<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[LUKENILAND.CO.UK]]></title><description><![CDATA[Musings on Programming, Technology and life ]]></description><link>https://lukeniland.co.uk/</link><generator>Ghost 5.50</generator><lastBuildDate>Sat, 25 Apr 2026 21:22:08 GMT</lastBuildDate><atom:link href="https://lukeniland.co.uk/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Vue 2 Testing With Pinia]]></title><description><![CDATA[My issue came when I tried to move my tests over from once the Vuex stores had been replaced in the components with Pinia ones....]]></description><link>https://lukeniland.co.uk/vue-2-testing-with-pinia/</link><guid isPermaLink="false">64c13f566f2e6804c6b962ba</guid><category><![CDATA[Programming]]></category><category><![CDATA[vue]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Fri, 04 Aug 2023 14:03:11 GMT</pubDate><content:encoded><![CDATA[<p>So I do some work for a company where we are building an application in vue. When we started vue3 had literally only just been released, the ecosystem around it was very immature and a couple of key components (vuetify) hadn&apos;t even been ported over yet, so at the time we decided to write the app using vue2.</p><p>However, fast forward a year, and the vue3 ecosystem is a lot more stable and development has stopped on vue2, so it looks like its time to make the upgrade. Now the application isn&apos;t that huge, but moving all the components over will take time, luckily we should be able to do it gradually instead of big bang as vue2 components can be written using the new composition API.</p><p>Another change is the move away from <a href="https://vuex.vuejs.org/?ref=lukeniland.co.uk">Vuex </a>to <a href="https://pinia.vuejs.org/?ref=lukeniland.co.uk">Pinia </a>as the state store. Again this is backward compatible with vue2, so this seemed like a good place to start. </p><p>I&apos;m not going to talk much about the actual migration of the stores here, but it was actually pretty straightforward forward and they are much more pleasnt to use than vuex stores, the issue I had was testing with them.</p><h2 id="please-just-give-me-an-example-%F0%9F%93%9D">Please just give me an example! &#x1F4DD;</h2><p>My issue came when I tried to move my tests over from once the Vuex stores had been replaced in the components with Pinia ones. Obviously, they all started to fail, so I headed over to the testing section of the documentation (<a href="https://pinia.vuejs.org/cookbook/testing.html?ref=lukeniland.co.uk">https://pinia.vuejs.org/cookbook/testing.html</a>) to see how to get going.</p><p>Now it might just be me but I struggled to understand a couple of the things going on here. My main issue was there&apos;s a lack of example component test online showing a simply create and inject a mocked pinia store into the component you are testing. After much head-scratching I finally got one working, as below:</p><pre><code class="language-javascript">import { mount, createLocalVue } from &apos;@vue/test-utils&apos;
import MyComponent from &apos;@/components/MyComponent&apos;
import { PiniaVuePlugin, setActivePinia, defineStore } from &apos;pinia&apos;
import { createTestingPinia } from &apos;@pinia/testing&apos;

const localVue = createLocalVue()
localVue.use(PiniaVuePlugin)

describe(&apos;MyComponent.vue&apos;, () =&gt; {
    let wrapper, ourPinia

    //create pinia instance. No autostub we control the outcomes
    ourPinia = createTestingPinia({            
        stubActions: false, 
      })
    
      //activate our instance
      setActivePinia(ourPinia)

    const actionId = 1

    const actionsResponse = [
        {
            id: actionId,
            triggerFormSubmit: true
        }
    ]

    })

    const mountWithProps = function () {
        const app = document.createElement(&apos;div&apos;)
        app.setAttribute(&apos;data-app&apos;, true)
        document.body.append(app)

        wrapper = mount(MyComponent, {
            localVue,
            pinia: ourPinia,                
        })
    }

    beforeEach(() =&gt; {

        //setup our store
        const useMyStore = defineStore(&apos;MyStore&apos;, {
            state: () =&gt; ({ actions: actionsResponse }),        
                getters: {
                    getAllActions: (state) =&gt; state.actions,
                },
                actions: {
                    setActions() {
                        return new Promise(() =&gt; {
                            this.actions = actionsResponse
                        })                    
                    },
                },
        })

        useMyStore(ourPinia) 
    })

    afterEach(() =&gt; {
        wrapper.destroy()
    })


    it(&apos;onActionTrigger with triggerFormSubmit set emits event&apos;, () =&gt; {
        //  assign 
        mountWithProps()
        
        // act
        wrapper.vm.onActionTrigger(actionId)

        // assert
        wrapper.vm.$nextTick(() =&gt; {
            expect(wrapper.emitted().actionWorkFlowSubmitTrigger).toBeTruthy()
        })
    })

})</code></pre><h2 id="so-whats-going-on-%F0%9F%A4%94">So, what&apos;s going on? &#x1F914;</h2><p>The first couple of bits are actually pretty well documented. We create a <em>localvue</em> and push the <em>PiniaVuePlugin </em>in. Then we create a testing pinia instance using <em>createTestingPinia, </em>here I&apos;m also telling it not to auto-mock the actions so we can decide what happens. Then we activate our instance ready to use. </p><p>Once we have our instance setup we need to create our mocked store. Make sure to name it the same as the &apos;real&apos; one so it drops in over the top of it. We define our store as if creating a real one, but here we just mock out the state/getters/actions. Then we pass our store into the active Pinia instance. </p><p>One of the bits that had me stumped for a while was by default if you don&apos;t include the <em>stubActions: false</em> directive when creating the test Pinia then all the actions get auto-mocked. These could be fine in some instances, but the actions don&apos;t seem to return promises so all my calls failed as I waited for them to resolve. &#xA0;This is why I am manually mocking any actions and making sure to return a promise where required. </p>]]></content:encoded></item><item><title><![CDATA[Updating ssh on PowerShell to trust new host keys 🔒]]></title><description><![CDATA[<p>So, BitBucket has had some of their hostkeys have been leaked, so they are moving to new keys. This is all fine, but my current ssh setup wasn&apos;t using the new algorithms, as I could see when running</p><pre><code>ssh git@bitbucket.org host_key_info</code></pre><p>The output was</p>]]></description><link>https://lukeniland.co.uk/updating-ssh-key/</link><guid isPermaLink="false">647600246f2e6804c6b9627a</guid><category><![CDATA[security]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Tue, 30 May 2023 14:04:29 GMT</pubDate><content:encoded><![CDATA[<p>So, BitBucket has had some of their hostkeys have been leaked, so they are moving to new keys. This is all fine, but my current ssh setup wasn&apos;t using the new algorithms, as I could see when running</p><pre><code>ssh git@bitbucket.org host_key_info</code></pre><p>The output was telling me I was using the old ssh-rsa algo, so I needed to update my known hosts with the new details.</p><p>The instructions on <a href="https://bitbucket.org/blog/ssh-host-key-changes?ref=lukeniland.co.uk">https://bitbucket.org/blog/ssh-host-key-changes</a> gave me the first part, I needed to update my known_hosts file with the new trusted keys. </p><pre><code>ssh-keygen -R bitbucket.org</code></pre><p>This added the new keys, but it also left my old ones behind, resulting in me getting an error when connecting. It was easy to get around, but it was very annoying.</p><pre><code>Warning: the ECDSA host key for &apos;bitbucket.org&apos; differs from the key for the IP address &apos;xxxxxxxxxx&apos;
Offending key for IP in C:\Users\Luke/.ssh/known_hosts:9
Matching host key in C:\Users\Luke/.ssh/known_hosts:13
Are you sure you want to continue connecting (yes/no)? yes</code></pre><p>The only way I could find to clean this up was by opening the known_hosts file, and removing the offending line (in my case line 9), once iIran the connection again stopped getting the error. &#xA0;</p><p> &#xA0;</p>]]></content:encoded></item><item><title><![CDATA[C# Custom API Key Authentication Handler 🔒]]></title><description><![CDATA[As part of a recent project we realized some form of API key auth was going to be required, and we ended up rolling a custom, fairly simple version by implementing our own version of the  Microsoft.AspNetCore.Authentication.AuthenticationHandler]]></description><link>https://lukeniland.co.uk/custom-api-key-authentication-handler-in-c/</link><guid isPermaLink="false">617bcc2479b8291370b2a54e</guid><category><![CDATA[Programming]]></category><category><![CDATA[c#]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Tue, 09 May 2023 18:00:00 GMT</pubDate><content:encoded><![CDATA[<p>Ahh, authentication, the word that all developers dread. While in C# over the last few years, it has become a bit easier to work with, it&apos;s still sometimes not all that straightforward. <br>As part of a recent project we realized some form of API key auth was going to be required, and we ended up rolling a custom, fairly simple version by implementing our own version of the &#xA0;<em>Microsoft.AspNetCore.Authentication.AuthenticationHandler. </em></p><h2 id="so-how-does-it-work">So, how does it work?</h2><p>When we call a method on a controller, we can use the <em>Authorize</em> attribute to tell the application a user must be authenticated to use said method or controller. If the user is not authenticated then a HTTP 403 is normally sent back. There are various way to wire up what this <em>Authorize</em> attribute actually does. In our case, we are going to create a new handler for the authentication that is going to pull a HTTP header from the request, check it&apos;s valid then return the status from our authentication handler.</p><h2 id="our-new-handler">Our new handler</h2><p>So first, we need to scaffold our handler up. Create a new class that inherits the <em>AuthenticationHandler </em>class<em>, </em>and brings in all the boilerplate code</p><pre><code class="language-csharp">public class BasicAuthenticationHandler : AuthenticationHandler&lt;AuthenticationSchemeOptions&gt;
    {
        
        public BasicAuthenticationHandler(
            IOptionsMonitor&lt;AuthenticationSchemeOptions&gt; options,
            ILoggerFactory logger,
            UrlEncoder encoder,
            ISystemClock clock)
            : base(options, logger, encoder, clock)
        {           
        }

        protected override async Task&lt;AuthenticateResult&gt; HandleAuthenticateAsync(){}
</code></pre><p>Pretty standard stuff, we&apos;re just overriding the <em>HandleAuthenticateAsync </em>method with our own code. Next, we need to have a list of valid API keys injected into our method. In this example, we are just going to drop them into our <em>appsettings.config </em>file as a dictionary and bring them in using IOptions (don&apos;t forget to register them in DI). We also need to set up a couple of string constants for our auth scheme name and api http header name. So now our boilerplate code will look like this:</p><pre><code class="language-csharp">public class ApiKeyAuthenticationHandler : AuthenticationHandler&lt;AuthenticationSchemeOptions&gt;
{
 	public static readonly string SchemeName = &quot;APIKeyAuthentication&quot;;
    public static readonly string ApiKeyHeaderName = &quot;X-API-Key&quot;;
    private readonly IDictionary&lt;string, string&gt; _apiKeys;
        
	public ApiKeyAuthenticationHandler(
            IOptionsMonitor&lt;AuthenticationSchemeOptions&gt; options,
            ILoggerFactory logger,
            UrlEncoder encoder,
            ISystemClock clock,
            IOptions&lt;IDictionary&lt;string, string&gt;&gt; apiKeys)
            : base(options, logger, encoder, clock)
        {
            _apiKeys = apiKeys.Value;
        }
 protected override Task&lt;AuthenticateResult&gt; HandleAuthenticateAsync()
        {
        }
}</code></pre><p>Cool, so once we have that we can start to actually go and see if we have the header, and if we do then pull the payload out and go and see if it exists in our allowed list of clients. We&apos;re going to wrap that up in a try/catch block just in case anything goes wrong and returns a fail (unauthorized result). So let&apos;s build that logic up</p><pre><code class="language-csharp"> try
    {
        if (!Request.Headers.ContainsKey(ApiKeyHeaderName))
            {
                Logger.LogWarning($&quot;Missing {ApiKeyHeaderName} Header&quot;, Request.Headers);
                return Task.FromResult(AuthenticateResult.Fail($&quot;Missing {ApiKeyHeaderName} Header&quot;));
            }

        var apiKeyHeader = AuthenticationHeaderValue.Parse(Request.Headers[ApiKeyHeaderName]);
        if(!_apiKeys.ContainsKey(apiKeyHeader))
        {
            Logger.LogWarning(&quot;Invalid API Key passed&quot;, apiKeyHeader);
            return Task.FromResult(AuthenticateResult.Fail(&quot;API Key not found&quot;));
        }

        
    }
    catch (Exception e)
    {
        Logger.LogError(&quot;Basic authentication failed&quot;, Request.Headers[ApiKeyHeaderName], e);
        return Task.FromResult(AuthenticateResult.Fail(&quot;API authentication failed. Unable to authenticate&quot;));
    }</code></pre><p>So most of that should be pretty self-explanatory, we are checking to make sure we have the expected http header, and if we don&apos;t then return a fail result. </p><p>If we do then we try and find the key in our dictionary of known API keys, if it doesn&apos;t exist then at this point we again send a failed result back. If it does then we need to move on to the next part, creating our <em>AuthenticationTicket.</em></p><p>First, we can create some claims for the user, in this example we are just going to add one that creates the customer&apos;s name, this will come from the information we hold in our API keys settings in config. Update this with any claims you might need.</p><pre><code class="language-csharp"> private static IEnumerable&lt;Claim&gt; CreateClaimsForCustomer(APICustomer customer)
        {
            return new List&lt;Claim&gt;
            {
                new Claim(ClaimTypes.NameIdentifier, customer.Name)
            };
        }</code></pre><p>Now we can create claims, we just need to create our ticket, add the claims, and return the ticket along with a Success result</p><pre><code class="language-csharp">var identity = new ClaimsIdentity(CreateClaimsForCustomer(customer), Scheme.Name);
                var principal = new ClaimsPrincipal(identity);
                var ticket = new AuthenticationTicket(principal, Scheme.Name);
                return Task.FromResult(AuthenticateResult.Success(ticket));</code></pre><p>That should be it for our handler</p><h2 id="wire-it-into-the-pipeline">Wire It into the Pipeline</h2><p>Now it&apos;s just a case of plugging it into our <em>Startup </em>file. Under <em>ConfigureServices </em>add the following section somewhere</p><pre><code class="language-csharp"> services.AddAuthentication(ApiKeyAuthenticationHandler.SchemeName)
                .AddScheme&lt;AuthenticationSchemeOptions, ApiKeyAuthenticationHandler&gt;(ApiKeyAuthenticationHandler.SchemeName, null);</code></pre><p>This tells the application to add our handler as an authentication method. Then again in <em>Startup </em>under <em>Configure</em> we need to add the line </p><pre><code class="language-csharp"> app.UseAuthentication();</code></pre><p>And we should be ready to go! Now in your controllers, above the class just add the <em>[Authorize]</em> attribute and your controller will require authentication using our handler. Simple but a good start to understanding and controlling access to simple applications.</p>]]></content:encoded></item><item><title><![CDATA[Ghost Blog and Syntax Highlighting]]></title><description><![CDATA[<p>Just a quick post as this took me a bit of working out. I want to start having some code snippets on here, obviously formatted nicely and with syntax highlighting. As most of my code is c# there were a couple of steps I had to follow to get it</p>]]></description><link>https://lukeniland.co.uk/ghost-blog-and-syntax-highlighting/</link><guid isPermaLink="false">64088a956f2e6804c6b96080</guid><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Wed, 08 Mar 2023 13:26:41 GMT</pubDate><content:encoded><![CDATA[<p>Just a quick post as this took me a bit of working out. I want to start having some code snippets on here, obviously formatted nicely and with syntax highlighting. As most of my code is c# there were a couple of steps I had to follow to get it to work.</p><p>I&apos;m not going to go into how to add prism in detail as there are plenty of posts about that, this is more to specifically get c# snippets working.</p><p>So it turns out as well as needing the css file (<em>prism-okaidia</em> for example) and the core js (<em>prism.min.js</em>) and csharp (<em>prism-csharp.min.js</em>) scripts pulled down, you also need 2 more as well</p><p><em>prism-c.min.js</em><br><em>prism-clike.min.js</em></p><p>This is because the <em>prism-csharp.min.js </em>has a dependency on <em>prism-clike.min.js</em> and that in turn has one on <em>prism-c.min.js</em>. Once you reference all these files from the CDN (<a href="https://cdnjs.com/libraries/prism?ref=lukeniland.co.uk">https://cdnjs.com/libraries/prism</a>) your csharp snippets should start to show up nicely, like:</p><pre><code class="language-csharp">/// &lt;summary&gt;
///     I am a test class
/// &lt;/summary&gt;
public class TestClass1
{
	private string _string1 {get; set;}
	public string String1 {get;}
}</code></pre>]]></content:encoded></item><item><title><![CDATA[Dapr Event Publishing and Double types 👨‍💻]]></title><description><![CDATA[Dapr (not to be confused with Dapper) is a really good piece of software to use in your Kubernetes clusters, it helps with everything from inter-service communication to handling event publishing.]]></description><link>https://lukeniland.co.uk/dapranddoubletypes/</link><guid isPermaLink="false">6259907c79b8291370b2a703</guid><category><![CDATA[Programming]]></category><category><![CDATA[dapr]]></category><category><![CDATA[c#]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Wed, 08 Mar 2023 13:08:45 GMT</pubDate><content:encoded><![CDATA[<p><a href="https://dapr.io/?ref=lukeniland.co.uk">Dapr </a>(not to be confused with <a href="https://github.com/DapperLib/Dapper?ref=lukeniland.co.uk">Dapper</a>) is a really good piece of software to use in your Kubernetes clusters, it helps with everything from inter-service communication to handling event publishing.</p><p>I have however come across a bit of an issue with the way it handles the rounding/accuracy of a double data type</p><h3 id="the-problem-">The Problem...</h3><p>I had some geography points (lat/longs) that were being updated in a service, and this service was then publishing the updates onto the service bus for other services to consume. One day a user pointed out that some of the positions of the lat/longs seemed to be slightly wrong, &#xA0;not massively but it was noticeable.</p><p>I checked the logs to find the data coming in and the points looked correct, but they were wrong when they ended up in the database. As I looked into it further I could see that the numbers where losing accuracy somewhere down the line. </p><p>I created a test case and pumped it through the system, I could see the data going into the event fine but when it was consumed on the other side it had changed. Hmm, what&apos;s going on here?</p><p>As I couldn&apos;t see anything actually wrong, I headed over to the github issues for the netcore dapr, after a search yielded no results I asked the question. </p><p>Sounded like it was to do with the way the type was being converted into a JSON object, but no one really had a solution to fixing it, apart from &apos;use a string instead of a double&apos;. I wasn&apos;t all that happy with that answer, so I came up with a different solution instead.</p><h3 id="hacky-fix-incoming-">Hacky Fix incoming...</h3><p>So instead of using a string in my model to hold the data, I used a customer <a href="https://learn.microsoft.com/en-us/dotnet/standard/serialization/system-text-json/converters-how-to?pivots=dotnet-7-0&amp;ref=lukeniland.co.uk">JsonConverter </a>instead.</p><p>The idea is to pass the data through another process as it gets serialized/deserialized so we can change its format etc. &#xA0;To create a new converter you just need to implement the &#xA0;<em>JsonConverter&lt;T&gt;</em> type, and then you can override the <em>read </em>and <em>write </em>methods. This is a converter I created:</p><pre><code class="language-csharp">public class JsonDoubleLatLongConverter : JsonConverter&lt;double&gt;
{
	public override double Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
    {
        if (reader.TokenType == JsonTokenType.String)
        {
        return double.Parse(reader.GetString());
        }

        return reader.GetDouble();
	}

    public override void Write(Utf8JsonWriter writer, double value, JsonSerializerOptions options)
    {
    	writer.WriteStringValue(value.ToString(&quot;G17&quot;, CultureInfo.InvariantCulture));
    }
}
</code></pre><p>So on write (serialize), we are outputting our double as a string formatted as G17, &#xA0;and when we read we are going to just output the string as a double, this way it keeps our accuracy.</p><p>So now to use this converter, we just need to add a new attribute to our model properties where we want it to be used:</p><pre><code class="language-csharp">[JsonConverter(typeof(JsonDoubleLatLongConverter))]
public double? LocationLongitude { get; set; }

[JsonConverter(typeof(JsonDoubleLatLongConverter))]
public double? LocationLongitude { get; set; }
</code></pre><p>So now when the events get published we can keep the properties the correct types, but they are no longer losing any accuracy. </p>]]></content:encoded></item><item><title><![CDATA[Custom GMMK Fullsize]]></title><description><![CDATA[Custom painted GMMK Fullsize ISO]]></description><link>https://lukeniland.co.uk/custom-gmmk-fullsize/</link><guid isPermaLink="false">6221de5679b8291370b2a671</guid><category><![CDATA[Keyboards]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Fri, 04 Mar 2022 11:00:23 GMT</pubDate><media:content url="https://lukeniland.co.uk/content/images/2022/03/20220304_092432.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://lukeniland.co.uk/content/images/2022/03/20220304_092432.jpg" alt="Custom GMMK Fullsize"><p>So, for quite a long time I&apos;ve wanted to make a keyboard for my wife. The issue has always been she needs &#xA0;a full size ISO board, and the custom community never seems to sell them. </p><p>Last year I ended up on a whim buying the Osumekeys <a href="https://www.osumekeys.com/product/sakura?ref=lukeniland.co.uk">sakura </a>set, mainly because they used the same colourway my wife had picked out for our bedroom, &#xA0;so now I really needed to get a board built! &#xA0;</p><p>With there being now full size board group buys or anything in site, and came across the GMMK barebones kit. It almost ticked every box, ISO, hotswap keys, full size and reasonable price and in stock. The issue however was it was only available in black.</p><p>Now, I really wanted the board to be white to match the colourway of the keys, so I decided to do something id never tried before and paint the board. It&apos;s made of Aluminium, so I read up on the best way to paint it. </p><h3 id="first-step-strip-it-down">First Step, strip it down </h3><p>This was fairly straight forward, and there&apos;s youtube videos to help. First of there about 8 screws in the top plate you need to remove, then once the top and PCB are apart from the bottom, theres more screws (hidden under masking tape) to remove the PCB from the top. Once thats done you need to carefully remove the cable from the PCB, and keep all the screws and PCB safe to one side.</p><p>Next step is to sand down the top and bottom parts of the board. I used a light grade sandpaper, this helps the primer layer &apos;stick&apos; to the board. This is the first time I&apos;ve tried painting Aluminium, but the option seemed to be you <em>needed</em> a primer layer on first or the pain would not stick. </p><h3 id="next-lets-paint">Next, lets paint</h3><p>Like I mentioned before, I used a primer for my first layer, this was a special kind used for metal that etched the surface so the paint would stick better Then I had my main coloured paint (white), then I also had a sealer/lacquer layer as well to help protect the paint. All these items where car paints that &#xA0;are normal used to touch up scratches, but they seem to be the best I could find</p><p>Now I had all my paints, I started of with the primer, I sprayed a thin layer on both sides of the top and bottom pieces, left it to dry for 24 hours, then repeated another 2 times. </p><p>Then it was the same with the main paint, thin layer over the primer layer, let it dry and repeat until I was happy with the finish </p><figure class="kg-card kg-image-card"><img src="https://lukeniland.co.uk/content/images/2022/03/20220125_145142.jpg" class="kg-image" alt="Custom GMMK Fullsize" loading="lazy"></figure><p>Then same again for the primer, couple of layers drying in between</p><h3 id="putting-it-all-back-together">Putting it all back together</h3><p>Once it was dry, it was time &#xA0;to put it all back together, hoping I hadn&apos;t managed to lose any screws. This was pretty easy and was just a case of doing what I did to take it apart but backwards. </p><figure class="kg-card kg-image-card"><img src="https://lukeniland.co.uk/content/images/2022/03/20220206_190154.jpg" class="kg-image" alt="Custom GMMK Fullsize" loading="lazy"></figure><p>Now the first time I did this and plugged the keyboard in, I was getting no power at all to the board, witch did worry me a bit. I took the top off again and released I hadn&apos;t plugged the cable back into the PCB. Oh dear. Once it was back together again the board lit up as soon as it was plugged in</p><h3 id="switch-s-and-keys">Switch&apos;s and Keys</h3><p>Almost there! So the board is hotswap, so that means we can change the switch&apos;s out if my wife doesn&apos;t like them. I choose to start her of with Gateron Browns, I have used &#xA0;these before and think they are a good place to start with tactile switch&apos;s. </p><p>After a few bent pins and tests, I filled the board with these switches.</p><figure class="kg-card kg-image-card"><img src="https://lukeniland.co.uk/content/images/2022/03/20220209_105718.jpg" class="kg-image" alt="Custom GMMK Fullsize" loading="lazy"></figure><p>Now time for the keys, like I said I got a set of Osume <a href="https://www.osumekeys.com/product/sakura?ref=lukeniland.co.uk">sakura</a>&apos;s for this build. I really liked the colour way, and they seem to be good quality PTB cap, with added iso support. The only slight issue was the colour of the blue was slightly off in the R1 that I got, but they are supposed to be sending new versions that closer match the colour they should have been. </p><p>And here is the result, i&apos;m waiting for her birthday to give her the board, hopefully she&apos;s gonna like it! </p><figure class="kg-card kg-image-card"><img src="https://lukeniland.co.uk/content/images/2022/03/20220304_092432-1.jpg" class="kg-image" alt="Custom GMMK Fullsize" loading="lazy"></figure>]]></content:encoded></item><item><title><![CDATA[Adventures in MetaMask 📈]]></title><description><![CDATA[I was trying to send some USDT from MetaMask to Binance, but every time I tried the transaction it was failing, with a rather cryptic ...]]></description><link>https://lukeniland.co.uk/adventures-in-metamask/</link><guid isPermaLink="false">610063ee79b8291370b2a520</guid><category><![CDATA[CryptoCurrency]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Wed, 28 Jul 2021 08:03:58 GMT</pubDate><content:encoded><![CDATA[<p>So just a quick post in case anyone else ever sees this. I was trying to send some USDT (although I think the token was irrelevant) from MetaMask to Binance, but every time I tried the transaction it was failing, with a rather cryptic message:<br><br><em>BEP-20 Token Transfer Error (Unable to locate corresponding Transfer Event Logs), Check with Sender.</em></p><figure class="kg-card kg-image-card"><img src="https://lukeniland.co.uk/content/images/2021/07/image-1.png" class="kg-image" alt loading="lazy"></figure><p>I tried changing the gas fees, I even converted to USDT to see if that would make any difference but again same error.</p><p>The only thing I noticed was the amount I was trying to send had <em>lots </em>of decimal places, like 10.191919191919 or similar. Just as a test I rounded it up to a whole number, and next thing you know the transaction flew through. So, in future be careful using the max button as you could end up with this happening to you! </p>]]></content:encoded></item><item><title><![CDATA[DeFi you are the future]]></title><description><![CDATA[I kept hearing the word DeFi, but never really understood what it was, just that people kept saying it was the future, so this week I've decided to have a bit more of a deep dive into it. ]]></description><link>https://lukeniland.co.uk/defithefuture/</link><guid isPermaLink="false">60c8bde679b8291370b2a347</guid><category><![CDATA[DeFi]]></category><category><![CDATA[CryptoCurrency]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Sun, 20 Jun 2021 14:54:41 GMT</pubDate><content:encoded><![CDATA[<p>I&apos;ve been interest in crypto currency&apos;s (BTC/ETH) etc) for a few years, I had (and sold) some coins in the 2017/2018 boom and crash, got scared due to its volatility and then dived back in again last year during the pandemic.</p><p>I kept hearing the word DeFi, but never really understood what it was, just that people kept saying it was the future, so this week I&apos;ve decided to have a bit more of a deep dive into it. </p><p>The thing that really got me interested was watching the gas fee&apos;s on the Ethereum network go crazy a couple of weeks ago (I think at one point the average fee was hitting over $20). If you don&apos;t know what a <em>gas fee</em> is then in simple terms its how much it costs for a transaction on the Ethereum network. </p><p>While is was reading about the high gas &#xA0;fee&apos;s I started reading about the <a href="https://polygon.technology/?ref=lukeniland.co.uk">Polygon network</a>. This is pretty much a copy of the Ethereum blockchain, but with the goal to make the transactions much faster, and a lot cheaper. Like much cheaper, less than a 1c in most cases. This opens up the world of DeFi to normal retail investors, as moving you money into the DeFi exchanges and swapping your assets no longer costs a fortune. &#xA0; </p><h3 id="my-first-defi-transaction">My first DeFi Transaction </h3><p>So first of all I need to get some of my collateral onto the Pollygon network, so you need a bridge. I had some USDC on the BSC that I could play with, so the first thing I did was send that to Metamask, that was probably the first expense as I had to pay about 30c or something to get it into there. </p><p>Next I needed to swap the coins onto the Pollygon network. I was originally going to use <a href="https://bridge.orbitchain.io/?ref=lukeniland.co.uk">bridge.orbitchain.io</a> but that only supports a small subset of coins on the BSC that I didn&apos;t have. After a bit of searching I came across <a href="https://xpollinate.io/?ref=lukeniland.co.uk">xpollinate.io</a>. From here you can go from the BSC to the Pollygon network with one of the stablecoins. Win!</p><figure class="kg-card kg-image-card"><img src="https://lukeniland.co.uk/content/images/2021/06/image.png" class="kg-image" alt loading="lazy"></figure><p><em>NB - There was a slight issue the first time I tried as there wasn&apos;t enough liquidity for my coin. If thats the case give it a bit of time and try again later</em></p><p>Once I&apos;d told it to swap, I had to allow the transaction in Metamask, and a few minutes later (I think I had to wait about 5 minutes) I had the USDC sat in my wallet in the Pollygon network, sweet!</p><h3 id="defi-exchange">DeFi Exchange </h3><p>Next I needed an exchange. Now, as I&apos;m new to the space I wanted to start out with something as low risk as possible, where I could just add my assets to a <a href="https://coinmarketcap.com/alexandria/glossary/liquidity-pool?ref=lukeniland.co.uk">Liquidity Pool </a>and let them earn some interest, and if possible some staking rewards as well. <br>My first thought was to use <a href="https://aave.com/?ref=lukeniland.co.uk">Aave </a>as I had heard lots of people talking about them on reddit/youtube. Using them you can add your assets to an LP, and then get a loan out using your staked asserts as collateral that you can then go and invest in other project. At the time I was looking you also get paid to take the loan out more than the interest on the loan! While this all seemed tempting I wasn&apos;t really ready to get that deep into it yet. That&apos;s when I stumbled across <a href="https://lukeniland.co.uk/p/2682d9b8-6e61-40e2-a19f-87eec46f6ffb/polygon.curve.fi">polygon.curve.fi</a></p><p>So curve is pretty much a LP for stablecoins, you can stake on one of their various pools, getting a reasonable APY and also rewards (such as Matic and Crv). This looked like it was probably what I was looking for. Decent APY and reward tokens as well.</p><p>The next part is where DeFi comes into its own. When you want to use one of these services, there&apos;s no sign up, no verification, no nothing. Its not like Coinbase or Binance where you need to get verified before you can buy anything. All you need to do connect your wallet and your good to go, it really is that simple. The only thing with curve is the interface takes a bit of getting used to, its a bit like an old school console application!</p><figure class="kg-card kg-image-card"><img src="https://lukeniland.co.uk/content/images/2021/06/image-1.png" class="kg-image" alt loading="lazy"></figure><p> So I simply picked a pool, and deposited my USDC tokens into it, and that&apos;s it! My tokens are now in the LP pool, earning interest and I&apos;m also earning Matic and CRV on top of them as well. I can easily withdraw my assets anytime I want, or just go and claim my rewards if I want them. </p><h3 id="be-your-own-bank">Be your Own Bank</h3><p>So now using services such as these, you can effectively be your own bank. You Choose where you want to invest your money, where you loan you money out to based on the projects/companies that you like and serve your best interests. Obviously its still early days, and you have to be very carful where you deposit your money but going forward this is giving the average person much more control of their own finances.<br>Oh, and this is not financial advice, always do your own research! </p><p><strong><em>Update - </em></strong>So, since writing this originally you can now withdraw from Binance directly to the polygon chain, so you don&apos;t have to mess about with a bridge anymore, game changer! <br><br>This only seems to be available for the Matic token but this doesn&apos;t really matter. When you go to withdraw now just pick Matic, input your wallet address, pick the Matic network and boom!</p><figure class="kg-card kg-image-card"><img src="https://lukeniland.co.uk/content/images/2021/07/image-2.png" class="kg-image" alt loading="lazy"></figure><p>Compaired to normal polygon transactions the fee is a bit high (10c) but that&apos;s nothing really compared to the ETH network. It&apos;s a bit of a pain at the minute depositing onto Binance as they have been blocked by quite a few banks, but you can always transfer your funds in from other exchanges.</p>]]></content:encoded></item><item><title><![CDATA[Thoughts on WSL 2]]></title><description><![CDATA[So a few weeks ago Microsoft released the May 2020 update for windows 10. Normally this would pass me by, but this update contained a big update]]></description><link>https://lukeniland.co.uk/thoughts-on-wsl-2/</link><guid isPermaLink="false">5ef2063379b8291370b2a2ad</guid><category><![CDATA[Programming]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Wed, 24 Jun 2020 06:37:22 GMT</pubDate><content:encoded><![CDATA[<p>So a few weeks ago Microsoft released the May 2020 update for windows 10. Normally this would pass me by, but this update contained a big update in the way the Windows Subsystem for Linux works.</p><p>I use WSL/Ubuntu along with the windows terminal pretty much everyday, and I flipping love it, having the Linux tools on my windows machine is amazing and so useful as a developer. The only issue I have ever really had was the speed, its not that bad, but it could be better &#x1F422;</p><p>So, this release was supposed to increase the speed 5 fold or something, so I though great, pulled the update down and updated my Ubuntu install to a WSL2 version. </p><p>After everything had finished updating, i fired up the terminal did a git pull in the terminal, did a bit of work then did a git status. And i waited, and waited. It came back, but it was noticeably slower than it was before the update. So I tried another repo. Same again.</p><p>Now, all of my code is held in the Windows side of my machine, as I work in visual Studio and VSCode for my development. As a test i created a new repo on the Linux file system, then opened it up in VSCode via the command line, made some changes and did a git status. It was pretty speedy, I did a commit, made some more changes/added files etc and it was still fast, like really fast. I went back to my original repo and tried the same but it was slower again.</p><p>So, I&apos;ve come to the conclusion that WSL2 is only faster if you do you work on the Linix side exclusively, people like me who still open their stuff up from windows and just use the command line tools don&apos;t seem to have been helped (quite the opposite in fact) by this release. &#xA0;It&apos;s not the end of the world and i&apos;ll be carrying on using it everyday, its just a little disappointing its no faster &#x1F61F; </p>]]></content:encoded></item><item><title><![CDATA[But its only a Keyboard 🖮]]></title><description><![CDATA[As a developer I spend a lot of time in front of my keyboard, but until recently I paid it very little attention. Sure, I have a...]]></description><link>https://lukeniland.co.uk/but-its-only-a-keyboard/</link><guid isPermaLink="false">5ee4decad3ce697dbd727d90</guid><category><![CDATA[Keyboards]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Sun, 21 Jun 2020 20:51:55 GMT</pubDate><content:encoded><![CDATA[<p>As a developer I spend a lot of time in front of my keyboard, but until recently I paid it very little attention. Sure, I have a decent mechanical keyboard at home (a <a href="https://www.corsair.com/eu/en/k70-rgb-gaming-keyboard?ref=lukeniland.co.uk">Corsair K70</a>) that has served me well. Really well in fact, it&apos;s a great keyboard for development and gaming.<br><br>However, since the start of the year I&apos;ve found myself being drawn more towards the custom mechanical keyboard community, and my current drive to become more efficient with my coding by using more shortcuts has piqued my interest. &#xA0;</p><p>So I started lurking in a couple of places, the Reddit community <a href="https://www.reddit.com/r/CustomKeyboards/?ref=lukeniland.co.uk">r/CustomKeyboards</a>, and the forum over at <a href="https://geekhack.org/?ref=lukeniland.co.uk">geekhack.org</a>. People on there are doing some amazing work with their boards, like some of these are works of art.</p><p>One of the first things that struck me was a big percentage of the builds where 60% keyboards. This essentially means the function keys on the top row are missing, there&apos;s no number pad and no dedicated arrow keys. There&apos;s also a few TKL (Tenkeyless) layouts that have no number pad, and a few split style.</p><h3 id="decisions-decisions">Decisions, Decisions</h3><p>So I thought i&apos;d have a try and making my first custom. Now, when you start to look into this you&apos;ll soon find that a) lots of the custom stuff people sell are <em>really </em>expensive and b) you have to wait <em>ages</em> to get your gear. Especially if you go for a group buy. As i&apos;m not sure what the best layout for me really is, i don&apos;t want to start dropping hundreds of pound on something quite yet, so with that in mind I started looking on the big Chinese retail sites <a href="https://www.banggood.com/?ref=lukeniland.co.uk">banggood </a>and <a href="https://www.aliexpress.com/?ref=lukeniland.co.uk">aliexpress</a>. Now, these dont have anywhere near the best quality gear, but its pretty cheap and there&apos;s a good selection. </p><p>I&apos;d already decided to have a try with a 60% layout, I loved the look of them, I rarely use my number pad and there was a good selection of base kits. In the end I went with the following</p><!--kg-card-begin: markdown--><ul>
<li>GK61 Hot Swappable 60% RGB Keyboard from banggood. This was a full kit with a PSB mounted in a case, and the PCB was hot-swappable, so you don&apos;t have to solder the switches in, meaning you can change them out if you want</li>
<li>Gateron Brown Switch from banggood. I use cherry reds at the moment, this have a nice feel to them but are maybe slightly loud. The Gateron&apos;s are slightly less &apos;clicky&apos; but should still have the same feel. Also they are the same size as the cherry so most keycaps will fit</li>
<li>PTB Joker ANSI Keycaps from Ebay. There was loads of group buy kits I was looking at, but most of those would take 3-4 months to get here, and would cost about &#xA3;90. I wanted something now, i like the way these looked and they shone through so you could see the RGB so I decided to spring for these</li>
<li>The cable. So the keyboard came with a USB-C port, you did get a standard cable with the kit but I opted to make my own. I&apos;ll do a post about that at some point.</li>
</ul>
<!--kg-card-end: markdown--><p>So that was it, I order all my stuff and sat back. The switches came in about a week, then I waited for the case/PCB kit. And waited, and waited, and waited.....<br>Now I know ordering from aliexpress/banggood can be a bit hit and miss, but I had to wait about 2 months for the rest of my order to arrive. Yep, 2 months. I was giving up hope and then one day, it just turned up. I think you also need quite a bit of patience to play the custom keyboard game as well</p><h3 id="the-mail-the-mail-is-here">The mail, the mail is here</h3><p>So finally, I had everything I needed. Building the board was really straight forward, the PCB was already mounted in the case, all i had to do was pop the switch&apos;s in, and put the key caps on. The stabilisers where even already in place so that was a nice touch. &#xA0;See the bottom of the post for some images of the various stages of build</p><p>So then, I started using it. I&apos;m actually typing this post on it. First thoughts, for the price I paid, its a good quality board, and the caps are pretty good as well. I&apos;m also enjoying the brown switches, they have good feedback and sound, and I think i prefer them to the red&apos;s I&apos;ve been using for years. &#xA0;As for the actual layout, I&apos;m not sure I like it. I need to spend a bit more time with it before I commit, but at the moment I find I miss the dedicated arrow keys quite a lot, and its strange not having any F keys as well. But, I&apos;m going to stick with it for now and give it a good try. Watch this space...</p><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://lukeniland.co.uk/content/images/2020/06/IMG_20200604_124721682.jpg" width="2000" height="1500" loading="lazy" alt srcset="https://lukeniland.co.uk/content/images/size/w600/2020/06/IMG_20200604_124721682.jpg 600w, https://lukeniland.co.uk/content/images/size/w1000/2020/06/IMG_20200604_124721682.jpg 1000w, https://lukeniland.co.uk/content/images/size/w1600/2020/06/IMG_20200604_124721682.jpg 1600w, https://lukeniland.co.uk/content/images/size/w2400/2020/06/IMG_20200604_124721682.jpg 2400w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://lukeniland.co.uk/content/images/2020/06/IMG_20200608_154735515.jpg" width="2000" height="1500" loading="lazy" alt srcset="https://lukeniland.co.uk/content/images/size/w600/2020/06/IMG_20200608_154735515.jpg 600w, https://lukeniland.co.uk/content/images/size/w1000/2020/06/IMG_20200608_154735515.jpg 1000w, https://lukeniland.co.uk/content/images/size/w1600/2020/06/IMG_20200608_154735515.jpg 1600w, https://lukeniland.co.uk/content/images/size/w2400/2020/06/IMG_20200608_154735515.jpg 2400w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://lukeniland.co.uk/content/images/2020/06/IMG_20200610_185948250.jpg" width="2000" height="1500" loading="lazy" alt srcset="https://lukeniland.co.uk/content/images/size/w600/2020/06/IMG_20200610_185948250.jpg 600w, https://lukeniland.co.uk/content/images/size/w1000/2020/06/IMG_20200610_185948250.jpg 1000w, https://lukeniland.co.uk/content/images/size/w1600/2020/06/IMG_20200610_185948250.jpg 1600w, https://lukeniland.co.uk/content/images/size/w2400/2020/06/IMG_20200610_185948250.jpg 2400w" sizes="(min-width: 720px) 720px"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://lukeniland.co.uk/content/images/2020/06/IMG_20200610_185941131.jpg" width="2000" height="1500" loading="lazy" alt srcset="https://lukeniland.co.uk/content/images/size/w600/2020/06/IMG_20200610_185941131.jpg 600w, https://lukeniland.co.uk/content/images/size/w1000/2020/06/IMG_20200610_185941131.jpg 1000w, https://lukeniland.co.uk/content/images/size/w1600/2020/06/IMG_20200610_185941131.jpg 1600w, https://lukeniland.co.uk/content/images/size/w2400/2020/06/IMG_20200610_185941131.jpg 2400w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://lukeniland.co.uk/content/images/2020/06/IMG_20200616_135414913.jpg" width="2000" height="1500" loading="lazy" alt srcset="https://lukeniland.co.uk/content/images/size/w600/2020/06/IMG_20200616_135414913.jpg 600w, https://lukeniland.co.uk/content/images/size/w1000/2020/06/IMG_20200616_135414913.jpg 1000w, https://lukeniland.co.uk/content/images/size/w1600/2020/06/IMG_20200616_135414913.jpg 1600w, https://lukeniland.co.uk/content/images/size/w2400/2020/06/IMG_20200616_135414913.jpg 2400w" sizes="(min-width: 720px) 720px"></div></div></div></figure><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Investors in Keystrokes 👨‍💻]]></title><description><![CDATA[I thought about how sometimes you watch someone writing code, and they never seem to go anywhere near the mouse, they just float across the keyboard like a person possessed, and figured that might be an area I could actually improve on]]></description><link>https://lukeniland.co.uk/investors-in-keystrokes/</link><guid isPermaLink="false">5edcf9e6f1382727730688d6</guid><category><![CDATA[Programming]]></category><dc:creator><![CDATA[Luke Niland]]></dc:creator><pubDate>Sun, 07 Jun 2020 14:32:32 GMT</pubDate><content:encoded><![CDATA[<h3 id="so-when-this-lockdown-started-i-figured-instead-of-just-watching-more-box-sets-or-playing-games-i-should-maybe-try-and-learn-something-new-or-at-least-make-myself-more-efficient-at-something-">So when this lockdown started, i figured instead of just watching more box sets or playing games, i should maybe try and learn something new, or at least make myself more efficient at something. </h3><p>After getting slightly obsessed with keyboards (more on that in another post) I started wondering how people could actually be &#xA0;productive with a <a href="https://blog.wooting.nl/the-ultimate-guide-to-keyboard-layouts-and-form-factors/?ref=lukeniland.co.uk">60% keyboard</a>, that got me thinking more about keyboard shortcuts.</p><p>I thought about how sometimes you watch someone writing code, and they never seem to go anywhere near the mouse, they just float across the keyboard like a person possessed, and figured that might be an area I could actually improve on.</p><h2 id="lets-go-old-school-">Lets go old School &#x1F5A5;&#xFE0F;</h2><p>Even though I do 95% of my work in Visual Studio and VS Code I did (for about a day) have another go at trying my hand at <a href="https://en.wikipedia.org/wiki/Vim_(text_editor)?ref=lukeniland.co.uk">VIM</a>. &#xA0;Now, I do kind of know how to use VIM, when i&apos;m in a Linux shell I do tend to use it to edit files, but i&apos;m not that profeicant in it. So, I installed the plugins for VS Code and starting trying to get the hand of it. I lasted about 2 days and just ended up turning it of again. My brains to used to using the arrow and ctrl keys, and thats not really how VIM works. Ok, back to the drawing board.</p><h2 id="working-with-what-you-already-have">Working with what you already have</h2><p>After abandoning VIM, I figured i&apos;d just try and learn some more of the default keyboard shortcuts my editors of choice had built in. Now there are 100&apos;s, and you can if you want make your own (more on that later), but there a few I came across that I now find myself using everyday. Not sure how i missed them to be honest.</p><!--kg-card-begin: markdown--><p><code>Ctrl + Tab</code> - Ok, this one is so obvious now I don&apos;t know how i missed it. This opens a list of all your open tabs and lets you flick between them.</p>
<p><code>Ctrl + T</code> - I&apos;ve been using this one for a while, lets you start typing to find a class/file/method and jump right into it</p>
<p><code>Ctrl + End/Home</code> - Jump to the start or the end of the open file. Just using <code>Home</code> or <code>End</code> will take you to the end of the current line</p>
<p><code>Ctrl + Shift + P</code> - In VS code this opens the command pallet, where you can run various command from. I use this a lot if i copy a file in to change the language mode and format the text</p>
<p><code>Ctrl + Left/Right</code> - Use this to skip words whole words to the left and right. If you also combine it with <code>Shift</code> you can select the words as well</p>
<p><code>Alt + Enter</code> - This was a bit of a game changer as well. In Visual Studio when you get the dreaded red squiggle under something, place the cursor over the word in question and this keypress will open up the quick actions menu to let you fix it, or give you hints on what to do</p>
<!--kg-card-end: markdown--><p>Those are the main ones I&apos;ve started using, and I do feel like its making me code a lot quicker, not quite coding at the speed of thought, but faster than asthmatic ant with some heavy shopping like i was before.</p><h2 id="knocking-it-up-a-notch">Knocking it up a notch</h2><!--kg-card-begin: markdown--><p>As i said before, I did try VIM for a couple of days, but just couldn&apos;t get used to it. I was a bit annoyed with myself as there&apos;s a lot to like. One of the main functions was to be able to easily delete or change a word. I found myself used the <code>CW</code> and <code>DD</code> commands all the time.</p>
<p>So thats obviously not going to work in the normal world of Visual Studio, but I did find there was a couple of shortcut keys to delete the word to the left/right of the cursor, and to delete the current line. By default these were bound to some keys already, but it made more sense (at least in my head) to bind them to alternate keys that mirrored VIM.</p>
<p>So, with that in mind, i fired up the keyboard shortcut settings in VS code (<code>Ctrl+K</code> <code>Ctrl + S</code>), found the commands I needed and re-mapped them to some new keys, like so</p>
<p>Delete word to left, command <code>deleteWordLeft</code> new binding <code>Ctrl+D</code> <code>Ctrl + R</code><br>
Delete word to right, command <code>deleteWordRight</code> new binding <code>Ctrl+D</code> <code>Ctrl + W</code><br>
Delete line, command <code>Delete Line</code> new binding <code>Ctrl+D</code> <code>Ctrl + D</code></p>
<!--kg-card-end: markdown--><p>If I come across any others of use, or update my prefs with something else VIM like i&apos;ll update this post</p><h3 id="related-links">Related Links</h3><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://code.visualstudio.com/docs/getstarted/keybindings?ref=lukeniland.co.uk"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Visual Studio Code Key Bindings</div><div class="kg-bookmark-description">Here you will find the complete list of key bindings for Visual Studio Code and how to change them.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://code.visualstudio.com/favicon.ico" alt><span class="kg-bookmark-author">Visual Studio Code</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://code.visualstudio.com/assets/docs/getstarted/keybinding/customization_keybindings.png" alt></div></a></figure><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/VSCodeVim/Vim?ref=lukeniland.co.uk"><div class="kg-bookmark-content"><div class="kg-bookmark-title">VSCodeVim/Vim</div><div class="kg-bookmark-description">:star: Vim for Visual Studio Code. Contribute to VSCodeVim/Vim development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://github.githubassets.com/favicons/favicon.svg" alt><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">VSCodeVim</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://avatars0.githubusercontent.com/u/15813386?s=400&amp;v=4" alt></div></a></figure>]]></content:encoded></item></channel></rss>