<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Dean Hume's Blog]]></title><description><![CDATA[My name is Dean Hume, and I am an author, blogger and am passionate about making games!]]></description><link>https://deanhume.com/</link><generator>Ghost 5.38</generator><lastBuildDate>Fri, 23 Feb 2024 07:34:04 GMT</lastBuildDate><atom:link href="https://deanhume.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Getting started with Azure's Hybrid and Embedded Text-to-Speech]]></title><description><![CDATA[<p>Over the past few months, I&apos;ve been experimenting with Azure&apos;s Text-to-Speech service. It is a super powerful API that enables fluid, natural-sounding text to speech that matches the tone and emotion of human voices.</p><p>Whether you are building an app or a game, Text-to-Speech can be</p>]]></description><link>https://deanhume.com/azure-hybrid-and-embedded-text-to-speech/</link><guid isPermaLink="false">6582ad95c919137deeb7a806</guid><category><![CDATA[Text-to-Speech]]></category><category><![CDATA[AI]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Thu, 15 Feb 2024 09:41:58 GMT</pubDate><content:encoded><![CDATA[<p>Over the past few months, I&apos;ve been experimenting with Azure&apos;s Text-to-Speech service. It is a super powerful API that enables fluid, natural-sounding text to speech that matches the tone and emotion of human voices.</p><p>Whether you are building an app or a game, Text-to-Speech can be very useful. For example, think of the different stages of game development - during concept and pre-production, text-to-speech can help build out the feel of the game and enhance your scripts before you record with real voice actors. </p><figure class="kg-card kg-image-card"><img src="https://deanhume.com/content/images/2024/02/text-to-speech-game-development.jpg" class="kg-image" alt loading="lazy" width="1000" height="516" srcset="https://deanhume.com/content/images/size/w600/2024/02/text-to-speech-game-development.jpg 600w, https://deanhume.com/content/images/2024/02/text-to-speech-game-development.jpg 1000w" sizes="(min-width: 720px) 720px"></figure><p>During release and production , it can be used to provide accessibility options to suit the needs of your users. &#xA0;At the time of writing this article, there are over 456 voices across 147 languages that you can choose from! </p><p>As you are reading through this, you might be thinking to yourself...hang on...This uses a Cloud service, how would this work for an offline game? Or when a user loses connection?</p><p>This is where Hybrid (and Embedded) speech comes into play, and in this article, we are going to explore an example that will work both online and offline scenarios.</p><h2 id="how-hybrid-speech-works">How Hybrid Speech works</h2><p>Hybrid speech uses the cloud speech service by default and embedded speech as a fallback in case cloud connectivity is limited or slow.</p><p>In fact, you could ship your entire app with just the embedded speech and not use the cloud service at all. Its worth mentioning that it is slightly limited in that while the quality is good, the cloud option returns the highest quality speech. To give you and idea of what this looks like, lets compare the two versions. The first is the embedded speech version:</p><div class="kg-card kg-audio-card"><img src alt="audio-thumbnail" class="kg-audio-thumbnail kg-audio-hide"><div class="kg-audio-thumbnail placeholder"><svg width="24" height="24" fill="none" xmlns="http://www.w3.org/2000/svg"><path fill-rule="evenodd" clip-rule="evenodd" d="M7.5 15.33a.75.75 0 1 0 0 1.5.75.75 0 0 0 0-1.5Zm-2.25.75a2.25 2.25 0 1 1 4.5 0 2.25 2.25 0 0 1-4.5 0ZM15 13.83a.75.75 0 1 0 0 1.5.75.75 0 0 0 0-1.5Zm-2.25.75a2.25 2.25 0 1 1 4.5 0 2.25 2.25 0 0 1-4.5 0Z"/><path fill-rule="evenodd" clip-rule="evenodd" d="M14.486 6.81A2.25 2.25 0 0 1 17.25 9v5.579a.75.75 0 0 1-1.5 0v-5.58a.75.75 0 0 0-.932-.727.755.755 0 0 1-.059.013l-4.465.744a.75.75 0 0 0-.544.72v6.33a.75.75 0 0 1-1.5 0v-6.33a2.25 2.25 0 0 1 1.763-2.194l4.473-.746Z"/><path fill-rule="evenodd" clip-rule="evenodd" d="M3 1.5a.75.75 0 0 0-.75.75v19.5a.75.75 0 0 0 .75.75h18a.75.75 0 0 0 .75-.75V5.133a.75.75 0 0 0-.225-.535l-.002-.002-3-2.883A.75.75 0 0 0 18 1.5H3ZM1.409.659A2.25 2.25 0 0 1 3 0h15a2.25 2.25 0 0 1 1.568.637l.003.002 3 2.883a2.25 2.25 0 0 1 .679 1.61V21.75A2.25 2.25 0 0 1 21 24H3a2.25 2.25 0 0 1-2.25-2.25V2.25c0-.597.237-1.169.659-1.591Z"/></svg></div><div class="kg-audio-player-container"><audio src="https://deanhume.com/content/media/2024/02/device-22.wav" preload="metadata"></audio><div class="kg-audio-title">Embedded Text to Speech</div><div class="kg-audio-player"><button class="kg-audio-play-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/></svg></button><button class="kg-audio-pause-icon kg-audio-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><rect x="3" y="1" width="7" height="22" rx="1.5" ry="1.5"/><rect x="14" y="1" width="7" height="22" rx="1.5" ry="1.5"/></svg></button><span class="kg-audio-current-time">0:00</span><div class="kg-audio-time">/<span class="kg-audio-duration">0:02</span></div><input type="range" class="kg-audio-seek-slider" max="100" value="0"><button class="kg-audio-playback-rate">1&#xD7;</button><button class="kg-audio-unmute-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M15.189 2.021a9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h1.794a.249.249 0 0 1 .221.133 9.73 9.73 0 0 0 7.924 4.85h.06a1 1 0 0 0 1-1V3.02a1 1 0 0 0-1.06-.998Z"/></svg></button><button class="kg-audio-mute-icon kg-audio-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M16.177 4.3a.248.248 0 0 0 .073-.176v-1.1a1 1 0 0 0-1.061-1 9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h.114a.251.251 0 0 0 .177-.073ZM23.707 1.706A1 1 0 0 0 22.293.292l-22 22a1 1 0 0 0 0 1.414l.009.009a1 1 0 0 0 1.405-.009l6.63-6.631A.251.251 0 0 1 8.515 17a.245.245 0 0 1 .177.075 10.081 10.081 0 0 0 6.5 2.92 1 1 0 0 0 1.061-1V9.266a.247.247 0 0 1 .073-.176Z"/></svg></button><input type="range" class="kg-audio-volume-slider" max="100" value="100"></div></div></div><p>And the the second is the cloud based version:</p><div class="kg-card kg-audio-card"><img src alt="audio-thumbnail" class="kg-audio-thumbnail kg-audio-hide"><div class="kg-audio-thumbnail placeholder"><svg width="24" height="24" fill="none" xmlns="http://www.w3.org/2000/svg"><path fill-rule="evenodd" clip-rule="evenodd" d="M7.5 15.33a.75.75 0 1 0 0 1.5.75.75 0 0 0 0-1.5Zm-2.25.75a2.25 2.25 0 1 1 4.5 0 2.25 2.25 0 0 1-4.5 0ZM15 13.83a.75.75 0 1 0 0 1.5.75.75 0 0 0 0-1.5Zm-2.25.75a2.25 2.25 0 1 1 4.5 0 2.25 2.25 0 0 1-4.5 0Z"/><path fill-rule="evenodd" clip-rule="evenodd" d="M14.486 6.81A2.25 2.25 0 0 1 17.25 9v5.579a.75.75 0 0 1-1.5 0v-5.58a.75.75 0 0 0-.932-.727.755.755 0 0 1-.059.013l-4.465.744a.75.75 0 0 0-.544.72v6.33a.75.75 0 0 1-1.5 0v-6.33a2.25 2.25 0 0 1 1.763-2.194l4.473-.746Z"/><path fill-rule="evenodd" clip-rule="evenodd" d="M3 1.5a.75.75 0 0 0-.75.75v19.5a.75.75 0 0 0 .75.75h18a.75.75 0 0 0 .75-.75V5.133a.75.75 0 0 0-.225-.535l-.002-.002-3-2.883A.75.75 0 0 0 18 1.5H3ZM1.409.659A2.25 2.25 0 0 1 3 0h15a2.25 2.25 0 0 1 1.568.637l.003.002 3 2.883a2.25 2.25 0 0 1 .679 1.61V21.75A2.25 2.25 0 0 1 21 24H3a2.25 2.25 0 0 1-2.25-2.25V2.25c0-.597.237-1.169.659-1.591Z"/></svg></div><div class="kg-audio-player-container"><audio src="https://deanhume.com/content/media/2024/02/cloud-22.wav" preload="metadata"></audio><div class="kg-audio-title">Cloud Text to Speech</div><div class="kg-audio-player"><button class="kg-audio-play-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/></svg></button><button class="kg-audio-pause-icon kg-audio-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><rect x="3" y="1" width="7" height="22" rx="1.5" ry="1.5"/><rect x="14" y="1" width="7" height="22" rx="1.5" ry="1.5"/></svg></button><span class="kg-audio-current-time">0:00</span><div class="kg-audio-time">/<span class="kg-audio-duration">0:02</span></div><input type="range" class="kg-audio-seek-slider" max="100" value="0"><button class="kg-audio-playback-rate">1&#xD7;</button><button class="kg-audio-unmute-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M15.189 2.021a9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h1.794a.249.249 0 0 1 .221.133 9.73 9.73 0 0 0 7.924 4.85h.06a1 1 0 0 0 1-1V3.02a1 1 0 0 0-1.06-.998Z"/></svg></button><button class="kg-audio-mute-icon kg-audio-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M16.177 4.3a.248.248 0 0 0 .073-.176v-1.1a1 1 0 0 0-1.061-1 9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h.114a.251.251 0 0 0 .177-.073ZM23.707 1.706A1 1 0 0 0 22.293.292l-22 22a1 1 0 0 0 0 1.414l.009.009a1 1 0 0 0 1.405-.009l6.63-6.631A.251.251 0 0 1 8.515 17a.245.245 0 0 1 .177.075 10.081 10.081 0 0 0 6.5 2.92 1 1 0 0 0 1.061-1V9.266a.247.247 0 0 1 .073-.176Z"/></svg></button><input type="range" class="kg-audio-volume-slider" max="100" value="100"></div></div></div><p>If you listen really closely, you can hear that there is a slight improvement in the tone and cadence of the cloud speech service. There is not much difference, and the embedded version sounds pretty good too!</p><p>In this article, I am going to take you through a basic example of hybrid and embedded speech using Azure&apos;s Text to Speech Service.</p><h2 id="lets-get-started">Let&apos;s get started</h2><p>Before we get started, we need to download the voices that we will use with the embedded version of the code. That is, the voices that will actually &quot;ship&quot; with the code. In order to acquire the voices, you will need to apply for access - follow <a href="https://aka.ms/csgate-embedded-speech">this link to request access</a> to the voices. </p><p>Once you have the voices, we can then start building out our example. First off, let&apos;s start by creating a <a href="https://learn.microsoft.com/en-us/visualstudio/ide/create-new-project?view=vs-2022">new project in Visual Studio Code</a>. Next, add a new class called <em>Keys </em>that will contain the keys and settings that we need.</p><!--kg-card-begin: markdown--><pre><code>public class Keys
{
    public static string EmbeddedSpeechSynthesisVoicePath = @&quot;\voices\en-us&quot;; 
    public static string EmbeddedSpeechSynthesisVoiceKey = &quot;your_key&quot;; 
    public static string EmbeddedSpeechSynthesisVoiceName = &quot;en-US-JennyNeural&quot;; 
    
    public static string CloudSpeechSubscriptionKey = &quot;subscription_key&quot;; 
    public static string CloudSpeechServiceRegion = &quot;eastus&quot;; 
    public static string SpeechRecognitionLocale = &quot;en-US&quot;; 
    public static string SpeechSynthesisLocale = &quot;en-US&quot;; 
}
</code></pre>
<!--kg-card-end: markdown--><p>Let&apos;s break down the code above. Firstly, the variable <strong>EmbeddedSpeechSynthesisVoicePath </strong>points to the file location where the voices are located and <strong>EmbeddedSpeechSynthesisVoiceKey</strong> is the key that you need to access the voices. You&apos;ll be given these when you apply for the access as mentioned above. I&apos;ve also chosen the voice of &quot;Jenny&quot;, but you could choose any from the <a href="https://speech.microsoft.com/portal/voicegallery">Voice Gallery</a>.</p><p>As we are using a hybrid model, we&apos;ll need to provide some cloud details from an Azure speech instance. I created a new speech instance on the Azure portal and on the overview page, I selected the <strong>CloudSpeechSubscriptionKey</strong> and <strong>CloudSpeechServiceRegion </strong>from the portal (highlighted in yellow below).</p><figure class="kg-card kg-image-card"><img src="https://deanhume.com/content/images/2024/02/azure-portal-text-to-speech.jpg" class="kg-image" alt loading="lazy" width="1200" height="1190" srcset="https://deanhume.com/content/images/size/w600/2024/02/azure-portal-text-to-speech.jpg 600w, https://deanhume.com/content/images/size/w1000/2024/02/azure-portal-text-to-speech.jpg 1000w, https://deanhume.com/content/images/2024/02/azure-portal-text-to-speech.jpg 1200w" sizes="(min-width: 720px) 720px"></figure><p>In the code above, we are also providing the instance region and the locale that we are going to be using in the variables <strong>SpeechRecognitionLocale </strong>and <strong>SpeechSynthesisLocale</strong>. In this case I&apos;m using US English, but you could use any language and locale of your choice.</p><h2 id="configuring-the-embedded-speech">Configuring the Embedded Speech</h2><p>Now that we have the keys configured, we can create the config for the embedded speech model. I&apos;ve created a new class file called <strong>Settings </strong>and added the following code.</p><!--kg-card-begin: markdown--><pre><code>public class Settings
{

public static EmbeddedSpeechConfig CreateEmbeddedSpeechConfig()
{
    List&lt;string&gt; paths = new List&lt;string&gt;();

    var synthesisVoicePath = Keys.EmbeddedSpeechSynthesisVoicePath;
    if (!string.IsNullOrEmpty(synthesisVoicePath) &amp;&amp; !synthesisVoicePath.Equals(&quot;YourEmbeddedSpeechSynthesisVoicePath&quot;))
    {
        paths.Add(synthesisVoicePath);
    }

    // Make sure that there is a voice path defined above.
    if (paths.Count == 0)
    {
        Console.Error.WriteLine(&quot;## ERROR: No model path(s) specified.&quot;);
        return null;
    }

    var config = EmbeddedSpeechConfig.FromPaths(paths.ToArray());

    if (!string.IsNullOrEmpty(Keys.EmbeddedSpeechSynthesisVoiceName))
    {
        // Mandatory configuration for embedded speech synthesis.
        config.SetSpeechSynthesisVoice(Keys.EmbeddedSpeechSynthesisVoiceName, Keys.EmbeddedSpeechSynthesisVoiceKey);
        if (Keys.EmbeddedSpeechSynthesisVoiceName.Contains(&quot;Neural&quot;))
        {
            // Embedded neural voices only support 24kHz sample rate.
config.SetSpeechSynthesisOutputFormat(SpeechSynthesisOutputFormat.Riff24Khz16BitMonoPcm);
        }
    }

    return config;
}
</code></pre>
<!--kg-card-end: markdown--><p>The code above uses the different file paths and keys that we set in the Keys class file. We&apos;ll be using this <strong>EmbeddedSpeechConfig </strong>object to create speech using the local voices on file.</p><h2 id="configuring-the-hybrid-speech">Configuring the Hybrid Speech</h2><p>In the same way that we set up the <strong>EmbeddedSpeechConfig </strong>object, we&apos;ll need to create a <strong>CreateHybridSpeechConfig </strong>object. </p><!--kg-card-begin: markdown--><pre><code>public static HybridSpeechConfig CreateHybridSpeechConfig()
{
    var cloudSpeechConfig = SpeechConfig.FromSubscription(Keys.CloudSpeechSubscriptionKey, Keys.CloudSpeechServiceRegion);

    cloudSpeechConfig.SpeechRecognitionLanguage = Keys.SpeechRecognitionLocale;
    cloudSpeechConfig.SpeechSynthesisLanguage = Keys.SpeechSynthesisLocale;

    var embeddedSpeechConfig = CreateEmbeddedSpeechConfig();

    var config = HybridSpeechConfig.FromConfigs(cloudSpeechConfig, embeddedSpeechConfig);

    return config;
}
</code></pre>
<!--kg-card-end: markdown--><p>You&apos;ll notice that the code above calls the <em>CreateEmbeddedSpeechConfig()</em> function that we created earlier. This is because we&apos;ll be using this code to use the cloud speech service by default and then fallback to the embedded speech in case cloud connectivity is limited or slow. </p><h2 id="try-it-out">Try it out</h2><p>With all this in place, we are now ready to start calling the API and synthesizing some speech. </p><!--kg-card-begin: markdown--><pre><code>/// &lt;summary&gt;
/// Synthesizes speech using the hybrid speech system and outputs it to the default speaker.
/// &lt;/summary&gt;
private static async Task HybridSynthesisToSpeaker()
{
    var textToSpeak = &quot;Hello, this is a test of the hybrid speech system.&quot;;

    var speechConfig = Settings.CreateHybridSpeechConfig();

    using var audioConfig = AudioConfig.FromDefaultSpeakerOutput();

    using var synthesizer = new SpeechSynthesizer(speechConfig, audioConfig);
    
    using var result = await synthesizer.SpeakTextAsync(textToSpeak);
}
</code></pre>
<!--kg-card-end: markdown--><p>When you call <em>HybridSynthesisToSpeaker().Wait()</em>, you should now hear something coming from your speaker!</p><p>In these code samples, I have created a simple string that we are passing through to the API. Depending on your use case, you could build more complex examples using <a href="https://learn.microsoft.com/en-us/azure/ai-services/speech-service/speech-synthesis-markup">Speech Synthesis Markup Language (SSML)</a>. Speech Synthesis Markup Language (SSML) is an XML-based markup language that you can use to fine-tune your text to speech output attributes such as pitch, pronunciation, speaking rate, volume, and more. </p><h2 id="using-embedded-speech-only">Using Embedded Speech only</h2><p>If you preferred to use embedded speech only, you can call the <em>CreateEmbeddedSpeechConfig() </em>function that we created earlier.</p><!--kg-card-begin: markdown--><pre><code>/// &lt;summary&gt;
/// Synthesizes speech using the embedded speech system and outputs it to the default speaker.
/// &lt;/summary&gt;
private static async Task EmbeddedSynthesisToSpeaker()
{
    var textToSpeak = &quot;Hello, this is a test of the embedded speech system.&quot;;

    var speechConfig = Settings.CreateEmbeddedSpeechConfig();
    
    using var audioConfig = AudioConfig.FromDefaultSpeakerOutput();

    using var synthesizer = new SpeechSynthesizer(speechConfig, audioConfig);

    using var result = await synthesizer.SpeakTextAsync(textToSpeak);
}
</code></pre>
<!--kg-card-end: markdown--><p>When you call <em>EmbeddedSynthesisToSpeaker().Wait()</em>, you should now hear embedded speech coming from your speaker!</p><h2 id="summary">Summary</h2><p>I&apos;ve barely scratched the surface of the capabilities of text-to-speech; there is so much more to experiment with! If you&apos;d like to learn more about Embedded/Hybrid Speech, I recommend reading the <a href="https://learn.microsoft.com/en-us/azure/ai-services/speech-service/embedded-speech?tabs=windows-target%2Cjre&amp;pivots=programming-language-csharp">following article</a> for more information.</p><p>In this article, we covered both embedded and hybrid speech options and its also worth mentioning that you can ship with only embedded speech if you prefer. In the code example that we ran through, we used C# but there are other language options available including C++, and Java SDKs.</p>]]></content:encoded></item><item><title><![CDATA[Azure Function Timer Trigger not firing - NCrontab]]></title><description><![CDATA[<p>If you&apos;ve recently ventured into the world of <a href="https://azure.microsoft.com/en-gb/products/functions/">Azure Functions</a>, you&apos;re likely familiar with the versatility they offer when it comes to scheduling tasks. In this guide, we&apos;ll delve into using <a href="https://learn.microsoft.com/en-gb/azure/azure-functions/functions-create-scheduled-function">Azure Functions&apos; Time Trigger</a> function with Node.js and Visual Studio Code,</p>]]></description><link>https://deanhume.com/azure-http-trigger/</link><guid isPermaLink="false">651d2c1ec919137deeb7a735</guid><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Wed, 04 Oct 2023 11:03:57 GMT</pubDate><media:content url="https://deanhume.com/content/images/2023/10/azure-function-timer-trigger.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://deanhume.com/content/images/2023/10/azure-function-timer-trigger.jpg" alt="Azure Function Timer Trigger not firing - NCrontab"><p>If you&apos;ve recently ventured into the world of <a href="https://azure.microsoft.com/en-gb/products/functions/">Azure Functions</a>, you&apos;re likely familiar with the versatility they offer when it comes to scheduling tasks. In this guide, we&apos;ll delve into using <a href="https://learn.microsoft.com/en-gb/azure/azure-functions/functions-create-scheduled-function">Azure Functions&apos; Time Trigger</a> function with Node.js and Visual Studio Code, uncovering some important nuances along the way.</p><p>This article assumes that you have some familiarity with Node.js and a basic understanding of Azure.</p><h3 id="getting-started">Getting Started</h3><p>To begin, you&apos;ll need Node.js and Visual Studio Code installed, along with the Azure Functions extension. When you create a new function, it comes with some boilerplate code that includes a schedule property defining when the function should run. By default, it&apos;s set to run every 5 minutes.</p><!--kg-card-begin: markdown--><pre><code>const { app } = require(&apos;@azure/functions&apos;);

app.timer(&apos;timerTrigger2&apos;, {
    schedule: &apos;0 */5 * * * *&apos;,
    handler: (myTimer, context) =&gt; {
        context.log(&apos;Timer function processed request.&apos;);
    }
});
</code></pre>
<!--kg-card-end: markdown--><p>The <strong>schedule parameter</strong> in the above code tells the Azure Function how often to fire.</p><h3 id="the-challenge">The Challenge</h3><p>But what if you want your function to run every 11 hours during a day? Crafting the right Cron expression can be tricky, and that&apos;s where I encountered some difficulties. I turned to an online Cron expression generator, hoping it would simplify the process. Here&apos;s what it gave me:</p><!--kg-card-begin: markdown--><pre><code>0 0 */11 * *
</code></pre>
<!--kg-card-end: markdown--><p>Feeling confident, I integrated this expression into my code above and deployed the Azure Function. However, it didn&apos;t work as expected. I scoured the logs for clues but found nothing. To troubleshoot and debug locally, I temporarily changed the Cron expression to run every minute, and it worked flawlessly. So why wasn&apos;t it firing every few hours?</p><h3 id="the-revelation">The Revelation</h3><p>After some online research, I uncovered the reason behind the issue. Azure Functions use an <strong>NCRONTAB expression</strong>, which requires a six-part format instead of the traditional five-part Cron expression. The online generator that I found online had provided a five-part expression, leading to the problem.</p><p>To be fair, the Visual Studio Code extension for Azure Functions provides the correct six-part format by default. My mistake was changing it using a five-part expression from an online tool - d&apos;oh!</p><h3 id="the-solution">The Solution</h3><p>To rectify this, I recommend using the <a href="https://ncrontab.swimburger.net/">NCrontab Expression Tester</a>. This user-friendly tool not only helps you test your expressions but also generates the correct six-part Cron expressions tailored for Azure Functions. </p><p>When I updated my code to use the 6 part format:</p><!--kg-card-begin: markdown--><pre><code>0 0 */11 * * *
</code></pre>
<!--kg-card-end: markdown--><p>It ran perfectly!</p><h3 id="why-ncrontab">Why NCrontab?</h3><p>You might be wondering why Azure Functions use the NCrontab six-part format instead of the traditional five-part format. The answer lies in its flexibility. The six-part format allows you to specify seconds, enabling you to run your functions with higher precision and frequency.</p><!--kg-card-begin: markdown--><pre><code>* * * * * *
- - - - - -
| | | | | |
| | | | | +--- day of week (0 - 6) (Sunday=0)
| | | | +----- month (1 - 12)
| | | +------- day of month (1 - 31)
| | +--------- hour (0 - 23)
| +----------- min (0 - 59)
+------------- sec (0 - 59)
</code></pre>
<!--kg-card-end: markdown--><p>For further details on NCrontab expressions, you can visit the GitHub repository <a href="https://github.com/atifaziz/NCrontab">here</a>.</p><p>Happy coding!</p>]]></content:encoded></item><item><title><![CDATA[Using a Raspberry Pi to track the progress of your homebrew]]></title><description><![CDATA[In this article I go into detail about how I hooked up a Raspberry to continually track and monitor the progress of my beer all whilst graphing it out to another application. It gets a little geeky!]]></description><link>https://deanhume.com/using-a-raspberry-pi-to-track-the-progress-of-your-homebrew/</link><guid isPermaLink="false">6419d57e92605e0b555b234f</guid><category><![CDATA[Charts]]></category><category><![CDATA[Raspberry Pi]]></category><category><![CDATA[Linux]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Thu, 18 Aug 2022 10:48:32 GMT</pubDate><media:content url="https://deanhume.com/content/images/2023/10/raspberry-pi-homebrew.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://deanhume.com/content/images/2023/10/raspberry-pi-homebrew.jpg" alt="Using a Raspberry Pi to track the progress of your homebrew"><p>In my spare time, I like to do a bit of <a href="humebrew.com">homebrewing</a>. While it isn&apos;t the usual sort of topic that I post on this site, I wanted to cross post to an article I wrote about automating some of the brewing process.</p><p>If you&apos;ve ever goofed around with Raspberry Pi&apos;s before, you&apos;ll know how fun they can be. <a href="https://humebrew.com/tilt-hydrometer-to-a-raspberry-pi-to-brewfather/">In this article</a> I go into detail about how I hooked up a Raspberry Pi to continually track and monitor the progress of my beer all whilst graphing it out to another application. It gets a little geeky!</p><figure class="kg-card kg-image-card"><img src="https://deanhume.com/content/images/2022/08/raspberry-pi-tilt-pi.jpg" class="kg-image" alt="Using a Raspberry Pi to track the progress of your homebrew" loading="lazy"></figure><p>Please head over to my homebrew website over at <a href="https://humebrew.com/tilt-hydrometer-to-a-raspberry-pi-to-brewfather/">Humebrew.com</a> to find out more!</p>]]></content:encoded></item><item><title><![CDATA[Has Service Worker usage increased on this website over the past four years?]]></title><description><![CDATA[It's been almost four years since I started collecting data on Service Worker usage and support on this website and in this article I show the details.]]></description><link>https://deanhume.com/service-worker-growth-over-time/</link><guid isPermaLink="false">6419d57e92605e0b555b234a</guid><category><![CDATA[Service Workers]]></category><category><![CDATA[Progressive Web Apps]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Fri, 01 Apr 2022 08:20:39 GMT</pubDate><media:content url="https://deanhume.com/content/images/2023/10/service-worker-usage.png" medium="image"/><content:encoded><![CDATA[<img src="https://deanhume.com/content/images/2023/10/service-worker-usage.png" alt="Has Service Worker usage increased on this website over the past four years?"><p>I think <a href="https://deanhume.com/progressive-web-apps-my-new-book-is-available/">Service Workers are pretty cool</a>. They enable us to do so many powerful things with the web today such as intercepting network requests, caching and retrieving resources and delivering push messages to name a few. In fact, I&apos;ve been running a Service Worker file on this website for around the past four years. </p><p>A few years ago, I wrote an article that showed how to determine the <a href="https://deanhume.com/determining-service-worker-support/">Service Worker usage and support</a> for the users that visited your site by pushing statistics to Google Analytics. &#xA0;It also went a step further and determined if the users were being served cached data through the supported service worker.</p><p>It&apos;s been almost four years since I started collecting data and I wanted to look back over time and see if there has been any change to the numbers. Let&apos;s first start with the overall number - the chart below represents the percentage of users that visit this website that have browsers that support Service Workers. This number includes desktop and all devices (mobiles, tablets).</p><!--kg-card-begin: html--><div class="infogram-embed" data-id="5316af28-d9af-4bf7-8822-bbff52acf4a8" data-type="interactive" data-title="Column Chart"></div><script>!function(e,i,n,s){var t="InfogramEmbeds",d=e.getElementsByTagName("script")[0];if(window[t]&&window[t].initialized)window[t].process&&window[t].process();else if(!e.getElementById(n)){var o=e.createElement("script");o.async=1,o.id=n,o.src="https://e.infogram.com/js/dist/embed-loader-min.js",d.parentNode.insertBefore(o,d)}}(document,0,"infogram-async");</script><div style="padding:8px 0;font-family:Arial!important;font-size:13px!important;line-height:15px!important;text-align:center;border-top:1px solid #dadada;margin:0 30px"><br></div><!--kg-card-end: html--><p>From the chart above, you can see that between 2019 and 2021 the percentage of users has remained relatively consistent. The numbers for 2022 have jumped considerably, but it&apos;s worth pointing out that at the time of writing this article, we are only 3 months into this year. I expect that this number will flatten out over time.</p><p>If you click on the second tab in the chart above, you&apos;ll notice the number of &quot;controlled&quot; users to the site. In this case it means how many pages were actually <em>controlled </em>by an active service worker. For example, if someone visits a site with a service worker on it, it will get installed the first time and only used the next time they reload the page or navigate away. I wanted to determine how many people were both supported and using my service worker.</p><p>You can see from these numbers that over the years it has remained pretty stable around the 25% mark.</p><p>As I looked at the data over time, it was clear that the desktop usage has remained pretty stable with very little growth. I wanted to dive a little more into the mobile phone usage on this site and the chart below represents these numbers.</p><!--kg-card-begin: html--><div class="infogram-embed" data-id="358b8eb9-1d25-4da6-9641-9cb8b282a60d" data-type="interactive" data-title="Mobile: Total Service Worker Usage"></div><script>!function(e,i,n,s){var t="InfogramEmbeds",d=e.getElementsByTagName("script")[0];if(window[t]&&window[t].initialized)window[t].process&&window[t].process();else if(!e.getElementById(n)){var o=e.createElement("script");o.async=1,o.id=n,o.src="https://e.infogram.com/js/dist/embed-loader-min.js",d.parentNode.insertBefore(o,d)}}(document,0,"infogram-async");</script><div style="padding:8px 0;font-family:Arial!important;font-size:13px!important;line-height:15px!important;text-align:center;border-top:1px solid #dadada;margin:0 30px"></div><!--kg-card-end: html--><p>As you can see from the chart above, the number of mobile devices that visit this site that support Service Workers has remained pretty stable. There doesn&apos;t seem like much of a story to tell here. While 2022 looks like it has increased a bit, at the time of writing this article, we are only 3 months into this year. I expect these numbers to flatten out.</p><p>Of all the data that I managed to collect, I only started to notice a trend when it came to iPhones. The chart below shows the percentage of iPhone users that visited this site that were both supported (and controlled) by a Service Worker.</p><!--kg-card-begin: html--><div class="infogram-embed" data-id="4135e627-1332-4843-ba4b-2c2a9987b025" data-type="interactive" data-title="Copy: Mobile: Total Service Worker Usage"></div><script>!function(e,i,n,s){var t="InfogramEmbeds",d=e.getElementsByTagName("script")[0];if(window[t]&&window[t].initialized)window[t].process&&window[t].process();else if(!e.getElementById(n)){var o=e.createElement("script");o.async=1,o.id=n,o.src="https://e.infogram.com/js/dist/embed-loader-min.js",d.parentNode.insertBefore(o,d)}}(document,0,"infogram-async");</script><div style="padding:8px 0;font-family:Arial!important;font-size:13px!important;line-height:15px!important;text-align:center;border-top:1px solid #dadada;margin:0 30px"></div><!--kg-card-end: html--><p><a href="https://caniuse.com/serviceworkers">Apple&apos;s Safari browser shipped Service Worker support</a> on their browser a few years ago and it&apos;s pretty cool to see the usage increase. There has been a significant increase from 14% in 2019 to around 39% this year (so far). </p><p>In terms of the number of iPhone users to this site that are controlled by a Service Worker, this number has remained pretty stable with a slight increase this year.</p><h3 id="summary">Summary</h3><p>While I realise that my website is just a small blog and is in no way indicative of the global trend, it has been pretty cool to see how usage is changing over many years. When I first wrote about Service Worker and PWAs, it seemed like a pipe dream that Apple would even consider adding support, but as you can see from the data, these numbers are trending in the right direction.</p>]]></content:encoded></item><item><title><![CDATA[Leading Virtual Teams - Book Review]]></title><description><![CDATA[I’m always on the lookout to improve both myself and my team's daily work, so when I came across a book entitled Leading Virtual Teams by the Harvard Business Review, it really caught my eye.]]></description><link>https://deanhume.com/leading-virtual-teams-book-review/</link><guid isPermaLink="false">6419d57e92605e0b555b234d</guid><category><![CDATA[Leadership]]></category><category><![CDATA[Book Review]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Mon, 14 Mar 2022 15:08:14 GMT</pubDate><content:encoded><![CDATA[<p>When COVID first entered the world and I started working from home, my hunch was that it was only going to be for a few weeks. Two years later, that hunch has proven quite wrong! I am a people person and enjoy meeting face to face with others, so it has definitely taken me some time to adjust. </p><p>If you had asked me at the start of working remotely whether I would prefer to work at home or in the office&#x2026;..I would have chosen the office every time. As time has gone on, I have definitely started to lean towards working from home, but as you may know, that isn&#x2019;t without its challenges either. </p><p>I&#x2019;m always on the lookout to improve both myself and my team&apos;s daily work, so when I came across a book entitled <a href="https://store.hbr.org/product/leading-virtual-teams-hbr-20-minute-manager-series/10005"><em>Leading Virtual Teams</em></a> by the Harvard Business Review, it really caught my eye.<br><br></p><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/b67vcydKrqlZOz3zgCngNsbQuLEnExI1DeHKo3QlT6cMuRzCHz49LBmafoQObljauI0e4wgdl65HsXOV7aeWY-kFz1QN5dPJ3U013ZCw5NLk0qBtFxxzOxu2T3UW9e33JxvSxBT8" class="kg-image" alt loading="lazy"></figure><p><br>Published in 2016, it &#xA0;was written by the authors <em>before </em>the pandemic and before so many people started working from home. What I really like about this book is that the details written in here still stand, and it hasn&#x2019;t changed the need for something like this. This book is part of the Harvard Business Review <em>20 Minute Manager Series,</em> which means it is a concise, practical primer so you&#x2019;ll have time to dive straight into the meat of the topic.</p><p>It starts off with some of the important things that you need to get right to be successful on a virtual team. First and foremost to this list is getting the right people on the team, which requires a special set of traits:</p><ul><li><strong>Communication </strong>- &#x201C;<em>Good virtual team members know how to be precise and concise in multiple media, and the err on the side of overcommunication</em>&#x201D;.</li><li><strong>Collaboration style</strong> - &#x201C;<em>Virtual teamwork requires self discipline and self motivation, since team members must stay on schedule and ask for help when necessary. Remote work is not ideal for people who need a lot of supervision</em>&#x201D;.</li><li><strong>Temperament </strong>- &#x201C;<em>Look for people who will be generous in negotiating conflict in a low information environment and resilient working alone under pressure</em>&#x201D;.</li><li><strong>Technology </strong>- &#x201C;<em>Seek out people who are open-minded to new technology and competent in tools</em>&#x201D;.</li><li><strong>Size </strong>- &#x201C;<em>When it comes to the number of folks on your team, aim low. Research shows that smaller teams are more effective and more motivated</em>&#x201D;.</li></ul><p>While you might not be in a position to build a team from scratch and choose people with these traits, it does help as a starting point to identify gaps in an existing team and tweak it accordingly.</p><p>The book also has a chapter entitled<strong> Manage the Technology</strong> in which it goes into detail about the importance of technology in the remote working world. I won&#x2019;t go too much into detail here, as each organization has their own unique tools, but the key takeaway for me was around establishing rules for the use of the technology within your organization / team. How do you share and store content? What is the etiquette for the use of this technology? While the book doesn&#x2019;t suggest which tools to use, it does talk about agreeing on using a standardised list of products. Regardless of whether you are going to use Slack or Teams - it suggests choosing <strong>one </strong>and sticking with it.</p><p>For me, one of the key areas of this book is around creating a shared vision with the team. What is the purpose of your team? Can you explain it in a clear, compelling language? Whether you are working with a team on a long-term project or a shorter goal, it&#x2019;s important to clearly define what is the vision for the team and what will make it successful. By documenting this vision and goal, you can use it as a reference point to bring everyone back to common ground when distance and time chips away at the team&apos;s cohesion.</p><p>As a People Manager, I know how tough it is to keep a remote working team engaged and happy. The book goes into some detail and provides a few tips and tricks such as:</p><ul><li>Providing praise and recognising collaborative behaviour when you see it</li><li>Encouraging people to acknowledge each other&apos;s work</li><li>Playing games together!</li><li>Finding a daily working rhythm with the team</li></ul><p>Finally, Leading Virtual Teams also touches on the common problems that you might face with remote teams. There is a part of this chapter that is dedicated to managing conflict on a virtual team. It is not a question of if, but rather when you will face conflict with a team. Dealing with this remotely can be sensitive ground and will need to be handled differently to how you might do face to face. The book provides a few examples and ideas for how to navigate this territory.</p><h2 id="final-verdict">Final Verdict</h2><p><strong>Would I buy this book?</strong> Yes, absolutely. It&#x2019;s an easy read and is filled with practical, actionable details. There isn&#x2019;t a lot of &#x201C;fluff&#x201D; - it gets to the point and leaves you with useful insights.</p><p>This book was written long before the majority of workers were forced to work remotely due to the pandemic. The authors have had real-world experience implementing this and faced challenges that they provide practical solutions to.</p><p>While some of the topics covered may come naturally to you and your organization, there are definitely some tips and tricks that you can pick up and use to improve your team&apos;s remote working. The book doesn&#x2019;t go super deep into all of these topics and perhaps skims the surface a little more than I&#x2019;d like, but after all this is entitled the 20 Minute Manager Series for a reason. I&#x2019;d still recommend you buy this book!</p>]]></content:encoded></item><item><title><![CDATA[Introducing Onesie - A tool for One on One meetings]]></title><description><![CDATA[Onesie is an tool to help you get the most out of your One one One meetings. It has features that allow you to track, take notes, and help you improve engagement with your team.]]></description><link>https://deanhume.com/introducing-onesie-tool-for-one-on-one-meetings/</link><guid isPermaLink="false">6419d57e92605e0b555b234b</guid><category><![CDATA[Leadership]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Thu, 19 Aug 2021 16:18:23 GMT</pubDate><content:encoded><![CDATA[<p>In my day to day role as an Engineering Manager, a key part of my job is having One on One meetings with the people in my team. A One on One is a dedicated time in the calendar for an open-ended conversation between a manager and an employee. These meetings aren&apos;t meant to be status reports, but rather a time to check-in, reflect and may even involve coaching, venting, etc.</p><p>When I have One on One meetings with my reports, I always take notes and jot them down on paper. The problem I found was that I never organised these notes properly and used them in the best way. I always had to shuffle back and forwards in my notepad and often lost notes. This gave me an idea and encouraged me to build a little something called <a href="https://onesie.tech/">Onesie</a>. </p><figure class="kg-card kg-image-card"><img src="https://deanhume.com/content/images/2021/08/onesie-one-on-one-hero.png" class="kg-image" alt loading="lazy"></figure><p><a href="https://onesie.tech/">Onesie </a>is an tool to help you get the most out of your One one One meetings. It has features that allow you to take notes and help you improve engagement with your team.</p><p>Like most of us these days, I work almost exclusively remotely, and trying to maintain a connection with my team whilst being remote has been challenging. Without regular contact, One on One&apos;s can sometimes run of out energy and lose momentum. &#xA0;For this reason, Onesie has a built-in function that suggests topics to talk about or important themes that you could run through with your employee.</p><p>Onesie is designed to work offline. So once you visit the site for the first time, it is cached and ready to use, even without a network connection. This can be particularly handy if you are on your mobile device and you lose connection, or perhaps working from a cafe and it has no Wifi.</p><figure class="kg-card kg-image-card"><img src="https://deanhume.com/content/images/2021/08/onesie-one-on-one-iphone.jpg" class="kg-image" alt loading="lazy"></figure><p>Even if you aren&apos;t a people manager with line reports, Onesie is a useful tool to take notes from a conversation and then refer to them at a later stage (or the next time that you speak to that person). </p><p>Best of all, Onesie is completely free. I built this project as a tool for myself, but also out of love for the craft of people management. I would love for you to start using it, and welcome any feedback or improvements that you might have. </p><p>Head over to <a href="https://onesie.tech/">Onesie </a>today to learn more.</p>]]></content:encoded></item><item><title><![CDATA[Ghost CMS & Linux - Fixing "No Space Left on Device" Issue]]></title><description><![CDATA[I was quite surprised to found out that I was greeted with an annoying HTTP 503 error on my site over the Christmas period. After taking a closer look at the logs on Amazon, it turns out that I had actually run out of disk space on the instance.]]></description><link>https://deanhume.com/ghost-blog-fixing-no-space-left-on-device-issue/</link><guid isPermaLink="false">6419d57e92605e0b555b2348</guid><category><![CDATA[Ghost]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Thu, 07 Jan 2021 13:40:50 GMT</pubDate><content:encoded><![CDATA[<p>A few years ago, I <a href="https://deanhume.com/amazon-aws-ec2-ghost-cms-setup/">transitioned my blog from a custom ASP.NET website to Ghost CMS</a>. I&apos;ve been really happy with Ghost - it&apos;s easy to set up and get running, and the Ghost community is really great.</p><!--kg-card-begin: markdown--><p><img src="https://deanhume.com/content/images/2018/10/ghost-logo.png" alt="Ghost CMS Logo" loading="lazy"></p>
<!--kg-card-end: markdown--><p>Life often gets in the way of blogging, and I haven&apos;t made any new posts to this blog for a while. I&apos;ve had a few <a href="https://deanhume.com/amazon-aws-ec2-ghost-cms-setup/">on and off issues with Amazon EC2 Linux instances</a> and this blog over time, but generally things were working as expected. The blog has largely remained untouched for a year or so. Which is why I was quite surprised to found out that I was greeted with an annoying <strong>HTTP 503 error</strong> on my site over the Christmas period.</p><p>I thought this might just be an issue with the site, so I tried the usual <strong>Stop Instance</strong> and <strong>Restart Instance</strong>. That didn&apos;t help. </p><p>After taking a closer look at the logs on Amazon, it turns out that I had actually run out of disk space on the instance. This is a bit weird considering I had 10 GB assigned to the volume - after all, this is only a small blog!</p><h2 id="increasing-the-size-of-the-volume">Increasing the size of the volume</h2><p>My first thoughts were to get the site back up and running by increasing the size of the volume (or disk space) assigned to the instance. You can do this from the EC2 Management Console by selecting the instance, choosing <strong>Storage </strong>and the clicking on the <strong>Volume ID</strong><em> </em>(highlighted in yellow below).</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://deanhume.com/content/images/2021/01/ec2-volume-id.PNG" class="kg-image" alt loading="lazy"><figcaption>Select the Volume to update</figcaption></figure><p>Next, select from the <em>Actions </em>drop down menu and choose <strong>Modify Volume</strong>.</p><figure class="kg-card kg-image-card"><img src="https://deanhume.com/content/images/2021/01/ec2-modify-volume.PNG" class="kg-image" alt loading="lazy"></figure><p>Choose the new size of the volume and select OK. </p><h2 id="using-growpart">Using Growpart</h2><p>Once you&apos;ve increased the size of the volume, it turns out that there is still one more step that needs to take place. You need to tell the partition to use the &quot;new space&quot; that you&apos;ve just given it. Without doing this, it is still assigned the old volume size and you haven&apos;t actually made use of the new space you&apos;ve given it.</p><p>You can see this by SSH&apos;ing into the instance and typing <em><strong>lsblk </strong></em>in the terminal.</p><!--kg-card-begin: markdown--><pre><code>NAME    MAJ:MIN RM  SIZE RO TYPE MOUNTPOINT
xvda    202:0    0   40G  0 disk
&#x2514;&#x2500;xvda1 202:1    0   10G  0 part /
loop1     7:1    0 97.9M  1 loop /snap/core/10444
loop3     7:3    0 97.9M  1 loop /snap/core/10577
loop4     7:4    0 55.4M  1 loop /snap/core18/1932
loop6     7:6    0 55.4M  1 loop /snap/core18/1944
</code></pre>
<!--kg-card-end: markdown--><p>In my case the partition <strong>xvda1 </strong>is still assigned 10 GB, but the root <strong>xvda </strong>uses 40 GB. </p><p>The simple solution is to run the following command on the root partition and tell it to start using the new space it has been allocated:</p><!--kg-card-begin: markdown--><p><code>$ sudo growpart /dev/xvda 1 </code></p>
<!--kg-card-end: markdown--><p>However, when I did this I was presented with the following error:</p><pre><code>mkdir: cannot create directory &#x2018;/tmp/growpart.2626&#x2019;: No space left on device</code></pre><!--kg-card-begin: markdown--><p>Arrrgh! This meant that I didn&apos;t even have enough disk space to expand the disk space. I was screwed! One of the suggestions that was mentioned online was to use the <code>$ apt-get autoremove command</code> to remove those dependencies that were installed with applications and are no longer used by anything else on the system. Unfortunately, I was also met with the &quot;<em>No space left on device</em>&quot; error when I ran the command.</p>
<!--kg-card-end: markdown--><p>I searched for other files to remove, but being a bit of an amateur with Linux, I decided to delete the safest files....the log files. I did this by typing the following command in the terminal:</p><!--kg-card-begin: markdown--><pre><code>$ find /var/log -type f -delete
</code></pre>
<!--kg-card-end: markdown--><p>Whew! This bought me an additional 300 Mb which was just enough to run the growpart command again. </p><!--kg-card-begin: markdown--><p><code>$ sudo growpart /dev/xvda 1</code></p>
<!--kg-card-end: markdown--><p>Success! If I run the <strong>lsblk</strong> command to verify that partition 1 is expanded to 40 GB, I see the following:</p><!--kg-card-begin: markdown--><pre><code>NAME    MAJ:MIN RM  SIZE RO TYPE MOUNTPOINT
xvda    202:0    0   40G  0 disk
&#x2514;&#x2500;xvda1 202:1    0   40G  0 part /
loop1     7:1    0 97.9M  1 loop /snap/core/10444
loop3     7:3    0 97.9M  1 loop /snap/core/10577
loop4     7:4    0 55.4M  1 loop /snap/core18/1932
loop6     7:6    0 55.4M  1 loop /snap/core18/1944
</code></pre>
<!--kg-card-end: markdown--><p>I ran the resize2fs command to resize my file system. It can be used to enlarge or shrink an unmounted file system located on device.</p><!--kg-card-begin: markdown--><p><code>$ sudo resize2fs /dev/xvda1</code></p>
<!--kg-card-end: markdown--><p>Finally, I started the Ghost CMS again with the following <a href="https://deanhume.com/amazon-aws-ec2-ghost-cms-setup/#stoppingandstartingghostagain">command</a>:</p><!--kg-card-begin: markdown--><p><code>sudo /opt/bitnami/ctlscript.sh start</code></p>
<!--kg-card-end: markdown--><p>Once I ran that command I was greeted with:</p><!--kg-card-begin: markdown--><pre><code class="language-&#x2139;">&#x2139; Checking if logged in user is directory owner [skipped]
&#x2714; Checking current folder permissions
&#x2714; Validating config
&#x2714; Checking memory availability
&#x2714; Starting Ghost
You can access your publication at http://xx.xxx.xx.x:80
Your admin interface is located at http://xx.xxx.xx.x:80/ghost/
</code></pre>
<!--kg-card-end: markdown--><p>I never thought I&apos;d be so happy to see the output from a terminal!</p><h2 id="summary">Summary</h2><p>I still haven&apos;t been able to get to the bottom of why a small Ghost CMS should be taking up such a large amount of disk space, but for the moment I am happy that my site is up and running. I have Cloudwatch alarms in place to alert me if the disk space grows too large now.</p><p>If anyone has any guesses as to why this happens with Ghost hosted on Linux, let me know!</p>]]></content:encoded></item><item><title><![CDATA[Book Review - Accelerate: The Science of Lean Software and Devops]]></title><description><![CDATA[During my morning coffee session, I came across the book - Accelerate: The Science of Lean Software and Devops, and the title of the book instantly drew me in.]]></description><link>https://deanhume.com/accelerate-devops-book-review/</link><guid isPermaLink="false">6419d57e92605e0b555b2347</guid><category><![CDATA[Book Review]]></category><category><![CDATA[Book]]></category><category><![CDATA[Leadership]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Mon, 03 Aug 2020 15:51:44 GMT</pubDate><content:encoded><![CDATA[<p>During my morning coffee session, I like to check through my RSS feed and see what is happening out there in the world of technology. I came across this article by the Spotify R&amp;D team entitled &#x201C;<a href="https://engineering.atspotify.com/2020/07/22/leveraging-mobile-infrastructure-with-data-driven-decisions/">Leveraging Mobile Infrastructure with Data-Driven Decisions</a>&#x201D;. They referenced the book <a href="https://itrevolution.com/book/accelerate/"><strong>Accelerate: The Science of Lean Software and Devops</strong></a>, and the title of the book instantly drew me in.</p><p>I&#x2019;ve just finished reading the book and I can say that I definitely enjoyed it.</p><figure class="kg-card kg-image-card"><img src="https://deanhume.com/content/images/2020/08/accelerate.jpeg" class="kg-image" alt loading="lazy"></figure><p>Through four years of research, the authors set out to find a way to measure software delivery performance&#x2015;and what drives it&#x2015;using rigorous statistical methods. This book presents both the findings and the science behind that research, making the information accessible for readers to apply in their own organizations. Personally, I think it&apos;s quite cool to see scientific data applied to some of the everyday things that we do in technology organizations. I also found it interesting to see how some of the top performing companies repeatedly seemed to pull away from the &#x201C;pack&#x201D; by simply applying many of the techniques applied in this book. The book takes a look at well-known best practices such as Continuous Delivery, Lean Management, and Transformational Leadership.</p><p>If you are looking for a book that tells you <em>how </em>to build and scale a high performing technology organisation, then this book doesn&#x2019;t quite cover it. This book goes more into <strong><em>why </em></strong>you should be doing things a certain way and backs this up with facts.</p><p>Part 1 of this book explores what the research team found after trawling through the data. Part 2 dives into the science behind the book and an introduction to Psychometrics. Finally Part 3, looks into transformation, which focuses on how leadership and management can help drive these improvements.</p><p>If I had one takeaway from this book (which surprised me), it was that the leadership of an organisation plays a pivotal role in the team&apos;s results. I really liked this quote:</p><blockquote>&#x201C;Leadership really does have a powerful impact on results. A good leader affects a team&#x2019;s ability to deliver code, architect good systems, and apply Lean principles to how the team manages its work and develops products. All of these have a measurable impact on satisfaction, efficiency, and the ability to achieve organisational goals&#x201D;.</blockquote><p>If you&#x2019;d like a bit of insight into the book, I&#x2019;d recommend checking out the following resource, which contains a <a href="https://files.ontraport.com/media/phpleXP7K?Expires=1730762516&amp;Signature=P~VbcKyK7C3dsOosbxDMTx10LxI9Z7qPOCf71ywoa~YvAfQtB7lFZnLarBbfiGTeP53PqN5FervF4hycT5WyzCmcgDwzT~ZUIz81pXZjIeFlgtkdMpljnTrSmlXs0bi5fpWBZF~JTTHLgd-iOo6QoSukjk7Hqtd2s1Q73zBjtc6uqlJNoydoRd7hUO4RS5YAJ8knqiLvI2AkQL3E2V3Bz4guvz7~goKj1lee0ryQT2sz~38Qu~VoROPQ7zge2KUvIbp6CDGadGb0cAI7DkNn8hmoZRrKj7VXdfATGZD0958YXijHq0t9U9QxDBqsHFt~U-JabEHdXslTSA6a~0oFtQ__&amp;Key-Pair-Id=APKAJVAAMVW6XQYWSTNA">useful table of high-performance team, management and leadership behaviours and practices</a>.</p><p>Overall, this is a great book and definitely gives a good insight into why some of the best practices that we take for granted are worth doing and lead to high performing technology organisations. This book concentrates more on the why as opposed to the how. If you are looking for a how to, I&#x2019;d recommend reading <a href="https://www.amazon.co.uk/DevOps-Handbook-World-Class-Reliability-Organizations-ebook/dp/B01M9ASFQ3">The DevOps Handbook</a> to learn more.</p><p>I hope you enjoy the book!</p>]]></content:encoded></item><item><title><![CDATA[The People Manager: A Guide for the First Time Manager - has been published!]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Behind the scenes, I have been working on a new book with my EA colleague <a href="https://www.linkedin.com/in/jo-root-17b6371/">Jo Root</a>. I am super excited to announce that <a href="https://www.amazon.co.uk/People-Manager-Guide-First-Managers-ebook/dp/B084WVGG94">The People Manager: A Guide for the First Time Manager</a> has been published! This is a personal project that has been a year in the making.</p>]]></description><link>https://deanhume.com/book-the-people-manager/</link><guid isPermaLink="false">6419d57e92605e0b555b2346</guid><category><![CDATA[Book Review]]></category><category><![CDATA[Book]]></category><category><![CDATA[Leadership]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Thu, 26 Mar 2020 10:45:04 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Behind the scenes, I have been working on a new book with my EA colleague <a href="https://www.linkedin.com/in/jo-root-17b6371/">Jo Root</a>. I am super excited to announce that <a href="https://www.amazon.co.uk/People-Manager-Guide-First-Managers-ebook/dp/B084WVGG94">The People Manager: A Guide for the First Time Manager</a> has been published! This is a personal project that has been a year in the making.</p>
<p><img src="https://deanhume.com/content/images/2020/03/the-people-manager-book.jpg" alt="The People Manager - A Guide for the First Time Manager - Book" loading="lazy"></p>
<p>The book is designed to de-mystify the challenges of People Management and give best practices, personal hints and tips from our own experience, and splitting up what might seem an overwhelming new role into clear, focused and hopefully manageable chapters of advice and coaching.</p>
<p><img src="https://deanhume.com/content/images/2020/03/people-manager-book.jpg" alt="People Manager Book - A Guide for the First Time Manager" loading="lazy"></p>
<p>Becoming a People Manager is an exciting new challenge in your career but can also feel overwhelming. Being new to the role will no doubt mean that you have many questions that you want to find answers for. Your previous work experience, whilst invaluable for your personal development, may not give you the tools or confidence you need to springboard straight into your new role.</p>
<p>This book takes you through practical steps and real-world examples to become a confident, and ultimately successful People Manager.</p>
<p>Our hope is that you will:</p>
<ul>
<li>Discover tried-and-tested examples of tools and process you can use in your new role.</li>
<li>Explore the role of a People Manager and learn about what to expect.</li>
<li>Learn how to run effective and engaging one-on-one sessions with your team.</li>
<li>Master the art of building happy teams.</li>
<li>Learn how to handle conflict amongst your teams.</li>
<li>Discover how to continually learn and grow in your new role.</li>
</ul>
<p>We hope that new People Managers out there (or those returning to the role) will find this book invaluable for your learning, growth and success in your career.</p>
<p>The book is now live on <a href="https://www.amazon.co.uk/People-Manager-Guide-First-Managers-ebook/dp/B084WVGG94">Amazon</a> and available to order!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[My Favourite Soft Skills Interview Questions]]></title><description><![CDATA[<p>In my day to day role, a big part of what I do involves hiring and assessing Software Engineers. It might not be to everyone&#x2019;s liking, but I really do enjoy the interview process. It&#x2019;s a great chance to meet new people and I often learn</p>]]></description><link>https://deanhume.com/my-favourite-soft-skills-interview-questions/</link><guid isPermaLink="false">6419d57e92605e0b555b2345</guid><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Tue, 21 Jan 2020 17:17:00 GMT</pubDate><content:encoded><![CDATA[<p>In my day to day role, a big part of what I do involves hiring and assessing Software Engineers. It might not be to everyone&#x2019;s liking, but I really do enjoy the interview process. It&#x2019;s a great chance to meet new people and I often learn new things from them during the process.</p><p>When it comes to hiring for a technical role, more often than not, the emphasis might be placed on the technical skills of the candidate. Of course, if it is a technical role, it is vital to assess these skills. However, one part of an interview process that I feel is often overlooked is the soft skills. </p><p>At this point you might be wondering - what are<strong> soft skills</strong>? If you look up the definition of soft skills in the dictionary, it is defined as &#x201C;<em>personal attributes that enable someone to interact effectively and harmoniously with other people</em>&#x201D; and I think that it sums this up perfectly. These &#x201C;soft skills&#x201D; are also referred to as emotional intelligence and are an important part of the hiring equation. As a manager, soft skills are those &#x201C;fluffier&#x201D; attributes that are hard to define. Employees who possess these attributes are generally nice people to work with. I like to think that soft skills fall under the umbrella of creativity, listening skills, and team skills.</p><p>In this article, I want to share some of my favourite soft skill interview questions that I have come across (or used myself) during the interview process. Many of them are aimed towards Software Engineering roles, but you could adapt depending on the role that you are hiring for.</p><p>There is no right or wrong answer to these questions - they are simply there to give you inspiration during your next interview.</p><p>Questions such as:</p><ul><li>Why are you leaving your old workplace?</li><li>Why are you interested in working here?</li><li>How would your colleagues describe you if I asked them to tell me about you?</li><li>What criticisms or strengths might they mention?</li><li>Do you have any goals that you would like to achieve in the next few years?</li><li>What made you want to be an [insert role here]?</li><li>What was the last book that you read?</li><li>Do you have any personal projects?</li><li>What frustrates you at work?</li><li>Tell me about a time when you had a difficult working relationship with someone at work - how did you handle it?</li><li>When do you consider a piece of work to be finished?</li><li>What do you think of TDD? </li><li>Do you like it - why or why not?</li><li>What&#x2019;s the worst technical mistake that you&#x2019;ve made? Or maybe the &#x201C;best&#x201D; outage you&#x2019;ve ever been part of?</li><li>What do you think is the best way to collaborate on a project?</li><li>Imagine that you are in the middle of a development cycle and there is a major change in the functionality of a feature that you have been working on. How do you respond? What questions do you ask?</li><li>What do you do when someone in your team strongly disagrees with you?How do you ensure your opinion is heard? </li><li>How do you ensure you hear others opinions?</li></ul><p>These are just a few questions that I like to ask and they give you a general understanding of the candidates self-awareness, passion, curiosity, temperament and approach to teamwork - just to name a few attributes! </p><p>If you are a hiring manager and you&#x2019;ve never focussed on soft skills during an interview - give it a try, you never know what you might learn during your next interview.</p>]]></content:encoded></item><item><title><![CDATA[Experimenting with the Streams API]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I am always looking for ways to build faster, slicker web pages. Where applicable, I&#x2019;ll use great new browser features such as service workers, HTTP/2 and improved compression, just to name a few. But what if I told you there was a way to build even faster</p>]]></description><link>https://deanhume.com/experimenting-with-the-streams-api/</link><guid isPermaLink="false">6419d57e92605e0b555b2344</guid><category><![CDATA[JavaScript]]></category><category><![CDATA[Web Performance]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Tue, 19 Mar 2019 11:08:17 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I am always looking for ways to build faster, slicker web pages. Where applicable, I&#x2019;ll use great new browser features such as service workers, HTTP/2 and improved compression, just to name a few. But what if I told you there was a way to build even faster web pages? I&#x2019;d like to introduce you to the <a href="https://streams.spec.whatwg.org/">Streams API</a>.</p>
<p>Before we go any further, let&#x2019;s go right back to basics. <em>What is a stream?</em> A stream is data that is created, processed, and consumed in an incremental fashion, without ever reading all of it into memory. Streams have been available server side for years, but <a href="https://streams.spec.whatwg.org/">web streams</a> are available to JavaScript, and you can now start processing raw data with JavaScript bit by bit as soon as it is available on the client-side, without needing to generate a buffer, string, or blob.</p>
<p>The benefit of using streams is that if you are sending large chunks of data over the web, you can start processing the data immediately as you receive it, without having to wait for the full download to complete. For example, if you think of a large video file and imagine how long it might take to download the whole file. Using a stream allows you to download only the small amount of data that you need to view the video instead of the whole file - this means that you can view the video as quickly as the network takes to get you just those bytes.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://deanhume.com/content/images/2019/03/streams-api.jpg" class="kg-image" alt loading="lazy"></figure><!--kg-card-begin: markdown--><p>Streams also come with many other benefits, one of them being that they reduce the amount of memory that a large resource takes up. For example, if we needed to download a large file, process it, and keep it in memory, this could become a problem. However, with streaming, we can reduce the amount of memory that a large resource takes up because we are processing the data piece by piece; this feature is known as flow control and plays an important role in web streams.</p>
<p>At this point, you might be a little skeptical about using streams - but let me do my best to convince you of their great advantages. In this article, I am going to download and process a large JSON file using the Streams API and instantly write the data to a web page as we receive it, instead of when all of the data is downloaded.</p>
<h2 id="whatisndjson">What is NDJSON?</h2>
<p>Before we go any further, it&#x2019;s worth diving into a data format called NDJSON that we will be using throughout this example. If you haven&#x2019;t heard of <a href="http://ndjson.org/">NDJSON</a> before, it is a type of JSON called Newline delimited JSON. <em>Wait...not another data format!?</em> I totally agree with you, but NDJSON is needed because there is currently no standard for transporting instances of JSON text within a stream protocol. NDJSON looks very similar to JSON, the only difference is that each new line contains a new record, which allows us to stream and process one record at a time. If we sent traditional JSON in a stream, and processed it in chunks, it wouldn&#x2019;t come through as valid JSON. This is why NDJSON is perfect for this situation.</p>
<p>Let&#x2019;s look at a simple example comparing the two formats. Imagine the following JSON file:</p>
<pre><code>[
    {&quot;id&quot;:1,&quot;name&quot;:&quot;Alice&quot;},
    {&quot;id&quot;:2,&quot;name&quot;:&quot;Bob&quot;},
    {&quot;id&quot;:3,&quot;name&quot;:&quot;Carol&quot;}
]
</code></pre>
<p>The exact same data expressed as NDJSON looks like this:</p>
<pre><code>{&quot;id&quot;:1,&quot;name&quot;:&quot;Alice&quot;}
{&quot;id&quot;:2,&quot;name&quot;:&quot;Bob&quot;}
{&quot;id&quot;:3,&quot;name&quot;:&quot;Carol&quot;}
</code></pre>
<p>If you&#x2019;d like to learn more about NDJSON, I recommend reading the <a href="https://gist.github.com/clue/c3e9090798cc583e9fa3b7a5f68757ee">following article</a>.</p>
<p>With this in mind, let&#x2019;s get started!</p>
<h2 id="gettingstarted">Getting Started</h2>
<p>In order to get started, we need to get a little acquainted with the <a href="https://developers.google.com/web/updates/2015/03/introduction-to-fetch">Fetch API</a>. If you&#x2019;ve used it before, you may be aware that the Fetch API exposes Response bodies as ReadableStream instances. This means that they represent a readable stream of byte data. By tapping into this ReadableStream, we are able to process data incrementally in a stream instead of buffering it all into memory and processing it in one go.</p>
<p>We&#x2019;ll be looking at a concept called &#x201C;piping&#x201D; shortly which I think helps describe streams really well. Piping provides a chainable way of piping the current stream through a transform stream or any other writable/readable pair. The great thing about it is that it allows data to flow from one pipe to the next!</p>
<p>Let&#x2019;s take a look at the following function.</p>
<pre><code>/**
 * Fetch and process the stream
 */
async function process() {
    // Retrieve NDJSON from the server
    const response = await fetch(&apos;http://localhost:3000/request&apos;);

    const results = response.body
        // From bytes to text:
        .pipeThrough(new TextDecoderStream())
        // Buffer until newlines:
        .pipeThrough(splitStream(&apos;\n&apos;))
        // Parse chunks as JSON:
        .pipeThrough(parseJSON());

    // Loop through the results and write to the DOM
    writeToDOM(results.getReader());
}
</code></pre>
<p>In the code above, we started off by making a request to a local server for a NDJSON file. The body of the response is available as a <a href="https://streams.spec.whatwg.org/#rs-model">readable stream</a> which, as the name implies, allows us to read data out of it.</p>
<p>Using the <strong>pipeThrough()</strong> method, we can pipe the data we received through to another type of stream (Writeable, Transform, etc.) and process it accordingly. In the code above, I&#x2019;ve piped the body of the response through to another function called <strong>splitStream()</strong> and then again to another one called <strong>parseJSON()</strong>. Firstly, <strong>splitStream()</strong> takes the result of the NDJSON file and splits it based on each new line - making this the perfect format for streaming! Next, <strong>parseJSON()</strong> takes each chunk of data and parses the JSON to ensure that it is valid.</p>
<p>Now that our stream is ready to use, we can write the data to the page. With the readableStream, we can use <strong>getReader()</strong> to create a reader that locks the stream to the new reader. While the stream is locked, no other reader can be acquired until this one is released. This functionality is especially useful for creating abstractions that desire the ability to consume a stream in its entirety. By gett)ing a reader for the stream, you can ensure nobody else can interleave reads with yours or cancel the stream, which would interfere with your abstraction.</p>
<p>We can then iterate through the results in the reader.</p>
<pre><code>/**
 * Read through the results and write to the DOM
 * @param {object} reader 
 */
function writeToDOM(reader) {
    reader.read().then(
        ({ value, done }) =&gt; {
            if (done) {
                console.log(&quot;The stream was already closed!&quot;);

            } else {
                // Build up the values
                let result = document.createElement(&apos;div&apos;);
                result.innerHTML = `&lt;div&gt;ID: ${value.id} - Phone: ${value.phone} - Result: ${value.result}&lt;/div&gt;&lt;br&gt;`;

                // Prepend to the target
                targetDiv.insertBefore(result, targetDiv.firstChild);

                // Recursively call
                writeToDOM(reader);
            }
        },
        e =&gt; console.error(&quot;The stream became errored and cannot be read from!&quot;, e)
    );
}
</code></pre>
<p>In the code above, I am iterating through each result in the reader and prepending the results to a DIV on the page. <strong>In a real world example, you might want to display your results differently, but this gives you an example of what is capable with streams!</strong></p>
<p>Using the Developer Tools in Google Chrome, let&#x2019;s inspect the results with streaming in place.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://deanhume.com/content/images/2019/03/streaming-ndjson.gif" class="kg-image" alt loading="lazy"></figure><!--kg-card-begin: markdown--><p>In the animation above, you can see that as I reload the page, the results are instantly displayed thanks to the streaming. Even though the rest of the HTTP request is still being downloaded and chunked, we are still able to process and display the results. Without streaming, we would have to wait the full 8 seconds for the file to download, and then display the results. This is much smoother!</p>
<h2 id="browsersupport">Browser Support</h2>
<script src="https://cdn.jsdelivr.net/gh/ireade/caniuse-embed/caniuse-embed.min.js"></script>
<p class="ciu_embed" data-feature="streams" data-periods="future_1,current,past_1" data-accessible-colours="false">
<a href="http://caniuse.com/#feat=streams">Can I Use streams?</a> Data on support for the streams feature across the major browsers from caniuse.com.
</p>
<h2 id="summary">Summary</h2>
<p>I hope I&#x2019;ve managed to convince you how awesome streams are on the web. There are some great resources online, that I <a href="https://jakearchibald.com/2016/streaming-template-literals/">thoroughly</a> <a href="https://streams.spec.whatwg.org/demos/">recommend</a> <a href="https://www.sitepen.com/blog/a-guide-to-faster-web-app-io-and-data-operations-with-streams/">reading</a>. It&#x2019;s worth mentioning that you wouldn&#x2019;t want to use streams in every situation, but where applicable they can make a big difference to the performance of your web application.</p>
<p>Web streams allow you to stream data to your users, allowing the browser to process data piece by piece as it is downloaded. Without streaming, we need to wait for the entire contents of a download to complete before we return a response, but by streaming the data instead, we can return the results of the download and process it piece by piece, allowing us to render something onto the screen even sooner. Faster web pages = happier users!</p>
<p>If you&#x2019;d like to see a working example of this code, please over to <a href="https://github.com/deanhume/streams">Github</a> to find out more.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Brewfather - Progressive App Review]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>In my spare time I enjoy a little bit of homebrewing. What first started out as an experiment with a beer kit, soon turned into a geeky fascination with the science behind brewing beer. I am one of those annoying people that sniffs the beer and comments on the aroma</p>]]></description><link>https://deanhume.com/brewfather-progressive-app-review/</link><guid isPermaLink="false">6419d57e92605e0b555b2343</guid><category><![CDATA[Progressive Web Apps]]></category><category><![CDATA[Service Workers]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Mon, 04 Feb 2019 12:19:34 GMT</pubDate><media:content url="https://deanhume.com/content/images/2019/02/humbrew-beer-hero.jpg" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://deanhume.com/content/images/2019/02/humbrew-beer-hero.jpg" alt="Brewfather - Progressive App Review"><p>In my spare time I enjoy a little bit of homebrewing. What first started out as an experiment with a beer kit, soon turned into a geeky fascination with the science behind brewing beer. I am one of those annoying people that sniffs the beer and comments on the aroma before tasting it. Believe it or not, I even have a blog dedicated to <a href="https://humebrew.com">homebrewing</a>!</p>
<p>Like many hobbies out there, you can bet yourself that a piece of software exists to match the hobby. This couldn&#x2019;t be truer for homebrewing.</p>
<p>When I first started exploring this hobby, there were many desktop apps and mobile phone apps, but they all felt pretty clunky and I could only see my data if I was logged onto a specific device. It seemed as if the default implementation was always desktop apps. After a bit of searching around I discovered a web app called <a href="https://brewfather.app">Brewfather</a>. Pleased with the easy to use interface, and as a web developer, I naturally started exploring under the hood and noticed that this was a fully fledged Progressive Web App. As it turns out, an application like this is perfect for the web and it&#x2019;s also really good to see developers using Progressive Web Apps for commercial, real world examples.</p>
<p>In this article, I wanted to dig under the hood and take a closer look at how a great PWA like this is built and being used in the wild. (<em>It&#x2019;s worth mentioning that I am in no way affiliated with Brewfather - it&#x2019;s just a cool web app!</em>)</p>
<h2 id="whatisbrewfather">What is Brewfather?</h2>
<p>Before I go any further, many of you may not be familiar with Brewfather (and nor should you be to read this article!), and it is worth explaining a little more. Brewfather allows you to easily create and manage your homebrewing recipes as well as keep track of all the details in your batches and your ingredients inventory. It&#x2019;s got some great little built in calculators that help you determine your beer style, sugar content and bitterness levels to name a few.</p>
<p><img src="https://deanhume.com/content/images/2019/02/brewfather-1.png" alt="Brewfather - Progressive App Review" loading="lazy"></p>
<p>If this was just built as a simple web app with no PWA functionality, it would be a great web app! However, by adding PWA functionality, the Brewfather team have managed to take it a step further. Brewfather has been written as a Single Page App (SPA) using Ionic v3 and uses a service worker under the hood to cache any assets that the user will need again. This means that it works completely offline - allowing homebrewers to access to their recipes and brewing details wherever they are, regardless of network connection. This definitely allows this app to shine and takes it to the next level!</p>
<h2 id="lookingunderthehood">Looking under the hood</h2>
<p>Before we go any further, let&#x2019;s take a deeper look under the hood. In order to find out a little bit more about Brewfather, I opened up Google Chrome and used the Developer tools.</p>
<p>With Chrome Developer tools open, if you head over to the <strong>Application</strong> tab, you can see that the web app has a Web App Manifest file, icons, a start URL and theme colours.</p>
<p><img src="https://deanhume.com/content/images/2019/02/chrome-application-tab.PNG" alt="Brewfather - Progressive App Review" loading="lazy"></p>
<p>If we navigate to the <strong>Service Workers</strong> tab, you should see something a little like the image below.  You&#x2019;ll notice that a service worker has been installed and is controlling the page.</p>
<p><img src="https://deanhume.com/content/images/2019/02/brewfather-service-worker.PNG" alt="Brewfather - Progressive App Review" loading="lazy"></p>
<h2 id="workboxprecaching">Workbox &amp; precaching</h2>
<p>With a little bit of digging under the hood, I noticed that Brewfather is using <a href="https://developers.google.com/web/tools/workbox/">Workbox.js</a>. If you&#x2019;ve not heard of Workbox before, it is a set of libraries that make it easy to cache assets and take full advantage of features used to build Progressive Web Apps. Think of it as a set of JavaScript libraries for adding offline support to web apps.</p>
<p>One feature of service workers is the ability to save a set of files to the cache when the service worker is installing. This is often referred to as &quot;precaching&quot;, since you are caching content ahead of the service worker being used.</p>
<p>The main reasons for doing this is that it gives developers control over the cache, meaning they can determine when and how long a file is cached as well as serve it to the browser without going to the network, meaning it can be used to create web apps that work offline.</p>
<p>Workbox takes a lot of the heavy lifting out of precaching by simplifying the API and ensuring assets are downloaded efficiently.</p>
<p>You can see this in action on the Brewfather serviceworker.js file.</p>
<pre><code>self.workbox.precaching.precacheAndRoute([
  {
    &quot;url&quot;: &quot;assets/font/_flaticon.scss&quot;,
    &quot;revision&quot;: &quot;fa8ccf9cd4c3ed4b4122abd45ffef487&quot;
  },
  {
    &quot;url&quot;: &quot;assets/font/AngelineVintage.woff&quot;,
    &quot;revision&quot;: &quot;c2f686c43322502df1c28c5105b64f06&quot;
  },
  {
    &quot;url&quot;: &quot;assets/font/flaticon.css&quot;,
    &quot;revision&quot;: &quot;8a0da5d051f6ab11e5701c7b1651ad5c&quot;
  },
  {
    &quot;url&quot;: &quot;assets/font/Flaticon.woff&quot;,
    &quot;revision&quot;: &quot;7bc7d85cf32f8cf612ec6656d77fe1a0&quot;
  },
  {
    &quot;url&quot;: &quot;assets/fonts/ionicons.scss&quot;,
    &quot;revision&quot;: &quot;c1fdfabf9cbd412b444f064d27641f10&quot;
  }
]);

</code></pre>
<p>In the code snippet above, you can see that each file has a revision hash associated to it. Using the <a href="https://developers.google.com/web/tools/workbox/guides/precache-files/">Workbox CLI tool</a>, it will generate a unique hash for each file. If any changes are made to these files, the next time the CLI is run, it will update the hash, thus invalidating the cache for that file. The change to the service worker file will also mean that the browser picks this change up too.</p>
<p>If you&#x2019;d like to learn more about this, I recommend heading over the to the <a href="https://developers.google.com/web/tools/workbox/modules/workbox-precaching">Workbox documentation</a>. I&#x2019;ve also previously written about <a href="https://deanhume.com/getting-started-with-workbox-javascript-libraries-for-your-next-progressive-web-app/">Workbox</a> and how you can get any web app up and running with this great library.</p>
<h2 id="firebase">Firebase</h2>
<p>As I was browsing through the Brewfather web app, I noticed that when I was offline, I was still able to make change to recipes and batches even when I navigated away. After a little digging under the hood, it turns out that if you makes changes and save on the site, you&#x2019;ll also notice that the network requests make requests to <a href="https://firebase.google.com/docs/firestore/">Firestore</a> which is a flexible, scalable NoSQL cloud database to store and sync data for client and server-side development. The great thing about Firebase and Firestore is that it is a realtime database that takes care of the online/offline data syncing.</p>
<p><img src="https://deanhume.com/content/images/2019/02/firebase.png" alt="Brewfather - Progressive App Review" loading="lazy"></p>
<p>If you&#x2019;d like to learn more, I&#x2019;ve previously written about <a href="https://deanhume.com/a-basic-guide-to-firebase-for-the-web/">Firebase</a> on this blog. I do have to admit that my article is a little out of date and their API has been updated since this was last published.</p>
<h2 id="offlinesupport">Offline Support</h2>
<p>The great thing about Progressive Web Apps is that they allow your users to access their content regardless of network connection. While this is great for static content, it can be a bit tricky when the user wants to save changes when they are offline. As a web developer, you currently have the ability to using <a href="https://developers.google.com/web/updates/2015/12/background-sync">Background Sync</a> which allows you to defer actions until the user has stable connectivity. The thing is, it&#x2019;s great for simple actions, but it can be a bit tricky when you need total control across a complex web application. This is where <a href="https://firebase.google.com/docs/firestore/">Cloud Firestore</a> comes in.</p>
<p>It supports offline data persistence and it caches a copy of the Cloud Firestore data that your app is actively using, so your app can access the data when the device is offline. You can write, read, and query the cached data. When the device comes back online, Firestore synchronizes any local changes made by your app to the data stored remotely in Cloud Firestore.</p>
<p>Let&#x2019;s turn our focus back to the Brewfather app. If you look a little closer under the hood using Chrome Developer tools, you can see that Firestore uses IndexedDb to save this information locally and then update the server when the user comes online again.</p>
<p><img src="https://deanhume.com/content/images/2019/02/Firestore-IndexedDb.JPG" alt="Brewfather - Progressive App Review" loading="lazy"></p>
<p>If you&#x2019;d like to learn how to build a web app using these great offline features of Firestore, I recommend reading the <a href="https://firebase.google.com/docs/firestore/manage-data/enable-offline">following article</a>.</p>
<h2 id="summary">Summary</h2>
<p>All in all, Brewfather is a great example of a Progressive Web App. It&#x2019;s easy to access, it works offline, is super fast, and lets you access your data on the go. What more could you ask of a web application!</p>
<p>A big thank you to <a href="https://mobile.twitter.com/BrewfatherApp">Thomas Gangs&#xF8;y</a> for letting me review his PWA and for building such a great web app.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Determining Service Worker Support for your Site]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Have you ever thought about building a Progressive Web App or even adding a service worker to your website? Perhaps you&#x2019;ve considered it, but weren&#x2019;t too sure about whether or not your users were on browsers that supported these features. Would they even benefit from their</p>]]></description><link>https://deanhume.com/determining-service-worker-support/</link><guid isPermaLink="false">6419d57e92605e0b555b2341</guid><category><![CDATA[Progressive Web Apps]]></category><category><![CDATA[Service Workers]]></category><category><![CDATA[Web Development]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Thu, 08 Nov 2018 16:23:16 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Have you ever thought about building a Progressive Web App or even adding a service worker to your website? Perhaps you&#x2019;ve considered it, but weren&#x2019;t too sure about whether or not your users were on browsers that supported these features. Would they even benefit from their features? Speaking from experience, I know how challenging it can be to convince key stakeholders about trying out new technology on your product.</p>
<p>Without a doubt, one of the best ways to go about this is to start tracking usage and check the results. It&apos;s much easier to convince someone about a great piece of tech, if you are able to show the benefit to the users.</p>
<p>In this article, I am going to run through a small snippet of code that you can use to track service worker support on your site. We&apos;ll be pushing the results through to Google Analytics, but this could be tailored to suit any Web analytics package.</p>
<h2 id="trackingserviceworkersupport">Tracking Service Worker Support</h2>
<p>On this very blog, I am using Google Analytics as a web analytics tool. It allows me to easily see useful information about how my site is performing and lets me track page views. It also has the ability to track certain events, which works perfectly for this use case.</p>
<p>I won&#x2019;t go into too much detail about how to set up Google Analytics on your site, as the <a href="https://support.google.com/analytics/answer/1008080?hl=en">online documentation</a> does a great job of this. However, let&apos;s take a look at an empty HTML page with the basic Google Analytics code in place.</p>
<pre><code>&lt;!DOCTYPE html&gt;
&lt;html&gt;
  &lt;head&gt;
    &lt;meta charset=&quot;UTF-8&quot;&gt;
    &lt;title&gt;Tracking Service Worker Support&lt;/title&gt;
&lt;script async src=&quot;https://www.googletagmanager.com/gtag/js?id=UA-xxx-1&quot;&gt;&lt;/script&gt;
&lt;script&gt;

     
  window.dataLayer = window.dataLayer || [];
  function gtag(){dataLayer.push(arguments);}
  gtag(&apos;js&apos;, new Date());

    
  gtag(&apos;config&apos;, &apos;UA-xxx-1&apos;);
&lt;/script&gt;
  &lt;/head&gt;
  &lt;body&gt;
  
  &lt;/body&gt;
&lt;/html&gt;
</code></pre>
<p>According to the documentation, the code above is typical setup for Google Analytics. In order for us to start using this, we need to add a check to determine service worker support and push it through to Google Analytics.</p>
<p>I added the following code to each page just after the closing <strong>body</strong> tag.</p>
<pre><code>&lt;script&gt;
function getServiceWorkerSupport() {
  if (&apos;serviceWorker&apos; in navigator) {
    return &apos;supported&apos;;
  } else {
    return &apos;unsupported&apos;;
  }
}


// Push the result to Google Analytics
gtag(&apos;event&apos;, &apos;service-worker-support&apos;, {
      &apos;event_category&apos; : &apos;service-worker-support&apos;,
      &apos;event_label&apos; : getServiceWorkerSupport()
    })
&lt;/script&gt;
</code></pre>
<p>Let&#x2019;s break this down. I&apos;ve created a function called <em>getServiceWorkerSupport</em>() which uses <a href="https://developer.mozilla.org/en-US/docs/Learn/Tools_and_testing/Cross_browser_testing/Feature_detection">feature detection</a> to determine if the current browser is capable of using service workers.</p>
<p>For each visit to my site, this information will get pushed through to Google Analytics. For this blog, I wanted to take things a little further and determine exactly how many pages were actually controlled by an active service worker. For example, if someone visits a site with a service worker on it, it will get installed the first time and only used the next time they reload the page or navigate away. I wanted to determine how many people were both supported and using my service worker.</p>
<p>With this in mind, I updated the function above to include the condition below.</p>
<pre><code>function getServiceWorkerSupport() {
  if (&apos;serviceWorker&apos; in navigator) {
    return navigator.serviceWorker.controller ? &apos;controlled&apos; : &apos;supported&apos;;
  } else {
    return &apos;unsupported&apos;;
  }
}
</code></pre>
<p>After a few days of tracking data, I logged into the Google Analytics dashboard and checked the numbers. Start by expanding the menu and navigate to <strong>Behavior</strong> -&gt; <strong>Events</strong> -&gt; <strong>Top Events</strong> and you should see something a little similar to the image below.</p>
<p><img src="https://deanhume.com/content/images/2018/11/service-worker-support-google-analytics.JPG" alt="Service Worker Support - Google Analytics" loading="lazy"></p>
<p>From there, choose the Event label as the primary dimension and show as a chart. I&#x2019;ve tried to outline the image below (in red) to give you a better idea of this in action.</p>
<p><img src="https://deanhume.com/content/images/2018/11/service-worker-support.JPG" alt="Service Worker Support" loading="lazy"></p>
<p>In the image above, the blue (83.8%) represents all visits to the site that support service workers. The green (10.5%) represents of all visits to my site were controlled by a service worker and served cached data. Finally, the orange shows that around 3% of all visits to my site were from browsers that don&#x2019;t support service workers.</p>
<p>That is a big portion of my users that not only have browser support for service workers, but also already benefit from the service worker on my site!</p>
<h2 id="summary">Summary</h2>
<p>It&apos;s worth mentioning that a big portion of my users are tech savvy and interested in learning about new browser technology. Depending on your target audience, you might find that your numbers are completely different.</p>
<p>If you find that the majority of your users would benefit from converting your site to a Progressive Web App, or even a service worker, it might be worth considering it. Who knows&#x2026;. you might even find that service workers are <a href="https://deanhume.com/service-workers-can-save-the-environment/">good for the environment!</a></p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Tips for setting up a Ghost blog on Amazon AWS EC2]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>After a long overdue refresh, I have <a href="https://deanhume.com/converting-a-ghost-blog-to-a-progressive-web-app/">recently moved</a> this blog to Ghost CMS. Overall, I have been very impressed.  Ghost is easy to use as an editor, quick to make updates, and the Ghost team are constantly pushing out updates - which is great considering that it is completely</p>]]></description><link>https://deanhume.com/amazon-aws-ec2-ghost-cms-setup/</link><guid isPermaLink="false">6419d57e92605e0b555b2342</guid><category><![CDATA[Ghost]]></category><category><![CDATA[Amazon EC2]]></category><category><![CDATA[Blogging]]></category><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Fri, 26 Oct 2018 10:15:25 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>After a long overdue refresh, I have <a href="https://deanhume.com/converting-a-ghost-blog-to-a-progressive-web-app/">recently moved</a> this blog to Ghost CMS. Overall, I have been very impressed.  Ghost is easy to use as an editor, quick to make updates, and the Ghost team are constantly pushing out updates - which is great considering that it is completely open source.</p>
<p><img src="https://deanhume.com/content/images/2018/10/ghost-logo.png" alt="Ghost CMS Logo" loading="lazy"></p>
<p>In order to get started with Ghost, I currently user the AWS free tier and run this blog on a Micro instance. Considering this is the smallest instance available, I have been very impressed with the performance of this site. I also chose to go with the <a href="https://docs.bitnami.com/aws/apps/ghost/">Bitnami Ghost stack for AWS</a> in order to get up and running as quickly as possible.</p>
<p>While Ghost has been ticking away nicely, I have recently been having some serious issues with my Amazon EC2 instance and I wanted to write this article to help any others that might be experiencing the same problem! I&#x2019;d also like to refer back to this in case it happens to me again.</p>
<h2 id="awsinstancescheduledforretirement">AWS Instance scheduled for Retirement</h2>
<p>As far as I was concerned, my blog was up and running and everything was working as expected. After about 3 months of running undisturbed, I received a warning from AWS that my &#x201C;<em>instance was scheduled for retirement due to underlying hardware degradation and that it may already be unreachable</em>&#x201D;. What!?</p>
<p>Sure enough, I logged in to the Amazon management console and it was down. My blog was also returning a <em>503 Service Unavailable</em> error. Fortunately I was able to stop the instance and restart it again - but this meant I needed to quickly backup my blog and restore it to another instance.</p>
<p>Wait...you don&#x2019;t back your blog up!? Yeah I know&#x2026;.rookie mistake!</p>
<h2 id="makingabackupofyourghostblog">Making a backup of your Ghost blog</h2>
<p>There are two ways to make a backup of your Ghost blog on EC2. The first option is to create an image of your EC2 instance which you can simply use to restore from at any point in time. Think of it as a &#x201C;snapshot&#x201D; for the given time.</p>
<p>If you log into the AWS console, select Instances and then select the instance you wish to back up. Next, choose <strong>Actions</strong> -&gt; <strong>Image</strong> -&gt; <strong>Create Image</strong>.</p>
<p><img src="https://deanhume.com/content/images/2018/10/ghost-create-image-1.JPG" alt="Amazon EC2 Dashboard - Ghost Create Image" loading="lazy"></p>
<p>This will then create AMI under images that you can use to restore from. You can simply choose <strong>Launch Instance</strong> and then look under <strong>My AMIs</strong> to restore from your newly created image.</p>
<p>You could even take this a step further and automate the whole process, so that you have an up to date image to restore from at any point. Using a <a href="https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/TakeScheduledSnapshot.html">Cloudwatch event</a>, you can take a scheduled snapshot of your instance. I won&#x2019;t go into it in this article, as this is an article in itself - but please follow the link above for more details.</p>
<p>The second option is to manually backup your blog using the UI on your Ghost blog. This option is ideal if you are looking to restore to a completely clean, new version of the Bitnami Ghost image. You&#x2019;ll need to start by exporting all of the content on your blog. Log into your admin interface, and head over to <strong>Labs</strong>.</p>
<p><img src="https://deanhume.com/content/images/2018/10/ghost-export-contents.JPG" alt="Ghost Admin Export Contents" loading="lazy"></p>
<p>Choose <strong>Export your Content</strong>, and it will download all of your posts and settings in a single JSON file.</p>
<p>If you have any redirects or routes set up on your Ghost blog, you can do this in the same way.</p>
<p><img src="https://deanhume.com/content/images/2018/10/ghost-redirects-routes.JPG" alt="Ghost Admin Redirects Routes" loading="lazy"></p>
<p>We are almost done - next you&#x2019;ll need to take a back up of the design of your blog. Head over to the <strong>Design</strong> tab in your admin interface.</p>
<p><img src="https://deanhume.com/content/images/2018/10/ghost-export-design.JPG" alt="Ghost Admin Export Design" loading="lazy"></p>
<p>From there, click on the download link and you&#x2019;ll receive a ZIP file containing all of the assets that make up the design of your site.</p>
<p>The best thing about the Ghost admin interface is that it allows you to import everything back in the same way. If you have another instance up and running, log into the admin interface and upload all of the exported content that you have. Nice and easy!</p>
<h2 id="backinguptheimagesonaghostblog">Backing up the images on a Ghost blog</h2>
<p>Now this is all good and well, but the eagle-eyed amongst you might have noticed that we haven&#x2019;t actually downloaded any of the images associated with the blog. You might have restored all of the content to your new instance, but if you visit the pages on your site, you may notice that your images now point to broken links!</p>
<p>At the moment, the Ghost admin interface doesn&#x2019;t let you download a copy of all of your images. Fortunately there is a way around this, although its a bit of a pain. All of the images for your Bitnami Ghost blog are located at &#x2018;<em>~/apps/ghost/htdocs/content/images</em>&#x2019;.</p>
<p>I switch between Windows and a Mac - so I used <a href="https://www.ssh.com/ssh/putty/putty-manuals/0.68/Chapter5.html">PSCP</a> on a Windows machine for the command below. However, the command using SCP in linux is almost identical. I started by downloading all of the images on my blog to my local machine.</p>
<pre><code>pscp -r -i C:\Users\dhume\Documents\xxx.ppk bitnami@xxx.compute-1.amazonaws.com:/home/bitnami/apps/ghost/htdocs/content/images C:\dean 
</code></pre>
<p>Let&#x2019;s break this down a little. Firstly, I am running the PSCP command and am linking to the .ppk file with my login credentials. Putty SCP uses PPK instead of PEM, but you can easily <a href="https://stackoverflow.com/q/3190667/335567">create one using the Putty tool</a>. Next, point to the EC2 instance that I want to copy the images from, as well as the path to the images on the EC2 instance. The last parameter points to the folder on my local machine.</p>
<p>After running this command, I can see that all of the images have been downloaded to my local drive.</p>
<p><img src="https://deanhume.com/content/images/2018/10/ghost-export-images.JPG" alt="Ghost Export Images" loading="lazy"></p>
<h2 id="restoringimagesonaghostblog">Restoring images on a Ghost blog</h2>
<p>So far so good! Our images have been downloaded onto the local drive but we still need to upload them to the new EC2 instance. Using a similar command to the one we used above, I now need to flip the command and point to the new EC2 instance.</p>
<pre><code>pscp -r -i C:\Users\dhume\Documents\xxx.ppk C:\dean\images\2018 bitnami@xxx.us-west-2.compute.amazonaws.com:/home/bitnami/apps/ghost/htdocs/content/images/ 
</code></pre>
<p>If all run successfully, you should now see the images on your site!</p>
<h2 id="removingthemanagebanneronaghostblogbitnami">Removing the &#x2018;manage&#x2019; banner on a Ghost blog (Bitnami)</h2>
<p>If you spin up a new Bitnami Ghost instance, there are still a few things that you need to do before the site is ready to go &#x2018;live&#x2019;. On a completely fresh instance, you may notice a banner in the bottom right hand corner when you view your site in a web browser.</p>
<p><img src="https://deanhume.com/content/images/2018/10/bitnami-banner.png" alt="Bitnami Banner" loading="lazy"></p>
<p>In order to remove this, you&#x2019;ll need to <a href="https://docs.aws.amazon.com/quickstarts/latest/vmlaunch/step-2-connect-to-instance.html">SSH into your instance</a> and run the following command in the terminal.</p>
<pre><code>sudo /opt/bitnami/apps/ghost/bnconfig --disable_banner 1
</code></pre>
<p>If you refresh the page, you&#x2019;ll notice that this has now been removed.</p>
<h2 id="updatingthedomainnameonaghostblog">Updating the domain name on a Ghost blog</h2>
<p>When you spin up a new Bitnami Ghost instance on EC2, you&#x2019;ll be able to access it from the default domain that AWS assigns to it. However, you&#x2019;ll want to point this to a real domain name and that involves updating a config setting in order for Ghost to link correctly.</p>
<p>In order to do this, you&#x2019;ll need to SSH into your instance and run the following command.</p>
<pre><code>sudo /opt/bitnami/apps/ghost/bnconfig --machine_hostname yourdomainname.com
</code></pre>
<p>It takes a little while to finish running, but this command will update your domain accordingly. Provided you&#x2019;ve updated your DNS details with your hosting provider - when you visit the domain name on your website - all links will now be in working order.</p>
<p>I have sometimes found that the command above appends a <strong>:80</strong> port number to the end of the domain name. This can mess with RSS feeds and links. In that case, you can always directly edit the values in the config.production.json file using the following command:</p>
<pre><code>sudo nano /opt/bitnami/apps/ghost/htdocs/config.production.json
</code></pre>
<h2 id="updatingaghostblogtothelatestversion">Updating a Ghost blog to the latest version</h2>
<p>Considering Ghost is completely open source and built largely by the community, it&#x2019;s pretty impressive to see the number of updates that they constantly ship. If you log into the admin section of your Ghost blog, and click on <strong>About</strong> - you should notice something a little like the image below.</p>
<p><img src="https://deanhume.com/content/images/2018/10/ghost-update-version.JPG" alt="Ghost CMS Update Version" loading="lazy"></p>
<p>The Ghost admin interface intelligently lets you know that you are on an older version and what new features are available if you update. Not surprisingly, updating it easier than you think.</p>
<p>Start off by SSH&#x2019;ing into your instance and navigate to /apps/ghost/htdocs. Next, you&#x2019;ll need to run the following command.</p>
<pre><code>sudo ghost update
</code></pre>
<p>If everything runs successfully, you&#x2019;ll have the latest version of Ghost up and running!</p>
<h2 id="stoppingandstartingghostagain">Stopping and Starting Ghost again</h2>
<p>I have noticed a few instances where you suddenly encounter a 503 error when you are running an Amazon Micro instance. This could be caused due to having low memory on the instance and Ghost struggles with this. I find that a quick stop and start of Ghost should sort this problem out too.</p>
<pre><code>sudo /opt/bitnami/ctlscript.sh stop
</code></pre>
<pre><code>sudo /opt/bitnami/ctlscript.sh start
</code></pre>
<h2 id="summary">Summary</h2>
<p>That&#x2019;s about it. I hope that this article has been helpful to you. As I experiment with Ghost and make changes to this blog, I will be sure to keep it updated. Hopefully this serves to help you, but as a reminder for myself in case this happens again!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Software Team Maturity Matrix]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Whether you lead a small team of developers, a tech startup or a huge department, it&#x2019;s important that you constantly strive to improve the processes and tools that you use. If things are going really well, it&#x2019;s easy to become confident in your abilities and overlook</p>]]></description><link>https://deanhume.com/software-team-maturity-matrix/</link><guid isPermaLink="false">6419d57e92605e0b555b2340</guid><dc:creator><![CDATA[Dean Hume]]></dc:creator><pubDate>Wed, 10 Oct 2018 15:48:41 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Whether you lead a small team of developers, a tech startup or a huge department, it&#x2019;s important that you constantly strive to improve the processes and tools that you use. If things are going really well, it&#x2019;s easy to become confident in your abilities and overlook the need to self assess where you are at as a team.</p>
<p>One of my favourite tools for doing this is a <a href="https://en.wikipedia.org/wiki/Capability_Maturity_Model">Maturity Matrix</a>. The idea behind this matrix is that you have a list of best practices and you assess your team or department accordingly. For example, imagine you were building the perfect organisation - what would it look like? What tools / processes would they use? I like to jot all of these things down in the Maturity Matrix and compare where we might be as a team versus the reality of where we are at. It&#x2019;s a bit like the <a href="https://www.joelonsoftware.com/2000/08/09/the-joel-test-12-steps-to-better-code/">Joel test</a> but on a much larger scale.</p>
<p>Let&#x2019;s take the spreadsheet below as an example.</p>
<iframe width="140%" height="550px" src="https://docs.google.com/spreadsheets/d/e/2PACX-1vRw_do1p2-AVmNVTXEot2L_wZ3vRgbjOZkS25UeCC5GH1wON1kH_Qa2PWNFgC_zUINgOcNSPtHKVkBa/pubhtml?gid=751619418&amp;single=true&amp;widget=true&amp;headers=false"></iframe>
<p>The spreadsheet above contains a list of Software Engineering best practices for a department (click <a href="https://docs.google.com/spreadsheets/d/1auzG0Zh29FvksXBEnyCi72BCqM8POUwFHakWD3BHO8k/edit?usp=sharing">here to download</a> if you can&#x2019;t see above). Sure, this list might not contain every single best practice available and is somewhat customised to my experience - but it&apos;s a good start and gives an insight into the things that could be improved upon. The best thing about this is that you can tailor it to suit your own departments needs and regularly reassess where your team is at.</p>
<p><em>So, how do you use this?</em> Well, I like to use a Red / Amber / Green rating to compare where things are. Red is for non existent. Amber is for work in progress, or something we partially do. Finally Green is for something that we work on.</p>
<p>It&#x2019;s important to be honest with yourself and rate yourself accordingly - after all, this is only to help improve things for you and your team! Once you&#x2019;ve completed the Maturity Matrix you should start to get a bigger picture of the things that you need to work on. For example, if there are a lot of reds - then perhaps those are the more immediate things that you need to work on.</p>
<p>It&#x2019;s important to not only look at Software Engineering best practices - but also People best practices. Below is another Maturity Matrix that I like to use to assess how things are in terms of people.</p>
<iframe width="140%" height="300px" src="https://docs.google.com/spreadsheets/d/e/2PACX-1vRw_do1p2-AVmNVTXEot2L_wZ3vRgbjOZkS25UeCC5GH1wON1kH_Qa2PWNFgC_zUINgOcNSPtHKVkBa/pubhtml?gid=1433670064&amp;single=true&amp;widget=true&amp;headers=false"></iframe>
<p>Again, this list is something that I like to use and have personally built up - but it might not be suited to you - it&#x2019;s up to you to adjust and add accordingly.</p>
<h2 id="summary">Summary</h2>
<p>As a team, it&#x2019;s important to constantly look for new avenues to improve. I know that in my own experience, I can sometimes become lazy and neglect some of the important things that should be done. This is why I like using a Maturity Matrix like this to keep me and my teams honest.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item></channel></rss>