<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Deltacode Blog]]></title><description><![CDATA[A web developer's blog by David De Sloovere]]></description><link>https://blog.deltacode.be/</link><generator>Ghost 3.35</generator><lastBuildDate>Thu, 09 Apr 2026 02:24:06 GMT</lastBuildDate><atom:link href="https://blog.deltacode.be/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Creating timelapses with Home Assistant, C# and Azure - step 2]]></title><description><![CDATA[Uploading images to Azure blob storage with a few lines of C# and one HTTP call. Part of a series of blog posts.]]></description><link>https://blog.deltacode.be/2022/06/07/home-assistant-upload-to-azure-blob-storage/</link><guid isPermaLink="false">629f9db588ca2d148c6a1604</guid><category><![CDATA[Home Assistant]]></category><category><![CDATA[Azure]]></category><category><![CDATA[C#]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Tue, 07 Jun 2022 19:06:47 GMT</pubDate><media:content url="https://blog.deltacode.be/content/images/2022/06/upload-from-home-assistant-to-azure.png" medium="image"/><content:encoded><![CDATA[<img src="https://blog.deltacode.be/content/images/2022/06/upload-from-home-assistant-to-azure.png" alt="Creating timelapses with Home Assistant, C# and Azure - step 2"><p>Part of a series of blog posts:<br>- Step 1: <a href="https://blog.deltacode.be/2022/06/07/home-assistant-camera-snapshot-timelapse/">Taking snapshots</a><br>- Step 2: Uploading to the cloud (this post)<br><em>- Step 3: this is work in progress</em></p><h2 id="step-2-upload-to-the-cloud">Step 2: Upload to the cloud</h2><p>For the creation of the timelapse itself, I will be using C# and the code will run on Azure. I've been writing C# almost daily since 2006. So this is what would work for me (and not take too much time doing it either).</p><p>There is no integration or service readily available in Home Assistant that will allow you to upload files to Azure Blob storage. <a href="https://www.home-assistant.io/integrations/rest_command/">RESTful Command</a> is a way to execute HTTP calls, but I didn't see how I could pass the contents of my image as payload (template syntax won't get this far). </p><blockquote><em>I decided to code my way out of this, like any developer would do.</em></blockquote><p>You can upload a file to <a href="https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction">Azure Blob storage </a>using an official SDK (currently available in 7 languages) or just plain HTTP requests. For simplicity I chose the HTTP request. You could probably swap some parts of the code to upload to other cloud storage that supports HTTP requests.</p><p>Unfortunately I haven't written any real Python. But I knew of 2 projects that offer .NET / C# integration in Home Assistant: <a href="https://community.home-assistant.io/t/c-for-home-assistant/320200">C# for Home Assistant</a> and <a href="https://netdaemon.xyz/">NetDaemon</a>. I picked the first, without any reason other than it seemed to be easier to set up.</p><h2 id="c-for-home-assistant">C# for Home Assistant</h2><blockquote><strong>Setting up C# for Home Assistant took only a few minutes. </strong></blockquote><p>It's detailed on the <a href="https://github.com/anhaehne/hhnl.HomeAssistantNet">wiki of the project</a>. In short: install add-on, have samba running (already did), map network drive, open folder in VS Code, add user secrets and start coding.</p><p>The code short and simple. Loop through the files, read the content, PUT on Azure and clean up.</p><!--kg-card-begin: markdown--><pre><code class="language-csharp">[Automation]
[Schedule(Every.Minute, 15)]
public async Task UploadTimeLapsesAsync()
{
    var sasToken = Secrets.GetSecret(&quot;TimelapsesSasToken&quot;);
    var files = System.IO.Directory.GetFiles(&quot;/config/www/timelapse&quot;);
    _logger.LogInformation($&quot;Timelapse files found: {files.Length}&quot;);
    if (files.Length == 0)
    {
        return;
    }

    var client = new HttpClient();
    foreach (var fi in files)
    {
        _logger.LogInformation(fi);
        var bytes = System.IO.File.ReadAllBytes(fi);
        var fileName = System.IO.Path.GetFileName(fi);
        ByteArrayContent content = new System.Net.Http.ByteArrayContent(bytes);
        content.Headers.Add(&quot;x-ms-blob-type&quot;, &quot;BlockBlob&quot;);
        var response = await client.PutAsync($&quot;https://redacted.blob.core.windows.net/timelapse/{fileName}?{sasToken}&quot;, content);
        response.EnsureSuccessStatusCode();
        System.IO.File.Delete(fi);
    }
}
</code></pre>
<!--kg-card-end: markdown--><p>I use a timer as workaround. There is no direct way of starting the C# automation from my 'snapshot automation' (see Step 1) other that using helpers or yet another workaround as described in the wiki. The timer hardly impacts my system.</p><p>The add-on has a basic UI to list the automations. It can also show the logs, which I is extremely useful.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.deltacode.be/content/images/2022/06/image-1.png" class="kg-image" alt="Creating timelapses with Home Assistant, C# and Azure - step 2" srcset="https://blog.deltacode.be/content/images/size/w600/2022/06/image-1.png 600w, https://blog.deltacode.be/content/images/size/w1000/2022/06/image-1.png 1000w, https://blog.deltacode.be/content/images/2022/06/image-1.png 1069w" sizes="(min-width: 720px) 720px"><figcaption>The C# for Home Assistant add-on - over view of the automations.</figcaption></figure><p>Drilling down into the logs we see that 2 images were uploaded. <br>It took .4 seconds read and upload the first file (513.13 KiB). <br>The second one was only .2 seconds (276.14 KiB).</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.deltacode.be/content/images/2022/06/image-2.png" class="kg-image" alt="Creating timelapses with Home Assistant, C# and Azure - step 2" srcset="https://blog.deltacode.be/content/images/size/w600/2022/06/image-2.png 600w, https://blog.deltacode.be/content/images/size/w1000/2022/06/image-2.png 1000w, https://blog.deltacode.be/content/images/2022/06/image-2.png 1011w" sizes="(min-width: 720px) 720px"><figcaption>ILogger output is captured in the add-on. Awesome!</figcaption></figure><p>The uploaded files in the Azure portal.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.deltacode.be/content/images/2022/06/image.png" class="kg-image" alt="Creating timelapses with Home Assistant, C# and Azure - step 2"><figcaption>The images uploaded in the Azure Blob storage container.</figcaption></figure><p>In step 3 will we create the timelapse using C#. The post will follow soon.</p><p><em>If you know how I would write this in Python and integrate that in HA, please let me know in the comments.</em></p>]]></content:encoded></item><item><title><![CDATA[Creating timelapses with Home Assistant, C# and Azure - step 1]]></title><description><![CDATA[Taking camera snapshots at regular interval, with the goal to create timelapses. Part of a series of blog posts.]]></description><link>https://blog.deltacode.be/2022/06/07/home-assistant-camera-snapshot-timelapse/</link><guid isPermaLink="false">629f963c88ca2d148c6a155a</guid><category><![CDATA[Home Assistant]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Tue, 07 Jun 2022 18:48:44 GMT</pubDate><media:content url="https://blog.deltacode.be/content/images/2022/06/carbon.png" medium="image"/><content:encoded><![CDATA[<img src="https://blog.deltacode.be/content/images/2022/06/carbon.png" alt="Creating timelapses with Home Assistant, C# and Azure - step 1"><p>Part of a series of blog posts:<br>- Step 1: Taking snapshots (this post)<br>- Step 2: <a href="https://blog.deltacode.be/2022/06/07/home-assistant-upload-to-azure-blob-storage/">Uploading to the cloud</a><br>- Step 3: (this is work in progress)</p><p>I have a Unifi setup at home including some of their cameras. The mobile application has this really great feature where you can scroll through the timeline and the view updates instantly. It's like a fluent timelapse that you control.</p><p>This brought me to the idea that I want to created timelapses from snapshots. I have the Unifi integration set up in Home Assistant and my cameras are available as entities already. It shouldn't be that hard to get the job done, right?</p><h2 id="step-1-taking-snapshots">Step 1: Taking snapshots</h2><p>Home Assistant already has a service to <a href="https://www.home-assistant.io/integrations/camera/#services">take snapshots</a>, and combined with the <a href="https://www.home-assistant.io/docs/automation/trigger/#time-pattern-trigger">time pattern trigger</a> you can easily save an image at regular interval. </p><p>The only tricky part was to find a place where files could be written. I didn't see any errors with my initial folder, but also didn't see any files created. Must have been a permission issue. After some digging I changed the path to /config/www and step 1 was complete.</p><p>Here's the YAML for my automation, where I take snapshots for 2 separate cameras, at an hourly interval.</p><!--kg-card-begin: markdown--><pre><code class="language-yaml">alias: Timelapse snapshots
description: ''
trigger:
  - platform: time_pattern
    hours: /1
condition: []
action:
  - service: camera.snapshot
    data:
      filename: /config/www/timelapse/carport-{{ now().strftime('%Y%m%d-%H%M%S') }}.jpg
    target:
      device_id: 53d064634f8c8dd6cf8f9f6e2c93aa75
  - service: camera.snapshot
    data:
      filename: /config/www/timelapse/doorbell-{{ now().strftime('%Y%m%d-%H%M%S') }}.jpg
    target:
      device_id: 7abd063efe059580aafef4ee9efaa3d5
mode: single
</code></pre>
<!--kg-card-end: markdown--><p>Continue to step 2 where I <a href="https://blog.deltacode.be/2022/06/07/home-assistant-upload-to-azure-blob-storage/">upload files from Home Assistant to the cloud</a>.</p>]]></content:encoded></item><item><title><![CDATA[dotnet new templates]]></title><description><![CDATA[Where is the dotnet new templates folder on my drive? How do I update the dotnet new templates?]]></description><link>https://blog.deltacode.be/2020/10/13/dotnet-new-templates-path/</link><guid isPermaLink="false">5f8555271b45091b00a8e585</guid><category><![CDATA[.NET]]></category><category><![CDATA[NuGet]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Tue, 13 Oct 2020 07:31:05 GMT</pubDate><content:encoded><![CDATA[<h3 id="path">Path</h3><p>Where are the template packages stored for <code>dotnet new</code>? I had to run a good old file seach in my C: drive for this. They are stored in a path under the user profile, per SDK version.</p><p>On Windows it's something like this:</p><p>C:\Users\&lt;username&gt;\.templateengine\dotnetcli\&lt;sdkversion&gt;\packages</p><p>%userprofile%\.templateengine\dotnetcli\v3.1.402\packages</p><p>In that folder you can also see the version of the packages/template.</p><figure class="kg-card kg-image-card"><img src="https://blog.deltacode.be/content/images/2020/10/image-5.png" class="kg-image" alt srcset="https://blog.deltacode.be/content/images/size/w600/2020/10/image-5.png 600w, https://blog.deltacode.be/content/images/2020/10/image-5.png 870w" sizes="(min-width: 720px) 720px"></figure><h3 id="updating-via-nuget-org">Updating via NuGet.Org</h3><p>To update your local templates that have NuGet.org as their source, you can use these command to check and update all at once:</p><!--kg-card-begin: markdown--><p><code>dotnet new --update-check</code><br>
<code>dotnet new --update-apply</code></p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.deltacode.be/content/images/2020/10/image-7.png" class="kg-image" alt srcset="https://blog.deltacode.be/content/images/size/w600/2020/10/image-7.png 600w, https://blog.deltacode.be/content/images/2020/10/image-7.png 994w" sizes="(min-width: 720px) 720px"></figure><h3 id="updating-via-custom-package-source">Updating via custom package source</h3><p>Unfortunately these commands won't work to update templates that where installed from a custom source, i.e. Azure Artifacts. </p><p>Only thing I have found to work is to install the template again. That will download a new version. An example for installing and updating(!) a custom template package:</p><!--kg-card-begin: markdown--><p>dotnet new --install &quot;fsociety.templates&quot; --nuget-source <a href="https://pkgs.dev.azure.com/fsociety/_packaging/feedname/nuget/v3/index.json">https://pkgs.dev.azure.com/fsociety/_packaging/feedname/nuget/v3/index.json</a></p>
<!--kg-card-end: markdown--><p>If you check the .templateengine folder, you'll see the latest .nupkg is there.</p><p>I don't think there is a solution to update all templates at once when they are installed from a custom source. If you know one, please share in the comments.</p>]]></content:encoded></item><item><title><![CDATA[Giphy CLI]]></title><description><![CDATA[A .NET Global tool to search for a gif on Giphy and optionally open the link in the browser or copy the link or markdown to the clipboard.]]></description><link>https://blog.deltacode.be/2020/10/11/giphy-cli/</link><guid isPermaLink="false">5f82b47b1b45091b00a8e563</guid><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Sun, 11 Oct 2020 07:35:57 GMT</pubDate><content:encoded><![CDATA[<p>I have published an update of my Giphy CLI tool with 'copy to clipboard' option. After the search result is displayed, you can now copy the url or markdown of the gif to clipboard. Or choose to open the giphy.com page in the browser.</p><p>If you have the .NET Core SDK installed, you can install with the follow command:</p><!--kg-card-begin: markdown--><p><code>dotnet tool install --global GiphyCli</code></p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card kg-width-wide"><img src="https://blog.deltacode.be/content/images/2020/10/image-4.png" class="kg-image" alt srcset="https://blog.deltacode.be/content/images/size/w600/2020/10/image-4.png 600w, https://blog.deltacode.be/content/images/2020/10/image-4.png 983w"></figure><p><a href="https://www.nuget.org/packages/GiphyCli/">https://www.nuget.org/packages/GiphyCli/</a></p><p>Did you try the tool? Let me know in the comments.</p>]]></content:encoded></item><item><title><![CDATA[Tag sources on build in Azure YAML Pipelines]]></title><description><![CDATA[Tagging Git sources with build number after successful build with Azure Pipelines YAML.]]></description><link>https://blog.deltacode.be/2020/10/09/tag-source-on-build-in-azure-yaml-pipelines/</link><guid isPermaLink="false">5f8068f71b45091b00a8e4c6</guid><category><![CDATA[Pipelines]]></category><category><![CDATA[Azure DevOps]]></category><category><![CDATA[Git]]></category><category><![CDATA[ALM]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Fri, 09 Oct 2020 14:20:31 GMT</pubDate><media:content url="https://blog.deltacode.be/content/images/2020/10/carbon.png" medium="image"/><content:encoded><![CDATA[<img src="https://blog.deltacode.be/content/images/2020/10/carbon.png" alt="Tag sources on build in Azure YAML Pipelines"><p>With the classic UI builds, we could easily access the option to tag source with every build or successful build.</p><figure class="kg-card kg-image-card"><img src="https://blog.deltacode.be/content/images/2020/10/image.png" class="kg-image" alt="Tag sources on build in Azure YAML Pipelines"></figure><p>When using Azure YAML Pipelines, this is still possible, but you really have to go out of your way to get to that option. From the YAML edit screen you'll need to go to the Triggers option via the kebab menu icon. After this you'll find the Tag sources option in the same place as before.</p><figure class="kg-card kg-image-card"><img src="https://blog.deltacode.be/content/images/2020/10/image-1.png" class="kg-image" alt="Tag sources on build in Azure YAML Pipelines"></figure><h2 id="self-checkout">Self checkout</h2><p>However, there is an undocumented way of doing this straight in the YAML. <br>You'll need to add an explicit step to checkout the code at the end of your build steps. Call the <code>git tag</code> command to add the (lightweight) tag from the command line, followed by a <code>git push</code>. Here's a slimmed down version using PowerShell:</p><!--kg-card-begin: markdown--><pre><code class="language-yml">steps:
- checkout: self
  clean: true
  persistCredentials: true

# restore, build, test, pack and push go here

- powershell: |
    Write-Host &quot;Tagging Build: $env:BuildNumber&quot;

    git tag $env:BuildNumber
    git push origin $env:BuildNumber
  env:
    BuildNumber: $(Build.BuildNumber)
  condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))
</code></pre>
<!--kg-card-end: markdown--><h2 id="permissions">Permissions</h2><!--kg-card-begin: markdown--><p>You need the Git 'GenericContribute' permission to perform this action.</p>
<!--kg-card-end: markdown--><p>The <code>persistCredentials</code> option is required to allow all steps after the checkout to have access to the auth token for the <code>git push</code> operation.</p><p>You'll also need to set the <em>Contribute</em> permission to <em>Allowed</em> for the Build Service user .</p><figure class="kg-card kg-image-card"><img src="https://blog.deltacode.be/content/images/2020/10/image-3.png" class="kg-image" alt="Tag sources on build in Azure YAML Pipelines" srcset="https://blog.deltacode.be/content/images/size/w600/2020/10/image-3.png 600w, https://blog.deltacode.be/content/images/size/w1000/2020/10/image-3.png 1000w, https://blog.deltacode.be/content/images/2020/10/image-3.png 1275w" sizes="(min-width: 720px) 720px"></figure><p>A big advantage of placing the tagging inline is that I can now <strong>use the same YAML file </strong>for CI builds on Pull request and CD builds for approved PRs that go into <code>master</code>.</p><p>More information about the checkout step: <a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&amp;tabs=schema%2Cparameter-schema#checkout">https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&amp;tabs=schema%2Cparameter-schema#checkout</a></p><p>PS: You can use PowerShell Core if you need cross-platform. Or switch to bash.</p>]]></content:encoded></item><item><title><![CDATA[git fetch over all subfolders with PowerShell]]></title><description><![CDATA[<p>Want to fetch or pull all subfolder of your repos directory with a single line of PowerShell?</p><figure class="kg-card kg-image-card"><img src="https://blog.deltacode.be/content/images/2020/09/image.png" class="kg-image" alt></figure><!--kg-card-begin: markdown--><pre><code class="language-powershell">gci | foreach { write-host $_.fullname; push-location $_; &amp; git fetch }
</code></pre>
<!--kg-card-end: markdown-->]]></description><link>https://blog.deltacode.be/2020/09/29/git-fetch-over-all-subfolders-with-powershell/</link><guid isPermaLink="false">5f777fee6d3795227c24a4e9</guid><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Tue, 29 Sep 2020 16:10:26 GMT</pubDate><content:encoded><![CDATA[<p>Want to fetch or pull all subfolder of your repos directory with a single line of PowerShell?</p><figure class="kg-card kg-image-card"><img src="https://blog.deltacode.be/content/images/2020/09/image.png" class="kg-image" alt></figure><!--kg-card-begin: markdown--><pre><code class="language-powershell">gci | foreach { write-host $_.fullname; push-location $_; &amp; git fetch }
</code></pre>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Update Portainer]]></title><description><![CDATA[<p>I initialize my container with 2 extra options that are not in the official instructions. Adding <code>--name</code> as <code>portainer</code> and configuring the <code>--restart</code> policy as <code>unless-stopped</code>.</p><pre><code>docker run --name portainer --restart=unless-stopped -d -p 8000:8000 -p 9000:9000 -v /var/run/docker.sock:/var/run/docker.sock -v portainer_</code></pre>]]></description><link>https://blog.deltacode.be/2019/08/09/update-portainer/</link><guid isPermaLink="false">5f777fee6d3795227c24a4e8</guid><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Fri, 09 Aug 2019 18:08:00 GMT</pubDate><content:encoded><![CDATA[<p>I initialize my container with 2 extra options that are not in the official instructions. Adding <code>--name</code> as <code>portainer</code> and configuring the <code>--restart</code> policy as <code>unless-stopped</code>.</p><pre><code>docker run --name portainer --restart=unless-stopped -d -p 8000:8000 -p 9000:9000 -v /var/run/docker.sock:/var/run/docker.sock -v portainer_data:/data portainer/portainer-ce
</code></pre><p>Updating to a new version is just 3 lines and 1 line to recreate the container.</p><pre><code>sudo docker pull portainer/portainer-ce
sudo docker stop portainer
sudo docker rm portainer

docker run --name portainer --restart=unless-stopped -d -p 8000:8000 -p 9000:9000 -v /var/run/docker.sock:/var/run/docker.sock -v portainer_data:/data portainer/portainer-ce
</code></pre><p>Thanks to the `-v` parameter we used to bind a volume all data is stored on the host and all settings are preserved.</p><p><a href="https://docs.docker.com/engine/reference/run/">docker run reference</a><br><a href="https://www.portainer.io/installation/">portainer installation</a></p>]]></content:encoded></item><item><title><![CDATA[Update raspbian via SSH]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>4 steps after you SSH into your Raspberry PI running raspbian:</p>
<pre><code>sudo apt update
sudo apt dist-upgrade
sudo apt clean
sudo reboot
</code></pre>
<p>This is the short version of <a href="https://www.makeuseof.com/tag/raspberry-pi-update-raspbian-os/">https://www.makeuseof.com/tag/raspberry-pi-update-raspbian-os/</a></p>
<!--kg-card-end: markdown-->]]></description><link>https://blog.deltacode.be/2019/08/02/update-raspbian-via-ssh/</link><guid isPermaLink="false">5f777fee6d3795227c24a4e7</guid><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Fri, 02 Aug 2019 15:28:38 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>4 steps after you SSH into your Raspberry PI running raspbian:</p>
<pre><code>sudo apt update
sudo apt dist-upgrade
sudo apt clean
sudo reboot
</code></pre>
<p>This is the short version of <a href="https://www.makeuseof.com/tag/raspberry-pi-update-raspbian-os/">https://www.makeuseof.com/tag/raspberry-pi-update-raspbian-os/</a></p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Pimp My PowerShell]]></title><description><![CDATA[<!--kg-card-begin: markdown--><h2 id="installscoop">Install Scoop</h2>
<p><a href="http://scoop.sh/">http://scoop.sh/</a><br>
<em>A command-line installer for Windows</em></p>
<p><code>iex (new-object net.webclient).downloadstring('https://get.scoop.sh')</code></p>
<h2 id="installanicerprompt">Install a nicer prompt</h2>
<p>Install pshazz via concfg<br>
<em>Give your powershell some pizazz.</em></p>
<p><code>scoop install pshazz</code></p>
<h2 id="installconcfg">Install concfg</h2>
<p><a href="https://github.com/lukesampson/concfg">https://github.com/lukesampson/concfg</a><br>
<em>concfg is a utility to import and export</em></p>]]></description><link>https://blog.deltacode.be/2018/06/05/powershell-theming-and-git-info/</link><guid isPermaLink="false">5f777fee6d3795227c24a4e5</guid><category><![CDATA[PowerShell]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Tue, 05 Jun 2018 18:07:26 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="installscoop">Install Scoop</h2>
<p><a href="http://scoop.sh/">http://scoop.sh/</a><br>
<em>A command-line installer for Windows</em></p>
<p><code>iex (new-object net.webclient).downloadstring('https://get.scoop.sh')</code></p>
<h2 id="installanicerprompt">Install a nicer prompt</h2>
<p>Install pshazz via concfg<br>
<em>Give your powershell some pizazz.</em></p>
<p><code>scoop install pshazz</code></p>
<h2 id="installconcfg">Install concfg</h2>
<p><a href="https://github.com/lukesampson/concfg">https://github.com/lukesampson/concfg</a><br>
<em>concfg is a utility to import and export Windows console settings like fonts and colors.</em></p>
<p><code>scoop install concfg</code></p>
<p>This is not just for PowerShell but also for cmd.</p>
<h2 id="installtheme">Install theme</h2>
<p>Install a preset theme Solarized via concfg</p>
<p><code>concfg import solarized</code></p>
<h2 id="visualstudioide">Visual Studio IDE</h2>
<p>In Visual Studio you can install the <em>Open Command Line</em> extension. This will open a command line of your choice with the <code>CTRL+space</code> shortcut. Configure it to open PowerShell (instead of default cmd). Don't forget to restart VS.</p>
<p><a href="https://marketplace.visualstudio.com/items?itemName=MadsKristensen.OpenCommandLine">https://marketplace.visualstudio.com/items?itemName=MadsKristensen.OpenCommandLine</a></p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Find repos with unpushed commits using PowerShell]]></title><description><![CDATA[Here's a onliner to find unpushed commits in all subfolders of a given folder.]]></description><link>https://blog.deltacode.be/2017/12/18/find-unpushed-commits-with-powershell/</link><guid isPermaLink="false">5f777fee6d3795227c24a4e4</guid><category><![CDATA[Git]]></category><category><![CDATA[PowerShell]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Mon, 18 Dec 2017 16:26:01 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Want to format your hard drive to reinstall Windows? Make sure you have pushed all commits to the server using this online.</p>
<p>Run this PowerShell command in the root folder of your repositories. The <code>cd ..</code> at the end is to go back into the root folder so you can easily repeat the command.</p>
<pre><code>C:\Repos&gt; Get-ChildItem -Directory | Where-Object { Test-Path &quot;$($_.FullName)\.git&quot; } | ForEach-Object { Write-Host $($_.Name) ; Set-Location $($_.FullName); Invoke-Expression &quot;git log --branches --not --remotes &quot;}; cd ..
</code></pre>
<p>Aliases can be used to make this even shorter, like using <code>gci</code> instead of <code>Get-ChildItem</code> and <code>iex</code> for <code>Invoke-Expression</code>. But we don't need to fit this in a tweet. Short aliases are ok, but not every user know what they stand for, so I'm using the complete command for clarity.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Fix slow startup of ASP.NET MVC 5 on Azure App Services]]></title><description><![CDATA[I will show you how to speed up your web site start up time using view compilation. Because Azure App Service disc IO is slow, this will make a big impact.]]></description><link>https://blog.deltacode.be/2017/01/08/fix-slow-startup-of-asp-net-mvc-5-on-azure-app-services/</link><guid isPermaLink="false">5f777fee6d3795227c24a4e3</guid><category><![CDATA[ALM]]></category><category><![CDATA[MVC]]></category><category><![CDATA[Azure]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Sun, 08 Jan 2017 11:16:37 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p><em>You've deployed your MVC 5 website to Azure App Service (via Team Services); with pride you visit the site and start clicking... But wait, it's so slow even though you've done everything right, and 'it works on my machine'. Maybe you even scaled up from the free or shared tier, but it's still too slow.</em></p>
<h3 id="introduction">Introduction</h3>
<p>I will show you how to speed up your web site start up time using view compilation. Because Azure App Service disc IO is slow, this will make a big impact. It'll improve startup on other servers too. <strong>The homepage of my application starts up in 13 seconds now, before it was close to 30 seconds!</strong> How I 'measured' this is for another post.</p>
<p>If you publish via Visual Studio, you only need to set the correct checkboxes. You can read this excellent <a href="http://gunnarpeipman.com/2016/08/asp-net-mvc-precompiling-views/">blog post by Gunnar Peiman</a> for some more background information and instructions. But we all know that publishing from Visual Studio isn't the correct way, right? Friends don't let friends 'right-click publish'. Just rub some DevOps on it (TM Donovan Brown)!</p>
<h3 id="build">Build</h3>
<p>I'm using the Team Services build system for this project. It's easy to set up the initial build pipeline, because you can pick a template to start from. The one I used is Azure WebApp under Deployment.</p>
<p><img src="https://blog.deltacode.be/content/images/2017/01/2017-01-08-12_08_47-Start.png" alt></p>
<p>It has steps for nuget restore, build, tests, deployment and publishing the artifact back to Team Services to store with your build results. The hosted build agent, free for up to 240 minutes a month, offers what I need.</p>
<p><img src="https://blog.deltacode.be/content/images/2017/01/2017-01-08-12_23_28-Start.png" alt></p>
<p>Instead of going through the edit build, execute build, check output loop on Team Services, I just opened a 'developer command prompt' that has msbuild.exe in the path and started looking for the right combination of parameters. Yes, our goal can be achieved just by adding MSBuild arguments.</p>
<p><em>Disclaimer</em></p>
<p>Builds will be slower, about 30-40 seconds for me. But it'll make your site start up faster because you've moved some workload from the webserver to the build server. So it's worth it.</p>
<h3 id="step1">Step 1</h3>
<p>Add <strong>/p:MvcBuildViews=true</strong>. Just to tell you the difference between building views and precompiling views.</p>
<pre><code>msbuild.exe build.sln /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:PackageLocation=&quot;C:\out\\&quot; /p:platform=&quot;any cpu&quot; /p:configuration=&quot;release&quot; /p:VisualStudioVersion=&quot;14.0&quot; /p:MvcBuildViews=true
</code></pre>
<p>This will build your views and give you compile-time errors of issues in your views. Let's say you mistype a variable name, you'd normally only see this at run-time (after deployment). With this option, the build will fail and you won't have a deployment with an error in a view. <em>This will not make your site or pages load faster</em>.</p>
<h3 id="step2">Step 2</h3>
<p>Replace with <strong>/p:PrecompileBeforePublish=true</strong>.</p>
<pre><code>msbuild.exe build.sln /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:PackageLocation=&quot;C:\out\\&quot; /p:platform=&quot;any cpu&quot; /p:configuration=&quot;release&quot; /p:VisualStudioVersion=&quot;14.0&quot; /p:PrecompileBeforePublish=true
</code></pre>
<p>This will not only build your views, but also put them into a dll. No more dynamic compilation by the server at runtime (aka at request time).</p>
<p><img src="https://blog.deltacode.be/content/images/2017/01/precompile-1.png" alt="Output of precompiled views"></p>
<p>For every view you'll get one dll. The more dlls the server has to load, the slower. So let's merge them into a single dll.</p>
<h3 id="step3">Step 3</h3>
<p>Add <strong>/p:UseMerge=true /p:SingleAssemblyName=AppCode</strong></p>
<pre><code>msbuild.exe build.sln /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:PackageLocation=&quot;C:\out\\&quot; /p:platform=&quot;any cpu&quot; /p:configuration=&quot;release&quot; /p:VisualStudioVersion=&quot;14.0&quot; /p:PrecompileBeforePublish=true /p:UseMerge=true /p:SingleAssemblyName=AppCode
</code></pre>
<p>All the App_Web dlls have been merged into a single dll. Also notice that the App_global.asax.dll file is gone now, it's been merge into AppCode too. In the build log you'll see calls to aspnet_compiler.exe and aspnet_merge.exe.</p>
<p><img src="https://blog.deltacode.be/content/images/2017/01/precompile-merge-1.png" alt></p>
<h3 id="cshtml">.cshtml</h3>
<p>What happened with the .cshtml files? Well they are still in the same place as they were, but they have been changed. If you open one of the views you'll see that it contains a placeholder text. This is to indicate that the views have been precompiled. And also make it clear that you don't have the option anymore to edit the view for a quick fix directly on the web server (precompiled, remember?). You'll have to follow your CI/CD flow (like a pro).</p>
<p><img src="https://blog.deltacode.be/content/images/2017/01/cshtml-placeholder.png" alt></p>
<p>You'll also find a .compiled file in the bin folder for every view. It contains XML that links the .cshtml file to the compiled class in the AppCode assembly file.</p>
<h3 id="summary">Summary</h3>
<p>Add this to the build parameters and you'll have a ASP.NET MVC 5 that will start up faster and every new page will load faster.</p>
<p><code>/p:PrecompileBeforePublish=true /p:UseMerge=true /p:SingleAssemblyName=AppCode</code></p>
<p><img src="https://blog.deltacode.be/content/images/2017/01/2017-01-08-10_01_10-Edit-GamePrijzenWeb-Deploy.png" alt></p>
<p>Full list of msbuild arguments in Team Services build:</p>
<p>/p:PrecompileBeforePublish=true /p:UseMerge=true /p:SingleAssemblyName=AppCode /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation=&quot;$(build.artifactstagingdirectory)\&quot;</p>
<h5 id="protip">ProTip</h5>
<p>Keep your site alive by using a monitoring tool like <a href="http://www.uptimerobot.com">uptimerobot.com</a> or <a href="https://blog.deltacode.be/2017/01/08/fix-slow-startup-of-asp-net-mvc-5-on-azure-app-services/appbeat.io">appbeat.io</a> (which runs on .NET I believe) ! Their free tier allow checks at a 5 minute interval, which should be enough to keep it running.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Why Remove and Add ExtensionlessUrlHandler?]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I was starting a new ASP.NET MVC 5 project today, and like always was copy/pasting some code from a previous project. No need to reinvent anything if the work has been done... right?</p>
<p>In the web.config I saw the following under <code>&lt;system.webServer&gt;&lt;handlers&</code></p>]]></description><link>https://blog.deltacode.be/2017/01/04/why-remove-and-add-extensionlessurlhandler/</link><guid isPermaLink="false">5f777fee6d3795227c24a4e2</guid><category><![CDATA[.NET]]></category><category><![CDATA[MVC]]></category><category><![CDATA[ASP.NET]]></category><category><![CDATA[IIS]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Wed, 04 Jan 2017 17:52:55 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I was starting a new ASP.NET MVC 5 project today, and like always was copy/pasting some code from a previous project. No need to reinvent anything if the work has been done... right?</p>
<p>In the web.config I saw the following under <code>&lt;system.webServer&gt;&lt;handlers&gt;</code></p>
<pre><code>&lt;system.webServer&gt;
    &lt;handlers&gt;
        ...
        &lt;remove name=&quot;ExtensionlessUrlHandler-Integrated-4.0&quot; /&gt;
        &lt;add name=&quot;ExtensionlessUrlHandler-Integrated-4.0&quot; path=&quot;*.&quot; verb=&quot;*&quot; type=&quot;System.Web.Handlers.TransferRequestHandler&quot; preCondition=&quot;integratedMode,runtimeVersionv4.0&quot; /&gt;
        ...
    &lt;/handlers&gt;
&lt;/system.webServer&gt;
</code></pre>
<p>Why would that even be there? Removing and adding again... There must be a point to this, right?</p>
<p>The default <code>ExtensionlessUrlHandler-Integrated-4.0</code> handler only registers with a <strong>small set of verbs: GET, HEAD, POST, DEBUG</strong>. So you can't use PUT or DELETE with an extensionless url if you don't replace the handler. Something you would want to do with a RESTful Web API. The new handler registers for all verbs, and <strong>that includes PUT and DELETE</strong>.</p>
<p><img src="https://blog.deltacode.be/content/images/2017/01/extensionlessurlhandler-1.png" alt="ExtensionlessUrlHandler-Integrated-4.0 in IIS"></p>
<p>There are a total of 3 <code>ExtensionlessUrlHandler</code>s. These are part of ASP.NET v4.0. On older machines you have to install a KB/QFE to enable MVC (and WebForms?) to handle extensionless URLs, which wasn't possible in the previous version. See this <a href="https://blogs.msdn.microsoft.com/tmarq/2010/05/26/how-extensionless-urls-are-handled-by-asp-net-v4/">link</a> for more information.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Using Azure Functions as glue between Octopus deploy and Microsoft Teams Channel]]></title><description><![CDATA[Post a message on a Microsoft Teams channel when Octopus Deploy has deployed an application in production, using the serverless Azure Function platform.]]></description><link>https://blog.deltacode.be/2016/11/17/azure-functions-octopus-deploy-and-microsoft-teams-channel/</link><guid isPermaLink="false">5f777fee6d3795227c24a4e1</guid><category><![CDATA[Azure]]></category><category><![CDATA[C#]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Thu, 17 Nov 2016 21:19:23 GMT</pubDate><media:content url="https://blog.deltacode.be/content/images/2016/11/2016-11-19-18_30_34-General--TryOut--_-Microsoft-Teams.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://blog.deltacode.be/content/images/2016/11/2016-11-19-18_30_34-General--TryOut--_-Microsoft-Teams.png" alt="Using Azure Functions as glue between Octopus deploy and Microsoft Teams Channel"><p>In this post we'll use Azure Functions as a gateway that recieves a payload from Octopus Deploy (after a deployment to production) and posts a notification to a Microsoft Teams channel.</p>
<p>The Azure Function code for this blogpost can be found at <a href="https://github.com/DavidDeSloovere/AzureFunction-OctopusToMicrosoftTeams">https://github.com/DavidDeSloovere/AzureFunction-OctopusToMicrosoftTeams</a></p>
<h2 id="introduction">Introduction</h2>
<p>A few weeks ago Octopus Deploy released version 3.5 of the amazing deployment tool. In this version they introduced 'Subscriptions'. This feature allows you to <em>subscribe</em> to all kinds of events via email or webhook. Together with the release they also published a post, showcasing the use of 'Subscriptions' to send a notification of an Octopus Deploy event to a Slack channel via Zapier. Great idea, we usd Octopus Deploy at Natch, but... we don't use Slack, and I don't want to use Zapier. We did activate Microsoft teams recently, it comes for free with our Office 365 subscription and we are always looking to improve communication and workflow. So I got this idea:</p>
<blockquote>
<p>Mission: post a message on a Microsoft Teams channel when Octopus Deploy has deployed an application in production, with Azure Functions in the middle</p>
</blockquote>
<h2 id="whatisazurefunctions">What is Azure Functions</h2>
<p>With the announcement of Azure Functions going GA (General Availability) yesterday (2016-11-16), it seemed like the right time to write this blogpost. Azure Functions is Microsoft's implementation of the serverless computing model on top of Azure. Serverless computing has been the lastest evolution of the cloud. Amazon has had Lambda on AWS for two years already.</p>
<blockquote>
<p>Azure Functions is a solution for easily running small pieces of code, or &quot;functions,&quot; in the cloud.</p>
</blockquote>
<p>We've come a long way. It started with buying physical hardware and running servers on them (been there, done that). Then came virtualization, renting resources, and running your VMs (IaaS). Evolving into platforms like Azure Website where you don't even own and maintain VMs anymore (PaaS). IaaS and Paas are still valid models, but you're paying for those resources 24/7 even if your application only runs a few hours a day or is only visited during office hours.</p>
<p>With serverless computing a lot of this changes. You only pay when your code is executed. You only maintain your code. The underlying infrastructure is taken care of for you by the hosting provider.</p>
<p>Microsoft used the WebJobs platform as a starting point for Azure Functions. With the amazing Azure portal, the online editor code-named 'Monaco', source control integration for CD, Azure Functions is set to be a great candidate when exploring options. A Function can be written in C# with Roslyn under the hood, but also Javascript with node under the hood.</p>
<h2 id="whatismicrosoftteams">What is Microsoft Teams</h2>
<p>In short, Microsofts answer to Slack. Released just a few weeks back (2016-11-02).</p>
<blockquote>
<p>Microsoft Teams, the new chat-based workspace in Office 365</p>
</blockquote>
<p>I believe Slack got a little bit scared, because they released this passive-aggresive 'Welcome Microsoft Teams' press release right after Microsofts announcement.</p>
<h2 id="flow">Flow</h2>
<p>To set this all up, we have 3 parts:</p>
<ul>
<li>An Octopus Deploy Subscription that posts its payload to a webhook...</li>
<li>An Azure function that receives the payload, extracts the data and posts a message</li>
<li>A Microsoft Teams channel with an 'Incoming Webhook' connector</li>
</ul>
<h3 id="microsoftteams">Microsoft Teams</h3>
<p>We'll start at the last step of the flow, because we need the webhook url for the Azure Function.</p>
<p>In the channel, add a connector of the type 'Incoming Webhook'.</p>
<p><img src="https://blog.deltacode.be/content/images/2016/11/2016-11-17-22_15_50-General--TryOut--_-Microsoft-Teams.png" alt="Using Azure Functions as glue between Octopus deploy and Microsoft Teams Channel"></p>
<p>All you need is a name (ie. Octopus Deploy) and you'll get a webhook url back under the outlook.office365.com domain. You might also want to upload the Octopus logo, because that will be displayed with the message. You'll need the webhook url in the next step, keep it in the clipboard.</p>
<p><img src="https://blog.deltacode.be/content/images/2016/11/2016-11-17-22_17_52-General--TryOut--_-Microsoft-Teams.png" alt="Using Azure Functions as glue between Octopus deploy and Microsoft Teams Channel"></p>
<h3 id="azurefunction">Azure Function</h3>
<p>In the Azure portal, add a new <em>Function App</em>. This App allows you to add multiple 'functions'. We'll only need one, and we'll use 'GenericWebhookCSharp' as the starting point. This template is configured to run on an incoming http request.</p>
<p>Under the Application Settings of the Funtion App, add a new 'app setting' named 'TeamsWebHookUrl' and paste the O365 webhook url as the value. This app setting will be available as an environment variable when we write our C# code. You don't need to hardcode this setting, so you don't need to put this in source control.</p>
<p><img src="https://blog.deltacode.be/content/images/2016/11/2016-11-17-21_11_55-Application-settings---Microsoft-Azure.png" alt="Using Azure Functions as glue between Octopus deploy and Microsoft Teams Channel"></p>
<p>Copy the code from the github project (which I also set up as Continuous Integration source) or this gist.</p>
<script src="https://gist.github.com/DavidDeSloovere/31f188235149f08d9a657b73643aef29.js"></script>
<p>Previously, without using a service like Zapier, I would have:</p>
<ul>
<li>created an ASP.NET MVC/WebApi project -&gt; too much work (lazy developer = good developer)</li>
<li>hosted it on a webserver (or azure) -&gt; would run all the time (consume too much resources, expensive)</li>
</ul>
<p><strong>The code is very short and will be called a few times a day at most. Which is a great fit for an Azure Function.</strong></p>
<p>Before we continue to the next step, copy the Function url from the Develop tab.</p>
<p><img src="https://blog.deltacode.be/content/images/2016/11/2016-11-17-21_36_50-Function-app---Microsoft-Azure.png" alt="Using Azure Functions as glue between Octopus deploy and Microsoft Teams Channel"></p>
<h3 id="octopusdeploy">Octopus Deploy</h3>
<p>To activate a 'Subscription' you can either click around in Octopus Deploy until you stumble upon it or give up (yes, it's hidden very well). If you gave up, here iswhere you can find it: Configuration &gt; Audit &gt; Subscriptions (top right).</p>
<p>Create a new subscription. Give it a name, set filters and paste the Azure Function url in the 'Payload URL' field.</p>
<p><img src="https://blog.deltacode.be/content/images/2016/11/2016-11-17-21_57_05-ProductionDeployments---Octopus-Deploy.png" alt="Using Azure Functions as glue between Octopus deploy and Microsoft Teams Channel"></p>
<h3 id="finishingup">Finishing up</h3>
<p>Now let's see if our setup works:</p>
<ul>
<li>Trigger an Octopus Deploy event that matches the filters of the subscription you have set up.</li>
<li>In the Azure Portal, navigate to the monitor tab of your function, there is even a 'live event stream'</li>
<li>Within a few seconds, your Microsoft Teams channel will receive the message that was sent by Octopus Deploy.</li>
<li>Enjoy.</li>
</ul>
<p><img src="https://blog.deltacode.be/content/images/2016/11/2016-11-17-22_03_49-General--TryOut--_-Microsoft-Teams.png" alt="Using Azure Functions as glue between Octopus deploy and Microsoft Teams Channel"></p>
<h3 id="aboutwritingthefunction">About writing the function</h3>
<p>I started writing the code for the function in the Azure portal. The editor is pretty amazing, considering it's running in the browser. There a handy 'test' feature where you can quickly simulate a request to the function (which is more a webhook). This allows you to iterate quickly without leaving the browser. When the code was done (is code ever done?) I moved it to a github repo and set up the continuous integration option under the Function app settings. Now every commit in the master branch will automatically update the Azure function. So easy. I hadn't tried Azure functions before, nor had I used Octopus Deploy subscriptions or the Microsoft Teams webhook connector. Still I was able to get this all running in under 2 hours. Writing this blog post is actually more work.</p>
<p>Don't hesitate to leave a comment below or fork the github repo.</p>
<h5 id="someinterestinglinks">Some interesting links:</h5>
<ul>
<li><a href="https://azure.microsoft.com/en-us/services/functions/">Azure Functions official site</a></li>
<li><a href="https://octopus.com/blog/subscriptions">Octopus Deploy blogpost - Octopus Deploy event to Slack channel</a></li>
<li><a href="https://www.troyhunt.com/azure-functions-in-practice/">Troy Hunt - Azure Functions in Practice</a></li>
<li><a href="https://slackhq.com/dear-microsoft-8d20965d2849#.px9xtvud9">Slack 'Welcomes' Microsoft Teams</a></li>
<li><a href="https://dev.outlook.com/Connectors/GetStarted">O365 Connector</a></li>
</ul>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Fix "VT-x is not available (VERR_VMX_NO_VMX)" Virtualbox error]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I don’t know if it’s caused by new build of Win 10 or something else, but I started getting these errors when starting my Virtual box machines. Well not on all machine actually.</p>
<pre><code>Failed to open a session for the virtual machine Win7-VS2012.
VT-x is not available (VERR_</code></pre>]]></description><link>https://blog.deltacode.be/2015/10/21/vt-x-is-not-available-verr_vmx_no_vmx-result-code-e_fail-0x80004005/</link><guid isPermaLink="false">5f777fee6d3795227c24a4e0</guid><category><![CDATA[virtual box]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Wed, 21 Oct 2015 10:27:58 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I don’t know if it’s caused by new build of Win 10 or something else, but I started getting these errors when starting my Virtual box machines. Well not on all machine actually.</p>
<pre><code>Failed to open a session for the virtual machine Win7-VS2012.
VT-x is not available (VERR_VMX_NO_VMX).
Result Code: E_FAIL (0x80004005)
Component: ConsoleWrap
Interface: IConsole {872da645-4a9b-1727-bee2-5585105b9eed}
</code></pre>
<p>It took some time and digging to figure out it was only the 64-bit VMs that threw this error, and not the 32-bit VMs. After some interwebs clicking, I got to this <a href="https://derekgusoff.wordpress.com/2012/09/05/run-hyper-v-and-virtualbox-on-the-same-machine/">post by Derek Gusoff</a> where he described how to use Hyper-V and Virtual Box on the same host.</p>
<p>Although he doesn’t mention the error, the root cause of his problem is the same. Hyper-V is greedy and owns the “VT-x” thingie after booting and doesn’t share with Virtual Box at all.</p>
<p>The two solutions:</p>
<ul>
<li>Either remove Hyper-V from Windows (if you’re not using it obviously) via ‘Turn Windows features on of off’</li>
<li>Edit the BCD to start Windows with the Hyper-V (hypervisorlaunchtype disabled) as <a href="https://derekgusoff.wordpress.com/2012/09/05/run-hyper-v-and-virtualbox-on-the-same-machine/">the post</a> illustrates.</li>
</ul>
<p><em>Edit May 2017: Docker for Windows will required Hyper-V.</em></p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Windows - fix "HTTP server doesn't seem to support byte ranges" Vagrant error]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p><a href="https://www.vagrantup.com/">Vagrant</a> is very popular in the Linux community, but it also runs on Windows (where it comes with its own set of problems I guess).</p>
<p>I was trying to set up some machine, but the boxes needed to be downloaded first. Unfortunately my machine went into sleep mode before the</p>]]></description><link>https://blog.deltacode.be/2015/10/20/vagrant-error-http-server-doesnt-seem-to-support-byte-ranges-on-windows/</link><guid isPermaLink="false">5f777fee6d3795227c24a4df</guid><category><![CDATA[vagrant]]></category><category><![CDATA[windows]]></category><dc:creator><![CDATA[David De Sloovere]]></dc:creator><pubDate>Tue, 20 Oct 2015 18:44:18 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p><a href="https://www.vagrantup.com/">Vagrant</a> is very popular in the Linux community, but it also runs on Windows (where it comes with its own set of problems I guess).</p>
<p>I was trying to set up some machine, but the boxes needed to be downloaded first. Unfortunately my machine went into sleep mode before the download finished.<br>
Calling <code>vagrant up</code> resulted in the following error:</p>
<pre><code class="language-An">The error message, if any, is reproduced below. Please fix this error and try again.```

`HTTP server doesn't seem to support byte ranges. Cannot resume.`

Thanks to [this blog post](http://branetheory.org/2014/12/06/2135/) that describes how to fix this on Linux I was able to figure out how to solve it on Windows.  
 You just have to run this command to remove the temp directory.

    rmdir %vagrant_home%\tmp /S

As you can see `%VAGRANT_HOME%` points to the directory when Vagrant stores its files.

If you want to change the Vagrant home directory after installing Vagrant, fex. to another (larger) drive, you can run this command:

    setx VAGRANT_HOME=Z:\VirtualMachines\VagrantHome /m

New to Vagrant? Here’s the [Vagrant dowload page](https://www.vagrantup.com/downloads.html) (although you should use [Chocolatey](https://chocolatey.org/)) and the [Getting Started](https://docs.vagrantup.com/v2/getting-started/index.html) documentation.


</code></pre>
<!--kg-card-end: markdown-->]]></content:encoded></item></channel></rss>