I am a self-taught developer with knowledge of Azure, C, C++, C#, Typescript/JavaScript, HTML and CSS along with several databases. Having been working in IT for over 30 years and co-founding and selling a successful company I am now working on products for the TV industry.Follow me on twitter @coderanger, Mastodon @coderanger/@dotnet.social and GitHub

Powershell Tips

As I am forever forgetting things; I thought this would be the best place to put these tips, for myself and anyone else.

Renaming files in a folder

Get-ChildItem -Filter "*1x1.png" | Rename-Item -NewName {$_.name -replace '1x1.png','1x1-light.png' }

(Add -recurse to the Get-ChildItem call to recurse through sub-folders).

Deleting files in a folder

Get-ChildItem -Filter "*1x1.png" | Remove-Item

ASP.NET Core Onboarding Woes

I thought I would finally do a bit of dabbling in ASP.NET Core and boy the onboarding experience is something isn’t it.

Firstly I had decided on a server side rendered experience instead of a SPA, and as I love TypeScript, that is what I will be doing any client-side scripting in.

I had already decided on only modern browser support and I dont want to complicate things with a slow and painful bundling experience using rollup or webpack which makes debugging an awful experience.

ASP.NET Core

So the first issue I noticed was the ugly URLs; capitalised url parts … really, urgh! I understand this is based on the file names, but I don’t want to rename the files to all lowercase as thats not the .net way.

After a fair amount of digging this is resolved with a routing option in your Startup.cs ConfigureServices method:

services.AddRouting( options =>
{
  options.LowercaseUrls = true;
} );

Why this is not the default I have no idea?!

Typescript

Ok, so now onto adding in TypeScript; again why is this not already setup in the default templates … maybe then they would have resolved all the pain which took many hours of messing about trying to resolve.

Also as I only want to target modern browsers (Edge, Firefox, Chrome, Safari) I want to be abke to use the latest features like modern Modules support and so on.

Ok, so this took me a while of mucking about and working around, but it seems that once you add in the TypeScript MSBuild package, Visual Studio 2019 automatically looks and finds tsconfig.json files … however what I wanted (and is normal) is to have a production and development configuration so that production does not include comments or map files.

After trying csproj conditions (which didn’t work and gave build errors), extending files in separate folders which also didn’t work, the only solution I found (so far) was to have the following setup, which I am not against, albeit not ideal:

  • tsconfig.base.json - this contains my base options, include/exclude directories, module settings etc
  • tsconfig.debug.bak - this extends the base and contains options specific to debug (see below)
  • tsconfig.release.bak - like the debug.bak but with release options

tsconfig.base.json

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "ES2020",
    "moduleResolution": "Classic",
    "lib": [ "DOM", "ES2020" ],
    "noImplicitAny": true,
    "noEmitOnError": true,
    "alwaysStrict": true,
    "outDir": "wwwroot/js",
    "allowUmdGlobalAccess": true,
    "forceConsistentCasingInFileNames": true
  },
  "include": [
    "scripts/**/*"
  ],
  "exclude": [
    "wwwroot/lib/**/*",
    "wwwroot/js/**/*"
  ]
}

tsconfig.debug.bak

{
  "extends": "./tsconfig.base.json",
  "compilerOptions": {
    "removeComments": false,
    "sourceMap": true
  }
}

tsconfig.release.bak

{
  "extends": "./tsconfig.base.json",
  "compilerOptions": {
    "removeComments": true,
    "sourceMap": false
  }
}

The last piece of this little puzzle is to set a Pre-Build Event to rename the debug/release based on the current configuration:

del "tsconfig.json"
copy "tsconfig.$(ConfigurationName).bak" "tsconfig.json"

All the above now allows you to have modern TypeScript using imports in an asp.net core project.

The only caveat (which again is odd that there is no option for) is that your import statements need to have .js added to the module name. This works in both TS compiles but also in the browser; and to include the main script as a module.

Here are some examples.

app.ts

export class App {
  constructor() {
  }

  public startup() {
    // Initialise and start our application
  }
}

site.ts

import { App } from './app.js';

$(document).ready(() => {
  const app = new App();
  app.startup();
});

_Layout.cshtml

<script src="~/js/site.js" asp-append-version="true" type="module"></script>

I hope this helps someone who might be discovering the same points as me.

Now available as a free Visual Studio Extension

Forcing Semantic Release Compatible Commits

We use the great semantic-release system for our Node projects at work, which is great for automating the version numbers and change logs.

The way it works is that you format your commit messages in a particular way to mark the changes as a ‘fix’, 'feature’ or 'breaking change’ … when semantic-release is run, it will determine what version numbers need increasing and will also generate a change log.

However, its easy to format the message incorrectly or forget completely, which will cause all sorts of issues when you try and do a release.

So I created a simple git hook to check my message format:

#!/bin/sh

# Config options

min_length=4
max_length=50
types=("feat" "fix" "perf")

# End config options


regexpstart="^("
regexp="${regexpstart}"

for type in "${types[@]}"
do
  if [ "$regexp" != "$regexpstart" ]; then
    regexp="${regexp}|"
  fi
  regexp="${regexp}$type"
done

regexp="${regexp})(\(.+\))?: "
regexp="${regexp}.{$min_length,$max_length}$"

function print_error() {
  echo -e "\n\e[1m\e[31m[INVALID COMMIT MESSAGE]"
  echo -e "------------------------\033[0m\e[0m"
  echo -e "\e[1mValid types:\e[0m \e[34m${types[@]}\033[0m"
  echo -e "\e[1mMax length (first line):\e[0m \e[34m$max_length\033[0m"
  echo -e "\e[1mMin length (first line):\e[0m \e[34m$min_length\033[0m\n"
}

# get the first line of the commit message
INPUT_FILE=$1
START_LINE=`head -n1 $INPUT_FILE`

if [[ ! $START_LINE =~ $regexp ]]; then
  # commit message is invalid according to semantic-release conventions
  print_error
  exit 1
fi

You can save the above script into your .git/hooks/ folder for any repos which exist already; or you can run the following commands that will set things up for every new repo you init in the future:

git config --global init.templatedir '~/.git-templates'
mkdir -p ~/.git-templates/hooks
cp commit-msg ~/.git-templates/hooks

Using nvmrc on Windows

Unfortunately nvm use on Windows does not change the node version to that specified in the `.nvmrc` file as its not supported on nvm for Windows.

So the easiest solution to this is to create a simple Powershell command that performs an approximation of the command which is to switch to the version specified and, if it doesn’t already exist, download and install it first: nvm use $(Get-Content .nvmrc).replace( 'v', '' );

However, thats a bit awkward and we can do a bit more so instead, we can create an ‘alias’ to a function that calls the command instead:

function callnvm() {
  $versionDesired = $(Get-Content .nvmrc).replace( 'v', '' );
  $response = nvm use $versionDesired;
  if ($response -match 'is not installed') {
    if ($response -match '64-bit') {
      nvm install $versionDesired x64
    } else {
      nvm install $versionDesired x86
    }
    nvm use $versionDesired;
  }
}
Set-Alias nvmu -value "callnvm"

Now we only need to type nvmu in a project folder for it to work properly.

However, this will only work for the current session, so to make it more useful for any project and every session; we can add this content to the Powershell Profile for the current user.

You can get the location of this file by typing $profile in a Powershell session and either edit or create the Profile file and place the content shown above into it.

Converting Azure Pipeline with Task Groups to Yaml

We have a fairly complex pipeline which builds, tests and deploys our ASP.net MVC app to an Azure WebApp in an App Service Environment. Because we have several high profile customers we actually deploy the app to separate web apps for each customer ‘instance’ so they have database and application isolation.

Because each customer instance is identical except for some App Settings to point to a separate database; deployment is the same except the web app location. Currently we have a Task Group with parameters setting the name of the instance (for the Task Display Name), the app location and the staging url so we can run tests.

I would prefer to use the new YAML pipeline for this app so its easier to add new customer 'instances’ in the future and we can source control the pipeline.

After some investigation, I discovered I can pass parameter 'objects’ into a template YAML file to pretty much do what I want; the only tricky bit was to have multiple properties per instance parameter 'object’ and using the new template {{ each }} expression.

Below is how I constructed my yaml files for this solution.

azure_pipelines.yml

pool:
  name: Hosted VS2017
  demands:
  - npm
  - msbuild
  - visualstudio
  - vstest

steps:
- template: azure_webapp_template.yml
  parameters:
    webapps:
    - name: Customer 1
      url: customer1.azurewebsites.net
    - name: Customer 2
      url: customer2.azurewebsites.net
    - name: Customer 3
      url: customer3.azurewebsites.net
    - name: Customer 4
      url: customer4.azurewebsites.net

As you can see above, we are creating an object webapps and then we have some nested properties for each 'webapp’.

Then in our 'template’ we can iterate over each of the objects in the webapps parameter and expand the property in our iterated tasks.

azure_webapp_template.yml

# Proving ability to loop over params a number of times

parameters:
- name: 'webapps'
  type: object
  default: {}

steps:
- ${{ each webapp in parameters.webapps }}:

  - task: PowerShell@2
    displayName: 'Task Group Test 1 ${{webapp.name}}'
    inputs:
      targetType: 'inline'
      script: |
        Write-Host "Name: ${{webapp.name}} with url ${{webapp.url}}"
      failOnStderr: true
      workingDirectory: '$(Build.SourcesDirectory)'

  - task: PowerShell@2
    displayName: 'Task Group Test 2 ${{webapp.name}}'
    inputs:
      targetType: 'inline'
      script: |
        Write-Host "Name: ${{webapp.name}} with url ${{webapp.url}}"
      failOnStderr: true
      workingDirectory: '$(Build.SourcesDirectory)'

I hope this finds some use to others.

Azure Front-End Timeout

I experienced a particulary frustrating issue over the weekend with our Azure WebApp that took quite a while to find out the cause.

Basically, one of the functions of our WebApp generates Word and Excel documents with images and text, these can be pretty large and with images, can take anything from 1 minute to 10 minutes to create.

What we suddenly started experiencing with a new customer was at 4 minutes the Ajax call would complete with an error which contained a small HTML fragment:

<html><head><title>500 - The request timed out.</title></head><body> <font color =“#aa0000”> <h2>500 - The request timed out.</h2></font> The web server failed to respond within the specified time.</body></html>

This was odd, as the server was still busy downloading images and generating the documents in the background … but how could that be when the request had timed out and closed the response.

All of which didn’t happen when running locally; so after a lot of investigation we ended up deciding it wasn’t our code and was an Azure issue.

What we ended up doing, was responding with a ‘heartbeat’ (in our case a carriage return) and flushing the buffer; then changing our JQuery Ajax call to move our client finalisation code from 'success’ to 'complete’ events.

This worked.

Azure support have just got back to me and said that this is 'by design’ and the PaaS front-ends will kill the request with a 'timeout’ error after 240 seconds. This timeout period cannot be adjusted and the solution is to re-architect the code.

I will now look into an improved architecture and possibly using WebJobs and polling to take this generation out of the request pipeline, which is a far better solution <inherited code disclaimer>.

Getting Office365 Planner Now

I was excited about Office365 Planner, but was dismayed that it will only be released in Preview to select customers in Q4 of this year

Urgh, then why announce it!! I will have long forgotten about it by then.

Anyway, I was taking a look at my apps, and I found the Planner icon was on my Apps list but only when in the Sway App. So you can use it now! Unless they take it away again soon.

Adding changes to the previous commit in Git

I end up needing to do this all the time, but thankfully its very easy to do; just stage the extra changes like normal and amend the commit:

Stage your missed changes:

git add .

Then just –amend the commit:

git commit --amend -m"Add new commit message which overwrites the previous one"