Simon Online

2021-07-08

Enable TeamCity Symbol Server

First off a symbol server is a server which stores the symbols from a built binary so you don’t have to ship out PDB files with your compiled code to be able to debug it. You can hook up visual studio to search a symbol server when you’re debugging so that you can drop into code for something like a shared nuget package. Teamcity, as it turns out, has a plugin to support being a symbol server. Here is how you get started with it:

  1. Install the symbol server plugin by going to Administration > plugins > Browse plugins repository and search for symbol
  2. On your build agents install the windows debugging tools which are shipped as part of the Windows SDK. For windows 10 you can grab it here: https://developer.microsoft.com/en-us/windows/downloads/windows-10-sdk/ During the install you’ll be prompted for which components you want to install and you can just turn off everything but the debugging tools.
  3. Remember to restart your build agents so they can register the debugging tools as being installed. You can check by going to the build agent in teamcity. Click on parameters

    In there, at the bottom, you should find an entry for the debugger
  4. In the projects you want symbols for enable the symbol server feature
  5. In the build artifacts you need to ensure that both the PDB and the associated EXE or DLL are selected as artifacts.

That’s pretty much it. In your build now you should see a few log messages to let you know that the symbol server indexing is working

Now you can hook up Visual Studio to use this by going into settings and searching for symbols then paste the URL of the teamcity server with /app/symbols at the end of it into the box

Now when you’re debugging in visual studio you’ll have access to the symbols.

2021-07-06

Enable SSO for Snowflake using Azure AD

So you want to enable single sign on for you AD users to Snowflake? There are a bunch of good reasons to do this: it makes managing users easier, deleting a user in AD deletes them in snowflake so you don’t have a laundry list of places to delete a user when users leave.

The process is a 2 sided thing: setting up the Snowflake integration on the AD side and then letting Snowflake know where to authenticate its users.

Azure Side

  1. Go to azure AD and click on Enterprise Applications on the left hand side
  2. Click New Application and search for Snowflake select it and create it
  3. In there set up the links to your Snowflake tenant for single sign on by selecting Single sign-on on the left
  4. Fill in the URLs for your snowflake instance. The only thing that you really need to pay attention to is that you’re using the snowflake name on your already created snowflake instance.
  5. Download the Base64 Certificate from the SAML Signing Certificate section
  6. Assign a test user to the snowflake integration by clicking on users and groups and adding an existing user

Snowflake Side

  1. Run this query in snowflake. It adds a saml identity provider and then set up single sign on
use role accountadmin;
alter account set saml_identity_provider = '{
"certificate": "<Paste the content of downloaded certificate from Azure portal>",
"ssoUrl":"<Login URL value which you have copied from the Azure portal, something like https://login.microsoftonline.com/44xxxx25-xxxx-415b-bedc-xxxxxxxxxxxxxx/saml2>",
"type":"custom",
"label":"AzureAD"
}';
alter account set sso_login_page = TRUE;
  1. Hook up the user you created earlier in AD
    CREATE USER simon_timms PASSWORD = '' LOGIN_NAME = 'user@somedomain.com' DISPLAY_NAME = 'Simon Timms';
    

You should now be able to log in with your AD account. Open up an incognito tab and go to your snowflake instance. In there click on the SSO option and enter your AD credentials.

Automatic Provisioning

Obviously it sucks to provision the users manually in snowflake so you can have AD sync changes over to it. To do this start with snowflake. You’ll need to create a user who can provision users.

create or replace role aad_provisioner;
grant create user on account to role aad_provisioner;
grant create role on account to role aad_provisioner;
grant role aad_provisioner to role accountadmin;
create or replace security integration aad_provisioning
    type = scim
    scim_client = 'azure'
    run_as_role = 'AAD_PROVISIONER';
select system$generate_scim_access_token('AAD_PROVISIONING');

This should give you a long key which you should copy.

Go back to the AD app and click on Provisioning. In there change over to automatic provisioning. Enter the key in the Secret Token field and in the Tenant Url field enter your usual URL but this time with /scim/v2 on the end of it.

Test the connection and ensure that it can connect properly. With that done you’ll need to turn provisioning status on

Adding Users to the Sync

If you want to add a new user to the synchronizing then go back to the snowflake app under Enterprise Applications in Azure AD. In there click on Users and groups

Then on the add users and groups button. In there you can select your user and click Assign. That should be it. It may take a few minutes to sync. You can always check the status of the sync by going to the Provisioning item

Gotchas!

The biggest one here is that the snowflake key used in automatic provisioning only has a lifespan of 6 months. It is almost certainly going to break horribly at that time. You should mitigate this by having the sync job email you if it fails. This can be done in the settings page in Azure

To get a new token you’ll need to log into snowflake and run the following query

select system$generate_scim_access_token('AAD_PROVISIONING');

This will generate a new token and you’ll need to copy it back into Azure. A gotcha inside a gotcha here is that running this command can only be done as ACCOUNTADMIN so you need to select that here:

2021-06-24

Azure Automation

Azure Automation is a service that allows running small scripts to do automation of tasks inside azure. For instance if you want to scale a database up and down depending on the time of day this is an ideal place to do it.

There are basically 3 concepts in it

  1. Runbook - a script that you write and publish from within Azure Automation. The supported languages include Python (2 and 3!) and powershell. There is also a graphical builder which basically just run powershell commandlets
  2. Jobs - executions of the runbook. These can take parameters and pass them off to a runbook. The job logs what it is doing but the logging is a bit sketchy. You should consider reviewing the json output to see exactly what went wrong with your job instead of relying on the UI.
  3. Schedule - You can kick off a job at any point in time using a schedule. Schedules allow passing parameters to the jobs.

Powershell Gotchas

For some reason, likely the typical Microsoft support of legacy software, the Azure modules included in powershell by default are the old AzureRM ones and not the newer, more awesome Az modules. You can go to the module gallery to install more modules

However, little problem with that is that the module installation process doesn’t handle dependencies so if you want to install something like Az.Sql which relies on Az.Account then you need to go install Az.Account first. The installation takes way longer than you’d logically expect so I sure hope you don’t need to install something like Az proper which has 40 dependencies.

Example Script

This script will scale a database to the desired level



Param(
 [string]$ResourceGroupName,
 [string]$ServerName,
 [string]$DatabaseName,
 [string]$TargetEdition,
 [string]$TargetServiceObjective
)

$connectionName = "AzureRunAsConnection"
try
{
    # Get the connection "AzureRunAsConnection "
    $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName         

    "Logging in to Azure..."
    Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $servicePrincipalConnection.TenantId `
        -ApplicationId $servicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
}
catch {
    if (!$servicePrincipalConnection)
    {
        $ErrorMessage = "Connection $connectionName not found."
        throw $ErrorMessage
    } else{
        Write-Error -Message $_.Exception
        throw $_.Exception
    }
}


echo "Scaling the database"
Set-AzSqlDatabase -ResourceGroupName $ResourceGroupName -DatabaseName $DatabaseName -ServerName $ServerName -Edition $TargetEdition -RequestedServiceObjectiveName $TargetServiceObjective
echo "Scaling complete"
2021-06-17

Getting started with Storybook and Vue

  1. Starting with an empty folder you can run
     npx sb init
    
  2. During the addition you’ll be prompted for the template type - select vue
  3. If this is brand new then you’ll need to install vue. The template assumes you have it installed already.
     npm install vue vue-template-compiler
    
  4. Run storybook with

     npm run storybook
    

    This will get storybook running and you’ll be presented with the browser interface for it

Adding Vuetify

  1. In the project install vuetify
    npm install vuetify
    
  2. In the .storybook folder add a preview-head.html file. This will be included in the project template. Set the content to

     <link href="https://cdn.jsdelivr.net/npm/@mdi/font@4.x/css/materialdesignicons.min.css" rel="stylesheet">
     <link href="https://cdn.jsdelivr.net/npm/vuetify@2.x/dist/vuetify.min.css" rel="stylesheet">
    
  3. Create a new file called vuetify_storybook.js and add to it

import Vue from 'vue';
import Vuetify from 'vuetify'; // loads all components
import 'vuetify/dist/vuetify.min.css'; // all the css for components
import en from 'vuetify/es5/locale/en';

Vue.use(Vuetify);

export default new Vuetify({
    lang: {
        locales: { en },
        current: 'en'
    }
});
  1. In the .storybook folder add to the preview.js and include

     import { addDecorator } from '@storybook/vue';
     import vuetify from './vuetify_storybook';
    
     addDecorator(() => ({
     vuetify,
     template: `
         <v-app>
         <v-main>
             <v-container fluid >
             <story/>
             </v-container>
         </v-main>
         </v-app>
         `,
     }));
    

    This will add vuetify wrapping to the project. You can now just go ahead and us the components in your .vue files. Here is an example:

     <template>
         <div>
             <v-text-field dense label="User name" hint="You can use your email"></v-text-field>
             <v-text-field dense label="Password" hint="You need to use upper case and lower case"></v-text-field>
         </div>
     </template>
     <script>
     module.exports = {
         data: function () {
             return {
             userName: null,
             password: null,
             rememberMe: false,
             };
         },
         computed: {
             isValid: function () {
             return true;
             },
         },
     };
     </script>
    

    Networking

    If you’re using a service layer then you an shim that in to prevent making network calls. However that might not be what you want to do so you can instead shim in something to intercept all network calls. This can be done using the mock service worker addon https://storybook.js.org/addons/msw-storybook-addon

    To get it working install it

     npm i -D msw msw-storybook-addon
    

    Then to the preview.js file you can add a hook for it

     import { initializeWorker, mswDecorator } from 'msw-storybook-addon';
    
     initializeWorker();
     addDecorator(mswDecorator);
    
2021-06-16

Quick Noda Time Conversions

Noda time makes working with timezones, well not a snap but better than dental surgery.

Convert a DateTime and TzDB Timezone to UTC

A TzDB timezone is one that looks like America/Edmonton or, one might presume Mars/OlympusMons

DateTimeZone timezone = DateTimeZoneProviders.Tzdb.GetZoneOrNull(timezoneId);
ZoneLocalMappingResolver customResolver = Resolvers.CreateMappingResolver(Resolvers.ReturnLater, Resolvers.ReturnStartOfIntervalAfter);
var localDateTime = LocalDateTime.FromDateTime(dateTime);
var zonedDateTime = timezone.ResolveLocal(localDateTime, customResolver);
return zonedDateTime.ToDateTimeUtc();

Convert from a UTC to a zoned DateTime

 var local = new LocalDateTime(dateTime.Year, dateTime.Month, dateTime.Day, dateTime.Hour, dateTime.Minute, dateTime.Second);
var tz = DateTimeZoneProviders.Tzdb[timeZoneID];
return local.InZoneLeniently(tz);

But be careful with this one because it might produce weird results around time change periods. If you want to avoid ambiguity or at least throw an exception for it consider InZoneStrictly

2021-06-11

Installing Fonts on Windows with Powershell

You’d like to think that in 2021 installing a font would involve just copying it and some advanced AI system would notice it and install it on Windows. Again the future has failed us.

Let’s say you have a folder of TTF fonts you need installing. Just copying them to the c:\windows\fonts directory won’t work. You need to copy them with a magic COM command that is probably left over from when file names in Windows looked like PROGRA~1. I’ve seen some scripts which add the font to the windows registry but I didn’t have much luck getting them to work and they feel fragile should Microsoft ever update font handling (ha!).

Here is a script that will copy over all the fonts in the current directory.

echo "Install fonts"
$fonts = (New-Object -ComObject Shell.Application).Namespace(0x14)
foreach ($file in gci *.ttf)
{
    $fileName = $file.Name
    if (-not(Test-Path -Path "C:\Windows\fonts\$fileName" )) {
        echo $fileName
        dir $file | %{ $fonts.CopyHere($_.fullname) }
    }
}
cp *.ttf c:\windows\fonts\

The fonts don’t seem to get installed using the same file name as they arrive with so that last cp line puts the original files in the fonts directory so you can run this script multiple times and it will just install the new fonts. If you wanted to get cool you could check for a checksum and install fonts where the checksum doesn’t match. Don’t both trying to use CopyHere with the flag 0x14 thinking it will overwrite fonts. That doesn’t work for the font directory.

If you want to check and see which fonts are visible to .NET on the system then you can try

[void] [System.Reflection.Assembly]::LoadWithPartialName("System.Drawing")
(New-Object System.Drawing.Text.InstalledFontCollection).Families
2021-06-07

Transport for Azure Service Bus

There are two transport mechanisms for service bus

  • AQMP
  • AQMP over web sockets

The default is to use plain AQMP but this uses port 5671. Often times this port may be blocked by firewalls. You can switch over to using the websocket based version which uses port 443 - much more commonly open already on firewalls.

.NET Code

You just need to update the TransportType in the service bus set up

var client = new ServiceBusClient(Configuration["ServiceBusConnection"], new ServiceBusClientOptions
{
    TransportType = ServiceBusTransportType.AmqpWebSockets
});

Azure Functions

The simplest way of getting websockets to work on functions is to update the connection string to mention it

Endpoint=sb://someendpoint.servicebus.windows.net/;SharedAccessKeyName=SenderPolicy;SharedAccessKey=asecretkey;TransportType=AmqpWebSockets
2021-06-07

Add user to role in sql server

This can be done with

sp_addrolemember @rolename = 'role', @membername = 'security_account'

example

sp_addrolemember @rolename = 'db_owner', @membername = 'evil_hacker_account'

another example

sp_addrolemember @rolename = 'db_datareader', @membername = 'datafactory'

and another

sp_addrolemember @rolename = 'db_datawriter', @membername = 'asca_webapp'

Built in database roles are

db_owner Members of the db_owner fixed database role can perform all configuration and maintenance activities on the database, and can also drop the database in SQL Server. (In SQL Database and Azure Synapse, some maintenance activities require server-level permissions and cannot be performed by db_owners.)
db_securityadmin Members of the db_securityadmin fixed database role can modify role membership for custom roles only and manage permissions. Members of this role can potentially elevate their privileges and their actions should be monitored.
db_accessadmin Members of the db_accessadmin fixed database role can add or remove access to the database for Windows logins, Windows groups, and SQL Server logins.
db_backupoperator Members of the db_backupoperator fixed database role can back up the database.
db_ddladmin Members of the db_ddladmin fixed database role can run any Data Definition Language (DDL) command in a database.
db_datawriter Members of the db_datawriter fixed database role can add, delete, or change data in all user tables.
db_datareader Members of the db_datareader fixed database role can read all data from all user tables and views. User objects can exist in any schema except sys and INFORMATION_SCHEMA.

db_denydatawriter Members of the db_denydatawriter fixed database role cannot add, modify, or delete any data in the user tables within a database.

db_denydatareader Members of the db_denydatareader fixed database role cannot read any data from the user tables and views within a database.

2021-06-03

Sequences

Sequences are a handy feature in SQL server which provide an increasing, unique number. You wouldn’t typically use them directly but might use them under the covers in an identity. However from time to time they are useful when you need numbers but your primary key is a uniqueidentifier or you need two different ways of numbering records. I’ve been using them to associate records in a table into groups.

create SEQUENCE Seq_PermitNumber 
    start with 1 
    increment by 1

You can then use them like this

update tblManualPayment 
   set PermitNumber = next value for Seq_PermitNumber 
 where PermitNumber is null

This will give each record a unique permit number.

2021-05-20

Using Durable Entities

Durable entities are basically blobs of state that are stored somewhere (probably table storage). You can retrieve them and signal them with changes. They can be tied directly into standard Azure functions.

You build one as pretty much a POCO that looks like

[JsonObject(MemberSerialization.OptIn)]
public class DuplicatePreventor
{
    [JsonProperty("value")]
    public int CurrentValue { get; set; };

    public void Add(int amount) => this.CurrentValue += amount;

    public void Reset() => this.CurrentValue = 0;

    public int Get() => this.CurrentValue;

    [FunctionName(nameof(DuplicatePreventor))]
    public static Task Run([EntityTrigger] IDurableEntityContext ctx)
        => ctx.DispatchAsync<DuplicatePreventor>();
} 

In this example there is one piece of state: the CurrentValue. You can retrieve it using the Get() function. Add and Reset are other signals you can send to the state.

Using it in a function involves adding a client to the signature of the function like so

[FunctionName("ShopifyPurchaseWebhook")]
public static async Task<IActionResult> Run(
    [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
    [DurableClient] IDurableEntityClient client,
    ILogger log)
{
        ...
}

Once you have the client you can retrieve an existing state by specifying an entityId and then getting it from the client

var entityId = new EntityId(nameof(DuplicatePreventer), webhook.order_number.ToString());
var duplicationPreventionEntity = await client.ReadEntityStateAsync<DuplicatePreventer>(entityId);

This gets you back a wrapper which includes properties like EntityExists and EntityState.

You can signal changes in the entity through an unfortunate interface that looks like

await client.SignalEntityAsync(entityId, "Add", 1);

That’s right, strings are back in style.

Gotchas

If you create the durable entity in your function and then request it’s value you at once you won’t get the correct value - you just get null. I’d bet they are using some sort of outbox model that only sends data updates at the end of the function execution.