2020-12-31

Sensitivity classification and bulk load

This problem has caught me twice now and both times it cost me hours of painful debugging to track down. I have some data that I bulk load into a database on a regular basis, about 100 executions an hour. It has been working flawlessly but then I got a warning from SQL Azure that some of my columns didn’t have a sensitivity classification on them. I believe this feature is designed to annotate columns as containing financial or personal information.

The docs don’t mention much about what annotating a column as being sensitive actually does except require some special permissions to access the data. However once we applied the sensitivity classification the bulk loading against the database stopped working and it stopped working with only the error “Internal connection error”. That error lead me down all sorts of paths that weren’t correct. The problem even happened on my local dev box where the account attached to SQL server had permission to do anything and everything. Once the sensitivity classification was removed everything worked perfectly again.

Unfortunately I was caught by this thing twice hence the blog post so that next time I google this I’ll find my own article.

2020-12-03

Running DbUp commands against master

I ran into a little situation today where I needed to deploy a database change that created a new user on our database. We deploy all our changes using the fantastic tool DbUp so I figured the solution would be pretty easy, something like this:

use master;
create login billyzane with password='%XerTE#%^REFGK&*^reg5t';

However when I executed this script DbUp reported that it was unable to write to the SchemaVersions table. This is a special table in which DbUp keeps track of the change scripts it has applied. Of course it was unable to write to that table because it was back in the non-master database. My databases have different names in different environments (dev, prod,…) so I couldn’t just add another use at the end to switch back to the original database because I didn’t know what it was called.

Fortunately, I already have the database name in a variable used for substitution against the script in DbUp. The code for this looks like

var dbName = connectionString.Split(';').Where(x => x.StartsWith("Database") || x.StartsWith("Initial Catalog")).Single().Split('=').Last();
var upgrader =
    DeployChanges.To
        .SqlDatabase(connectionString)
        .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly(), s =>
          ShouldUseScript(s, baseline, sampleData))
        .LogToConsole()
        .WithExecutionTimeout(TimeSpan.FromMinutes(5))
        .WithVariable("DbName", dbName)
        .Build();

var result = upgrader.PerformUpgrade();

So using that I was able to change my script to

use master;
create login billyzane with password='%XerTE#%^REFGK&*^reg5t';
use $DbName$;

Which ran perfectly. Thanks, DbUp!

2020-11-20

Updating SMS factor phone number in Okta

Hello, and welcome to post number 2 today about the Okta API! In the previous post we got into resending the SMS to the same phone number already in the factor. However people sometimes change their phone number and you have to update it for them. To do this you’ll want to reverify their phone number. Unfortunately, you will get an error from Okta that an existing factor of that type exists if you simply try to enroll a new SMS factor.

You actually need to follow a 3 step process

  1. Find the existing factors
  2. Remove the existing SMS factor
  3. Re-enroll the factor being user to specify updatePhone

Read More

2020-11-20

Resending an SMS verification code in Okta from nodejs

The title of this blog seems like a really long one but I suppose that’s fine because it does address a very specific problem I had. I had a need to resend an SMS to somebody running through an Okta account creation in our mobile application. We use a nodejs (I know, I’m not happy about it either) backend on lambda (and the hits keep coming). We don’t take people through the external, browser based login because we wanted the operation to be seamless inside the app.

The verification screen looks a little this (in dark mode)

okta-resend-sms-nodejs

Read More

2020-11-18

Allocating a Serverless Database in SQL Azure

I’m pretty big on the SQL Azure Serverless SKU. It allows you to scale databases up and down automatically within a band of between 0.75 and 40 vCores on Gen5 hardware. It also supports auto-pausing which can shut down the entire database during periods of inactivity. I’m provisioning a bunch of databases for a client and we’re not sure what performance tier is going to be needed. Eventually we may move to an elastic pool but initially we wanted to allocate the databases in a serverless configuration so we can ascertain a performance envelope. We wanted to allocate the resources in a terraform template but had a little trouble figuring it out.

Read More

2020-11-05

Azure Processor Limits

Ran into a fun little quirk in Azure today. We wanted to allocate a pretty beefy machine, an M32ms. Problem was that for the region we were looking at it wasn’t showing up on our list of VM sizes. We checked and there were certainly VMs of that size available in the region we just couldn’t see them. So we ran the command

az vm list-usage --location "westus" --output table

And that returned a bunch of information about the quota limits we had in place. Sure enough in there we had

Name                               Current Value   Limit
Standard MS Family vCPUs           0               0

We opened a support request to increase the quota on that CPU. We also had a weirdly low limit on CPUs in the region

Total Regional vCPUs               0               10

Which support fixed for us too and we were then able to create the VM we were looking for.

2020-11-05

Querying Across Databases In SQL Azure

I seem to be picking up a few projects lately which require migrating data up to Azure SQL from an on premise database. One of the things that people tend to do when they have on premise databases is query across databases or link servers together. It is a really tempting prospect to be able to query the orders database from the customers database. There are, of course, numerous problems with taking this approach not the least of which is making it very difficult to change database schema. We have all heard that it is madness to integrate applications at the database level and that’s one of the reasons.

Unfortunately, whacking developers with a ruler and making them rewrite their business logic to observe proper domain boundaries isn’t always on the cards. This is a problem when migrating them to SQL Azure because querying across databases, even ones on the same server, isn’t permitted.

Broken query across databases

This is where the new Elastic Query comes in. I should warn at this point that the functionality is still in preview but it’s been in preview for a couple of years so I think it is pretty stable. I feel a little bit disingenuous describing it as “new” now but it is new to me. To use it is pretty easy and doesn’t even need you to use the Azure portal.

Let’s imagine that you have two databases one of which contains a collection of Products and a second database that contains a list of Orders which contain just the product id. Your mission is to query and get a list of orders and the product name. To start we can set up a couple of databases. I called mine testias and testias2 and I had them both on the same instance of SQL Azure but you don’t have to.

Two databases on the same server

Product Database

create table Products( 
id uniqueidentifier primary key default newid(),
name nvarchar(50));

insert into Products(name) values('socks');
insert into Products(name) values('hats');
insert into Products(name) values('gloves');

Orders Database

create table orders(id uniqueidentifier primary key default newid(),
date date);

create table orderLineItems(id uniqueidentifier primary key default newid(),
orderId uniqueidentifier,
productId uniqueidentifier,
quantity int,
foreign key (orderId) references orders(id));

declare @orderID uniqueidentifier = newid();
insert into orders(id, date)
values(@orderID, '2020-11-01');

insert into orderLineItems(orderId, productId, quantity) values(@orderID, '3829A43D-FD2A-4B7C-9A09-23DBF030C1DC', 10);
insert into orderLineItems(orderId, productId, quantity) values(@orderID, '233BC430-BA3F-4F5C-B3EA-4B82867FC040', 1);
insert into orderLineItems(orderId, productId, quantity) values(@orderID, '95A20D82-EC26-4769-8840-804B88630A01', 2);

set @orderId = newid();
insert into orders(id, date)
values(@orderID, '2020-11-02');

insert into orderLineItems(orderId, productId, quantity) values(@orderID, '3829A43D-FD2A-4B7C-9A09-23DBF030C1DC', 16);
insert into orderLineItems(orderId, productId, quantity) values(@orderID, '233BC430-BA3F-4F5C-B3EA-4B82867FC040', 99);
insert into orderLineItems(orderId, productId, quantity) values(@orderID, '95A20D82-EC26-4769-8840-804B88630A01', 0);

Now we need to hook up the databases to be able to see each other. We’re actually just going to make products visible from the orders database. It makes more sense to me to run these queries in the database which contains the most data to minimize how much data needs to cross the wire to the other database.

So first up we need to tell the Orders database about the credentials needed to access the remote database, products. To do this we need to use a SQL account on the products database. Windows accounts and integrated security doesn’t currently work for this.

create master key encryption by password = 'monkeyNose!2';
create database scoped credential ProductDatabaseCredentials 
with identity = 'ProductsDBUser', 
secret = 'wouNHk41l9fBBcqadwWiq3ert';

Next we set up an external data source for the products

create external data source ProductsSource with 
(type=RDBMS, location = 'testias.database.windows.net', 
database_name = 'testias', credential = ProductDatabaseCredentials);

Finally we create a table definition in the Orders database that matches the remote table (without any defaults or constraints).

create external table Products( id uniqueidentifier,
name nvarchar(50))
with ( data_source = ProductsSource)

We now have a products table in the external tables section in the object explorer

Tables from both databases

We can query the external table and even cross it against the tables in this database

select name, ol.quantity from orderLineItems ol inner join products p on ol.productId = p.id
socks   16
socks   10
gloves  1
gloves  99
hats    2
hats    0

So it is possible to run queries across databases in Azure but it takes a little set up and a little bit of thought about how to best set it up.

Possible Gotchas

  • I forgot to set up the database to be able to talk to Azure resources in the firewall so I had to go back and add that
  • Inserting to the external table isn’t supported, which is good, make the changes directly in the source database
2020-09-28

The trimStart rabbit hole

I was bragging to David about a particularly impressive piece of TypeScript code I wrote last week

if (body.trim().startsWith('<')) { //100% infallible xml detection

He, rightly, pointed out that trimStart would probably be more efficient. Of course it would! However when I went to make that change there was only trim, trimLeft and trimRight in my TypeScript auto-complete drop down.

TrimStart and TrimEnd are missing

Odd. This was for sure a real function because it appears in the MDN docs.

A reasonable person would have used trimLeft and moved on but it was Monday and I was full of passion for programming. So I went down the rabbit hole.

Checking out the TypeScript directory in my node_modules I found that there were quite a few definition files in there. These were the definition files that described the JavaScript language itself rather than any libraries. Included in that bunch was one called lib.es2019.string.d.ts. This file contained changes which were made to the language in es2019.

interface String {
    /** Removes the trailing white space and line terminator characters from a string. */
    trimEnd(): string;

    /** Removes the leading white space and line terminator characters from a string. */
    trimStart(): string;

    /** Removes the leading white space and line terminator characters from a string. */
    trimLeft(): string;

    /** Removes the trailing white space and line terminator characters from a string. */
    trimRight(): string;
}

So I must be targeting the wrong thing! Sure enough in my tsconfig.js I was targeting es5 on this project. When we started this was using an older version of node on lambda that didn’t have support for more recent versions of ES. I checked and the lambda was running node 12.18.3 and support for ES2020 landed in node 12.9 so I was good to move up to es2020 as a target.

Incidentally you can check the running node version in JavaScript by running

console.log('Versions: ' + JSON.stringify(process.versions));

After updating my tsconfig.js and restarting the language server all was right in the world.

The missing functions appear

2020-07-11

Azure Functions for NServiceBus

A while back I blogged about Durable Functions vs. NServiceBus Sagas. At the time I had some good chats with Sean Feldman and one of the ideas he tossed around was the possibility of running NServiceBus on top of Azure Functions. At the time I snorted at the idea because starting up NServiceBus is kind of a heavy weight process. There is assembly scanning and DI set up to do which, in my experience, takes a good second or two. It wouldn’t be cost effective to run all that startup code on Functions. But Sean set out to prove me wrong.

Read More