2015-01-24

CSS Animated Confirmations

I’ve been playing a little bit with changing how my app acknowledges that actions are running and have completed. I use a system, from time to time, that doesn’t do good feedback for when an action is running. It drives me up the wall and puts me back in my Comput 301 class with the teacher talking about the importance of feedback. Rapid feedback is what makes the difference between an application feeling snappy and it feeling slow. Even if the actions take the same time to complete a system that does something to indicate it is working right away will make users feel far better.

So knowing that my users are more and more on modern browsers I though I would give some animation a try.



This was my first attempt and I’m going to put it into the field and see what people think. It is implemented using the animation css attribute. I start with some key frame defintions:

  @keyframes save-sucessful-animation {
      0%  { background-color: white}
      25% { background-color: #DFF2BF;}
      75% { background-color: #DFF2BF;}
      100%{ background-color: white}
  }

I’ve listed the un-prefixed ones here but you’ll need to prefix with -moz or -webkit or -ms for your needs. You could also use a CSS precompiler to do that for you (check out http://bourbon.io/). The transform here changes the colour to green, pauses for 50% of the animation and turns it back to white.

Next we need to apply the style to our element

.saving {
    animation: save-sucessful-animation 3s forwards;
}

And finally hook up some JavaScript to trigger it on click

$(".trigger").click(function(event){
          var target = $(event.target).siblings(".target");
          if(target.hasClass("saveSuccessful"))
          {
              var replacement = target.clone(true);
            target.before(replacement);
            target.remove();
            }
          else{
              target.addClass("saveSuccessful"); 
          }
      });

I remove the element if the class already exists as that is the easiest way to restart the animation. (See http://css-tricks.com/restart-css-animation/)

Now when users click on the button they get a nifty little animation while the save is going on. I’ve ignored failure conditions here but this is going to be a big win already for my users.

<style>

@-moz-keyframes save-sucessful-animation {
    0% { background-color: white}
    25%{ background-color: #DFF2BF}
    75% {
        background-color: #DFF2BF;
    }
    100%{ background-color: white}
}

@-ms-keyframes save-sucessful-animation {
    0% { background-color: white}
    25%{ background-color: #DFF2BF}
    75% {
        background-color: #DFF2BF;
    }
    100%{ background-color: white}
}

@-webkit-keyframes save-sucessful-animation {
    0% { background-color: white}
    25%{ background-color: #DFF2BF}
    75% {
        background-color: #DFF2BF;
    }
    100%{ background-color: white}
}

@keyframes save-sucessful-animation {
    0% { background-color: white}
    25%{ background-color: #DFF2BF;}
    75% {
        background-color: #DFF2BF;
    }
    100%{ background-color: white}
}

.saveSuccessful {
    -moz-animation: save-sucessful-animation 3s forwards;
    -webkit-animation: save-sucessful-animation 3s forwards;
    -o-animation: save-sucessful-animation 3s forwards;
    animation: save-sucessful-animation 3s forwards;
}
</style>

<script>
    $(function(){ 
      $(".trigger").click(function(event){
          var target = $(event.target).siblings(".target");
          if(target.hasClass("saveSuccessful"))
          {
              var replacement = target.clone(true);
            target.before(replacement);
            target.remove();
            }
          else{
              target.addClass("saveSuccessful"); 
          }
      });
     });
</script>
2015-01-22

Getting Bower Components in Gulp

I’m embarking on an adventure in which I update the way my work project handles JavaScript. Inspired by David Paquette’s blog I’m moving to using gulp and dropping most of the built in bundling and minimization stuff from ASP.net. This is really just a part of a big yak shaving effort to try out using react on the site. I didn’t expect it to turn into this huge effort but it is really going to result in a better solution in the end. It is also another step on the way to aligning how we in the .net community develop JavaScript with the way the greater web development community develops JavaScript. It will be the way that the next version of ASP.net handles these tasks.

One of the many yaks that need shaving is moving many of my JavaScript dependencies to using bower. Bower is to JavaScript as nuget is to .net or as CPAN is to perl or as gem is to ruby: it is a package manager. There is often some confusion between bower and npm as both are package managers for JavaScript. I like to think of npm as being the package manager for build infrastructure and running on node. Whereas bower handles packages that are sent over the wire to my clients.

So on a normal project you might have jquery, underscore/lodash and require.js all installed by bower.

We would like to bundle up all of these bower components into a single minified file along with all our other site specific JavaScript. We can use Gulp, a build tool for JavaScript, to do that. Unfortunately bower packages may contain far more than they actually need to and there doesn’t seem to be a good standard in place to describe which file should be loaded into your bundle. Some projects include a bower.json file that defines a main file to include. This is an excerpt from the bower.json file from the flux library:

   {
  "name": "flux",
  "description": "An application architecture based on a unidirectional data flow",
  "version": "2.0.2",
  "main": "dist/Flux.js",
  ...

Notice the main file listed there. If we could read all the bower.json files from all our bower packages then you could figure out what files to include. There is, as with all things gulp, a plugin for that. You can install it by running

npm install --save-dev main-bower-files

Now you can reference this from your gulp file by doing

var mainBowerFiles = require('main-bower-files');

You can plug this task into gulp like so

gulp.task('3rdpartybundle', function(){
  gulp.src(mainBowerFiles())
  .pipe(uglify())
  .pipe(concat('all.min.js'))
  .pipe(gulp.dest('./Scripts/'));
});

From time to time you might find a package that fails to properly specify a main file. This is a bit annoying and certainly something you should consider fixing and submitting back to the author. To work around it you can specify an override in your own bower.json file.

"dependencies": {
    "marty": "~0.8.3",
    "react": "~0.12.2"
  },
  "overrides": {
    "superagent":{
      "main": "superagent.js"
    }
  }

Great! Okay now what if you have an ordering dependency? Perhaps you need to load requirejs as the last thing in your bundle. This can be done through the ordering plugin. Start with npm and install the plugin:

npm install --save-dev gulp-order

Again you’ll need to specify the package inside the gulp file

var order = require('gulp-order');

Now you can plug the ordering into the build pipeline

gulp.task('3rdpartybundle', function(){
  gulp.src(mainBowerFiles({paths: {bowerJson: 'bower.json', bowerDirectory: 'bower_components'}}))
  .pipe(order(["*react*", "*requirejs*"])
  .pipe(uglify())
  .pipe(concat(config.all3rdPartyFile))
  .pipe(gulp.dest('./Scripts/'));
});

The ordering plugin doesn’t, at the time of writing, support matching the last rule so making something appear last is harder than making it appear first. There is, however, a pull request out to fix that.

This is really just scratching the surface of the nifty stuff you can get up to with gulp. I’m also building typescript files, transcompiling es6 to es5 and linting my JavaScript. It’s a brave new world!

2015-01-15

Importing On-Premise SQL to SQL Azure

Microsoft have done a great job building Azure and SQL Azure. However one of the places where I feel like they have fallen down is how to get your on premise data up into Azure. It isn’t that there isn’t a good way it is that there are a million different ways to do it. If you look at the official MSDN entry there are 10 different approaches. How are we to know which one to use? I think we could realistically reduce the number to two methods:

  1. Export and import a data-tier application
  2. Synchronize with an on premise application

The first scenario is what you want in most instances. It will require downtime for your application as any data created between exporting your database and importing it will not be seen up in Azure SQL. It is a point in time migration.

If you have a zero downtime requirement or your database is so large that it will take an appreciable period of time to export and import then you can use the data sync. This will synchronize your database to the cloud and you then simply need to switch over to using your up to date database in the cloud.

This article is going to be about method #1.

The first step is to find the database you want to export in SQL Server Management Studio

Database selected

Now under tasks select export data-tier application. This will create a .bacpac file that is portable to pretty much any SQL server.

Select export data-tier application

Here you can export directly to a blob storage container on Azure. I’ve blanked out the storage account and container here for SECURITY. I just throw the backup in an existing container I have for moving data about. If you don’t have a container just create a new storage account and put a container in it.

Imgur

The export may take some time depending on the size of your database but the nice part is that the bacpac file is uploaded to azure as part of the process so you can just fire and forget the export.

Now jump over to the Azure management portal. I like to use the older portal for this as it provides some more fidelity when it comes to finding and selecting your bacpack file. Here you can create a new database from an export. Unfortunately there doesn’t seem to be a way to apply an import to an existing database. That’s probably not a very common use case anyway.

Create database

Here we can specify the various settings for the new database. One thing I would suggest is to use a more performant database that you might usually. It will speed up the import and you can always scale down after.

Import settings

Now you just need to wait for the import to finish and you’ll have a brand new database on Azure complete with all your data.

Import complete

2015-01-09

Missing Constraints in ParkPlus

Parking in Calgary is always a bit of an adventure, as it is in most North American cities. All the city owned parking and even some private lots are managed by a company called The Calgary Parking Authority a name that is wholly reminiscent of some Orwellian nightmare. Any organization that gives out parking tickets is certain to be universally hated so I feel for anybody who works there.

What sets the Calgary Parking Authority apart is that they have developed this rather nifty technology to police the parking. They call it park plus and it comes in a couple of parts. The first is a series of signs around town with unique numbers on them.

Zone sign

These identify the parking zone you’re in. A zone might span a single city block or it might cover a large multi-storey parking lot. The next is the parking terminal or kiosk

Parkplus kiosk

These solar-powered boxes act as data entry terminals. They are scattered around the city with one a block in most places. In to them you enter your license plate number and the zone number along with some form of payment.

The final piece of the puzzle is a fleet of camera equipped vehicles that drive up and down the streets taking pictures of license plates.

Camera car

If the license plate is in a zone for which payment is required and no payment has been made then they mail you out a ticket. Other than driving the vehicles there is almost no human involvement on their side. It is actually a really good application of technology to solve a problem.

Unless something goes wrong, that is.

I used one of these things a few weeks ago and it told me that my form of payment was unrecognized. The error seemed odd to me as VISA is pretty well known. So I tried MasterCard, same deal. American Express? Same deal. Dinner’s club? Haha, who has one of those? “This kiosk”, I thought, “is dead”. Now the customer friendly approach here is the Calgary Parking Authority to admit that if a kiosk is broken you don’t have to pay. So of course they do the opposite and make you walk to another kiosk. In my case 5 other kiosks.
Every kiosk told me the same thing. Eventually I gave up and fed it cash.

A few weeks pass and I get my VISA statement. Lo and behold there are multiple charges for Calgary parking for the same time period, for the same license plate in the same zone. This is a condition that should not be permitted. When charging the credit card it is trivial to check to database to see if there is already a parking window that would cover the new transaction. If it exists and the new window is not substantially longer than the current window then the charge should not be applied.

I’m disappointed that there is such a simple technical solution here that isn’t being applied.

I have another whole rant about the difficulty of getting my money refunded but you wouldn’t be interested in that.

2015-01-02

Shaving the Blog Yak

I’ve been on wordpress.com for 2 years now. It does a pretty fair job of hosting my blog and it does abstract away a lot of the garbage you usually have to deal with when it comes to hosting your own blog. However it cost about $100 a year which is a pretty fair amount of money when I have Azure hosting burning a hole in my pocket. There are also a few things that really bug me about wordpress.com.

The first is that I can’t control the plugins or other features to the degree that I like. I really would like to change the way that code is displayed in the blog but it is pretty much fixed and I can only do it because I farm out displaying code to github.

Second I’m always super nervous about anything written in PHP. You can shoot yourself in the foot using any programming language but PHP is like gluing the business end of a rail gun to your big toe. One of the most interesting things I’ve read recently about PHP was this blog post which suggests that something like 78% of PHP installs are insecure. Hilarious.

The third and most major thing is that all the content on wordpress.com is locked into wordpress.com. Sure there is an export feature but frankly it sucks and you’re really in trouble if you want to move quickly to something else. For instance exporting in a Ghost compatible format requires installing a ghost export plugin. This can’t be done because wordpress.com doesn’t allow installing your own plugins.

So, having a few days off over Christmas, I decided it was time to make the move. I was also a bit prompted by the Ghost From Source series of posts from Dave Wesst who showed how easy it was to get Ghost running on Azure.

As I expected this whole process was jolly involved. The first thing I wanted was to get an export in a format that Ghost could read. This meant getting the Ghost export plugin working on wordpress.com. This is, of course, impossible. There is no ability to install your own plugins on wordpress.com.

So I kicked up a new Ubuntu Trusty VM on my OSX box using vagrant. I used Trusty because I had an image for it downloaded already. The Vagrantfile looked like

# -*- mode: ruby -*-
# vi: set ft=ruby :

# Vagrantfile API/syntax version. Don't touch unless you know what you're doing!
VAGRANTFILE_API_VERSION = "2"

Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
  config.vm.box = "ubuntu/trusty64"
  config.vm.network "forwarded_port", guest: 80, host: 8000
end

Once I was sshed in I just followed the steps for setting up wordpress at https://help.ubuntu.com/community/WordPress.

I dropped the plugins in for WordPress import and for Ghost export. I then exported the WordPress file which is, of course, a giant blob of XML.

The export seems to contain all sorts of data including links to images and comments. Ghost doesn’t support either of those things very well. That’s fine we’ll get to those.

Once I had the wordpress backup installed into my local wordpress I could then use the Ghost export which dumped a giant JSON file. Ah, JSON, already this is looking better than wordpress.

I downloaded the latest Ghost and went about getting it set up locally.

git clone git@github.com:TryGhost/Ghost.git
cd Ghost
npm install
grunt init
npm start

This got Ghost up and running locally at http://localhost:2368 . I jumped into the blog at http://localhost:2368/ghost and set up the parameters for my blog. I next went to the labs section of the admin section and imported the JSON dump.

#Issues

I was amazed that it was all more or less working. Things missing were

  1. Routes incorrect - old ones had dates in them and the new ones don’t
  2. Github gists not working
  3. Some crazy character encoding issues

They all need to be dealt with.

##Routes

The problem is that the old URLs were like http://blog.simontimms.com/2014/08/19/experiments-with-azure-event-hubs/ while the new ones would be http://blog.simontimms.com/experiments-with-azure-event-hubs/. I found a router folder which pointed me to frontend.js. The contents looked very much like an expressjs router, and in fact that is what it was. Unfortunately the router in express is pretty lightweight and doesn’t do parameter extraction. I had to rewrite the url inside the controller itself. Check out the commit. It isn’t pretty but it works.

Later on I was poking around in the settings table of the ghost blog and found that there is a setting called permalink that dictates how the permalinks are generated. I changed mine from /:slug to /:year/:month/:day/:slug and that got things working better without source hackery. This feature is, like most of ghost, a black hole of documentation.

##Gists Not Working

Because there was no good syntax highlighting in WordPress I used embedded gists for code. To do that all you did was put in a github url and it would replace it with the gist.

There are a number of ways to fix this. The first is to simply update the content of the post, the second is to catch the content as it is passed out to the view and change it, the third is to change the templating. It is a tough call but with proper syntax highlighting in ghost I don’t really have any need for continuing support for gists. I decided to fix it in the json dump and just reimport everything.

I thought I would paste in the regex I used because it is, frankly, hilarious.

%s/https:\\\/\\\/gist.github.com\\\/\d\+/<script src='&.js'><\\\/script>/g
%s/https:\\\/\\\/gist.github.com\\\/stimms\\\/\w\+/<script src='&.js'><\\\/script>/g

##Crazy Character Encoding

Some characters look to have come across as the wrong format. Some UTF jazz, I imagine. I kind of ignored it for now and I’ll run some SQL when I have a few minutes to kill.

#Databases

By default ghost uses an sqlite database on the file system. This is a big problem for hosting on Azure websites. The local file system is not persistent and may disapear at a moment’s notice. Ghost is built on top of the Knex data access layer and this data layer works just fine against MySQL as well as Postgresql. There is a free version of MySQL from ClearDB which is of very limited size and very limited number of connections. This is actually fine for me because I don’t have a whole lot of people reading my blog (tell your friends to read this blog).

So I decided to use MySQL and so began a yak shaving to rival the best. The first adventure was to figure out how to set the connection string. Of course, this isn’t documented anywhere, but this is the configuration file I came up with:

database: {
            client: 'mysql',
            connection: {
              host: 'us-cdbr-azure-west-a.cloudapp.net',
              user: 'someuser',
              password:'somepassword',
              database:'simononline',
              charset: 'utf8',
              debug: false
          }
        },

I logged into the ghost instance I had on azure websites and found that it worked fine. However, when I attempted to import the dump into MySQL I was met with an error. As it turns out the free MySQL has a limit of 4 connections and the import seems to use more than that. I feel like it is probably a bug in Ghost that it opens up so many connections but probably not one that really needs a lot of attention.

And so began the shaving.

I jumped back over to my vagrant Ubuntu instance and found it already had MySQL installed as a dependency for Wordpress. I didn’t, however, know the password for it. So I had to figure out how to reset the password for MySQL. I followed the instructions at https://dev.mysql.com/doc/refman/5.0/en/resetting-permissions.html

That got the password changed but I couldn’t log into it from the host system still. Turns out you have to grant access specifically from that host.

SET PASSWORD FOR 'root'@'10.0.2.2' = PASSWORD('fish');

Finally I got into the database and could point my local Ghost install at it. That generated the database and I could import into it without worrying too much about the number of connections opened.

Then I ran mysqldump to export the content of the databaes into a file. Finally I sucked this into the azure instance of MySQL.

#Coments

Ghost has no comments but you can use Disqus to handle that for you. I’m pretty much okay with that so I followed the instructions here:

https://help.disqus.com/customer/portal/articles/466255-exporting-comments-from-wordpress-to-disqus

and got all my comments back up and running.

#Conclusion
So everything is moved over and I’m exhausted. There are still a couple of thing to fix

  1. No SSL on full domain only the xxx.azurewebsites.net one
  2. Images still hosted on wordpress domain
  3. No backups of mysql yet

I can tell you this jazz cost me way more time than $100 but it is a victory and I really needed a victory.

2014-12-01

Getting Hammered Because of DNSimple?

Poor DNSimple is, as of writing, undergoing a massive denial of service attack. I have a number of domains with them and, up until now, I’ve been very happy with them. Now it isn’t fair of me to blame them for my misfortune as I should have put in place a redundant DNS server. I’ve never seen a DNS system go belly up in this fashion before. I also keep the TTL on my DNS records pretty low to mitigate any failures on my hosting provider. This means that when the DNS system fails people’s caches are emptied very quickly.

DNS has been up and down all day but it is so bad now that something had to be done. Obviously I need to have some redundancy anyway so I set up an account on easyDNS. I chose them because their logo contains a lightning bolt which is yellow and yellow rhymes with mellow and that reminds me of my co-worker, Lyndsay, who is super calm about everything and never gets upset. It probably doesn’t matter much which DNS provider you use so long as it isn’t Big Bob’s Discount DNS.

I set up a new account in there and put in the same DNS information I’d managed to retrieve from DNSimple during one of its brief periods of being up. I had the information written down too so either way it wouldn’t be too serious to recreate it. It does suggest, however, that there is something else you need to backup.

In EasyDNS I set up a new domain

Screen Shot 2014-12-01 at 10.11.29 PM

in the DNS section I set up the same records as I had in my DNSimple account.Screen Shot 2014-12-01 at 10.14.02 PMFinally I jumped over to my registrar and entered two of the EasyDNS server as the DNS servers for my domain and left two DNSimple servers. This is not the ideal way of setting up multiple DNS server. However from what I can tell DNSimple doesn’t support zone transfers or secondary DNS so the round robin approach is as good as I’m going to get.

Screen Shot 2014-12-01 at 10.34.34 PM

With the new records in place and the registrar changed over everything started working much better. So now I have redundant DNS servers for about $20/year. Great deal.

2014-11-23

Books

socialData

2013 ““ Use a variety of JavaScript technologies to build visualizations of data taken from social media sites. Includes sections on LinkedIn, Facebook and StackOverflow.

mastering

2014 ““ Covers all the original GoF patterns as well as functional patterns, MV* patterns and messaging patterns.

2014-11-16

I Have a Short Memory or So I Wrote Another Book

Last year I wrote a book then I blogged about writing a book. I concluded that post with

I guess watch this spot to see if I end up writing another book. -Simon Timms

Well, my friendly watchers, it has happened. This time I wrote a book about JavaScript patterns. I feel like a bit of a fraud because I don’t really believe that I know much about either patterns or JavaScript. I did manage to find some way to fill over 250 pages with content. I also feel like this book is much better than the last book, if only because it is physically heavier.

book

I agreed to write it for a couple of reasons

  1. I had forgotten how much time it takes to write a book. Seriously it takes forever. Even when you’re done you’re never done. There are countless revisions and reviews and goodness knows what else to eat up your time. Maybe a review only takes 10 minutes a chapter but 12 chapters later and you’ve used up another 2 hours. I don’t like to do the math around dollars an hour for writing but it is low, like below minimum wage low. I met Charlie Russel at the Microsoft MVP Summit earlier this year and had a good chat with him about making a living writing books. He has been in the game for many years and has written more books than I have pairs of socks (even if you relax your constrains and let any two socks be a pair regardless of matching). He told me of a golden age when it was possible to make a good living writing books. Those days are gone and we’re all in a mad dash to the bottom ““ which is free and free isn’t maintainable. That’s a topic for another day.

  2. I liked the topic. Patterns are good things to know. I would never recommend that you go out of your way to implement patterns but having a knowledge of them will help you solve common problems in a sensible way. It is said there are only so many basic plots for a story and I think patterns are like that too. You start writing a story and while it is unique you’ll find that one of the archetypal plots emerges. You can also never go wrong talking about JavaScript.

  3. I figured this book would get more exposure than the last one. My last book was pretty niche. I can’t imagine there are more than 2 or 3 dozen people in the world who would be interested in visualizing social media data to the degree they would buy a book on it. This one, however, should have a much broader reach. I’ve been working on getting my name out there in the hopes that the next time I’m looking for work it is easier.

If you happen to be one of the people interested in JavaScript and how to write it building on the patterns we’ve spent 20 years discovering then go on, buy the book.

This time, however, I’m serious! I’m not writing any more books through a traditional publisher. I’ve already turned down an offer to write what would have been an really interesting one. For my next foray I’m going to publish through LeanPub. They are a nice and funny group of folks whose hands off approach allows for much more creativity around pricing and even production of the book. I’m also done with writing books by myself, I need some companionship on the journey.

There will be more books, just watch this space!

2014-11-14

ASP.net 5 Configuration

On twitter yesterday I had a good conversation with Matt Honeycutt about configuration in ASP.net 5. It started with

The more I think about it: yeah, putting app-specific connection strings and things in evn variables is a TERRIBLE idea. #AspNetvNext

Today I’m seeing a lot more questions about how configuration works in ASP.net vNext 5 (sorry, still getting use to that).

Hail Hydra! How do I avoid collision with env variables bw apps on the same server? #AspNetvNext

Isn’t it high time to rename #AJAX to #AJAJ? : Everything gone JSON now! Even the configuration files in #AspNetvNext?

It sounds like there is some clarification needed about how configuration works in ASP.net 5.

The first thing is that configuration in ASP.net 5 is completly pluggable. You no longer need to rely on the convoluted Web.config file for all your configuration. All the configuration code is found in the Configuration repository on github. You should start by looking at the Configuration.cs file, this is the container class that holds the configuration for your application. It is basically a box full of strings. How we get things into that box is the interesting part.

In the standard template for a new ASP.net 5 project you’ll find a class called Startup.cs. Within that class is the configuration code

In the default configuration we’re reading from a json based configuration file and then overriding it with variables taken from the environment. So if you were developing and wanted to enable an option called SendMailToTestServer then you could simply define that in your environment and it would override the value from the json file.

Looking again in the Configuration repository we see that there are a number of other configuration sources such as

  • Ini files
  • Xml files
  • In memory

The interface you need to implement to create your own source is simple and if you just extend BaseConfigurationSource that should get you most of the way there. So if you want to keep your configuration in Zookeeper then all you would need to do is implement your own source that could talk to Zookeeper. Certain configuration providers also allow changes in the configuration to be committed back to them.

The next point of confusion I’m seeing is related to how environmental variables work. For the most part .net developers think of environmental variables as being like PATH: you set it once and it is globally set for all processes on that machine. For those people from a more Linuxy/UNIXy background we have a whole different interpretation.

Environment variables are simply pieces of information that are inherited by child processes. So when you go set your PATH variables by right clicking on My Computer in Windows (it is still called that, right?) you’re setting a default set of environmental variables that are inherited by all launched processes. You can set them in other ways, though.

Try this: open up two instances of powershell. In the first one type

$env:asp=”rocks” echo $env:asp

You should see “rocks” echoed back. This sets an environmental variable and then echos it out. Now let’s see if the other instance of powershell has been polluted by this variable. Type

echo $env:asp

Nothing comes back! This is because the environments, after launch, are separate for each process. Now back to the first window and type

start powershell

This should get you a third powershell window. In this one type

echo $env:asp

Ah, now you can see “rocks” echoed again. That’s because this new powershell inherited its environment from the parent process.

Environments are NOT global. So you should have no issue running as many instances of ASP.net 5 on your computer as you like without fear of cross polluting them so long as you don’t set your variables globally.

Why even bother with environmental variables? Because it is a common language that is spoken by every operating system (maybe not OpenVMS but let’s be realists here). It is also already supported by Azure. If you set up a configuration variable in an Azure WebSite then when that is set in the environment. That’s how you can easily configure node application or anything else.Finally it helps eliminate that thing where you accidentally alter and check in a configuration file with settings specifically for your computer and break the rest of your team. Instead of altering the default configuration file you could just set up and environment or you could set up a private settings file.

Where AddPrivateJsonFile extends the json configuration source and swallows missing file exceptions allowing your code to work flawlessly on production.

In a non-cloud production environment I would still tend to use a file based configuration systeminstead of environmental variables.

The new configuration system is extensible and powerful. It allows for chaining sources and solves a lot of problems in a more elegant fashion than the old XML based transforms. I love it.

2014-11-11

Is ASP.net 5 too much?

I’ve been pretty busy as of late on a number of projects and so I’ve not been paying as much attention as I’d like to the development of ASP.net vNext, or as it is now called, ASP.net 5. If you haven’t been watching the development I can tell you it is a very impressive operation. I watched two days worth of presentations on it at the MVP Summit and pretty much every part of ASP.net 5 is brand new.

The project has adopted a lot of ideas from the OWIN project to specify a more general interface to serving web pages built in .net technologies. They’ve also pulled in a huge number of ideas from the node community. Build tools such as grunt and gulp have been integrated into Visual Studio 2015. At the same time the need for Visual Studio has been deprecated. Coupled with the open sourcing of the .net framework developing .net applications on OSX or Linux is perfectly possible.

I don’t think it is any secret that the vision of people likes Scott Hanselman is that ASP.net will be a small 8 or 10 meg download that fits in with the culturebeing taught at coding schools. Frankly this is needed because those schools put a lot of stress on platforms like Ruby, Python or node. They’re pumping out developers at an alarming rate. Dropping the expense of Visual Studio makes the teaching of .net awhole lot more realistic.

ASP.net 5 is moving the platform away from propriatary technologies to open source tools and technologies. If you thought it was revolutionary when jQuery was included in Visual Studio out of the box you ain’t seen nothing yet.

The thought around the summit was that with node mired in the whole Node Forward controversy there was a great opportunity for a platform with real enterprise support like ASP.net to gain big market share.

Basically ASP.net 5 is ASP.net with everything done right. Roslyn is great, the project file structure is clean and clear and even packaging, the bane of our existence, is vastly improved.

But are we moving too fast?

For the average ASP.net developer we’re introducing at least

  • node
  • npm
  • grunt
  • bower
  • sass/less
  • json project files
  • dependency injection as a first class citizen
  • different directory structure
  • fragmented .net framework

That’s a lot of newish stuff to learn. If you’re a polyglot developer then you’re probably already familiar with many of these things through working in other languages. The average, monolingual, developer is going to have a lot of trouble with this.

Folks I’ve talked to at Microsoft have likened this change to the migration from classic ASP to ASP.net and from WebForms to MVC. I think it is a bigger change than either of those. With each of these transitions there were really only one or two things to learn. Classic ASP to ASP.net brough a new language on the server (C# or VB.net) and the integration of WebForms. Honestly, though, you could still pretty much write ASP classic in ASP.net without too much change. MVC was a bit of a transition too but you could still write using Response and all the other things with which you had built up comfort in WebForms.

ASP.net 5 is a whole lot of moving parts build on a raft of technologies. To use a Hanselman term is is a lot of lego bricks. A lot of lego can be either make a great model or it can make a mess.

OLYMPUS DIGITAL CAMERAI really feel like we’re heading for a mess in most cases.

ASP.net 5 is great for the expert developers but we’re not all expert developers. In fact the vast majority of developersare just average.

So what can be done to bring the power of ASP.net 5 to the masses and still save them from the mess?

  1. Tooling. I’ve seen some sneak peeks at where the tooling is going and the team is great. The WebEssentials team is hard at work fleshing out helper tools for integration into Visual Studio.

  2. Training. I run a .net group in Calgary and I can tell you that I’m already planning hackatons on ASP.net 5 for the summer of 2015. It sure would be great if Microsoft could throw me a couple hundred bucks to buy pizza and the such. We provide a lot of training and discussion opportunity and Microsoft does fund us but this is a once in a decade sort of thing.

  3. Document everything, like crazy. There is limited budget inside Microsoft to do technical writing. You can see this in the general decline in the quality of documentation as of late. Everybody is on a budget but good documentation was really made .net accessible in the first place. Documentation isn’t just a cost center it drives adoption of your technology. Do it.

  4. Target the node people. If ASP.net can successfully pull developers from node projects onto existing ASP.net teams then they’ll bring with them all sorts of knowledge about npm and other chunks of the tool chain. Having just one person on the team with that experience will be a boon.

The success of ASP.net 5 is dependent on how quickly average developers can be brought up to speed. In a world where a discussion of dependency injection gets blank stares I’m, frankly, worried. Many of the developers with whom I talk are pigeon holed into a single language or technology. They will need to become polyglots. It is going to be a heck of a ride. Now, if you’ll excuse me, I have to go learn about something called “gulp”.