2022-04-04

Redis Cheat Sheet

Running in Docker

Quickly get started with

docker run --name somename -p 6379:6379 redis

Connection

The simplest connection string is to use localhost which just connects to the localhost on port 6379.

Querying

Redis is a really simple server to which you can just telnet (or use the redis-cli) to run commands.

List All Keys

This might not be a great idea against a prod server with lots of keys

keys *

Get key type

Redis supports a bunch of different data primatives like a simple key value, a list, a hash, a zset, … to find the type of a key use type

type HereForElizabethAnneSlovak
+zset

Set key value

set blah 7

This works for updates too

Get key value

get blah

Get everything in a zset

ZRANGE "id" 0 -1

Count everything in a zset

zcount HereForjohnrufuscolginjr. -inf +inf

Get the first thing in a zset

ZRANGE "id" 0 0

Get everything in a set

SMEMBERS HereFeedHashTags

Get the first member of a list

LPOP somekey

Get the last member of a list

RPOP somekey

Get the contents of a hash

HGETALL somekey

Clear out the whole database

FLUSHALL

Clear just the current db

FLUSHDB

Stats on keys

INFO keyspace

Get an idea of db size

INFO Memory
2022-03-18

Fixing VS Code Rubocop Issues

I run this rubocop extension in my VS Code

rubocop extension

Recently the project I was on did a Ruby update and my rubocop stopped working with an error like this

Rubocop error

The issue here was that the rubocop in the project was newer than the globally installed rubocop so it was returning empty output. This extension doesn’t look like it uses rbenv properly so I needed to globally update rubocop which I did with

/usr/local/bin/rubocop -v  -> 1.22.3
sudo gem install rubocop
/usr/local/bin/rubocop -v  -> 1.26

I still had some errors about missing rules and needed to also do

sudo gem install rubocop-rake
sudo gem install rubocop-rails
sudo gem install rubocop-performance

Ideally I’d like this extension to use the rbenv version of ruby but this gets me sorted for now.

2022-03-16

Unsupported Architecture for fsevents with Oryx

I updated the version of the lock file on a project the other day in the hopes it might restore a little bit more quickly. However for some steps in my build an older version of NPM was being used. This older version didn’t have support for the new lock file version and while it is supposed to be compatible it seemed like optional dependencies like fsevents were causing a legit issue

npm WARN read-shrinkwrap This version of npm is compatible with lockfileVersion@1, but package-lock.json was generated for lockfileVersion@2. I'll try to do my best with it!
npm ERR! code EBADPLATFORM
npm ERR! notsup Unsupported platform for fsevents@2.3.2: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})
npm ERR! notsup Valid OS:    darwin
npm ERR! notsup Valid Arch:  any
npm ERR! notsup Actual OS:   linux
npm ERR! notsup Actual Arch: x64

npm ERR! A complete log of this run can be found in:
npm ERR!     /root/.npm/_logs/2022-03-09T13_58_42_006Z-debug.log

In theory these fsevent things should be warnings because they are optional dependencies. Updating the version of node used by Oryx, the build engine for Azure Static Web Apps, listens to the version of node defined in package.config. Adding this section to the package.config fixed everything

"engines": {
    "node": ">=16.0",
    "npm": ">=8.0"
  },
2022-02-20

Exclude node_modules from Backblaze Backups on Windows

There are some articles out there about how to exclude node_modules from Backblaze backups on OSX but I couldn’t find anything about windows.

What you want to do is open up C:\ProgramData\Backblaze\bzdata\bzexcluderules_editable.xml and add a new rule.

<excludefname_rule plat="win" osVers="*"  ruleIsOptional="t" skipFirstCharThenStartsWith="*" contains_1="node_modules" contains_2="*" doesNotContain="*" endsWith="*" hasFileExtension="*" />

If you want to exclude .git folders too (they can also contain a lot of small files that are slow to backup) also add

<excludefname_rule plat="win" osVers="*"  ruleIsOptional="t" skipFirstCharThenStartsWith="*" contains_1=".git" contains_2="*" doesNotContain="*" endsWith="*" hasFileExtension="*" />
2022-02-18

Parsing Vue Router Path Parameters

In the vue router you can set up path parameters that are bound into the rendered component. For instance you might have a route like this:

{
    path: '/reports/:reportId/:reportName/:favorite',
    name: 'Reports',
    component: ReportView,
    props: true
}

This will bind the parameters reportId, reportName and favorite on the component. However when you drop into that component and look at the values passed in you will see that they are strings. Of course that makes sense, the router doesn’t really know if the things you pass in are strings or something else. Consider the route /reports/17/awesome report/false. Here reportId and favorite are going to be strings.

You can work around that by giving props in the router a function rather than just a boolean.

{
    path: '/reports/:reportId/:reportName/:favorite',
    name: 'Reports',
    component: ReportView,
    props: (route) => ({
      ...route.params,
      reportId: parseInt(route.params.reportId),
      favorite: route.params.favorite === 'true',
    })
  }
2022-02-16

Installing the AzFilesHybrid PowerShell Module

If you don’t do a lot of powershell then the instructions on how to install the AzFilesHybrid module can be lacking. Here is how to do it

  1. Download the module from https://github.com/Azure-Samples/azure-files-samples/releases
  2. Unzip the file downloaded in step 1
  3. Go into the folder and run the copy command
./CopyToPSPath.ps1
  1. Install the module with
    Install-Module -Name AzFilesHybrid -Force
    

With this module installed you can then run things like Join-AzStorageAccount to get a fileshare joined to the domain

Join-AzStorageAccount -ResourceGroupName "rg-azfileshare" -StorageAccountName "sa-azfileshare" -DomainName "somedomain.com" -DomainUserName "jane" -DomainUserPassword "password"
2022-01-27

Purge CDN in DevOps

In order to purge a cache in the build pipeline you can use some random task that some dude wrote or you can just use the Azure CLI.

Here is an example of what it would look like to purge the entire CDN top to bottom

- task: AzureCLI@2
  displayName: 'Invalidate CDN Cache'
  inputs:
    azureSubscription: 'Azure'
    scriptType: 'batch'
    scriptLocation: 'inlineScript'
    inlineScript: 'az cdn endpoint purge --content-paths "/*"  -n devascacdnendpoint -g devasca-rg --no-wait --profile-name devascacdn'
2021-12-15

Kafka and .NET - Part 3 - Finally at .NET

It has taken us 2 seemingly unrelated posts to get here but we finally made it to the point where we can actually run .NET and interact with Kafka. We need to create two basic programs: one that writes to Kafka and another that reads from it. There is a Kafka client for .NET available in nuget and that’s where our story will start.

Read More

2021-12-09

Kafka and .NET - Part 1 - What is Kafka?

C# Advent

This post is one among many which is part of 2021’s C# Advent. There are a ton of really great posts this year and some new bloggers to discover. I’d strongly encourage you to check it out.

Years ago I was talking to somebody, and I’m sorry I don’t recall who, that was bemoaning the lack of innovative data storage technologies in the .NET space. I honestly didn’t have an answer to that. We have RavenDB but I would avoid that if I possibly could. In the Java space there was Neo4J, Voldemort, Elasticsearch, Cassandra…

Fortunately, those of us who are ostensibly .NET developers don’t have to rely on a data storage tool being written in the same language we’re using to benefit from it. One of the technologies I’ve been looking at recently is Apache Kafka, which is another Java based data storage and routing tool. It is effectively a messaging system but Kafka adds persistence on top of that.

Read More