Bootsy FTW

Bootsy Collins

Just published a new Typescript functional-light library on NPM. Bootsy has a lot of great ease-of-use features for functional-light JavaScript.

The intended target is node, but it’s small and there are no dependencies, so there’s no reason you couldn’t also use it for browser-based applications.

Bootsy should play just fine with other functional libraries like Lodash or Ramda

Check us out on NPM or GitHub for more information!

Why name it Bootsy?

Bootsy Collins is one of the most successful funk (func?) musicians to grace us with his music. From laying down some of the most classic lines for James Brown to Parliament/Funkadelic and his own solo efforts, Bootsy is known not for extreme complexity, but for slick, fun grooves. We’re drawing inspiration from his music for our functional library.

Introducing Serverless-Multi-Region-Plugin

We use the serverless framework on one of my projects at work. Now, deploying serverless is a piece of cake, but we needed to deploy serverless in an active-active multi-region failover setup to meet the needs of our disaster recovery.  The setup we want kinda looks like this:

So as usual, I started by trying to use something that was out there.  And I found “Serverless-Multi-Regional-Plugin“.  Just one problem…it didn’t really work.

It was a good start, but it left a number of things out. It didn’t set up the API Gateway base path properly, it wasn’t set up to await the outcome of the API being deployed, and it also didn’t set up health checks to allow the fail-over to occur out of the box.  The setup also required a lot of properties to be explicitly set that I just wanted to be derived from the host name. In addition, there were no unit tests in the project.  That being said, it was a great start to build from.

So I basically added all the things I mentioned.  I allow almost everything to be derived from just a couple of settings if you have a domain name set up and a certificate registered. But…I also allow all of the original settings to explicitly override the settings derived by convention.  I also automatically added in some default health checks and set up the base path of the CloudFront so that everything just works. And finally, I added a bunch of unit tests to make sure all of the core configuration settings actually did was I expected them to do.

After completing all of this work, of course I put in a PR….but no response. I guess the maintainer of the project has let it go. So I tweaked the name a bit, and now we have:

Serverless-Multi-Region-Plugin!! I know, very original.

Check it out, use it, improve it!

NPM: https://www.npmjs.com/package/serverless-multi-region-plugin

GitHub: https://github.com/unbill/serverless-multi-region-plugin

Introducing Mongo-Up

A month or so ago, I was in need of a Node-based utility to deploy changes or data migrations of my Mongo DB database.  I work in an industry that has high compliance needs, so it wasn’t appropriate for us to do manual scripting of changes.  And logging onto our production Mongo was out of the question.

I started by using a project called Migrate-Mongo, but because we use AWS and I needed to retrieve configuration dynamically, I had to add some async configuration functionality and a pull request was accepted.

Next, I needed a way to run scripts that I wanted to happen idempotently with every release.  I added support for scripts that were run before and after every release (like ensuring indexes).  This PR added some significant complexity and was also a significant rewrite, but I added a ton of tests and made sure that coverage was at 100%.

Understandably, the Migrate-Mongo owner didn’t want to incorporate such a large change to the original project. But that’s the great thing about OSS, I just created Mongo-Up to make this great functionality to all of you great peeps!

Check it out, use it, improve it!

NPM: https://www.npmjs.com/package/mongo-up

GitHub: https://github.com/unbill/mongo-up

 

GeoGems Part 1: Getting Set Up

LonghornDam

In this installment, we’ll be getting our Ubuntu environment set up for ASP.Net Core development. I’m using an Ubuntu 14.04 VM on my Mac. If you choose a different Linux installation, hopefully this will help, but mileage may vary.

Ubuntu

This will just be a short article on getting your development environment set up.  For this article, I’m using Ubuntu 14.04 desktop edition.  I’m working on a Mac and running Ubuntu as a VM via VMWare fusion. Here’s some good instructions on getting this set up…not much to it really although their Ubuntu download link has gone bad.

Open a command line and make sure out package manager is up to date:

apt-get update

or if needed, prefix with sudo as shown below.

sudo apt-get update

In the commands that follow, I’m assuming that if you run into security issues you will use sudo to prefix installation steps.  The exception is when installing Node/NPM.  Here we want to avoid using sudo which is why I use the specific technique

Now make sure that Unzip and cURL are installed (again prefix with sudo if required).

apt-get install unzip curl

Visual Studio Code

Visual Studio Code is just a download and install.  Just make sure you download the .deb installer if you are using Ubuntu. This will take you to the Ubuntu Software Center and prompt you to install Visual Studio Code. In my case, the installer warned my that it didn’t like the package for Code…I proceeded anyway and it installed just fine.  Not sure what the deal is there…this is the just-release 1.0 drop of Visual Studio Code…so maybe just a glitch in the installer…but again it appeared to install just fine without a hitch. In addition, the installer should also set up launching from the command line via typing “code {your directory here}” to launch in the specified directory.

After starting up Visual Studio Code, assuming you are using the Unity GUI, you will see the Visual Studio Code icon in the Launcher Bar on the left side.  You may want to right click the Code icon and select Lock to Launcher so the icon stays put when Code is closed.

LockToLauncher

In order to support C# development, we’ll be adding the OmniSharp plugin. Basically you can hit F1 to bring up the Command Palette and then type/select Install Extension from the command selections.  Then choose/type C# from the available installations. In a few seconds, OmniSharp will be installed but a restart of Code will be necessary to complete installation.  Here are the official instructions.

Node

Of course Node is required for build and scaffolding tools etc…  For this demo I’ve installed Node 4.4.3 but don’t believe there should be any issues using 5.x.  I used the NVM method to install Node.  This method is being used because it allows node packages to be installed globally without requiring sudo permissions.

The full instructions are here.

The Basics

Open a command line and run the following.

curl https://raw.githubusercontent.com/creationix/nvm/v0.30.2/install.sh | bash

Restart your command line.

To install Node/NPM with NVM:

nvm install 4.4.3


And now we need to install some build tools to help us out via NPM. The execution of these commands should not require sudo to install if the NVM method worked properly.

npm install -g yo bower grunt-cli gulp

Installing .Net

Now we’re going to install .Net tooling to allow us to control the various .Net runtime versions on our system.  Note that this tooling will change when RC2 comes out.  I’ll update this post when V2 officially drops. The information below was taken mostly from this source.

To install the .Net Version Manager (DNVM):

curl -sSL https://raw.githubusercontent.com/aspnet/Home/dev/dnvminstall.sh | DNX_BRANCH=dev sh && source ~/.dnx/dnvm/dnvm.sh

To install other .Net prerequisites via the Ubuntu package manager:

sudo apt-get install libunwind8 gettext libssl-dev libcurl4-openssl-dev zlib1g libicu-dev uuid-dev

Now we’re going to use the DNVM to install the latest DNX available for the Core CLR. This is a compact, performant, and cross-platform version of the common language runtime (clr) that we will be using.

dnvm upgrade -r coreclr

ASP.Net core applications run on a lightweight and performant web server called Kestrel that uses a cross-platform I/O library called Libuv.  The following builds and installs Libuv:

sudo apt-get install make automake libtool curl
curl -sSL https://github.com/libuv/libuv/archive/v1.8.0.tar.gz | sudo tar zxfv - -C /usr/local/src
cd /usr/local/src/libuv-1.8.0
sudo sh autogen.sh
sudo ./configure
sudo make
sudo make install
sudo rm -rf /usr/local/src/libuv-1.8.0 && cd ~/
sudo ldconfig

Docker

We’ll be using docker as our method of packaging and deploying our ASP.Net applications. The instructions to install Docker on Ubuntu can be found here. Make sure you follow all instructions for Ubuntu 14.04 (or whichever version you are using).

Yeoman Templates

If you followed the Node instructions above, you’ve already installed Yeoman from NPM.  All we need to do now is install our ASP.Net yeoman templates to get us started:

npm install -g generator-aspnet

The GitHub repository has much more thorough information.

That’s it as far as getting set up for now.  In our next installment, we’ll build our first ASP.Net Web API on Linux that targets the core framework.

GeoGems: An ASP.Net Core App on Linux

GeoGems

Articles in This Series

GeoGems Part 1: Getting Set Up

The Idea

I’m starting a new “practice project” to suss out creating a complex .Net application on Linux. The application is going to be called GeoGems.io. I’m a runner, but now that I’m older and have a child and slower etc…I don’t really run to be fast anymore.  I just run because I love to run. But I love to trail run and run on routes I’ve never encountered before. A lot of times I’ll come across awesome urban art or other really cool landmarks.  I was thinking how cool it would be to have a running app that showed you cool things that others had tagged that you were near…so you could change course and go check things out as you run. And also allow you as a runner to share things you’ve found. So the basic idea is that you can use an app to stop and take pictures and “GeoCode” and tag sites as you run or walk. I’d also like to track some basic running stats like distance and pace. Kind of a combination of a running app and a geocaching app, but based on landmarks as the things to find.

My Technology Stack

The goal is to use free/opensource in all possible cases to keep this zero cost. I want to create this app with C# running ASP.Net Core on Linux. I want to use Docker as my deployment approach.  I’m thinking about Aurelia for the admin front end and Ionic 2 for the mobile web/web app. If Aurelia releases their Aurelia Interface product any time soon I might swap that for Ionic. I’ll probably want some kind of event sourcing backend for recording route events so I’m probably going to try Marten, since that sits on Postgres and also has a document store capability (and then the underlying relational DB if I need it). Hopefully they release the event store support soon.

Code Tools

I’m planning on coding this on Linux (Ubuntu) as well, so I’ll be using Visual Studio Code with OmniSharp. I’ll also use Code as my JavaScript editor for client apps.

Deployment

I want to deploy to the cloud, so I’m thinking DigitalOcean (although AWS does have their free plan). You just get so much bang for your buck with DigitalOcean.

Why This Approach

There are several.  The first is that I’ve been a Windows/Mac user my entire life, including my entire coding life and always wanted to explore Linux in more detail. And the move for .Net Core to support Linux is compelling for several reasons:

  • Because it’s there…why not give it a shot?
  • Cost – Price out some Windows VM’s on Azure vs Linux VMs.  The Linux VMs come in at 60% of the cost.  That’s huge for a startup or even ambitious hobby projects.
  • Platforms/Hosts – Using Linux opens up several awesome and inexpensive hosts like Digital Ocean.  It also opens up some great DevOps/Depoyment tools like Docker. I know Windows containers are on the horizon…but they are definitely behind the curve and they still need to run on Windows (see Cost).
  • .Net and C# are awesome – I’ve used various Node frameworks and they are impressive…but there’s something to be said about the sweet spot that C# hits as a strongly typed language with so many features, like:
    • Lambdas/LINQ/Delegates
    • Async/Await
    • Great Generics support
    • Dynamics
    • Immutables
    • I could go on for a while here…it’s a fantastic language…and C# 7 looks to be even better.
  • .Net has a solid and growing open source community…and now even .Net Core is open source…so long Ballmer…
  • .Net Core Performance on Kestrel is impressive…even for the RC, so you’re getting a lot of bang for your compute buck.  This great performance means smaller/fewer servers in your farm.

So why not just develop on Windows using Visual Studio 2015 (which I have available) and target the core framework so that you can run on Linux?  There are a couple of reasons.  The first is that, as mentioned above, I want to improve my comfort with Linux as well as get a better grasp of using a “minimal toolset” approach.  The other is ease of integration with Docker. Docker can be set up on Windows (currently by using VirtualBox)…but it’s really just a mini Linux VM…why not just get on Linux instead where it’s a first class citizen?

That’s all for now…I’ll use this as a landing page for upcoming posts.

Mapster 2.0 Released!

We’ve released Mapster 2.0, and it’s looking really good!  This definitely puts Mapster back on top as the best, most complete, fast .Net mapper out there.  If you need something that won’t get bogged down under heavy load but has very rich features, this one’s for you.

Huge shout to Chaowlert, who took this project and ran with it, adding all of the new updates and optimizations.

Downloads

GitHub

Nuget (PM> Install-Package Mapster)

New Features

  • Big speed improvements.
  • Projection is improved to generate nicer sql queries
  • Mapster is now able to map struct
  • Flagged enum is supported
  • Settings are now much more flexible
    • You can now both opt-in and opt-out setting
    • Setting inheritance is able to inherit from interface
    • Setting inheritance is now combined (it does not only pick from the closest parent)
    • New rule based setting, you can defined your setting more granular level
    • Setting is no more static, you can overload your setting to use different setting for your mapping
  • You can ignore properties using attributes
  • Now you can setup your map from different type ie config.Map(dest => dest.AgeString, src => src.AgeInt)
  • Mapster now supports circular reference mapping!
  • Supports more frameworks (.NET 4.0, 4.5, .NET Core RC 5.4)

Benchmarks

Engine Structs Simple objects Parent-Child Parent-Children Complex objects Advance mapping
AutoMapper 10871 27075 20895 19199 19333 21496
ExpressMapper 690 1350 1195 1678 3130 3920
OoMapper 2043 1277 1416 2777
ValueInjector 8534 21089 17008 12355 16876 19970
TinyMapper 1282
Mapster 2382 1892 1626 4287 6756
Mapster 2.0 515 1251 950 1037 2455 2342
Native 458 790 870 1253 3037 2754

(NOTE: Benchmark runner is from ExpressMapper. Benchmark was run against largest set of data, times are in milliseconds, lower is better. Blank values mean the library did not supported.)

Give it a shot!

 

Nozus JS 1: Intro to Sails with Passport and JWT (JSON Web Token) Auth

This project extends from some previous posts on creating a SPA style application with Node (Sails) and Aurelia. I’m not going to go into detail about installing Node, NPM, or Sails, other than when it’s germane to the subject.  I’m assuming you are already set up with all of the needed basics for Node/Sails development. If not, visit the Sails site to get started.

We’ll start by creating a new directory in which to create our test projects and bringing up a command line. What we’re creating here is an API; so no need for any front end mixed into the Sails application.  We can create that later if we like. So the command to create a new app without a front end is:

sails new myApi --no-frontend

The basics of the app should now be present in the “myApi” directory (or whatever you called it). If you open up the project in your favorite editor, you will see the following structure:

structure

Most of the action will happen in the api folder but we’ll also need to set up some config settings. But first let’s go ahead and install some dependencies. Make sure you are in the root folder of your project and install the following from the command line:

npm install jsonwebtoken --save
npm install bcrypt-nodejs --save
npm install passport --save
npm install passport-jwt --save
npm install passport-local --save

Alternately, you could just add these to the package.json file and run npm install.

Config

Now we’ll add our passport configuration.  Another nice thing about Sails is that putting a js file in the /config folder means that it will be run when you “lift” or start the app. In the config folder, create passport.js and add the following code:


/**
 * Passport configuration file where you should configure strategies
 */
var passport = require('passport');
var LocalStrategy = require('passport-local').Strategy;
var JwtStrategy = require('passport-jwt').Strategy;

var EXPIRES_IN_MINUTES = 60 * 24;
var SECRET = process.env.tokenSecret || "4ukI0uIVnB3iI1yxj646fVXSE3ZVk4doZgz6fTbNg7jO41EAtl20J5F7Trtwe7OM";
var ALGORITHM = "HS256";
var ISSUER = "nozus.com";
var AUDIENCE = "nozus.com";

/**
 * Configuration object for local strategy
 */
var LOCAL_STRATEGY_CONFIG = {
  usernameField: 'email',
  passwordField: 'password',
  passReqToCallback: false
};

/**
 * Configuration object for JWT strategy
 */
var JWT_STRATEGY_CONFIG = {
  secretOrKey: SECRET,
  issuer : ISSUER,
  audience: AUDIENCE,
  passReqToCallback: false
};

/**
 * Triggers when user authenticates via local strategy
 */
function _onLocalStrategyAuth(email, password, next) {
  User.findOne({email: email})
    .exec(function (error, user) {
      if (error) return next(error, false, {});

      if (!user) return next(null, false, {
        code: 'E_USER_NOT_FOUND',
        message: email + ' is not found'
      });

      // TODO: replace with new cipher service type
      if (!CipherService.comparePassword(password, user))
        return next(null, false, {
          code: 'E_WRONG_PASSWORD',
          message: 'Password is wrong'
        });

      return next(null, user, {});
    });
}

/**
 * Triggers when user authenticates via JWT strategy
 */
function _onJwtStrategyAuth(payload, next) {
  var user = payload.user;
  return next(null, user, {});
}

passport.use(
  new LocalStrategy(LOCAL_STRATEGY_CONFIG, _onLocalStrategyAuth));
passport.use(
  new JwtStrategy(JWT_STRATEGY_CONFIG, _onJwtStrategyAuth));

module.exports.jwtSettings = {
  expiresInMinutes: EXPIRES_IN_MINUTES,
  secret: SECRET,
  algorithm : ALGORITHM,
  issuer : ISSUER,
  audience : AUDIENCE
};

So there’s a few things going on here.  First, we’re importing both passport and the two strategies we’re going to set up for now (local and JWT). We’ll add social login in the next post. Next, we’re going to add configuration for our two auth strategies. For the local strategy we’ll use email and password in to login. For JWT, we’ll set up the parameters needed to verify the token: secret, issuer and audience.  It should be noted again that the issuer and audience are optional but provide some additional verification.

Next we’ll add some functions to react to the auth request for each strategy. For local auth, this means making sure that the user matches the user in our database and returning a false response with a message on failure or the user if it succeeded.  For JWT auth, the strategy is already validating the token internally. If you want additional validation here you can check the user ID against the database to verify that this user actually exists and is active.  For the sake of simplicity, I’m going to trust the token and retrieve the user information from the token payload and just return it.

Last, we are exporting some settings to be used elsewhere in the API. This is another nice feature of Sails: The ability to export settings in a config file to make them available globally. So in our case, we are exporting jwtSettings from our config. To access these settings from anywhere, we can use the global accessor:  sails.config.jwtSettings.

Services

Next add a new file under api/services called CipherService. In sails, services and models are named using PascalCase by convention. I need a better name for this one but it will do for the time being. This is where we’ll put all of our code that does hashing and/or token stuff.

var bcrypt = require('bcrypt-nodejs');
var jwt = require('jsonwebtoken');

module.exports = {
  secret: sails.config.jwtSettings.secret,
  issuer: sails.config.jwtSettings.issuer,
  audience: sails.config.jwtSettings.audience,

  /**
   * Hash the password field of the passed user.
   */
  hashPassword: function (user) {
    if (user.password) {
      user.password = bcrypt.hashSync(user.password);
    }
  },

  /**
   * Compare user password hash with unhashed password
   * @returns boolean indicating a match
   */
  comparePassword: function(password, user){
    return bcrypt.compareSync(password, user.password);
  },

  /**
   * Create a token based on the passed user
   * @param user
   */
  createToken: function(user)
  {
    return jwt.sign({
        user: user.toJSON()
      },
      sails.config.jwtSettings.secret,
      {
        algorithm: sails.config.jwtSettings.algorithm,
        expiresInMinutes: sails.config.jwtSettings.expiresInMinutes,
        issuer: sails.config.jwtSettings.issuer,
        audience: sails.config.jwtSettings.audience
      }
    );
  }
};

We’ll use these methods elsewhere. The hash/comparePassword methods are pretty self-explanatory. Bcrypt-nodejs even adds in a salt to the hashed password by default. Pretty nice. The createToken method uses JsonWebToken to create a new token using the sign method. This method accepts a payload (the user in our case), a secret to use for the self-contained JWT hash, and some metadata including the algorithm, expiration, issuer and audience. Issuer and Audience will also be validated when the token is verified coming back in, so it gives a little extra protection. The inclusion of the user in the payload is really just for example. In the real world, you might include the user id and claims etc…

API Generation

Next we’ll create our User API. Return to the root command line of the application and run the following command:

sails generate api User

If you now observe your controllers and models directories, you’ll see the generated controller and model for User.  The controller can be left as-is for now. Although it looks empty, it’s functional!  It will use default blueprints included with Sails to provide functionality. Now we’ll want to enhance the model.  Open the api/models/User file and add the following code:

/**
 * User
 * @description :: Model for storing users
 */
module.exports = {
    schema: true,
    attributes: {
        username: {
            type: 'string',
            required: true,
            unique: true,
            alphanumericdashed: true
        },
        password: {
            type: 'string'
        },
        email: {
            type: 'string',
            email: true,
            required: true,
            unique: true
        },
        firstName: {
            type: 'string',
            defaultsTo: ''
        },
        lastName: {
            type: 'string',
            defaultsTo: ''
        },
        photo: {
            type: 'string',
            defaultsTo: '',
            url: true
        },
        socialProfiles: {
            type: 'object',
            defaultsTo: {}
        },

        toJSON: function () {
            var obj = this.toObject();
            delete obj.password;
            delete obj.socialProfiles;
            return obj;
        }
    },
    beforeUpdate: function (values, next) {
        CipherService.hashPassword(values);
        next();
    },
    beforeCreate: function (values, next) {
        CipherService.hashPassword(values);
        next();
    }
};

Here we’re adding a bunch of properties to our User schema, but also a method to remove sensitive data before converting our object to JSON.  We also have beforeUpdate and beforeCreate delegates defined to hash our password before saving to the data store.  You can see that the hash will use our CipherService, and one of the nice things about sails is that it makes our services available globally.  So here we can just use CipherService instead of needing a require. In Sails, you also have models available globally using their name. This is a huge advantage of Sails.

Auth Controller

Now we have to add an Auth endpoint to perform signup/signin etc…  We’re only generating a controller here, so you can just add the js file or do it the Sails way:

sails generate controller Auth

Now that we have our AuthController, let’s add the following actions:

/**
 * AuthController
 * @description :: Server-side logic for manage user's authorization
 */
var passport = require('passport');
/**
 * Triggers when user authenticates via passport
 * @param {Object} req Request object
 * @param {Object} res Response object
 * @param {Object} error Error object
 * @param {Object} user User profile
 * @param {Object} info Info if some error occurs
 * @private
 */
function _onPassportAuth(req, res, error, user, info) {
  if (error) return res.serverError(error);
  if (!user) return res.unauthorized(null, info && info.code, info && info.message);

  return res.ok({
    // TODO: replace with new type of cipher service
    token: CipherService.createToken(user),
    user: user
  });
}

module.exports = {
  /**
   * Sign up in system
   * @param {Object} req Request object
   * @param {Object} res Response object
   */
  signup: function (req, res) {
    User
      .create(_.omit(req.allParams(), 'id'))
      .then(function (user) {
        return {
          // TODO: replace with new type of cipher service
          token: CipherService.createToken(user),
          user: user
        };
      })
      .then(res.created)
      .catch(res.serverError);
  },

  /**
   * Sign in by local strategy in passport
   * @param {Object} req Request object
   * @param {Object} res Response object
   */
  signin: function (req, res) {
    passport.authenticate('local', 
      _onPassportAuth.bind(this, req, res))(req, res);
  },
};

So here we’re doing two basic things: Sign up and Sign in. The signup method uses the built-in Create method provided by the user model to create a new user. The signin used the Passport’s local authorization strategy to log in. In both cases, if it succeeds, we’ll generate a token so that they are signed in automatically. All requests that follow should now include the returned token in the header. We’ll demo this shortly, but have one final item to add first:  The login policy.

Policies

In Sails, the way to add Express middleware is via policies. If you examine the policy structure, that’s really exactly what it is. In our case, we need a policy that will protect our non-auth controllers from requests that don’t have a token. Look at the api/policies folder. There may be a sessionAuth.js file already present. You can delete this file as we won’t use it. Now add a new file to api/policies called isAuthenticated.js with the following contents:

/**
 * isAuthenticated
 * @description :: Policy to inject user in req via JSON Web Token
 */
var passport = require('passport');

module.exports = function (req, res, next) {
    passport.authenticate('jwt', function (error, user, info) {
      if (error) return res.serverError(error);
      if (!user) 
       return res.unauthorized(null, info && info.code, info && info.message);
     req.user = user;

     next();
    })(req, res);
};

The policy is pretty simple: It authenticates the user using the ‘jwt’ strategy that we earlier implemented in the passport configuration. If no token is present, this will return an unauthorized response. Otherwise it will add the user object from the token to the request.

Responses

Let’s briefly discuss Sails responses, another cool Sails convention. If you take a look at api/responses, you’ll see a bunch of responses that have been created for you already. These make it easy to define and reuse typical responses. So to return an OK response, instead of defining our response over and over, we can just do: return res.ok(data). In our case, we need to add two new custom responses: created and unauthorized.  In the responses folder add created.js with the following content:

/**
 * 201 (Created) Response
 * Successful creation occurred (via either POST or PUT).
 * Set the Location header to contain a link 
 * to the newly-created resource (on POST).
 * Response body content may or may not be present.
 */
module.exports = function (data, code, message, root) {
  var response = _.assign({
    code: code || 'CREATED',
    message: message 
       || 'The request has resulted in a new resource being created',
    data: data || {}
  }, root);

  this.req._sails.log.silly('Sent (201 CREATED)\n', response);

  this.res.status(201);
  this.res.jsonx(response);
};

Now create an unauthorized.js file in the same folder:

/**
 * 401 (Unauthorized) Response
 * Similar to 403 Forbidden.
 * Specifically for authentication failed or not yet provided.
 */
module.exports = function (data, code, message, root) {
  var response = _.assign({
    code: code || 'E_UNAUTHORIZED',
    message: message || 'Missing or invalid authentication token',
    data: data || {}
  }, root);

  this.req._sails.log.silly('Sent (401 UNAUTHORIZED)\n', response);

  this.res.status(401);
  this.res.jsonx(response);
};

I’ve also customized the other responses in accordance with some guidance provided by the Sails API Yeoman generator, but I’m not going to go through all of them.  If you would like to copy them, feel free to reference the project on GitHub. While you’re there, also checkout the overridden Blueprints in api/blueprints. These let you customize the standard processing of different operations exposed by the api controllers (unless you override them specifically in the controller). I also took these blueprints from the Sails API Yeoman generator.

Now we have our policy…how do we use it? In the /config folder, you should see a policies.js file. Open this file and adjust the code to the following:

module.exports.policies = {

    '*': ['isAuthenticated'],

    AuthController: {
        '*': true
    }
};

This is applying the following rules:

  1. Protect all controllers from unauthenticated users.
  2. Override this for the AuthController and allow anybody to hit that.

Loose Ends

OK, before we test this thing out, just a couple of small cleanups that will make Sails bug you less on “lift”. First open /config/models.js. Lets set up a standard migration policy so it won’t bug us on every project start.  Uncomment or add the line that reads migrate and change it to ‘drop‘ for now. This will drop the data store each time the application runs.  You can change to ‘alter‘ if you don’t want this behavior.

migrate: 'drop'

Now we’ll tell Sails not to expect a Gruntfile.  When you create Sails with –no-frontend, it doesn’t scaffold a Gruntfile, but it still warns that there isn’t one on every start. Go to /.sailssrc and remove the Grunt hook:

{
  "generators": {
    "modules": {}
  },
  "hooks":{
    "grunt":false
  }
}

Testing it Out!

Wow…that was a lot of stuff to run through but we’re basically done!  Let’s test it out. If you don’t already have Postman (or you’re preferred test tool) installed, go ahead and install it.

Now lets start up our Api.  At your project root terminal type:

sails lift

You should see sails start up on port 1337 by default. Now fire up Postman and signup a user by posting the following JSON payload to your local signup endpoint. In my case, this is: http://localhost:1337/auth/signup

{
 "username":"testdude",
 "email":"test1@test.com",
 "password":"testdude"
}

signup

The response should look like:

{
 "code": "CREATED",
 "message": "The request has been fulfilled and resulted in a new resource being created",
 "data": {
 "token": "eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyIjp7InVzZXJuYW1lIjoidGVzdGR1ZGUiLCJlbWFpbCI6InRlc3QxQHRlc3QuY29tIiwiZmlyc3ROYW1lIjoiIiwibGFzdE5hbWUiOiIiLCJwaG90byI6IiIsImNyZWF0ZWRBdCI6IjIwMTUtMDQtMjRUMjE6NTg6MTkuMjcxWiIsInVwZGF0ZWRBdCI6IjIwMTUtMDQtMjRUMjE6NTg6MTkuMjcxWiIsImlkIjoyfSwiaWF0IjoxNDI5OTEyNjk5LCJleHAiOjE0Mjk5OTkwOTksImF1ZCI6Im5venVzLmNvbSIsImlzcyI6Im5venVzLmNvbSJ9.j9mSeoHJiNb_rzxqJ8Cefv5ctcMVzbgvnUlvAWhbXas",
 "user": {
 "username": "testdude",
 "email": "test1@test.com",
 "firstName": "",
 "lastName": "",
 "photo": "",
 "createdAt": "2015-04-24T21:58:19.271Z",
 "updatedAt": "2015-04-24T21:58:19.271Z",
 "id": 2
 }
 }
}

You’ll notice that a token was generated.  Copy the token that your API generated so that it’s available later.

We’ll now attempt to signin as well.  Post the following JSON to your local signin endpoint. In my case, this is: http://localhost:1337/auth/signin. Earlier in our Passport config we set up local auth to use email and password, but we could change this to use the username if that is preferable.

{
 "email":"test1@test.com",
 "password":"testdude"
}

In this case, the response should be almost identical but the response code will be a 200-OK instead of a 201-Created.

Now lets try to access a resource without our token. Perform a get on your local user endpoint. In my case: http://localhost:1337/user

You should get a response indicating that you are not authorized:

{
 "code": "E_UNAUTHORIZED",
 "message": "No auth token",
 "data": {}
}

Now lets update our request to add an Authorization header.  For the value, add “JWT“, then a space, then the token you created previously. So an example value would be:

JWT eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyIjp7InVzZXJuYW1lIjoidGVzdGR1ZGUiLCJlbWFpbCI6InRlc3QxQHRlc3QuY29tIiwiZmlyc3ROYW1lIjoiIiwibGFzdE5hbWUiOiIiLCJwaG90byI6IiIsImNyZWF0ZWRBdCI6IjIwMTUtMDQtMjRUMjE6NTg6MTkuMjcxWiIsInVwZGF0ZWRBdCI6IjIwMTUtMDQtMjRUMjE6NTg6MTkuMjcxWiIsImlkIjoyfSwiaWF0IjoxNDI5OTEzNTI3LCJleHAiOjE0Mjk5OTk5MjcsImF1ZCI6Im5venVzLmNvbSIsImlzcyI6Im5venVzLmNvbSJ9.9FqGAVnJNM3SWRupOqCoW7tPJqu0ChZt5f2_En6GKqo

getUsers

Now send the Get request and you should get back the user record we created when we signed up!

{
 "code": "OK",
 "message": "Operation is successfully executed",
 "data": [
 {
 "username": "testdude",
 "email": "test1@test.com",
 "firstName": "",
 "lastName": "",
 "photo": "",
 "createdAt": "2015-04-24T21:58:19.271Z",
 "updatedAt": "2015-04-24T21:58:19.271Z",
 "id": 2
 }
 ]
}

Final Thoughts

Seems like we did a lot of stuff here, but it was all pretty easy thanks to the conventions and code generation provided by Sails. In upcoming articles, we’ll look at expanding this example to include social auth, a front end via Aurelia and additional storage mechanisms (we’re just using disk here).

The code for this example is available here. The code in my uploaded GitHub example is using Postgresql instead of writing to disk, but that can easily be changed in the config/models.js file.  Just point to any connection you have set up in the config/connections.js file.

** Warnings **

Always remember that this style of token auth is not secure if the token can be intercepted. In production, always perform communications with the API (at least those that send the token or any sensitive information) over SSL.

Additionally, you don’t want to check any production secrets into public source control. In this case, the secret in the passport config file. In a follow up we’ll talk about how to handle this but you can read about config overrides here.

Nozus JS Preface: Sails with Passport and JWT – Ermahgerd!

The Sails project in this case is only going to serve as an RESTful API.  We’ll be using Aurelia on the front end, so we don’t need any presentation.  What we do need is authentication provided by the API.

So my local auth implementation went through a few stages and much hair pulling. I first took a look at sails-generate-auth. This seemed like a great starting place…but I ran into some issues:

  • You can use tokens to protect the API, but the social auth implementation is geared towards server emitted UI by default. I want something more SPA oriented.
  • It doesn’t use JWT currently although it supports bearer tokens.

So I moved on to Waterlock. Waterlock has a lot of cool features and is super-easy to set up. In addition, it appears to be more oriented towards a SPA/API implementation.

But I ran into some issues: It uses session in situations where tokens are being used.  This defeats some of the advantages of using tokens and reduces horizontal scalability.  Some remedies to this are being worked on, but have been stuck in pull request status for several months, although it looks as though they may be resolved soon.  So I figured I could just reference the github fork instead from npm until this gets resolved…but was still concerned about it using session.

And then with local auth in Waterlock: There is no register endpoint.  There is a login endpoint that auto-registers you if it can’t find you. 😦  So if you fat-finger your username it just goes ahead and makes a new user. Again there is a pull request that hasn’t yet been merged.

At this point I decided I needed to keep looking. I could try and get those pull requests across the line but I’m already leaning away because of the opacity/complexity of the library as a whole and the session requirement.  I don’t need something this complex but it’s a great reference implementation.

So next I found a Yeoman generator for Sails that was specialized for API’s and implemented with JWT based auth. This is a really cool project and major kudo’s to Eugene who was also very responsive to questions.  But then when I felt so close:  The Yeoman generator failed because of a Windows file system issue. I think probably a path length issue due to nested npm packages. I’ll see you in hell Windows file system!!!  Eugene mentioned that he hadn’t tested it on windows…so there it was.

This may have been a good thing because the generator/template also outputs a lot of functionality that is both very cool and yet unnecessary for my project. The upside being that you can blow out a complex API in almost no time, but I didn’t want to go through and attempt to clean out everything that I didn’t want/need.  I also wanted to go through the process myself so I understood it better.

So I used the generator templates as a reference project and borrowed some great things from it including updated API blueprints and a basic approach for issuing and validating JWT tokens.

So hopefully if you come across this post and are in a similar boat (npi), I can save you some of my frustration (aka “earned knowledge”). I’ll be adding the implementation information in the next post, but a reference implementation is already available in GitHub.  Hope that’s helpful!

Next in series:

Nozus JS 1: Intro to Sails with Passport and JWT (JSON Web Token) Auth

Nozus: Changing the API to Node JS and Sails

So after futzing around with ASP.Net 5/MVC 6 for a bit and becoming frustrated with attempting to implement token based auth, I decide to switch over to Node JS using Sails as an alternative for my API.

Why Sails?  I took a look at it a few months back and was impressed with the easy setup and Waterline ORM. I then had a recent job opportunity where the client was using Sails.  I passed on the opportunity but decided to review Sails again. The combination of convention with easy overrides is super productive and flexible.  Additionally, the documentation is well done for an OSS framework.

I look forward to the eventual rollout of ASP.Net 5, but it’s still under heavy dev and what works one day doesn’t on the next pull.  This is totally understandable but I’m ready to move along and get something working.  I’m already biting off a fair amount of learnin’ by attempting Node and Aurelia.  So without further ado, I’ve added the following growing list of articles on setting up Nozus in node, including examples in GitHub.

Nozus JS Preface: Sails with Passport and JWT – Ermahgerd!

Nozus JS 1: Intro to Sails with Passport and JWT (JSON Web Token) Auth

Nozus Step 3: Adding ASP.Net Identity to MVC 6

In the previous post, we added some simple logging to our API using Serilog and simple middleware. In this installment, we’ll be adding identity management to that project using the standard ASP.Net Identity libraries.

First, let’s do a little project cleanup.  In Nozus.Data and Nozus.Domain, go ahead and delete the default Class1 that was added when we created those projects. Also, from the HomeController in Web.Api, make sure you removed the thrown exception added at the end of the previous post.

Now we’re going to add in packages needed for the identity system. Open your Package Manager Console (Tools –> Nuget Package Manager –> Package Manager Console).  In the console select your VNext package source and install the Identity.EntityFramework package into both the Data and Web.Api projects.  Make sure to includeprerelease which can be abbreviated to “-inc”.

PM> Install-Package Microsoft.AspNet.Identity.EntityFramework -inc
Installing NuGet package Microsoft.AspNet.Identity.EntityFramework.3.0.0-beta3.
PM> Install-Package Microsoft.AspNet.Identity.EntityFramework -inc
Installing NuGet package Microsoft.AspNet.Identity.EntityFramework.3.0.0-beta3.

We’re going to be using SQL Server as the data store, but you can use any applicable store.  From the Package Manager Console, install EntityFramework.SqlServer into the Web.Api project only.

PM> Install-Package EntityFramework.SqlServer -inc
Installing NuGet package EntityFramework.SqlServer.7.0.0-beta3.

Finally, in our Domain project, we’re going to add in a package to support basic Authentication classes.  We’ll derive from these later.

PM> Install-Package Microsoft.AspNet.Identity.Authentication -inc
Installing NuGet package Microsoft.AspNet.Identity.Authentication.3.0.0-alpha4.

Now that all packages are installed…let’s put identity into place.  We’re going to first add in our user and role classes into the Domain project.  In the Domain project, add an Entities folder and inside of that folder add an Identity folder. Inside of the Identity folder, add two classes, AppUser and AppRole.  For the moment, these classes require no implementation other than inheriting from the proper identity classes.  AppUser will inherit from IdentityUser<int> and AppRole will inherit from IdentityRole<int>.  The type specifies the type of the ID that will be used in the entities and the database.  By default, an int will be set up as an Identity column in SQL Server.

using Microsoft.AspNet.Identity;
namespace Nozus.Domain.Entities.Identity
{
    public class AppUser : IdentityUser<int>
    {}
}

using Microsoft.AspNet.Identity;
namespace Nozus.Domain.Entities.Identity
{
   public class AppRole : IdentityRole<int>
   {}
}

Now add a reference to the Domain project from the Data project.  Add a reference to both Domain and Data from the Web.Api.  In the Data project, add an AppDbContext class.  Inherit this class from IdentityDbContext<AppUser, AppRole, int>.


using Microsoft.AspNet.Identity.EntityFramework;
using Nozus.Domain.Entities.Identity;
namespace Nozus.Data
{
    public class AppDbContext 
        : IdentityDbContext<AppUser,AppRole,int>;
    {}
}

IdentityClassesWe’ll use this as our EF dbContext for the solution, and we can dual purpose it by inheriting from the IdentityDbContext.  We’re also specifying our User and Role types as well as the type for the IDs, which applies to both User and Role.  The project structure should now look similar to the image at the right.

Now lets fix up our Web.Api project to use this identity setup.  All the initial action here is going to be in the Startup class because, as expected, the identity functionality is set up using middleware.  In the ConfigureServices method, we’ll configure both Entity Framework and Identity as follows:


public void ConfigureServices(IServiceCollection services)
{
services.AddLogging(Configuration);

// Add EF services to the services container.
services.AddEntityFramework(Configuration)
.AddSqlServer()
.AddDbContext<AppDbContext>();

// Add Identity services to the services container.
services.AddIdentity<AppUser, AppRole>(Configuration)
.AddEntityFrameworkStores<AppDbContext, int>();

services.AddMvc();
services.Configure<MvcOptions>(options =>
{
  options.OutputFormatters.RemoveAll(formatter => 
  formatter.Instance is XmlDataContractSerializerOutputFormatter);
});
}

You can  see that we’re adding EF, telling it to use SQL Server and then giving it the type of our DbContext.  Next we’re adding identity, telling it our user and role types, and letting it know to use an EF store with our DbContext and the type of IDs it will be using.  Next, we’ll do some more setup in the Configure method.


public void Configure(IApplicationBuilder app, 
  IHostingEnvironment env, ILoggerFactory loggerfactory)
{
  loggerfactory.AddSerilog(GetLoggerConfiguration());
  app.UseStaticFiles();

  //*** Tell the app to use Identity ***//
  app.UseIdentity();

  app.UseMiddleware<ErrorLoggingMiddleware>();
  app.UseMvc(routes =>
  {
    routes.MapRoute(
    name: "default",
    template: "{controller}/{action}/{id?}";,
    defaults: new { 
        controller = "Home", 
        action = "Index" });
  });

  //***  Initialize the DB ***//
  if(env.EnvironmentName == "Development")
    InitializeDb(app.ApplicationServices).Wait();
}

private static async Task InitializeDb(
  IServiceProvider applicationServices)
{
  using (var dbContext = 
    applicationServices.GetService<AppDbContext>())
  {
    var sqlServerDatabase = 
      (SqlServerDatabase)dbContext.Database;
    await sqlServerDatabase.EnsureDeletedAsync();
    if (await sqlServerDatabase.EnsureCreatedAsync())
    {
      //We could add some test data...perhaps later
    }
  }
}

Here, we’re telling the app to use identity.  We’re also calling our InitializeDb method to drop and recreate our database. This is mainly in place for dev purposes, so you could call alternate database setups depending on whether you were running integration tests/developing/etc…  We’ll just have an empty database to start with for now.  We’re using the built in env.EnvironmentName to test that this is running in Dev before dropping and recreating our database.

When you installed the EF packaged into the Web.Api project, it should have automatically added a config section to your config.json file.  Take a look, it should look similar to the following:

{
 "Data": {
   "DefaultConnection": {
   "ConnectionString": 
    "Server=localhost;Database=Nozus;Trusted_Connection=True;
     MultipleActiveResultSets=true"
   }
 },
   "EntityFramework": {
     "AppDbContext": {
     "ConnectionString": "Data:DefaultConnection:ConnectionString"
     }
   }
}

The data section should contain a connection string.  By default it will set up a connection string to localDB with a random database name.  I’ve changed it to point to my local Sql Server Dev Edition, but localDB will work just fine.  You can explicitly set up which connection string EF uses, but by default, it’s going to find the entry under the EntityFramework section that matches the name of the DbContext.  So in this case, make sure that it’s named “AppDbContext”.

When exchanging data with our API, it’s important that we define contracts for data going in and out. We don’t want to use the domain classes, as they will have lots of properties that we don’t want to expose to the outside world.  If there isn’t already a Models folder in your Web.Api project, add one.  In this folder, create a new class called UserModel.  It should look like the following:

using System.ComponentModel.DataAnnotations;
namespace Nozus.Web.Api.Models
{
public class UserModel
{
   public int? Id { get; set; }
  [Required]
  [Display(Name = "User name")]
  public string UserName { get; set; }

  [Required]
  [StringLength(100, 
  ErrorMessage = 
    "The {0} must be at least {2} characters long.", 
  MinimumLength = 6)]
  [DataType(DataType.Password)]
  [Display(Name = "Password")]
  public string Password { get; set; }

  [DataType(DataType.Password)]
  [Display(Name = "Confirm password")]
  [Compare("Password",
  ErrorMessage = 
    "The password and confirmation password do not match.")]
  public string ConfirmPassword { get; set; }
}
}

So now that we’ve got a model class, how to we map from our Domain AppUser class to this UserModel class. You can roll your own mappers, which is fine, but you might want to use a mapping library.  In .Net almost everybody uses AutoMapper. It is very versatile and works in a variety of environments.  So feel free to use AutoMapper if you like, but I’m going to use Mapster instead. It’s another mapper that I currently support.  I use it because it gives me everything I need from AutoMapper and is ~10-50x faster on average. The setup of these two mappers is very similar, so they translate easily in most cases.  Install the Mapster package into your Web.Api project, selecting nuget.org as your source.

PM> Install-Package Mapster
Installing NuGet package Mapster.1.14.0.

Now create a Mapping folder in your Web.Api project.  Create a class in this folder called UserMapping.  It should inherit from Mapster’s Registry class. A registry class can be scanned at startup to register all of the mappings.

using Mapster;
using Mapster.Registration;
using Nozus.Domain.Entities.Identity;
using Nozus.Web.Api.Models;
namespace Nozus.Web.Api.Mapping
{
public class UserMapping : Registry
{
   public override void Apply()
  {
    TypeAdapterConfig<AppUser, UserModel>.NewConfig()
    .Ignore(dest => dest.ConfirmPassword)
    .Ignore(dest => dest.Password);
  }
}
}

Here, we’re telling Mapster to map AppUser to UserModel and to ignore the destination ConfirmPassword and Password fields since we don’t want those returned.  Now in the Startup.cs constructor, we’ll add a call to find and register all of our Registry classes.  Of course right now there’s just the one, but it will grow as the application grows.

public Startup(IHostingEnvironment env)
{
  Configuration = new Configuration()
  .AddJsonFile("config.json")
  .AddEnvironmentVariables();
  //Scan for our mappings
  Mapster.Registration.Registrar
  .RegisterFromAssembly(Assembly.GetExecutingAssembly());
}

So now to finish up we’ll implement our AccountController. This will do things like register new users, reset passwords, inactivate users etc…  For right now we’ll just add methods to add a new users and to retrieve a user.  Add the below code to your AccountController class.

using System.Net;
using System.Threading.Tasks;
using Mapster;
using Microsoft.AspNet.Identity;
using Microsoft.AspNet.Mvc;
using Nozus.Domain.Entities.Identity;
using Nozus.Web.Api.Models;

namespace Nozus.Web.Api.Controllers
{
  [Route("[controller]")]
  public class AccountController : Controller
  {
   private readonly UserManager<AppUser> _userManager;

  public AccountController(UserManager<AppUser> userManager)
  {
    _userManager = userManager;
  }

 // POST api/Account/
 [AllowAnonymous]
 [HttpPost]
 public async Task<IActionResult> Post(
 [FromBody] UserModel userModel)
 {
   if (!ModelState.IsValid)
   {
     return new BadRequestObjectResult(ModelState);
   }
   var user = new AppUser {UserName = userModel.UserName};

   IdentityResult idResult = 
   await _userManager.CreateAsync(user, userModel.Password);

   IActionResult errorResult = GetErrorResult(idResult);
   if (errorResult != null)
   {
     return errorResult;
   }
   //Put together a response
   string url = 
    Url.RouteUrl("GetUserById", new { userId = user.Id },
   Request.Scheme, Request.Host.ToUriComponent());
   Context.Response.Headers["Location"] = url;
   Context.Response.StatusCode = (int)HttpStatusCode.Created;

   return new ObjectResult(TypeAdapter.Adapt<UserModel>(user));
}

//GET api/Account/1
[HttpGet("{userId:int}", Name = "GetUserById"]
public async Task<IActionResult> GetUserById(int userId)
{
  var user = 
   await _userManager.FindByIdAsync(userId.ToString());

  return new ObjectResult(TypeAdapter.Adapt<UserModel>(user));
} 

private IActionResult GetErrorResult(IdentityResult result)
{
  if (!result.Succeeded)
  {
   if (result.Errors != null)
   {
     foreach (var error in result.Errors)
     {
         ModelState.AddModelError(error.Code, error.Description);
     }
   }
   return new BadRequestObjectResult(ModelState);
  }
  return null;
}

}
}

So there’s a fair amount going on here.  The Post method accepts a UserModel, validates it using the annotations on the class and returns a BadRequest response with the model errors if validation fails.  If it succeeds, it uses the UserManager that is included with Identity to attempt to create an AppUser.  If the create fails, we will build an error response and return it. If the create is successful, it will store this new AppUser in our database and will also populate its assigned UserId.  We’ll create an object response and use Mapster to map the AppUser back to a response UserModel, but we’ll also add a header to the response that points to the Get uri that can be used to retrieve the user.

Now let’s try this out.  Start your project and the Home page should be displayed. If you don’t have it installed already, install Postman.  Fire up Postman and lets perform a Post to our API to create a new user.

CreateUserPostman

Select POST from the method dropdown.  Set the url to {your api url}/Account. Here, mine is http://localhost:13171/Account.  Add a a Content-Type header and set it to application/json.  Put some json in the payload that will pass password validation:

{
 "userName": "RicoSuave",
 "password": "MyPassword_123",
 "confirmPassword": "MyPassword_123"
}

Now click the Send button.  The tabs on the bottom should show the response.  A new user with ID should be returned.  If you examine the Headers tab, you will also see the Location header we added to the response.  Take note of the ID and try out the GET that we added to retrieve the user as well.  Note that because our DB setup method in Startup.cs deletes the DB on each restart, your users will be lost with each run of your project. Feel free to change this.

Before we end, there’s one more change I want to make.  Most consumers expect our response to come across as camel-cased json.  In addition, we don’t want to transmit null values.  In json, we should just skip that property to keep our payload concise.  Open the Startup.cs class and in the ConfigureService method, replace the code to set our MvcOptions with the following:

services.Configure<MvcOptions>(options =>
{
  options.OutputFormatters.RemoveAll(
    f => f.Instance is XmlDataContractSerializerOutputFormatter);

  var formatter = options.OutputFormatters.FirstOrDefault(
    f => f.Instance is JsonOutputFormatter);

  var jsonFormatter = formatter?.Instance as JsonOutputFormatter;
  if (jsonFormatter != null)
  {
    jsonFormatter.SerializerSettings.ContractResolver = 
      new CamelCasePropertyNamesContractResolver();
    jsonFormatter.SerializerSettings.NullValueHandling = 
      NullValueHandling.Ignore;
  }
});

This will drop null values from our json formatting and camel case all properties by default.  Try out the API again to see the difference.

Next time, we’ll take a look at setting up a basic Aurelia project and calling our basic API.