Nozus JS 1: Intro to Sails with Passport and JWT (JSON Web Token) Auth

This project extends from some previous posts on creating a SPA style application with Node (Sails) and Aurelia. I’m not going to go into detail about installing Node, NPM, or Sails, other than when it’s germane to the subject.  I’m assuming you are already set up with all of the needed basics for Node/Sails development. If not, visit the Sails site to get started.

We’ll start by creating a new directory in which to create our test projects and bringing up a command line. What we’re creating here is an API; so no need for any front end mixed into the Sails application.  We can create that later if we like. So the command to create a new app without a front end is:

sails new myApi --no-frontend

The basics of the app should now be present in the “myApi” directory (or whatever you called it). If you open up the project in your favorite editor, you will see the following structure:

structure

Most of the action will happen in the api folder but we’ll also need to set up some config settings. But first let’s go ahead and install some dependencies. Make sure you are in the root folder of your project and install the following from the command line:

npm install jsonwebtoken --save
npm install bcrypt-nodejs --save
npm install passport --save
npm install passport-jwt --save
npm install passport-local --save

Alternately, you could just add these to the package.json file and run npm install.

Config

Now we’ll add our passport configuration.  Another nice thing about Sails is that putting a js file in the /config folder means that it will be run when you “lift” or start the app. In the config folder, create passport.js and add the following code:


/**
 * Passport configuration file where you should configure strategies
 */
var passport = require('passport');
var LocalStrategy = require('passport-local').Strategy;
var JwtStrategy = require('passport-jwt').Strategy;

var EXPIRES_IN_MINUTES = 60 * 24;
var SECRET = process.env.tokenSecret || "4ukI0uIVnB3iI1yxj646fVXSE3ZVk4doZgz6fTbNg7jO41EAtl20J5F7Trtwe7OM";
var ALGORITHM = "HS256";
var ISSUER = "nozus.com";
var AUDIENCE = "nozus.com";

/**
 * Configuration object for local strategy
 */
var LOCAL_STRATEGY_CONFIG = {
  usernameField: 'email',
  passwordField: 'password',
  passReqToCallback: false
};

/**
 * Configuration object for JWT strategy
 */
var JWT_STRATEGY_CONFIG = {
  secretOrKey: SECRET,
  issuer : ISSUER,
  audience: AUDIENCE,
  passReqToCallback: false
};

/**
 * Triggers when user authenticates via local strategy
 */
function _onLocalStrategyAuth(email, password, next) {
  User.findOne({email: email})
    .exec(function (error, user) {
      if (error) return next(error, false, {});

      if (!user) return next(null, false, {
        code: 'E_USER_NOT_FOUND',
        message: email + ' is not found'
      });

      // TODO: replace with new cipher service type
      if (!CipherService.comparePassword(password, user))
        return next(null, false, {
          code: 'E_WRONG_PASSWORD',
          message: 'Password is wrong'
        });

      return next(null, user, {});
    });
}

/**
 * Triggers when user authenticates via JWT strategy
 */
function _onJwtStrategyAuth(payload, next) {
  var user = payload.user;
  return next(null, user, {});
}

passport.use(
  new LocalStrategy(LOCAL_STRATEGY_CONFIG, _onLocalStrategyAuth));
passport.use(
  new JwtStrategy(JWT_STRATEGY_CONFIG, _onJwtStrategyAuth));

module.exports.jwtSettings = {
  expiresInMinutes: EXPIRES_IN_MINUTES,
  secret: SECRET,
  algorithm : ALGORITHM,
  issuer : ISSUER,
  audience : AUDIENCE
};

So there’s a few things going on here.  First, we’re importing both passport and the two strategies we’re going to set up for now (local and JWT). We’ll add social login in the next post. Next, we’re going to add configuration for our two auth strategies. For the local strategy we’ll use email and password in to login. For JWT, we’ll set up the parameters needed to verify the token: secret, issuer and audience.  It should be noted again that the issuer and audience are optional but provide some additional verification.

Next we’ll add some functions to react to the auth request for each strategy. For local auth, this means making sure that the user matches the user in our database and returning a false response with a message on failure or the user if it succeeded.  For JWT auth, the strategy is already validating the token internally. If you want additional validation here you can check the user ID against the database to verify that this user actually exists and is active.  For the sake of simplicity, I’m going to trust the token and retrieve the user information from the token payload and just return it.

Last, we are exporting some settings to be used elsewhere in the API. This is another nice feature of Sails: The ability to export settings in a config file to make them available globally. So in our case, we are exporting jwtSettings from our config. To access these settings from anywhere, we can use the global accessor:  sails.config.jwtSettings.

Services

Next add a new file under api/services called CipherService. In sails, services and models are named using PascalCase by convention. I need a better name for this one but it will do for the time being. This is where we’ll put all of our code that does hashing and/or token stuff.

var bcrypt = require('bcrypt-nodejs');
var jwt = require('jsonwebtoken');

module.exports = {
  secret: sails.config.jwtSettings.secret,
  issuer: sails.config.jwtSettings.issuer,
  audience: sails.config.jwtSettings.audience,

  /**
   * Hash the password field of the passed user.
   */
  hashPassword: function (user) {
    if (user.password) {
      user.password = bcrypt.hashSync(user.password);
    }
  },

  /**
   * Compare user password hash with unhashed password
   * @returns boolean indicating a match
   */
  comparePassword: function(password, user){
    return bcrypt.compareSync(password, user.password);
  },

  /**
   * Create a token based on the passed user
   * @param user
   */
  createToken: function(user)
  {
    return jwt.sign({
        user: user.toJSON()
      },
      sails.config.jwtSettings.secret,
      {
        algorithm: sails.config.jwtSettings.algorithm,
        expiresInMinutes: sails.config.jwtSettings.expiresInMinutes,
        issuer: sails.config.jwtSettings.issuer,
        audience: sails.config.jwtSettings.audience
      }
    );
  }
};

We’ll use these methods elsewhere. The hash/comparePassword methods are pretty self-explanatory. Bcrypt-nodejs even adds in a salt to the hashed password by default. Pretty nice. The createToken method uses JsonWebToken to create a new token using the sign method. This method accepts a payload (the user in our case), a secret to use for the self-contained JWT hash, and some metadata including the algorithm, expiration, issuer and audience. Issuer and Audience will also be validated when the token is verified coming back in, so it gives a little extra protection. The inclusion of the user in the payload is really just for example. In the real world, you might include the user id and claims etc…

API Generation

Next we’ll create our User API. Return to the root command line of the application and run the following command:

sails generate api User

If you now observe your controllers and models directories, you’ll see the generated controller and model for User.  The controller can be left as-is for now. Although it looks empty, it’s functional!  It will use default blueprints included with Sails to provide functionality. Now we’ll want to enhance the model.  Open the api/models/User file and add the following code:

/**
 * User
 * @description :: Model for storing users
 */
module.exports = {
    schema: true,
    attributes: {
        username: {
            type: 'string',
            required: true,
            unique: true,
            alphanumericdashed: true
        },
        password: {
            type: 'string'
        },
        email: {
            type: 'string',
            email: true,
            required: true,
            unique: true
        },
        firstName: {
            type: 'string',
            defaultsTo: ''
        },
        lastName: {
            type: 'string',
            defaultsTo: ''
        },
        photo: {
            type: 'string',
            defaultsTo: '',
            url: true
        },
        socialProfiles: {
            type: 'object',
            defaultsTo: {}
        },

        toJSON: function () {
            var obj = this.toObject();
            delete obj.password;
            delete obj.socialProfiles;
            return obj;
        }
    },
    beforeUpdate: function (values, next) {
        CipherService.hashPassword(values);
        next();
    },
    beforeCreate: function (values, next) {
        CipherService.hashPassword(values);
        next();
    }
};

Here we’re adding a bunch of properties to our User schema, but also a method to remove sensitive data before converting our object to JSON.  We also have beforeUpdate and beforeCreate delegates defined to hash our password before saving to the data store.  You can see that the hash will use our CipherService, and one of the nice things about sails is that it makes our services available globally.  So here we can just use CipherService instead of needing a require. In Sails, you also have models available globally using their name. This is a huge advantage of Sails.

Auth Controller

Now we have to add an Auth endpoint to perform signup/signin etc…  We’re only generating a controller here, so you can just add the js file or do it the Sails way:

sails generate controller Auth

Now that we have our AuthController, let’s add the following actions:

/**
 * AuthController
 * @description :: Server-side logic for manage user's authorization
 */
var passport = require('passport');
/**
 * Triggers when user authenticates via passport
 * @param {Object} req Request object
 * @param {Object} res Response object
 * @param {Object} error Error object
 * @param {Object} user User profile
 * @param {Object} info Info if some error occurs
 * @private
 */
function _onPassportAuth(req, res, error, user, info) {
  if (error) return res.serverError(error);
  if (!user) return res.unauthorized(null, info && info.code, info && info.message);

  return res.ok({
    // TODO: replace with new type of cipher service
    token: CipherService.createToken(user),
    user: user
  });
}

module.exports = {
  /**
   * Sign up in system
   * @param {Object} req Request object
   * @param {Object} res Response object
   */
  signup: function (req, res) {
    User
      .create(_.omit(req.allParams(), 'id'))
      .then(function (user) {
        return {
          // TODO: replace with new type of cipher service
          token: CipherService.createToken(user),
          user: user
        };
      })
      .then(res.created)
      .catch(res.serverError);
  },

  /**
   * Sign in by local strategy in passport
   * @param {Object} req Request object
   * @param {Object} res Response object
   */
  signin: function (req, res) {
    passport.authenticate('local', 
      _onPassportAuth.bind(this, req, res))(req, res);
  },
};

So here we’re doing two basic things: Sign up and Sign in. The signup method uses the built-in Create method provided by the user model to create a new user. The signin used the Passport’s local authorization strategy to log in. In both cases, if it succeeds, we’ll generate a token so that they are signed in automatically. All requests that follow should now include the returned token in the header. We’ll demo this shortly, but have one final item to add first:  The login policy.

Policies

In Sails, the way to add Express middleware is via policies. If you examine the policy structure, that’s really exactly what it is. In our case, we need a policy that will protect our non-auth controllers from requests that don’t have a token. Look at the api/policies folder. There may be a sessionAuth.js file already present. You can delete this file as we won’t use it. Now add a new file to api/policies called isAuthenticated.js with the following contents:

/**
 * isAuthenticated
 * @description :: Policy to inject user in req via JSON Web Token
 */
var passport = require('passport');

module.exports = function (req, res, next) {
    passport.authenticate('jwt', function (error, user, info) {
      if (error) return res.serverError(error);
      if (!user) 
       return res.unauthorized(null, info && info.code, info && info.message);
     req.user = user;

     next();
    })(req, res);
};

The policy is pretty simple: It authenticates the user using the ‘jwt’ strategy that we earlier implemented in the passport configuration. If no token is present, this will return an unauthorized response. Otherwise it will add the user object from the token to the request.

Responses

Let’s briefly discuss Sails responses, another cool Sails convention. If you take a look at api/responses, you’ll see a bunch of responses that have been created for you already. These make it easy to define and reuse typical responses. So to return an OK response, instead of defining our response over and over, we can just do: return res.ok(data). In our case, we need to add two new custom responses: created and unauthorized.  In the responses folder add created.js with the following content:

/**
 * 201 (Created) Response
 * Successful creation occurred (via either POST or PUT).
 * Set the Location header to contain a link 
 * to the newly-created resource (on POST).
 * Response body content may or may not be present.
 */
module.exports = function (data, code, message, root) {
  var response = _.assign({
    code: code || 'CREATED',
    message: message 
       || 'The request has resulted in a new resource being created',
    data: data || {}
  }, root);

  this.req._sails.log.silly('Sent (201 CREATED)\n', response);

  this.res.status(201);
  this.res.jsonx(response);
};

Now create an unauthorized.js file in the same folder:

/**
 * 401 (Unauthorized) Response
 * Similar to 403 Forbidden.
 * Specifically for authentication failed or not yet provided.
 */
module.exports = function (data, code, message, root) {
  var response = _.assign({
    code: code || 'E_UNAUTHORIZED',
    message: message || 'Missing or invalid authentication token',
    data: data || {}
  }, root);

  this.req._sails.log.silly('Sent (401 UNAUTHORIZED)\n', response);

  this.res.status(401);
  this.res.jsonx(response);
};

I’ve also customized the other responses in accordance with some guidance provided by the Sails API Yeoman generator, but I’m not going to go through all of them.  If you would like to copy them, feel free to reference the project on GitHub. While you’re there, also checkout the overridden Blueprints in api/blueprints. These let you customize the standard processing of different operations exposed by the api controllers (unless you override them specifically in the controller). I also took these blueprints from the Sails API Yeoman generator.

Now we have our policy…how do we use it? In the /config folder, you should see a policies.js file. Open this file and adjust the code to the following:

module.exports.policies = {

    '*': ['isAuthenticated'],

    AuthController: {
        '*': true
    }
};

This is applying the following rules:

  1. Protect all controllers from unauthenticated users.
  2. Override this for the AuthController and allow anybody to hit that.

Loose Ends

OK, before we test this thing out, just a couple of small cleanups that will make Sails bug you less on “lift”. First open /config/models.js. Lets set up a standard migration policy so it won’t bug us on every project start.  Uncomment or add the line that reads migrate and change it to ‘drop‘ for now. This will drop the data store each time the application runs.  You can change to ‘alter‘ if you don’t want this behavior.

migrate: 'drop'

Now we’ll tell Sails not to expect a Gruntfile.  When you create Sails with –no-frontend, it doesn’t scaffold a Gruntfile, but it still warns that there isn’t one on every start. Go to /.sailssrc and remove the Grunt hook:

{
  "generators": {
    "modules": {}
  },
  "hooks":{
    "grunt":false
  }
}

Testing it Out!

Wow…that was a lot of stuff to run through but we’re basically done!  Let’s test it out. If you don’t already have Postman (or you’re preferred test tool) installed, go ahead and install it.

Now lets start up our Api.  At your project root terminal type:

sails lift

You should see sails start up on port 1337 by default. Now fire up Postman and signup a user by posting the following JSON payload to your local signup endpoint. In my case, this is: http://localhost:1337/auth/signup

{
 "username":"testdude",
 "email":"test1@test.com",
 "password":"testdude"
}

signup

The response should look like:

{
 "code": "CREATED",
 "message": "The request has been fulfilled and resulted in a new resource being created",
 "data": {
 "token": "eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyIjp7InVzZXJuYW1lIjoidGVzdGR1ZGUiLCJlbWFpbCI6InRlc3QxQHRlc3QuY29tIiwiZmlyc3ROYW1lIjoiIiwibGFzdE5hbWUiOiIiLCJwaG90byI6IiIsImNyZWF0ZWRBdCI6IjIwMTUtMDQtMjRUMjE6NTg6MTkuMjcxWiIsInVwZGF0ZWRBdCI6IjIwMTUtMDQtMjRUMjE6NTg6MTkuMjcxWiIsImlkIjoyfSwiaWF0IjoxNDI5OTEyNjk5LCJleHAiOjE0Mjk5OTkwOTksImF1ZCI6Im5venVzLmNvbSIsImlzcyI6Im5venVzLmNvbSJ9.j9mSeoHJiNb_rzxqJ8Cefv5ctcMVzbgvnUlvAWhbXas",
 "user": {
 "username": "testdude",
 "email": "test1@test.com",
 "firstName": "",
 "lastName": "",
 "photo": "",
 "createdAt": "2015-04-24T21:58:19.271Z",
 "updatedAt": "2015-04-24T21:58:19.271Z",
 "id": 2
 }
 }
}

You’ll notice that a token was generated.  Copy the token that your API generated so that it’s available later.

We’ll now attempt to signin as well.  Post the following JSON to your local signin endpoint. In my case, this is: http://localhost:1337/auth/signin. Earlier in our Passport config we set up local auth to use email and password, but we could change this to use the username if that is preferable.

{
 "email":"test1@test.com",
 "password":"testdude"
}

In this case, the response should be almost identical but the response code will be a 200-OK instead of a 201-Created.

Now lets try to access a resource without our token. Perform a get on your local user endpoint. In my case: http://localhost:1337/user

You should get a response indicating that you are not authorized:

{
 "code": "E_UNAUTHORIZED",
 "message": "No auth token",
 "data": {}
}

Now lets update our request to add an Authorization header.  For the value, add “JWT“, then a space, then the token you created previously. So an example value would be:

JWT eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyIjp7InVzZXJuYW1lIjoidGVzdGR1ZGUiLCJlbWFpbCI6InRlc3QxQHRlc3QuY29tIiwiZmlyc3ROYW1lIjoiIiwibGFzdE5hbWUiOiIiLCJwaG90byI6IiIsImNyZWF0ZWRBdCI6IjIwMTUtMDQtMjRUMjE6NTg6MTkuMjcxWiIsInVwZGF0ZWRBdCI6IjIwMTUtMDQtMjRUMjE6NTg6MTkuMjcxWiIsImlkIjoyfSwiaWF0IjoxNDI5OTEzNTI3LCJleHAiOjE0Mjk5OTk5MjcsImF1ZCI6Im5venVzLmNvbSIsImlzcyI6Im5venVzLmNvbSJ9.9FqGAVnJNM3SWRupOqCoW7tPJqu0ChZt5f2_En6GKqo

getUsers

Now send the Get request and you should get back the user record we created when we signed up!

{
 "code": "OK",
 "message": "Operation is successfully executed",
 "data": [
 {
 "username": "testdude",
 "email": "test1@test.com",
 "firstName": "",
 "lastName": "",
 "photo": "",
 "createdAt": "2015-04-24T21:58:19.271Z",
 "updatedAt": "2015-04-24T21:58:19.271Z",
 "id": 2
 }
 ]
}

Final Thoughts

Seems like we did a lot of stuff here, but it was all pretty easy thanks to the conventions and code generation provided by Sails. In upcoming articles, we’ll look at expanding this example to include social auth, a front end via Aurelia and additional storage mechanisms (we’re just using disk here).

The code for this example is available here. The code in my uploaded GitHub example is using Postgresql instead of writing to disk, but that can easily be changed in the config/models.js file.  Just point to any connection you have set up in the config/connections.js file.

** Warnings **

Always remember that this style of token auth is not secure if the token can be intercepted. In production, always perform communications with the API (at least those that send the token or any sensitive information) over SSL.

Additionally, you don’t want to check any production secrets into public source control. In this case, the secret in the passport config file. In a follow up we’ll talk about how to handle this but you can read about config overrides here.

Nozus JS Preface: Sails with Passport and JWT – Ermahgerd!

The Sails project in this case is only going to serve as an RESTful API.  We’ll be using Aurelia on the front end, so we don’t need any presentation.  What we do need is authentication provided by the API.

So my local auth implementation went through a few stages and much hair pulling. I first took a look at sails-generate-auth. This seemed like a great starting place…but I ran into some issues:

  • You can use tokens to protect the API, but the social auth implementation is geared towards server emitted UI by default. I want something more SPA oriented.
  • It doesn’t use JWT currently although it supports bearer tokens.

So I moved on to Waterlock. Waterlock has a lot of cool features and is super-easy to set up. In addition, it appears to be more oriented towards a SPA/API implementation.

But I ran into some issues: It uses session in situations where tokens are being used.  This defeats some of the advantages of using tokens and reduces horizontal scalability.  Some remedies to this are being worked on, but have been stuck in pull request status for several months, although it looks as though they may be resolved soon.  So I figured I could just reference the github fork instead from npm until this gets resolved…but was still concerned about it using session.

And then with local auth in Waterlock: There is no register endpoint.  There is a login endpoint that auto-registers you if it can’t find you. :(  So if you fat-finger your username it just goes ahead and makes a new user. Again there is a pull request that hasn’t yet been merged.

At this point I decided I needed to keep looking. I could try and get those pull requests across the line but I’m already leaning away because of the opacity/complexity of the library as a whole and the session requirement.  I don’t need something this complex but it’s a great reference implementation.

So next I found a Yeoman generator for Sails that was specialized for API’s and implemented with JWT based auth. This is a really cool project and major kudo’s to Eugene who was also very responsive to questions.  But then when I felt so close:  The Yeoman generator failed because of a Windows file system issue. I think probably a path length issue due to nested npm packages. I’ll see you in hell Windows file system!!!  Eugene mentioned that he hadn’t tested it on windows…so there it was.

This may have been a good thing because the generator/template also outputs a lot of functionality that is both very cool and yet unnecessary for my project. The upside being that you can blow out a complex API in almost no time, but I didn’t want to go through and attempt to clean out everything that I didn’t want/need.  I also wanted to go through the process myself so I understood it better.

So I used the generator templates as a reference project and borrowed some great things from it including updated API blueprints and a basic approach for issuing and validating JWT tokens.

So hopefully if you come across this post and are in a similar boat (npi), I can save you some of my frustration (aka “earned knowledge”). I’ll be adding the implementation information in the next post, but a reference implementation is already available in GitHub.  Hope that’s helpful!

Next in series:

Nozus JS 1: Intro to Sails with Passport and JWT (JSON Web Token) Auth

Nozus: Changing the API to Node JS and Sails

So after futzing around with ASP.Net 5/MVC 6 for a bit and becoming frustrated with attempting to implement token based auth, I decide to switch over to Node JS using Sails as an alternative for my API.

Why Sails?  I took a look at it a few months back and was impressed with the easy setup and Waterline ORM. I then had a recent job opportunity where the client was using Sails.  I passed on the opportunity but decided to review Sails again. The combination of convention with easy overrides is super productive and flexible.  Additionally, the documentation is well done for an OSS framework.

I look forward to the eventual rollout of ASP.Net 5, but it’s still under heavy dev and what works one day doesn’t on the next pull.  This is totally understandable but I’m ready to move along and get something working.  I’m already biting off a fair amount of learnin’ by attempting Node and Aurelia.  So without further ado, I’ve added the following growing list of articles on setting up Nozus in node, including examples in GitHub.

Nozus JS Preface: Sails with Passport and JWT – Ermahgerd!

Nozus JS 1: Intro to Sails with Passport and JWT (JSON Web Token) Auth

Nozus Step 3: Adding ASP.Net Identity to MVC 6

In the previous post, we added some simple logging to our API using Serilog and simple middleware. In this installment, we’ll be adding identity management to that project using the standard ASP.Net Identity libraries.

First, let’s do a little project cleanup.  In Nozus.Data and Nozus.Domain, go ahead and delete the default Class1 that was added when we created those projects. Also, from the HomeController in Web.Api, make sure you removed the thrown exception added at the end of the previous post.

Now we’re going to add in packages needed for the identity system. Open your Package Manager Console (Tools –> Nuget Package Manager –> Package Manager Console).  In the console select your VNext package source and install the Identity.EntityFramework package into both the Data and Web.Api projects.  Make sure to includeprerelease which can be abbreviated to “-inc”.

PM> Install-Package Microsoft.AspNet.Identity.EntityFramework -inc
Installing NuGet package Microsoft.AspNet.Identity.EntityFramework.3.0.0-beta3.
PM> Install-Package Microsoft.AspNet.Identity.EntityFramework -inc
Installing NuGet package Microsoft.AspNet.Identity.EntityFramework.3.0.0-beta3.

We’re going to be using SQL Server as the data store, but you can use any applicable store.  From the Package Manager Console, install EntityFramework.SqlServer into the Web.Api project only.

PM> Install-Package EntityFramework.SqlServer -inc
Installing NuGet package EntityFramework.SqlServer.7.0.0-beta3.

Finally, in our Domain project, we’re going to add in a package to support basic Authentication classes.  We’ll derive from these later.

PM> Install-Package Microsoft.AspNet.Identity.Authentication -inc
Installing NuGet package Microsoft.AspNet.Identity.Authentication.3.0.0-alpha4.

Now that all packages are installed…let’s put identity into place.  We’re going to first add in our user and role classes into the Domain project.  In the Domain project, add an Entities folder and inside of that folder add an Identity folder. Inside of the Identity folder, add two classes, AppUser and AppRole.  For the moment, these classes require no implementation other than inheriting from the proper identity classes.  AppUser will inherit from IdentityUser<int> and AppRole will inherit from IdentityRole<int>.  The type specifies the type of the ID that will be used in the entities and the database.  By default, an int will be set up as an Identity column in SQL Server.

using Microsoft.AspNet.Identity;
namespace Nozus.Domain.Entities.Identity
{
    public class AppUser : IdentityUser<int>
    {}
}

using Microsoft.AspNet.Identity;
namespace Nozus.Domain.Entities.Identity
{
   public class AppRole : IdentityRole<int>
   {}
}

Now add a reference to the Domain project from the Data project.  Add a reference to both Domain and Data from the Web.Api.  In the Data project, add an AppDbContext class.  Inherit this class from IdentityDbContext<AppUser, AppRole, int>.


using Microsoft.AspNet.Identity.EntityFramework;
using Nozus.Domain.Entities.Identity;
namespace Nozus.Data
{
    public class AppDbContext 
        : IdentityDbContext<AppUser,AppRole,int>;
    {}
}

IdentityClassesWe’ll use this as our EF dbContext for the solution, and we can dual purpose it by inheriting from the IdentityDbContext.  We’re also specifying our User and Role types as well as the type for the IDs, which applies to both User and Role.  The project structure should now look similar to the image at the right.

Now lets fix up our Web.Api project to use this identity setup.  All the initial action here is going to be in the Startup class because, as expected, the identity functionality is set up using middleware.  In the ConfigureServices method, we’ll configure both Entity Framework and Identity as follows:


public void ConfigureServices(IServiceCollection services)
{
services.AddLogging(Configuration);

// Add EF services to the services container.
services.AddEntityFramework(Configuration)
.AddSqlServer()
.AddDbContext<AppDbContext>();

// Add Identity services to the services container.
services.AddIdentity<AppUser, AppRole>(Configuration)
.AddEntityFrameworkStores<AppDbContext, int>();

services.AddMvc();
services.Configure<MvcOptions>(options =>
{
  options.OutputFormatters.RemoveAll(formatter => 
  formatter.Instance is XmlDataContractSerializerOutputFormatter);
});
}

You can  see that we’re adding EF, telling it to use SQL Server and then giving it the type of our DbContext.  Next we’re adding identity, telling it our user and role types, and letting it know to use an EF store with our DbContext and the type of IDs it will be using.  Next, we’ll do some more setup in the Configure method.


public void Configure(IApplicationBuilder app, 
  IHostingEnvironment env, ILoggerFactory loggerfactory)
{
  loggerfactory.AddSerilog(GetLoggerConfiguration());
  app.UseStaticFiles();

  //*** Tell the app to use Identity ***//
  app.UseIdentity();

  app.UseMiddleware<ErrorLoggingMiddleware>();
  app.UseMvc(routes =>
  {
    routes.MapRoute(
    name: "default",
    template: "{controller}/{action}/{id?}";,
    defaults: new { 
        controller = "Home", 
        action = "Index" });
  });

  //***  Initialize the DB ***//
  if(env.EnvironmentName == "Development")
    InitializeDb(app.ApplicationServices).Wait();
}

private static async Task InitializeDb(
  IServiceProvider applicationServices)
{
  using (var dbContext = 
    applicationServices.GetService<AppDbContext>())
  {
    var sqlServerDatabase = 
      (SqlServerDatabase)dbContext.Database;
    await sqlServerDatabase.EnsureDeletedAsync();
    if (await sqlServerDatabase.EnsureCreatedAsync())
    {
      //We could add some test data...perhaps later
    }
  }
}

Here, we’re telling the app to use identity.  We’re also calling our InitializeDb method to drop and recreate our database. This is mainly in place for dev purposes, so you could call alternate database setups depending on whether you were running integration tests/developing/etc…  We’ll just have an empty database to start with for now.  We’re using the built in env.EnvironmentName to test that this is running in Dev before dropping and recreating our database.

When you installed the EF packaged into the Web.Api project, it should have automatically added a config section to your config.json file.  Take a look, it should look similar to the following:

{
 "Data": {
   "DefaultConnection": {
   "ConnectionString": 
    "Server=localhost;Database=Nozus;Trusted_Connection=True;
     MultipleActiveResultSets=true"
   }
 },
   "EntityFramework": {
     "AppDbContext": {
     "ConnectionString": "Data:DefaultConnection:ConnectionString"
     }
   }
}

The data section should contain a connection string.  By default it will set up a connection string to localDB with a random database name.  I’ve changed it to point to my local Sql Server Dev Edition, but localDB will work just fine.  You can explicitly set up which connection string EF uses, but by default, it’s going to find the entry under the EntityFramework section that matches the name of the DbContext.  So in this case, make sure that it’s named “AppDbContext”.

When exchanging data with our API, it’s important that we define contracts for data going in and out. We don’t want to use the domain classes, as they will have lots of properties that we don’t want to expose to the outside world.  If there isn’t already a Models folder in your Web.Api project, add one.  In this folder, create a new class called UserModel.  It should look like the following:

using System.ComponentModel.DataAnnotations;
namespace Nozus.Web.Api.Models
{
public class UserModel
{
   public int? Id { get; set; }
  [Required]
  [Display(Name = "User name")]
  public string UserName { get; set; }

  [Required]
  [StringLength(100, 
  ErrorMessage = 
    "The {0} must be at least {2} characters long.", 
  MinimumLength = 6)]
  [DataType(DataType.Password)]
  [Display(Name = "Password")]
  public string Password { get; set; }

  [DataType(DataType.Password)]
  [Display(Name = "Confirm password")]
  [Compare("Password",
  ErrorMessage = 
    "The password and confirmation password do not match.")]
  public string ConfirmPassword { get; set; }
}
}

So now that we’ve got a model class, how to we map from our Domain AppUser class to this UserModel class. You can roll your own mappers, which is fine, but you might want to use a mapping library.  In .Net almost everybody uses AutoMapper. It is very versatile and works in a variety of environments.  So feel free to use AutoMapper if you like, but I’m going to use Mapster instead. It’s another mapper that I currently support.  I use it because it gives me everything I need from AutoMapper and is ~10-50x faster on average. The setup of these two mappers is very similar, so they translate easily in most cases.  Install the Mapster package into your Web.Api project, selecting nuget.org as your source.

PM> Install-Package Mapster
Installing NuGet package Mapster.1.14.0.

Now create a Mapping folder in your Web.Api project.  Create a class in this folder called UserMapping.  It should inherit from Mapster’s Registry class. A registry class can be scanned at startup to register all of the mappings.

using Mapster;
using Mapster.Registration;
using Nozus.Domain.Entities.Identity;
using Nozus.Web.Api.Models;
namespace Nozus.Web.Api.Mapping
{
public class UserMapping : Registry
{
   public override void Apply()
  {
    TypeAdapterConfig<AppUser, UserModel>.NewConfig()
    .Ignore(dest => dest.ConfirmPassword)
    .Ignore(dest => dest.Password);
  }
}
}

Here, we’re telling Mapster to map AppUser to UserModel and to ignore the destination ConfirmPassword and Password fields since we don’t want those returned.  Now in the Startup.cs constructor, we’ll add a call to find and register all of our Registry classes.  Of course right now there’s just the one, but it will grow as the application grows.

public Startup(IHostingEnvironment env)
{
  Configuration = new Configuration()
  .AddJsonFile("config.json")
  .AddEnvironmentVariables();
  //Scan for our mappings
  Mapster.Registration.Registrar
  .RegisterFromAssembly(Assembly.GetExecutingAssembly());
}

So now to finish up we’ll implement our AccountController. This will do things like register new users, reset passwords, inactivate users etc…  For right now we’ll just add methods to add a new users and to retrieve a user.  Add the below code to your AccountController class.

using System.Net;
using System.Threading.Tasks;
using Mapster;
using Microsoft.AspNet.Identity;
using Microsoft.AspNet.Mvc;
using Nozus.Domain.Entities.Identity;
using Nozus.Web.Api.Models;

namespace Nozus.Web.Api.Controllers
{
  [Route("[controller]")]
  public class AccountController : Controller
  {
   private readonly UserManager<AppUser> _userManager;

  public AccountController(UserManager<AppUser> userManager)
  {
    _userManager = userManager;
  }

 // POST api/Account/
 [AllowAnonymous]
 [HttpPost]
 public async Task<IActionResult> Post(
 [FromBody] UserModel userModel)
 {
   if (!ModelState.IsValid)
   {
     return new BadRequestObjectResult(ModelState);
   }
   var user = new AppUser {UserName = userModel.UserName};

   IdentityResult idResult = 
   await _userManager.CreateAsync(user, userModel.Password);

   IActionResult errorResult = GetErrorResult(idResult);
   if (errorResult != null)
   {
     return errorResult;
   }
   //Put together a response
   string url = 
    Url.RouteUrl("GetUserById", new { userId = user.Id },
   Request.Scheme, Request.Host.ToUriComponent());
   Context.Response.Headers["Location"] = url;
   Context.Response.StatusCode = (int)HttpStatusCode.Created;

   return new ObjectResult(TypeAdapter.Adapt<UserModel>(user));
}

//GET api/Account/1
[HttpGet("{userId:int}", Name = "GetUserById"]
public async Task<IActionResult> GetUserById(int userId)
{
  var user = 
   await _userManager.FindByIdAsync(userId.ToString());

  return new ObjectResult(TypeAdapter.Adapt<UserModel>(user));
} 

private IActionResult GetErrorResult(IdentityResult result)
{
  if (!result.Succeeded)
  {
   if (result.Errors != null)
   {
     foreach (var error in result.Errors)
     {
         ModelState.AddModelError(error.Code, error.Description);
     }
   }
   return new BadRequestObjectResult(ModelState);
  }
  return null;
}

}
}

So there’s a fair amount going on here.  The Post method accepts a UserModel, validates it using the annotations on the class and returns a BadRequest response with the model errors if validation fails.  If it succeeds, it uses the UserManager that is included with Identity to attempt to create an AppUser.  If the create fails, we will build an error response and return it. If the create is successful, it will store this new AppUser in our database and will also populate its assigned UserId.  We’ll create an object response and use Mapster to map the AppUser back to a response UserModel, but we’ll also add a header to the response that points to the Get uri that can be used to retrieve the user.

Now let’s try this out.  Start your project and the Home page should be displayed. If you don’t have it installed already, install Postman.  Fire up Postman and lets perform a Post to our API to create a new user.

CreateUserPostman

Select POST from the method dropdown.  Set the url to {your api url}/Account. Here, mine is http://localhost:13171/Account.  Add a a Content-Type header and set it to application/json.  Put some json in the payload that will pass password validation:

{
 "userName": "RicoSuave",
 "password": "MyPassword_123",
 "confirmPassword": "MyPassword_123"
}

Now click the Send button.  The tabs on the bottom should show the response.  A new user with ID should be returned.  If you examine the Headers tab, you will also see the Location header we added to the response.  Take note of the ID and try out the GET that we added to retrieve the user as well.  Note that because our DB setup method in Startup.cs deletes the DB on each restart, your users will be lost with each run of your project. Feel free to change this.

Before we end, there’s one more change I want to make.  Most consumers expect our response to come across as camel-cased json.  In addition, we don’t want to transmit null values.  In json, we should just skip that property to keep our payload concise.  Open the Startup.cs class and in the ConfigureService method, replace the code to set our MvcOptions with the following:

services.Configure<MvcOptions>(options =>
{
  options.OutputFormatters.RemoveAll(
    f => f.Instance is XmlDataContractSerializerOutputFormatter);

  var formatter = options.OutputFormatters.FirstOrDefault(
    f => f.Instance is JsonOutputFormatter);

  var jsonFormatter = formatter?.Instance as JsonOutputFormatter;
  if (jsonFormatter != null)
  {
    jsonFormatter.SerializerSettings.ContractResolver = 
      new CamelCasePropertyNamesContractResolver();
    jsonFormatter.SerializerSettings.NullValueHandling = 
      NullValueHandling.Ignore;
  }
});

This will drop null values from our json formatting and camel case all properties by default.  Try out the API again to see the difference.

Next time, we’ll take a look at setting up a basic Aurelia project and calling our basic API.

Nozus Step 2: Setting up MVC 6 with Basic Logging

The previous post in this series covered basic MVC 6 API project setup. In this post, we’re going to build on that and set up some baseline logging functionality in the API.  We’ll continue to enhance logging as the project progresses.

We’ll start by setting up logging.  In our case, we’re going to use Serilog.  It’s the new kid on the block for .Net logging and looks like it combines some nice elements of structured data with basic log messages.  This can easily be swapped for NLog.  Right now there isn’t 5.0 support for Log4Net, but expect that to change.

If it’s not already available, open the Package Manager Console: Tools –> Nuget Package Manager –> Package Manager Console. In the package source, select your VNext package source and the Web.Api project. At the prompt:

PM> Install-Package Microsoft.Framework.ConfigurationModel.Json -includeprerelease
Installing NuGet package Microsoft.Framework.ConfigurationModel.Json.1.0.0-beta3.
PM> Install-Package Microsoft.Framework.Logging.Serilog -includeprerelease
Installing NuGet package Microsoft.Framework.Logging.Serilog.1.0.0-beta3.

So we just installed JSON configuration support (no more web.config) and support for Serilog.  Now lets add a config.json file.  Right click on the Web.Api project and Select Add –> New Item…  Select ASP.Net Configuration File.  This should correspond to a config.json file.  Just add the default json.config to the project, we’ll add some things to it in a later installment.  I lowercase the file name…optional.

We’re going to add logging into the Startup.cs now. Open Startup.cs, add a Configuration property, and set it in the constructor. This will set up config information from our config.json and will also add configuration in from any environment variables that may be set in the deployment environment.

 public Startup(IHostingEnvironment env)
 {
   // Setup configuration sources.
   Configuration = new Configuration()
   .AddJsonFile("config.json")
   .AddEnvironmentVariables();
 }

 public IConfiguration Configuration { get; set; }

Now we’ll update the ConfigureServices method.  Configure services sets up services in our DI container.  We’ll add in logging by calling AddLogging() and while we’re in here, we’ll go ahead and remove XML as a potential output format. Notice that AddLogging reads in the configuration we read in the constructor.  We’re going JSON only with these services. Why? Because I don’t really want to support XML. Leave it if you like.

 public void ConfigureServices(IServiceCollection services)
 {
 services.AddLogging(Configuration);
 services.AddMvc();
 services.Configure<MvcOptions>(options =>
 {
 options.OutputFormatters.RemoveAll(formatter =>
formatter.Instance is XmlDataContractSerializerOutputFormatter);
 });
 }

Next, we’ll configure the actual middleware pipeline.  For this we use the Configure method. It’s a little confusing that we have ConfigureServices (DI) and Configure (Pipeline), but that’s the convention.  We’ll add Serilog as our logger and create a method to set up our logger configuration. Serilog can alternately read from app settings if that is preferred and I’ll probably switch it over to do that at some point.  Notice here that I’m writing out to a rolling file on my D drive.  You can write to whatever location or writer works for you.

public void Configure(IApplicationBuilder app,
IHostingEnvironment env, ILoggerFactory loggerFactory)
{
  loggerfactory.AddSerilog(GetLoggerConfiguration());
  app.UseStaticFiles();
  // Add MVC to the request pipeline.
  app.UseMvc(routes =>
  {
    routes.MapRoute(
    name: "default",
    template: "{controller}/{action}/{id?}",
    defaults: new {
        controller = "Home",
        action = "Index" });
  });
}

private static LoggerConfiguration GetLoggerConfiguration()
{
 return new LoggerConfiguration()
 .Enrich.WithMachineName()
 .Enrich.WithProcessId()
 .Enrich.WithThreadId()
 .MinimumLevel.Debug()
 .WriteTo.RollingFile(@"D:\Logs\Nozus.Web.Api\file-{Date}.txt",
 outputTemplate:
 "{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} {Level}:{EventId} [{SourceContext}] {Message}{NewLine}{Exception}")
}

Now that we have the Serilog package installed, if you try to compile you might notice a problem.  The compiler has some complaints and if you look closely, you’ll notice that it’s complaining about Serilog but in addition that it’s complaining about ASP.Net Core 5.0.

CoreErrors

So if you’re not already aware, we have two flavors of .Net 5.0, the core flavor, which is being touted as the minimal/side-by-side/deployable/cloud flavor vs the full framework.  The problem here is that Serilog (and most other legacy .Net assemblies) won’t be compatible with the core flavor.  Expect this to change as .Net 5.0 goes live and the migration begins.  Right now we are compiling for both the Core and full flavors of the framework.  For now, we’re going to remove Core compilation from our project.  Open the project.json file and delete the core framework:

 "frameworks": {
     "aspnet50": { },
     "aspnetcore50": { }
 }

Now compilation should succeed.  Now let’s put in some basic logging to do an error catch all.  So there are a few ways to accomplish this.  One was is to use the built in ErrorHandling middleware.  To me, this is more geared towards MVC, where one wants to perform some logging and then perhaps render an alternate view from the standard. In my case since this is an API, I just want to log the error and send the standard 500 error out the door.  Perhaps I’ll add in a Production mode later that sends an alternate view out.  The nice thing is that since ASP.Net is now open source, we can see what the error handling middleware does and just make a simpler version of this when configuring services.  Since I just need something stupid simple, I added my own middleware.

In the Web.Api project add a Middleware folder.  In this folder create a new class called ErrorLoggingMiddleware.  It should look like this:

using System; using System.Threading.Tasks;
using Microsoft.AspNet.Builder;
using Microsoft.AspNet.Http;
using Microsoft.Framework.Logging; 

namespace Nozus.Web.Api.Middleware
{
public class ErrorLoggingMiddleware
{
private readonly RequestDelegate _next;
private readonly ILogger _logger;
public ErrorLoggingMiddleware(RequestDelegate next,
ILoggerFactory loggerFactory)
{
  _next = next;
  _logger = loggerFactory.Create<ErrorLoggingMiddleware>();
} 

public async Task Invoke(HttpContext context)
{
  try
  {
    await _next(context);
  }
  catch (Exception ex)
  {
    _logger.WriteError("An unhandled exception has occurred: "
    + ex.Message, ex);
    throw; // Don't stop the error
  }
}
}
}

All we’re doing is logging the error and throwing it up the chain for eventual handling by the framework. One really nice thing is that you’ll notice our dependencies are injected into the middleware for us.  So middleware is now hooked into the core DI mechanism and is truly first class.  Very nice…  You can also see the basic middleware pattern.  This is the same basic OWIN pattern you may be used to: Middleware is a russian doll where the next middleware component is injected and wrapped by the current middleware component.  So in our case we just call the next middleware component and log something if there’s an error. Really easy, the possibilities for open-source/third-party middleware are huge.  I expect this to explode with Asp.Net 5.

So now we need to call our middleware.  In the Configure method, well use the AddMiddleware method to add in our custom middleware.

public void Configure(IApplicationBuilder app, 
IHostingEnvironment env, ILoggerFactory loggerFactory)
 {
 loggerFactory.AddSerilog(GetLoggerConfiguration());

 app.UseStaticFiles();
 // Add MVC to the request pipeline.

 app.UseMiddleware<ErrorLoggingMiddleware>();

 app.UseMvc(routes =>
 {
   routes.MapRoute(
   name: "default",
   template: "{controller}/{action}/{id?}",
   defaults: new { 
        controller = "Home", 
        action = "Index" });
 });
 }

So easy and now we’re set.  One thing to remember is that the order of adding middleware determines the call sequence, so it’s important to add your component in the right place, which may differ depending on what you are trying to accomplish.

So now as a quick example, let’s log something from our home page by throwing an error from our default HomeController.  Just open the HomeController and have the Index method throw an error, something like:

[HttpGet("/")]
 public IActionResult Index()
 {
 throw new InvalidOperationException("Ghost in the machine!");
 return View();
 }

Let’s start up our app and it should immediately err since the error message gets thrown up the chain by our middleware.

ErrorShot

But if we go to our log file location that we set earlier, we should now have a log file with our error properly logged.

2015-03-11 21:02:58.723 -05:00 Error: [Nozus.Web.Api.Middleware.ErrorLoggingMiddleware] An unhandled exception has occurred: Ghost in the machine!
System.InvalidOperationException: Ghost in the machine!
 at Nozus.Web.Api.Controllers.HomeController.Index() in C:\Users\Visual Studio 14\Projects\Another\Nozus.Web.Api\Controllers\HomeController.cs:line 14
--- End of stack trace from previous location where exception was thrown --etc....

Just remember to remove the thrown error before proceeding further….in the next installment we’ll set up basic identity management…then we may switch over to Aurelia before getting back to social logins.

Nozus Step 1: Creating a Web API with MVC 6 – Project Setup

This article involves basic Visual Studio and project setup and should go fairly quickly.  I’m going to start out by creating a Web Api using MVC 6.  For reference, I’m using Visual Studio 2015 Preview 6.  Instead of beginning from complete scratch, I’m going to start with the Web API template.

From Visual Studio, go to File –> New –> Project.

In the New Project dialog box, select Web from the template tree and ASP.Net Web Application as the template.

NewProject

In the resulting web project template modal, select ASP.Net 5 Preview Web API. Note:  The Web API template is new with Preview 6.

NewProject2

Now lets add a couple of basic projects to our solution to round out the API.  A Domain class library for all Domain entities and interfaces, and a Data project for any repositories or EF 7 DataContexts.

Right click the solution node in the Solution Explorer and select Add –> New Project

NewProjectClassLib

InitialSolutionIn the Add New Project dialog, select ASP.Net 5 Class Library as the template. Do this twice:  Once for a .Domain project and once for a .Data project.  The initial structure should look something like the image to the right.

Now that some basic project structure is set up…lets add in some needed Nuget packages.  The first thing to do is to make sure the Nuget package manager is up to date.  You can go to Tools –> Extensions and Updates…  Look in the updates node on the navigation tree to see if there are any updates available for the Nuget Package Manager.

Now we need to set up the package manager such that it’s pointing to the correct source of packages for ASP.Net 5.

Open the Nuget settings by going to Tools –> Nuget Package Manager –> Package Manager Settings.  The Package Manager Settings dialog will open.

Navigate to Package Sources, and if it’s not already present, add an entry for AspNetVNext packages with source: https://www.myget.org/F/aspnetmaster/api/v2.

NugetSetup

Before we get started adding packages, lets make sure the packages we have are up to date.  Right click the solution node in the Solution Explorer and select Manage Nuget Packages…  In the Nuget Package Manager dialog, select the AspNetVNext source with Upgrade available and Include prerelease as filters. Install any available official M$ upgrades.

UpgradePackages

If you open your project.config in the Web.Api project, it should look something like the structure below.  We’ll talk more about this structure with the next installment.

{
 "webroot": "wwwroot",
 "version": "1.0.0-*",
 "dependencies": {
 "Microsoft.AspNet.Server.IIS": "1.0.0-beta3",
 "Microsoft.AspNet.Mvc": "6.0.0-beta3",
 "Microsoft.AspNet.StaticFiles": "1.0.0-beta3",
 "Microsoft.AspNet.Server.WebListener": "1.0.0-beta3",
 "System.Runtime": "4.0.20"
 },
 "frameworks": {
 "aspnet50": { },
 "aspnetcore50": { }
 },
 "exclude": [
 "wwwroot",
 "node_modules",
 "bower_components"
 ],
 "bundleExclude": [
 "node_modules",
 "bower_components",
 "**.kproj",
 "**.user",
 "**.vspscc"
 ]
}

Go ahead and run the Web.Api project.  The browser of your choice should launch and you should see a page like the following:

AppRunning

In the next article, we’ll set up basic logging and then identity management.

So many cool tools, which way to go…

I was beginning to work on a new exploratory project where I wanted to get a little deeper into some technologies that I don’t get to use every day at work. I’ve got a plan to build a cloud-based web site that does a few things.  So the choices were Angular or Aurelia on the front end and Node or ASP.Net 5/MVC 6 on the back end.

I’ve been doing various Node and Angular exercises and tutorials for the past year or so and have also been excited about the release of ASP.Net 5 and what it promises.  Although I’ve been put off by the recent Node schism, I’m fully confident that either a reconciliation will take place or IO.js will run away with the ball.  And the Node/IO community is so incredibly vibrant right now.  It seems like there’s a package for anything/everything. You can literally see the energy coming off of that system. Recently, I’ve also been impressed by the Aurelia project and it’s embrace of ES 6 and other emerging and established standards.  Aurelia will probably be a niche player in the SPA market when compared to Angular/React/Ember etc…, but it’s a really cool and looks well put together, so why not try it out.

But as cool as Node is, there are things that I’ll miss terribly in the .Net ecosystem like LINQ and (a hopefully more performant) EF and truly awesome productivity tools in VS and R# as well as all of my previous experience. I also want to support .Net’s move into OSS and embrace/trust its own community.  Already the source code out on GitHub has been an incredible help to me.  I think that’s already paying off.

I’ve decide to stick to my .Net roots and try out Asp.Net 5/MVC 6 as an API on the backend, trying to learn it while it’s still relatively new and attempt Aurelia on the front end with help from Bootstrap.  We’ll see how it works out…