Simple Engineering

Some of nodejs projects rely on expressjs for routing. There is a realization that past a certain threshold, some request handlers start looking like copycats. Extreme cases of such instances become a nightmare to debug, hinder scalability. The increase in code reusability, modularity, improves overall testability — along the way scalability and user experience. The question we have to ask is How do we get there?.

This blog article will explore some of the ways to achieve that. In expressjs routes context, we will shift focus on making sure most of the parts are accessible and testable.

In this article we will talk about:

  • The need to modularize expressjs routes
  • How to modularize expressjs routes for reusability
  • How to modularize expressjs routes for testability
  • The need for a manifest route modularization strategy
  • How to modularize expressjs routes for composability
  • How to modularize expressjs route handlers for reusability
  • How to modularize expressjs route handlers for performance
  • How to modularize expressjs route handlers for composability
  • The need to have route handlers as controllers
  • How to specialize routes handlers as controllers

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

While following a simple principle “make it work”, you realize that route lines of code(LoC) grows linearly(or leaning towards exponential) as feature requests increase. All this growth can happen inside one file, or on a single function. Assuming all our models are NOT in the same files as our routes, the following source code may be available to us in the early days of a project:

var User = require('./models').User; 
/** code that initializes everything then comes this route*/
app.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});

/**
 * More code, more time, more developers 
 * Then you realize that you actually need:
 */ 
app.get('/admin/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});

Example:

What can possibly go wrong?

When trying to figure out how to approach modularization of expressjs routes, the following points highlight some challenges:

  • Understanding where to start, and where to stop when modularizing routes
  • Making a choice between a layered architecture with or without controllers
  • Making a choice between a layered architecture with or without services

In the next sections, we will explore more on points made raised earlier.

The need to modularize expressjs routes

One heavily relied upon a feature in expressjs is its router. The routes tend to grow out of proportion and can be a source of trouble when the time comes to test, refactor or extend existing functionalities. One of the tools to make our job easy is to apply modularization techniques to expressjs routes.

How to modularize expressjs routes for reusability

There is only one route per application, so the notion of route re-usability may not be as evident as it should be in such a context.

However, when we look closer to the constructions of a handler, we may get a sense of how many times an actual route's work can be spread across multiple instances and use cases. When we look at the path itself, it is possible to find matching suffixes.

Suffixes indicate that multiple routes may indeed be using one handler. To keep it simple, different contexts, same actions. /admin/add/user, /profile/add/user, /school/:id/add/user etc. All of the roots, or prefix of /add/user are contexts in which some action is taking place.

Deep-down the end result is a user being added. There is a probability that the user is going to be added to the same database table or document.

//in one file 
let router = require('express').Router();
  router.post('/add/user', addUser);
module.exports = router;

//later in another file 
let router = require('express').Router(), 
  add = require('/one/file');
  
  router.use('/admin', add);
  router.use('/profile', add);
  router.use('/school/:id', add);
module.exports = router;

The modularization of routes, for that matter — route handlers, should not stop at their ability to be reusable.

Modularization can guarantee the stability of routes and their handlers. To put things in perspective, for two distinct routes that share the same route handler, a change in parameter naming should not affect other routes. Likewise, a change in route handler affects routes using the same handler but does not necessarily affect any route configuration.

Like in other use cases, modularizing an expressjs route consists of two major changes. The first step is to identify, name and eject route handlers. This step may be a bit challenging when the middleware is involved. The second and last step is to move and group similar handlers under the same library. The said library can be exposed to the public using the index trick we discussed in other blog posts.

How to modularize expressjs routes for testability

The challenge when mocking an expressjs route is losing route handler implementation in the process.

That may not be an issue when executing integration or end-to-end testing tasks. Taking into consideration that individual handlers can be tested in isolation, we get the benefits of reducing the number of tests and mocking work required per route.

The second challenge is to find a sweet spot between integration testing, unit testing and apply both ideas to the route and route handler, per test case needs.

Loading any library in unit tests is expensive, let alone to load entire expressjs in every unit test. To avoid this, either loading express from a mockable library or injecting expressjs application as needed, maybe two healthy alternatives we have to look into so to speak.

The need of manifest route modularization strategy

There is a common pattern that reveals itself at the end of the modularization effort. Related Routes can be grouped into independent modules, to be reused independently on demand. To make this thought a reality, the manifest route technique attaches a route to a router and makes that router available and ready to be used by other routers, or by an expressjs application.

How to modularize expressjs routes for composability

There is a lot to unpack when dealing with the composability of expressjs routes. The takeout when composing routes is a route that should be defined in a way to can plugged on any router, and just work. Another example would be the ability to mount a server or an expressjs app instance to the route definition on the get-go and have an application that just works.

How to modularize expressjs route handlers for reusability

The reusability aspect of business comes in handy to help reduce instances of code duplication. One can argue that this also helps with performance, as well as better test coverage. The advanced use case of higher re-usability ends in a controller or well-organized module of handlers.

How to modularize expressjs route handlers for performance

The nodejs module loader is expensive. For fairness, reading a file is expensive. The node_modules is notorious for the number of directories and files associated with it. It is by no surprise that reading and loading all those files may be a performance bottleneck. The fewer the files we read from the disk, the better. The following modularization for composability is a living example of how modularization can be used alongside performance improvements.

How to modularize expressjs route handlers for composability

Be in this blog post, as in the ones that came ahead of it, we strive to make the application more reusable while at the same time reducing the time it takes to load the application for use or testing purposes. One way of reducing the number of imports is to leverage thunks or injections.

The need to have route handlers as controllers

If we look up close to route handlers are tightly coupled to a route. Previous techniques broke the coupling and moved individual route handlers in their own modules. Another up-close look, reveals two key points: first some route handlers are copycats, second some route handlers are related at the point they may constitute an independent entity on their own. If we group all handlers related to providing one feature, we completely land in the controller space.

How to specialize routes handlers as controllers

If there exist multiple ways to brew a beer, there should be multiple ways to clustering related handlers in the same module or component! Ok, let's admit that that example does not have any sound logic, but You see the point.

One of the ways to group related handlers is to start grouping by feature. Then if for some reason multiple features happen to use similar(or copycat handlers), choosing an advanced level of abstraction becomes ideal. When we have an equivalent of a base controller, that base controller can move to a common library. The name of the common library can be for instance: /core, /common, or even /lib. We can get creative here.

Modularization of Express routes

The easy way to mitigate that is by grouping functions that are similar into the same file. Since the service layer is sometimes not so relevant, we can group functions into a controller.

//in controller/user.js
module.exports = function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error || !user){
      return next(error);#return right away
    }
    return res.status(200).json(user);
  });
};

//in routes/user.js
var getUser = require('controller/user);
var router = require('express').Router();
router.get('users/:id', getUser);
router.get('admin/:id', getUser);
//exporting the 
module.exports = app;

Example:

Both controller/user.js and two routes can be tested in isolation.

Conclusion

The complexity that comes with working on large-scale nodejs/expressjs applications reduces significantly when the application is in fact well modularized. Modularization is a key strategy in making expressjs routes more re-usable, composable, and stable as the rest of the system evolves. Modularization brings not only elegance to the routes but also reduces the possibility of route redundancy, as well as improved testability.

In this article, we revisited techniques that improve expressjs routes elegance, their testability, and re-usability. We focused more on layering the route into routes and controllers, as well as applying modularization techniques based on module.exports and index files. There are additional complimentary materials in the “Testing nodejs applications” book.

References

#snippets #code #annotations #question #discuss

The depth of an HTTP request or response mock brings a level of complexity to the whole system. In this article, we revisit some techniques used to mock HTTP request/response when used in the same test case.

In this article we will talk about:

  • Mocking Request Objects
  • Mocking Response Objects
  • Mocking Request and Response object in the same test case.
  • When does it make sense to mock both Request and Response.

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

module.exports.getUsers = function getUsers(req, res, next){
  UserModel.find(req.params, (error, users){
    if(error) return next(error, null);
    return res.status(200).json(users);
  });
}

Example: in controller/get-users.js

What can possibly go wrong?

When trying to figure out how to approach mocking request and response objects, the following points may be a challenge:

  • Stubbing the right request/response methods
  • Mock output that can be consumed by the other callers
  • Stubbing request/response handlers in the same test case
  • Strategic mocking that can make a live server obsolete

How to mock Request/Response Objects the easy way

Testing an expressjs middleware provides a good use case where mocking a request and response in the same test case makes sense.

Key objectives are:

  • Spying if certain calls have been called
  • Make sure the requests don't leave the local machine.
var sinon = require('sinon'),
    chai = require('chai'),
    expect = chai.expect,
    getUsers = require('./controller').getUsers;

describe("getUsers()", function() {
  it("should guarantee a response", function() {
    var req  = {}, 
      res  = { send: sinon.spy()}, 
      next = sinon.spy();
    getUsers(req, res, next);
    expect(res.send.calledOnce).to.equal(true);
    res.send.restore(); 
  });     
});

code excerpt adapted from – Unit Testing Controllers the Easy Way in Express 4

Particular Case: How to mock a response that uses a streaming, or other hard to mock interfaces. Keyword: let the flow intact, but fake read/write data instead.

Mocking request

Request object provided by node-mocks-http is pretty similar to the request provided by the native http found in nodejs library

var request;
//When method = GET|DELETE
request = httpMock.createRequest({method: method, url: url});

//When method = PUT|POST
var request = httpMock.createRequest({method, url, body: body})

Mocking Response

//initialization(or beforeEach)
var response = httpMock.createResponse({
    eventEmitter: require('events').EventEmitter
});

//Usage: somewhere in tests
let next = sinon.spy();
getUsers(request, response, next);
response.on('end|data|error', function(error){
  //write tests in this close.
});

Using node-mocks-http is in the gray area of integration testing. However, this technique can be verifiable in use cases where the first strategy falls short.

There is more on integration testing mocking strategy: How to Mock HTTP Request and Response ~ Integration testing use case

Conclusion

In this article, we revisited strategies to mock HTTP Request and Response methods in the same test case, while using mock data to emulate interaction with remote systems. We also re-iterated the difference between stubbing and mocking, and how spies(fake) fall into the testing big picture. There are additional complimentary materials in the “Testing nodejs applications” book on this very same subject.

References

#snippets #http #request #response #mocking #stubbing

Mocking HTTP requests, for that matter responses, is essential in most unit test scenarios. Depending on the depth we want the mock to kick in, this task can become quite a feat on its own. In this article, we revisit some techniques that can make our life easy when mocking requests in integration testing scenarios.

This article is a followup to How to Mock HTTP Request and Response

In this article we will talk about:

  • Stubbing HTTP Request Objects
  • Mocking Request and Response object in the same test case.
  • When does it make sense to mock both Request and Response.

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

//
var User = require('./models').User; 
module.exports = function getProfile(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
};

//Router that Authentication Middleware
var router = require('express').Router();
var authenticated = require('./middleware/authenticated');
var getUsers = require('./users/get-user');
router.get('/users/:id', authenticated, getUser);
module.exports = router;

Example:

What can possibly go wrong?

Some challenges associated with stubbing HTTP requests:

  • How deep a stub should go

Show me the tests

The next section has the following traits backed in:

  • When to use: Testing all routes at once
  • When to use: Asserting on nature of the response output
  • When not to use: When running unit testing
  • When to use: When running integration tests
// Add promise support if this does not exist natively.
if (!global.Promise) {
    global.Promise = require('q');//or any other promise library 
}

var chai = require('chai'),
  chaiHttp = require('chai-http'),
  chai.use(chaiHttp), //registering the plugin.
  //use this line to retain cookies instead 
  agent = chai.request.agent(app),
  //agent.post()|agent.get()|agent.del()|agent.put 
  app = require('express').Router(),
  //mounting app to routes to be tested
  require('./lib/routes')(app);

//initialization of app can be express or other HTTP compatible server.
it('works', function(done){
    chai.request(app)
    .put('/user/me')//.post|get|delete
    .send({ password: '123', confirm: '123' })
    .end(function (err, res) {
        expect(err).to.be.null;
        expect(res).to.have.status(200);
        //more possible assertion 
        expect(res).to.have.status(200);
        expect(req).to.have.header('x-api-key');
        expect(req).to.have.headers;//Assert that a Response or Request object has headers.
        expect(req).to.be.json;//.html|.text 
        expect(res).to.redirect;//.to.not.redirect
        expect(req).to.have.param('orderby');//test sent parameters
        expect(req).to.have.param('orderby', 'date');//test sent parameters values 
        expect(req).to.have.cookie('session_id');//test cookie parameters
    });
});

//keeping port open 
var requester = chai.request(app).keepOpen();
it('works - parallel requests', function(){
    Promise.all([requester.get('/a'), requester.get('/b')])
    .then(responses => { /**do - more assertions here */})
    .then(() => requester.close());
});

This strategy has not been tested on routes that read/write streams.

To the question: When does it make sense to mock both Request and Response, the answer is it depends. In the event where were are interested in replicating interactions with a third party system via Requests/Responses, then it makes sense to mock request/responses.

Conclusion

In this article, we established the difference between Mocking versus Stubbing HTTP requests.

We also established the cost associated with HTTP request every time a test is executed.

With this knowledge, we reviewed ways to reduce costs by strategically stubbing HTTP read/write operations to make tests fail fast, without losing test effectiveness. There are additional complimentary materials in the “Testing nodejs applications” book.

References

#snippets #http #request #response #mocking #stubbing

Mocking and stubbing walk hand in hand. stubbing redis pub/sub, a datastore widely adopted in nodejs ecosystem, can be a setback when testing WebSocket endpoints. This article brings clarity, and a path forward, to it.

In this article we will talk about:

  • Stubbing redis clients
  • Replacing a redis with a drop in replacement.
  • How to avoid spin-up a redis server.

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

module.exports = function(req, res, next){
  User.findById(req.user, (error, next) => {
    if(error) return next(error); 
    new Messenger(options).send().then((response) => {
      redisClient.publish(Messenger.SYSTEM_EVENT, payload));
      //schedule a delayed job 
      return res.status(200).json({message: 'Some Message'});
    });
  });
};

//service based equivalent using a service layer
module.exports = function(req, res, next){
  UserService.findById(req.user)
    .then(new Messenger(options).send())
    .then(new RedisService(redisClient).publish(Messenger.SYSTEM_EVENT, payload))
    .then(response => res.status(200).json(message);})
    .catch(error => next(error));
};

The use of arrow functions instead of function keyword serves to shorten the code. It is possible to replace all arrow functions with the function keywords, for readability.

What can possibly go wrong?

The following points may be a challenge to mock datastore access:

  • Same level of challenge as when mocking database access functions
  • Asynchronous nature of pub/sub clients, characteristic to queue processing systems
  • When the application is using redis (local or remote)
  • Running tests without spinning up a redis server

The following sections will explore more on making points stated above work.

Show me the tests

There is more than one way to go with mocking. I have to preview 3 libraries and choose one the fits better my needs.

Some of libraries are we can tap into to make mocking possible are : rewire, fakeredis, proxywire and sinon.

Mocking redis using rewire

var Rewire = require('rewire');
//module to mock redisClient from 
var controller = Rewire("/path/to/controller.js");
//the mock object + stubs
var redisMock = {
  //get|pub|sub are stubs that can return promise|or do other things
  get: sinon.spy(function(options){return "someValue";});
  pub: sinon.spy(function(options){return "someValue";});
sub: sinon.spy(function(options){return "someValue";});
};
//replacing --- `redis` client methods :::: this does not prevent spinup a new `redis` server
controller.__set__('redisClient', redisMock);

Example:

Mocking redis using fakeredis. fakeredis provides an thrown in replacement and functionalities for redis's createClient() function.

var redis = require("redis");    
var fakeredis = require('fakeredis'); 
var sinon = require('sinon'); 
var assert = require('chai').assert; 

var users, client; 
describe('redis', function(){
  before(function(){
    sinon.stub(redis, 'createClient', , fakeredis.createClient);
    client = redis.createClient(); //or anywhere in code it can be initialized
  });

  after(function(done){
    client.flushdb(function(error){
      redis.createClient.restore();
      done();
    });
  });
});

Example:

Two of the alternatives whose examples are not figuring in this article are mocking redis usingredis-mock and proxyquire.

The goal of the redis-mock project is to create a feature-complete mock of [`redisnode](https://github.com/mranney/node_redis), so that it may be used interchangeably when writing unit tests for code that depends onredis`_

Conclusion

In this article, we revisited strategies to mock redis access methods and replace response objects with mock data.

Testing in parallel can stress the redis server. Mocking redis clients makes tests faster, reduces friction on the network, and prevent stressing redis server especially when shared with other production applications.

There are additional complimentary materials in the “Testing nodejs applications” book.

References

#snippets #code #annotations #question #discuss

Mocking and stubbing walk hand in hand. stubbing redis pub/sub, a datastore widely adopted in nodejs ecosystem, can be a setback when testing WebSocket endpoints. This article brings clarity, and a path forward, to it.

In this article we will talk about:

  • Stubbing redis clients
  • Replacing a redis with a drop in replacement.
  • How to avoid spin-up a redis server.

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

module.exports = function(req, res, next){
  User.findById(req.user, (error, next) => {
    if(error) return next(error); 
    new Messenger(options).send().then((response) => {
      redisClient.publish(Messenger.SYSTEM_EVENT, payload));
      //schedule a delayed job 
      return res.status(200).json({message: 'Some Message'});
    });
  });
};

//service based equivalent using a service layer
module.exports = function(req, res, next){
  UserService.findById(req.user)
    .then(new Messenger(options).send())
    .then(new RedisService(redisClient).publish(Messenger.SYSTEM_EVENT, payload))
    .then(response => res.status(200).json(message);})
    .catch(error => next(error));
};

The use of arrow functions instead of function keyword serves to shorten the code. It is possible to replace all arrow functions with the function keywords, for readability.

What can possibly go wrong?

The following points may be a challenge to mock datastore access:

  • Same level of challenge as when mocking database access functions
  • Asynchronous nature of pub/sub clients, characteristic to queue processing systems
  • When the application is using redis (local or remote)
  • Running tests without spinning up a redis server

The following sections will explore more on making points stated above work.

Show me the tests

There is more than one way to go with mocking. I have to preview 3 libraries and choose one the fits better my needs.

Some of libraries are we can tap into to make mocking possible are : rewire, fakeredis, proxywire and sinon.

Mocking redis using rewire

var Rewire = require('rewire');
//module to mock redisClient from 
var controller = Rewire("/path/to/controller.js");
//the mock object + stubs
var redisMock = {
  //get|pub|sub are stubs that can return promise|or do other things
  get: sinon.spy(function(options){return "someValue";});
  pub: sinon.spy(function(options){return "someValue";});
sub: sinon.spy(function(options){return "someValue";});
};
//replacing --- `redis` client methods :::: this does not prevent spinup a new `redis` server
controller.__set__('redisClient', redisMock);

Example:

Mocking redis using fakeredis. fakeredis provides an thrown in replacement and functionalities for redis's createClient() function.

var redis = require("redis");    
var fakeredis = require('fakeredis'); 
var sinon = require('sinon'); 
var assert = require('chai').assert; 

var users, client; 
describe('redis', function(){
  before(function(){
    sinon.stub(redis, 'createClient', , fakeredis.createClient);
    client = redis.createClient(); //or anywhere in code it can be initialized
  });

  after(function(done){
    client.flushdb(function(error){
      redis.createClient.restore();
      done();
    });
  });
});

Example:

Two of the alternatives whose examples are not figuring in this article are mocking redis usingredis-mock and proxyquire.

The goal of the redis-mock project is to create a feature-complete mock of [`redisnode](https://github.com/mranney/node_redis), so that it may be used interchangeably when writing unit tests for code that depends onredis`_

Conclusion

In this article, we revisited strategies to mock redis access methods and replace response objects with mock data.

Testing in parallel can stress the redis server. Mocking redis clients makes tests faster, reduces friction on the network, and prevent stressing redis server especially when shared with other production applications.

There are additional complimentary materials in the “Testing nodejs applications” book.

References

#snippets #code #annotations #question #discuss

How to Mock mongodb database access functions

From the cost perspective, the least database read/write operations the better. Not all test cases are created equal, and with intensive read/write capabilities comes big accountability. This blog is an expose of some techniques to mock database access without compromising the quality of test results.

In this article we will talk about:

  • Stubbing database access methods to provide mock of their output
  • Mocking output of database access chained method
  • Mocking mongoose/mongodb connections.
  • Database Drop in replacement for faster testing
  • How to avoid spin-up a database server in a unit test context
  • Mocking streaming to/from database systems

Even though this blogpost was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

The mongoose model comes with helpers baked in. The example below, implicitly make functions such as save(), find() update() available by default. The trouble starts settling in when we realize that some functions are used with an instance, for example new User().save(). And others are made available on the class declaration instead, for example: User.find() etc. From this vantage point, the reality of not applying the same techniques while stubbing starts settling in.

Another real struggle is to be able to figure out how to stub custom helpers. In our example, are statics.findByName() and methods.addEmail(). To better understand how to stub those two categories of function, we should start by understanding how they are unique in their own ways, and how they stack up against the instance and class functions mentioned above.

Show me the code

We can think of database access from two perspectives. The first is a set of functions designed to extend mongoose utilities via statics and methods properties. The second is from a usage perspective, or other entities using a mongoose model method to talk to the database.

Difference between static vs method

Let's look at both scenarios, first when our functions are expected to extend mongoose capabilities


var UserSchema = new mongoose.Schema({name: String});
UserScheme.statics.findByName(function(name, next){
    //gives: access to Compiled Model
    return this.where({'name': name}).exect(next);
});

UserSchema.methods.addEmail(function(email, next){
    //works: with retires un-compiled model
    return this.model('User').find({ type: this.type }, cb);
});

//exporting the model 
module.exports = mongoose.model('User', UserSchema);        

Example: mongoose Model definition example in core/user/model

And next, when our functions are expected to leverage existing capabilities

const Contact = require('core/contact/model');
function addContact(params, next){	
    return new Contact(params).save((error, contact) => {	
        if(error) return next(error);	
        return next(null, contact);	
    });	
}

function findContact(id, next){	
    return Contact.findById(id, (error, contact) => {	
        if(error) return next(error);	    
        return next(null, contact);	    
    });	
}

function findContacts(params, next){ 
    return Contact.find(params, (error, contacts) => {	
        if(error) return next(error);	
        return next(null, contacts);
    });
});

Example: mongoose Model usage example in core/contact/model

What can possibly go wrong?

The following points may be a challenge when testing the model layer:

  • Hitting database for any reason slows down Unit Tests
  • Stubbing database access functions, while making it possible to validate callback implementations via spies
  • Making tests less dependent on database server
  • Providing re-usable mocks

Tools

It is feasible to replace database access functions with fakes that emulate similar corresponding actions. Two libraries that come to mind when doing this are sinon and sinon-mongoose.

How to apply same techniques to test SQL based alternative to mongoose such as knex?

There is a feature-complete wrapper of most mongoose utilities: mockgoose.

Database drop-in replacements

Replacing a database with a drop-in-replacement for testing purposes makes it possible to avoid a database spin-up server altogether. The following example highlights good practices when testing with a live database(local development).

var mongoose = require('mongoose');
describe('User', function(){
  
  before(function(){
    mongoose.connect(process.env.CONNECTION_URL);
  });

  after(function(){
    mongoose.connection.close(); 
    mongoose.disconnect(); 
  });
  
  //Do the tests here. 
});

This approach is not convenient, since every time we need to test another model, we will need to spin up a database. read/write operations are also expensive. mockgoose and mongodb-memory-server provide alternatives to mock the whole database infrastructure.

For sake of simplicity, we are going to avoid that, and rely on test doubles instead.

Mocking database access functions

Functions that access or change database state, can be replaced by fakes/stubs capable of supply|emulate output identical to data coming from a real database system.

There are a couple of solutions that can be used, one of them is based on test double libraries such as sinon.


//cb will be the callback to simulate callback function ~ it takes the control from where the stub left off. That means that after executing a stub, the regular callback will execute just as in regular circumstances 
function cb(fn, params){
 return fn.apply(this, arguments);
 //check if params are the one that has to apply instead and apply it.
}

//Model should be an actual model, eg: User|Contact|Address, etc
saveStub = sinon.stub(Contact.prototype, 'save', cb);
findStub = sinon.stub(Contact, 'find', cb);
findByIdStub = sinon.stub(Contact, 'findById', cb);

Stub mongoose with sinon-mongoose

The following is the order in which libraries are loaded to stub the entire mongoose library.

First, we will need to replace the default promise with Promise A+, or another promise library of your choice. Second, we will need to replace mongoose with sinon-mongoose. And the trick is completed.

// Using sinon-as-promised with custom promise 
var sinon = require('sinon'),
    Promise = require('promise'),
    require('sinon-as-promised')(Promise);

// Adding sinon-mongoose to mongoose 
var mongoose = require('mongoose'),
    require('sinon-mongoose');

The promise is not the only kid on the block, in the promised land. The mongoose documentation showcase how to replace the default library, with BYOL (bring your own library).

In next example, we to replace default mongoose promise library with bluebird. Another promise library that made rounds in nodejs and JavaScript community.


var bluebird = require('bluebird'),
 mongoose = require('mongoose'),
 mongoose.Promise = bluebird,
 uri = 'mongodb://localhost:27017/mongoose_test',
 options = { promiseLibrary: bluebird },
 db = mongoose.createConnection(uri, options);

That is good as far as information goes, but not necessarily helping our current task at hand.

Mocking Library: — with callbacks

//in model/user.js
var UserSchema = new mongoose.Schema({name: String});
mongoose.model('User', UserSchema);


//in test.spec.js
describe('User', function(){
    before(function(){
        //model is declared in model/user.js
        this.User = mongoose.model('user');
        this.UserMock = sinon.mock(this.User);
    });
    after(function(){
        this.User.restore();
    });

    it('#save()', function(){
        var self = this,
            user = {name: 'Max Zuckerberg'},
            results = Object.assign({}, user, _id: '11122233aabb');
        //yields works for callbacks
        //.chain('sort').withArgs('-date')
        this.UserMock
            .expects('save')
            .withArgs(user)
            .yields(null, results);  

        new this.User(user).save(function(err, user){
            //add all assertions here. 
            self.UserMock.verify();//verifying 
            self.UserMock.restore();//restore 
        });
    });
    
});

Mock works on an actual object, i.e instance of a model. save() is defined on Document, and not the model object itself. This explains why to we spy on the prototype: sinon.stub(User.prototype, 'save', cb).

Without the mock, it becomes impossible to chain extra function such as .exec(), .stream() etc. A double stub may be an answer to be able to test such edge cases. A quick example is provided to give an idea what we mean by double stub.

var results = { ... }//mock of user data
sinon
 .stub(User.prototype, 'save', cb)
 .returns({
     exec: sinon.stub().yields(null, results)
 });

Alternatively we can use .create() instead. But that may be a little too late, in case the application adopted save() and used in multiple instances. Needless to mention that .create() looks a little bit off. Same this in this this instance, but requires far more changes.

It is also possible to rely on libraries such as factory girl, as explained in this SO answer

Mocking Library: — with promises


//in User.js
MongooseModel
 .find()
 .limit(10)
 .sort('-date')
 .exec()
 .then(result =>  result);

//in user.model.spec.js 
require('sinon'),
require('mongoose'),
require('sinon-mongoose'),
require('sinon-as-promised');

//describe 
describe('User', function(){
    it('works', function(){
        sinon.mock(MongooseModel)
        .expects('find').withArgs(10)
        .chain('limit').withArgs(10)
        .chain('sort').withArgs('-date')
        .chain('exec')
        .resolves('SOME_VALUE'); 
        //.yields(null, 'SOME_VALUES')       
    });
});

Mocking Library: — with streams

The following is a drop-in-replacement of usage of a model paired with a stream.


//code example  
UserModelMock
 .find()
 .stream()
 .pipe(new Transformer())
 .pipe(res);


// in tests ~ using the double mock technique ~ return a readableStream
sinon.stub(Model, 'find').returns({
  stream: sinon.stub().yields(null , readableStreamMock)
});
//readableStreamMock has to have generated data for testing purposes. 

//in tests ~ create writable stream compatible with response object somehow 
 writableStream.on('data|end|close|finish', function(){
  expect(Model.find.called, 'find() has been called');
});

The technique to test the streams has been intensively covered in the – Testing nodejs Applications book. The ideas in this code sample, are rough and still have loopholes that are covered in the said book.

Final notes

Models should be created once, across all tests.

The error: OverwriteModelError: Cannot overwrite 'Activity' model once compiled. means one of the following occurred: – 1. got the caps wrong while importing models. => import User from 'model/user – 2. got wrong definitions of models => var userSchema = new Schema({}); module.exports = mongoose.model('user', userSchema) <=== new schema and not just schema(this was my case) – 3. got models twice(two times recompilation) => module.exports = mongoose.model.User || mongoose.model('user', userSchema);

There are more answers on this subject on StackOverflow

Conclusion

In this article, we established the cost of hitting the database every time a unit test runs, and how to avoid worst-case scenario by mocking out the most expensive parts of the model layer.

We also reviewed how to reduce such costs by strategically stubbing read/write operations to make tests fast, without losing test effectiveness.

Testing tends to be more of art, than a science, practice makes perfect. There are additional complimentary materials in the “Testing nodejs applications” book.

References

Mocking results of a single function is crystal clear in most use cases. However, there is a level of difficulty linked to mocking more than one chained function. This article highlights some techniques to overcome this challenge.

In this article we will talk about:

  • Stubbing standalone functions
  • Stubbing chained functions
  • Stubbing chained methods

Even though this blogpost was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

//standalone
module.exports = function normalizeName(obj){
  let attrs = (obj.name || "").split(' ');
  return {
    first: attrs[0], 
    last: attrs[attrs.length - 1], 
    full: obj.name
  };
};

//with callback
Order
  .find()
	.populate()
	.sort()
	.exec(function(err, order){ 
    /** ... */
  });

//with a promise
Order
  .find()
  .populate()
  .sort()
  .exec()
  .then(function(err, order){ });

Keyvan Fatehi managed to hack something amazing ~ that is in fact the blueprint of this article

What can possibly go wrong?

If you haven't already, there is an article about “Test Doubles” that makes a case on how stub/mock and spy stacks-up. There is also “How to spy/stub methods” as a complementary article.

Some challenges we notice by looking at the above examples:

  • Given that most spying/stubbing techniques require an object, which standalone functions do not have
  • Each chaining results in an independent object, which makes spying/stubbing methods a little off the charts

Show me the tests

Functions' return value mocks can be achieved via two main channels: via a spy of a function, or via a stub on an object that hosts the function.

The choice to use mongoose models is deliberate, for having something that is widely used, at least for backend developers. However, we have to highlight techniques presented below, can be applied to any other object or function. There are also some advanced techniques such as asynchronous code via callbacks, promises, and even streaming backed-in the library, that would otherwise increase complexity to examples that we want to be simple.

Standalone Functions

Before we jump headfirst into it, let's see what infrastructure sinon has to offer when it comes to mocking(spying and stubbing included).


let outputMock = { ... };
sinon.stub(obj, 'func').returns(outputMock);
sinon.stub(obj, 'func').callsFake(function fake(){ return outputMock; })
let func = sinon.spy(function fake(){ return outputMock; });

With exception of spy notation, backing objects are required in the previous three examples to be able to mock function return values. This is telling, to mock an independent function, we will either have to re-assign function to a spy whenever the function is needed, or attach our independent function to some sort/form of an object.

Using modularization techniques + index we can solve this challenge. If normalizeName() is located in /utils directory, in /utils/index.js we will export all files under /utils. We will then be able to mock utils#normalizeName using one of the examples stated earlier.

//ES5
var normalizeName = require('./normalize-name');
module.exports = { normalizeName: noramlizeName };
//ESNext
export * from "../utils";

//ES5
var utils = require('./utils');
//ESNext
import * as utils from './utils';

//Mocking
let nameMock = {first: 'Eliud', last: 'Kipchoge', full: 'Eliud Kipchoge'};
sinon.stub(utils, 'normalizeName').returns(nameMock);
sinon.stub(utils, 'normalizeName').callsFake(() => nameMock)
let normalizeName = sinon.spy(() => nameMock);

The use of the arrow function is to have a small example, but can easily be replaced with full-fledged function declarations.

Chained Functions

Stubbing chained functions. One of the tricks to make a function chain-able is to return a special kind of object. This object has to have access to both previous and new function contexts.

module.exports = utils = {
  name: '', 
  _name: {full: this.name},
  first: function(optional){
    if(optional) { this.name = optional; }
    this._name.full = this.name; 
    this._name.first=  this.name.split(' ')[0];
    return this;
  },
  last: function(optional){
    if(optional) { this.name = optional; }
    this._name.full = this.name; 
    let attrs = this.name.split(' ');
    this._name.last = attrs[attrs.length - 1];
    return this;
  }, 
  value: function(){ 
    return this._name; 
  }
};

console.log(utils.first('Eliud Kipchoge').last().value());
//logs { first: "Eliud", last: "Kipchoge", full: "Eliud Kipchoge"}

From this angle, we can choose to mock the last function in the call tree and let other functions do their job, or mock in a cascading fashion.

//Mock the last function call
let nameMock = {first: 'Eliud', last: 'Kipchoge', full: 'Eliud Kipchoge'};
sinon.stub(utils, 'normalizeName').returns(nameMock);

//Mock in a cascading fashion
sinon.stub(util, 'first').returns({
  last: sinon.stub().returns({
    value: sinon.stub().returns(nameMock)
  })
})

Example:

The depth of cascading stub/mocks depends on how far function usage is headed. The more chaining, the deeper the stubbing can get.

Chained Methods

Stubbing chained methods are not different from stubbing chained functions. Methods are by definition a collection of functions belonging to the same class. Since objects are instances of a class, we can keep the mental model we adopted earlier intact.

Order.find()
	.populate()
	.sort()
	.exec(function(err, order){ /** ... */});

//Slight modification of original code
sinon.stub(Order, 'find').returns({
  populate: sinon.stub().returns({
    exec: sinon.stub().yields(null, {
      id: "1234553"
    })
  })
})

Chained Methods with a promise

What can happen if a promise is involved in a chain of functions?

We have two alternatives and both are important just equally. We can opt to adopt the cascading approach we used in previous examples, Or adopt a library that does some heavy lifting for us. The set of libraries, to be more specific, sinon-mongoose and sinon-as-promised to have sinon like mocking experience with capabilities to resolve promises.


require('sinon');
require('sinon-as-promised');
require('sinon-mongoose');

sinon.mock(Order)
  .expects('find')
  .populate('customer provider product')
  .chain('limit').withArgs(10)
  .chain('sort').withArgs('-date')
  .chain('exec')
  .resolves(resultsMock)

Conclusion

In this article, we revisited strategies to mock chained functions which are supposed to return data on each point of connection. We also re-iterated the difference between stubbing and mocking, and how spies(fake) fall into the testing picture. There are additional complimentary materials in the “Testing nodejs applications” book.

References

This blog is a follow-up to “How to add good looking code snippets in presentations, documentation and blog posts”. The difference is we are going to stress on one single kind of demo that has quite a revival these days: gif.

Demo, or it didn't happen.

In this article we will talk about:

  • How some developers are leveraging gif for demo
  • Some Apps that help with making gif demos standout
  • Tools for making screen capture and screencasting
  • Resources for hacking WebRTC ~ good for those interested in rolling out their own stuff.

Even though this blogpost was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Blogs

Apps

Hacks

Conclusion

In this article, we revisited how to demo using gif. We provided references to tutorials, apps, and blogs to help achieve good-looking demo without breaking our bank account.

References

Modules come in multiple flavors. To name a few, function, objects, classes, configuration metadata, initialization data or server can all be treated as modules. We will explore ways to modularize Routes, Middleware, Services, Models and utility libraries in a realistic setting.

This article is more technical, “How to modularize nodejs applications” offers a more theoretical approach to think about modularization.

As a recap, two key elements are going to be leveraged to modularize components as stated in the first article, at a glance module.exports ES5 utility, its ESNext import/export utility equivalents, and usage of the index file at every top-level directory.

The technique we are going to explore is based on two enhancements: make most of the things importable(or exportable depending on perspective), and adding an index file to every level of the project directory structure.

If you haven't read it yet, the followup to article is Overview on testing nodejs applications. You may give it a bird-eye-view or test drive to get a sense of how testing nodejs will look like.

In this article we will talk about:

  • Introducing the Layered Architecture
  • The need to make nodejs server modular
  • Modularization of nodejs server
  • The need to make expressjs routes modular
  • Modularization of expressjs routes
  • The need to have the controller layer
  • Modularization of expressjs routes using controllers
  • The need to abstract business logic in a service layer
  • Modularization of business logic under service layer
  • The need to have a configuration layer
  • Modularization of configuration layer

Even though this blogpost was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

var express = require('express'),
    path = require('path'),
    n = path.normalize,
    j = path.join,
    app = express();

/**Data Layer*/
var mongodb = require("mongodb");
mongoose.connect('mongodb://localhost:27017/devdb');
var User = require('./models').User; 

/**
 * Essential Middelewares 
 */
app.use(express.logger());
app.use(express.cookieParser());
app.use(express.session({ secret: 'angrybirds' }));
app.use(express.bodyParser());
app.use((req, res, next) => { /** Adding CORS support here */ });

app.use((req, res) => res.sendFile(n(j(__dirname, 'index.html'))));


/** .. more routes + code for app ... */
app.get('/', function (req, res) {
  return res.send('Hello World!');
});


/** code that initialize everything, then comes this route*/
app.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});

/**
 * More code, more time, more developers 
 * Then you realize that you actually need:
 */ 
app.get('/admin/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});
/**
 * This would work just fine, but we may also have a requirement to listen to Twitter changes 
app.listen(port, function () {
  console.log('Example app listening on port 3000!')
});
*/

var server = require('http').createServer(app);
server.listen(app.get('port'), () => console.log(`Listening on ${ process.env.PORT || 8080 }`));
var wss = require('socket.io')(server);
//Handling realtime data
wss.on('connection'(socket, event) => {
    socket.on('error', () => {});
    socket.on('pong', () => {});
    socket.on('disconnect', () => {});
    socket.on('message', () => {});
});

Example:

What can possibly go wrong?

How do we go from the code posted above to a code that looks like as in the snippet below?

Looking at the code above, one would wonder how easy, or hard, maintenance would turn out to be when asked to add or sunset a feature from a program similar to the one presented above.

When trying to figure out how to approach modularization, the following key points may be a challenge:

  • Identify key layers good enough to make an application work
  • Slicing the application following a well-defined project structure
  • Apply modularization techniques to each layer.

When done right, ideally, we will have a code that looks like the following at the end of the exercise.

var express = require('express'),
  app = express(), 
  http = require('http'),
  server = http.createServer(app);

...
require('./config')
require('./utils/mongodb');
require('./utils/middleware')(app);
require('./routes')(app);
require('./realtime')(app, server);
...

server.listen(app.get('port'));
module.exports.server = server; 

Example:

The following sections will explore more on making points stated above work.

The need to have a modular system

  • One of the side effects of creating a layered architecture, is an explosion in directory count and file count
  • The complexity directory count of layered systems bring to the table makes it intimidating to maintain layered architecture software
  • Modularization reduces such intimidation. It makes it easy to track source code files and libraries to import to make the program work. Grouping those modules under one exportable banner is key to make this concept work.

The need to have a Layered Architecture

Large codebases tend to be hard to maintain and nodejs applications are not an exception to this reality. Updates in 3rd party integrations, the evolution of language or libraries are some of the reasons to revisit codebase under hibernation.

When talking about the layered architecture, we will be referencing two additional layers: the introduction of the controller, the model, and later on the service layer.

A layered architecture makes sure systems components are decoupled, and changes in any of the components of the integrated system do not directly translate into a rewrite of dependent modules.

For instance, layered architectures make sure swapping a payment system from an expensive one to a cheaper one does not cause a service disruption. It also makes sure both systems can actually work in tandem.

Another example that may strike your interest is when there is a change in a route handler, the route declaration should stay intact. Ideally, write once and forget.

Every step of the way, while making our application layered, we will keep in mind above mentioned details.

The need to make nodejs server modular

Disambiguation: Server. The server is not in terms of the computer that runs the code, but the actual code that serves as an entry point(server) to the node application.

The nodejs server should be easy to test, like any other application component. To be able to test a nodejs server in isolation, we have to be able to import a nodejs server as a module. A server, by definition, requires the use of network resources. Those resources are expensive read/write operations, may introduce unwanted side effects, when not mocked out in unit tests. Hence, another reason to make nodejs server modular, is a need to mock expensive read/write operations that may impede test performance.

More on how to modularize nodejs servers is explained in Modularization of nodejs servers

Modularization of nodejs server

Modularization of nodejs server, as for any other module forming procedure, requires 2 steps. The first step is to make sure the server is exportable as any other object. The second step is to provide access to it, via index.

//in server.js 
var express = require('express'),
    app = express();
var server = require('http').createServer(app);
    server.listen(app.get('port'), () => console.log(`Listening on ${ process.env.PORT || 8080 }`));

//Modularizing the server 
module.export = server;

//Adding server export in index.js
module.export = require('./server');

Once this process is done, it becomes possible to import the server be in unit tests, or other sections.

The above code has a caveat: every time we require the server, the script will automatically start listening. To prevent this from happening, the server initialization code can be moved into a configurable function, also known as a thunk.

That modification can make the code look as in following example:

module.export.server = function(app){
  return function(){
    var m = `Listening on ${ process.env.PORT || 8080 }`;
    var server = require('http').createServer(app);
    server.listen(app.get('port'), () => console.log(m));
    return server;
  }
}

This modification makes sure, that as long as the function var server = require('./server')(app) has not been called yet, the server.listen will not be listening.

The need to make expressjs routes modular

The need for a router and route handlers. The router is obviously a cornerstone in expressjs application.

For the nodejs only use case, the introduction of a router may be subtle, especially when there is no clear reason to respond to client requests, for instance on command line applications or scripts designed to be run by a server process.

However, in case we need to respond to some requests, it is possible that some request handlers start looking a little like copycats, past a certain threshold. Whence, the need to make route handlers less repetitive and more reusable.

expressjs route handler comes into the picture, to simplify and avoid boilerplate when it comes to HTTP request/response operations.

Separation of route handler from the route ~ or why tightly coupled handlers are not a good idea.

The route constitutes one part of the application that will not change quite often. The frequency of change depends on how modular the route handler turns out to be.

When the time comes to test a route, it may be hard to isolate one route from the rest of the pack. Since isolated tests require the availability of one piece of code under test to be available for test, we have no other choice but to make routes importable/exportable.

The next step showcase how modularization of a route can be achieved in a few steps.

Modularization of expressjs Routes

Modularization of expressjs route, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure a route is exportable. The second step is to provide access to it via an index file.

To make the route “exportable” means to extract the route out of the route's definition.

More on how to modularize router, with manifest routes technique, is explained in Modularizing nodejs applications ~ Manifest routes

//Get this construct off server.js 
app.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});

// First iteration makes function exportable ~ in route/user.js
var router = require('express').Router();
router.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});
module.export = router;

// Second iteration ~ in route/index.js
module.export = require('./route/user');

It is always possible to do better, while at the same time staying within the limit of acceptable refactoring. One of those things we can consider is naming and extracting the route handler from the route definition. This process does the groundwork to move transform route handlers into controllers.

// Router definition ~ in route/user.js 
router.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});

//Naming and extracting route handler results in: 
function getUsersById(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
}
router.get('/users/:id', getUsersById);

The need of a middleware

In expressjs context and nodejs in general, the middleware provides a way to do validation/verifications before continuing with the rest of the router.

Some of the commonly used middleware are:CORS, authentication(passportjs), JSON(transforming a body into a JSON object), route logging, to name a few.

Modularization of middleware

Modularization of middleware, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure the middleware is exportable as any other function. The second step is to provide access to it, via index.

The extra step that may be optional in certain instances, is to identify opportunities to create a new middleware and attach middleware from one point instead of multiple imports.

/**
 * Essential Middelewares 
 */
app.use(express.logger());
app.use(express.cookieParser());
app.use(express.session({ secret: 'angrybirds' }));
app.use(express.bodyParser());
//Opportunity to modularize 
app.use((req, res, next) => { /** Adding CORS support here */ });

//First step ~ extract cors middleware + move to middleware/cors.js 
function cors(req, res, next) => { /** Adding CORS support here */ }
app.use(cors);

//Second step ~ Export via middlware/index.js
module.export = require('./cors');

The extra step would be to group middleware calls into a thunk. This kind of module makes it possible to configure a function and use it later on.

//Add initialization function in utils/middleware.js
module.export = function middleware(express, app){
  //optional
  //return function(){ 
    app.use(express.logger());
    app.use(express.cookieParser());
    app.use(express.session({ secret: 'angrybirds' }));
    app.use(express.bodyParser());
    app.use(cors());
  //}
};

//How to use the new middleware in server.js
require('./utils/middleware')(express, app);

The need of a controller

Modularization of routes leaves one remnant that can be improved upon: the request handler is tightly coupled with the route it serves. That makes re-usability a little hard, isolated testing a bit hacky and developers' life a little more miserable.

One further step to modularization is the introduction of a standalone handler. Generally speaking, with some exceptions, when a route handler is extracted from the rest of the route, the resulting code qualifies as a controller. A controller can be interchangeable between related and unrelated routes alike.

The need of a controller, or a controller layer, is independent of a router, or routing system adopted. This makes it possible to pass around controllers to various routing systems, as long as they adhere to some sort of similar request handler function signature.

Modularization of controllers

Three steps are required to make controllers modular. The first two steps have been describing in the introduction: introduction of import/export constructs and the index. The first step includes the ability to make route handler exportable, and by that to eject the handler out of route definition.

Modularization of expressjs route handlers as controllers.

Modularization of expressjs controller, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure a controller is exportable. The second step is to provide access to it, via index.

In the case of expressjs, which is really not applicable to controllers outside expressjs framework, the controller is basically the request/response handler.

//Ejecting handler out of route definition + export to controller/user.js 
//The filename can also be user/controller.js for organization by feature projects 
function getUsersById(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
}

//First interation ~ export handler as a controller in controller/user.js  
module.export = getUserById;

// Second iteration ~ in user/controller/index.js
//OR controller/user/index.js for organization by category projects 
module.export = require('./user/controller');

Following this logic, a controller is nothing but the ejected route handler.

The need of a model layer

The model is needed to translate application business logic into well-organized data structures within a program. That idea is mainly why a model is needed.

Having a data model layer is a whole new story. Managing models become cumbersome in some circumstances. That is why there is a plethora of ORM/ODM and whatnot. These tools bring DRY to model management.

Separation of Model from Route. A tightly coupled model/router combo is not always a good idea, particularly when the application has a tendency to grow beyond database fetch and dump scenarios. Keeping models tightly coupled to the route will make it difficult to test, scale, and reduce code duplications. Code duplications are known to increase technical debt.

More on how to modularize the router is explained in How to modularize expressjs routes

Modularization of model layers

Modularization of the model, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure the model is exportable as any other function or object. The second step is to provide access to it, via index.

It worths to mention that model definitions may come with accompanying schema definition. This is usually the case when coupling nodejs application to a mongodb database, via mongoose DRM(Document Relational Mapping).

//In utils/mongodb.js we initialize the mongodb 
var mongodb = require("mongodb");
mongoose.connect('mongodb://localhost:27017/devdb');
module.export = mongodb;

//Can be exported in utils/index.js 
module.export = require('./utils/mongodb');

//How to use this
require('./utils/mongodb');

The model layer is composed of the database connection and initialization alone. There is also schema definition that can be done. The order in which we load and compile schema to modules is as following: connect to the database first, then load models.

The need for real-time data

The need for real-time data — most modern applications require a stream of information to be pushed to customers near real-time. Some applications are designed with real-time capabilities at their core, others have the real-time element as an add-on.

For instance, a stock trading application, cannot afford to lose a minute sending updates to customers who need such invaluable information as soon as a change happens on stock prices. The same applies to maps and instant communication applications. Realtime capability is a core feature in such applications.

WebSocket is an extension of the HTTP protocol that makes near real-time magic happen.

Decoupling the real-time portion of the application makes maintenance a little easier to test, deploy and maintain.

Modularization of WebSocket handlers

How to modularize the WebSocket?

Modularization of WebSocket handlers, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure the handlers are exportable as any other function. The second step is to provide access to them, via index.

We have to make sure we isolate WebSocket objects, as the event model of the WebSocket implementation requires the availability of a network, which may not be available in a unit testing context.

Even if the networks were available, unit testing on a network is expensive and should be mocked out to make sure we have small and fast test cases.

More on how to modularize WebSocket is available on How to modularize socket.io/expressjs application

var express = require('express'),
    app = express(), 
    http = require('http'),
    server = http.createServer(app),
    wss = require('socket.io')(server);
//Handling realtime data
wss.on('connection', (socket, event) => {
    socket.on('error', () => {});
    socket.on('pong', () => {});
    socket.on('disconnect', () => {});
    socket.on('message', () => {});
});

//in ./realtime.js 
module.export = function(app, server){
  var wss = require('socket.io')(server);
  //Handling realtime data
  wss.on('connection', (socket, event) => {
      socket.on('error', onError);
      socket.on('pong', onPong);
      socket.on('disconnect', onDisconnet);
      socket.on('message', onMessage);
  });
}

//How to use: in server.js 
require('./realtime')(app, server)

The need for a service layer

The MVC approach decouples the model from the view, and both are supposed to be held together by a controller. However, using models inside a controller assumes that the database or ORM used to negotiate data with the underlying database will not change over time. That is not always the case.

The API from the same ORM/ODM may change over time. The ORM/ODM maybe sunset at one point. Data that used to come from one kind of database, may be moved to another database.

All of the above changes require our model to be resilient and ready to adapt to change over time.

The examples provided above are all related to the data model, but integration with third-party services can well fall under the same category.

To sum up, there a clear need to abstract business logic in a service layer, that we changes are under the limits of what we can control. There is a clear need for the modularization of business logic under the service banner.

Modularization of the service layer

There are 3 steps we can take to make the service layer, modular. There are two steps already mentioned in the introduction. The last and more critical is to group related individual model handling modules, or grouping third party integration links into a service.

The following module can be used a service:

//initial request handler 
module.exports = function(req, res, next){
 return User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
}

//service declaration
function UserService(){}

UserService.prototype.getUser = function(id, next){
  return User.findById(id, function(error, user){
    return next(error, user);
  });
};
module.exports = UserService;

//request|controller handler 
module.exports = function(req, res, next){
 return new UserService().getUser(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
}

The signature of findById may change in the future, or the function is renamed. Whatever happens, we have one single place in our library that is subject to such a change.

More on how to modularize the service layer Modularize nodejs service layer

The need for a configuration layer

The service layer, the controller layer, the model layer, servers, and whatnot all require pre-determined environment variables that enable those layers to work from one environment to the next. Those variables are known under the configurations moniker. Those variables are sensitive to unintentional leaks to the public. Modularization of such a layer, in part, resolves this issue.

When it comes to databases, there is a need to modularize database connection strings and secrets and get them out of the codebase. This idea can be extended to other secrets and variables needed to run the application.

Modularization of configuration layer

There is a library that brings server configurations in mists of the application. The beauty of this library is its ability to completely separate configuration from code.

//in utils/config.js 
var dotenv = require('dotenv'),
  dotenvExpand = require('dotenv-expand');

module.export = function(){
  var config = dotenv.config();
  dotenvExpand(config);
  return config;
}

//in utils/index.js
module.export = require('./config')();

//Usage 
let config = require('./config');

More on how to modularize configurations variables Modularize nodejs application configuration

To sum up

As recapitulation, the code we exposed at the beginning of the current article may end up looking something like this:

var express = require('express')
var app = express();
//@todo add modules here to make the server a bit smaller and granular 
...
require('./config')
require('./utils/mongodb');
require('./utils/middleware')(express, app);
require('./routes')(app);
require('./realtime')(app, server)
...
module.exports.server = server; 

Example:

Conclusion

In this article, we revisited how modularization can be achieved by leveraging the power of module.exports( or export in ES7+). The variety and flavors of components candidate to modularize, make it imperative that the modularization has to be minimalist, that is the reason why we leveraged the index file to make sure we do not overload already complex architectures. There are additional deeper complementary materials in the “Testing nodejs applications” book, around modularization and project layout.

References

Modules come in multiple flavors. To name a few, function, objects, classes, configuration metadata, initialization data or server can all be treated as modules. We will explore ways to modularize Routes, Middleware, Services, Models and utility libraries in a realistic setting.

This article is more technical, “How to modularize nodejs applications” offers a more theoretical approach to think about modularization.

As a recap, two key elements are going to be leveraged to modularize components as stated in the first article, at a glance module.exports ES5 utility, its ESNext import/export utility equivalents, and usage of the index file at every top-level directory.

The technique we are going to explore is based on two enhancements: make most of the things importable(or exportable depending on perspective), and adding an index file to every level of the project directory structure.

If you haven't read it yet, the followup to article is Overview on testing nodejs applications. You may give it a bird-eye-view or test drive to get a sense of how testing nodejs will look like.

In this article we will talk about:

  • Introducing the Layered Architecture
  • The need to make nodejs server modular
  • Modularization of nodejs server
  • The need to make expressjs routes modular
  • Modularization of expressjs routes
  • The need to have the controller layer
  • Modularization of expressjs routes using controllers
  • The need to abstract business logic in a service layer
  • Modularization of business logic under service layer
  • The need to have a configuration layer
  • Modularization of configuration layer

Even though this blogpost was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

var express = require('express'),
    path = require('path'),
    n = path.normalize,
    j = path.join,
    app = express();

/**Data Layer*/
var mongodb = require("mongodb");
mongoose.connect('mongodb://localhost:27017/devdb');
var User = require('./models').User; 

/**
 * Essential Middelewares 
 */
app.use(express.logger());
app.use(express.cookieParser());
app.use(express.session({ secret: 'angrybirds' }));
app.use(express.bodyParser());
app.use((req, res, next) => { /** Adding CORS support here */ });

app.use((req, res) => res.sendFile(n(j(__dirname, 'index.html'))));


/** .. more routes + code for app ... */
app.get('/', function (req, res) {
  return res.send('Hello World!');
});


/** code that initialize everything, then comes this route*/
app.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});

/**
 * More code, more time, more developers 
 * Then you realize that you actually need:
 */ 
app.get('/admin/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});
/**
 * This would work just fine, but we may also have a requirement to listen to Twitter changes 
app.listen(port, function () {
  console.log('Example app listening on port 3000!')
});
*/

var server = require('http').createServer(app);
server.listen(app.get('port'), () => console.log(`Listening on ${ process.env.PORT || 8080 }`));
var wss = require('socket.io')(server);
//Handling realtime data
wss.on('connection'(socket, event) => {
    socket.on('error', () => {});
    socket.on('pong', () => {});
    socket.on('disconnect', () => {});
    socket.on('message', () => {});
});

Example:

What can possibly go wrong?

How do we go from the code posted above to a code that looks like as in the snippet below?

Looking at the code above, one would wonder how easy, or hard, maintenance would turn out to be when asked to add or sunset a feature from a program similar to the one presented above.

When trying to figure out how to approach modularization, the following key points may be a challenge:

  • Identify key layers good enough to make an application work
  • Slicing the application following a well-defined project structure
  • Apply modularization techniques to each layer.

When done right, ideally, we will have a code that looks like the following at the end of the exercise.

var express = require('express'),
  app = express(), 
  http = require('http'),
  server = http.createServer(app);

...
require('./config')
require('./utils/mongodb');
require('./utils/middleware')(app);
require('./routes')(app);
require('./realtime')(app, server);
...

server.listen(app.get('port'));
module.exports.server = server; 

Example:

The following sections will explore more on making points stated above work.

The need to have a modular system

  • One of the side effects of creating a layered architecture, is an explosion in directory count and file count
  • The complexity directory count of layered systems bring to the table makes it intimidating to maintain layered architecture software
  • Modularization reduces such intimidation. It makes it easy to track source code files and libraries to import to make the program work. Grouping those modules under one exportable banner is key to make this concept work.

The need to have a Layered Architecture

Large codebases tend to be hard to maintain and nodejs applications are not an exception to this reality. Updates in 3rd party integrations, the evolution of language or libraries are some of the reasons to revisit codebase under hibernation.

When talking about the layered architecture, we will be referencing two additional layers: the introduction of the controller, the model, and later on the service layer.

A layered architecture makes sure systems components are decoupled, and changes in any of the components of the integrated system do not directly translate into a rewrite of dependent modules.

For instance, layered architectures make sure swapping a payment system from an expensive one to a cheaper one does not cause a service disruption. It also makes sure both systems can actually work in tandem.

Another example that may strike your interest is when there is a change in a route handler, the route declaration should stay intact. Ideally, write once and forget.

Every step of the way, while making our application layered, we will keep in mind above mentioned details.

The need to make nodejs server modular

Disambiguation: Server. The server is not in terms of the computer that runs the code, but the actual code that serves as an entry point(server) to the node application.

The nodejs server should be easy to test, like any other application component. To be able to test a nodejs server in isolation, we have to be able to import a nodejs server as a module. A server, by definition, requires the use of network resources. Those resources are expensive read/write operations, may introduce unwanted side effects, when not mocked out in unit tests. Hence, another reason to make nodejs server modular, is a need to mock expensive read/write operations that may impede test performance.

More on how to modularize nodejs servers is explained in Modularization of nodejs servers

Modularization of nodejs server

Modularization of nodejs server, as for any other module forming procedure, requires 2 steps. The first step is to make sure the server is exportable as any other object. The second step is to provide access to it, via index.

//in server.js 
var express = require('express'),
    app = express();
var server = require('http').createServer(app);
    server.listen(app.get('port'), () => console.log(`Listening on ${ process.env.PORT || 8080 }`));

//Modularizing the server 
module.export = server;

//Adding server export in index.js
module.export = require('./server');

Once this process is done, it becomes possible to import the server be in unit tests, or other sections.

The above code has a caveat: every time we require the server, the script will automatically start listening. To prevent this from happening, the server initialization code can be moved into a configurable function, also known as a thunk.

That modification can make the code look as in following example:

module.export.server = function(app){
  return function(){
    var m = `Listening on ${ process.env.PORT || 8080 }`;
    var server = require('http').createServer(app);
    server.listen(app.get('port'), () => console.log(m));
    return server;
  }
}

This modification makes sure, that as long as the function var server = require('./server')(app) has not been called yet, the server.listen will not be listening.

The need to make expressjs routes modular

The need for a router and route handlers. The router is obviously a cornerstone in expressjs application.

For the nodejs only use case, the introduction of a router may be subtle, especially when there is no clear reason to respond to client requests, for instance on command line applications or scripts designed to be run by a server process.

However, in case we need to respond to some requests, it is possible that some request handlers start looking a little like copycats, past a certain threshold. Whence, the need to make route handlers less repetitive and more reusable.

expressjs route handler comes into the picture, to simplify and avoid boilerplate when it comes to HTTP request/response operations.

Separation of route handler from the route ~ or why tightly coupled handlers are not a good idea.

The route constitutes one part of the application that will not change quite often. The frequency of change depends on how modular the route handler turns out to be.

When the time comes to test a route, it may be hard to isolate one route from the rest of the pack. Since isolated tests require the availability of one piece of code under test to be available for test, we have no other choice but to make routes importable/exportable.

The next step showcase how modularization of a route can be achieved in a few steps.

Modularization of expressjs Routes

Modularization of expressjs route, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure a route is exportable. The second step is to provide access to it via an index file.

To make the route “exportable” means to extract the route out of the route's definition.

More on how to modularize router, with manifest routes technique, is explained in Modularizing nodejs applications ~ Manifest routes

//Get this construct off server.js 
app.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});

// First iteration makes function exportable ~ in route/user.js
var router = require('express').Router();
router.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});
module.export = router;

// Second iteration ~ in route/index.js
module.export = require('./route/user');

It is always possible to do better, while at the same time staying within the limit of acceptable refactoring. One of those things we can consider is naming and extracting the route handler from the route definition. This process does the groundwork to move transform route handlers into controllers.

// Router definition ~ in route/user.js 
router.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});

//Naming and extracting route handler results in: 
function getUsersById(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
}
router.get('/users/:id', getUsersById);

The need of a middleware

In expressjs context and nodejs in general, the middleware provides a way to do validation/verifications before continuing with the rest of the router.

Some of the commonly used middleware are:CORS, authentication(passportjs), JSON(transforming a body into a JSON object), route logging, to name a few.

Modularization of middleware

Modularization of middleware, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure the middleware is exportable as any other function. The second step is to provide access to it, via index.

The extra step that may be optional in certain instances, is to identify opportunities to create a new middleware and attach middleware from one point instead of multiple imports.

/**
 * Essential Middelewares 
 */
app.use(express.logger());
app.use(express.cookieParser());
app.use(express.session({ secret: 'angrybirds' }));
app.use(express.bodyParser());
//Opportunity to modularize 
app.use((req, res, next) => { /** Adding CORS support here */ });

//First step ~ extract cors middleware + move to middleware/cors.js 
function cors(req, res, next) => { /** Adding CORS support here */ }
app.use(cors);

//Second step ~ Export via middlware/index.js
module.export = require('./cors');

The extra step would be to group middleware calls into a thunk. This kind of module makes it possible to configure a function and use it later on.

//Add initialization function in utils/middleware.js
module.export = function middleware(express, app){
  //optional
  //return function(){ 
    app.use(express.logger());
    app.use(express.cookieParser());
    app.use(express.session({ secret: 'angrybirds' }));
    app.use(express.bodyParser());
    app.use(cors());
  //}
};

//How to use the new middleware in server.js
require('./utils/middleware')(express, app);

The need of a controller

Modularization of routes leaves one remnant that can be improved upon: the request handler is tightly coupled with the route it serves. That makes re-usability a little hard, isolated testing a bit hacky and developers' life a little more miserable.

One further step to modularization is the introduction of a standalone handler. Generally speaking, with some exceptions, when a route handler is extracted from the rest of the route, the resulting code qualifies as a controller. A controller can be interchangeable between related and unrelated routes alike.

The need of a controller, or a controller layer, is independent of a router, or routing system adopted. This makes it possible to pass around controllers to various routing systems, as long as they adhere to some sort of similar request handler function signature.

Modularization of controllers

Three steps are required to make controllers modular. The first two steps have been describing in the introduction: introduction of import/export constructs and the index. The first step includes the ability to make route handler exportable, and by that to eject the handler out of route definition.

Modularization of expressjs route handlers as controllers.

Modularization of expressjs controller, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure a controller is exportable. The second step is to provide access to it, via index.

In the case of expressjs, which is really not applicable to controllers outside expressjs framework, the controller is basically the request/response handler.

//Ejecting handler out of route definition + export to controller/user.js 
//The filename can also be user/controller.js for organization by feature projects 
function getUsersById(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
}

//First interation ~ export handler as a controller in controller/user.js  
module.export = getUserById;

// Second iteration ~ in user/controller/index.js
//OR controller/user/index.js for organization by category projects 
module.export = require('./user/controller');

Following this logic, a controller is nothing but the ejected route handler.

The need of a model layer

The model is needed to translate application business logic into well-organized data structures within a program. That idea is mainly why a model is needed.

Having a data model layer is a whole new story. Managing models become cumbersome in some circumstances. That is why there is a plethora of ORM/ODM and whatnot. These tools bring DRY to model management.

Separation of Model from Route. A tightly coupled model/router combo is not always a good idea, particularly when the application has a tendency to grow beyond database fetch and dump scenarios. Keeping models tightly coupled to the route will make it difficult to test, scale, and reduce code duplications. Code duplications are known to increase technical debt.

More on how to modularize the router is explained in How to modularize expressjs routes

Modularization of model layers

Modularization of the model, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure the model is exportable as any other function or object. The second step is to provide access to it, via index.

It worths to mention that model definitions may come with accompanying schema definition. This is usually the case when coupling nodejs application to a mongodb database, via mongoose DRM(Document Relational Mapping).

//In utils/mongodb.js we initialize the mongodb 
var mongodb = require("mongodb");
mongoose.connect('mongodb://localhost:27017/devdb');
module.export = mongodb;

//Can be exported in utils/index.js 
module.export = require('./utils/mongodb');

//How to use this
require('./utils/mongodb');

The model layer is composed of the database connection and initialization alone. There is also schema definition that can be done. The order in which we load and compile schema to modules is as following: connect to the database first, then load models.

The need for real-time data

The need for real-time data — most modern applications require a stream of information to be pushed to customers near real-time. Some applications are designed with real-time capabilities at their core, others have the real-time element as an add-on.

For instance, a stock trading application, cannot afford to lose a minute sending updates to customers who need such invaluable information as soon as a change happens on stock prices. The same applies to maps and instant communication applications. Realtime capability is a core feature in such applications.

WebSocket is an extension of the HTTP protocol that makes near real-time magic happen.

Decoupling the real-time portion of the application makes maintenance a little easier to test, deploy and maintain.

Modularization of WebSocket handlers

How to modularize the WebSocket?

Modularization of WebSocket handlers, as for any other module forming procedure, requires 2 steps at a minimum. The first step is to make sure the handlers are exportable as any other function. The second step is to provide access to them, via index.

We have to make sure we isolate WebSocket objects, as the event model of the WebSocket implementation requires the availability of a network, which may not be available in a unit testing context.

Even if the networks were available, unit testing on a network is expensive and should be mocked out to make sure we have small and fast test cases.

More on how to modularize WebSocket is available on How to modularize socket.io/expressjs application

var express = require('express'),
    app = express(), 
    http = require('http'),
    server = http.createServer(app),
    wss = require('socket.io')(server);
//Handling realtime data
wss.on('connection', (socket, event) => {
    socket.on('error', () => {});
    socket.on('pong', () => {});
    socket.on('disconnect', () => {});
    socket.on('message', () => {});
});

//in ./realtime.js 
module.export = function(app, server){
  var wss = require('socket.io')(server);
  //Handling realtime data
  wss.on('connection', (socket, event) => {
      socket.on('error', onError);
      socket.on('pong', onPong);
      socket.on('disconnect', onDisconnet);
      socket.on('message', onMessage);
  });
}

//How to use: in server.js 
require('./realtime')(app, server)

The need for a service layer

The MVC approach decouples the model from the view, and both are supposed to be held together by a controller. However, using models inside a controller assumes that the database or ORM used to negotiate data with the underlying database will not change over time. That is not always the case.

The API from the same ORM/ODM may change over time. The ORM/ODM maybe sunset at one point. Data that used to come from one kind of database, may be moved to another database.

All of the above changes require our model to be resilient and ready to adapt to change over time.

The examples provided above are all related to the data model, but integration with third-party services can well fall under the same category.

To sum up, there a clear need to abstract business logic in a service layer, that we changes are under the limits of what we can control. There is a clear need for the modularization of business logic under the service banner.

Modularization of the service layer

There are 3 steps we can take to make the service layer, modular. There are two steps already mentioned in the introduction. The last and more critical is to group related individual model handling modules, or grouping third party integration links into a service.

The following module can be used a service:

//initial request handler 
module.exports = function(req, res, next){
 return User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
}

//service declaration
function UserService(){}

UserService.prototype.getUser = function(id, next){
  return User.findById(id, function(error, user){
    return next(error, user);
  });
};
module.exports = UserService;

//request|controller handler 
module.exports = function(req, res, next){
 return new UserService().getUser(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
}

The signature of findById may change in the future, or the function is renamed. Whatever happens, we have one single place in our library that is subject to such a change.

More on how to modularize the service layer Modularize nodejs service layer

The need for a configuration layer

The service layer, the controller layer, the model layer, servers, and whatnot all require pre-determined environment variables that enable those layers to work from one environment to the next. Those variables are known under the configurations moniker. Those variables are sensitive to unintentional leaks to the public. Modularization of such a layer, in part, resolves this issue.

When it comes to databases, there is a need to modularize database connection strings and secrets and get them out of the codebase. This idea can be extended to other secrets and variables needed to run the application.

Modularization of configuration layer

There is a library that brings server configurations in mists of the application. The beauty of this library is its ability to completely separate configuration from code.

//in utils/config.js 
var dotenv = require('dotenv'),
  dotenvExpand = require('dotenv-expand');

module.export = function(){
  var config = dotenv.config();
  dotenvExpand(config);
  return config;
}

//in utils/index.js
module.export = require('./config')();

//Usage 
let config = require('./config');

More on how to modularize configurations variables Modularize nodejs application configuration

To sum up

As recapitulation, the code we exposed at the beginning of the current article may end up looking something like this:

var express = require('express')
var app = express();
//@todo add modules here to make the server a bit smaller and granular 
...
require('./config')
require('./utils/mongodb');
require('./utils/middleware')(express, app);
require('./routes')(app);
require('./realtime')(app, server)
...
module.exports.server = server; 

Example:

Conclusion

In this article, we revisited how modularization can be achieved by leveraging the power of module.exports( or export in ES7+). The variety and flavors of components candidate to modularize, make it imperative that the modularization has to be minimalist, that is the reason why we leveraged the index file to make sure we do not overload already complex architectures. There are additional deeper complementary materials in the “Testing nodejs applications” book, around modularization and project layout.

References

Enter your email to subscribe to updates.