Simple Engineering

expressjs

A server requires the use of network resources, some of which perform expensive read/write operations. Testing servers introduce side effects, some of which expensive, and may cause unintended consequences when not mocked in the testing phase. To limit the chances of breaking something, testing servers have to be done in isolation.

The question to ask at this stage, is How to get there?. This blog article will explore some of the ways to answer this question.

The motivation for modularization is to reduce the complexity associated with large-scale expressjs applications. In nodejs servers context, we will shift focus on making sure most of the parts are accessible for tests in isolation.

In this article we will talk about:

  • How to modularize nodejs server for reusability.
  • How to modularize nodejs server for testability.
  • How to modularize nodejs server for composability.

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

nodejs application server comes in two flavors. Using native nodejs library, or adopting a server provided via a framework, in our case expressjs.

Using expressjs framework a classic server code looks as is the following example:

var express = require('express'),
    app = express()
/** .. more routes + code for app ... */
app.get('/', function (req, res) {
  return res.send('Hello World!')
});

app.listen(port, function () {
  console.log('Example app listening on port 3000!')
});
//source: https://expressjs.com/en/starter/hello-world.html

Example:

As the requirement increases, this file becomes exponentially big. The most application runs on top of expressjs a popular library in nodejs world. To keep the server.js small, regardless of requirements and dependent modules, moving most of the code into modules makes a difference.

var http = require('http'),
  hostname = 'localhost',
  port = process.env.PORT || 3000,
  server = http.createServer(function(req, res){
    res.statusCode = 200;
    res.setHeader('Content-Type', 'text/plain');
    res.end('Hello World\n');
  });

//Alternatively
var express = require('express'),
    app = express(),
    require('app/routes')(app),
    server = http.createServer(app);

server.listen(port, hostname, function (){
  console.log(['Server running at http://',hostname,':',port].join());
});
//source: https://nodejs.org/api/synopsis.html#synopsis_example

Example:

What can possibly go wrong?

When trying to figure out how to approach modularizing nodejs servers, the following points may be a challenge:

  • Understanding where to start, and where to stop with server modularization
  • Understanding key parts that need abstraction, or how/where to inject dependencies
  • Making servers testable

The following sections will explore more on making points stated above work.

How to modularize nodejs server for reusability

How to apply modularization technique in a server context or How to break down larger server file into a smaller granular alternative.

The server reusability becomes an issue when it becomes clear that the server bootstrapping code either needs some refactoring or presents an opportunity to add extra test coverage.

In order to make the server available to the third-party sandboxed testing environment, the server has to be exportable first.

In order to be able to load and mock/stub certain areas of the server code, still the server has to be exportable.

Like any other modularization technique we used, two steps are going to be in play. Since our case concerns multiple players, for instance, expressjs WebSocket and whatnot, we have to look at the server like an equal of those other possible servers.

How to modularize nodejs server for testability

Simulations of start/stop while running tests are catalysts of this exercise.

Testability and composability are other real drives to get the server to be modular. A modular server makes it easy to load the server as we load any other object into the testing sandbox, as well as mocking any dependency we deem unnecessary or prevents us to get the job done.

Simulation of Start/Stop while running testsHow to correctly unit test express server – There is a better code structure organization, that make it easy to test, get coverage, etc. Testing nodejs with mocha

The previous example shows how simpler becomes server initialization, but that comes with the additional library to install. Modularization of the above two code segments makes it possible to test the server in isolation.

module.exports = server;

Example: Modularization – this line makes server available in our tests ~ source

How to modularize nodejs server for composability

The challenge is to expose the HTTP server, in a way redis/websocket or agenda can re-use the same server. Making the server injectable.

The composability of the server is rather counter-intuitive. In most cases, the server will be injected into other components, for those components to mount additional server capabilities. The code sample proves this point by making the HTTP server available to a WebSocket component so that the WebSocket can be aware and mounted/attached to the same instance of the HTTP server.

var http = require('http'), 
    app = require('express')(),
    server = http.createServer(app),
    sio = require("socket.io")(server);

...

module.exports = server;

Conclusion

Modularization is key in making nodejs server elegant, serve as a baseline to performance improvements and improved testability. In this article, we revisited how to achieve nodejs server modularity, with stress on testability and code reusability. There are additional complimentary materials in the “Testing nodejs applications” book.

References

tags: #snippets #modularization #nodejs #expressjs

We assume most of the system components to be accessible for testability. However, that is challenging when routes are a little bit complex. To reduce the complexity that comes with working on large-scale expressjs routes, we will apply a technique known as manifest routes to make route declarations change proof, making them more stable as the rest of the application evolves.

In this article we will talk about:

  • The need to have manifest routes technique
  • How to apply the manifest routes as a modularization technique

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

var express = require('express')
var app = express();

app.get('/', function(req, res, next) {  
  res.render('index', { title: 'Express' });
});

/** code that initialize everything, then comes this route*/
app.get('/users/:id', function(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
});

app.listen(port, function () {
  console.log('Example app listening on port 3000!')
});

What can possibly go wrong?

When trying to figure out how to approach modularization of expressjs routes with a manifest route pattern, the following points may be a challenge:

  • Where to start with modularization without breaking the rest of the application
  • How to introduce the layered architecture, without incurring additional test burden, but making it easier to isolate tests

The following sections will explore more on making points stated above work.

The need to have manifest routes technique

There is a subtle nuance that is missing when following traditional approaches to modularization.

When adding an index file, as a part of the modularization process, exporting the content of directories, for that matter — sub-directories, does not result in exporting routes that can be plugged into existing expressjs applications.

The remedy is to create, isolate, export, and manifest them to the outer world.

How to apply the manifest routes handlers for reusability

The handlers are a beast in their own way.

A collection of related route handlers can be used as a baseline to create the controller layer. The modularization of this newly created/revealed layer can be achieved in two steps as was the case for other use cases. The first step consists of naming, ejecting, and exporting single functions as modules. The second step consists of adding an index to every directory and exporting the content of the directory.

Manifest routes

In essence, requiring a top-level directory, will seek for index.js at top of the directory and make all the route content accessible to the caller.

var routes = require('./routes'); 

Example: /routes has index.js at top level directory ~ source

A typical default entry point of the application:

var express = require('express');  
var router = express.Router();

router.get('/', function(req, res, next) {  
  return res.render('index', { title: 'Express' });
});
module.exports = router;  

Example: default /index entry point

Anatomy of a route handler

module.exports = function (req, res) {  };

Example: routes/users/get-user|new-user|delete-user.js

“The most elegant configuration that I've found is to turn the larger routes with lots of sub-routes into a directory instead of a single route file” – Chev source

When individual routes/users sub-directories are put together, the resulting index would look as in the following code sample

var router = require('express').Router();  
router.get('/get/:id', require('./get-user.js'));  
router.post('/new', require('./new-user.js'));  
router.post('/delete/:id', require('./delete-user.js'));  
module.exports = router;    

Example: routes/users/index.js

Update when routes/users/favorites/ adds more sub-directories

router.use('/favorites', require('./favorites')); 
...
module.exports = router;

Example: routes/users/index.js ~ after adding a new favorites requirement

We can go extra mile and group route handlers in controllers. Using route and controllers' route handler as a controller would look as in the following example:

var router = require('express').Router();
var catalogues = require('./controllers/catalogues');

router.route('/catalogues')
  .get(catalogues.getItem)
  .post(catalogues.createItem);
module.exports = router;

Conclusion

Modularization makes expressjs routes reusable, composable, and stable as the rest of the system evolves. Modularization brings elegance to route composition, improved testability, and reduces instances of redundancy.

In this article, we revisited a technique that improves expressjs routes elegance, their testability, and re-usability known under the manifest route moniker. We also re-state that the manifest route technique is an extra mile to modularizing expressjs routes. There are additional complimentary materials in the “Testing nodejs applications” book.

References

#snippets #modularization #manifest-routes #nodejs #expressjs

In most integration and end-to-end routes testing, a live server may be deemed critical to make reasonable test assertions. A live server is not always a good idea, especially in a sandboxed environment such as a CI environment where opening server ports may be restricted, if not outright prohibited. In this article, we explore the combination of mocking HTTP requests/responses to make use of an actual server obsolete.

In this article we will talk about:

  • Mocking the Server instance
  • Mocking Route's Request/Response objects
  • Modularization of routes and revealing server instance
  • Auto reload(hot reload) using:nodemon, supervisor or forever

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

//
var User = require('./models').User; 
module.exports = function getProfile(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
};

//Router that Authentication Middleware
var router = require('express').Router();
var authenticated = require('./middleware/authenticated');
var getUsers = require('./users/get-user');
router.get('/users/:id', authenticated, getUser);
module.exports = router;

What can possibly go wrong?

When trying to figure out how to approach testing expressjs routes, the driving force behind falling into the integration testing trap is the need to start a server. the following points may be a challenge:

  • Routes should be served at any time while testing
  • Testing in a sandboxed environments restricts server to use(open new ports, serving requests, etc)
  • Mocking request/response objects to wipe need of a server out of the picture

Testing routes without spinning up a server

The key is mocking request/response objects. A typical REST integration testing shares similarities with the following snippet.


var app = require('express').express(),
  request = require('./support/http');

describe('req .route', function(){
  it('should serve on route /user/:id/edit', function(done){
    app.get('/user/:id/edit', function(req, res){
      expect(req.route.path).to.equal('/user/:id/edit');
      res.end();
    });

    request(app)
      .get('/user/12/edit')
      .expect(200, done);
  });
  it('should serve get requests', function(done){
    app.get('/user/:id/edit', function(req, res){
      expect(req.route.method).to.equal('get');
      res.end();
    });

    request(app)
    .get('/user/12/edit')
    .expect(200, done);
  });
});

Example:

example from so and supertest. supertest spins up a server if necessary. In case we don't want to have a server, then an alternative dupertest can be a reasonable alternative. request = require('./support/http') is the utility that may use either of those two libraries to provide a request.

Choosing tools

If you haven't already, reading “How to choose the right tools” blog post gives insights on a framework we used to choose the tools we suggest in this blog.

Following our own Choosing the right tools framework, we suggest adopting the following tools, when testing expressjs routes by mocking out the server:

  • There exists well respected such as jasmine(jasmine-node), ava, jest in the wild. mocha can just do fine for example sakes.
  • There is also code instrumentation tools in the wild. mocha integrates well with istanbul test coverage and reporting library.
  • supertest, nock and dupertest are framework for mocking mocking HTTP, whereas nock intercepts requests. dupertest responds better to our demands(not spinning up a server).

Workflow

If you haven't already, read the “How to write test cases developers will love”

# In package.json at "test" - add next line
> "istanbul test mocha -- --color --reporter mocha-lcov-reporter specs"
# OR "nyc test mocha -- --color --reporter mocha-lcov-reporter specs"

# Then run the tests using 
$ npm test --coverage 

Example: istanbul generates reports as tests progress

Conclusion

To sum up, it pays off to spend extra time writing some tests. Effective tests can be written before, as well as after writing code. The balance should be at the discretion of the developer.

Testing nodejs routes are quite intimidating on the first encounter. This article contributed to shifting fear into opportunities.

Removing the server dependency makes it easy to validate the most common use cases at a lower cost. Writing a good meaningful message is pure art. There are additional complimentary materials in the “Testing nodejs applications” book.

References

#tdd #testing #nodejs #expressjs #server

In most integration and end-to-end routes testing, a live server may be deemed critical to make reasonable test assertions. A live server is not always a good idea, especially in a sandboxed environment such as a CI environment where opening server ports may be restricted, if not outright prohibited. In this article, we explore the combination of mocking HTTP requests/responses to make use of an actual server obsolete.

In this article we will talk about:

  • Mocking the Server instance
  • Mocking Route's Request/Response objects
  • Modularization of routes and revealing server instance
  • Auto reload(hot reload) using:nodemon, supervisor or forever

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

//
var User = require('./models').User; 
module.exports = function getProfile(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
};

//Router that Authentication Middleware
var router = require('express').Router();
var authenticated = require('./middleware/authenticated');
var getUsers = require('./users/get-user');
router.get('/users/:id', authenticated, getUser);
module.exports = router;

What can possibly go wrong?

When trying to figure out how to approach testing expressjs routes, the driving force behind falling into the integration testing trap is the need to start a server. the following points may be a challenge:

  • Routes should be served at any time while testing
  • Testing in a sandboxed environments restricts server to use(open new ports, serving requests, etc)
  • Mocking request/response objects to wipe need of a server out of the picture

Testing routes without spinning up a server

The key is mocking request/response objects. A typical REST integration testing shares similarities with the following snippet.


var app = require('express').express(),
  request = require('./support/http');

describe('req .route', function(){
  it('should serve on route /user/:id/edit', function(done){
    app.get('/user/:id/edit', function(req, res){
      expect(req.route.path).to.equal('/user/:id/edit');
      res.end();
    });

    request(app)
      .get('/user/12/edit')
      .expect(200, done);
  });
  it('should serve get requests', function(done){
    app.get('/user/:id/edit', function(req, res){
      expect(req.route.method).to.equal('get');
      res.end();
    });

    request(app)
    .get('/user/12/edit')
    .expect(200, done);
  });
});

Example:

example from so and supertest. supertest spins up a server if necessary. In case we don't want to have a server, then an alternative dupertest can be a reasonable alternative. request = require('./support/http') is the utility that may use either of those two libraries to provide a request.

Choosing tools

If you haven't already, reading “How to choose the right tools” blog post gives insights on a framework we used to choose the tools we suggest in this blog.

Following our own Choosing the right tools framework, we suggest adopting the following tools, when testing expressjs routes by mocking out the server:

  • There exists well respected such as jasmine(jasmine-node), ava, jest in the wild. mocha can just do fine for example sakes.
  • There is also code instrumentation tools in the wild. mocha integrates well with istanbul test coverage and reporting library.
  • supertest, nock and dupertest are framework for mocking mocking HTTP, whereas nock intercepts requests. dupertest responds better to our demands(not spinning up a server).

Workflow

If you haven't already, read the “How to write test cases developers will love”

# In package.json at "test" - add next line
> "istanbul test mocha -- --color --reporter mocha-lcov-reporter specs"
# OR "nyc test mocha -- --color --reporter mocha-lcov-reporter specs"

# Then run the tests using 
$ npm test --coverage 

Example: istanbul generates reports as tests progress

Conclusion

To sum up, it pays off to spend extra time writing some tests. Effective tests can be written before, as well as after writing code. The balance should be at the discretion of the developer.

Testing nodejs routes are quite intimidating on the first encounter. This article contributed to shifting fear into opportunities.

Removing the server dependency makes it easy to validate the most common use cases at a lower cost. Writing a good meaningful message is pure art. There are additional complimentary materials in the “Testing nodejs applications” book.

References

#tdd #testing #nodejs #expressjs #server

This blog post approaches testing fairly large nodejs application from a real-world perspective and with refactoring in mind. The use cases address advanced concepts that testing expressjs routes are.

Automated testing of any JavaScript project is quite intimidating for newbies and veterans alike.

In this article we will talk about:

  • Healthy test coverage of routes
  • Modularization of routes for testability
  • Mock Route's Request/Response Objects when necessary
  • Mock requests to third-party endpoints such as Payment Gateway.

Additional challenges while testing expressjs Routes*

  • Test code, not the output
  • Mock requests to Payment Gateway, etc.
  • Mock database read/write operations
  • Be able to cover exceptions and missing data structures

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

//
var User = require('./models').User; 
module.exports = function getProfile(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
};

//Router that Authentication Middleware
var router = require('express').Router();
var authenticated = require('./middleware/authenticated');
var getProfile = require('./settings/get-profile');
router.get('/profile/:id', authenticated, getProfile);
module.exports = router;

Example:

What can possibly go wrong?

When (unit) test expressjs routes the following challenges may arise:

  • Drawing a line between tests that fall into the unit testing category versus those tests that fall into the integration testing camp.
  • Being mindful that authenticated routes can appeal in the picture
  • Mock database read/write operations, or other layers(controller/service) that are not critical (core) to validation of the route's expectations

Choosing tools

If you haven't already, reading “How to choose the right tools” blog post gives insights on a framework we used to choose the tools we suggest in this blog.

Following our own Choosing the right tools framework, we suggest adopting the following tools, when testing expressjs routes:

  • We can technically have auto-reload or hot-reload using: pm2, nodemon or forever. We recommend supervisor.
  • We can choose amongst a myriad of test runners, for instance, jasmine(jasmine-node), ava or jest. We recommend mocha. The stack mocha, chai and sinon can be worth it as well.
  • supertest framework for mocking Restful APIs and nock for mocking HTTP.
  • Code under test is instrumented, but default reporting tools do not always suit our every project's needs. For test coverage reporting we recommend istanbul.

Workflow

It is possible to generate reports as tests progress.

latest versions of istanbul uses nyc name.

# In package.json at "test" - add next line
> "istanbul test mocha -- --color --reporter mocha-lcov-reporter specs"

# Then run the tests using 
$ npm test --coverage 

Show me the test

If you haven't already, read the “How to write test cases developers will love”

The mainstream philosophy about automated testing is to write failing tests, followed by code that resolves the failing use cases. This is not always the case, especially when dealing with legacy code, or poorly tested code. The less puritan approach is at least tests when the code is still fresh in memory.

In this article, we assume the reader knows how to mock routes, otherwise there are articles that cover the basics of mocking routes' request/response objects and how to mock database read/write functions in this blog.

The common source of frustration and sometimes bad decision-making that follows is when not able to define boundaries: when to start refactoring, and when to stop.

Testing a route handler in isolation looks like testing any function. In our case, there should be a mocking operation of the User.findById() function, that is intended to be used with the request.

For more on how to mock mongoose read/write function.

describe('getProfile', () => {
  let req, res, next, error;
  beforeEach(() => {
    next = sinon.spy();
    sessionObject = { ... };//mocking session object
    req = { params: {id: 1234}, user: sessionObject };
    res = { status: (code) => { json: sinon.spy() }}
  });

  it('returns a profile', () => {
    getRequest(req, res, next);
    expect(res.status().json).toHaveBeenCalled();
  });
  
  it('fails when no profile is found', () => {
    getRequest(req, res, next);
    expect(next).toHaveBeenCalledWith([error, null]);
  });

});

Please refer to this article to learn more about how to mocking mongoose read/write functions.

Testing an integral route falls into the integration testing category. Whether we connect to a live database or use a live server route is up to the programmer, but the best(fast/efficient) approach is to mock out those two expensive parts as well.

var router = require('./profile/router'),
    request = require('./support/http');
describe('/profile/:id', () => {
  it('returns a profile', done => {
    request(router)
      .get('/profile/12')
      .expect(200, done);
  });

  it('fails when no profile is found', done => {
    request(router)
      .get('/profile/NONEXISTENT')
      .expect(500, done);
  });
});

request = require('./support/http') is the utility that may use either of supertest or dupertest provide a request.

Conclusion

When paying off technical debt, small bad moves can build up into catastrophe, such as downtime with little failure traceability. Good test coverage increase confidence when refactoring, refines boundaries, while at the same time reducing the introduction of new bugs in the codebase.

In this article, we reviewed how testing tends to be more of art, than science. We also stressed the fact that, like in any art, practice makes perfect ~ testing routes, just like testing controllers, can be challenging when interacting with external systems is involved. There are additional complimentary materials in the “Testing nodejs applications” book.

References

#snippets #expressjs #routes #discuss