Simple Engineering

mocking

The stream API provides a heavy-weight asynchronous computation model that keeps a small memory footprint. As exciting as it may sound, testing streams is somehow intimidating. This blog layout some key elements necessary to be successful when mocking stream API.

We keep in mind that there is a clear difference between mocking versus stub/spying/fakes even though we used mock interchangeably.

In this article we will talk about:

  • Understanding the difference between Readable and Writable streams
  • Stubbing Writable stream
  • Stubbing Readable stream
  • Stubbing Duplex or Transformer streams

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

var  gzip = require('zlib').createGzip();//quick example to show multiple pipings
var route = require('expressjs').Router(); 
//getter() reads a large file of songs metadata, transform and send back scaled down metadata 
route.get('/songs' function getter(req, res, next){
        let rstream = fs.createReadStream('./several-TB-of-songs.json'); 
        rstream.
            pipe(new MetadataStreamTransformer()).
            pipe(gzip).
            pipe(res);
        // forwaring the error to next handler     
        rstream.on('error', (error) => next(error, null));
});

At a glance The code is supposed to read a very large JSON file of TB of metadata about songs, apply some transformations, gzip, and send the response to the caller, by piping the results on the response object.

The next example demonstrates how a typical transformer such as MetadataStreamTransformer looks like

const inherit = require('util').inherits;
const Transform = require('stream').Tranform;

function MetadataStreamTransformer(options){
    if(!(this instanceof MetadataStreamTransformer)){
        return new MetadataStreamTransformer(options);
    }
    this.options = Object.assign({}, options, {objectMode: true});//<= re-enforces object mode chunks
    Transform.call(this, this.options);
}
inherits(MetadataStreamTransformer, Transform);
MetadataStreamTransformer.prototype._transform = function(chunk, encoding, next){
    //minimalistic implementation 
    //@todo  process chunk + by adding/removing elements
    let data = JSON.parse(typeof chunk === 'string' ? chunk : chunk.toString('utf8'));
    this.push({id: (data || {}).id || random() });
    if(typeof next === 'function') next();
};

MetadataStreamTransformer.prototype._flush = function(next) {
    this.push(null);//tells that operation is over 
    if(typeof next === 'function') {next();}
};

Inheritance as explained in this program might be old, but illustrates good enough in a prototypal way that our MetadataStreamTransformer inherits stuff fromStream#Transformer

What can possibly go wrong?

stubbing functions in stream processing scenario may yield the following challenges:

  • How to deal with the asynchronous nature of streams
  • Identify areas where it makes sense to a stub, for instance: expensive operations
  • Identifying key areas needing drop-in replacements, for instance reading from a third party source over the network.

Primer

The keyword when stubbing streams is:

  • To identify where the heavy lifting is happening. In pure terms of streams, functions that executes _read() and _write() are our main focus.
  • To isolate some entities, to be able to test small parts in isolation. For instance, make sure we test MetadataStreamTransformer in isolation, and mock any response fed into .pipe() operator in other places.

What is the difference between readable vs writable vs duplex streams? The long answer is available in substack's Stream Handbook

Generally speaking, Readable streams produce data that can be feed into Writable streams. Readable streams can be .piped on, but not into. Readable streams have readable|data events, and implementation-wise, implement ._read() from Stream#Readable interface.

Writable streams can be .piped into, but not on. For example, res examples above are piped to an existing stream. The opposite is not always guaranteed. Writable streams also have writable|data events, and implementation-wise, implement _.write() from Stream#Writable interface.

Duplex streams go both ways. They have the ability to read from the previous stream and write to the next stream. Transformer streams are duplex, implement ._transform() Stream#Transformer interface.

Modus Operandi

How to test the above code by taking on smaller pieces?

  • fs.createReadStream won't be tested, but stubbed and returns a mocked readable stream
  • .pipe() will be stubbed to return a chain of stream operators
  • gzip and res won't be tested, therefore stubbed to returns a writable+readable mocked stream objects
  • rstream.on('error', cb) stub readable stream with a read error, spy on next() and check if it has been called upon
  • MetadataStreamTransformer will be tested in isolation and MetadataStreamTransformer._transform() will be treated as any other function, except it accepts streams and emits events

How to stub stream functions

describe('/songs', () => {
    before(() => {
        sinon.stub(fs, 'createReadStream').returns({
            pipe: sinon.stub().returns({
                pipe: sinon.stub().returns({
                    pipe: sinon.stub().returns(responseMock)
                })
            }),
            on: sinon.spy(() => true)
        })
    });
});

This way of chained stubbing is available in our toolbox. Great power comes with great responsibilities, and wielding this sword may not always be a good idea.

There is an alternative at the very end of this discussion

The transformer stream class test in isolation may be broken down to

  • stub the whole Transform instance
  • Or stub the .push() and simulate a write by feeding in the readable mocked stream of data

the stubbed push() is a good place to add assertions

it('_transform()', function(){
    var Readable = require('stream').Readable;
    var rstream = new Readable(); 
    var mockPush = sinon.stub(MetadataStreamTransformer, 'push', function(data){
        assert.isNumber(data.id);//testing data sent to callers. etc
        return true;
    });
    var tstream = new MetadataStreamTransformer();
    rstream.push({id: 1});
    rstream.push({id: 2});
    rstream.pipe(tstream);
    expect(tstream.push.called, '#push() has been called');
    mockPush.restore(); 
});

How to Mock Stream Response Objects

The classic example of a readable stream is reading from a file. This example shows how mocking fs.createReadStream and returns a readable stream, capable of being asserted on.

//stubb can emit two or more streams + close the stream
var rstream = fs.createReadStream();
sinon.stub(fs, 'createReadStream', function(file){ 
    //trick from @link https://stackoverflow.com/a/33154121/132610
    assert(file, '#createReadStream received a file');
    rstream.emit('data', "{id:1}");
    rstream.emit('data', "{id:2}");
    rstream.emit('end');
    return false; 
});

var pipeStub = sinon.spy(rstream, 'pipe');
//Once called this above structure will stream two elements: good enough to simulate reading a file.
//to stub `gzip` library: another transformer stream: producing 
var next = sinon.stub();
//use this function| or call the whole route 
getter(req, res, next);
//expectations follow: 
expect(rstream.pipe.called, '#pipe() has been called');

Conclusion

In this article, we established the difference between Readable and Writable streams and how to stub each one of them when unit test.

Testing tends to be more of art, than a science, practice makes perfect. There are additional complimentary materials in the “Testing nodejs applications” book.

References

tags: #snippets #TDD #streams #nodejs #mocking

There is a striking similarity between testing expressjs route handlers and controllers. That similarity and test exploration is the subject matter of this article.

Few resources about testing in general address advanced concepts such as how to isolate components for better composability and healthy test coverage. One of the components that improve composability, at least in layered nodejs applications, is the controller.

In this article we will talk about:

  • Mocking controller Request/Response objects
  • Providing healthy test coverage to controllers
  • Avoiding controller integration test trap

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

//Session Object in settings/controller/get-profile  
module.exports = function getPrifile(req, res, next){
    let user = req.session.user;
    UserModel.findById(user._id, (error, user) => {
        if(error) return next(error, null);
        return req.status(200).json(user); 
    });     
};

This code is a valid controller and a valid handler. There is a caveat in design that makes the case of introducing a service layer in the applications.

What can possibly go wrong?

When trying to figure out how to approach testing expressjs controllers in a Unit Test context, the following points may be a challenge:

  • How to refactor unit test at the time controller layer gets introduced, instead of route handlers.
  • Mock database read/write operations, or service layer if any, that are not core/critical to validation of the controller's expectations
  • Test-driven refactoring of the controller to adopt a service layer, to abstract the database and third-party services.

The following sections will explore more on making points stated above work.

Choosing tools

If you haven't already, reading “How to choose the right tools” blog post gives insights on a framework we used to choose the tools we suggest in this blog.

Following our own “Choosing the right tools” framework, we adopted the following tools (that made sense to complete current article) on testing expressjs controllers:

  • We can choose amongst a myriad of test runners, for instance, jasmine(jasmine-node), ava or jest. We chose mocha.
  • The stack mocha, chai and sinon (assertion and test doubles libraries) worth a shot.
  • supertest framework for mocking Restful APIs and nock for mocking HTTP.
  • Code under test is instrumented, but default reporting tools do not always suits our every project's needs. For test coverage reporting we recommend istanbul.

Workflow

It is possible to generate reports as tests progress.

latest versions of istanbul uses nyc name.

# In package.json at "test" - add next line
> "istanbul test mocha -- --color --reporter mocha-lcov-reporter specs"

# Then run the tests using 
$ npm test --coverage 

Show me the tests

If you haven't already, read the “How to write test cases developers will love”

It is not always obvious why to have a controller layer in a nodejs application. When the controller is already part of the application, it may well be problematic to test it, in a way that provides value to the application as a whole, without sacrificing “time to market”.

describe('getPrifile', () => {
  let req, res, next, error;
  beforeEach(() => {
    next = sinon.spy();
    sessionObject = { ... };//mocking session object
    req = { params: {id: 1234}, user: sessionObject };
    res = { status: (code) => { json: sinon.spy() }}
  });

  it('returns a profile', () => {
    getRequest(req, res, next);
    expect(res.status().json).toHaveBeenCalled();
  });
  
  it('fails when no profile is found', () => {
    getRequest(req, res, next);
    expect(next).toHaveBeenCalledWith([error, null]);
  });

});

The integration testing of the request may look a bit like in the following paragraph:

var router = require('./profile/router'),
    request = require('./support/http');
describe('/profile/:id', () => {
  it('returns a profile', done => {
    request(router)
      .get('/profile/12')
      .expect(200, done);
  });

  it('fails when no profile is found', done => {
    request(router)
      .get('/profile/NONEXISTENT')
      .expect(500, done);
  });
});

request = require('./support/http') is the utility that may use either of supertest or dupertest provide a request.

Once the above process is refined, more complex use cases can be sliced into more manageable but testable cases. The following as some of the complex use cases we can think of for now:

module.exports = function(req, res, next){
  User.findById(req.user, function(error, next){
    if(error) return next(error); 
    new Messenger(options).send().then(function(response){
      redisClient.publish(Messenger.SYSTEM_EVENT, payload));
      //schedule a delayed job 
      return res.status(200).json({message: 'Some Message'});
    });
  });
};

It may be hard to mock one single use case, with callbacks. That is where slicing, and grouping libraries into reusable services can come in handy. Once a library has a corresponding wrapper service, it becomes easy to mock the service as we wish.

module.exports = function(req, res, next){
  UserService.findById(req.user)
    .then(new Messenger(options).send())
    .then(new RedisService(redisClient).publish(Messenger.SYSTEM_EVENT, payload))
    .then(function(response){ return res.status(200).json(message);})
    .catch(function(error){return next(error);});
};

Alternatively, Using an in-memory database can alleviate the task, to mock the whole database. The other more viable way to go is to restructure the application and add a service layer. The service layer makes it possible to test all these features in isolation.

Conclusion

Automated testing of any JavaScript project is quite intimidating for newbies and veterans alike. In this article, we reviewed how testing tends to be more of art, than science. We also stressed the fact that, like in any art, practice makes perfect ~ testing controllers, just like testing routers, can be challenging especially when interacting with external systems is involved. There are additional complimentary materials in the “Testing nodejs applications” book.

References

#snippets #code #annotations #question #discuss

Asynchronous computation model makes nodejs flexible to perform heavy computations while keeping a relatively lower memory footprint. The stream API is one of those computation models, this article explores how to approach testing it.

In this article we will talk about:

  • Difference between Readable/Writable and Duplex streams
  • Testing Writable stream
  • Testing Readable stream
  • Testing Duplex or Transformer streams

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

//Read + Transform +Write Stream processing example
var gzip = require('zlib').createGzip(),
    route = require('expressjs').Router(); 
//getter() reads a large file of songs metadata, transform and send back scaled down metadata 
route.get('/songs' function getter(req, res, next){
    let rstream = fs.createReadStream('./several-tb-of-songs.json'); 
    rstream.
        .pipe(new MetadataStreamTransformer())
        .pipe(gzip)
        .pipe(res);
    // forwaring the error to next handler     
    rstream.on('error', error => next(error, null));
});

//Transformer Stream example
const inherit = require('util').inherits,
    Transform = require('stream').Tranform;

function MetadataStreamTransformer(options){
    if(!(this instanceof MetadataStreamTransformer)){
        return new MetadataStreamTransformer(options);
    }
    // re-enforces object mode chunks
    this.options = Object.assign({}, options, {objectMode: true});
    Transform.call(this, this.options);
}

inherits(MetadataStreamTransformer, Transform);
MetadataStreamTransformer.prototype._transform = function(chunk, encoding, next){
    //minimalistic implementation 
    //@todo  process chunk + by adding/removing elements
    let data = JSON.parse(typeof chunk === 'string' ? chunk : chunk.toString('utf8'));
    this.push({id: (data || {}).id || random() });
    if(typeof next === 'function') next();
};

MetadataStreamTransformer.prototype._flush = function(next) {
    this.push(null);//tells that operation is over 
    if(typeof next === 'function') {next();}
};

The example above provides a clear picture of the context in which Readable, Writable, and Duplex(Transform) streams can be used.

What can possibly go wrong?

Streams are particularly hard to test because of their asynchronous nature. That is not an exception for I/O on the filesystem or third-party endpoints. It is easy to fall into the integration testing trap when testing nodejs streams.

Among other things, the following are challenges we may expect when (unit) test streams:

  • Identify areas where it makes sense to stub
  • Choosing the right mock object output to feed into stubs
  • Mock streams read/transform/write operations

There is an article dedicated to stubbing stream functions. Mocking in our case will not go into details about the stubbing parts in the current text.

Choosing tools

If you haven't already, reading “How to choose the right tools” blog post gives insights on a framework we used to choose the tools we suggest in this blog.

Following our own “Choosing the right tools” framework. They are not a suggestion, rather the ones that made sense to complete this article:

  • We can choose amongst a myriad of test runners, for instance, jasmine(jasmine-node), ava or jest. mocha was appealing in the context of this writeup, but choosing any other test runner does not make this article obsolete.
  • The stack mocha, chai, and sinon (assertion and test doubles libraries) worth a shot.
  • node-mocks-http framework for mocking HTTP Request/Response objects.
  • Code under test is instrumented to make test progress possible. Test coverage reporting we adopted, also widely adopted by the mocha community, is istanbul.

Workflow

It is possible to generate reports as tests progress.

latest versions of istanbul uses the nyc name.

# In package.json at "test" - add next line
> "istanbul test mocha -- --color --reporter mocha-lcov-reporter specs"

# Then run the tests using 
$ npm test --coverage 

Show me the tests

If you haven't already, read the “How to write test cases developers will love”

We assume we approach testing of fairly large nodejs application from a real-world perspective, and with refactoring in mind. The good way to think about large scale is to focus on smaller things and how they integrate(expand) with the rest of the application.

The philosophy about test-driven development is to write failing tests, followed by code that resolves the failing use cases, refactor rinse and repeat. Most real-world, writing tests may start at any given moment depending on multiple variables one of which being the pressure and timeline of the project at hand.

It is not a new concept for some tests being written after the fact (characterization tests). Another case is when dealing with legacy code, or simply ill-tested code base. That is the case we are dealing with in our code sample use case.

The first thing is rather reading the code and identify areas of improvement before we start writing the code. And the clear improvement opportunity is to eject the function getter() out of the router. Our new construct looks as the following: route.get('/songs', getter); which allows to test getter() in isolation.

Our skeleton looks a bit as in the following lines.

describe('getter()', () => {
  let req, res, next, error;
  beforeEach(() => {
    next = sinon.spy();
    sessionObject = { ... };//mocking session object
    req = { params: {id: 1234}, user: sessionObject };
    res = { status: (code) => { json: sinon.spy() }}
  });
    //...
});

Let's examine the case where the stream is actually going to fail.

Note that we lack a way to get the handle on the stream object, as the handler does not return any object to tap into. Luckily, the response and request objects are both instances of streams. So a good mocking can come to our rescue.


//...
let eventEmitter = require('events').EventEmitter,
  httpMock = require('node-mocks-http'),

//...
it('fails when no songs are found', done => {
    var self = this; 
    this.next = sinon.spy();
    this.req = httpMock.createRequest({method, url, body})
    this.res = httpMock.createResponse({eventEmitter: eventEmitter})
    
    getter(this.req, this.res, this.next);
    this.res.on('error', function(error){
        assert(self.next.called, 'next() has been called');
        done(error);
    });
});

Mocking both request and response objects in our context makes more sense. Likewise, we will mock response cases of success, the reader stream's fs.createReadStream() has to be stubbed and make it eject a stream of fake content. this time, this.res.on('end') will be used to make assertions.

Conclusion

Automated testing streams are quite intimidating for newbies and veterans alike. There are multiple enough use cases in the book to get you past that mark.

In this article, we reviewed how testing tends to be more of art, than science. We also stressed the fact that, like in any art, practice makes perfect ~ testing streams is particularly challenging especially when a read/write is involved. There are additional complimentary materials in the “Testing nodejs applications” book.

References

#snippets #tdd #streams #nodejs #mocking

The depth of an HTTP request or response mock brings a level of complexity to the whole system. In this article, we revisit some techniques used to mock HTTP request/response when used in the same test case.

In this article we will talk about:

  • Mocking Request Objects
  • Mocking Response Objects
  • Mocking Request and Response object in the same test case.
  • When does it make sense to mock both Request and Response.

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

module.exports.getUsers = function getUsers(req, res, next){
  UserModel.find(req.params, (error, users){
    if(error) return next(error, null);
    return res.status(200).json(users);
  });
}

Example: in controller/get-users.js

What can possibly go wrong?

When trying to figure out how to approach mocking request and response objects, the following points may be a challenge:

  • Stubbing the right request/response methods
  • Mock output that can be consumed by the other callers
  • Stubbing request/response handlers in the same test case
  • Strategic mocking that can make a live server obsolete

How to mock Request/Response Objects the easy way

Testing an expressjs middleware provides a good use case where mocking a request and response in the same test case makes sense.

Key objectives are:

  • Spying if certain calls have been called
  • Make sure the requests don't leave the local machine.
var sinon = require('sinon'),
    chai = require('chai'),
    expect = chai.expect,
    getUsers = require('./controller').getUsers;

describe("getUsers()", function() {
  it("should guarantee a response", function() {
    var req  = {}, 
      res  = { send: sinon.spy()}, 
      next = sinon.spy();
    getUsers(req, res, next);
    expect(res.send.calledOnce).to.equal(true);
    res.send.restore(); 
  });     
});

code excerpt adapted from – Unit Testing Controllers the Easy Way in Express 4

Particular Case: How to mock a response that uses a streaming, or other hard to mock interfaces. Keyword: let the flow intact, but fake read/write data instead.

Mocking request

Request object provided by node-mocks-http is pretty similar to the request provided by the native http found in nodejs library

var request;
//When method = GET|DELETE
request = httpMock.createRequest({method: method, url: url});

//When method = PUT|POST
var request = httpMock.createRequest({method, url, body: body})

Mocking Response

//initialization(or beforeEach)
var response = httpMock.createResponse({
    eventEmitter: require('events').EventEmitter
});

//Usage: somewhere in tests
let next = sinon.spy();
getUsers(request, response, next);
response.on('end|data|error', function(error){
  //write tests in this close.
});

Using node-mocks-http is in the gray area of integration testing. However, this technique can be verifiable in use cases where the first strategy falls short.

There is more on integration testing mocking strategy: How to Mock HTTP Request and Response ~ Integration testing use case

Conclusion

In this article, we revisited strategies to mock HTTP Request and Response methods in the same test case, while using mock data to emulate interaction with remote systems. We also re-iterated the difference between stubbing and mocking, and how spies(fake) fall into the testing big picture. There are additional complimentary materials in the “Testing nodejs applications” book on this very same subject.

References

#snippets #http #request #response #mocking #stubbing

Mocking HTTP requests, for that matter responses, is essential in most unit test scenarios. Depending on the depth we want the mock to kick in, this task can become quite a feat on its own. In this article, we revisit some techniques that can make our life easy when mocking requests in integration testing scenarios.

This article is a followup to How to Mock HTTP Request and Response

In this article we will talk about:

  • Stubbing HTTP Request Objects
  • Mocking Request and Response object in the same test case.
  • When does it make sense to mock both Request and Response.

Even though this blog post was designed to offer complementary materials to those who bought my Testing nodejs Applications book, the content can help any software developer to tuneup working environment. You use this link to buy the book. Testing nodejs Applications Book Cover

Show me the code

//
var User = require('./models').User; 
module.exports = function getProfile(req, res, next){
  User.findById(req.params.id, function(error, user){
    if(error) return next(error);
    return res.status(200).json(user);
  });
};

//Router that Authentication Middleware
var router = require('express').Router();
var authenticated = require('./middleware/authenticated');
var getUsers = require('./users/get-user');
router.get('/users/:id', authenticated, getUser);
module.exports = router;

Example:

What can possibly go wrong?

Some challenges associated with stubbing HTTP requests:

  • How deep a stub should go

Show me the tests

The next section has the following traits backed in:

  • When to use: Testing all routes at once
  • When to use: Asserting on nature of the response output
  • When not to use: When running unit testing
  • When to use: When running integration tests
// Add promise support if this does not exist natively.
if (!global.Promise) {
    global.Promise = require('q');//or any other promise library 
}

var chai = require('chai'),
  chaiHttp = require('chai-http'),
  chai.use(chaiHttp), //registering the plugin.
  //use this line to retain cookies instead 
  agent = chai.request.agent(app),
  //agent.post()|agent.get()|agent.del()|agent.put 
  app = require('express').Router(),
  //mounting app to routes to be tested
  require('./lib/routes')(app);

//initialization of app can be express or other HTTP compatible server.
it('works', function(done){
    chai.request(app)
    .put('/user/me')//.post|get|delete
    .send({ password: '123', confirm: '123' })
    .end(function (err, res) {
        expect(err).to.be.null;
        expect(res).to.have.status(200);
        //more possible assertion 
        expect(res).to.have.status(200);
        expect(req).to.have.header('x-api-key');
        expect(req).to.have.headers;//Assert that a Response or Request object has headers.
        expect(req).to.be.json;//.html|.text 
        expect(res).to.redirect;//.to.not.redirect
        expect(req).to.have.param('orderby');//test sent parameters
        expect(req).to.have.param('orderby', 'date');//test sent parameters values 
        expect(req).to.have.cookie('session_id');//test cookie parameters
    });
});

//keeping port open 
var requester = chai.request(app).keepOpen();
it('works - parallel requests', function(){
    Promise.all([requester.get('/a'), requester.get('/b')])
    .then(responses => { /**do - more assertions here */})
    .then(() => requester.close());
});

This strategy has not been tested on routes that read/write streams.

To the question: When does it make sense to mock both Request and Response, the answer is it depends. In the event where were are interested in replicating interactions with a third party system via Requests/Responses, then it makes sense to mock request/responses.

Conclusion

In this article, we established the difference between Mocking versus Stubbing HTTP requests.

We also established the cost associated with HTTP request every time a test is executed.

With this knowledge, we reviewed ways to reduce costs by strategically stubbing HTTP read/write operations to make tests fail fast, without losing test effectiveness. There are additional complimentary materials in the “Testing nodejs applications” book.

References

#snippets #http #request #response #mocking #stubbing