API Testing in Node.js — using Chai, Mocha and Github Actions

Lakshyajit Laxmikant
6 min readMay 6, 2022

In this article I will be discussing, to setup a basic API testing architecture for a Node.js application with Postgres.

Now there are different kinds of testing which are typically done in a large scale production application namely — unit-testing, end-to-end testing, integration testing etc. Here in this example to keep things simple, we’ll be looking at an example of integration testing as it will involve connecting to our postgreSQL DB as well as other methods involved at the respository/service and the middleware layer.

According to Google:

Unit testing involves testing individual pieces of code while integration testing involves testing modules of code to understand how they perform alone and how they interact with each other.

So as the above definition suggests, integration testing will be helpful for us in order to get to know how all the methods written in our application work together and interact with each other on various flows.

Now before we start discussing any further, let me break down some points about what our Node.js application consists of.

  1. The application consists of a simple CRUD API for a “Task” schema and uses postgres running on docker as a database.
  2. We’ll be using “sequelize” as the ORM to connect to postgres from our application.(Doc reference: https://sequelize.org/api/v6/identifiers.html)
  3. We’ll be using Chai and Mocha to test our endpoints. Both are standard libraries using for API testing in Node.js
  4. We will also setup a Github Actions pipeline to trigger our tests if our code is pushed to certain branches. This helps in ensuring code quality througout the development process as we are notified immediately if the pipeline fails.

Note: To keep this article short I will only be discussing few of the many test cases. The entire code for the application can be found in the github repo here: https://github.com/lakshyajit165/node_postgres_cicd. I suggest cloning this repo in order to get a full picture of how the code is written. It’s pretty simple if you’re even a little bit familiar with Node.js actually. I have made sure that the code is well documented and formatted so that readers don’t have too much trouble in browsing it.

I suggest the readers to spin up a postgres docker container by running the following command on the terminal:

docker run -d -p 5432:5432 --name node-postgres -e POSTGRES_PASSWORD=postgres -e POSTGRES_DB=node_postgres_cicd -e POSTGRES_USER=postgres postgres

I have included the .env file format and details in the Readme file of the project in the github repo. Now, before moving ahead with discussing the testing functionalities/features, here’s what our schema looks like as defined in Node.js using sequelize. Good thing is we don’t need to ssh into our running Postgres docker container ourselves and create the tables. The ORM will do that for us according to the schema that we’ve defined in our application.

We’ve imported this schema along with all the db config variables in the index.js file inside the “models” folder as follows.

Now let’s discuss the “Create Task”(POST) endpoint.

This is what the code for this endpoint looks like at the controller/service and repository layers.

At the Controller Layer:

At the service layer:

At the repository layer:

As you can see, it’s a simple API with no complex setups/schemas. Now in order to do the integration testing for this “create-task” functionality, we’ll create a “test” folder in the project root and inside it we’ll create a file called “app.test.js”. The folder structure looks like this -

Folder structure for writing tests

So to get started with the testing we’ll make the necessary imports and declare some variables as follows:

As you can see the imports include setting the env. which in our case set to “test” because we are doing integration testing. We’ve also imported the app.js file along with our model schema and ORM which will be necessary for making some db calls. So for the “create-task” functionality I have considered 3 cases for testing here — 2 failure cases and 1 success case. As mentioned previously I’ve included a simple middleware to do some sanity checks for the request payload as the request hits our application. Based on that we’ll be able to test out some failure cases like — what should the response be like if some attribute(s) is/are missing from the request payload. The success case includes creating a task succesfully and testing the corresponding response received. For the create task functionality the request payload consists of 3 things — “title”, “description” and “created_by”. Here is the code consisting of the test cases related to this flow:

We are using the “describe” function to encapsulate a set of tests for a particular flow. In real production environments where an entire microservice might have 100s of test-cases we can and infact we should give a more meaningful description under this function. Here I’ve only written “POST task” to keep things simple and clear. Under this we’ve 3 cases as described above. The first case is missing the “title” field from the request payload. Notice how the testing resembles meaningful english sentences/phrases as we’ve written res.body.should.be.a(“object”) and res.body.should.have.property(“errorMessage”). Similarly in the second test, we’ve the entire request payload as an empty object and the 3rd case is the success case where we create a task in the DB. Notice after the 3 cases I have included an async delete function. This is a part of the clean up process. Now consider this — even if we’re testing on our testing environment where our DB is different from the one used in production, we don’t want our DB to get bloated as the test functions increase over time. So it’s a good practice to delete a resource once we’ve tested all the required functionality. Here I have kept this function blank because I’ve included a separate test for the “delete” functionality in the same file below.

So this was the test for the “create-task” functionality. Similary I have included tests (both failure and success cases) for other functionalities like GET, PUT, DELETE as well. I request my readers to check those out to get an idea about the overall working of the entire test suite. Now once all our tests are done, we are going to modify the test script in our package.json file which will help us in running all the tests together using a simple “npm test” command on our terminal.

The script in the package.json file is modified as follows:

mocha --timeout 10000 --exit

Now when we run “npm test” in our terminal we should get a result of something like this -

We see that all our tests are passing. Now the only thing left is to define a .yml file and then push our code to a github repository to leverage the power of github actions. But before we push our .yml file to the repo, we have to define some “secrets” in our repository as follows(these are basically the variables declared in the .env file):

Once this is done, we can go ahead and configure our .yml file. It’s present inside the .github/workflows folder and for our case, I have configured it as follows:

The only thing to note here is the “on” clause. You can define how you want to set up this clause like on which branch, on a pull request etc. We can also include branch protection rules along with this so that for e.g. a PR from another branch must be merged to master only when the testing job succeeds and the changes are approved by a reviewer. Having this file defined and pushing it github, will trigger the job according to the rules defined in this file. For my case it looked something like this -

And that’s pretty much it! We’ve setup a testing architecture within a short span of time — no doubt github actions is one of the best friends a developer can get! 😎

Feel free to reach out in case of any issues or comment down on this article and I will revert back.

--

--

Lakshyajit Laxmikant

A tech-geek, curious creature, willing to learn new technologies to build interesting and intelligent systems...