Writing Scalable Architecture for Node.js
Node.js makes it easy to quickly build web applications, thanks to its rich ecosystem of packages and frameworks like Express.js. However, as an application grows in size and complexity, it‘s critical to design the architecture in a way that will scale.
Many Node.js applications start out with a flat file structure and simple routes. But as new features and APIs are added, this approach can lead to a tangled mess of code that is difficult to reason about and maintain. Requests are funneled through a single large file, making it hard to find the logic for specific routes.
Luckily, there are a number of best practices and architectural patterns we can apply to keep even large Node.js codebases modular, organized and scalable. Let‘s dive into some of the key principles.
Principles of Scalable Node.js Architecture
There are a few overarching principles to keep in mind when structuring a Node.js application for scale:
Separate concerns. Each part of your codebase should have a single, well-defined responsibility. Avoid duplication and extract shared logic into reusable modules. Keep route handlers lean by moving complex business logic into separate service modules.
Keep your codebase predictable. Follow established conventions for file and directory naming, code style, and application structure. Use a consistent, repeatable approach for common tasks like authentication and database queries.
Don‘t overthink it. Start with a simple structure and only add complexity as your application demands it. Introduce architectural patterns incrementally as your codebase grows.
With those guiding principles in mind, let‘s look at a typical Express application structure that sets us up for scalability.
Structuring an Express Application
Here‘s an example of what an Express application structure might look like for a large application:
config/
index.js
db.js
server/
controllers/
dashboard.js
login.js
register.js
models/
User.js
routes/
index.js
v1/
index.js
services/
authentication/
login.js
register.js
dashboard/
dashboard.js
index.js
Let‘s break this down piece-by-piece.
Config
At the top level, we have a config
directory for configuration settings that may vary between environments, like the database connection. Extracting these into a separate directory makes it easy to manage different settings for development, staging and production.
A typical config/index.js
file might look something like this:
const _ = require(‘lodash‘);
const env = process.env.NODE_ENV || ‘development‘;
const envConfig = require(‘./‘ + env);
let defaultConfig = {
env,
};
module.exports = _.merge(defaultConfig, envConfig);
This lets us define specific settings for each environment in a dedicated file, like config/development.js
:
module.exports = {
db: ‘mongodb://localhost:27017/myapp‘,
jwtSecret: ‘topsecret‘,
port: 3000,
};
And then merge them together with any default settings that apply in all environments.
Server
The server
directory contains the core pieces of our Express application. This is where we define our models, controllers, routes and services.
Let‘s look at each part in more detail.
Models
The models
directory contains our Mongoose models, which define the schema and behavior for interacting with MongoDB collections.
A typical model looks like this:
const mongoose = require(‘mongoose‘);
const UserSchema = new mongoose.Schema({
email: {
type: String,
required: true,
unique: true,
lowercase: true,
trim: true,
},
passwordHash: {
type: String,
required: true,
},
});
module.exports = mongoose.model(‘User‘, UserSchema);
Services
The services
directory contains modules that encapsulate business logic and data access. This is where we define complex processes like user registration, authentication, and any database queries or updates.
Keeping this logic in dedicated services, rather than in our route handlers, helps keep our routes thin and focused on their primary responsibility: handling HTTP requests and responses.
Here‘s what a typical service module looks like:
const User = require(‘../models/User‘);
const bcrypt = require(‘bcrypt‘);
async function register(email, password) {
const user = await User.findOne({ email });
if (user) {
throw new Error(‘Email already registered‘);
}
const passwordHash = await bcrypt.hash(password, 10);
const newUser = new User({
email,
passwordHash,
});
return newUser.save();
}
module.exports = {
register,
};
Controllers
The controllers
directory contains modules that define the request handlers for our routes. They take in requests from the routes, invoke the appropriate services to fulfill the request, and send the response back to the client.
Here‘s an example of a basic controller:
const registerService = require(‘../services/register‘);
async function register(req, res) {
const { email, password } = req.body;
try {
const user = await registerService.register(email, password);
res.json(user);
} catch (err) {
res.status(400).json({ error: err.message });
}
}
module.exports = {
register,
};
Routes
Finally, the routes
directory contains the route definitions for our API. It‘s common to break these up according to the API version. Each version has an index.js
file that mounts the routes for that particular version.
Here‘s what a routes/v1/index.js
file might look like:
const express = require(‘express‘);
const router = express.Router();
const authController = require(‘../../controllers/auth‘);
const dashboardController = require(‘../../controllers/dashboard‘);
router.post(‘/register‘, authController.register);
router.post(‘/login‘, authController.login);
router.get(‘/dashboard‘, dashboardController.getDashboard);
module.exports = router;
And here‘s how we might mount these routes in server/index.js
:
const express = require(‘express‘);
const v1Router = require(‘./routes/v1‘);
const app = express();
app.use(‘/api/v1‘, v1Router);
Authentication with Passport.js and JWT
For any application with user accounts, robust authentication and authorization is a key concern.
The most common approach with Node.js APIs is to use JSON Web Tokens (JWTs). When a user logs in with their credentials, the server generates a cryptographically-signed token which is returned to the client. The client includes this token in the headers of subsequent requests to authenticate themselves.
To generate and verify these tokens in Express, we can use the jsonwebtoken
package. And to handle the actual authentication logic and middleware, we can use Passport.js.
Here‘s what the authentication flow might look like in our API:
-
The client sends a POST request to
/api/v1/login
with an email and password in the request body. -
The
/login
route handler invokes theauthService.login
method, passing in the email and password. -
The service looks up the user by email and verifies the password is correct using
bcrypt
to compare the hashes. -
If the password is invalid, the service throws an error. Otherwise, it generates a JWT using
jsonwebtoken
and returns it to the controller. -
The controller sends the JWT back to the client in the response.
-
On subsequent requests, the client includes the JWT in the
Authorization
header. -
Protected routes use the
passport-jwt
strategy to validate the token and attach the authenticated user to the Express request object.
Here‘s what a login route handler might look like:
const jwt = require(‘jsonwebtoken‘);
const jwtSecret = require(‘../config‘).jwtSecret;
function login(req, res, next) {
passport.authenticate(‘local‘, (err, user) => {
if (err) return next(err);
if (!user) return res.status(401).json({ message: ‘Invalid credentials‘ });
const token = jwt.sign({ sub: user.id }, jwtSecret);
res.json({ token });
})(req, res, next);
}
And here‘s how we can protect routes with the passport-jwt
strategy:
const jwtSecret = require(‘../config‘).jwtSecret;
const jwtStrategy = require(‘passport-jwt‘).Strategy;
const ExtractJwt = require(‘passport-jwt‘).ExtractJwt;
const jwtOptions = {
jwtFromRequest: ExtractJwt.fromAuthHeaderAsBearerToken(),
secretOrKey: jwtSecret,
};
passport.use(new jwtStrategy(jwtOptions, (payload, done) => {
User.findById(payload.sub)
.then(user => done(null, user || false))
.catch(err => done(err, false));
}));
// Protected route
router.get(‘/dashboard‘, passport.authenticate(‘jwt‘, { session: false }), dashboardController.getDashboard);
Testing and Deployment
With a well-structured codebase in place, testing and deployment become much easier.
We can write unit tests for our models, services and controller methods to ensure they behave as expected in isolation. Integration tests let us verify that all the pieces work together correctly. And end-to-end tests allow us to test the entire flow from HTTP request to response.
Popular testing tools in the Node.js ecosystem include:
- Jest for unit testing and mocking
- Mocha as a general-purpose test runner
- Chai for assertions
- Sinon for test spies, stubs and mocks
- Supertest for integration testing Express applications
Deploying a Node.js application is largely a matter of provisioning a production environment with the required dependencies (Node.js, MongoDB, etc.) and then pulling down the latest code from version control.
We can use a process manager like PM2 to run our application and automatically restart it if it crashes. And a reverse proxy like Nginx can handle tasks like load balancing, caching and SSL termination.
For a more hands-off approach, platforms like Heroku or AWS Elastic Beanstalk can automatically provision the environment, deploy the code, and scale the application as needed.
Conclusion
Building a scalable architecture for a Node.js application boils down to a few key principles: separating concerns, keeping the codebase predictable, and avoiding over-engineering.
By structuring our codebase with clear separation between routes, controllers, models and services, we set ourselves up to more easily maintain and extend our application over time.
Adding in authentication with Passport.js and JWT allows us to secure our API endpoints with a well-tested, battle-hardened library.
And taking a config-driven approach makes it simple to specify different settings for each environment.
There‘s no one-size-fits-all approach to scalable architecture. But by keeping our codebases modular and following these best practices, we can ensure our Node.js applications will be able to grow and succeed for the long haul.