Skip to content

Instantly share code, notes, and snippets.

@mostafa8026
Forked from Luxcium/unordered-nest-js-docs.md
Created December 6, 2022 09:36
Show Gist options
  • Save mostafa8026/8d507ce1fd432f08a2015a9107233a4b to your computer and use it in GitHub Desktop.
Save mostafa8026/8d507ce1fd432f08a2015a9107233a4b to your computer and use it in GitHub Desktop.

Standalone applications

There are several ways of mounting a Nest application. You can create a web app, a microservice or just a bare Nest standalone application (without any network listeners). The Nest standalone application is a wrapper around the Nest IoC container, which holds all instantiated classes. We can obtain a reference to any existing instance from within any imported module directly using the standalone application object. Thus, you can take advantage of the Nest framework anywhere, including, for example, scripted CRON jobs. You can even build a CLI on top of it.

Getting started

To create a Nest standalone application, use the following construction:

@@filename()
async function bootstrap() {
  const app = await NestFactory.createApplicationContext(ApplicationModule);
  // application logic...
}
bootstrap();

The standalone application object allows you to obtain a reference to any instance registered within the Nest application. Let's imagine that we have a TasksService in the TasksModule. This class provides a set of methods that we want to call from within a CRON job.

@@filename()
const app = await NestFactory.create(ApplicationModule);
const tasksService = app.get(TasksService);

To access the TasksService instance we use the get() method. The get() method acts like a query that searches for an instance in each registered module. Alternatively, for strict context checking, pass an options object with the strict: true property. With this option in effect, you have to navigate through specific modules to obtain a particular instance from the selected context.

@@filename()
const app = await NestFactory.create(AppModule);
const tasksService = app.select(TasksModule).get(TasksService, { strict: true });

Following is a summary of the methods available for retrieving instance references from the standalone application object.

get() Retrieves an instance of a controller or provider (including guards, filters, and so on) available in the application context.
select() Navigates through the modules graph to pull out a specific instance from the selected module (used together with strict mode as described above).

info Hint In non-strict mode, the root module is selected by default. To select any other module, you need to navigate the modules graph manually, step by step.

If you want the node application to close after the script finishes (e.g., for a script running CRON jobs), add await app.close() to the end of your bootstrap function:

@@filename()
async function bootstrap() {
  const app = await NestFactory.createApplicationContext(ApplicationModule);
  // application logic...
  await app.close();
}
bootstrap();

Providers

Providers are a fundamental concept in Nest. Many of the basic Nest classes may be treated as a provider – services, repositories, factories, helpers, and so on. The main idea of a provider is that it can inject dependencies; this means objects can create various relationships with each other, and the function of "wiring up" instances of objects can largely be delegated to the Nest runtime system. A provider is simply a class annotated with an @Injectable() decorator.

In the previous chapter, we built a simple CatsController. Controllers should handle HTTP requests and delegate more complex tasks to providers. Providers are plain JavaScript classes with an @Injectable() decorator preceding their class declaration.

info Hint Since Nest enables the possibility to design and organize dependencies in a more OO-way, we strongly recommend following the SOLID principles.

Services

Let's start by creating a simple CatsService. This service will be responsible for data storage and retrieval, and is designed to be used by the CatsController, so it's a good candidate to be defined as a provider. Thus, we decorate the class with @Injectable().

@@filename(cats.service)
import { Injectable } from '@nestjs/common';
import { Cat } from './interfaces/cat.interface';

@Injectable()
export class CatsService {
  private readonly cats: Cat[] = [];

  create(cat: Cat) {
    this.cats.push(cat);
  }

  findAll(): Cat[] {
    return this.cats;
  }
}
@@switch
import { Injectable } from '@nestjs/common';

@Injectable()
export class CatsService {
  constructor() {
    this.cats = [];
  }

  create(cat) {
    this.cats.push(cat);
  }

  findAll() {
    return this.cats;
  }
}

info Hint To create a service using the CLI, simply execute the $ nest g service cats command.

Our CatsService is a basic class with one property and two methods. The only new feature is that it uses the @Injectable() decorator. The @Injectable() decorator attaches metadata, which tells Nest that this class is a Nest provider. By the way, this example also uses a Cat interface, which probably looks something like this:

@@filename(interfaces/cat.interface)
export interface Cat {
  name: string;
  age: number;
  breed: string;
}

Now that we have a service class to retrieve cats, let's use it inside the CatsController:

@@filename(cats.controller)
import { Controller, Get, Post, Body } from '@nestjs/common';
import { CreateCatDto } from './dto/create-cat.dto';
import { CatsService } from './cats.service';
import { Cat } from './interfaces/cat.interface';

@Controller('cats')
export class CatsController {
  constructor(private catsService: CatsService) {}

  @Post()
  async create(@Body() createCatDto: CreateCatDto) {
    this.catsService.create(createCatDto);
  }

  @Get()
  async findAll(): Promise<Cat[]> {
    return this.catsService.findAll();
  }
}
@@switch
import { Controller, Get, Post, Body, Bind, Dependencies } from '@nestjs/common';
import { CatsService } from './cats.service';

@Controller('cats')
@Dependencies(CatsService)
export class CatsController {
  constructor(catsService) {
    this.catsService = catsService;
  }

  @Post()
  @Bind(Body())
  async create(createCatDto) {
    this.catsService.create(createCatDto);
  }

  @Get()
  async findAll() {
    return this.catsService.findAll();
  }
}

The CatsService is injected through the class constructor. Notice the use of the private syntax. This shorthand allows us to both declare and initialize the catsService member immediately in the same location.

Dependency injection

Nest is built around the strong design pattern commonly known as Dependency injection. We recommend reading a great article about this concept in the official Angular documentation.

In Nest, thanks to TypeScript capabilities, it's extremely easy to manage dependencies because they are resolved just by type. In the example below, Nest will resolve the catsService by creating and returning an instance of CatsService (or, in the normal case of a singleton, returning the existing instance if it has already been requested elsewhere). This dependency is resolved and passed to your controller's constructor (or assigned to the indicated property):

constructor(private catsService: CatsService) {}

Scopes

Providers normally have a lifetime ("scope") synchronized with the application lifecycle. When the application is bootstrapped, every dependency must be resolved, and therefore every provider has to be instantiated. Similarly, when the application shuts down, each provider will be destroyed. However, there are ways to make your provider lifetime request-scoped as well. You can read more about these techniques here.

Custom providers

Nest has a built-in inversion of control ("IoC") container that resolves relationships between providers. This feature underlies the dependency injection feature described above, but is in fact far more powerful than what we've described so far. The @Injectable() decorator is only the tip of the iceberg, and is not the only way to define providers. In fact, you can use plain values, classes, and either asynchronous or synchronous factories. More examples are provided here.

Optional providers

Occasionally, you might have dependencies which do not necessarily have to be resolved. For instance, your class may depend on a configuration object, but if none is passed, the default values should be used. In such a case, the dependency becomes optional, because lack of the configuration provider wouldn't lead to errors.

To indicate a provider is optional, use the @Optional() decorator in the constructor's signature.

import { Injectable, Optional, Inject } from '@nestjs/common';

@Injectable()
export class HttpService<T> {
  constructor(@Optional() @Inject('HTTP_OPTIONS') private httpClient: T) {}
}

Note that in the example above we are using a custom provider, which is the reason we include the HTTP_OPTIONS custom token. Previous examples showed constructor-based injection indicating a dependency through a class in the constructor. Read more about custom providers and their associated tokens here.

Property-based injection

The technique we've used so far is called constructor-based injection, as providers are injected via the constructor method. In some very specific cases, property-based injection might be useful. For instance, if your top-level class depends on either one or multiple providers, passing them all the way up by calling super() in sub-classes from the constructor can be very tedious. In order to avoid this, you can use the @Inject() decorator at the property level.

import { Injectable, Inject } from '@nestjs/common';

@Injectable()
export class HttpService<T> {
  @Inject('HTTP_OPTIONS')
  private readonly httpClient: T;
}

warning Warning If your class doesn't extend another provider, you should always prefer using constructor-based injection.

Provider registration

Now that we have defined a provider (CatsService), and we have a consumer of that service (CatsController), we need to register the service with Nest so that it can perform the injection. We do this by editing our module file (app.module.ts) and adding the service to the providers array of the @Module() decorator.

@@filename(app.module)
import { Module } from '@nestjs/common';
import { CatsController } from './cats/cats.controller';
import { CatsService } from './cats/cats.service';

@Module({
  controllers: [CatsController],
  providers: [CatsService],
})
export class AppModule {}

Nest will now be able to resolve the dependencies of the CatsController class.

This is how our directory structure should look now:

src
cats
dto
create-cat.dto.ts
interfaces
cat.interface.ts
cats.service.ts
cats.controller.ts
app.module.ts
main.ts

Manual instantiation

Thus far, we've discussed how Nest automatically handles most of the details of resolving dependencies. In certain circumstances, you may need to step outside of the built-in Dependency Injection system and manually retrieve or instantiate providers. We briefly discuss two such topics below.

To get existing instances, or instantiate providers dynamically, you can use Module reference.

To get providers within the bootstrap() function (for example for standalone applications without controllers, or to utilize a configuration service during bootstrapping) see Standalone applications.

Controllers

Controllers are responsible for handling incoming requests and returning responses to the client.

A controller's purpose is to receive specific requests for the application. The **routing** mechanism controls which controller receives which requests. Frequently, each controller has more than one route, and different routes can perform different actions.

In order to create a basic controller, we use classes and decorators. Decorators associate classes with required metadata and enable Nest to create a routing map (tie requests to the corresponding controllers).

Routing

In the following example we'll use the @Controller() decorator, which is required to define a basic controller. We'll specify an optional route path prefix of cats. Using a path prefix in a @Controller() decorator allows us to easily group a set of related routes, and minimize repetitive code. For example, we may choose to group a set of routes that manage interactions with a customer entity under the route /customers. In that case, we could specify the path prefix customers in the @Controller() decorator so that we don't have to repeat that portion of the path for each route in the file.

@@filename(cats.controller)
import { Controller, Get } from '@nestjs/common';

@Controller('cats')
export class CatsController {
  @Get()
  findAll(): string {
    return 'This action returns all cats';
  }
}
@@switch
import { Controller, Get } from '@nestjs/common';

@Controller('cats')
export class CatsController {
  @Get()
  findAll() {
    return 'This action returns all cats';
  }
}

info Hint To create a controller using the CLI, simply execute the $ nest g controller cats command.

The @Get() HTTP request method decorator before the findAll() method tells Nest to create a handler for a specific endpoint for HTTP requests. The endpoint corresponds to the HTTP request method (GET in this case) and the route path. What is the route path? The route path for a handler is determined by concatenating the (optional) prefix declared for the controller, and any path specified in the request decorator. Since we've declared a prefix for every route ( cats), and haven't added any path information in the decorator, Nest will map GET /cats requests to this handler. As mentioned, the path includes both the optional controller path prefix and any path string declared in the request method decorator. For example, a path prefix of customers combined with the decorator @Get('profile') would produce a route mapping for requests like GET /customers/profile.

In our example above, when a GET request is made to this endpoint, Nest routes the request to our user-defined findAll() method. Note that the method name we choose here is completely arbitrary. We obviously must declare a method to bind the route to, but Nest doesn't attach any significance to the method name chosen.

This method will return a 200 status code and the associated response, which in this case is just a string. Why does that happen? To explain, we'll first introduce the concept that Nest employs two different options for manipulating responses:

Standard (recommended) Using this built-in method, when a request handler returns a JavaScript object or array, it will automatically be serialized to JSON. When it returns a JavaScript primitive type (e.g., string, number, boolean), however, Nest will send just the value without attempting to serialize it. This makes response handling simple: just return the value, and Nest takes care of the rest.

Furthermore, the response's status code is always 200 by default, except for POST requests which use 201. We can easily change this behavior by adding the @HttpCode(...) decorator at a handler-level (see Status codes).
Library-specific We can use the library-specific (e.g., Express) response object, which can be injected using the @Res() decorator in the method handler signature (e.g., findAll(@Res() response)). With this approach, you have the ability (and the responsibility), to use the native response handling methods exposed by that object. For example, with Express, you can construct responses using code like response.status(200).send()

warning Warning You cannot use both approaches at the same time. Nest detects when the handler is using either @Res() or @Next(), indicating you have chosen the library-specific option. If both approaches are used at the same time, the Standard approach is automatically disabled for this single route and will no longer work as expected.

Request object

Handlers often need access to the client request details. Nest provides access to the request object of the underlying platform (Express by default). We can access the request object by instructing Nest to inject it by adding the @Req() decorator to the handler's signature.

@@filename(cats.controller)
import { Controller, Get, Req } from '@nestjs/common';
import { Request } from 'express';

@Controller('cats')
export class CatsController {
  @Get()
  findAll(@Req() request: Request): string {
    return 'This action returns all cats';
  }
}
@@switch
import { Controller, Bind, Get, Req } from '@nestjs/common';

@Controller('cats')
export class CatsController {
  @Get()
  @Bind(Req())
  findAll(request) {
    return 'This action returns all cats';
  }
}

info Hint In order to take advantage of express typings (as in the request: Request parameter example above), install @types/express package.

The request object represents the HTTP request and has properties for the request query string, parameters, HTTP headers, and body (read more here). In most cases, it's not necessary to grab these properties manually. We can use dedicated decorators instead, such as @Body() or @Query(), which are available out of the box. Below is a list of the provided decorators and the plain platform-specific objects they represent.

@Request(), @Req() req
@Response(), @Res()* res
@Next() next
@Session() req.session
@Param(key?: string) req.params / req.params[key]
@Body(key?: string) req.body / req.body[key]
@Query(key?: string) req.query / req.query[key]
@Headers(name?: string) req.headers / req.headers[name]
@Ip() req.ip
@HostParam() req.hosts

* For compatibility with typings across underlying HTTP platforms (e.g., Express and Fastify), Nest provides @Res() and @Response() decorators. @Res() is simply an alias for @Response(). Both directly expose the underlying native platform response object interface. When using them, you should also import the typings for the underlying library (e.g., @types/express) to take full advantage. Note that when you inject either @Res() or @Response() in a method handler, you put Nest into Library-specific mode for that handler, and you become responsible for managing the response. When doing so, you must issue some kind of response by making a call on the response object (e.g., res.json(...) or res.send(...)), or the HTTP server will hang.

info Hint To learn how to create your own custom decorators, visit this chapter.

Resources

Earlier, we defined an endpoint to fetch the cats resource (GET route). We'll typically also want to provide an endpoint that creates new records. For this, let's create the POST handler:

@@filename(cats.controller)
import { Controller, Get, Post } from '@nestjs/common';

@Controller('cats')
export class CatsController {
  @Post()
  create(): string {
    return 'This action adds a new cat';
  }

  @Get()
  findAll(): string {
    return 'This action returns all cats';
  }
}
@@switch
import { Controller, Get, Post } from '@nestjs/common';

@Controller('cats')
export class CatsController {
  @Post()
  create() {
    return 'This action adds a new cat';
  }

  @Get()
  findAll() {
    return 'This action returns all cats';
  }
}

It's that simple. Nest provides decorators for all of the standard HTTP methods: @Get, @Post, @Put(), @Delete(), @Patch(), @Options(), and @Head(). In addition, @All() defines an endpoint that handles all of them.

Route wildcards

Pattern based routes are supported as well. For instance, the asterisk is used as a wildcard, and will match any combination of characters.

@Get('ab*cd')
findAll() {
  return 'This route uses a wildcard';
}

The 'ab*cd' route path will match abcd, ab_cd, abecd, and so on. The characters ?, +, *, and () may be used in a route path, and are subsets of their regular expression counterparts. The hyphen ( -) and the dot (.) are interpreted literally by string-based paths.

Status code

As mentioned, the response status code is always 200 by default, except for POST requests which are 201. We can easily change this behavior by adding the @HttpCode(...) decorator at a handler level.

@Post()
@HttpCode(204)
create() {
  return 'This action adds a new cat';
}

info Hint Import HttpCode from the @nestjs/common package.

Often, your status code isn't static but depends on various factors. In that case, you can use a library-specific response (inject using @Res()) object (or, in case of an error, throw an exception).

Headers

To specify a custom response header, you can either use a @Header() decorator or a library-specific response object (and call res.header() directly).

@Post()
@Header('Cache-Control', 'none')
create() {
  return 'This action adds a new cat';
}

info Hint Import Header from the @nestjs/common package.

Redirection

To redirect a response to a specific URL, you can either use a @Redirect() decorator or a library-specific response object (and call res.redirect() directly).

@Redirect() takes a required url argument, and an optional statusCode argument. The statusCode defaults to 302 (Found) if omitted.

@Get()
@Redirect('https://nestjs.com', 301)

Sometimes you may want to determine the HTTP status code or the redirect URL dynamically. Do this by returning an object from the route handler method with the shape:

{
  "url": string,
  "statusCode": number
}

Returned values will override any arguments passed to the @Redirect() decorator. For example:

@Get('docs')
@Redirect('https://docs.nestjs.com', 302)
getDocs(@Query('version') version) {
  if (version && version === '5') {
    return { url: 'https://docs.nestjs.com/v5/' };
  }
}

Route parameters

Routes with static paths won't work when you need to accept dynamic data as part of the request (e.g., GET /cats/1 to get cat with id 1). In order to define routes with parameters, we can add route parameter tokens in the path of the route to capture the dynamic value at that position in the request URL. The route parameter token in the @Get() decorator example below demonstrates this usage. Route parameters declared in this way can be accessed using the @Param() decorator, which should be added to the method signature.

@@filename()
@Get(':id')
findOne(@Param() params): string {
  console.log(params.id);
  return `This action returns a #${params.id} cat`;
}
@@switch
@Get(':id')
@Bind(Param())
findOne(params) {
  console.log(params.id);
  return `This action returns a #${params.id} cat`;
}

@Param() is used to decorate a method parameter (params in the example above), and makes the route parameters available as properties of that decorated method parameter inside the body of the method. As seen in the code above, we can access the id parameter by referencing params.id. You can also pass in a particular parameter token to the decorator, and then reference the route parameter directly by name in the method body.

info Hint Import Param from the @nestjs/common package.

@@filename()
@Get(':id')
findOne(@Param('id') id): string {
  return `This action returns a #${id} cat`;
}
@@switch
@Get(':id')
@Bind(Param('id'))
findOne(id) {
  return `This action returns a #${id} cat`;
}

Sub-Domain Routing

The @Controller decorator can take a host option to require that the HTTP host of the incoming requests matches some specific value.

@Controller({ host: 'admin.example.com' })
export class AdminController {
  @Get()
  index(): string {
    return 'Admin page';
  }
}

Warning Since Fastify lacks support for nested routers, when using sub-domain routing, the (default) Express adapter should be used instead.

Similar to a route path, the hosts option can use tokens to capture the dynamic value at that position in the host name. The host parameter token in the @Controller() decorator example below demonstrates this usage. Host parameters declared in this way can be accessed using the @HostParam() decorator, which should be added to the method signature.

@Controller({ host: ':account.example.com' })
export class AccountController {
  @Get()
  getInfo(@HostParam('account') account: string) {
    return account;
  }
}

Scopes

For people coming from different programming language backgrounds, it might be unexpected to learn that in Nest, almost everything is shared across incoming requests. We have a connection pool to the database, singleton services with global state, etc. Remember that Node.js doesn't follow the request/response Multi-Threaded Stateless Model in which every request is processed by a separate thread. Hence, using singleton instances is fully safe for our applications.

However, there are edge-cases when request-based lifetime of the controller may be the desired behavior, for instance per-request caching in GraphQL applications, request tracking or multi-tenancy. Learn how to control scopes here.

Asynchronicity

We love modern JavaScript and we know that data extraction is mostly asynchronous. That's why Nest supports and works well with async functions.

info Hint Learn more about async / await feature here

Every async function has to return a Promise. This means that you can return a deferred value that Nest will be able to resolve by itself. Let's see an example of this:

@@filename(cats.controller)
@Get()
async findAll(): Promise<any[]> {
  return [];
}
@@switch
@Get()
async findAll() {
  return [];
}

The above code is fully valid. Furthermore, Nest route handlers are even more powerful by being able to return RxJS observable streams. Nest will automatically subscribe to the source underneath and take the last emitted value (once the stream is completed).

@@filename(cats.controller)
@Get()
findAll(): Observable<any[]> {
  return of([]);
}
@@switch
@Get()
findAll() {
  return of([]);
}

Both of the above approaches work and you can use whatever fits your requirements.

Request payloads

Our previous example of the POST route handler didn't accept any client params. Let's fix this by adding the @Body() decorator here.

But first (if you use TypeScript), we need to determine the DTO (Data Transfer Object) schema. A DTO is an object that defines how the data will be sent over the network. We could determine the DTO schema by using TypeScript interfaces, or by simple classes. Interestingly, we recommend using classes here. Why? Classes are part of the JavaScript ES6 standard, and therefore they are preserved as real entities in the compiled JavaScript. On the other hand, since TypeScript interfaces are removed during the transpilation, Nest can't refer to them at runtime. This is important because features such as Pipes enable additional possibilities when they have access to the metatype of the variable at runtime.

Let's create the CreateCatDto class:

@@filename(create-cat.dto)
export class CreateCatDto {
  name: string;
  age: number;
  breed: string;
}

It has only three basic properties. Thereafter we can use the newly created DTO inside the CatsController:

@@filename(cats.controller)
@Post()
async create(@Body() createCatDto: CreateCatDto) {
  return 'This action adds a new cat';
}
@@switch
@Post()
@Bind(Body())
async create(createCatDto) {
  return 'This action adds a new cat';
}

Handling errors

There's a separate chapter about handling errors (i.e., working with exceptions) here.

Full resource sample

Below is an example that makes use of several of the available decorators to create a basic controller. This controller exposes a couple of methods to access and manipulate internal data.

@@filename(cats.controller)
import { Controller, Get, Query, Post, Body, Put, Param, Delete } from '@nestjs/common';
import { CreateCatDto, UpdateCatDto, ListAllEntities } from './dto';

@Controller('cats')
export class CatsController {
  @Post()
  create(@Body() createCatDto: CreateCatDto) {
    return 'This action adds a new cat';
  }

  @Get()
  findAll(@Query() query: ListAllEntities) {
    return `This action returns all cats (limit: ${query.limit} items)`;
  }

  @Get(':id')
  findOne(@Param('id') id: string) {
    return `This action returns a #${id} cat`;
  }

  @Put(':id')
  update(@Param('id') id: string, @Body() updateCatDto: UpdateCatDto) {
    return `This action updates a #${id} cat`;
  }

  @Delete(':id')
  remove(@Param('id') id: string) {
    return `This action removes a #${id} cat`;
  }
}
@@switch
import { Controller, Get, Query, Post, Body, Put, Param, Delete, Bind } from '@nestjs/common';

@Controller('cats')
export class CatsController {
  @Post()
  @Bind(Body())
  create(createCatDto) {
    return 'This action adds a new cat';
  }

  @Get()
  @Bind(Query())
  findAll(query) {
    console.log(query);
    return `This action returns all cats (limit: ${query.limit} items)`;
  }

  @Get(':id')
  @Bind(Param('id'))
  findOne(id) {
    return `This action returns a #${id} cat`;
  }

  @Put(':id')
  @Bind(Param('id'), Body())
  update(id, updateCatDto) {
    return `This action updates a #${id} cat`;
  }

  @Delete(':id')
  @Bind(Param('id'))
  remove(id) {
    return `This action removes a #${id} cat`;
  }
}

Getting up and running

With the above controller fully defined, Nest still doesn't know that CatsController exists and as a result won't create an instance of this class.

Controllers always belong to a module, which is why we include the controllers array within the @Module() decorator. Since we haven't yet defined any other modules except the root AppModule, we'll use that to introduce the CatsController:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { CatsController } from './cats/cats.controller';

@Module({
  controllers: [CatsController],
})
export class AppModule {}

We attached the metadata to the module class using the @Module() decorator, and Nest can now easily reflect which controllers have to be mounted.

Appendix: Library-specific approach

So far we've discussed the Nest standard way of manipulating responses. The second way of manipulating the response is to use a library-specific response object. In order to inject a particular response object, we need to use the @Res() decorator. To show the differences, let's rewrite the CatsController to the following:

@@filename()
import { Controller, Get, Post, Res, HttpStatus } from '@nestjs/common';
import { Response } from 'express';

@Controller('cats')
export class CatsController {
  @Post()
  create(@Res() res: Response) {
    res.status(HttpStatus.CREATED).send();
  }

  @Get()
  findAll(@Res() res: Response) {
     res.status(HttpStatus.OK).json([]);
  }
}
@@switch
import { Controller, Get, Post, Bind, Res, Body, HttpStatus } from '@nestjs/common';

@Controller('cats')
export class CatsController {
  @Post()
  @Bind(Res(), Body())
  create(res, createCatDto) {
    res.status(HttpStatus.CREATED).send();
  }

  @Get()
  @Bind(Res())
  findAll(res) {
     res.status(HttpStatus.OK).json([]);
  }
}

Though this approach works, and does in fact allow for more flexibility in some ways by providing full control of the response object (headers manipulation, library-specific features, and so on), it should be used with care. In general, the approach is much less clear and does have some disadvantages. The main disadvantages are that you lose compatibility with Nest features that depend on Nest standard response handling, such as Interceptors and the @HttpCode() decorator. Also, your code can become platform-dependent (as underlying libraries may have different APIs on the response object), and harder to test (you'll have to mock the response object, etc.).

As a result, the Nest standard approach should always be preferred when possible.

Custom route decorators

Nest is built around a language feature called decorators. Decorators are a well-known concept in a lot of commonly used programming languages, but in the JavaScript world, they're still relatively new. In order to better understand how decorators work, we recommend reading this article. Here's a simple definition:

An ES2016 decorator is an expression which returns a function and can take a target, name and property descriptor as arguments. You apply it by prefixing the decorator with an @ character and placing this at the very top of what you are trying to decorate. Decorators can be defined for either a class, a method or a property.

Param decorators

Nest provides a set of useful param decorators that you can use together with the HTTP route handlers. Below is a list of the provided decorators and the plain Express (or Fastify) objects they represent

@Request(), @Req() req
@Response(), @Res() res
@Next() next
@Session() req.session
@Param(param?: string) req.params / req.params[param]
@Body(param?: string) req.body / req.body[param]
@Query(param?: string) req.query / req.query[param]
@Headers(param?: string) req.headers / req.headers[param]
@Ip() req.ip
@HostParam() req.hosts

Additionally, you can create your own custom decorators. Why is this useful?

In the node.js world, it's common practice to attach properties to the request object. Then you manually extract them in each route handler, using code like the following:

const user = req.user;

In order to make your code more readable and transparent, you can create a @User() decorator and reuse it across all of your controllers.

@@filename(user.decorator)
import { createParamDecorator, ExecutionContext } from '@nestjs/common';

export const User = createParamDecorator(
  (data: unknown, ctx: ExecutionContext) => {
    const request = ctx.switchToHttp().getRequest();
    return request.user;
  },
);

Then, you can simply use it wherever it fits your requirements.

@@filename()
@Get()
async findOne(@User() user: UserEntity) {
  console.log(user);
}
@@switch
@Get()
@Bind(User())
async findOne(user) {
  console.log(user);
}

Passing data

When the behavior of your decorator depends on some conditions, you can use the data parameter to pass an argument to the decorator's factory function. One use case for this is a custom decorator that extracts properties from the request object by key. Let's assume, for example, that our authentication layer validates requests and attaches a user entity to the request object. The user entity for an authenticated request might look like:

{
  "id": 101,
  "firstName": "Alan",
  "lastName": "Turing",
  "email": "[email protected]",
  "roles": ["admin"]
}

Let's define a decorator that takes a property name as key, and returns the associated value if it exists (or undefined if it doesn't exist, or if the user object has not been created).

@@filename(user.decorator)
import { createParamDecorator, ExecutionContext } from '@nestjs/common';

export const User = createParamDecorator(
  (data: string, ctx: ExecutionContext) => {
    const request = ctx.switchToHttp().getRequest();
    const user = request.user;

    return data ? user && user[data] : user;
  },
);
@@switch
import { createParamDecorator } from '@nestjs/common';

export const User = createParamDecorator((data, ctx) => {
  const request = ctx.switchToHttp().getRequest();
  const user = request.user;

  return data ? user && user[data] : user;
});

Here's how you could then access a particular property via the @User() decorator in the controller:

@@filename()
@Get()
async findOne(@User('firstName') firstName: string) {
  console.log(`Hello ${firstName}`);
}
@@switch
@Get()
@Bind(User('firstName'))
async findOne(firstName) {
  console.log(`Hello ${firstName}`);
}

You can use this same decorator with different keys to access different properties. If the user object is deep or complex, this can make for easier and more readable request handler implementations.

info Hint For TypeScript users, note that createParamDecorator<T>() is a generic. This means you can explicitly enforce type safety, for example createParamDecorator<string>((data, ctx) => ...). Alternatively, specify a parameter type in the factory function, for example createParamDecorator((data: string, ctx) => ...). If you omit both, the type for data will be any.

Working with pipes

Nest treats custom param decorators in the same fashion as the built-in ones (@Body(), @Param() and @Query()). This means that pipes are executed for the custom annotated parameters as well (in our examples, the user argument). Moreover, you can apply the pipe directly to the custom decorator:

@@filename()
@Get()
async findOne(@User(new ValidationPipe()) user: UserEntity) {
  console.log(user);
}
@@switch
@Get()
@Bind(User(new ValidationPipe()))
async findOne(user) {
  console.log(user);
}

Decorator composition

Nest provides a helper method to compose multiple decorators. For example, suppose you want to combine all decorators related to authentication into a single decorator. This could be done with the following construction:

@@filename(auth.decorator)
import { applyDecorators } from '@nestjs/common';

export function Auth(...roles: Role[]) {
  return applyDecorators(
    SetMetadata('roles', roles),
    UseGuards(AuthGuard, RolesGuard),
    ApiBearerAuth(),
    ApiUnauthorizedResponse({ description: 'Unauthorized' }),
  );
}
@@switch
import { applyDecorators } from '@nestjs/common';

export function Auth(...roles) {
  return applyDecorators(
    SetMetadata('roles', roles),
    UseGuards(AuthGuard, RolesGuard),
    ApiBearerAuth(),
    ApiUnauthorizedResponse({ description: 'Unauthorized' }),
  );
}

You can then use this custom @Auth() decorator as follows:

@Get('users')
@Auth('admin')
findAllUsers() {}

This has the effect of applying all four decorators with a single declaration.

warning Warning The @ApiHideProperty() decorator from the @nestjs/swagger package is not composable and won't work properly with the applyDecorators function.

Official NestJS Consulting

Our goal is to ensure that your developers are successful and productive with NestJS as well as other modern technologies in today's ever-changing tech world.

Official Support

With official support, get expert help directly from the NestJS core team. We tackle your toughest challenges, and collaborate with your team on many levels such as:

  • Providing technical guidance & architectural reviews
  • Mentoring team members
  • Advising best practices
  • Solving design decisions
  • Addressing security & performance concerns
  • Performing in-depth code reviews

Team Augmentation & Development

With team augmentation, NestJS core team members can work directly with your team on a daily basis to help take your project to the next-level. Consider us “part of your team”, tackling the most ambitious projects - right by your side.

NestJS Best Practices

Frequent code reviews can eliminate potentially hazardous bugs & issues at an early stage and help enforce best practices. Let us perform PR reviews & audits to ensure your code quality, performance, and security.

First-hand access

Direct communication channel will boost team velocity, giving a quick access to discuss and solve problems.

NestJS Workshops and Trainings

We provide solid kick-off training as well as more advanced ones that give teams an in-depth understanding of NestJS. We offer on-site workshops and remote intensive sessions which help get you up and running quickly within the NestJS ecosystem.

Contact us!

Let's talk how we can help you become successful with NestJS.

Reach out to us at [email protected], and let’s talk about your project & teams needs!

Exception filters

Nest comes with a built-in exceptions layer which is responsible for processing all unhandled exceptions across an application. When an exception is not handled by your application code, it is caught by this layer, which then automatically sends an appropriate user-friendly response.

Out of the box, this action is performed by a built-in global exception filter, which handles exceptions of type HttpException (and subclasses of it). When an exception is unrecognized (is neither HttpException nor a class that inherits from HttpException), the built-in exception filter generates the following default JSON response:

{
  "statusCode": 500,
  "message": "Internal server error"
}

Throwing standard exceptions

Nest provides a built-in HttpException class, exposed from the @nestjs/common package. For typical HTTP REST/GraphQL API based applications, it's best practice to send standard HTTP response objects when certain error conditions occur.

For example, in the CatsController, we have a findAll() method (a GET route handler). Let's assume that this route handler throws an exception for some reason. To demonstrate this, we'll hard-code it as follows:

@@filename(cats.controller)
@Get()
async findAll() {
  throw new HttpException('Forbidden', HttpStatus.FORBIDDEN);
}

info Hint We used the HttpStatus here. This is a helper enum imported from the @nestjs/common package.

When the client calls this endpoint, the response looks like this:

{
  "statusCode": 403,
  "message": "Forbidden"
}

The HttpException constructor takes two required arguments which determine the response:

  • The response argument defines the JSON response body. It can be a string or an object as described below.
  • The status argument defines the HTTP status code.

By default, the JSON response body contains two properties:

  • statusCode: defaults to the HTTP status code provided in the status argument
  • message: a short description of the HTTP error based on the status

To override just the message portion of the JSON response body, supply a string in the response argument. To override the entire JSON response body, pass an object in the response argument. Nest will serialize the object and return it as the JSON response body.

The second constructor argument - status - should be a valid HTTP status code. Best practice is to use the HttpStatus enum imported from @nestjs/common.

Here's an example overriding the entire response body:

@@filename(cats.controller)
@Get()
async findAll() {
  throw new HttpException({
    status: HttpStatus.FORBIDDEN,
    error: 'This is a custom message',
  }, HttpStatus.FORBIDDEN);
}

Using the above, this is how the response would look:

{
  "status": 403,
  "error": "This is a custom message"
}

Custom exceptions

In many cases, you will not need to write custom exceptions, and can use the built-in Nest HTTP exception, as described in the next section. If you do need to create customized exceptions, it's good practice to create your own exceptions hierarchy, where your custom exceptions inherit from the base HttpException class. With this approach, Nest will recognize your exceptions, and automatically take care of the error responses. Let's implement such a custom exception:

@@filename(forbidden.exception)
export class ForbiddenException extends HttpException {
  constructor() {
    super('Forbidden', HttpStatus.FORBIDDEN);
  }
}

Since ForbiddenException extends the base HttpException, it will work seamlessly with the built-in exception handler, and therefore we can use it inside the findAll() method.

@@filename(cats.controller)
@Get()
async findAll() {
  throw new ForbiddenException();
}

Built-in HTTP exceptions

Nest provides a set of standard exceptions that inherit from the base HttpException. These are exposed from the @nestjs/common package, and represent many of the most common HTTP exceptions:

  • BadRequestException
  • UnauthorizedException
  • NotFoundException
  • ForbiddenException
  • NotAcceptableException
  • RequestTimeoutException
  • ConflictException
  • GoneException
  • HttpVersionNotSupportedException
  • PayloadTooLargeException
  • UnsupportedMediaTypeException
  • UnprocessableEntityException
  • InternalServerErrorException
  • NotImplementedException
  • ImATeapotException
  • MethodNotAllowedException
  • BadGatewayException
  • ServiceUnavailableException
  • GatewayTimeoutException
  • PreconditionFailedException

Exception filters

While the base (built-in) exception filter can automatically handle many cases for you, you may want full control over the exceptions layer. For example, you may want to add logging or use a different JSON schema based on some dynamic factors. Exception filters are designed for exactly this purpose. They let you control the exact flow of control and the content of the response sent back to the client.

Let's create an exception filter that is responsible for catching exceptions which are an instance of the HttpException class, and implementing custom response logic for them. To do this, we'll need to access the underlying platform Request and Response objects. We'll access the Request object so we can pull out the original url and include that in the logging information. We'll use the Response object to take direct control of the response that is sent, using the response.json() method.

@@filename(http-exception.filter)
import { ExceptionFilter, Catch, ArgumentsHost, HttpException } from '@nestjs/common';
import { Request, Response } from 'express';

@Catch(HttpException)
export class HttpExceptionFilter implements ExceptionFilter {
  catch(exception: HttpException, host: ArgumentsHost) {
    const ctx = host.switchToHttp();
    const response = ctx.getResponse<Response>();
    const request = ctx.getRequest<Request>();
    const status = exception.getStatus();

    response
      .status(status)
      .json({
        statusCode: status,
        timestamp: new Date().toISOString(),
        path: request.url,
      });
  }
}
@@switch
import { Catch, HttpException } from '@nestjs/common';

@Catch(HttpException)
export class HttpExceptionFilter {
  catch(exception, host) {
    const ctx = host.switchToHttp();
    const response = ctx.getResponse();
    const request = ctx.getRequest();
    const status = exception.getStatus();

    response
      .status(status)
      .json({
        statusCode: status,
        timestamp: new Date().toISOString(),
        path: request.url,
      });
  }
}

info Hint All exception filters should implement the generic ExceptionFilter<T> interface. This requires you to provide the catch(exception: T, host: ArgumentsHost) method with its indicated signature. T indicates the type of the exception.

The @Catch(HttpException) decorator binds the required metadata to the exception filter, telling Nest that this particular filter is looking for exceptions of type HttpException and nothing else. The @Catch() decorator may take a single parameter, or a comma-separated list. This lets you set up the filter for several types of exceptions at once.

Arguments host

Let's look at the parameters of the catch() method. The exception parameter is the exception object currently being processed. The host parameter is an ArgumentsHost object. ArgumentsHost is a powerful utility object that we'll examine further in the execution context chapter*. In this code sample, we use it to obtain a reference to the Request and Response objects that are being passed to the original request handler (in the controller where the exception originates). In this code sample, we've used some helper methods on ArgumentsHost to get the desired Request and Response objects. Learn more about ArgumentsHost here.

*The reason for this level of abstraction is that ArgumentsHost functions in all contexts (e.g., the HTTP server context we're working with now, but also Microservices and WebSockets). In the execution context chapter we'll see how we can access the appropriate underlying arguments for any execution context with the power of ArgumentsHost and its helper functions. This will allow us to write generic exception filters that operate across all contexts.

Binding filters

Let's tie our new HttpExceptionFilter to the CatsController's create() method.

@@filename(cats.controller)
@Post()
@UseFilters(new HttpExceptionFilter())
async create(@Body() createCatDto: CreateCatDto) {
  throw new ForbiddenException();
}
@@switch
@Post()
@UseFilters(new HttpExceptionFilter())
@Bind(Body())
async create(createCatDto) {
  throw new ForbiddenException();
}

info Hint The @UseFilters() decorator is imported from the @nestjs/common package.

We have used the @UseFilters() decorator here. Similar to the @Catch() decorator, it can take a single filter instance, or a comma-separated list of filter instances. Here, we created the instance of HttpExceptionFilter in place. Alternatively, you may pass the class (instead of an instance), leaving responsibility for instantiation to the framework, and enabling dependency injection.

@@filename(cats.controller)
@Post()
@UseFilters(HttpExceptionFilter)
async create(@Body() createCatDto: CreateCatDto) {
  throw new ForbiddenException();
}
@@switch
@Post()
@UseFilters(HttpExceptionFilter)
@Bind(Body())
async create(createCatDto) {
  throw new ForbiddenException();
}

info Hint Prefer applying filters by using classes instead of instances when possible. It reduces memory usage since Nest can easily reuse instances of the same class across your entire module.

In the example above, the HttpExceptionFilter is applied only to the single create() route handler, making it method-scoped. Exception filters can be scoped at different levels: method-scoped, controller-scoped, or global-scoped. For example, to set up a filter as controller-scoped, you would do the following:

@@filename(cats.controller)
@UseFilters(new HttpExceptionFilter())
export class CatsController {}

This construction sets up the HttpExceptionFilter for every route handler defined inside the CatsController.

To create a global-scoped filter, you would do the following:

@@filename(main)
async function bootstrap() {
  const app = await NestFactory.create(AppModule);
  app.useGlobalFilters(new HttpExceptionFilter());
  await app.listen(3000);
}
bootstrap();

warning Warning The useGlobalFilters() method does not set up filters for gateways or hybrid applications.

Global-scoped filters are used across the whole application, for every controller and every route handler. In terms of dependency injection, global filters registered from outside of any module (with useGlobalFilters() as in the example above) cannot inject dependencies since this is done outside the context of any module. In order to solve this issue, you can register a global-scoped filter directly from any module using the following construction:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { APP_FILTER } from '@nestjs/core';

@Module({
  providers: [
    {
      provide: APP_FILTER,
      useClass: HttpExceptionFilter,
    },
  ],
})
export class AppModule {}

info Hint When using this approach to perform dependency injection for the filter, note that regardless of the module where this construction is employed, the filter is, in fact, global. Where should this be done? Choose the module where the filter (HttpExceptionFilter in the example above) is defined. Also, useClass is not the only way of dealing with custom provider registration. Learn more here.

You can add as many filters with this technique as needed; simply add each to the providers array.

Catch everything

In order to catch every unhandled exception (regardless of the exception type), leave the @Catch() decorator's parameter list empty, e.g., @Catch().

import {
  ExceptionFilter,
  Catch,
  ArgumentsHost,
  HttpException,
  HttpStatus,
} from '@nestjs/common';

@Catch()
export class AllExceptionsFilter implements ExceptionFilter {
  catch(exception: unknown, host: ArgumentsHost) {
    const ctx = host.switchToHttp();
    const response = ctx.getResponse();
    const request = ctx.getRequest();

    const status =
      exception instanceof HttpException
        ? exception.getStatus()
        : HttpStatus.INTERNAL_SERVER_ERROR;

    response.status(status).json({
      statusCode: status,
      timestamp: new Date().toISOString(),
      path: request.url,
    });
  }
}

In the example above the filter will catch each exception thrown, regardless of its type (class).

Inheritance

Typically, you'll create fully customized exception filters crafted to fulfill your application requirements. However, there might be use-cases when you would like to simply extend the built-in default global exception filter, and override the behavior based on certain factors.

In order to delegate exception processing to the base filter, you need to extend BaseExceptionFilter and call the inherited catch() method.

@@filename(all-exceptions.filter)
import { Catch, ArgumentsHost } from '@nestjs/common';
import { BaseExceptionFilter } from '@nestjs/core';

@Catch()
export class AllExceptionsFilter extends BaseExceptionFilter {
  catch(exception: unknown, host: ArgumentsHost) {
    super.catch(exception, host);
  }
}
@@switch
import { Catch } from '@nestjs/common';
import { BaseExceptionFilter } from '@nestjs/core';

@Catch()
export class AllExceptionsFilter extends BaseExceptionFilter {
  catch(exception, host) {
    super.catch(exception, host);
  }
}

warning Warning Method-scoped and Controller-scoped filters that extend the BaseExceptionFilter should not be instantiated with new. Instead, let the framework instantiate them automatically.

The above implementation is just a shell demonstrating the approach. Your implementation of the extended exception filter would include your tailored business logic (e.g., handling various conditions).

Global filters can extend the base filter. This can be done in either of two ways.

The first method is to inject the HttpServer reference when instantiating the custom global filter:

async function bootstrap() {
  const app = await NestFactory.create(AppModule);

  const { httpAdapter } = app.get(HttpAdapterHost);
  app.useGlobalFilters(new AllExceptionsFilter(httpAdapter));

  await app.listen(3000);
}
bootstrap();

The second method is to use the APP_FILTER token as shown here.

First steps

In this set of articles, you'll learn the core fundamentals of Nest. To get familiar with the essential building blocks of Nest applications, we'll build a basic CRUD application with features that cover a lot of ground at an introductory level.

Language

We're in love with TypeScript, but above all - we love Node.js. That's why Nest is compatible with both TypeScript and pure JavaScript. Nest takes advantage of the latest language features, so to use it with vanilla JavaScript we need a Babel compiler.

We'll mostly use TypeScript in the examples we provide, but you can always switch the code snippets to vanilla JavaScript syntax (simply click to toggle the language button in the upper right hand corner of each snippet).

Prerequisites

Please make sure that Node.js (>= 10.13.0) is installed on your operating system.

Setup

Setting up a new project is quite simple with the Nest CLI. With npm installed, you can create a new Nest project with the following commands in your OS terminal:

$ npm i -g @nestjs/cli
$ nest new project-name

The project directory will be created, node modules and a few other boilerplate files will be installed, and a src/ directory will be created and populated with several core files.

src
app.controller.ts
app.module.ts
main.ts

Here's a brief overview of those core files:

app.controller.ts Basic controller sample with a single route.
app.module.ts The root module of the application.
main.ts The entry file of the application which uses the core function NestFactory to create a Nest application instance.

The main.ts includes an async function, which will bootstrap our application:

@@filename(main)

import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create(AppModule);
  await app.listen(3000);
}
bootstrap();
@@switch
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create(AppModule);
  await app.listen(3000);
}
bootstrap();

To create a Nest application instance, we use the core NestFactory class. NestFactory exposes a few static methods that allow creating an application instance. The create() method returns an application object, which fulfills the INestApplication interface. This object provides a set of methods which are described in the coming chapters. In the main.ts example above, we simply start up our HTTP listener, which lets the application await inbound HTTP requests.

Note that a project scaffolded with the Nest CLI creates an initial project structure that encourages developers to follow the convention of keeping each module in its own dedicated directory.

Platform

Nest aims to be a platform-agnostic framework. Platform independence makes it possible to create reusable logical parts that developers can take advantage of across several different types of applications. Technically, Nest is able to work with any Node HTTP framework once an adapter is created. There are two HTTP platforms supported out-of-the-box: express and fastify. You can choose the one that best suits your needs.

platform-express Express is a well-known minimalist web framework for node. It's a battle tested, production-ready library with lots of resources implemented by the community. The @nestjs/platform-express package is used by default. Many users are well served with Express, and need take no action to enable it.
platform-fastify Fastify is a high performance and low overhead framework highly focused on providing maximum efficiency and speed. Read how to use it here.

Whichever platform is used, it exposes its own application interface. These are seen respectively as NestExpressApplication and NestFastifyApplication.

When you pass a type to the NestFactory.create() method, as in the example below, the app object will have methods available exclusively for that specific platform. Note, however, you don't need to specify a type unless you actually want to access the underlying platform API.

const app = await NestFactory.create<NestExpressApplication>(AppModule);

Running the application

Once the installation process is complete, you can run the following command at your OS command prompt to start the application listening for inbound HTTP requests:

$ npm run start

This command starts the app with the HTTP server listening on the port defined in the src/main.ts file. Once the application is running, open your browser and navigate to http://localhost:3000/. You should see the Hello World! message.

Guards

A guard is a class annotated with the @Injectable() decorator. Guards should implement the CanActivate interface.

Guards have a single responsibility. They determine whether a given request will be handled by the route handler or not, depending on certain conditions (like permissions, roles, ACLs, etc.) present at run-time. This is often referred to as authorization. Authorization (and its cousin, authentication, with which it usually collaborates) has typically been handled by middleware in traditional Express applications. Middleware is a fine choice for authentication, since things like token validation and attaching properties to the request object are not strongly connected with a particular route context (and its metadata).

But middleware, by its nature, is dumb. It doesn't know which handler will be executed after calling the next() function. On the other hand, Guards have access to the ExecutionContext instance, and thus know exactly what's going to be executed next. They're designed, much like exception filters, pipes, and interceptors, to let you interpose processing logic at exactly the right point in the request/response cycle, and to do so declaratively. This helps keep your code DRY and declarative.

info Hint Guards are executed after each middleware, but before any interceptor or pipe.

Authorization guard

As mentioned, authorization is a great use case for Guards because specific routes should be available only when the caller (usually a specific authenticated user) has sufficient permissions. The AuthGuard that we'll build now assumes an authenticated user (and that, therefore, a token is attached to the request headers). It will extract and validate the token, and use the extracted information to determine whether the request can proceed or not.

@@filename(auth.guard)
import { Injectable, CanActivate, ExecutionContext } from '@nestjs/common';
import { Observable } from 'rxjs';

@Injectable()
export class AuthGuard implements CanActivate {
  canActivate(
    context: ExecutionContext,
  ): boolean | Promise<boolean> | Observable<boolean> {
    const request = context.switchToHttp().getRequest();
    return validateRequest(request);
  }
}
@@switch
import { Injectable } from '@nestjs/common';

@Injectable()
export class AuthGuard {
  async canActivate(context) {
    const request = context.switchToHttp().getRequest();
    return validateRequest(request);
  }
}

The logic inside the validateRequest() function can be as simple or sophisticated as needed. The main point of this example is to show how guards fit into the request/response cycle.

Every guard must implement a canActivate() function. This function should return a boolean, indicating whether the current request is allowed or not. It can return the response either synchronously or asynchronously (via a Promise or Observable). Nest uses the return value to control the next action:

  • if it returns true, the request will be processed.
  • if it returns false, Nest will deny the request.

Execution context

The canActivate() function takes a single argument, the ExecutionContext instance. The ExecutionContext inherits from ArgumentsHost. We saw ArgumentsHost previously in the exception filters chapter. In the sample above, we are just using the same helper methods defined on ArgumentsHost that we used earlier, to get a reference to the Request object. You can refer back to the Arguments host section of the exception filters chapter for more on this topic.

By extending ArgumentsHost, ExecutionContext also adds several new helper methods that provide additional details about the current execution process. These details can be helpful in building more generic guards that can work across a broad set of controllers, methods, and execution contexts. Learn more about ExecutionContext here.

Role-based authentication

Let's build a more functional guard that permits access only to users with a specific role. We'll start with a basic guard template, and build on it in the coming sections. For now, it allows all requests to proceed:

@@filename(roles.guard)
import { Injectable, CanActivate, ExecutionContext } from '@nestjs/common';
import { Observable } from 'rxjs';

@Injectable()
export class RolesGuard implements CanActivate {
  canActivate(
    context: ExecutionContext,
  ): boolean | Promise<boolean> | Observable<boolean> {
    return true;
  }
}
@@switch
import { Injectable } from '@nestjs/common';

@Injectable()
export class RolesGuard {
  canActivate(context) {
    return true;
  }
}

Binding guards

Like pipes and exception filters, guards can be controller-scoped, method-scoped, or global-scoped. Below, we set up a controller-scoped guard using the @UseGuards() decorator. This decorator may take a single argument, or a comma-separated list of arguments. This lets you easily apply the appropriate set of guards with one declaration.

@@filename()
@Controller('cats')
@UseGuards(RolesGuard)
export class CatsController {}

info Hint The @UseGuards() decorator is imported from the @nestjs/common package.

Above, we passed the RolesGuard type (instead of an instance), leaving responsibility for instantiation to the framework and enabling dependency injection. As with pipes and exception filters, we can also pass an in-place instance:

@@filename()
@Controller('cats')
@UseGuards(new RolesGuard())
export class CatsController {}

The construction above attaches the guard to every handler declared by this controller. If we wish the guard to apply only to a single method, we apply the @UseGuards() decorator at the method level.

In order to set up a global guard, use the useGlobalGuards() method of the Nest application instance:

@@filename()
const app = await NestFactory.create(AppModule);
app.useGlobalGuards(new RolesGuard());

warning Notice In the case of hybrid apps the useGlobalGuards() method doesn't set up guards for gateways and micro services by default (see Hybrid application for information on how to change this behavior). For "standard" (non-hybrid) microservice apps, useGlobalGuards() does mount the guards globally.

Global guards are used across the whole application, for every controller and every route handler. In terms of dependency injection, global guards registered from outside of any module (with useGlobalGuards() as in the example above) cannot inject dependencies since this is done outside the context of any module. In order to solve this issue, you can set up a guard directly from any module using the following construction:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { APP_GUARD } from '@nestjs/core';

@Module({
  providers: [
    {
      provide: APP_GUARD,
      useClass: RolesGuard,
    },
  ],
})
export class AppModule {}

info Hint When using this approach to perform dependency injection for the guard, note that regardless of the module where this construction is employed, the guard is, in fact, global. Where should this be done? Choose the module where the guard (RolesGuard in the example above) is defined. Also, useClass is not the only way of dealing with custom provider registration. Learn more here.

Setting roles per handler

Our RolesGuard is working, but it's not very smart yet. We're not yet taking advantage of the most important guard feature - the execution context. It doesn't yet know about roles, or which roles are allowed for each handler. The CatsController, for example, could have different permission schemes for different routes. Some might be available only for an admin user, and others could be open for everyone. How can we match roles to routes in a flexible and reusable way?

This is where custom metadata comes into play (learn more here). Nest provides the ability to attach custom metadata to route handlers through the @SetMetadata() decorator. This metadata supplies our missing role data, which a smart guard needs to make decisions. Let's take a look at using @SetMetadata():

@@filename(cats.controller)
@Post()
@SetMetadata('roles', ['admin'])
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}
@@switch
@Post()
@SetMetadata('roles', ['admin'])
@Bind(Body())
async create(createCatDto) {
  this.catsService.create(createCatDto);
}

info Hint The @SetMetadata() decorator is imported from the @nestjs/common package.

With the construction above, we attached the roles metadata (roles is a key, while ['admin'] is a particular value) to the create() method. While this works, it's not good practice to use @SetMetadata() directly in your routes. Instead, create your own decorators, as shown below:

@@filename(roles.decorator)
import { SetMetadata } from '@nestjs/common';

export const Roles = (...roles: string[]) => SetMetadata('roles', roles);
@@switch
import { SetMetadata } from '@nestjs/common';

export const Roles = (...roles) => SetMetadata('roles', roles);

This approach is much cleaner and more readable, and is strongly typed. Now that we have a custom @Roles() decorator, we can use it to decorate the create() method.

@@filename(cats.controller)
@Post()
@Roles('admin')
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}
@@switch
@Post()
@Roles('admin')
@Bind(Body())
async create(createCatDto) {
  this.catsService.create(createCatDto);
}

Putting it all together

Let's now go back and tie this together with our RolesGuard. Currently, it simply returns true in all cases, allowing every request to proceed. We want to make the return value conditional based on the comparing the roles assigned to the current user to the actual roles required by the current route being processed. In order to access the route's role(s) (custom metadata), we'll use the Reflector helper class, which is provided out of the box by the framework and exposed from the @nestjs/core package.

@@filename(roles.guard)
import { Injectable, CanActivate, ExecutionContext } from '@nestjs/common';
import { Reflector } from '@nestjs/core';

@Injectable()
export class RolesGuard implements CanActivate {
  constructor(private reflector: Reflector) {}

  canActivate(context: ExecutionContext): boolean {
    const roles = this.reflector.get<string[]>('roles', context.getHandler());
    if (!roles) {
      return true;
    }
    const request = context.switchToHttp().getRequest();
    const user = request.user;
    return matchRoles(roles, user.roles);
  }
}
@@switch
import { Injectable, Dependencies } from '@nestjs/common';
import { Reflector } from '@nestjs/core';

@Injectable()
@Dependencies(Reflector)
export class RolesGuard {
  constructor(reflector) {
    this.reflector = reflector;
  }

  canActivate(context) {
    const roles = this.reflector.get('roles', context.getHandler());
    if (!roles) {
      return true;
    }
    const request = context.switchToHttp().getRequest();
    const user = request.user;
    return matchRoles(roles, user.roles);
  }
}

info Hint In the node.js world, it's common practice to attach the authorized user to the request object. Thus, in our sample code above, we are assuming that request.user contains the user instance and allowed roles. In your app, you will probably make that association in your custom authentication guard (or middleware).

warning Warning The logic inside the matchRoles() function can be as simple or sophisticated as needed. The main point of this example is to show how guards fit into the request/response cycle.

Refer to the Reflection and metadata section of the Execution context chapter for more details on utilizing Reflector in a context-sensitive way.

When a user with insufficient privileges requests an endpoint, Nest automatically returns the following response:

{
  "statusCode": 403,
  "message": "Forbidden resource",
  "error": "Forbidden"
}

Note that behind the scenes, when a guard returns false, the framework throws a ForbiddenException. If you want to return a different error response, you should throw your own specific exception. For example:

throw new UnauthorizedException();

Any exception thrown by a guard will be handled by the exceptions layer (global exceptions filter and any exceptions filters that are applied to the current context).

Interceptors

An interceptor is a class annotated with the @Injectable() decorator. Interceptors should implement the NestInterceptor interface.

Interceptors have a set of useful capabilities which are inspired by the Aspect Oriented Programming (AOP) technique. They make it possible to:

  • bind extra logic before / after method execution
  • transform the result returned from a function
  • transform the exception thrown from a function
  • extend the basic function behavior
  • completely override a function depending on specific conditions (e.g., for caching purposes)

Basics

Each interceptor implements the intercept() method, which takes two arguments. The first one is the ExecutionContext instance (exactly the same object as for guards). The ExecutionContext inherits from ArgumentsHost. We saw ArgumentsHost before in the exception filters chapter. There, we saw that it's a wrapper around arguments that have been passed to the original handler, and contains different arguments arrays based on the type of the application. You can refer back to the exception filters for more on this topic.

Execution context

By extending ArgumentsHost, ExecutionContext also adds several new helper methods that provide additional details about the current execution process. These details can be helpful in building more generic interceptors that can work across a broad set of controllers, methods, and execution contexts. Learn more about ExecutionContext here.

Call handler

The second argument is a CallHandler. The CallHandler interface implements the handle() method, which you can use to invoke the route handler method at some point in your interceptor. If you don't call the handle() method in your implementation of the intercept() method, the route handler method won't be executed at all.

This approach means that the intercept() method effectively wraps the request/response stream. As a result, you may implement custom logic both before and after the execution of the final route handler. It's clear that you can write code in your intercept() method that executes before calling handle(), but how do you affect what happens afterward? Because the handle() method returns an Observable, we can use powerful RxJS operators to further manipulate the response. Using Aspect Oriented Programming terminology, the invocation of the route handler (i.e., calling handle()) is called a Pointcut, indicating that it's the point at which our additional logic is inserted.

Consider, for example, an incoming POST /cats request. This request is destined for the create() handler defined inside the CatsController. If an interceptor which does not call the handle() method is called anywhere along the way, the create() method won't be executed. Once handle() is called (and its Observable has been returned), the create() handler will be triggered. And once the response stream is received via the Observable, additional operations can be performed on the stream, and a final result returned to the caller.

Aspect interception

The first use case we'll look at is to use an interceptor to log user interaction (e.g., storing user calls, asynchronously dispatching events or calculating a timestamp). We show a simple LoggingInterceptor below:

@@filename(logging.interceptor)
import { Injectable, NestInterceptor, ExecutionContext, CallHandler } from '@nestjs/common';
import { Observable } from 'rxjs';
import { tap } from 'rxjs/operators';

@Injectable()
export class LoggingInterceptor implements NestInterceptor {
  intercept(context: ExecutionContext, next: CallHandler): Observable<any> {
    console.log('Before...');

    const now = Date.now();
    return next
      .handle()
      .pipe(
        tap(() => console.log(`After... ${Date.now() - now}ms`)),
      );
  }
}
@@switch
import { Injectable } from '@nestjs/common';
import { Observable } from 'rxjs';
import { tap } from 'rxjs/operators';

@Injectable()
export class LoggingInterceptor {
  intercept(context, next) {
    console.log('Before...');

    const now = Date.now();
    return next
      .handle()
      .pipe(
        tap(() => console.log(`After... ${Date.now() - now}ms`)),
      );
  }
}

info Hint The NestInterceptor<T, R> is a generic interface in which T indicates the type of an Observable<T> (supporting the response stream), and R is the type of the value wrapped by Observable<R>.

warning Notice Interceptors, like controllers, providers, guards, and so on, can inject dependencies through their constructor.

Since handle() returns an RxJS Observable, we have a wide choice of operators we can use to manipulate the stream. In the example above, we used the tap() operator, which invokes our anonymous logging function upon graceful or exceptional termination of the observable stream, but doesn't otherwise interfere with the response cycle.

Binding interceptors

In order to set up the interceptor, we use the @UseInterceptors() decorator imported from the @nestjs/common package. Like pipes and guards, interceptors can be controller-scoped, method-scoped, or global-scoped.

@@filename(cats.controller)
@UseInterceptors(LoggingInterceptor)
export class CatsController {}

info Hint The @UseInterceptors() decorator is imported from the @nestjs/common package.

Using the above construction, each route handler defined in CatsController will use LoggingInterceptor. When someone calls the GET /cats endpoint, you'll see the following output in your standard output:

Before...
After... 1ms

Note that we passed the LoggingInterceptor type (instead of an instance), leaving responsibility for instantiation to the framework and enabling dependency injection. As with pipes, guards, and exception filters, we can also pass an in-place instance:

@@filename(cats.controller)
@UseInterceptors(new LoggingInterceptor())
export class CatsController {}

As mentioned, the construction above attaches the interceptor to every handler declared by this controller. If we want to restrict the interceptor's scope to a single method, we simply apply the decorator at the method level.

In order to set up a global interceptor, we use the useGlobalInterceptors() method of the Nest application instance:

const app = await NestFactory.create(AppModule);
app.useGlobalInterceptors(new LoggingInterceptor());

Global interceptors are used across the whole application, for every controller and every route handler. In terms of dependency injection, global interceptors registered from outside of any module (with useGlobalInterceptors(), as in the example above) cannot inject dependencies since this is done outside the context of any module. In order to solve this issue, you can set up an interceptor directly from any module using the following construction:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { APP_INTERCEPTOR } from '@nestjs/core';

@Module({
  providers: [
    {
      provide: APP_INTERCEPTOR,
      useClass: LoggingInterceptor,
    },
  ],
})
export class AppModule {}

info Hint When using this approach to perform dependency injection for the interceptor, note that regardless of the module where this construction is employed, the interceptor is, in fact, global. Where should this be done? Choose the module where the interceptor (LoggingInterceptor in the example above) is defined. Also, useClass is not the only way of dealing with custom provider registration. Learn more here.

Response mapping

We already know that handle() returns an Observable. The stream contains the value returned from the route handler, and thus we can easily mutate it using RxJS's map() operator.

warning Warning The response mapping feature doesn't work with the library-specific response strategy (using the @Res() object directly is forbidden).

Let's create the TransformInterceptor, which will modify each response in a trivial way to demonstrate the process. It will use RxJS's map() operator to assign the response object to the data property of a newly created object, returning the new object to the client.

@@filename(transform.interceptor)
import { Injectable, NestInterceptor, ExecutionContext, CallHandler } from '@nestjs/common';
import { Observable } from 'rxjs';
import { map } from 'rxjs/operators';

export interface Response<T> {
  data: T;
}

@Injectable()
export class TransformInterceptor<T> implements NestInterceptor<T, Response<T>> {
  intercept(context: ExecutionContext, next: CallHandler): Observable<Response<T>> {
    return next.handle().pipe(map(data => ({ data })));
  }
}
@@switch
import { Injectable } from '@nestjs/common';
import { map } from 'rxjs/operators';

@Injectable()
export class TransformInterceptor {
  intercept(context, next) {
    return next.handle().pipe(map(data => ({ data })));
  }
}

info Hint Nest interceptors work with both synchronous and asynchronous intercept() methods. You can simply switch the method to async if necessary.

With the above construction, when someone calls the GET /cats endpoint, the response would look like the following (assuming that route handler returns an empty array []):

{
  "data": []
}

Interceptors have great value in creating re-usable solutions to requirements that occur across an entire application. For example, imagine we need to transform each occurrence of a null value to an empty string ''. We can do it using one line of code and bind the interceptor globally so that it will automatically be used by each registered handler.

@@filename()
import { Injectable, NestInterceptor, ExecutionContext, CallHandler } from '@nestjs/common';
import { Observable } from 'rxjs';
import { map } from 'rxjs/operators';

@Injectable()
export class ExcludeNullInterceptor implements NestInterceptor {
  intercept(context: ExecutionContext, next: CallHandler): Observable<any> {
    return next
      .handle()
      .pipe(map(value => value === null ? '' : value ));
  }
}
@@switch
import { Injectable } from '@nestjs/common';
import { map } from 'rxjs/operators';

@Injectable()
export class ExcludeNullInterceptor {
  intercept(context, next) {
    return next
      .handle()
      .pipe(map(value => value === null ? '' : value ));
  }
}

Exception mapping

Another interesting use-case is to take advantage of RxJS's catchError() operator to override thrown exceptions:

@@filename(errors.interceptor)
import {
  Injectable,
  NestInterceptor,
  ExecutionContext,
  BadGatewayException,
  CallHandler,
} from '@nestjs/common';
import { Observable, throwError } from 'rxjs';
import { catchError } from 'rxjs/operators';

@Injectable()
export class ErrorsInterceptor implements NestInterceptor {
  intercept(context: ExecutionContext, next: CallHandler): Observable<any> {
    return next
      .handle()
      .pipe(
        catchError(err => throwError(new BadGatewayException())),
      );
  }
}
@@switch
import { Injectable, BadGatewayException } from '@nestjs/common';
import { throwError } from 'rxjs';
import { catchError } from 'rxjs/operators';

@Injectable()
export class ErrorsInterceptor {
  intercept(context, next) {
    return next
      .handle()
      .pipe(
        catchError(err => throwError(new BadGatewayException())),
      );
  }
}

Stream overriding

There are several reasons why we may sometimes want to completely prevent calling the handler and return a different value instead. An obvious example is to implement a cache to improve response time. Let's take a look at a simple cache interceptor that returns its response from a cache. In a realistic example, we'd want to consider other factors like TTL, cache invalidation, cache size, etc., but that's beyond the scope of this discussion. Here we'll provide a basic example that demonstrates the main concept.

@@filename(cache.interceptor)
import { Injectable, NestInterceptor, ExecutionContext, CallHandler } from '@nestjs/common';
import { Observable, of } from 'rxjs';

@Injectable()
export class CacheInterceptor implements NestInterceptor {
  intercept(context: ExecutionContext, next: CallHandler): Observable<any> {
    const isCached = true;
    if (isCached) {
      return of([]);
    }
    return next.handle();
  }
}
@@switch
import { Injectable } from '@nestjs/common';
import { of } from 'rxjs';

@Injectable()
export class CacheInterceptor {
  intercept(context, next) {
    const isCached = true;
    if (isCached) {
      return of([]);
    }
    return next.handle();
  }
}

Our CacheInterceptor has a hardcoded isCached variable and a hardcoded response [] as well. The key point to note is that we return a new stream here, created by the RxJS of() operator, therefore the route handler won't be called at all. When someone calls an endpoint that makes use of CacheInterceptor, the response (a hardcoded, empty array) will be returned immediately. In order to create a generic solution, you can take advantage of Reflector and create a custom decorator. The Reflector is well described in the guards chapter.

More operators

The possibility of manipulating the stream using RxJS operators gives us many capabilities. Let's consider another common use case. Imagine you would like to handle timeouts on route requests. When your endpoint doesn't return anything after a period of time, you want to terminate with an error response. The following construction enables this:

@@filename(timeout.interceptor)
import { Injectable, NestInterceptor, ExecutionContext, CallHandler, RequestTimeoutException } from '@nestjs/common';
import { Observable, throwError, TimeoutError } from 'rxjs';
import { catchError, timeout } from 'rxjs/operators';

@Injectable()
export class TimeoutInterceptor implements NestInterceptor {
  intercept(context: ExecutionContext, next: CallHandler): Observable<any> {
    return next.handle().pipe(
      timeout(5000),
      catchError(err => {
        if (err instanceof TimeoutError) {
          return throwError(new RequestTimeoutException());
        }
        return throwError(err);
      }),
    );
  };
};
@@switch
import { Injectable, RequestTimeoutException } from '@nestjs/common';
import { Observable, throwError, TimeoutError } from 'rxjs';
import { catchError, timeout } from 'rxjs/operators';

@Injectable()
export class TimeoutInterceptor {
  intercept(context, next) {
    return next.handle().pipe(
      timeout(5000),
      catchError(err => {
        if (err instanceof TimeoutError) {
          return throwError(new RequestTimeoutException());
        }
        return throwError(err);
      }),
    );
  };
};

After 5 seconds, request processing will be canceled. You can also add custom logic before throwing RequestTimeoutException (e.g. release resources).

Introduction

Nest (NestJS) is a framework for building efficient, scalable Node.js server-side applications. It uses progressive JavaScript, is built with and fully supports TypeScript (yet still enables developers to code in pure JavaScript) and combines elements of OOP (Object Oriented Programming), FP (Functional Programming), and FRP (Functional Reactive Programming).

Under the hood, Nest makes use of robust HTTP Server frameworks like Express (the default) and optionally can be configured to use Fastify as well!

Nest provides a level of abstraction above these common Node.js frameworks (Express/Fastify), but also exposes their APIs directly to the developer. This gives developers the freedom to use the myriad of third-party modules which are available for the underlying platform.

Philosophy

In recent years, thanks to Node.js, JavaScript has become the “lingua franca” of the web for both front and backend applications. This has given rise to awesome projects like Angular, React and Vue, which improve developer productivity and enable the creation of fast, testable, and extensible frontend applications. However, while plenty of superb libraries, helpers, and tools exist for Node (and server-side JavaScript), none of them effectively solve the main problem of - Architecture.

Nest provides an out-of-the-box application architecture which allows developers and teams to create highly testable, scalable, loosely coupled, and easily maintainable applications. The architecture is heavily inspired by Angular.

Installation

To get started, you can either scaffold the project with the Nest CLI, or clone a starter project (both will produce the same outcome).

To scaffold the project with the Nest CLI, run the following commands. This will create a new project directory, and populate the directory with the initial core Nest files and supporting modules, creating a conventional base structure for your project. Creating a new project with the Nest CLI is recommended for first-time users. We'll continue with this approach in First Steps.

$ npm i -g @nestjs/cli
$ nest new project-name

Alternatively, to install the TypeScript starter project with Git:

$ git clone https://github.com/nestjs/typescript-starter.git project
$ cd project
$ npm install
$ npm run start

Open your browser and navigate to http://localhost:3000/.

To install the JavaScript flavor of the starter project, use javascript-starter.git in the command sequence above.

You can also manually create a new project from scratch by installing the core and supporting files with npm (or yarn). In this case, of course, you'll be responsible for creating the project boilerplate files yourself.

$ npm i --save @nestjs/core @nestjs/common rxjs reflect-metadata

Middleware

Middleware is a function which is called before the route handler. Middleware functions have access to the request and response objects, and the next() middleware function in the application’s request-response cycle. The next middleware function is commonly denoted by a variable named next.

Nest middleware are, by default, equivalent to express middleware. The following description from the official express documentation describes the capabilities of middleware:

Middleware functions can perform the following tasks:
  • execute any code.
  • make changes to the request and the response objects.
  • end the request-response cycle.
  • call the next middleware function in the stack.
  • if the current middleware function does not end the request-response cycle, it must call next() to pass control to the next middleware function. Otherwise, the request will be left hanging.

You implement custom Nest middleware in either a function, or in a class with an @Injectable() decorator. The class should implement the NestMiddleware interface, while the function does not have any special requirements. Let's start by implementing a simple middleware feature using the class method.

@@filename(logger.middleware)
import { Injectable, NestMiddleware } from '@nestjs/common';
import { Request, Response } from 'express';

@Injectable()
export class LoggerMiddleware implements NestMiddleware {
  use(req: Request, res: Response, next: Function) {
    console.log('Request...');
    next();
  }
}
@@switch
import { Injectable } from '@nestjs/common';

@Injectable()
export class LoggerMiddleware {
  use(req, res, next) {
    console.log('Request...');
    next();
  }
}

Dependency injection

Nest middleware fully supports Dependency Injection. Just as with providers and controllers, they are able to inject dependencies that are available within the same module. As usual, this is done through the constructor.

Applying middleware

There is no place for middleware in the @Module() decorator. Instead, we set them up using the configure() method of the module class. Modules that include middleware have to implement the NestModule interface. Let's set up the LoggerMiddleware at the AppModule level.

@@filename(app.module)
import { Module, NestModule, MiddlewareConsumer } from '@nestjs/common';
import { LoggerMiddleware } from './common/middleware/logger.middleware';
import { CatsModule } from './cats/cats.module';

@Module({
  imports: [CatsModule],
})
export class AppModule implements NestModule {
  configure(consumer: MiddlewareConsumer) {
    consumer
      .apply(LoggerMiddleware)
      .forRoutes('cats');
  }
}
@@switch
import { Module } from '@nestjs/common';
import { LoggerMiddleware } from './common/middleware/logger.middleware';
import { CatsModule } from './cats/cats.module';

@Module({
  imports: [CatsModule],
})
export class AppModule {
  configure(consumer) {
    consumer
      .apply(LoggerMiddleware)
      .forRoutes('cats');
  }
}

In the above example we have set up the LoggerMiddleware for the /cats route handlers that were previously defined inside the CatsController. We may also further restrict a middleware to a particular request method by passing an object containing the route path and request method to the forRoutes() method when configuring the middleware. In the example below, notice that we import the RequestMethod enum to reference the desired request method type.

@@filename(app.module)
import { Module, NestModule, RequestMethod, MiddlewareConsumer } from '@nestjs/common';
import { LoggerMiddleware } from './common/middleware/logger.middleware';
import { CatsModule } from './cats/cats.module';

@Module({
  imports: [CatsModule],
})
export class AppModule implements NestModule {
  configure(consumer: MiddlewareConsumer) {
    consumer
      .apply(LoggerMiddleware)
      .forRoutes({ path: 'cats', method: RequestMethod.GET });
  }
}
@@switch
import { Module, RequestMethod } from '@nestjs/common';
import { LoggerMiddleware } from './common/middleware/logger.middleware';
import { CatsModule } from './cats/cats.module';

@Module({
  imports: [CatsModule],
})
export class AppModule {
  configure(consumer) {
    consumer
      .apply(LoggerMiddleware)
      .forRoutes({ path: 'cats', method: RequestMethod.GET });
  }
}

info Hint The configure() method can be made asynchronous using async/await (e.g., you can await completion of an asynchronous operation inside the configure() method body).

Route wildcards

Pattern based routes are supported as well. For instance, the asterisk is used as a wildcard, and will match any combination of characters:

forRoutes({ path: 'ab*cd', method: RequestMethod.ALL });

The 'ab*cd' route path will match abcd, ab_cd, abecd, and so on. The characters ?, +, *, and () may be used in a route path, and are subsets of their regular expression counterparts. The hyphen ( -) and the dot (.) are interpreted literally by string-based paths.

warning Warning The fastify package uses the latest version of the path-to-regexp package, which no longer supports wildcard asterisks *. Instead, you must use parameters (e.g., (.*), :splat*).

Middleware consumer

The MiddlewareConsumer is a helper class. It provides several built-in methods to manage middleware. All of them can be simply chained in the fluent style. The forRoutes() method can take a single string, multiple strings, a RouteInfo object, a controller class and even multiple controller classes. In most cases you'll probably just pass a list of controllers separated by commas. Below is an example with a single controller:

@@filename(app.module)
import { Module, NestModule, MiddlewareConsumer } from '@nestjs/common';
import { LoggerMiddleware } from './common/middleware/logger.middleware';
import { CatsModule } from './cats/cats.module';
import { CatsController } from './cats/cats.controller.ts';

@Module({
  imports: [CatsModule],
})
export class AppModule implements NestModule {
  configure(consumer: MiddlewareConsumer) {
    consumer
      .apply(LoggerMiddleware)
      .forRoutes(CatsController);
  }
}
@@switch
import { Module } from '@nestjs/common';
import { LoggerMiddleware } from './common/middleware/logger.middleware';
import { CatsModule } from './cats/cats.module';
import { CatsController } from './cats/cats.controller.ts';

@Module({
  imports: [CatsModule],
})
export class AppModule {
  configure(consumer) {
    consumer
      .apply(LoggerMiddleware)
      .forRoutes(CatsController);
  }
}

info Hint The apply() method may either take a single middleware, or multiple arguments to specify multiple middlewares.

Excluding routes

At times we want to exclude certain routes from having the middleware applied. We can easily exclude certain routes with the exclude() method. This method can take a single string, multiple strings, or a RouteInfo object identifying routes to be excluded, as shown below:

consumer
  .apply(LoggerMiddleware)
  .exclude(
    { path: 'cats', method: RequestMethod.GET },
    { path: 'cats', method: RequestMethod.POST },
    'cats/(.*)',
  )
  .forRoutes(CatsController);

info Hint The exclude() method supports wildcard parameters using the path-to-regexp package.

With the example above, LoggerMiddleware will be bound to all routes defined inside CatsController except the three passed to the exclude() method.

Functional middleware

The LoggerMiddleware class we've been using is quite simple. It has no members, no additional methods, and no dependencies. Why can't we just define it in a simple function instead of a class? In fact, we can. This type of middleware is called functional middleware. Let's transform the logger middleware from class-based into functional middleware to illustrate the difference:

@@filename(logger.middleware)
import { Request, Response } from 'express';

export function logger(req: Request, res: Response, next: Function) {
  console.log(`Request...`);
  next();
};
@@switch
export function logger(req, res, next) {
  console.log(`Request...`);
  next();
};

And use it within the AppModule:

@@filename(app.module)
consumer
  .apply(logger)
  .forRoutes(CatsController);

info Hint Consider using the simpler functional middleware alternative any time your middleware doesn't need any dependencies.

Multiple middleware

As mentioned above, in order to bind multiple middleware that are executed sequentially, simply provide a comma separated list inside the apply() method:

consumer.apply(cors(), helmet(), logger).forRoutes(CatsController);

Global middleware

If we want to bind middleware to every registered route at once, we can use the use() method that is supplied by the INestApplication instance:

const app = await NestFactory.create(AppModule);
app.use(logger);
await app.listen(3000);

Migration guide

This article provides a set of guidelines for migrating from Nest version 6 to version 7.

Custom route decorators

The Custom decorators API has been unified for all types of applications. Now, whether you're creating a GraphQL application or a REST API, the factory passed into the createParamDecorator() function will take the ExecutionContext (read more here) object as a second argument.

@@filename()
// Before
import { createParamDecorator } from '@nestjs/common';

export const User = createParamDecorator((data, req) => {
  return req.user;
});

// After
import { createParamDecorator, ExecutionContext } from '@nestjs/common';

export const User = createParamDecorator(
  (data: unknown, ctx: ExecutionContext) => {
    const request = ctx.switchToHttp().getRequest();
    return request.user;
  },
);
@@switch
// Before
import { createParamDecorator } from '@nestjs/common';

export const User = createParamDecorator((data, req) => {
  return req.user;
});

// After
import { createParamDecorator } from '@nestjs/common';

export const User = createParamDecorator((data, ctx) => {
  const request = ctx.switchToHttp().getRequest();
  return request.user;
});

Microservices

To avoid code duplication, the MicroserviceOptions interface has been removed from the @nestjs/common package. Therefore, now when you're creating a microservice (through either createMicroservice() or connectMicroservice() method), you should pass the type generic parameter to get code autocompletion.

@@filename()
// Before
const app = await NestFactory.createMicroservice(AppModule);

// After
const app = await NestFactory.createMicroservice<MicroserviceOptions>(AppModule);
@@switch
// Before
const app = await NestFactory.createMicroservice(AppModule);

// After
const app = await NestFactory.createMicroservice(AppModule);

info Hint The MicroserviceOptions interface is exported from the @nestjs/microservices package.

GraphQL

In the version 6 major release of NestJS, we introduced the code-first approach as a compatibility layer between the type-graphql package and the @nestjs/graphql module. Eventually, our team decided to reimplement all the features from scratch due to a lack of flexibility. To avoid numerous breaking changes, the public API is backward-compatible and may resemble type-graphql.

In order to migrate your existing application, simply rename all the type-graphql imports to the @nestjs/graphql. If you used more advanced features, you might need to also:

  • use Type (imported from @nestjs/common) instead of ClassType (imported from type-graphql)
  • move methods that require @Args() from object types (classes annotated with @ObjectType() decorator) under resolver classes (and use @ResolveField() decorator instead of @Field())

Terminus

In the version 7 major release of @nestjs/terminus, a new simplified API has been introduced to run health checks. The previously required peer dependency @godaddy/terminus has been removed, which allows us to integrate our health checks automatically into Swagger! Read more about the removal of @godaddy/terminus here.

For most users, the biggest change will be the removal of the TerminusModule.forRootAsync function. With the next major version, this function will be completely removed. To migrate to the new API, you will need to create a new controller, which will handle your health checks.

@@filename()
// Before
@Injectable()
export class TerminusOptionsService implements TerminusOptionsFactory {
  constructor(
    private dns: DNSHealthIndicator,
  ) {}

  createTerminusOptions(): TerminusModuleOptions {
    const healthEndpoint: TerminusEndpoint = {
      url: '/health',
      healthIndicators: [
        async () => this.dns.pingCheck('google', 'https://google.com'),
      ],
    };
    return {
      endpoints: [healthEndpoint],
    };
  }
}

@Module({
  imports: [
    TerminusModule.forRootAsync({
      useClass: TerminusOptionsService
    })
  ]
})
export class AppModule { }

// After
@Controller('health')
export class HealthController {
  constructor(
    private health: HealthCheckService,
    private dns: DNSHealthIndicator,
  ) { }

  @Get()
  @HealthCheck()
  healthCheck() {
    return this.health.check([
      async () => this.dns.pingCheck('google', 'https://google.com'),
    ]);
  }
}

@Module({
  imports: [
    TerminusModule
  ]
})
export class AppModule { }

@@switch

// Before
@Injectable()
@Dependencies(DNSHealthIndicator)
export class TerminusOptionsService {
  constructor(
    private dns,
  ) {}

  createTerminusOptions() {
    const healthEndpoint = {
      url: '/health',
      healthIndicators: [
        async () => this.dns.pingCheck('google', 'https://google.com'),
      ],
    };
    return {
      endpoints: [healthEndpoint],
    };
  }
}

@Module({
  imports: [
    TerminusModule.forRootAsync({
      useClass: TerminusOptionsService
    })
  ]
})
export class AppModule { }

// After
@Controller('/health')
@Dependencies(HealthCheckService, DNSHealthIndicator)
export class HealthController {
  constructor(
    private health,
    private dns,
  ) { }

  @Get('/')
  @HealthCheck()
  healthCheck() {
    return this.health.check([
      async () => this.dns.pingCheck('google', 'https://google.com'),
    ])
  }
}

@Module({
  controllers: [
    HealthController
  ],
  imports: [
    TerminusModule
  ]
})
export class AppModule { }

warning Warning If you have set a Global Prefix in your Nest application and you have not used the useGlobalPrefix Terminus option, the URL of your health check will change. Make sure to update the reference to that URL, or use the legacy Terminus API until nestjs/nest#963 is fixed.

If you are forced to use the legacy API, you can also disable deprecation messages for the time being.

TerminusModule.forRootAsync({
  useFactory: () => ({
    disableDeprecationWarnings: true,
    endpoints: [
      // ...
    ]
  })
}

You should enable shutdown hooks in your main.ts file. The Terminus integration will listen on POSIX signals such as SIGTERM (see the Application shutdown chapter for more information). When enabled, the health check route(s) will automatically respond with a Service Unavailable (503) HTTP error response when the server is shutting down.

With the removal of @godaddy/terminus, you will need to update your import statements to use @nestjs/terminus instead. Most notable is the import of the HealthCheckError.

@@filename(custom.health)
// Before
import { HealthCheckError } from '@godaddy/terminus';
// After
import { HealthCheckError } from '@nestjs/terminus';

Once you have fully migrated, make sure you uninstall @godaddy/terminus.

npm uninstall --save @godaddy/terminus

HTTP exceptions body

Previously, the generated response bodies for the HttpException class and other exceptions derived from it (e.g., BadRequestException or NotFoundException) were inconsistent. In the latest major release, these exception responses will follow the same structure.

/*
 * Sample outputs for "throw new ForbiddenException('Forbidden resource')"
 */

// Before
{
  "statusCode": 403,
  "message": "Forbidden resource"
}

// After
{
  "statusCode": 403,
  "message": "Forbidden resource",
  "error": "Forbidden"
}

Validation errors schema

In past releases, the ValidationPipe threw an array of the ValidationError objects returned by the class-validator package. Now, ValidationPipe will map errors to a list of plain strings representing error messages.

// Before
{
  "statusCode": 400,
  "error": "Bad Request",
  "message": [
    {
      "target": {},
      "property": "email",
      "children": [],
      "constraints": {
        "isEmail": "email must be an email"
      }
    }
  ]
}

// After
{
  "statusCode": 400,
  "message": ["email must be an email"],
  "error": "Bad Request"
}

If you prefer the previous approach, you can restore it by setting the exceptionFactory function:

new ValidationPipe({
  exceptionFactory: errors => new BadRequestException(errors),
});

Implicit type conversion (ValidationPipe)

With the auto-transformation option enabled (transform: true), the ValidationPipe will now perform conversion of primitive types. In the following example, the findOne() method takes one argument which represents an extracted id path parameter:

@Get(':id')
findOne(@Param('id') id: number) {
  console.log(typeof id === 'number'); // true
  return 'This action returns a user';
}

By default, every path parameter and query parameter comes over the network as a string. In the above example, we specified the id type as a number (in the method signature). Therefore, the ValidationPipe will try to automatically convert a string identifier to a number.

Microservice channels (bidirectional communication)

To enable the request-response message type, Nest creates two logical channels - one is responsible for transferring the data while the other waits for incoming responses. For some underlying transports, such as NATS, this dual-channel support is provided out-of-the-box. For others, Nest compensates by manually creating separate channels.

Let's say that we have a single message handler @MessagePattern('getUsers'). In the past, Nest built two channels from this pattern: getUsers_ack (for requests) and getUsers_res (for responses). With version 7, this naming scheme changes. Now Nest will build getUsers (for requests) and getUsers.reply (for responses) instead. Also, specifically for the MQTT transport strategy, the response channel would be getUsers/reply (to avoid conflicts with topic wildcards).

Deprecations

All deprecations (from Nest version 5 to version 6) have been finally removed (e.g., the deprecated @ReflectMetadata decorator).

Node.js

This release drops support for Node v8. We strongly recommend using the latest LTS version.

Modules

A module is a class annotated with a @Module() decorator. The @Module() decorator provides metadata that Nest makes use of to organize the application structure.

Each application has at least one module, a root module. The root module is the starting point Nest uses to build the application graph - the internal data structure Nest uses to resolve module and provider relationships and dependencies. While very small applications may theoretically have just the root module, this is not the typical case. We want to emphasize that modules are strongly recommended as an effective way to organize your components. Thus, for most applications, the resulting architecture will employ multiple modules, each encapsulating a closely related set of capabilities.

The @Module() decorator takes a single object whose properties describe the module:

providers the providers that will be instantiated by the Nest injector and that may be shared at least across this module
controllers the set of controllers defined in this module which have to be instantiated
imports the list of imported modules that export the providers which are required in this module
exports the subset of providers that are provided by this module and should be available in other modules which import this module

The module encapsulates providers by default. This means that it's impossible to inject providers that are neither directly part of the current module nor exported from the imported modules. Thus, you may consider the exported providers from a module as the module's public interface, or API.

Feature modules

The CatsController and CatsService belong to the same application domain. As they are closely related, it makes sense to move them into a feature module. A feature module simply organizes code relevant for a specific feature, keeping code organized and establishing clear boundaries. This helps us manage complexity and develop with SOLID principles, especially as the size of the application and/or team grow.

To demonstrate this, we'll create the CatsModule.

@@filename(cats/cats.module)
import { Module } from '@nestjs/common';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';

@Module({
  controllers: [CatsController],
  providers: [CatsService],
})
export class CatsModule {}

info Hint To create a module using the CLI, simply execute the $ nest g module cats command.

Above, we defined the CatsModule in the cats.module.ts file, and moved everything related to this module into the cats directory. The last thing we need to do is import this module into the root module (the AppModule, defined in the app.module.ts file).

@@filename(app.module)
import { Module } from '@nestjs/common';
import { CatsModule } from './cats/cats.module';

@Module({
  imports: [CatsModule],
})
export class AppModule {}

Here is how our directory structure looks now:

src
cats
dto
create-cat.dto.ts
interfaces
cat.interface.ts
cats.service.ts
cats.controller.ts
cats.module.ts
app.module.ts
main.ts

Shared modules

In Nest, modules are singletons by default, and thus you can share the same instance of any provider between multiple modules effortlessly.

Every module is automatically a shared module. Once created it can be reused by any module. Let's imagine that we want to share an instance of the CatsService between several other modules. In order to do that, we first need to export the CatsService provider by adding it to the module's exports array, as shown below:

@@filename(cats.module)
import { Module } from '@nestjs/common';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';

@Module({
  controllers: [CatsController],
  providers: [CatsService],
  exports: [CatsService]
})
export class CatsModule {}

Now any module that imports the CatsModule has access to the CatsService and will share the same instance with all other modules that import it as well.

Module re-exporting

As seen above, Modules can export their internal providers. In addition, they can re-export modules that they import. In the example below, the CommonModule is both imported into and exported from the CoreModule, making it available for other modules which import this one.

@Module({
  imports: [CommonModule],
  exports: [CommonModule],
})
export class CoreModule {}

Dependency injection

A module class can inject providers as well (e.g., for configuration purposes):

@@filename(cats.module)
import { Module } from '@nestjs/common';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';

@Module({
  controllers: [CatsController],
  providers: [CatsService],
})
export class CatsModule {
  constructor(private catsService: CatsService) {}
}
@@switch
import { Module, Dependencies } from '@nestjs/common';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';

@Module({
  controllers: [CatsController],
  providers: [CatsService],
})
@Dependencies(CatsService)
export class CatsModule {
  constructor(catsService) {
    this.catsService = catsService;
  }
}

However, module classes themselves cannot be injected as providers due to circular dependency .

Global modules

If you have to import the same set of modules everywhere, it can get tedious. Unlike in Nest, Angular providers are registered in the global scope. Once defined, they're available everywhere. Nest, however, encapsulates providers inside the module scope. You aren't able to use a module's providers elsewhere without first importing the encapsulating module.

When you want to provide a set of providers which should be available everywhere out-of-the-box (e.g., helpers, database connections, etc.), make the module global with the @Global() decorator.

import { Module, Global } from '@nestjs/common';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';

@Global()
@Module({
  controllers: [CatsController],
  providers: [CatsService],
  exports: [CatsService],
})
export class CatsModule {}

The @Global() decorator makes the module global-scoped. Global modules should be registered only once, generally by the root or core module. In the above example, the CatsService provider will be ubiquitous, and modules that wish to inject the service will not need to import the CatsModule in their imports array.

info Hint Making everything global is not a good design decision. Global modules are available to reduce the amount of necessary boilerplate. The imports array is generally the preferred way to make the module's API available to consumers.

Dynamic modules

The Nest module system includes a powerful feature called dynamic modules. This feature enables you to easily create customizable modules that can register and configure providers dynamically. Dynamic modules are covered extensively here. In this chapter, we'll give a brief overview to complete the introduction to modules.

Following is an example of a dynamic module definition for a DatabaseModule:

@@filename()
import { Module, DynamicModule } from '@nestjs/common';
import { createDatabaseProviders } from './database.providers';
import { Connection } from './connection.provider';

@Module({
  providers: [Connection],
})
export class DatabaseModule {
  static forRoot(entities = [], options?): DynamicModule {
    const providers = createDatabaseProviders(options, entities);
    return {
      module: DatabaseModule,
      providers: providers,
      exports: providers,
    };
  }
}
@@switch
import { Module } from '@nestjs/common';
import { createDatabaseProviders } from './database.providers';
import { Connection } from './connection.provider';

@Module({
  providers: [Connection],
})
export class DatabaseModule {
  static forRoot(entities = [], options?) {
    const providers = createDatabaseProviders(options, entities);
    return {
      module: DatabaseModule,
      providers: providers,
      exports: providers,
    };
  }
}

info Hint The forRoot() method may return a dynamic module either synchronously or asynchronously (i.e., via a Promise).

This module defines the Connection provider by default (in the @Module() decorator metadata), but additionally - depending on the entities and options objects passed into the forRoot() method - exposes a collection of providers, for example, repositories. Note that the properties returned by the dynamic module extend (rather than override) the base module metadata defined in the @Module() decorator. That's how both the statically declared Connection provider and the dynamically generated repository providers are exported from the module.

If you want to register a dynamic module in the global scope, set the global property to true.

{
  global: true,
  module: DatabaseModule,
  providers: providers,
  exports: providers,
}

warning Warning As mentioned above, making everything global is not a good design decision.

The DatabaseModule can be imported and configured in the following manner:

import { Module } from '@nestjs/common';
import { DatabaseModule } from './database/database.module';
import { User } from './users/entities/user.entity';

@Module({
  imports: [DatabaseModule.forRoot([User])],
})
export class AppModule {}

If you want to in turn re-export a dynamic module, you can omit the forRoot() method call in the exports array:

import { Module } from '@nestjs/common';
import { DatabaseModule } from './database/database.module';
import { User } from './users/entities/user.entity';

@Module({
  imports: [DatabaseModule.forRoot([User])],
  exports: [DatabaseModule],
})
export class AppModule {}

The Dynamic modules chapter covers this topic in greater detail, and includes a working example.

Pipes

A pipe is a class annotated with the @Injectable() decorator. Pipes should implement the PipeTransform interface.

Pipes have two typical use cases:

  • transformation: transform input data to the desired form (e.g., from string to integer)
  • validation: evaluate input data and if valid, simply pass it through unchanged; otherwise, throw an exception when the data is incorrect

In both cases, pipes operate on the arguments being processed by a controller route handler. Nest interposes a pipe just before a method is invoked, and the pipe receives the arguments destined for the method and operates on them. Any transformation or validation operation takes place at that time, after which the route handler is invoked with any (potentially) transformed arguments.

Nest comes with a number of built-in pipes that you can use out-of-the-box. You can also build your own custom pipes. In this chapter, we'll introduce the built-in pipes and show how to bind them to route handlers. We'll then examine several custom-built pipes to show how you can build one from scratch.

info Hint Pipes run inside the exceptions zone. This means that when a Pipe throws an exception it is handled by the exceptions layer (global exceptions filter and any exceptions filters that are applied to the current context). Given the above, it should be clear that when an exception is thrown in a Pipe, no controller method is subsequently executed. This gives you a best-practice technique for validating data coming into the application from external sources at the system boundary.

Built-in pipes

Nest comes with six pipes available out-of-the-box:

  • ValidationPipe
  • ParseIntPipe
  • ParseBoolPipe
  • ParseArrayPipe
  • ParseUUIDPipe
  • DefaultValuePipe

They're exported from the @nestjs/common package.

Let's take a quick look at using ParseIntPipe. This is an example of the transformation use case, where the pipe ensures that a method handler parameter is converted to a JavaScript integer (or throws an exception if the conversion fails). Later in this chapter, we'll show a simple custom implementation for a ParseIntPipe. The example techniques below also apply to the other built-in transformation pipes (ParseBoolPipe, ParseArrayPipe and ParseUUIDPipe, which we'll refer to as the Parse* pipes in this chapter).

Binding pipes

To use a pipe, we need to bind an instance of the pipe class to the appropriate context. In our ParseIntPipe example, we want to associate the pipe with a particular route handler method, and make sure it runs before the method is called. We do so with the following construct, which we'll refer to as binding the pipe at the method parameter level:

@Get(':id')
async findOne(@Param('id', ParseIntPipe) id: number) {
  return this.catsService.findOne(id);
}

This ensures that one of the following two conditions is true: either the parameter we receive in the findOne() method is a number (as expected in our call to this.catsService.findOne()), or an exception is thrown before the route handler is called.

For example, assume the route is called like:

GET localhost:3000/abc

Nest will throw an exception like this:

{
  "statusCode": 400,
  "message": "Validation failed (numeric string is expected)",
  "error": "Bad Request"
}

The exception will prevent the body of the findOne() method from executing.

In the example above, we pass a class (ParseIntPipe), not an instance, leaving responsibility for instantiation to the framework and enabling dependency injection. As with pipes and guards, we can instead pass an in-place instance. Passing an in-place instance is useful if we want to customize the built-in pipe's behavior by passing options:

@Get(':id')
async findOne(
  @Param('id', new ParseIntPipe({ errorHttpStatusCode: HttpStatus.NOT_ACCEPTABLE }))
  id: number,
) {
  return this.catsService.findOne(id);
}

Binding the other transformation pipes (all of the Parse* pipes) works similarly. These pipes all work in the context of validating route parameters, query string parameters and request body values.

For example with a query string parameter:

@Get()
async findOne(@Query('id', ParseIntPipe) id: number) {
  return this.catsService.findOne(id);
}

Here's an example of using the ParseUUIDPipe to parse a string parameter and validate if it is a UUID.

@@filename()
@Get(':uuid')
async findOne(@Param('uuid', new ParseUUIDPipe()) uuid: string) {
  return this.catsService.findOne(uuid);
}
@@switch
@Get(':uuid')
@Bind(Param('uuid', new ParseUUIDPipe()))
async findOne(uuid) {
  return this.catsService.findOne(uuid);
}

info Hint When using ParseUUIDPipe() you are parsing UUID in version 3, 4 or 5, if you only require a specific version of UUID you can pass a version in the pipe options.

Above we've seen examples of binding the various Parse* family of built-in pipes. Binding validation pipes is a little bit different; we'll discuss that in the following section.

info Hint Also, see Validation techniques for extensive examples of validation pipes.

Custom pipes

As mentioned, you can build your own custom pipes. While Nest provides a robust built-in ParseIntPipe and ValidationPipe, let's build simple custom versions of each from scratch to see how custom pipes are constructed.

We start with a simple ValidationPipe. Initially, we'll have it simply take an input value and immediately return the same value, behaving like an identity function.

@@filename(validation.pipe)
import { PipeTransform, Injectable, ArgumentMetadata } from '@nestjs/common';

@Injectable()
export class ValidationPipe implements PipeTransform {
  transform(value: any, metadata: ArgumentMetadata) {
    return value;
  }
}
@@switch
import { Injectable } from '@nestjs/common';

@Injectable()
export class ValidationPipe {
  transform(value, metadata) {
    return value;
  }
}

info Hint PipeTransform<T, R> is a generic interface that must be implemented by any pipe. The generic interface uses T to indicate the type of the input value, and R to indicate the return type of the transform() method.

Every pipe must implement the transform() method to fulfill the PipeTransform interface contract. This method has two parameters:

  • value
  • metadata

The value parameter is the currently processed method argument (before it is received by the route handling method), and metadata is the currently processed method argument's metadata. The metadata object has these properties:

export interface ArgumentMetadata {
  type: 'body' | 'query' | 'param' | 'custom';
  metatype?: Type<unknown>;
  data?: string;
}

These properties describe the currently processed argument.

type Indicates whether the argument is a body @Body(), query @Query(), param @Param(), or a custom parameter (read more here).
metatype Provides the metatype of the argument, for example, String. Note: the value is undefined if you either omit a type declaration in the route handler method signature, or use vanilla JavaScript.
data The string passed to the decorator, for example @Body('string'). It's undefined if you leave the decorator parenthesis empty.

warning Warning TypeScript interfaces disappear during transpilation. Thus, if a method parameter's type is declared as an interface instead of a class, the metatype value will be Object.

Schema based validation

Let's make our validation pipe a little more useful. Take a closer look at the create() method of the CatsController, where we probably would like to ensure that the post body object is valid before attempting to run our service method.

@@filename()
@Post()
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}
@@switch
@Post()
async create(@Body() createCatDto) {
  this.catsService.create(createCatDto);
}

Let's focus in on the createCatDto body parameter. Its type is CreateCatDto:

@@filename(create-cat.dto)
export class CreateCatDto {
  name: string;
  age: number;
  breed: string;
}

We want to ensure that any incoming request to the create method contains a valid body. So we have to validate the three members of the createCatDto object. We could do this inside the route handler method, but doing so is not ideal as it would break the single responsibility rule (SRP).

Another approach could be to create a validator class and delegate the task there. This has the disadvantage that we would have to remember to call this validator at the beginning of each method.

How about creating validation middleware? This could work, but unfortunately it's not possible to create generic middleware which can be used across all contexts across the whole application. This is because middleware is unaware of the execution context, including the handler that will be called and any of its parameters.

This is, of course, exactly the use case for which pipes are designed. So let's go ahead and refine our validation pipe.

Object schema validation

There are several approaches available for doing object validation in a clean, DRY way. One common approach is to use schema-based validation. Let's go ahead and try that approach.

The Joi library allows you to create schemas in a straightforward way, with a readable API. Let's build a validation pipe that makes use of Joi-based schemas.

Start by installing the required package:

$ npm install --save @hapi/joi
$ npm install --save-dev @types/hapi__joi

In the code sample below, we create a simple class that takes a schema as a constructor argument. We then apply the schema.validate() method, which validates our incoming argument against the provided schema.

As noted earlier, a validation pipe either returns the value unchanged, or throws an exception.

In the next section, you'll see how we supply the appropriate schema for a given controller method using the @UsePipes() decorator. Doing so makes our validation pipe re-usable across contexts, just as we set out to do.

@@filename()
import { PipeTransform, Injectable, ArgumentMetadata, BadRequestException } from '@nestjs/common';
import { ObjectSchema } from '@hapi/joi';

@Injectable()
export class JoiValidationPipe implements PipeTransform {
  constructor(private schema: ObjectSchema) {}

  transform(value: any, metadata: ArgumentMetadata) {
    const { error } = this.schema.validate(value);
    if (error) {
      throw new BadRequestException('Validation failed');
    }
    return value;
  }
}
@@switch
import { Injectable, BadRequestException } from '@nestjs/common';

@Injectable()
export class JoiValidationPipe {
  constructor(schema) {
    this.schema = schema;
  }

  transform(value, metadata) {
    const { error } = this.schema.validate(value);
    if (error) {
      throw new BadRequestException('Validation failed');
    }
    return value;
  }
}

Binding validation pipes

Earlier, we saw how to bind transformation pipes (like ParseIntPipe and the rest of the Parse* pipes).

Binding validation pipes is also very straightforward.

In this case, we want to bind the pipe at the method call level. In our current example, we need to do the following to use the JoiValidationPipe:

  1. Create an instance of the JoiValidationPipe
  2. Pass the context-specific Joi schema in the class constructor of the pipe
  3. Bind the pipe to the method

We do that using the @UsePipes() decorator as shown below:

@@filename()
@Post()
@UsePipes(new JoiValidationPipe(createCatSchema))
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}
@@switch
@Post()
@Bind(Body())
@UsePipes(new JoiValidationPipe(createCatSchema))
async create(createCatDto) {
  this.catsService.create(createCatDto);
}

info Hint The @UsePipes() decorator is imported from the @nestjs/common package.

Class validator

warning Warning The techniques in this section require TypeScript, and are not available if your app is written using vanilla JavaScript.

Let's look at an alternate implementation for our validation technique.

Nest works well with the class-validator library. This powerful library allows you to use decorator-based validation. Decorator-based validation is extremely powerful, especially when combined with Nest's Pipe capabilities since we have access to the metatype of the processed property. Before we start, we need to install the required packages:

$ npm i --save class-validator class-transformer

Once these are installed, we can add a few decorators to the CreateCatDto class. Here we see a significant advantage of this technique: the CreateCatDto class remains the single source of truth for our Post body object (rather than having to create a separate validation class).

@@filename(create-cat.dto)
import { IsString, IsInt } from 'class-validator';

export class CreateCatDto {
  @IsString()
  name: string;

  @IsInt()
  age: number;

  @IsString()
  breed: string;
}

info Hint Read more about the class-validator decorators here.

Now we can create a ValidationPipe class that uses these annotations.

@@filename(validation.pipe)
import { PipeTransform, Injectable, ArgumentMetadata, BadRequestException } from '@nestjs/common';
import { validate } from 'class-validator';
import { plainToClass } from 'class-transformer';

@Injectable()
export class ValidationPipe implements PipeTransform<any> {
  async transform(value: any, { metatype }: ArgumentMetadata) {
    if (!metatype || !this.toValidate(metatype)) {
      return value;
    }
    const object = plainToClass(metatype, value);
    const errors = await validate(object);
    if (errors.length > 0) {
      throw new BadRequestException('Validation failed');
    }
    return value;
  }

  private toValidate(metatype: Function): boolean {
    const types: Function[] = [String, Boolean, Number, Array, Object];
    return !types.includes(metatype);
  }
}

warning Notice Above, we have used the class-transformer library. It's made by the same author as the class-validator library, and as a result, they play very well together.

Let's go through this code. First, note that the transform() method is marked as async. This is possible because Nest supports both synchronous and asynchronous pipes. We make this method async because some of the class-validator validations can be async (utilize Promises).

Next note that we are using destructuring to extract the metatype field (extracting just this member from an ArgumentMetadata) into our metatype parameter. This is just shorthand for getting the full ArgumentMetadata and then having an additional statement to assign the metatype variable.

Next, note the helper function toValidate(). It's responsible for bypassing the validation step when the current argument being processed is a native JavaScript type (these can't have validation decorators attached, so there's no reason to run them through the validation step).

Next, we use the class-transformer function plainToClass() to transform our plain JavaScript argument object into a typed object so that we can apply validation. The reason we must do this is that the incoming post body object, when deserialized from the network request, does not have any type information (this is the way the underlying platform, such as Express, works). Class-validator needs to use the validation decorators we defined for our DTO earlier, so we need to perform this transformation to treat the incoming body as an appropriately decorated object, not just a plain vanilla object.

Finally, as noted earlier, since this is a validation pipe it either returns the value unchanged, or throws an exception.

The last step is to bind the ValidationPipe. Pipes can be parameter-scoped, method-scoped, controller-scoped, or global-scoped. Earlier, with our Joi-based validation pipe, we saw an example of binding the pipe at the method level. In the example below, we'll bind the pipe instance to the route handler @Body() decorator so that our pipe is called to validate the post body.

@@filename(cats.controller)
@Post()
async create(
  @Body(new ValidationPipe()) createCatDto: CreateCatDto,
) {
  this.catsService.create(createCatDto);
}

Parameter-scoped pipes are useful when the validation logic concerns only one specified parameter.

Global scoped pipes

Since the ValidationPipe was created to be as generic as possible, we can realize it's full utility by setting it up as a global-scoped pipe so that it is applied to every route handler across the entire application.

@@filename(main)
async function bootstrap() {
  const app = await NestFactory.create(AppModule);
  app.useGlobalPipes(new ValidationPipe());
  await app.listen(3000);
}
bootstrap();

warning Notice In the case of hybrid apps the useGlobalPipes() method doesn't set up pipes for gateways and micro services. For "standard" (non-hybrid) microservice apps, useGlobalPipes() does mount pipes globally.

Global pipes are used across the whole application, for every controller and every route handler.

Note that in terms of dependency injection, global pipes registered from outside of any module (with useGlobalPipes() as in the example above) cannot inject dependencies since the binding has been done outside the context of any module. In order to solve this issue, you can set up a global pipe directly from any module using the following construction:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { APP_PIPE } from '@nestjs/core';

@Module({
  providers: [
    {
      provide: APP_PIPE,
      useClass: ValidationPipe,
    },
  ],
})
export class AppModule {}

info Hint When using this approach to perform dependency injection for the pipe, note that regardless of the module where this construction is employed, the pipe is, in fact, global. Where should this be done? Choose the module where the pipe (ValidationPipe in the example above) is defined. Also, useClass is not the only way of dealing with custom provider registration. Learn more here.

Transformation use case

Validation isn't the only use case for custom pipes. At the beginning of this chapter, we mentioned that a pipe can also transform the input data to the desired format. This is possible because the value returned from the transform function completely overrides the previous value of the argument.

When is this useful? Consider that sometimes the data passed from the client needs to undergo some change - for example converting a string to an integer - before it can be properly handled by the route handler method. Furthermore, some required data fields may be missing, and we would like to apply default values. Transformation pipes can perform these functions by interposing a processing function between the client request and the request handler.

Here's a simple ParseIntPipe which is responsible for parsing a string into an integer value. (As noted above, Nest has a built-in ParseIntPipe that is more sophisticated; we include this as a simple example of a custom transformation pipe).

@@filename(parse-int.pipe)
import { PipeTransform, Injectable, ArgumentMetadata, BadRequestException } from '@nestjs/common';

@Injectable()
export class ParseIntPipe implements PipeTransform<string, number> {
  transform(value: string, metadata: ArgumentMetadata): number {
    const val = parseInt(value, 10);
    if (isNaN(val)) {
      throw new BadRequestException('Validation failed');
    }
    return val;
  }
}
@@switch
import { Injectable, BadRequestException } from '@nestjs/common';

@Injectable()
export class ParseIntPipe {
  transform(value, metadata) {
    const val = parseInt(value, 10);
    if (isNaN(val)) {
      throw new BadRequestException('Validation failed');
    }
    return val;
  }
}

We can then bind this pipe to the selected param as shown below:

@@filename()
@Get(':id')
async findOne(@Param('id', new ParseIntPipe()) id) {
  return this.catsService.findOne(id);
}
@@switch
@Get(':id')
@Bind(Param('id', new ParseIntPipe()))
async findOne(id) {
  return this.catsService.findOne(id);
}

Another useful transformation case would be to select an existing user entity from the database using an id supplied in the request:

@@filename()
@Get(':id')
findOne(@Param('id', UserByIdPipe) userEntity: UserEntity) {
  return userEntity;
}
@@switch
@Get(':id')
@Bind(Param('id', UserByIdPipe))
findOne(userEntity) {
  return userEntity;
}

We leave the implementation of this pipe to the reader, but note that like all other transformation pipes, it receives an input value (an id) and returns an output value (a UserEntity object). This can make your code more declarative and DRY by abstracting boilerplate code out of your handler and into a common pipe.

Providing defaults

Parse* pipes expect a parameter's value to be defined. They throw an exception upon receiving null or undefined values. To allow an endpoint to handle missing querystring parameter values, we have to provide a default value to be injected before the Parse* pipes operate on these values. The DefaultValuePipe serves that purpose. Simply instantiate a DefaultValuePipe in the @Query() decorator before the relevant Parse* pipe, as shown below:

@@filename()
@Get()
async findAll(
  @Query('activeOnly', new DefaultValuePipe(false), ParseBoolPipe) activeOnly: boolean,
  @Query('page', new DefaultValuePipe(0), ParseIntPipe) page: number,
) {
  return this.catsService.findAll({ activeOnly, page });
}

The built-in ValidationPipe

As a reminder, you don't have to build a generic validation pipe on your own since the ValidationPipe is provided by Nest out-of-the-box. The built-in ValidationPipe offers more options than the sample we built in this chapter, which has been kept basic for the sake of illustrating the mechanics of a custom-built pipe. You can find full details, along with lots of examples here.

Support

Nest is an MIT-licensed open source project with its ongoing development made possible thanks to the support by the community. This framework is a result of the long road, full of sleepless nights, working after hours, and busy weekends.

How can you help?

Nest doesn't have a large company that sits behind and is continuously paying for hours spent on the development. I fully rely on the goodness ❤️ of the people. However, I would love to make this framework even more powerful, to be fully focused on delivering you great solutions that make coding process enjoyable: In order to help me, I run few supporting platforms:

If you fell in love with Nest, or you run a business which is using Nest, consider sponsoring its development to ensure that the project which your product relies on is actively maintained and improved. Also, your support could help me to work more on content that benefits whole Nest community, writing either educational blog posts or recording videos.

Contributing to Nest

We would love for you to contribute to Nest and help make it even better than it is today! As a contributor, here are the guidelines we would like you to follow:

Got a Question or Problem?

Do not open issues for general support questions as we want to keep GitHub issues for bug reports and feature requests. You've got much better chances of getting your question answered on Stack Overflow where the questions should be tagged with tag nestjs.

Stack Overflow is a much better place to ask questions since:

  • questions and answers stay available for public viewing so your question / answer might help someone else
  • Stack Overflow's voting system assures that the best answers are prominently visible.

To save your and our time, we will systematically close all issues that are requests for general support and redirect people to Stack Overflow.

If you would like to chat about the question in real-time, you can reach out via our gitter channel.

Found a Bug?

If you find a bug in the source code, you can help us by submitting an issue to our GitHub Repository. Even better, you can submit a Pull Request with a fix.

Missing a Feature?

You can request a new feature by submitting an issue to our GitHub Repository. If you would like to implement a new feature, please submit an issue with a proposal for your work first, to be sure that we can use it. Please consider what kind of change it is:

  • For a Major Feature, first open an issue and outline your proposal so that it can be discussed. This will also allow us to better coordinate our efforts, prevent duplication of work, and help you to craft the change so that it is successfully accepted into the project. For your issue name, please prefix your proposal with [discussion], for example "[discussion]: your feature idea".
  • Small Features can be crafted and directly submitted as a Pull Request.

Submission Guidelines

Submitting an Issue

Before you submit an issue, please search the issue tracker, maybe an issue for your problem already exists and the discussion might inform you of workarounds readily available.

Unfortunately, we are not able to investigate / fix bugs without a minimal reproduction, so if we don't hear back from you we are going to close an issue that don't have enough info to be reproduced.

You can file new issues by filling out our new issue form.

Submitting a Pull Request (PR)

Before you submit your Pull Request (PR) consider the following guidelines:

  1. Search GitHub for an open or closed PR that relates to your submission. You don't want to duplicate effort.
  1. Fork the nestjs/nest repo.

  2. Make your changes in a new git branch:

    git checkout -b my-fix-branch master
  3. Create your patch, including appropriate test cases.

  4. Follow our Coding Rules.

  5. Run the full Nest test suite (see common scripts), and ensure that all tests pass.

  6. Commit your changes using a descriptive commit message that follows our commit message conventions. Adherence to these conventions is necessary because release notes are automatically generated from these messages.

    git commit -a

    Note: the optional commit -a command line option will automatically "add" and "rm" edited files.

  7. Push your branch to GitHub:

    git push origin my-fix-branch
  8. In GitHub, send a pull request to nestjs:master.

  • If we suggest changes then:

    • Make the required updates.

    • Re-run the Nest test suites to ensure tests are still passing.

    • Rebase your branch and force push to your GitHub repository (this will update your Pull Request):

      git rebase master -i
      git push -f

That's it! Thank you for your contribution!

After your pull request is merged

After your pull request is merged, you can safely delete your branch and pull the changes from the main (upstream) repository:

  • Delete the remote branch on GitHub either through the GitHub web UI or your local shell as follows:

    git push origin --delete my-fix-branch
  • Check out the master branch:

    git checkout master -f
  • Delete the local branch:

    git branch -D my-fix-branch
  • Update your master with the latest upstream version:

    git pull --ff upstream master

Coding Rules

To ensure consistency throughout the source code, keep these rules in mind as you are working:

  • All features or bug fixes must be tested by one or more specs (unit-tests).

Commit Message Guidelines

We have very precise rules over how our git commit messages can be formatted. This leads to more readable messages that are easy to follow when looking through the project history. But also, we use the git commit messages to generate the Nest change log.

Commit Message Format

Each commit message consists of a header, a body and a footer. The header has a special format that includes a type, a scope and a subject:

<type>(<scope>): <subject>
<BLANK LINE>
<body>
<BLANK LINE>
<footer>

The header is mandatory and the scope of the header is optional.

Any line of the commit message cannot be longer 100 characters! This allows the message to be easier to read on GitHub as well as in various git tools.

Footer should contain a closing reference to an issue if any.

Samples: (even more samples)

docs(changelog) update change log to beta.5
bugfix(@nestjs/core) need to depend on latest rxjs and zone.js

The version in our package.json gets copied to the one we publish, and users need the latest of these.

Revert

If the commit reverts a previous commit, it should begin with revert: , followed by the header of the reverted commit. In the body it should say: This reverts commit <hash>., where the hash is the SHA of the commit being reverted.

Type

Must be one of the following:

  • build: Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm)
  • ci: Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs)
  • docs: Documentation only changes
  • feature: A new feature
  • bugfix: A bug fix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • style: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
  • test: Adding missing tests or correcting existing tests

Subject

The subject contains succinct description of the change:

  • use the imperative, present tense: "change" not "changed" nor "changes"
  • don't capitalize first letter
  • no dot (.) at the end

Body

Just as in the subject, use the imperative, present tense: "change" not "changed" nor "changes". The body should include the motivation for the change and contrast this with previous behavior.

Footer

The footer should contain any information about Breaking Changes and is also the place to reference GitHub issues that this commit Closes.

Breaking Changes should start with the word BREAKING CHANGE: with a space or two newlines. The rest of the commit message is then used for this.

A detailed explanation can be found in this document.

Nest Logo

A progressive Node.js framework for building efficient and scalable server-side applications.

NPM Version Package License NPM Downloads Travis Linux Coverage Discord Backers on Open Collective Sponsors on Open Collective

Description

This project is built on top of the Angular CLI. It uses the Dgeni documentation generator to compile source documentation in markdown format into the published format. The Repository contains docs.nestjs.com source code, the official Nest documentation.

Installing

Install project dependencies and start a local server with the following terminal commands:

$ npm install
$ npm run start

Navigate to http://localhost:4200/.

All pages are written in markdown and located in the content directory.

Build

Run npm run build to build the project. The build artifacts will be stored in the dist/ directory.

To run build in watch mode, run npm run build:watch. Any content changes will be recompiled and rebuilt, and the content served at http://localhost:4200/.

Use npm run build:prod for a production build.

Support

Nest is an MIT-licensed open source project. It can grow thanks to the sponsors and support by the amazing backers. If you'd like to join them, please read more here.

Stay in touch

License

Nest is MIT licensed.

Libraries

Many applications need to solve the same general problems, or re-use a modular component in several different contexts. Nest has a few ways of addressing this, but each works at a different level to solve the problem in a way that helps meet different architectural and organizational objectives.

Nest modules are useful for providing an execution context that enables sharing components within a single application. Modules can also be packaged with npm to create a reusable library that can be installed in different projects. This can be an effective way to distribute configurable, re-usable libraries that can be used by different, loosely connected or unafilliated organizations (e.g., by distributing/installing 3rd party libraries).

For sharing code within closely organized groups (e.g., within company/project boundaries), it can be useful to have a more lightweight approach to sharing components. Monorepo's have arisen as a construct to enable that, and within a monorepo, a library provides a way to share code in an easy, lightweight fashion. In a Nest monorepo, using libraries enables easy assembly of applications that share components. In fact, this encourages decomposition of monolithic applications and development processes to focus on building and composing modular components.

Nest libraries

A Nest library is a Nest project that differs from an application in that it cannot run on its own. A library must be imported into a containing application in order for its code to execute. The built-in support for libraries described in this section is only available for monorepos (standard mode projects can achieve similar functionality using npm packages).

For example, an organization may develop an AuthModule that manages authentication by implementing company policies that govern all internal applications. Rather than build that module separately for each application, or physically packaging the code with npm and requiring each project to install it, a monorepo can define this module as a library. When organized this way, all consumers of the library module can see an up-to-date version of the AuthModule as it is committed. This can have significant benefits for coordinating component development and assembly, and simplifying end-to-end testing.

Creating libraries

Any functionality that is suitable for re-use is a candidate for being managed as a library. Deciding what should be a library, and what should be part of an application, is an architectural design decision. Creating libraries involves more than simply copying code from an existing application to a new library. When packaged as a library, the library code must be decoupled from the application. This may require more time up front and force some design decisions that you may not face with more tightly coupled code. But this additional effort can pay off when the library can be used to enable more rapid application assembly across multiple applications.

To get started with creating a library, run the following command:

nest g library my-library

When you run the command, the library schematic prompts you for a prefix (AKA alias) for the library:

What prefix would you like to use for the library (default: @app)?

This creates a new project in your workspace called my-library. A library-type project, like an application-type project, is generated into a named folder using a schematic. Libraries are managed under the libs folder of the monorepo root. Nest creates the libs folder the first time a library is created.

The files generated for a library are slightly different from those generated for an application. Here is the contents of the libs folder after executing the command above:

libs
my-library
src
my-library.service.ts
my-library.module.ts
index.ts
tsconfig.lib.json

The nest-cli.json file will have a new entry for the library under the "projects" key:

...
{
    "my-library": {
      "type": "library",
      "root": "libs/my-library",
      "entryFile": "index",
      "sourceRoot": "libs/my-library/src",
      "compilerOptions": {
        "tsConfigPath": "libs/my-library/tsconfig.lib.json"
      }
}
...

There are two differences in nest-cli.json metadata between libraries and applications:

  • the "type" property is set to "library" instead of "application"
  • the "entryFile" property is set to "index" instead of "main"

These differences key the build process to handle libraries appropriately. For example, a library exports its functions through the index.js file.

As with application-type projects, libraries each have their own tsconfig.lib.json file that extends the root (monorepo-wide) tsconfig.json file. You can modify this file, if necessary, to provide library-specific compiler options.

You can build the library with the CLI command:

nest build my-library

Using libraries

With the automatically generated configuration files in place, using libraries is straightforward. How would we import MyLibraryService from the my-library library into the my-project application?

First, note that using library modules is the same as using any other Nest module. What the monorepo does is manage paths in a way that importing libraries and generating builds is now transparent. To use MyLibraryService, we need to import its declaring module. We can modify my-project/src/app.module.ts as follows to import MyLibraryModule.

import { Module } from '@nestjs/common';
import { AppController } from './app.controller';
import { AppService } from './app.service';
import { MyLibraryModule } from '@app/my-library';

@Module({
  imports: [MyLibraryModule],
  controllers: [AppController],
  providers: [AppService],
})
export class AppModule {}

Notice above that we've used a path alias of @app in the ES module import line, which was the prefix we supplied with the nest g library command above. Under the covers, Nest handles this through tsconfig path mapping. When adding a library, Nest updates the global (monorepo) tsconfig.json file's "paths" key like this:

"paths": {
    "@app/my-library": [
        "libs/my-library/src"
    ],
    "@app/my-library/*": [
        "libs/my-library/src/*"
    ]
}

So, in a nutshell, the combination of the monorepo and library features has made it easy and intuitive to include library modules into applications.

This same mechanism enables building and deploying applications that compose libraries. Once you've imported the MyLibraryModule, running nest build handles all the module resolution automatically and bundles the app along with any library dependencies, for deployment. The default compiler for a monorepo is webpack, so the resulting distribution file is a single file that bundles all of the transpiled JavaScript files into a single file. You can also switch to tsc as described here.

Overview

The Nest CLI is a command-line interface tool that helps you to initialize, develop, and maintain your Nest applications. It assists in multiple ways, including scaffolding the project, serving it in development mode, and building and bundling the application for production distribution. It embodies best-practice architectural patterns to encourage well-structured apps.

Installation

Note: In this guide we describe using npm to install packages, including the Nest CLI. Other package managers may be used at your discretion. With npm, you have several options available for managing how your OS command line resolves the location of the nest CLI binary file. Here, we describe installing the nest binary globally using the -g option. This provides a measure of convenience, and is the approach we assume throughout the documentation. Note that installing any npm package globally leaves the responsibility of ensuring they're running the correct version up to the user. It also means that if you have different projects, each will run the same version of the CLI. A reasonable alternative is to use the npx program (or similar features with other package managers) to ensure that you run a managed version of the Nest CLI. We recommend you consult the npx documentation and/or your DevOps support staff for more information.

Install the CLI globally using the npm install -g command (see the Note above for details about global installs).

$ npm install -g @nestjs/cli

Basic workflow

Once installed, you can invoke CLI commands directly from your OS command line through the nest executable. See the available nest commands by entering the following:

$ nest --help

Get help on an individual command using the following construct. Substitute any command, like new, add, etc., where you see generate in the example below to get detailed help on that command:

$ nest generate --help

To create, build and run a new basic Nest project in development mode, go to the folder that should be the parent of your new project, and run the following commands:

$ nest new my-nest-project
$ cd my-nest-project
$ npm run start:dev

In your browser, open http://localhost:3000 to see the new application running. The app will automatically recompile and reload when you change any of the source files.

Project structure

When you run nest new, Nest generates a boilerplate application structure by creating a new folder and populating an initial set of files. You can continue working in this default structure, adding new components, as described throughout this documentation. We refer to the project structure generated by nest new as standard mode. Nest also supports an alternate structure for managing multiple projects and libraries called monorepo mode.

Aside from a few specific considerations around how the build process works (essentially, monorepo mode simplifies build complexities that can sometimes arise from monorepo-style project structures), and built-in library support, the rest of the Nest features, and this documentation, apply equally to both standard and monorepo mode project structures. In fact, you can easily switch from standard mode to monorepo mode at any time in the future, so you can safely defer this decision while you're still learning about Nest.

You can use either mode to manage multiple projects. Here's a quick summary of the differences:

Feature Standard Mode Monorepo Mode
Multiple projects Separate file system structure Single file system structure
node_modules & package.json Separate instances Shared across monorepo
Default compiler tsc webpack
Compiler settings Specified separately Monorepo defaults that can be overridden per project
Config files like tslint.json, .prettierrc, etc. Specified separately Shared across monorepo
nest build and nest start commands Target defaults automatically to the (only) project in the context Target defaults to the default project in the monorepo
Libraries Managed manually, usually via npm packaging Built-in support, including path management and bundling

Read the sections on Workspaces and Libraries for more detailed information to help you decide which mode is most suitable for you.

CLI command syntax

All nest commands follow the same format:

nest commandOrAlias requiredArg [optionalArg] [options]

For example:

$ nest new my-nest-project --dry-run

Here, new is the commandOrAlias. The new command has an alias of n. my-nest-project is the requiredArg. If a requiredArg is not supplied on the command line, nest will prompt for it. Also, --dry-run has an equivalent short-hand form -d. With this in mind, the following command is the equivalent of the above:

$ nest n my-nest-project -d

Most commands, and some options, have aliases. Try running nest new --help to see these options and aliases, and to confirm your understanding of the above constructs.

Command overview

Run nest <command> --help for any of the following commands to see command-specific options.

See usage for detailed descriptions for each command.

Command Alias Description
new n Scaffolds a new standard mode application with all boilerplate files needed to run.
generate g Generates and/or modifies files based on a schematic.
build Compiles an application or workspace into an output folder.
start Compiles and runs an application (or default project in a workspace).
add Imports a library that has been packaged as a nest library, running its install schematic.
update u Update @nestjs dependencies in the package.json "dependencies" list to their @latest version.
info i Displays information about installed nest packages and other helpful system info.

Nest CLI and scripts

This section provides additional background on how the nest command interacts with compilers and scripts to help DevOps personnel manage the development environment.

A Nest application is a standard TypeScript application that needs to be compiled to JavaScript before it can be executed. There are various ways to accomplish the compilation step, and developers/teams are free to choose a way that works best for them. With that in mind, Nest provides a set of tools out-of-the-box that seek to do the following:

  • Provide a standard build/execute process, available at the command line, that "just works" with reasonable defaults.
  • Ensure that the build/execute process is open, so developers can directly access the underlying tools to customize them using native features and options.
  • Remain a completely standard TypeScript/Node.js framework, so that the entire compile/deploy/execute pipeline can be managed by any external tools that the development team chooses to use.

This goal is accomplished through a combination of the nest command, a locally installed TypeScript compiler, and package.json scripts. We describe how these technologies work together below. This should help you understand what's happening at each step of the build/execute process, and how to customize that behavior if necessary.

The nest binary

The nest command is an OS level binary (i.e., runs from the OS command line). This command actually encompasses 3 distinct areas, described below. We recommend that you run the build (nest build) and execution (nest start) sub-commands via the package.json scripts provided automatically when a project is scaffolded (see typescript starter if you wish to start by cloning a repo, instead of running nest new).

Build

nest build is a wrapper on top of the standard tsc compiler (for standard projects) or the webpack compiler (for monorepos). It does not add any other compilation features or steps except for handling tsconfig-paths out of the box. The reason it exists is that most developers, especially when starting out with Nest, do not need to adjust compiler options (e.g., tsconfig.json file) which can sometimes be tricky.

See the nest build documentation for more details.

Execution

nest start simply ensures the project has been built (same as nest build), then invokes the node command in a portable, easy way to execute the compiled application. As with builds, you are free to customize this process as needed, either using the nest start command and its options, or completely replacing it. The entire process is a standard TypeScript application build and execute pipeline, and you are free to manage the process as such.

See the nest start documentation for more details.

Generation

The nest generate commands, as the name implies, generate new Nest projects, or components within them.

Package scripts

Running the nest commands at the OS command level requires that the nest binary be installed globally. This is a standard feature of npm, and outside of Nest's direct control. One consequence of this is that the globally installed nest binary is not managed as a project dependency in package.json. For example, two different developers can be running two different versions of the nest binary. The standard solution for this is to use package scripts so that you can treat the tools used in the build and execute steps as development dependencies.

When you run nest new, or clone the typescript starter, Nest populates the new project's package.json scripts with commands like build and start. It also installs the underlying compiler tools (such as typescript) as dev dependencies.

You run the build and execute scripts with commands like:

$ npm run build

and

$ npm run start

These commands use npm's script running capabilities to execute nest build or nest start using the locally installed nest binary. By using these built-in package scripts, you have full dependency management over the Nest CLI commands*. This means that, by following this recommended usage, all members of your organization can be assured of running the same version of the commands.

*This applies to the build and start commands. The nest new and nest generate commands aren't part of the build/execute pipeline, so they operate in a different context, and do not come with built-in package.json scripts.

For most developers/teams, it is recommended to utilize the package scripts for building and executing their Nest projects. You can fully customize the behavior of these scripts via their options (--path, --webpack, --webpackPath) and/or customize the tsc or webpack compiler options files (e.g., tsconfig.json) as needed. You are also free to run a completely custom build process to compile the TypeScript (or even to execute TypeScript directly with ts-node).

Backward compatibility

Because Nest applications are pure TypeScript applications, previous versions of the Nest build/execute scripts will continue to operate. You are not required to upgrade them. You can choose to take advantage of the new nest build and nest start commands when you are ready, or continue running previous or customized scripts.

Migration

While you are not required to make any changes, you may want to migrate to using the new CLI commands instead of using tools such as tsc-watch or ts-node. In this case, simply install the latest version of the @nestjs/cli, both globally and locally:

$ npm install -g @nestjs/cli
$ cd  /some/project/root/folder
$ npm install -D @nestjs/cli

You can then replace the scripts defined in package.json with the following ones:

"build": "nest build",
"start": "nest start",
"start:dev": "nest start --watch",
"start:debug": "nest start --debug --watch",

CLI command reference

nest new

Creates a new (standard mode) Nest project.

$ nest new <name> [options]
$ nest n <name> [options]
Description

Creates and initializes a new Nest project. Prompts for package manager.

  • Creates a folder with the given <name>
  • Populates the folder with configuration files
  • Creates sub-folders for source code (/src) and end-to-end tests (/test)
  • Populates the sub-folders with default files for app components and tests
Arguments
Argument Description
<name> The name of the new project
Options
Option Description
--dry-run Reports changes that would be made, but does not change the filesystem.
Alias: -d
--skip-git Skip git repository initialization.
Alias: -g
--skip-install Skip package installation.
Alias: -s
--package-manager [package-manager] Specify package manager. Use npm or yarn. Package manager must be installed globally.
Alias: -p
--language [language] Specify programming language (TS or JS).
Alias: -l
--collection [collectionName] Specify schematics collection. Use package name of installed npm package containing schematic.
Alias: -c

nest generate

Generates and/or modifies files based on a schematic

$ nest generate <schematic> <name> [options]
$ nest g <schematic> <name> [options]
Arguments
Argument Description
<schematic> The schematic or collection:schematic to generate. See the table below for the available schematics.
<name> The name of the generated component.
Schematics
Name Alias Description
app Generate a new application within a monorepo (converting to monorepo if it's a standard structure).
library lib Generate a new library within a monorepo (converting to monorepo if it's a standard structure).
class cl Generate a new class.
controller co Generate a controller declaration.
decorator d Generate a custom decorator.
filter f Generate a filter declaration.
gateway ga Generate a gateway declaration.
guard gu Generate a guard declaration.
interface Generate an interface.
interceptor in Generate an interceptor declaration.
middleware mi Generate a middleware declaration.
module mo Generate a module declaration.
pipe pi Generate a pipe declaration.
provider pr Generate a provider declaration.
resolver r Generate a resolver declaration.
service s Generate a service declaration.
Options
Option Description
--dry-run Reports changes that would be made, but does not change the filesystem.
Alias: -d
--project [project] Project that element should be added to.
Alias: -p
--flat Do not generate a folder for the element.
--collection [collectionName] Specify schematics collection. Use package name of installed npm package containing schematic.
Alias: -c
--spec Enforce spec files generation (default)
--no-spec Disable spec files generation

nest build

Compiles an application or workspace into an output folder.

$ nest build <name> [options]
Arguments
Argument Description
<name> The name of the project to build.
Options
Option Description
--path [path] Path to tsconfig file.
Alias -p
--config [path] Path to nest-cli configuration file.
Alias -c
--watch Run in watch mode (live-reload)
Alias -w
--webpack Use webpack for compilation.
--webpackPath Path to webpack configuration.
--tsc Force use tsc for compilation.

nest start

Compiles and runs an application (or default project in a workspace).

$ nest start <name> [options]
Arguments
Argument Description
<name> The name of the project to run.
Options
Option Description
--path [path] Path to tsconfig file.
Alias -p
--config [path] Path to nest-cli configuration file.
Alias -c
--watch Run in watch mode (live-reload)
Alias -w
--preserveWatchOutput Keep outdated console output in watch mode instead of clearing the screen. (tsc watch mode only)
--watchAssets Run in watch mode (live-reload), watching non-TS files (assets). See Assets for more details.
--debug [hostport] Run in debug mode (with --inspect flag)
Alias -d
--webpack Use webpack for compilation.
--webpackPath Path to webpack configuration.
--tsc Force use tsc for compilation.
--exec [binary] Binary to run (default: node).
Alias -e

nest add

Imports a library that has been packaged as a nest library, running its install schematic.

$ nest add <name> [options]
Arguments
Argument Description
<name> The name of the library to import.

nest update

Updates @nestjs dependencies in the package.json "dependencies" list to their @latest version.

Options
Option Description
--force Do upgrade instead of update
Alias -f
--tag Update to tagged version (use @latest, @<tag>, etc)
Alias -t

nest info

Displays information about installed nest packages and other helpful system info. For example:

 _   _             _      ___  _____  _____  _     _____
| \ | |           | |    |_  |/  ___|/  __ \| |   |_   _|
|  \| |  ___  ___ | |_     | |\ `--. | /  \/| |     | |
| . ` | / _ \/ __|| __|    | | `--. \| |    | |     | |
| |\  ||  __/\__ \| |_ /\__/ //\__/ /| \__/\| |_____| |_
\_| \_/ \___||___/ \__|\____/ \____/  \____/\_____/\___/

[System Information]
OS Version : macOS High Sierra
NodeJS Version : v8.9.0
YARN Version : 1.5.1
[Nest Information]
microservices version : 6.0.0
websockets version : 6.0.0
testing version : 6.0.0
common version : 6.0.0
core version : 6.0.0

Workspaces

Nest has two modes for organizing code:

  • standard mode: useful for building individual project-focused applications that have their own dependencies and settings, and don't need to optimize for sharing modules, or optimizing complex builds. This is the default mode.
  • monorepo mode: this mode treats code artifacts as part of a lightweight monorepo, and may be more appropriate for teams of developers and/or multi-project environments. It automates parts of the build process to make it easy to create and compose modular components, promotes code re-use, makes integration testing easier, makes it easy to share project-wide artifacts like tslint rules and other configuration policies, and is easier to use than alternatives like github submodules. Monorepo mode employs the concept of a workspace, represented in the nest-cli.json file, to coordinate the relationship between the components of the monorepo.

It's important to note that virtually all of Nest's features are independent of your code organization mode. The only affect of this choice is how your projects are composed and how build artifacts are generated. All other functionality, from the CLI to core modules to add-on modules work the same in either mode.

Also, you can easily switch from standard mode to monorepo mode at any time, so you can delay this decision until the benefits of one or the other approach become more clear.

Standard mode

When you run nest new, a new project is created for you using a built-in schematic. Nest does the following:

  1. Create a new folder, corresponding to the name argument you provide to nest new
  2. Populate that folder with default files corresponding to a minimal base-level Nest application. You can examine these files at the typescript-starter repository.
  3. Provide additional files such as nest-cli.json, package.json and tsconfig.json that configure and enable various tools for compiling, testing and serving your application.

From there, you can modify the starter files, add new components, add dependencies (e.g., npm install), and otherwise develop your application as covered in the rest of this documentation.

Monorepo mode

To enable monorepo mode, you start with a standard mode structure, and add projects. A project can be a full application (which you add to the workspace with the command nest generate app) or a library (which you add to the workspace with the command nest generate library). We'll discuss the details of these specific types of project components below. The key point to note now is that it is the act of adding a project to an existing standard mode structure that converts it to monorepo mode. Let's look at an example.

If we run:

nest new my-project

We've constructed a standard mode structure, with a folder structure that looks like this:

src
app.controller.ts
app.service.ts
app.module.ts
main.ts
node_modules
nest-cli.json
package.json
tsconfig.json
tslint.json

We can convert this to a monorepo mode structure as follows:

cd my-project
nest generate app my-app

At this point, nest converts the existing structure to a monorepo mode structure. This results in a few important changes. The folder structure now looks like this:

apps
my-app
src
app.controller.ts
app.service.ts
app.module.ts
main.ts
tsconfig.app.json
my-project
src
app.controller.ts
app.service.ts
app.module.ts
main.ts
tsconfig.app.json
nest-cli.json
package.json
tsconfig.json
tslint.json

The generate app schematic has reorganized the code - moving each application project under the apps folder, and adding a project-specific tsconfig.app.json file in each project's root folder. Our original my-project app has become the default project for the monorepo, and is now a peer with the just-added my-app, located under the apps folder. We'll cover default projects below.

error Warning The conversion of a standard mode structure to monorepo only works for projects that have followed the canonical Nest project structure. Specifically, during conversion, the schematic attempts to relocate the src and test folders in a project folder beneath the apps folder in the root. If a project does not use this structure, the conversion will fail or produce unreliable results.

Workspace projects

A monorepo uses the concept of a workspace to manage its member entities. Workspaces are composed of projects. A project may be either:

  • an application: a full Nest application including a main.ts file to bootstrap the application. Aside from compile and build considerations, an application-type project within a workspace is functionally identical to an application within a standard mode structure.
  • a library: a library is a way of packaging a general purpose set of features (modules, providers, controllers, etc.) that can be used within other projects. A library cannot run on its own, and has no main.ts file. Read more about libraries here.

All workspaces have a default project (which should be an application-type project). This is defined by the top-level "root" property in the nest-cli.json file, which points at the root of the default project (see CLI properties below for more details). Usually, this is the standard mode application you started with, and later converted to a monorepo using nest generate app. When you follow these steps, this property is populated automatically.

Default projects are used by nest commands like nest build and nest start when a project name is not supplied.

For example, in the above monorepo structure, running

$ nest start

will start up the my-project app. To start my-app, we'd use:

$ nest start my-app

Applications

Application-type projects, or what we might informally refer to as just "applications", are complete Nest applications that you can run and deploy. You generate an application-type project with nest generate app.

This command automatically generates a project skeleton, including the standard src and test folders from the typescript starter. Unlike standard mode, an application project in a monorepo does not have any of the package dependency (package.json) or other project configuration artifacts like .prettierrc and tslint.json. Instead, the monorepo-wide dependencies and config files are used.

However, the schematic does generate a project-specific tsconfig.app.json file in the root folder of the project. This config file automatically sets appropriate build options, including setting the compilation output folder properly. The file extends the top-level (monorepo) tsconfig.json file, so you can manage global settings monorepo-wide, but override them if needed at the project level.

Libraries

As mentioned, library-type projects, or simply "libraries", are packages of Nest components that need to be composed into applications in order to run. You generate a library-type project with nest generate library. Deciding what belongs in a library is an architectural design decision. We discuss libraries in depth in the libraries chapter.

CLI properties

Nest keeps the metadata needed to organize, build and deploy both standard and monorepo structured projects in the nest-cli.json file. Nest automatically adds to and updates this file as you add projects, so you usually do not have to think about it or edit its contents. However, there are some settings you may want to change manually, so it's helpful to have an overview understanding of the file.

After running the steps above to create a monorepo, our nest-cli.json file looks like this:

{
  "collection": "@nestjs/schematics",
  "sourceRoot": "apps/my-project/src",
  "monorepo": true,
  "root": "apps/my-project",
  "compilerOptions": {
    "webpack": true,
    "tsConfigPath": "apps/my-project/tsconfig.app.json"
  },
  "projects": {
    "my-project": {
      "type": "application",
      "root": "apps/my-project",
      "entryFile": "main",
      "sourceRoot": "apps/my-project/src",
      "compilerOptions": {
        "tsConfigPath": "apps/my-project/tsconfig.app.json"
      }
    },
    "my-app": {
      "type": "application",
      "root": "apps/my-app",
      "entryFile": "main",
      "sourceRoot": "apps/my-app/src",
      "compilerOptions": {
        "tsConfigPath": "apps/my-app/tsconfig.app.json"
      }
    }
  }
}

The file is divided into sections:

  • a global section with top-level properties controlling standard and monorepo-wide settings
  • a top level property ("projects") with metadata about each project. This section is present only for monorepo-mode structures.

The top-level properties are as follows:

  • "collection": points at the collection of schematics used to generate components; you generally should not change this value
  • "sourceRoot": points at the root of the source code for the single project in standard mode structures, or the default project in monorepo mode structures
  • "compilerOptions": a map with keys specifying compiler options and values specifying the option setting; see details below
  • "generateOptions": a map with keys specifying global generate options and values specifying the option setting; see details below
  • "monorepo": (monorepo only) for a monorepo mode structure, this value is always true
  • "root": (monorepo only) points at the project root of the default project

Global compiler options

These properties specify the compiler to use as well as various options that affect any compilation step, whether as part of nest build or nest start, and regardless of the compiler, whether tsc or webpack.

Property Name Property Value Type Description
webpack boolean If true, use webpack compiler. If false or not present, use tsc. In monorepo mode, the default is true (use webpack), in standard mode, the default is false (use tsc). See below for details.
tsConfigPath string (monorepo only) Points at the file containing the tsconfig.json settings that will be used when nest build or nest start is called without a project option (e.g., when the default project is built or started).
webpackConfigPath string Points at a webpack options file. If not specified, Nest looks for the file webpack.config.js. See below for more details.
deleteOutDir boolean If true, whenever the compiler is invoked, it will first remove the compilation output directory (as configured in tsconfig.json, where the default is ./dist).
assets array Enables automatically distributing non-TypeScript assets whenever a compilation step begins (asset distribution does not happen on incremental compiles in --watch mode). See below for details.
watchAssets boolean If true, run in watch-mode, watching all non-TypeScript assets. (For more fine-grained control of the assets to watch, see Assets section below).

Global generate options

These properties specify the default generate options to be used by the nest generate command.

Property Name Property Value Type Description
spec boolean or object If the value is boolean, a value of true enables spec generation by default and a value of false disables it. A flag passed on the CLI command line overrides this setting, as does a project-specific generateOptions setting (more below). If the value is an object, each key represents a schematic name, and the boolean value determines whether the default spec generation is enabled / disabled for that specific schematic.

The following example uses a boolean value to specify that spec file generation should be disabled by default for all projects:

{
  "generateOptions": {
    "spec": false
  },
  ...
}

In the following example, spec file generation is disabled only for service schematics (e.g., nest generate service...):

{
  "generateOptions": {
    "spec": {
      "service": false
    }
  },
  ...
}

error Warning When specifying the spec as an object, the key for the generation schematic does not currently support automatic alias handling. This means that specifying a key as for example service: false and trying to generate a service via the alias s, the spec would still be generated. To make sure both the normal schematic name and the alias work as intended, specify both the normal command name as well as the alias, as seen below.

{
  "generateOptions": {
    "spec": {
      "service": false,
      "s": false
    }
  },
  ...
}

Project-specific generate options

In addition to providing global generate options, you may also specify project-specific generate options. The project specific generate options follow the exact same format as the global generate options, but are specified directly on each project.

Project-specific generate options override global generate options.

{
  "projects": {
    "cats-project": {
      "generateOptions": {
        "spec": {
          "service": false
        }
      },
      ...
    }
  },
  ...
}

notice Notice The order of precedence for generate options is as follows. Options specified on the CLI command line take precedence over project-specific options. Project-specific options override global options.

Specified compiler

The reason for the different default compilers is that for larger projects (e.g., more typical in a monorepo) webpack can have significant advantages in build times and in producing a single file bundling all project components together. If you wish to generate individual files, set "webpack" to false, which will cause the build process to use tsc.

Webpack options

The webpack options file can contain standard webpack configuration options. For example, to tell webpack to bundle node_modules (which are excluded by default), add the following to webpack.config.js:

module.exports = {
  externals: [],
};

Since the webpack config file is a JavaScript file, you can even expose a function that takes default options and returns a modified object:

module.exports = function (options) {
  return {
    ...options,
    externals: [],
  };
};

Assets

TypeScript compilation automatically distributes compiler output (.js and .d.ts files) to the specified output directory. It can also be convenient to distribute non-TypeScript files, such as .graphql files, images, .html files and other assets. This allows you to treat nest build (and any initial compilation step) as a lightweight development build step, where you may be editing non-TypeScript files and iteratively compiling and testing.

The value of the assets key should be an array of elements specifying the files to be distributed. The elements can be simple strings with glob-like file specs, for example:

"assets": ["**/*.graphql"],
"watchAssets": true,

For finer control, the elements can be objects with the following keys:

  • "include": glob-like file specifications for the assets to be distributed
  • "exclude": glob-like file specifications for assets to be excluded from the include list
  • "outDir": a string specifying the path (relative to the root folder) where the assets should be distributed. Defaults to the same output directory configured for compiler output.
  • "watchAssets": boolean; if true, run in watch mode watching specified assets

For example:

"assets": [
  { "include": "**/*.graphql", "exclude": "**/omitted.graphql", "watchAssets": true },
]

error Warning Setting watchAssets in a top-level compilerOptions property overrides any watchAssets settings within the assets property.

Project properties

This element exists only for monorepo-mode structures. You generally should not edit these properties, as they are used by Nest to locate projects and their configuration options within the monorepo.

Who is using Nest?

We are proudly helping various companies building their products at scale. If you are using Nest and would you like to be listed here, see this thread. We are willing to put your logo here!

Companies

According to our knowledge, all the following companies have built awesome projects on top of our framework:

Global prefix

To set a prefix for every route registered in an HTTP application, use the setGlobalPrefix() method of the INestApplication instance.

const app = await NestFactory.create(AppModule);
app.setGlobalPrefix('v1');

HTTP adapter

Occasionally, you may want to access the underlying HTTP server, either within the Nest application context or from the outside.

Every native (platform-specific) HTTP server/library (e.g., Express and Fastify) instance is wrapped in an adapter. The adapter is registered as a globally available provider that can be retrieved from the application context, as well as injected into other providers.

Outside application context strategy

To get a reference to the HttpAdapter from outside of the application context, call the getHttpAdapter() method.

@@filename()
const app = await NestFactory.create(ApplicationModule);
const httpAdapter = app.getHttpAdapter();

In-context strategy

To get a reference to the HttpAdapterHost from within the application context, inject it using the same technique as any other existing provider (e.g., using constructor injection).

@@filename()
export class CatsService {
  constructor(private adapterHost: HttpAdapterHost) {}
}
@@switch
@Dependencies(HttpAdapterHost)
export class CatsService {
  constructor(adapterHost) {
    this.adapterHost = adapterHost;
  }
}

info Hint The HttpAdapterHost is imported from the @nestjs/core package.

The HttpAdapterHost is not an actual HttpAdapter. To get the actual HttpAdapter instance, simply access the httpAdapter property.

const adapterHost = app.get(HttpAdapterHost);
const httpAdapter = adapterHost.httpAdapter;

The httpAdapter is the actual instance of the HTTP adapter used by the underlying framework. It is an instance of either ExpressAdapter or FastifyAdapter (both classes extend AbstractHttpAdapter).

The adapter object exposes several useful methods to interact with the HTTP server. However, if you want to access the library instance (e.g., the Express instance) directly, call the getInstance() method.

const instance = httpAdapter.getInstance();

Hybrid application

A hybrid application is one that both listens for HTTP requests, as well as makes use of connected microservices. The INestApplication instance can be connected with INestMicroservice instances through the connectMicroservice() method.

const app = await NestFactory.create(AppModule);
const microservice = app.connectMicroservice({
  transport: Transport.TCP,
});

await app.startAllMicroservicesAsync();
await app.listen(3001);

To connect multiple microservice instances, issue the call to connectMicroservice() for each microservice:

const app = await NestFactory.create(AppModule);
// microservice #1
const microserviceTcp = app.connectMicroservice<MicroserviceOptions>({
  transport: Transport.TCP,
  options: {
    port: 3001,
  },
});
// microservice #2
const microserviceRedis = app.connectMicroservice<MicroserviceOptions>({
  transport: Transport.REDIS,
  options: {
    url: 'redis://localhost:6379',
  },
});

await app.startAllMicroservicesAsync();
await app.listen(3001);

Sharing configuration

By default a hybrid application will not inherit global pipes, interceptors, guards and filters configured for the main (HTTP-based) application. To inherit these configuration properties from the main application, set the inheritAppConfig property in the second argument (an optional options object) of the connectMicroservice() call, as follow:

const microservice = app.connectMicroservice(
  {
    transport: Transport.TCP,
  },
  { inheritAppConfig: true },
);

HTTPS

To create an application that uses the HTTPS protocol, set the httpsOptions property in the options object passed to the create() method of the NestFactory class:

const httpsOptions = {
  key: fs.readFileSync('./secrets/private-key.pem'),
  cert: fs.readFileSync('./secrets/public-certificate.pem'),
};
const app = await NestFactory.create(ApplicationModule, {
  httpsOptions,
});
await app.listen(3000);

If you use the FastifyAdapter, create the application as follows:

const app = await NestFactory.create<NestFastifyApplication>(
  ApplicationModule,
  new FastifyAdapter({ https: httpsOptions }),
);

Multiple simultaneous servers

The following recipe shows how to instantiate a Nest application that listens on multiple ports (for example, on a non-HTTPS port and an HTTPS port) simultaneously.

const httpsOptions = {
  key: fs.readFileSync('./secrets/private-key.pem'),
  cert: fs.readFileSync('./secrets/public-certificate.pem'),
};

const server = express();
const app = await NestFactory.create(
  ApplicationModule,
  new ExpressAdapter(server),
);
await app.init();

http.createServer(server).listen(3000);
https.createServer(httpsOptions, server).listen(443);

info Hint The ExpressAdapter is imported from the @nestjs/platform-express package. The http and https packages are native Node.js packages.

Warning This recipe does not work with GraphQL Subscriptions.

Request lifecycle

Nest applications handle requests and produce responses in a sequence we refer to as the request lifecycle. With the use of middleware, pipes, guards, and interceptors, it can be challenging to track down where a particular piece of code executes during the request lifecycle, especially as global, controller level, and route level components come into play. In general, a request flows through middleware to guards, then to interceptors, then to pipes and finally back to interceptors on the return path (as the response is generated).

Middleware

Middleware is executed in a particular sequence. First, Nest runs globally bound middleware (such as middleware bound with app.use) and then it runs module bound middleware, which are determined on paths. Middleware are run sequentially in the order they are bound, similar to the way middleware in Express works. In the case of middleware bound across different modules, the middleware bound to the root module will run first, and then middleware will run in the order that the modules are added to the imports array.

Guards

Guard execution starts with global guards, then proceeds to controller guards, and finally to route guards. As with middleware, guards run in the order in which they are bound. For example:

@UseGuards(Guard1, Guard2)
@Controller('cats')
export class CatsController {
  constructor(private catsService: CatsService) {}

  @UseGuards(Guard3)
  @Get()
  getCats(): Cats[] {
    return this.catsService.getCats();
  }
}

Guard1 will execute before Guard2 and both will execute before Guard3.

info Hint When speaking about globally bound vs controller or locally bound, the difference is where the guard (or other component is bound). If you are using app.useGlobalGuard() or providing the component via a module, it is globally bound. Otherwise, it is bound to a controller if the decorator precedes a controller class, or to a route if the decorator proceeds a route declaration.

Interceptors

Interceptors, for the most part, follow the same pattern as guards, with one catch: as interceptors return RxJS Observables, the observables will be resolved in a first in last out manner. So inbound requests will go through the standard global, controller, route level resolution, but the response side of the request (i.e., after returning from the controller method handler) will be resolved from route to controller to global. Also, any errors thrown by pipes, controllers, or services can be read in the catchError operator of an interceptor.

Pipes

Pipes follow the standard global to controller to route bound sequence, with the same first in first out in regards to the @usePipes() parameters. However, at a route parameter level, if you have multiple pipes running, they will run in the order of the last parameter with a pipe to the first. This also applies to the route level and controller level pipes. For example, if we have the following controller:

@UsePipes(GeneralValidationPipe)
@Controller('cats')
export class CatsController {
  constructor(private catsService: CatsService) {}

  @UsePipes(RouteSpecificPipe)
  @Patch(':id')
  updateCat(
    @Body() body: UpdateCatDTO,
    @Param() params: UpdateCatParams,
    @Query() query: UpdateCatQuery,
  ) {
    return this.catsService.updateCat(body, params, query);
  }
}

then the GeneralValidationPipe will run for the query, then the params, and then the body objects before moving on to the RouteSpecificPipe, which follows the same order. If any parameter-specific pipes were in place, they would run (again, from the last to first parameter) after the controller and route level pipes.

Filters

Filters are the only component that do not resolve global first. Instead, filters resolve from the lowest level possible, meaning execution starts with any route bound filters and proceeding next to controller level, and finally to global filters. Note that exceptions cannot be passed from filter to filter; if a route level filter catches the exception, a controller or global level filter cannot catch the same exception. The only way to achieve an effect like this is to use inheritance between the filters.

info Hint Filters are only executed if any uncaught exception occurs during the request process. Caught exceptions, such as those caught with a try/catch will not trigger Exception Filters to fire. As soon as an uncaught exception is encountered, the rest of the lifecycle is ignored and the request skips straight to the filter.

Summary

In general, the request lifecycle looks like the following:

  1. Incoming request
  2. Globally bound middleware
  3. Module bound middleware
  4. Global guards
  5. Controller guards
  6. Route guards
  7. Global interceptors (pre-controller)
  8. Controller interceptors (pre-controller)
  9. Route interceptors (pre-controller)
  10. Global pipes
  11. Controller pipes
  12. Route pipes
  13. Route parameter pipes
  14. Controller (method handler)
  15. Service (if exists)
  16. Route interceptor (post-request)
  17. Controller interceptor (post-request)
  18. Global interceptor (post-request)
  19. Exception filters (route, then controller, then global)
  20. Server response

Asynchronous providers

At times, the application start should be delayed until one or more asynchronous tasks are completed. For example, you may not want to start accepting requests until the connection with the database has been established. You can achieve this using asynchronous providers.

The syntax for this is to use async/await with the useFactory syntax. The factory returns a Promise, and the factory function can await asynchronous tasks. Nest will await resolution of the promise before instantiating any class that depends on (injects) such a provider.

{
  provide: 'ASYNC_CONNECTION',
  useFactory: async () => {
    const connection = await createConnection(options);
    return connection;
  },
}

info Hint Learn more about custom provider syntax here.

Injection

Asynchronous providers are injected to other components by their tokens, like any other provider. In the example above, you would use the construct @Inject('ASYNC_CONNECTION').

Example

The TypeORM recipe has a more substantial example of an asynchronous provider.

Circular dependency

A circular dependency occurs when two classes depend on each other. For example, class A needs class B, and class B also needs class A. Circular dependencies can arise in Nest between modules and between providers.

While circular dependencies should be avoided where possible, you can't always do so. In such cases, Nest enables resolving circular dependencies between providers in two ways. In this chapter, we describe using forward referencing as one technique, and using the ModuleRef class to retrieve a provider instance from the DI container as another.

We also describe resolving circular dependencies between modules.

Forward reference

A forward reference allows Nest to reference classes which aren't yet defined using the forwardRef() utility function. For example, if CatsService and CommonService depend on each other, both sides of the relationship can use @Inject() and the forwardRef() utility to resolve the circular dependency. Otherwise Nest won't instantiate them because all of the essential metadata won't be available. Here's an example:

@@filename(cats.service)
@Injectable()
export class CatsService {
  constructor(
    @Inject(forwardRef(() => CommonService))
    private commonService: CommonService,
  ) {}
}
@@switch
@Injectable()
@Dependencies(forwardRef(() => CommonService))
export class CatsService {
  constructor(commonService) {
    this.commonService = commonService;
  }
}

info Hint The forwardRef() function is imported from the @nestjs/common package.

That covers one side of the relationship. Now let's do the same with CommonService:

@@filename(common.service)
@Injectable()
export class CommonService {
  constructor(
    @Inject(forwardRef(() => CatsService))
    private catsService: CatsService,
  ) {}
}
@@switch
@Injectable()
@Dependencies(forwardRef(() => CatsService))
export class CommonService {
  constructor(catsService) {
    this.catsService = catsService;
  }
}

warning Warning The order of instantiation is indeterminate. Make sure your code does not depend on which constructor is called first.

ModuleRef class alternative

An alternative to using forwardRef() is to refactor your code and use the ModuleRef class to retrieve a provider on one side of the (otherwise) circular relationship. Learn more about the ModuleRef utility class here.

Module forward reference

In order to resolve circular dependencies between modules, use the same forwardRef() utility function on both sides of the modules association. For example:

@@filename(common.module)
@Module({
  imports: [forwardRef(() => CatsModule)],
})
export class CommonModule {}

Custom providers

In earlier chapters, we touched on various aspects of Dependency Injection (DI) and how it is used in Nest. One example of this is the constructor based dependency injection used to inject instances (often service providers) into classes. You won't be surprised to learn that Dependency Injection is built in to the Nest core in a fundamental way. So far, we've only explored one main pattern. As your application grows more complex, you may need to take advantage of the full features of the DI system, so let's explore them in more detail.

DI fundamentals

Dependency injection is an inversion of control (IoC) technique wherein you delegate instantiation of dependencies to the IoC container (in our case, the NestJS runtime system), instead of doing it in your own code imperatively. Let's examine what's happening in this example from the Providers chapter.

First, we define a provider. The @Injectable() decorator marks the CatsService class as a provider.

@@filename(cats.service)
import { Injectable } from '@nestjs/common';
import { Cat } from './interfaces/cat.interface';

@Injectable()
export class CatsService {
  private readonly cats: Cat[] = [];

  findAll(): Cat[] {
    return this.cats;
  }
}
@@switch
import { Injectable } from '@nestjs/common';

@Injectable()
export class CatsService {
  constructor() {
    this.cats = [];
  }

  findAll() {
    return this.cats;
  }
}

Then we request that Nest inject the provider into our controller class:

@@filename(cats.controller)
import { Controller, Get } from '@nestjs/common';
import { CatsService } from './cats.service';
import { Cat } from './interfaces/cat.interface';

@Controller('cats')
export class CatsController {
  constructor(private catsService: CatsService) {}

  @Get()
  async findAll(): Promise<Cat[]> {
    return this.catsService.findAll();
  }
}
@@switch
import { Controller, Get, Bind, Dependencies } from '@nestjs/common';
import { CatsService } from './cats.service';

@Controller('cats')
@Dependencies(CatsService)
export class CatsController {
  constructor(catsService) {
    this.catsService = catsService;
  }

  @Get()
  async findAll() {
    return this.catsService.findAll();
  }
}

Finally, we register the provider with the Nest IoC container:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { CatsController } from './cats/cats.controller';
import { CatsService } from './cats/cats.service';

@Module({
  controllers: [CatsController],
  providers: [CatsService],
})
export class AppModule {}

What exactly is happening under the covers to make this work? There are three key steps in the process:

  1. In cats.service.ts, the @Injectable() decorator declares the CatsService class as a class that can be managed by the Nest IoC container.

  2. In cats.controller.ts, CatsController declares a dependency on the CatsService token with constructor injection:

  constructor(private catsService: CatsService)
  1. In app.module.ts, we associate the token CatsService with the class CatsService from the cats.service.ts file. We'll see below exactly how this association (also called registration) occurs.

When the Nest IoC container instantiates a CatsController, it first looks for any dependencies*. When it finds the CatsService dependency, it performs a lookup on the CatsService token, which returns the CatsService class, per the registration step (#3 above). Assuming SINGLETON scope (the default behavior), Nest will then either create an instance of CatsService, cache it, and return it, or if one is already cached, return the existing instance.

*This explanation is a bit simplified to illustrate the point. One important area we glossed over is that the process of analyzing the code for dependencies is very sophisticated, and happens during application bootstrapping. One key feature is that dependency analysis (or "creating the dependency graph"), is transitive. In the above example, if the CatsService itself had dependencies, those too would be resolved. The dependency graph ensures that dependencies are resolved in the correct order - essentially "bottom up". This mechanism relieves the developer from having to manage such complex dependency graphs.

Standard providers

Let's take a closer look at the @Module() decorator. In app.module, we declare:

@Module({
  controllers: [CatsController],
  providers: [CatsService],
})

The providers property takes an array of providers. So far, we've supplied those providers via a list of class names. In fact, the syntax providers: [CatsService] is short-hand for the more complete syntax:

providers: [
  {
    provide: CatsService,
    useClass: CatsService,
  },
];

Now that we see this explicit construction, we can understand the registration process. Here, we are clearly associating the token CatsService with the class CatsService. The short-hand notation is merely a convenience to simplify the most common use-case, where the token is used to request an instance of a class by the same name.

Custom providers

What happens when your requirements go beyond those offered by Standard providers? Here are a few examples:

  • You want to create a custom instance instead of having Nest instantiate (or return a cached instance of) a class
  • You want to re-use an existing class in a second dependency
  • You want to override a class with a mock version for testing

Nest allows you to define Custom providers to handle these cases. It provides several ways to define custom providers. Let's walk through them.

Value providers: useValue

The useValue syntax is useful for injecting a constant value, putting an external library into the Nest container, or replacing a real implementation with a mock object. Let's say you'd like to force Nest to use a mock CatsService for testing purposes.

import { CatsService } from './cats.service';

const mockCatsService = {
  /* mock implementation
  ...
  */
};

@Module({
  imports: [CatsModule],
  providers: [
    {
      provide: CatsService,
      useValue: mockCatsService,
    },
  ],
})
export class AppModule {}

In this example, the CatsService token will resolve to the mockCatsService mock object. useValue requires a value - in this case a literal object that has the same interface as the CatsService class it is replacing. Because of TypeScript's structural typing, you can use any object that has a compatible interface, including a literal object or a class instance instantiated with new.

Non-class-based provider tokens

So far, we've used class names as our provider tokens (the value of the provide property in a provider listed in the providers array). This is matched by the standard pattern used with constructor based injection, where the token is also a class name. (Refer back to DI Fundamentals for a refresher on tokens if this concept isn't entirely clear). Sometimes, we may want the flexibility to use strings or symbols as the DI token. For example:

import { connection } from './connection';

@Module({
  providers: [
    {
      provide: 'CONNECTION',
      useValue: connection,
    },
  ],
})
export class AppModule {}

In this example, we are associating a string-valued token ('CONNECTION') with a pre-existing connection object we've imported from an external file.

warning Notice In addition to using strings as token values, you can also use JavaScript symbols.

We've previously seen how to inject a provider using the standard constructor based injection pattern. This pattern requires that the dependency be declared with a class name. The 'CONNECTION' custom provider uses a string-valued token. Let's see how to inject such a provider. To do so, we use the @Inject() decorator. This decorator takes a single argument - the token.

@@filename()
@Injectable()
export class CatsRepository {
  constructor(@Inject('CONNECTION') connection: Connection) {}
}
@@switch
@Injectable()
@Dependencies('CONNECTION')
export class CatsRepository {
  constructor(connection) {}
}

info Hint The @Inject() decorator is imported from @nestjs/common package.

While we directly use the string 'CONNECTION' in the above examples for illustration purposes, for clean code organization, it's best practice to define tokens in a separate file, such as constants.ts. Treat them much as you would symbols or enums that are defined in their own file and imported where needed.

Class providers: useClass

The useClass syntax allows you to dynamically determine a class that a token should resolve to. For example, suppose we have an abstract (or default) ConfigService class. Depending on the current environment, we want Nest to provide a different implementation of the configuration service. The following code implements such a strategy.

const configServiceProvider = {
  provide: ConfigService,
  useClass:
    process.env.NODE_ENV === 'development'
      ? DevelopmentConfigService
      : ProductionConfigService,
};

@Module({
  providers: [configServiceProvider],
})
export class AppModule {}

Let's look at a couple of details in this code sample. You'll notice that we define configServiceProvider with a literal object first, then pass it in the module decorator's providers property. This is just a bit of code organization, but is functionally equivalent to the examples we've used thus far in this chapter.

Also, we have used the ConfigService class name as our token. For any class that depends on ConfigService, Nest will inject an instance of the provided class (DevelopmentConfigService or ProductionConfigService) overriding any default implementation that may have been declared elsewhere (e.g., a ConfigService declared with an @Injectable() decorator).

Factory providers: useFactory

The useFactory syntax allows for creating providers dynamically. The actual provider will be supplied by the value returned from a factory function. The factory function can be as simple or complex as needed. A simple factory may not depend on any other providers. A more complex factory can itself inject other providers it needs to compute its result. For the latter case, the factory provider syntax has a pair of related mechanisms:

  1. The factory function can accept (optional) arguments.
  2. The (optional) inject property accepts an array of providers that Nest will resolve and pass as arguments to the factory function during the instantiation process. The two lists should be correlated: Nest will pass instances from the inject list as arguments to the factory function in the same order.

The example below demonstrates this.

@@filename()
const connectionFactory = {
  provide: 'CONNECTION',
  useFactory: (optionsProvider: OptionsProvider) => {
    const options = optionsProvider.get();
    return new DatabaseConnection(options);
  },
  inject: [OptionsProvider],
};

@Module({
  providers: [connectionFactory],
})
export class AppModule {}
@@switch
const connectionFactory = {
  provide: 'CONNECTION',
  useFactory: (optionsProvider) => {
    const options = optionsProvider.get();
    return new DatabaseConnection(options);
  },
  inject: [OptionsProvider],
};

@Module({
  providers: [connectionFactory],
})
export class AppModule {}

Alias providers: useExisting

The useExisting syntax allows you to create aliases for existing providers. This creates two ways to access the same provider. In the example below, the (string-based) token 'AliasedLoggerService' is an alias for the (class-based) token LoggerService. Assume we have two different dependencies, one for 'AliasedLoggerService' and one for LoggerService. If both dependencies are specified with SINGLETON scope, they'll both resolve to the same instance.

@Injectable()
class LoggerService {
  /* implementation details */
}

const loggerAliasProvider = {
  provide: 'AliasedLoggerService',
  useExisting: LoggerService,
};

@Module({
  providers: [LoggerService, loggerAliasProvider],
})
export class AppModule {}

Non-service based providers

While providers often supply services, they are not limited to that usage. A provider can supply any value. For example, a provider may supply an array of configuration objects based on the current environment, as shown below:

const configFactory = {
  provide: 'CONFIG',
  useFactory: () => {
    return process.env.NODE_ENV === 'development' ? devConfig : prodConfig;
  },
};

@Module({
  providers: [configFactory],
})
export class AppModule {}

Export custom provider

Like any provider, a custom provider is scoped to its declaring module. To make it visible to other modules, it must be exported. To export a custom provider, we can either use its token or the full provider object.

The following example shows exporting using the token:

@@filename()
const connectionFactory = {
  provide: 'CONNECTION',
  useFactory: (optionsProvider: OptionsProvider) => {
    const options = optionsProvider.get();
    return new DatabaseConnection(options);
  },
  inject: [OptionsProvider],
};

@Module({
  providers: [connectionFactory],
  exports: ['CONNECTION'],
})
export class AppModule {}
@@switch
const connectionFactory = {
  provide: 'CONNECTION',
  useFactory: (optionsProvider) => {
    const options = optionsProvider.get();
    return new DatabaseConnection(options);
  },
  inject: [OptionsProvider],
};

@Module({
  providers: [connectionFactory],
  exports: ['CONNECTION'],
})
export class AppModule {}

Alternatively, export with the full provider object:

@@filename()
const connectionFactory = {
  provide: 'CONNECTION',
  useFactory: (optionsProvider: OptionsProvider) => {
    const options = optionsProvider.get();
    return new DatabaseConnection(options);
  },
  inject: [OptionsProvider],
};

@Module({
  providers: [connectionFactory],
  exports: [connectionFactory],
})
export class AppModule {}
@@switch
const connectionFactory = {
  provide: 'CONNECTION',
  useFactory: (optionsProvider) => {
    const options = optionsProvider.get();
    return new DatabaseConnection(options);
  },
  inject: [OptionsProvider],
};

@Module({
  providers: [connectionFactory],
  exports: [connectionFactory],
})
export class AppModule {}

Dynamic modules

The Modules chapter covers the basics of Nest modules, and includes a brief introduction to dynamic modules. This chapter expands on the subject of dynamic modules. Upon completion, you should have a good grasp of what they are and how and when to use them.

Introduction

Most application code examples in the Overview section of the documentation make use of regular, or static, modules. Modules define groups of components like providers and controllers that fit together as a modular part of an overall application. They provide an execution context, or scope, for these components. For example, providers defined in a module are visible to other members of the module without the need to export them. When a provider needs to be visible outside of a module, it is first exported from its host module, and then imported into its consuming module.

Let's walk through a familiar example.

First, we'll define a UsersModule to provide and export a UsersService. UsersModule is the host module for UsersService.

import { Module } from '@nestjs/common';
import { UsersService } from './users.service';

@Module({
  providers: [UsersService],
  exports: [UsersService],
})
export class UsersModule {}

Next, we'll define an AuthModule, which imports UsersModule, making UsersModule's exported providers available inside AuthModule:

import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { UsersModule } from '../users/users.module';

@Module({
  imports: [UsersModule],
  providers: [AuthService],
  exports: [AuthService],
})
export class AuthModule {}

These constructs allow us to inject UsersService in, for example, the AuthService that is hosted in AuthModule:

import { Injectable } from '@nestjs/common';
import { UsersService } from '../users/users.service';

@Injectable()
export class AuthService {
  constructor(private usersService: UsersService) {}
  /*
    Implementation that makes use of this.usersService
  */
}

We'll refer to this as static module binding. All the information Nest needs to wire together the modules has already been declared in the host and consuming modules. Let's unpack what's happening during this process. Nest makes UsersService available inside AuthModule by:

  1. Instantiating UsersModule, including transitively importing other modules that UsersModule itself consumes, and transitively resolving any dependencies (see Custom providers).
  2. Instantiating AuthModule, and making UsersModule's exported providers available to components in AuthModule (just as if they had been declared in AuthModule).
  3. Injecting an instance of UsersService in AuthService.

Dynamic module use case

With static module binding, there's no opportunity for the consuming module to influence how providers from the host module are configured. Why does this matter? Consider the case where we have a general purpose module that needs to behave differently in different use cases. This is analogous to the concept of a "plugin" in many systems, where a generic facility requires some configuration before it can be used by a consumer.

A good example with Nest is a configuration module. Many applications find it useful to externalize configuration details by using a configuration module. This makes it easy to dynamically change the application settings in different deployments: e.g., a development database for developers, a staging database for the staging/testing environment, etc. By delegating the management of configuration parameters to a configuration module, the application source code remains independent of configuration parameters.

The challenge is that the configuration module itself, since it's generic (similar to a "plugin"), needs to be customized by its consuming module. This is where dynamic modules come into play. Using dynamic module features, we can make our configuration module dynamic so that the consuming module can use an API to control how the configuration module is customized at the time it is imported.

In other words, dynamic modules provide an API for importing one module into another, and customizing the properties and behavior of that module when it is imported, as opposed to using the static bindings we've seen so far.

Config module example

We'll be using the basic version of the example code from the configuration chapter for this section. The completed version as of the end of this chapter is available as a working example here.

Our requirement is to make ConfigModule accept an options object to customize it. Here's the feature we want to support. The basic sample hard-codes the location of the .env file to be in the project root folder. Let's suppose we want to make that configurable, such that you can manage your .env files in any folder of your choosing. For example, imagine you want to store your various .env files in a folder under the project root called config (i.e., a sibling folder to src). You'd like to be able to choose different folders when using the ConfigModule in different projects.

Dynamic modules give us the ability to pass parameters into the module being imported so we can change its behavior. Let's see how this works. It's helpful if we start from the end-goal of how this might look from the consuming module's perspective, and then work backwards. First, let's quickly review the example of statically importing the ConfigModule (i.e., an approach which has no ability to influence the behavior of the imported module). Pay close attention to the imports array in the @Module() decorator:

import { Module } from '@nestjs/common';
import { AppController } from './app.controller';
import { AppService } from './app.service';
import { ConfigModule } from './config/config.module';

@Module({
  imports: [ConfigModule],
  controllers: [AppController],
  providers: [AppService],
})
export class AppModule {}

Let's consider what a dynamic module import, where we're passing in a configuration object, might look like. Compare the difference in the imports array between these two examples:

import { Module } from '@nestjs/common';
import { AppController } from './app.controller';
import { AppService } from './app.service';
import { ConfigModule } from './config/config.module';

@Module({
  imports: [ConfigModule.register({ folder: './config' })],
  controllers: [AppController],
  providers: [AppService],
})
export class AppModule {}

Let's see what's happening in the dynamic example above. What are the moving parts?

  1. ConfigModule is a normal class, so we can infer that it must have a static method called register(). We know it's static because we're calling it on the ConfigModule class, not on an instance of the class. Note: this method, which we will create soon, can have any arbitrary name, but by convention we should call it either forRoot() or register().
  2. The register() method is defined by us, so we can accept any input arguments we like. In this case, we're going to accept a simple options object with suitable properties, which is the typical case.
  3. We can infer that the register() method must return something like a module since its return value appears in the familiar imports list, which we've seen so far includes a list of modules.

In fact, what our register() method will return is a DynamicModule. A dynamic module is nothing more than a module created at run-time, with the same exact properties as a static module, plus one additional property called module. Let's quickly review a sample static module declaration, paying close attention to the module options passed in to the decorator:

@Module({
  imports: [DogsService],
  controllers: [CatsController],
  providers: [CatsService],
  exports: [CatsService]
})

Dynamic modules must return an object with the exact same interface, plus one additional property called module. The module property serves as the name of the module, and should be the same as the class name of the module, as shown in the example below.

info Hint For a dynamic module, all properties of the module options object are optional except module.

What about the static register() method? We can now see that its job is to return an object that has the DynamicModule interface. When we call it, we are effectively providing a module to the imports list, similar to the way we would do so in the static case by listing a module class name. In other words, the dynamic module API simply returns a module, but rather than fix the properties in the @Modules decorator, we specify them programmatically.

There are still a couple of details to cover to help make the picture complete:

  1. We can now state that the @Module() decorator's imports property can take not only a module class name (e.g., imports: [UsersModule]), but also a function returning a dynamic module (e.g., imports: [ConfigModule.register(...)]).
  2. A dynamic module can itself import other modules. We won't do so in this example, but if the dynamic module depends on providers from other modules, you would import them using the optional imports property. Again, this is exactly analogous to the way you'd declare metadata for a static module using the @Module() decorator.

Armed with this understanding, we can now look at what our dynamic ConfigModule declaration must look like. Let's take a crack at it.

import { DynamicModule, Module } from '@nestjs/common';
import { ConfigService } from './config.service';

@Module({})
export class ConfigModule {
  static register(): DynamicModule {
    return {
      module: ConfigModule,
      providers: [ConfigService],
      exports: [ConfigService],
    };
  }
}

It should now be clear how the pieces tie together. Calling ConfigModule.register(...) returns a DynamicModule object with properties which are essentially the same as those that, until now, we've provided as metadata via the @Module() decorator.

info Hint Import DynamicModule from @nestjs/common.

Our dynamic module isn't very interesting yet, however, as we haven't introduced any capability to configure it as we said we would like to do. Let's address that next.

Module configuration

The obvious solution for customizing the behavior of the ConfigModule is to pass it an options object in the static register() method, as we guessed above. Let's look once again at our consuming module's imports property:

import { Module } from '@nestjs/common';
import { AppController } from './app.controller';
import { AppService } from './app.service';
import { ConfigModule } from './config/config.module';

@Module({
  imports: [ConfigModule.register({ folder: './config' })],
  controllers: [AppController],
  providers: [AppService],
})
export class AppModule {}

That nicely handles passing an options object to our dynamic module. How do we then use that options object in the ConfigModule? Let's consider that for a minute. We know that our ConfigModule is basically a host for providing and exporting an injectable service - the ConfigService - for use by other providers. It's actually our ConfigService that needs to read the options object to customize its behavior. Let's assume for the moment that we know how to somehow get the options from the register() method into the ConfigService. With that assumption, we can make a few changes to the service to customize its behavior based on the properties from the options object. (Note: for the time being, since we haven't actually determined how to pass it in, we'll just hard-code options. We'll fix this in a minute).

import { Injectable } from '@nestjs/common';
import * as dotenv from 'dotenv';
import * as fs from 'fs';
import { EnvConfig } from './interfaces';

@Injectable()
export class ConfigService {
  private readonly envConfig: EnvConfig;

  constructor() {
    const options = { folder: './config' };

    const filePath = `${process.env.NODE_ENV || 'development'}.env`;
    const envFile = path.resolve(__dirname, '../../', options.folder, filePath);
    this.envConfig = dotenv.parse(fs.readFileSync(envFile));
  }

  get(key: string): string {
    return this.envConfig[key];
  }
}

Now our ConfigService knows how to find the .env file in the folder we've specified in options.

Our remaining task is to somehow inject the options object from the register() step into our ConfigService. And of course, we'll use dependency injection to do it. This is a key point, so make sure you understand it. Our ConfigModule is providing ConfigService. ConfigService in turn depends on the options object that is only supplied at run-time. So, at run-time, we'll need to first bind the options object to the Nest IoC container, and then have Nest inject it into our ConfigService. Remember from the Custom providers chapter that providers can include any value not just services, so we're fine using dependency injection to handle a simple options object.

Let's tackle binding the options object to the IoC container first. We do this in our static register() method. Remember that we are dynamically constructing a module, and one of the properties of a module is its list of providers. So what we need to do is define our options object as a provider. This will make it injectable into the ConfigService, which we'll take advantage of in the next step. In the code below, pay attention to the providers array:

import { DynamicModule, Module } from '@nestjs/common';
import { ConfigService } from './config.service';

@Module({})
export class ConfigModule {
  static register(options): DynamicModule {
    return {
      module: ConfigModule,
      providers: [
        {
          provide: 'CONFIG_OPTIONS',
          useValue: options,
        },
        ConfigService,
      ],
      exports: [ConfigService],
    };
  }
}

Now we can complete the process by injecting the 'CONFIG_OPTIONS' provider into the ConfigService. Recall that when we define a provider using a non-class token we need to use the @Inject() decorator as described here.

import * as dotenv from 'dotenv';
import * as fs from 'fs';
import { Injectable, Inject } from '@nestjs/common';
import { EnvConfig } from './interfaces';

@Injectable()
export class ConfigService {
  private readonly envConfig: EnvConfig;

  constructor(@Inject('CONFIG_OPTIONS') private options) {
    const filePath = `${process.env.NODE_ENV || 'development'}.env`;
    const envFile = path.resolve(__dirname, '../../', options.folder, filePath);
    this.envConfig = dotenv.parse(fs.readFileSync(envFile));
  }

  get(key: string): string {
    return this.envConfig[key];
  }
}

One final note: for simplicity we used a string-based injection token ('CONFIG_OPTIONS') above, but best practice is to define it as a constant (or Symbol) in a separate file, and import that file. For example:

export const CONFIG_OPTIONS = 'CONFIG_OPTIONS';

Example

A full example of the code in this chapter can be found here.

Execution context

Nest provides several utility classes that help make it easy to write applications that function across multiple application contexts (e.g., Nest HTTP server-based, microservices and WebSockets application contexts). These utilities provide information about the current execution context which can be used to build generic guards, filters, and interceptors that can work across a broad set of controllers, methods, and execution contexts.

We cover two such classes in this chapter: ArgumentsHost and ExecutionContext.

ArgumentsHost class

The ArgumentsHost class provides methods for retrieving the arguments being passed to a handler. It allows choosing the appropriate context (e.g., HTTP, RPC (microservice), or WebSockets) to retrieve the arguments from. The framework provides an instance of ArgumentsHost, typically referenced as a host parameter, in places where you may want to access it. For example, the catch() method of an exception filter is called with an ArgumentsHostinstance.

ArgumentsHost simply acts as an abstraction over a handler's arguments. For example, for HTTP server applications (when @nestjs/platform-express is being used), the host object encapsulates Express's [request, response, next] array, where request is the request object, response is the response object, and next is a function that controls the application's request-response cycle. On the other hand, for GraphQL applications, the host object contains the [root, args, context, info] array.

Current application context

When building generic guards, filters, and interceptors which are meant to run across multiple application contexts, we need a way to determine the type of application that our method is currently running in. Do this with the getType() method of ArgumentsHost:

if (host.getType() === 'http') {
  // do something that is only important in the context of regular HTTP requests (REST)
} else if (host.getType() === 'rpc') {
  // do something that is only important in the context of Microservice requests
} else if (host.getType<GqlContextType>() === 'graphql') {
  // do something that is only important in the context of GraphQL requests
}

info Hint The GqlContextType is imported from the @nestjs/graphql package.

With the application type available, we can write more generic components, as shown below.

Host handler arguments

To retrieve the array of arguments being passed to the handler, one approach is to use the host object's getArgs() method.

const [req, res, next] = host.getArgs();

You can pluck a particular argument by index using the getArgByIndex() method:

const request = host.getArgByIndex(0);
const response = host.getArgByIndex(1);

In these examples we retrieved the request and response objects by index, which is not typically recommended as it couples the application to a particular execution context. Instead, you can make your code more robust and reusable by using one of the host object's utility methods to switch to the appropriate application context for your application. The context switch utility methods are shown below.

/**
 * Switch context to RPC.
 */
switchToRpc(): RpcArgumentsHost;
/**
 * Switch context to HTTP.
 */
switchToHttp(): HttpArgumentsHost;
/**
 * Switch context to WebSockets.
 */
switchToWs(): WsArgumentsHost;

Let's rewrite the previous example using the switchToHttp() method. The host.switchToHttp() helper call returns an HttpArgumentsHost object that is appropriate for the HTTP application context. The HttpArgumentsHost object has two useful methods we can use to extract the desired objects. We also use the Express type assertions in this case to return native Express typed objects:

const ctx = host.switchToHttp();
const request = ctx.getRequest<Request>();
const response = ctx.getResponse<Response>();

Similarly WsArgumentsHost and RpcArgumentsHost have methods to return appropriate objects in the microservices and WebSockets contexts. Here are the methods for WsArgumentsHost:

export interface WsArgumentsHost {
  /**
   * Returns the data object.
   */
  getData<T>(): T;
  /**
   * Returns the client object.
   */
  getClient<T>(): T;
}

Following are the methods for RpcArgumentsHost:

export interface RpcArgumentsHost {
  /**
   * Returns the data object.
   */
  getData<T>(): T;

  /**
   * Returns the context object.
   */
  getContext<T>(): T;
}

ExecutionContext class

ExecutionContext extends ArgumentsHost, providing additional details about the current execution process. Like ArgumentsHost, Nest provides an instance of ExecutionContext in places you may need it, such as in the canActivate() method of a guard and the intercept() method of an interceptor. It provides the following methods:

export interface ExecutionContext extends ArgumentsHost {
  /**
   * Returns the type of the controller class which the current handler belongs to.
   */
  getClass<T>(): Type<T>;
  /**
   * Returns a reference to the handler (method) that will be invoked next in the
   * request pipeline.
   */
  getHandler(): Function;
}

The getHandler() method returns a reference to the handler about to be invoked. The getClass() method returns the type of the Controller class which this particular handler belongs to. For example, in an HTTP context, if the currently processed request is a POST request, bound to the create() method on the CatsController, getHandler() returns a reference to the create() method and getClass() returns the CatsController type (not instance).

const methodKey = ctx.getHandler().name; // "create"
const className = ctx.getClass().name; // "CatsController"

The ability to access references to both the current class and handler method provides great flexibility. Most importantly, it gives us the opportunity to access the metadata set through the @SetMetadata() decorator from within guards or interceptors. We cover this use case below.

Reflection and metadata

Nest provides the ability to attach custom metadata to route handlers through the @SetMetadata() decorator. We can then access this metadata from within our class to make certain decisions.

@@filename(cats.controller)
@Post()
@SetMetadata('roles', ['admin'])
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}
@@switch
@Post()
@SetMetadata('roles', ['admin'])
@Bind(Body())
async create(createCatDto) {
  this.catsService.create(createCatDto);
}

info Hint The @SetMetadata() decorator is imported from the @nestjs/common package.

With the construction above, we attached the roles metadata (roles is a metadata key and ['admin'] is the associated value) to the create() method. While this works, it's not good practice to use @SetMetadata() directly in your routes. Instead, create your own decorators, as shown below:

@@filename(roles.decorator)
import { SetMetadata } from '@nestjs/common';

export const Roles = (...roles: string[]) => SetMetadata('roles', roles);
@@switch
import { SetMetadata } from '@nestjs/common';

export const Roles = (...roles) => SetMetadata('roles', roles);

This approach is much cleaner and more readable, and is strongly typed. Now that we have a custom @Roles() decorator, we can use it to decorate the create() method.

@@filename(cats.controller)
@Post()
@Roles('admin')
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}
@@switch
@Post()
@Roles('admin')
@Bind(Body())
async create(createCatDto) {
  this.catsService.create(createCatDto);
}

To access the route's role(s) (custom metadata), we'll use the Reflector helper class, which is provided out of the box by the framework and exposed from the @nestjs/core package. Reflector can be injected into a class in the normal way:

@@filename(roles.guard)
@Injectable()
export class RolesGuard {
  constructor(private reflector: Reflector) {}
}
@@switch
@Injectable()
@Dependencies(Reflector)
export class CatsService {
  constructor(reflector) {
    this.reflector = reflector;
  }
}

info Hint The Reflector class is imported from the @nestjs/core package.

Now, to read the handler metadata, use the get() method.

const roles = this.reflector.get<string[]>('roles', context.getHandler());

The Reflector#get method allows us to easily access the metadata by passing in two arguments: a metadata key and a context (decorator target) to retrieve the metadata from. In this example, the specified key is 'roles' (refer back to the roles.decorator.ts file above and the SetMetadata() call made there). The context is provided by the call to context.getHandler(), which results in extracting the metadata for the currently processed route handler. Remember, getHandler() gives us a reference to the route handler function.

Alternatively, we may organize our controller by applying metadata at the controller level, applying to all routes in the controller class.

@@filename(cats.controller)
@Roles('admin')
@Controller('cats')
export class CatsController {}
@@switch
@Roles('admin')
@Controller('cats')
export class CatsController {}

In this case, to extract controller metadata, we pass context.getClass() as the second argument (to provide the controller class as the context for metadata extraction) instead of context.getHandler():

@@filename(roles.guard)
const roles = this.reflector.get<string[]>('roles', context.getClass());
@@switch
const roles = this.reflector.get('roles', context.getClass());

Given the ability to provide metadata at multiple levels, you may need to extract and merge metadata from several contexts. The Reflector class provides two utility methods used to help with this. These methods extract both controller and method metadata at once, and combine them in different ways.

Consider the following scenario, where you've supplied 'roles' metadata at both levels.

@@filename(cats.controller)
@Roles('user')
@Controller('cats')
export class CatsController {
  @Post()
  @Roles('admin')
  async create(@Body() createCatDto: CreateCatDto) {
    this.catsService.create(createCatDto);
  }
}
@@switch
@Roles('user')
@Controller('cats')
export class CatsController {}
  @Post()
  @Roles('admin')
  @Bind(Body())
  async create(createCatDto) {
    this.catsService.create(createCatDto);
  }
}

If your intent is to specify 'user' as the default role, and override it selectively for certain methods, you would probably use the getAllAndOverride() method.

const roles = this.reflector.getAllAndOverride<string[]>('roles', [
  context.getHandler(),
  context.getClass(),
]);

A guard with this code, running in the context of the create() method, with the above metadata, would result in roles containing ['admin'].

To get metadata for both and merge it (this method merges both arrays and objects), use the getAllAndMerge() method:

const roles = this.reflector.getAllAndMerge<string[]>('roles', [
  context.getHandler(),
  context.getClass(),
]);

This would result in roles containing ['user', 'admin'].

For both of these merge methods, you pass the metadata key as the first argument, and an array of metadata target contexts (i.e., calls to the getHandler() and/or getClass()) methods) as the second argument.

Lifecycle Events

A Nest application, as well as every application element, has a lifecycle managed by Nest. Nest provides lifecycle hooks that give visibility into key lifecycle events, and the ability to act (run registered code on your module, injectable or controller) when they occur.

Lifecycle sequence

The following diagram depicts the sequence of key application lifecycle events, from the time the application is bootstrapped until the node process exits. We can divide the overall lifecycle into three phases: initializing, running and terminating. Using this lifecycle, you can plan for appropriate initialization of modules and services, manage active connections, and gracefully shutdown your application when it receives a termination signal.

Lifecycle events

Lifecycle events happen during application bootstrapping and shutdown. Nest calls registered lifecycle hook methods on modules, injectables and controllers at each of the following lifecycle events (shutdown hooks need to be enabled first, as described below). As shown in the diagram above, Nest also calls the appropriate underlying methods to begin listening for connections, and to stop listening for connections.

In the following table, onModuleDestroy, beforeApplicationShutdown and onApplicationShutdown are only triggered if you explicitly call app.close() or if the process receives a special system signal (such as SIGTERM) and you have correctly called enableShutdownHooks at application bootstrap (see below Application shutdown part).

Lifecycle hook method Lifecycle event triggering the hook method call
onModuleInit() Called once the host module's dependencies have been resolved.
onApplicationBootstrap() Called once all modules have been initialized, but before listening for connections.
onModuleDestroy()* Called after a termination signal (e.g., SIGTERM) has been received.
beforeApplicationShutdown()* Called after all onModuleDestroy() handlers have completed (Promises resolved or rejected);
once complete (Promises resolved or rejected), all existing connections will be closed (app.close() called).
onApplicationShutdown()* Called after connections close (app.close() resolves.

* For these events, if you're not calling app.close() explicitly, you must opt-in to make them work with system signals such as SIGTERM. See Application shutdown below.

warning Warning The lifecycle hooks listed above are not triggered for request-scoped classes. Request-scoped classes are not tied to the application lifecycle and their lifespan is unpredictable. They are exclusively created for each request and automatically garbage-collected after the response is sent.

Usage

Each lifecycle hook is represented by an interface. Interfaces are technically optional because they do not exist after TypeScript compilation. Nonetheless, it's good practice to use them in order to benefit from strong typing and editor tooling. To register a lifecycle hook, implement the appropriate interface. For example, to register a method to be called during module initialization on a particular class (e.g., Controller, Provider or Module), implement the OnModuleInit interface by supplying an onModuleInit() method, as shown below:

@@filename()
import { Injectable, OnModuleInit } from '@nestjs/common';

@Injectable()
export class UsersService implements OnModuleInit {
  onModuleInit() {
    console.log(`The module has been initialized.`);
  }
}
@@switch
import { Injectable } from '@nestjs/common';

@Injectable()
export class UsersService {
  onModuleInit() {
    console.log(`The module has been initialized.`);
  }
}

Asynchronous initialization

Both the OnModuleInit and OnApplicationBootstrap hooks allow you to defer the application initialization process (return a Promise or mark the method as async and await an asynchronous method completion in the method body).

@@filename()
async onModuleInit(): Promise<void> {
  await this.fetch();
}
@@switch
async onModuleInit() {
  await this.fetch();
}

Application shutdown

The onModuleDestroy(), beforeApplicationShutdown() and onApplicationShutdown() hooks are called in the terminating phase (in response to an explicit call to app.close() or upon receipt of system signals such as SIGTERM if opted-in). This feature is often used with Kubernetes to manage containers' lifecycles, by Heroku for dynos or similar services.

Shutdown hook listeners consume system resources, so they are disabled by default. To use shutdown hooks, you must enable listeners by calling enableShutdownHooks():

import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create(AppModule);

  // Starts listening for shutdown hooks
  app.enableShutdownHooks();

  await app.listen(3000);
}
bootstrap();

warning warning Due to inherent platform limitations, NestJS has limited support for application shutdown hooks on Windows. You can expect SIGINT to work, as well as SIGBREAK and to some extent SIGHUP - read more. However SIGTERM will never work on Windows because killing a process in the task manager is unconditional, "i.e., there's no way for an application to detect or prevent it". Here's some relevant documentation from libuv to learn more about how SIGINT, SIGBREAK and others are handled on Windows. Also, see Node.js documentation of Process Signal Events

info Info enableShutdownHooks consumes memory by starting listeners. In cases where you are running multiple Nest apps in a single Node process (e.g., when running parallel tests with Jest), Node may complain about excessive listener processes. For this reason, enableShutdownHooks is not enabled by default. Be aware of this condition when you are running multiple instances in a single Node process.

When the application receives a termination signal it will call any registered onModuleDestroy(), beforeApplicationShutdown(), then onApplicationShutdown() methods (in the sequence described above) with the corresponding signal as the first parameter. If a registered function awaits an asynchronous call (returns a promise), Nest will not continue in the sequence until the promise is resolved or rejected.

@@filename()
@Injectable()
class UsersService implements OnApplicationShutdown {
  onApplicationShutdown(signal: string) {
    console.log(signal); // e.g. "SIGINT"
  }
}
@@switch
@Injectable()
class UsersService implements OnApplicationShutdown {
  onApplicationShutdown(signal) {
    console.log(signal); // e.g. "SIGINT"
  }
}

Module reference

Nest provides the ModuleRef class to navigate the internal list of providers and obtain a reference to any provider using its injection token as a lookup key. The ModuleRef class also provides a way to dynamically instantiate both static and scoped providers. ModuleRef can be injected into a class in the normal way:

@@filename(cats.service)
@Injectable()
export class CatsService {
  constructor(private moduleRef: ModuleRef) {}
}
@@switch
@Injectable()
@Dependencies(ModuleRef)
export class CatsService {
  constructor(moduleRef) {
    this.moduleRef = moduleRef;
  }
}

info Hint The ModuleRef class is imported from the @nestjs/core package.

Retrieving instances

The ModuleRef instance (hereafter we'll refer to it as the module reference) has a get() method. This method retrieves a provider, controller, or injectable (e.g., guard, interceptor, etc.) that exists (has been instantiated) in the current module using its injection token/class name.

@@filename(cats.service)
@Injectable()
export class CatsService implements OnModuleInit {
  private service: Service;
  constructor(private moduleRef: ModuleRef) {}

  onModuleInit() {
    this.service = this.moduleRef.get(Service);
  }
}
@@switch
@Injectable()
@Dependencies(ModuleRef)
export class CatsService {
  constructor(moduleRef) {
    this.moduleRef = moduleRef;
  }

  onModuleInit() {
    this.service = this.moduleRef.get(Service);
  }
}

warning Warning You can't retrieve scoped providers (transient or request-scoped) with the get() method. Instead, use the technique described below. Learn how to control scopes here.

To retrieve a provider from the global context (for example, if the provider has been injected in a different module), pass the {{ '{' }} strict: false {{ '}' }} option as a second argument to get().

this.moduleRef.get(Service, { strict: false });

Resolving scoped providers

To dynamically resolve a scoped provider (transient or request-scoped), use the resolve() method, passing the provider's injection token as an argument.

@@filename(cats.service)
@Injectable()
export class CatsService implements OnModuleInit {
  private transientService: TransientService;
  constructor(private moduleRef: ModuleRef) {}

  async onModuleInit() {
    this.transientService = await this.moduleRef.resolve(TransientService);
  }
}
@@switch
@Injectable()
@Dependencies(ModuleRef)
export class CatsService {
  constructor(moduleRef) {
    this.moduleRef = moduleRef;
  }

  async onModuleInit() {
    this.transientService = await this.moduleRef.resolve(TransientService);
  }
}

The resolve() method returns a unique instance of the provider, from its own DI container sub-tree. Each sub-tree has a unique context identifier. Thus, if you call this method more than once and compare instance references, you will see that they are not equal.

@@filename(cats.service)
@Injectable()
export class CatsService implements OnModuleInit {
  constructor(private moduleRef: ModuleRef) {}

  async onModuleInit() {
    const transientServices = await Promise.all([
      this.moduleRef.resolve(TransientService),
      this.moduleRef.resolve(TransientService),
    ]);
    console.log(transientServices[0] === transientServices[1]); // false
  }
}
@@switch
@Injectable()
@Dependencies(ModuleRef)
export class CatsService {
  constructor(moduleRef) {
    this.moduleRef = moduleRef;
  }

  async onModuleInit() {
    const transientServices = await Promise.all([
      this.moduleRef.resolve(TransientService),
      this.moduleRef.resolve(TransientService),
    ]);
    console.log(transientServices[0] === transientServices[1]); // false
  }
}

To generate a single instance across multiple resolve() calls, and ensure they share the same generated DI container sub-tree, you can pass a context identifier to the resolve() method. Use the ContextIdFactory class to generate a context identifier. This class provides a create() method that returns an appropriate unique identifier.

@@filename(cats.service)
@Injectable()
export class CatsService implements OnModuleInit {
  constructor(private moduleRef: ModuleRef) {}

  async onModuleInit() {
    const contextId = ContextIdFactory.create();
    const transientServices = await Promise.all([
      this.moduleRef.resolve(TransientService, contextId),
      this.moduleRef.resolve(TransientService, contextId),
    ]);
    console.log(transientServices[0] === transientServices[1]); // true
  }
}
@@switch
@Injectable()
@Dependencies(ModuleRef)
export class CatsService {
  constructor(moduleRef) {
    this.moduleRef = moduleRef;
  }

  async onModuleInit() {
    const contextId = ContextIdFactory.create();
    const transientServices = await Promise.all([
      this.moduleRef.resolve(TransientService, contextId),
      this.moduleRef.resolve(TransientService, contextId),
    ]);
    console.log(transientServices[0] === transientServices[1]); // true
  }
}

info Hint The ContextIdFactory class is imported from the @nestjs/core package.

Getting current sub-tree

Occasionally, you may want to resolve an instance of a request-scoped provider within a request context. Let's say that CatsService is request-scoped and you want to resolve the CatsRepository instance which is also marked as a request-scoped provider. In order to share the same DI container sub-tree, you must obtain the current context identifier instead of generating a new one (e.g., with the ContextIdFactory.create() function, as shown above). To obtain the current context identifier, start by injecting the request object using @Inject() decorator.

@@filename(cats.service)
@Injectable()
export class CatsService {
  constructor(
    @Inject(REQUEST) private request: Record<string, unknown>,
  ) {}
}
@@switch
@Injectable()
@Dependencies(REQUEST)
export class CatsService {
  constructor(request) {
    this.request = request;
  }
}

info Hint Learn more about the request provider here.

Now, use the getByRequest() method of the ContextIdFactory class to create a context id based on the request object, and pass this to the resolve() call:

const contextId = ContextIdFactory.getByRequest(this.request);
const catsRepository = await this.moduleRef.resolve(CatsRepository, contextId);

Instantiating custom classes dynamically

To dynamically instantiate a class that wasn't previously registered as a provider, use the module reference's create() method.

@@filename(cats.service)
@Injectable()
export class CatsService implements OnModuleInit {
  private catsFactory: CatsFactory;
  constructor(private moduleRef: ModuleRef) {}

  async onModuleInit() {
    this.catsFactory = await this.moduleRef.create(CatsFactory);
  }
}
@@switch
@Injectable()
@Dependencies(ModuleRef)
export class CatsService {
  constructor(moduleRef) {
    this.moduleRef = moduleRef;
  }

  async onModuleInit() {
    this.catsFactory = await this.moduleRef.create(CatsFactory);
  }
}

This technique enables you to conditionally instantiate different classes outside of the framework container.

Platform agnosticism

Nest is a platform-agnostic framework. This means you can develop reusable logical parts that can be used across different types of applications. For example, most components can be re-used without change across different underlying HTTP server frameworks (e.g., Express and Fastify), and even across different types of applications (e.g., HTTP server frameworks, Microservices with different transport layers, and Web Sockets).

Build once, use everywhere

The Overview section of the documentation primarily shows coding techniques using HTTP server frameworks (e.g., apps providing a REST API or providing an MVC-style server-side rendered app). However, all those building blocks can be used on top of different transport layers (microservices or websockets).

Furthermore, Nest comes with a dedicated GraphQL module. You can use GraphQL as your API layer interchangeably with providing a REST API.

In addition, the application context feature helps to create any kind of Node.js application - including things like CRON jobs and CLI apps - on top of Nest.

Nest aspires to be a full-fledged platform for Node.js apps that brings a higher-level of modularity and reusability to your applications. Build once, use everywhere!

Injection scopes

For people coming from different programming language backgrounds, it might be unexpected to learn that in Nest, almost everything is shared across incoming requests. We have a connection pool to the database, singleton services with global state, etc. Remember that Node.js doesn't follow the request/response Multi-Threaded Stateless Model in which every request is processed by a separate thread. Hence, using singleton instances is fully safe for our applications.

However, there are edge-cases when request-based lifetime may be the desired behavior, for instance per-request caching in GraphQL applications, request tracking, and multi-tenancy. Injection scopes provide a mechanism to obtain the desired provider lifetime behavior.

Provider scope

A provider can have any of the following scopes:

DEFAULT A single instance of the provider is shared across the entire application. The instance lifetime is tied directly to the application lifecycle. Once the application has bootstrapped, all singleton providers have been instantiated. Singleton scope is used by default.
REQUEST A new instance of the provider is created exclusively for each incoming request. The instance is garbage-collected after the request has completed processing.
TRANSIENT Transient providers are not shared across consumers. Each consumer that injects a transient provider will receive a new, dedicated instance.

info Hint Using singleton scope is recommended for most use cases. Sharing providers across consumers and across requests means that an instance can be cached and its initialization occurs only once, during application startup.

Usage

Specify injection scope by passing the scope property to the @Injectable() decorator options object:

import { Injectable, Scope } from '@nestjs/common';

@Injectable({ scope: Scope.REQUEST })
export class CatsService {}

Similarly, for custom providers, set the scope property in the long-hand form for a provider registration:

{
  provide: 'CACHE_MANAGER',
  useClass: CacheManager,
  scope: Scope.TRANSIENT,
}

info Hint Import the Scope enum from @nestjs/common

warning Notice Gateways should not use request-scoped providers because they must act as singletons. Each gateway encapsulates a real socket and cannot be instantiated multiple times.

Singleton scope is used by default, and need not be declared. If you do want to declare a provider as singleton scoped, use the Scope.DEFAULT value for the scope property.

Controller scope

Controllers can also have scope, which applies to all request method handlers declared in that controller. Like provider scope, the scope of a controller declares its lifetime. For a request-scoped controller, a new instance is created for each inbound request, and garbage-collected when the request has completed processing.

Declare controller scope with the scope property of the ControllerOptions object:

@Controller({
  path: 'cats',
  scope: Scope.REQUEST,
})
export class CatsController {}

Scope hierarchy

Scope bubbles up the injection chain. A controller that depends on a request-scoped provider will, itself, be request-scoped.

Imagine the following dependency graph: CatsController <- CatsService <- CatsRepository. If CatsService is request-scoped (and the others are default singletons), the CatsController will become request-scoped as it is dependent on the injected service. The CatsRepository, which is not dependent, would remain singleton-scoped.

Request provider

In an HTTP server-based application (e.g., using @nestjs/platform-express or @nestjs/platform-fastify), you may want to access a reference to the original request object when using request-scoped providers. You can do this by injecting the REQUEST object.

import { Injectable, Scope, Inject } from '@nestjs/common';
import { REQUEST } from '@nestjs/core';
import { Request } from 'express';

@Injectable({ scope: Scope.REQUEST })
export class CatsService {
  constructor(@Inject(REQUEST) private request: Request) {}
}

Because of underlying platform/protocol differences, you access the inbound request slightly differently for Microservice or GraphQL applications. In GraphQL applications, you inject CONTEXT instead of REQUEST:

import { Injectable, Scope, Inject } from '@nestjs/common';
import { CONTEXT } from '@nestjs/graphql';

@Injectable({ scope: Scope.REQUEST })
export class CatsService {
  constructor(@Inject(CONTEXT) private context) {}
}

You then configure your context value (in the GraphQLModule) to contain request as its property.

Performance

Using request-scoped providers will have an impact on application performance. While Nest tries to cache as much metadata as possible, it will still have to create an instance of your class on each request. Hence, it will slow down your average response time and overall benchmarking result. Unless a provider must be request-scoped, it is strongly recommended that you use the default singleton scope.

Testing

Automated testing is considered an essential part of any serious software development effort. Automation makes it easy to repeat individual tests or test suites quickly and easily during development. This helps ensure that releases meet quality and performance goals. Automation helps increase coverage and provides a faster feedback loop to developers. Automation both increases the productivity of individual developers and ensures that tests are run at critical development lifecycle junctures, such as source code control check-in, feature integration, and version release.

Such tests often span a variety of types, including unit tests, end-to-end (e2e) tests, integration tests, and so on. While the benefits are unquestionable, it can be tedious to set them up. Nest strives to promote development best practices, including effective testing, so it includes features such as the following to help developers and teams build and automate tests. Nest:

  • automatically scaffolds default unit tests for components and e2e tests for applications
  • provides default tooling (such as a test runner that builds an isolated module/application loader)
  • provides integration with Jest and Supertest out-of-the-box, while remaining agnostic to testing tools
  • makes the Nest dependency injection system available in the testing environment for easily mocking components

As mentioned, you can use any testing framework that you like, as Nest doesn't force any specific tooling. Simply replace the elements needed (such as the test runner), and you will still enjoy the benefits of Nest's ready-made testing facilities.

Installation

To get started, first install the required package:

$ npm i --save-dev @nestjs/testing

Unit testing

In the following example, we test two classes: CatsController and CatsService. As mentioned, Jest is provided as the default testing framework. It serves as a test-runner and also provides assert functions and test-double utilities that help with mocking, spying, etc. In the following basic test, we manually instantiate these classes, and ensure that the controller and service fulfill their API contract.

@@filename(cats.controller.spec)
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';

describe('CatsController', () => {
  let catsController: CatsController;
  let catsService: CatsService;

  beforeEach(() => {
    catsService = new CatsService();
    catsController = new CatsController(catsService);
  });

  describe('findAll', () => {
    it('should return an array of cats', async () => {
      const result = ['test'];
      jest.spyOn(catsService, 'findAll').mockImplementation(() => result);

      expect(await catsController.findAll()).toBe(result);
    });
  });
});
@@switch
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';

describe('CatsController', () => {
  let catsController;
  let catsService;

  beforeEach(() => {
    catsService = new CatsService();
    catsController = new CatsController(catsService);
  });

  describe('findAll', () => {
    it('should return an array of cats', async () => {
      const result = ['test'];
      jest.spyOn(catsService, 'findAll').mockImplementation(() => result);

      expect(await catsController.findAll()).toBe(result);
    });
  });
});

info Hint Keep your test files located near the classes they test. Testing files should have a .spec or .test suffix.

Because the above sample is trivial, we aren't really testing anything Nest-specific. Indeed, we aren't even using dependency injection (notice that we pass an instance of CatsService to our catsController). This form of testing - where we manually instantiate the classes being tested - is often called isolated testing as it is independent from the framework. Let's introduce some more advanced capabilities that help you test applications that make more extensive use of Nest features.

Testing utilities

The @nestjs/testing package provides a set of utilities that enable a more robust testing process. Let's rewrite the previous example using the built-in Test class:

@@filename(cats.controller.spec)
import { Test } from '@nestjs/testing';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';

describe('CatsController', () => {
  let catsController: CatsController;
  let catsService: CatsService;

  beforeEach(async () => {
    const moduleRef = await Test.createTestingModule({
        controllers: [CatsController],
        providers: [CatsService],
      }).compile();

    catsService = moduleRef.get<CatsService>(CatsService);
    catsController = moduleRef.get<CatsController>(CatsController);
  });

  describe('findAll', () => {
    it('should return an array of cats', async () => {
      const result = ['test'];
      jest.spyOn(catsService, 'findAll').mockImplementation(() => result);

      expect(await catsController.findAll()).toBe(result);
    });
  });
});
@@switch
import { Test } from '@nestjs/testing';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';

describe('CatsController', () => {
  let catsController;
  let catsService;

  beforeEach(async () => {
    const moduleRef = await Test.createTestingModule({
        controllers: [CatsController],
        providers: [CatsService],
      }).compile();

    catsService = moduleRef.get(CatsService);
    catsController = moduleRef.get(CatsController);
  });

  describe('findAll', () => {
    it('should return an array of cats', async () => {
      const result = ['test'];
      jest.spyOn(catsService, 'findAll').mockImplementation(() => result);

      expect(await catsController.findAll()).toBe(result);
    });
  });
});

The Test class is useful for providing an application execution context that essentially mocks the full Nest runtime, but gives you hooks that make it easy to manage class instances, including mocking and overriding. The Test class has a createTestingModule() method that takes a module metadata object as its argument (the same object you pass to the @Module() decorator). This method returns a TestingModule instance which in turn provides a few methods. For unit tests, the important one is the compile() method. This method bootstraps a module with its dependencies (similar to the way an application is bootstrapped in the conventional main.ts file using NestFactory.create()), and returns a module that is ready for testing.

info Hint The compile() method is asynchronous and therefore has to be awaited. Once the module is compiled you can retrieve any static instance it declares (controllers and providers) using the get() method.

TestingModule inherits from the module reference class, and therefore its ability to dynamically resolve scoped providers (transient or request-scoped). Do this with the resolve() method (the get() method can only retrieve static instances).

const moduleRef = await Test.createTestingModule({
  controllers: [CatsController],
  providers: [CatsService],
}).compile();

catsService = await moduleRef.resolve(CatsService);

warning Warning The resolve() method returns a unique instance of the provider, from its own DI container sub-tree. Each sub-tree has a unique context identifier. Thus, if you call this method more than once and compare instance references, you will see that they are not equal.

info Hint Learn more about the module reference features here.

Instead of using the production version of any provider, you can override it with a custom provider for testing purposes. For example, you can mock a database service instead of connecting to a live database. We'll cover overrides in the next section, but they're available for unit tests as well.

End-to-end testing

Unlike unit testing, which focuses on individual modules and classes, end-to-end (e2e) testing covers the interaction of classes and modules at a more aggregate level -- closer to the kind of interaction that end-users will have with the production system. As an application grows, it becomes hard to manually test the end-to-end behavior of each API endpoint. Automated end-to-end tests help us ensure that the overall behavior of the system is correct and meets project requirements. To perform e2e tests we use a similar configuration to the one we just covered in unit testing. In addition, Nest makes it easy to use the Supertest library to simulate HTTP requests.

@@filename(cats.e2e-spec)
import * as request from 'supertest';
import { Test } from '@nestjs/testing';
import { CatsModule } from '../../src/cats/cats.module';
import { CatsService } from '../../src/cats/cats.service';
import { INestApplication } from '@nestjs/common';

describe('Cats', () => {
  let app: INestApplication;
  let catsService = { findAll: () => ['test'] };

  beforeAll(async () => {
    const moduleRef = await Test.createTestingModule({
      imports: [CatsModule],
    })
      .overrideProvider(CatsService)
      .useValue(catsService)
      .compile();

    app = moduleRef.createNestApplication();
    await app.init();
  });

  it(`/GET cats`, () => {
    return request(app.getHttpServer())
      .get('/cats')
      .expect(200)
      .expect({
        data: catsService.findAll(),
      });
  });

  afterAll(async () => {
    await app.close();
  });
});
@@switch
import * as request from 'supertest';
import { Test } from '@nestjs/testing';
import { CatsModule } from '../../src/cats/cats.module';
import { CatsService } from '../../src/cats/cats.service';
import { INestApplication } from '@nestjs/common';

describe('Cats', () => {
  let app: INestApplication;
  let catsService = { findAll: () => ['test'] };

  beforeAll(async () => {
    const moduleRef = await Test.createTestingModule({
      imports: [CatsModule],
    })
      .overrideProvider(CatsService)
      .useValue(catsService)
      .compile();

    app = moduleRef.createNestApplication();
    await app.init();
  });

  it(`/GET cats`, () => {
    return request(app.getHttpServer())
      .get('/cats')
      .expect(200)
      .expect({
        data: catsService.findAll(),
      });
  });

  afterAll(async () => {
    await app.close();
  });
});

In this example, we build on some of the concepts described earlier. In addition to the compile() method we used earlier, we now use the createNestApplication() method to instantiate a full Nest runtime environment. We save a reference to the running app in our app variable so we can use it to simulate HTTP requests.

We simulate HTTP tests using the request() function from Supertest. We want these HTTP requests to route to our running Nest app, so we pass the request() function a reference to the HTTP listener that underlies Nest (which, in turn, may be provided by the Express platform). Hence the construction request(app.getHttpServer()). The call to request() hands us a wrapped HTTP Server, now connected to the Nest app, which exposes methods to simulate an actual HTTP request. For example, using request(...).get('/cats') will initiate a request to the Nest app that is identical to an actual HTTP request like get '/cats' coming in over the network.

In this example, we also provide an alternate (test-double) implementation of the CatsService which simply returns a hard-coded value that we can test for. Use overrideProvider() to provide such an alternate implementation. Similarly, Nest provides methods to override guards, interceptors, filters and pipes with theoverrideGuard(), overrideInterceptor(), overrideFilter(), and overridePipe() methods respectively.

Each of the override methods returns an object with 3 different methods that mirror those described for custom providers:

  • useClass: you supply a class that will be instantiated to provide the instance to override the object (provider, guard, etc.).
  • useValue: you supply an instance that will override the object.
  • useFactory: you supply a function that returns an instance that will override the object.

Each of the override method types, in turn, returns the TestingModule instance, and can thus be chained with other methods in the fluent style. You should use compile() at the end of such a chain to cause Nest to instantiate and initialize the module.

The compiled module has several useful methods, as described in the following table:

createNestApplication() Creates and returns a Nest application (INestApplication instance) based on the given module. Note that you must manually initialize the application using the init() method.
createNestMicroservice() Creates and returns a Nest microservice (INestMicroservice instance) based on the given module.
get() Retrieves a static instance of a controller or provider (including guards, filters, etc.) available in the application context. Inherited from the module reference class.
resolve() Retrieves a dynamically created scoped instance (request or transient) of a controller or provider (including guards, filters, etc.) available in the application context. Inherited from the module reference class.
select() Navigates through the module's dependency graph; can be used to retrieve a specific instance from the selected module (used along with strict mode (strict: true) in get() method).

info Hint Keep your e2e test files inside the e2e directory. The testing files should have a .e2e-spec or .e2e-test suffix.

Testing request-scoped instances

Request-scoped providers are created uniquely for each incoming request. The instance is garbage-collected after the request has completed processing. This poses a problem, because we can't access a dependency injection sub-tree generated specifically for a tested request.

We know (based on the sections above) that the resolve() method can be used to retrieve a dynamically instantiated class. Also, as described here, we know we can pass a unique context identifier to control the lifecycle of a DI container sub-tree. How do we leverage this in a testing context?

The strategy is to generate a context identifier beforehand and force Nest to use this particular ID to create a sub-tree for all incoming requests. In this way we'll be able to retrieve instances created for a tested request.

To accomplish this, use jest.spyOn() on the ContextIdFactory:

const contextId = ContextIdFactory.create();
jest
  .spyOn(ContextIdFactory, 'getByRequest')
  .mockImplementation(() => contextId);

Now we can use the contextId to access a single generated DI container sub-tree for any subsequent request.

catsService = await moduleRef.resolve(CatsService, contextId);

CLI Plugin

warning Warning This chapter applies only to the code first approach.

TypeScript's metadata reflection system has several limitations which make it impossible to, for instance, determine what properties a class consists of or recognize whether a given property is optional or required. However, some of these constraints can be addressed at compilation time. Nest provides a plugin that enhances the TypeScript compilation process to reduce the amount of boilerplate code required.

info Hint This plugin is opt-in. If you prefer, you can declare all decorators manually, or only specific decorators where you need them.

Overview

The GraphQL plugin will automatically:

  • annotate all input object, object type and args classes properties with @Field unless @HideField is used
  • set the nullable property depending on the question mark (e.g. name?: string will set nullable: true)
  • set the type property depending on the type (supports arrays as well)

Please, note that your filenames must have one of the following suffixes in order to be analyzed by the plugin: ['.input.ts', '.args.ts', '.entity.ts', '.model.ts'] (e.g., author.entity.ts). If you are using a different suffix, you can adjust the plugin's behavior by specifying the typeFileNameSuffix option (see below).

With what we've learned so far, you have to duplicate a lot of code to let the package know how your type should be declared in GraphQL. For example, you could define a simple Author class as follows:

@@filename(authors/models/author.model)
@ObjectType()
export class Author {
  @Field(type => Int)
  id: number;

  @Field({ nullable: true })
  firstName?: string;

  @Field({ nullable: true })
  lastName?: string;

  @Field(type => [Post])
  posts: Post[];
}

While not a significant issue with medium-sized projects, it becomes verbose & hard to maintain once you have a large set of classes.

By enabling the GraphQL plugin, the above class definition can be declared simply:

@@filename(authors/models/author.model)
@ObjectType()
export class Author {
  @Field(type => Int)
  id: number;
  firstName?: string;
  lastName?: string;
  posts: Post[];
}

The plugin adds appropriate decorators on-the-fly based on the Abstract Syntax Tree. Thus, you won't have to struggle with @Field decorators scattered throughout the code.

info Hint The plugin will automatically generate any missing swagger properties, but if you need to override them, simply set them explicitly via @Field().

Using the CLI plugin

To enable the plugin, open nest-cli.json (if you use Nest CLI) and add the following plugins configuration:

{
  "collection": "@nestjs/schematics",
  "sourceRoot": "src",
  "compilerOptions": {
    "plugins": ["@nestjs/graphql/plugin"]
  }
}

You can use the options property to customize the behavior of the plugin.

"plugins": [
  {
    "name": "@nestjs/graphql/plugin",
    "options": {
      "typeFileNameSuffix": [".input.ts", ".args.ts"]
    }
  }
]

The options property has to fulfill the following interface:

export interface PluginOptions {
  typeFileNameSuffix?: string[];
}
Option Default Description
typeFileNameSuffix ['.input.ts', '.args.ts', '.entity.ts', '.model.ts'] GraphQL types files suffix

If you don't use the CLI but instead have a custom webpack configuration, you can use this plugin in combination with ts-loader:

getCustomTransformers: (program: any) => ({
  before: [require('@nestjs/graphql/plugin').before({}, program)]
}),

Complexity

warning Warning This chapter applies only to the code first approach.

Query complexity allows you to define how complex certain fields are, and to restrict queries with a maximum complexity. The idea is to define how complex each field is by using a simple number. A common default is to give each field a complexity of 1. In addition, the complexity calculation of a GraphQL query can be customized with so-called complexity estimators. A complexity estimator is a simple function that calculates the complexity for a field. You can add any number of complexity estimators to the rule, which are then executed one after another. The first estimator that returns a numeric complexity value determines the complexity for that field.

The @nestjs/graphql package integrates very well with tools like graphql-query-complexity that provides a cost analysis-based solution. With this library, you can reject queries to your GraphQL server that are deemed too costly to execute.

Installation

To begin using it, we first install the required dependency.

$ npm install --save graphql-query-complexity

Getting started

Once the installation process is complete, we can define the ComplexityPlugin class:

import { GraphQLSchemaHost, Plugin } from '@nestjs/graphql';
import {
  ApolloServerPlugin,
  GraphQLRequestListener,
} from 'apollo-server-plugin-base';
import { GraphQLError } from 'graphql';
import {
  fieldExtensionsEstimator,
  getComplexity,
  simpleEstimator,
} from 'graphql-query-complexity';

@Plugin()
export class ComplexityPlugin implements ApolloServerPlugin {
  constructor(private gqlSchemaHost: GraphQLSchemaHost) {}

  requestDidStart(): GraphQLRequestListener {
    const { schema } = this.gqlSchemaHost;

    return {
      didResolveOperation({ request, document }) {
        const complexity = getComplexity({
          schema,
          operationName: request.operationName,
          query: document,
          variables: request.variables,
          estimators: [
            fieldExtensionsEstimator(),
            simpleEstimator({ defaultComplexity: 1 }),
          ],
        });
        if (complexity >= 20) {
          throw new GraphQLError(
            `Query is too complex: ${complexity}. Maximum allowed complexity: 20`,
          );
        }
        console.log('Query Complexity:', complexity);
      },
    };
  }
}

For demonstration purposes, we specified the maximum allowed complexity as 20. In the example above, we used 2 estimators, the simpleEstimator and the fieldExtensionsEstimator.

  • simpleEstimator: the simple estimator returns a fixed complexity for each field
  • fieldExtensionsEstimator: the field extensions estimator extracts the complexity value for each field of your schema

info Hint Remember to add this class to the providers array in any module.

Field-level complexity

With this plugin in place, we can now define the complexity for any field by specifying the complexity property in the options object passed into the @Field() decorator, as follows:

@Field({ complexity: 3 })
title: string;

Alternatively, you can define the estimator function:

@Field({ complexity: (options: ComplexityEstimatorArgs) => ... })
title: string;

Directives

A directive can be attached to a field or fragment inclusion, and can affect execution of the query in any way the server desires (read more here). The GraphQL specification provides several default directives:

  • @include(if: Boolean) - only include this field in the result if the argument is true
  • @skip(if: Boolean) - skip this field if the argument is true
  • @deprecated(reason: String) - marks field as deprecated with message

A directive is an identifier preceded by a @ character, optionally followed by a list of named arguments, which can appear after almost any element in the GraphQL query and schema languages.

Custom directives

To create a custom schema directive, declare a class which extends the SchemaDirectiveVisitor class exported from the apollo-server package.

import { SchemaDirectiveVisitor } from 'apollo-server';
import { defaultFieldResolver, GraphQLField } from 'graphql';

export class UpperCaseDirective extends SchemaDirectiveVisitor {
  visitFieldDefinition(field: GraphQLField<any, any>) {
    const { resolve = defaultFieldResolver } = field;
    field.resolve = async function (...args) {
      const result = await resolve.apply(this, args);
      if (typeof result === 'string') {
        return result.toUpperCase();
      }
      return result;
    };
  }
}

info Hint Note that directives cannot be decorated with the @Injectable() decorator. Thus, they are not able to inject dependencies.

Now, register the UpperCaseDirective in the GraphQLModule.forRoot() method:

GraphQLModule.forRoot({
  // ...
  schemaDirectives: {
    upper: UpperCaseDirective,
  },
});

Once registered, the @upper directive can be used in our schema. However, the way you apply the directive will vary depending on the approach you use (code first or schema first).

Code first

In the code first approach, use the @Directive() decorator to apply the directive.

@Directive('@upper')
@Field()
title: string;

info Hint The @Directive() decorator is exported from the @nestjs/graphql package.

Directives can be applied on fields, field resolvers, input and object types, as well as queries, mutations, and subscriptions. Here's an example of the directive applied on the query handler level:

@Directive('@deprecated(reason: "This query will be removed in the next version")')
@Query(returns => Author, { name: 'author' })
async getAuthor(@Args({ name: 'id', type: () => Int }) id: number) {
  return this.authorsService.findOneById(id);
}

Directives applied through the @Directive() decorator will not be reflected in the generated schema definition file.

Schema first

In the schema first approach, apply directives directly in SDL.

directive @upper on FIELD_DEFINITION

type Post {
  id: Int!
  title: String! @upper
  votes: Int
}

Enums

Enumeration types are a special kind of scalar that is restricted to a particular set of allowed values (read more here). This allows you to:

  • validate that any arguments of this type are one of the allowed values
  • communicate through the type system that a field will always be one of a finite set of values

Code first

When using the code first approach, you define a GraphQL enum type by simply creating a TypeScript enum.

export enum AllowedColor {
  RED,
  GREEN,
  BLUE,
}

With this in place, register the AllowedColor enum using the registerEnumType function exported from the @nestjs/graphql package:

registerEnumType(AllowedColor, {
  name: 'AllowedColor',
});

Now you can reference the AllowedColor in our types:

@Field(type => AllowedColor)
favoriteColor: AllowedColor;

This will result in generating the following part of the GraphQL schema in SDL:

enum AllowedColor {
  RED
  GREEN
  BLUE
}

Schema first

To define an enumerator in the schema first approach, simply create a GraphQL enum with SDL.

enum AllowedColor {
  RED
  GREEN
  BLUE
}

Then you can use the typings generation feature (as shown in the quick start chapter) to generate corresponding TypeScript definitions:

export enum AllowedColor {
  RED
  GREEN
  BLUE
}

Sometimes a backend forces a different value for an enum internally than in the public API. In this example the API contains RED, however in resolvers we may use #f00 instead (read more here). To accomplish this, declare a resolver object for the AllowedColor enum:

export const allowedColorResolver: Record<keyof typeof AllowedColor, any> = {
  RED: '#f00',
};

info Hint All decorators are exported from the @nestjs/graphql package.

Then use this resolver object together with the resolvers property of the GraphQLModule#forRoot() method, as follows:

GraphQLModule.forRoot({
  resolvers: {
    AllowedColor: allowedColorResolver,
  },
});

Extensions

warning Warning This chapter applies only to the code first approach.

Extensions is an advanced, low-level feature that lets you define arbitrary data in the types configuration. Attaching custom metadata to certain fields allows you to create more sophisticated, generic solutions. For example, with extensions, you can define field-level roles required to access particular fields. Such roles can be reflected at runtime to determine whether the caller has sufficient permissions to retrieve a specific field.

Adding custom metadata

To attach custom metadata for a field, use the @Extensions() decorator exported from the @nestjs/graphql package.

@Field()
@Extensions({ role: Role.ADMIN })
password: string;

In the example above, we assigned the role metadata property the value of Role.ADMIN. Role is a simple TypeScript enum that groups all the user roles available in our system.

Note, in addition to setting metadata on fields, you can use the @Extensions() decorator at the class level and method level (e.g., on the query handler).

Using custom metadata

The logic that leverages the custom metatada can be as complex as needed. For example, you can create a simple interceptor that stores/logs events per method invocation, or create a sophisticated guard that analyzes requested fields, iterates through the GraphQLObjectType definition, and matches the roles required to retrieve specific fields with the caller permissions (field-level permissions system).

Let's define a FieldRolesGuard that implements a basic version of such a field-level permissions system.

import { CanActivate, ExecutionContext, Injectable } from '@nestjs/common';
import { GqlExecutionContext } from '@nestjs/graphql';
import { GraphQLNonNull, GraphQLObjectType, GraphQLResolveInfo } from 'graphql';
import * as graphqlFields from 'graphql-fields';

@Injectable()
export class FieldRolesGuard implements CanActivate {
  canActivate(context: ExecutionContext): boolean {
    const info = GqlExecutionContext.create(context).getInfo<
      GraphQLResolveInfo
    >();
    const returnType = (info.returnType instanceof GraphQLNonNull
      ? info.returnType.ofType
      : info.returnType) as GraphQLObjectType;

    const fields = returnType.getFields();
    const requestedFields = graphqlFields(info);

    Object.entries(fields)
      .filter(([key]) => key in requestedFields)
      .map(([_, field]) => field)
      .filter(field => field.extensions && field.extensions.role)
      .forEach(field => {
        // match user and field roles here
        console.log(field.extensions.role);
      });

    return true;
  }
}

warning Warning For illustration purposes, we assumed that every resolver returns either the GraphQLObjectType or GraphQLNonNull that wraps the object type. In a real-world application, you should cover other cases (scalars, etc.). Note that using this particular implementation can lead to unexpected errors (e.g., missing getFields() method).

In the example above, we've used the graphql-fields package that turns the GraphQLResolveInfo object into an object that consists of the requested fields. We used this specific library to make the presented example somewhat simpler.

With this guard in place, if the return type of any resolver contains a field annotated with the @Extensions({{ '{' }} role: Role.ADMIN {{ '}' }}}) decorator, this role (Role.ADMIN) will be logged in the console if requested in the GraphQL query.

Federation

Apollo Federation offers a means of splitting your monolithic GraphQL server into independent microservices. It consists of two components: a gateway and one or more federated microservices. Each microservice holds part of the schema and the gateway merges the schemas into a single schema that can be consumed by the client.

To quote the Apollo docs, Federation is designed with these core principles:

  • Building a graph should be declarative. With federation, you compose a graph declaratively from within your schema instead of writing imperative schema stitching code.
  • Code should be separated by concern, not by types. Often no single team controls every aspect of an important type like a User or Product, so the definition of these types should be distributed across teams and codebases, rather than centralized.
  • The graph should be simple for clients to consume. Together, federated services can form a complete, product-focused graph that accurately reflects how it’s being consumed on the client.
  • It’s just GraphQL, using only spec-compliant features of the language. Any language, not just JavaScript, can implement federation.

warning Warning Apollo Federation currently does not support subscriptions.

In the next example, we'll set up a demo application with a gateway and two federated endpoints: a Users service and a Posts service.

Federated example: Users

First install the optional dependency for federation:

$ npm install --save @apollo/federation

The User service has a simple schema. Note the @key directive: it tells the Apollo query planner that a particular instance of User can be fetched if you have its id. Also note that we extend the Query type.

type User @key(fields: "id") {
  id: ID!
  name: String!
}

extend type Query {
  getUser(id: ID!): User
}

Our resolver has one extra method: resolveReference. It's called by the Apollo Gateway whenever a related resource requires a User instance. We'll see an example of this in the Posts service later on. Please note the @ResolveReference decorator.

import { Args, Query, Resolver, ResolveReference } from '@nestjs/graphql';
import { UsersService } from './users.service';

@Resolver('User')
export class UsersResolvers {
  constructor(private usersService: UsersService) {}

  @Query()
  getUser(@Args('id') id: string) {
    return this.usersService.findById(id);
  }

  @ResolveReference()
  resolveReference(reference: { __typename: string; id: string }) {
    return this.usersService.findById(reference.id);
  }
}

Finally, we hook everything up in a module together with a GraphQLFederationModule. This module accepts the same options as the regular GraphQLModule.

import { Module } from '@nestjs/common';
import { GraphQLFederationModule } from '@nestjs/graphql';
import { UsersResolvers } from './users.resolvers';

@Module({
  imports: [
    GraphQLFederationModule.forRoot({
      typePaths: ['**/*.graphql'],
    }),
  ],
  providers: [UsersResolvers],
})
export class AppModule {}

Federated example: Posts

The Posts service references the User type in its schema by marking it with the extend keyword. It also adds one property to the User type. Note the @key directive used for matching instances of User, and the @external directive indicating that the id field is managed elsewhere.

type Post @key(fields: "id") {
  id: ID!
  title: String!
  body: String!
  user: User
}

extend type User @key(fields: "id") {
  id: ID! @external
  posts: [Post]
}

extend type Query {
  getPosts: [Post]
}

Our resolver has one method of interest here: getUser. It returns a reference containing __typename and any additional properties your application needs to resolve the reference, in this case only an id. The __typename is used by the GraphQL Gateway to pinpoint the microservice responsible for the User type and request the instance. The Users service discussed above will be called on the resolveReference method.

import { Query, Resolver, Parent, ResolveProperty } from '@nestjs/graphql';
import { PostsService } from './posts.service';
import { Post } from './posts.interfaces';

@Resolver('Post')
export class PostsResolvers {
  constructor(private postsService: PostsService) {}

  @Query('getPosts')
  getPosts() {
    return this.postsService.findAll();
  }

  @ResolveProperty('user')
  getUser(@Parent() post: Post) {
    return { __typename: 'User', id: post.userId };
  }
}

The Posts service has virtually the same module, but is included below for the sake of completeness:

import { Module } from '@nestjs/common';
import { GraphQLFederationModule } from '@nestjs/graphql';
import { PostsResolvers } from './posts.resolvers';

@Module({
  imports: [
    GraphQLFederationModule.forRoot({
      typePaths: ['**/*.graphql'],
    }),
  ],
  providers: [PostsResolvers],
})
export class AppModule {}

Federated example: Gateway

First install the optional dependency for the gateway: npm install --save @apollo/gateway.

Our gateway only needs a list of endpoints and will auto-discover the schemas from there. The code for our gateway is therefore very short:

import { Module } from '@nestjs/common';
import { GraphQLGatewayModule } from '@nestjs/graphql';

@Module({
  imports: [
    GraphQLGatewayModule.forRoot({
      server: {
        // ... Apollo server options
        cors: true,
      },
      gateway: {
        serviceList: [
          { name: 'users', url: 'http://user-service/graphql' },
          { name: 'posts', url: 'http://post-service/graphql' },
        ],
      },
    }),
  ],
})
export class AppModule {}

info Hint Apollo recommends that you don't rely on the service discovery in a production environment but use their Graph Manager instead.

Sharing context

You can customize the requests between the gateway and federated services using a build service. This allows you to share context about the request. You can easily extend the default RemoteGraphQLDataSource and implement one of the hooks. Please refer to Apollo Docs on RemoteGraphQLDataSource for more information about the possibilities.

import { Module } from '@nestjs/common';
import { GATEWAY_BUILD_SERVICE, GraphQLGatewayModule } from '@nestjs/graphql';
import { RemoteGraphQLDataSource } from '@apollo/gateway';
import { decode } from 'jsonwebtoken';

class AuthenticatedDataSource extends RemoteGraphQLDataSource {
  async willSendRequest({ request, context }) {
    const { userId } = await decode(context.jwt);
    request.http.headers.set('x-user-id', userId);
  }
}

@Module({
  providers: [
    {
      provide: AuthenticatedDataSource,
      useValue: AuthenticatedDataSource,
    },
    {
      provide: GATEWAY_BUILD_SERVICE,
      useFactory: AuthenticatedDataSource => {
        return ({ name, url }) => new AuthenticatedDataSource({ url });
      },
      inject: [AuthenticatedDataSource],
    },
  ],
  exports: [GATEWAY_BUILD_SERVICE],
})
class BuildServiceModule {}

@Module({
  imports: [
    GraphQLGatewayModule.forRootAsync({
      useFactory: async () => ({
        gateway: {
          serviceList: [
            /* services */
          ],
        },
        server: {
          context: ({ req }) => ({
            jwt: req.headers.authorization,
          }),
        },
      }),
      imports: [BuildServiceModule],
      inject: [GATEWAY_BUILD_SERVICE],
    }),
  ],
})
export class AppModule {}

Async configuration

Both the Federation and Gateway modules support asynchronous initialization using the same forRootAsync that's documented in Quick start.

Other features

In the GraphQL world, there is a lot of debate about handling issues like authentication, or side-effects of operations. Should we handle things inside the business logic? Should we use a higher-order function to enhance queries and mutations with authorization logic? Or should we use schema directives? There is no single one-size-fits-all answer to these questions.

Nest helps address these issues with its cross-platform features like guards and interceptors. The philosophy is to reduce redundancy and provide tooling that helps create well-structured, readable, and consistent applications.

Overview

You can use standard guards, interceptors, filters and pipes in the same fashion with GraphQL as with any RESTful application. Additionally, you can easily create your own decorators by leveraging the custom decorators feature. Let's take a look at a sample GraphQL query handler.

@Query('author')
@UseGuards(AuthGuard)
async getAuthor(@Args('id', ParseIntPipe) id: number) {
  return this.authorsService.findOneById(id);
}

As you can see, GraphQL works with both guards and pipes in the same way as HTTP REST handlers. Because of this, you can move your authentication logic to a guard; you can even reuse the same guard class across both a REST and GraphQL API interface. Similarly, interceptors work across both types of applications in the same way:

@Mutation()
@UseInterceptors(EventsInterceptor)
async upvotePost(@Args('postId') postId: number) {
  return this.postsService.upvoteById({ id: postId });
}

Execution context

Since GraphQL receives a different type of data in the incoming request, the execution context received by both guards and interceptors is somewhat different with GraphQL vs. REST. GraphQL resolvers have a distinct set of arguments: root, args, context, and info. Thus guards and interceptors must transform the generic ExecutionContext to a GqlExecutionContext. This is straightforward:

import { CanActivate, ExecutionContext, Injectable } from '@nestjs/common';
import { GqlExecutionContext } from '@nestjs/graphql';

@Injectable()
export class AuthGuard implements CanActivate {
  canActivate(context: ExecutionContext): boolean {
    const ctx = GqlExecutionContext.create(context);
    return true;
  }
}

The GraphQL context object returned by GqlExecutionContext.create() exposes a get method for each GraphQL resolver argument (e.g., getArgs(), getContext(), etc). Once transformed, we can easily pick out any GraphQL argument for the current request.

Exception filters

Nest standard exception filters are compatible with GraphQL applications as well. As with ExecutionContext, GraphQL apps should transform the ArgumentsHost object to a GqlArgumentsHost object.

@Catch(HttpException)
export class HttpExceptionFilter implements GqlExceptionFilter {
  catch(exception: HttpException, host: ArgumentsHost) {
    const gqlHost = GqlArgumentsHost.create(host);
    return exception;
  }
}

info Hint Both GqlExceptionFilter and GqlArgumentsHost are imported from the @nestjs/graphql package.

Note that unlike the REST case, you don't use the native response object to generate a response.

Custom decorators

As mentioned, the custom decorators feature works as expected with GraphQL resolvers.

export const User = createParamDecorator(
  (data: unknown, ctx: ExecutionContext) =>
    GqlExecutionContext.create(ctx).getContext().user,
);

Use the @User() custom decorator as follows:

@Mutation()
async upvotePost(
  @User() user: UserEntity,
  @Args('postId') postId: number,
) {}

info Hint In the above example, we have assumed that the user object is assigned to the context of your GraphQL application.

Execute enhancers at the field resolver level

In the GraphQL context, Nest does not run enhancers (the generic name for interceptors, guards and filters) at the field level see this issue: they only run for the top level @Query()/@Mutation() method. You can tell Nest to execute interceptors, guards or filters for methods annotated with @ResolveField() by setting the fieldResolverEnhancers option in GqlModuleOptions. Pass it a list of 'interceptors', 'guards', and/or 'filters' as appropriate:

GraphQLModule.forRoot({
  fieldResolverEnhancers: ['interceptors']
}),

Warning Enabling enhancers for field resolvers can cause performance issues when you are returning lots of records and your field resolver is executed thousands of times. For this reason, when you enable fieldResolverEnhancers, we advise you to skip execution of enhancers that are not strictly necessary for your field resolvers. You can do this using the following helper function:

export function isResolvingGraphQLField(context: ExecutionContext): boolean {
  if (context.getType<GqlContextType>() === 'graphql') {
    const gqlContext = GqlExecutionContext.create(context);
    const info = gqlContext.getInfo();
    const parentType = info.parentType.name;
    return parentType !== 'Query' && parentType !== 'Mutation';
  }
  return false;
}

Interfaces

Like many type systems, GraphQL supports interfaces. An Interface is an abstract type that includes a certain set of fields that a type must include to implement the interface (read more here).

Code first

When using the code first approach, you define a GraphQL interface by creating an abstract class annotated with the @InterfaceType() decorator exported from the @nestjs/graphql.

import { Field, ID, InterfaceType } from '@nestjs/graphql';

@InterfaceType()
export abstract class Character {
  @Field(type => ID)
  id: string;

  @Field()
  name: string;
}

warning Warning TypeScript interfaces cannot be used to define GraphQL interfaces.

This will result in generating the following part of the GraphQL schema in SDL:

interface Character {
  id: ID!
  name: String!
}

Now, to implement the Character interface, use the implements key:

@ObjectType({
  implements: [Character],
})
export class Human implements Character {
  id: string;
  name: string;
}

info Hint The @ObjectType() decorator is exported from the @nestjs/graphql package.

The default resolveType() function generated by the library extracts the type based on the value returned from the resolver method. This means that you must return class instances (you cannot return literal JavaScript objects).

To provide a customized resolveType() function, pass the resolveType property to the options object passed into the @InterfaceType() decorator, as follows:

@InterfaceType({
  resolveType(book) {
    if (book.colors) {
      return ColoringBook;
    }
    return TextBook;
  },
})
export abstract class Book {
  @Field(type => ID)
  id: string;

  @Field()
  title: string;
}

Schema first

To define an interface in the schema first approach, simply create a GraphQL interface with SDL.

interface Character {
  id: ID!
  name: String!
}

Then, you can use the typings generation feature (as shown in the quick start chapter) to generate corresponding TypeScript definitions:

export interface Character {
  id: string;
  name: string;
}

Interfaces require an extra __resolveType field in the resolver map to determine which type the interface should resolve to. Let's create a CharactersResolver class and define the __resolveType method:

@Resolver('Character')
export class CharactersResolver {
  @ResolveField()
  __resolveType(value) {
    if ('age' in value) {
      return Person;
    }
    return null;
  }
}

info Hint All decorators are exported from the @nestjs/graphql package.

Mapped types

warning Warning This chapter applies only to the code first approach.

As you build out features like CRUD (Create/Read/Update/Delete) it's often useful to construct variants on a base entity type. Nest provides several utility functions that perform type transformations to make this task more convenient.

Partial

When building input validation types (also called DTOs), it's often useful to build create and update variations on the same type. For example, the create variant may require all fields, while the update variant may make all fields optional.

Nest provides the PartialType() utility function to make this task easier and minimize boilerplate.

The PartialType() function returns a type (class) with all the properties of the input type set to optional. For example, suppose we have a create type as follows:

@InputType()
class CreateUserInput {
  @Field()
  email: string;

  @Field()
  password: string;

  @Field()
  firstName: string;
}

By default, all of these fields are required. To create a type with the same fields, but with each one optional, use PartialType() passing the class reference (CreateUserInput) as an argument:

@InputType()
export class UpdateUserInput extends PartialType(CreateUserInput) {}

info Hint The PartialType() function is imported from the @nestjs/graphql package.

The PartialType() function takes an optional second argument that is a reference to a decorator factory. This argument can be used to change the decorator function applied to the resulting (child) class. If not specified, the child class effectively uses the same decorator as the parent class (the class referenced in the first argument). In the example above, we are extending CreateUserInput which is annotated with the @InputType() decorator. Since we want UpdateUserInput to also be treated as if it were decorated with @InputType(), we didn't need to pass InputType as the second argument. If the parent and child types are different, (e.g., the parent is decorated with @ObjectType), we would pass InputType as the second argument. For example:

@InputType()
export class UpdateUserInput extends PartialType(User, InputType) {}

Pick

The PickType() function constructs a new type (class) by picking a set of properties from an input type. For example, suppose we start with a type like:

@InputType()
class CreateUserInput {
  @Field()
  email: string;

  @Field()
  password: string;

  @Field()
  firstName: string;
}

We can pick a set of properties from this class using the PickType() utility function:

@InputType()
export class UpdateEmailInput extends PickType(CreateUserInput, [
  'email',
] as const) {}

info Hint The PickType() function is imported from the @nestjs/graphql package.

Omit

The OmitType() function constructs a type by picking all properties from an input type and then removing a particular set of keys. For example, suppose we start with a type like:

@InputType()
class CreateUserInput {
  @Field()
  email: string;

  @Field()
  password: string;

  @Field()
  firstName: string;
}

We can generate a derived type that has every property except email as shown below. In this construct, the second argument to OmitType is an array of property names.

@InputType()
export class UpdateUserInput extends OmitType(CreateUserInput, [
  'email',
] as const) {}

info Hint The OmitType() function is imported from the @nestjs/graphql package.

Intersection

The IntersectionType() function combines two types into one new type (class). For example, suppose we start with two types like:

@InputType()
class CreateUserInput {
  @Field()
  email: string;

  @Field()
  password: string;
}

@ObjectType()
export class AdditionalUserInfo {
  @Field()
  firstName: string;

  @Field()
  lastName: string;
}

We can generate a new type that combines all properties in both types.

@InputType()
export class UpdateUserInput extends IntersectionType(
  CreateUserInput,
  AdditionalUserInfo,
) {}

info Hint The IntersectionType() function is imported from the @nestjs/graphql package.

Composition

The type mapping utility functions are composable. For example, the following will produce a type (class) that has all of the properties of the CreateUserInput type except for email, and those properties will be set to optional:

@InputType()
export class UpdateUserInput extends PartialType(
  OmitType(CreateUserInput, ['email'] as const),
) {}

Mutations

Most discussions of GraphQL focus on data fetching, but any complete data platform needs a way to modify server-side data as well. In REST, any request could end up causing side-effects on the server, but best practice suggests we should not modify data in GET requests. GraphQL is similar - technically any query could be implemented to cause a data write. However, like REST, it's recommended to observe the convention that any operations that cause writes should be sent explicitly via a mutation (read more here).

The official Apollo documentation uses an upvotePost() mutation example. This mutation implements a method to increase a post's votes property value. To create an equivalent mutation in Nest, we'll make use of the @Mutation() decorator.

Code first

Let's add another method to the AuthorResolver used in the previous section (see resolvers).

@Mutation(returns => Post)
async upvotePost(@Args({ name: 'postId', type: () => Int }) postId: number) {
  return this.postsService.upvoteById({ id: postId });
}

info Hint All decorators (e.g., @Resolver, @ResolveField, @Args, etc.) are exported from the @nestjs/graphql package.

This will result in generating the following part of the GraphQL schema in SDL:

type Mutation {
  upvotePost(postId: Int!): Post
}

The upvotePost() method takes postId (Int) as an argument and returns an updated Post entity. For the reasons explained in the resolvers section, we have to explicitly set the expected type.

If the mutation needs to take an object as an argument, we can create an input type. The input type is a special kind of object type that can be passed in as an argument (read more here). To declare an input type, use the @InputType() decorator.

import { InputType, Field } from '@nestjs/graphql';

@InputType()
export class UpvotePostInput {
  @Field()
  postId: number;
}

info Hint The @InputType() decorator takes an options object as an argument, so you can, for example, specify the input type's description. Note that, due to TypeScript's metadata reflection system limitations, you must either use the @Field decorator to manually indicate a type, or use a CLI plugin.

We can then use this type in the resolver class:

@Mutation(returns => Post)
async upvotePost(
  @Args('upvotePostData') upvotePostData: UpvotePostInput,
) {}

Schema first

Let's extend our AuthorResolver used in the previous section (see resolvers).

@Mutation()
async upvotePost(@Args('postId') postId: number) {
  return this.postsService.upvoteById({ id: postId });
}

Note that we assumed above that the business logic has been moved to the PostsService (querying the post and incrementing its votes property). The logic inside the PostsService class can be as simple or sophisticated as needed. The main point of this example is to show how resolvers can interact with other providers.

The last step is to add our mutation to the existing types definition.

type Author {
  id: Int!
  firstName: String
  lastName: String
  posts: [Post]
}

type Post {
  id: Int!
  title: String
  votes: Int
}

type Query {
  author(id: Int!): Author
}

type Mutation {
  upvotePost(postId: Int!): Post
}

The upvotePost(postId: Int!): Post mutation is now available to be called as part of our application's GraphQL API.

Plugins

Plugins enable you to extend Apollo Server's core functionality by performing custom operations in response to certain events. Currently, these events correspond to individual phases of the GraphQL request lifecycle, and to the startup of Apollo Server itself (read more here). For example, a basic logging plugin might log the GraphQL query string associated with each request that's sent to Apollo Server.

Custom plugins

To create a plugin, declare a class annotated with the @Plugin decorator exported from the @nestjs/graphql package. Also, for better code autocompletion, implement the ApolloServerPlugin interface from the apollo-server-plugin-base package.

import { Plugin } from '@nestjs/graphql';
import {
  ApolloServerPlugin,
  GraphQLRequestListener,
} from 'apollo-server-plugin-base';

@Plugin()
export class LoggingPlugin implements ApolloServerPlugin {
  requestDidStart(): GraphQLRequestListener {
    console.log('Request started');
    return {
      willSendResponse() {
        console.log('Will send response');
      },
    };
  }
}

With this in place, we can register the LoggingPlugin as a provider.

@Module({
  providers: [LoggingPlugin],
})
export class CommonModule {}

Nest will automatically instantiate a plugin and apply it to the Apollo Server.

Using external plugins

There are several plugins provided out-of-the-box. To use an existing plugin, simply import it and add it to the plugins array:

GraphQLModule.forRoot({
  // ...
  plugins: [ApolloServerOperationRegistry({ /* options */})]
}),

info Hint The ApolloServerOperationRegistry plugin is exported from the apollo-server-plugin-operation-registry package.

Harnessing the power of TypeScript & GraphQL

GraphQL is a powerful query language for APIs and a runtime for fulfilling those queries with your existing data. It's an elegant approach that solves many problems typically found with REST APIs. For background, we suggest reading this comparison between GraphQL and REST. GraphQL combined with TypeScript helps you develop better type safety with your GraphQL queries, giving you end-to-end typing.

In this chapter, we assume a basic understanding of GraphQL, and focus on how to work with the built-in @nestjs/graphql module. The GraphQLModule is a wrapper around the Apollo server. We use this proven GraphQL package to provide a way to use GraphQL with Nest.

Installation

Start by installing the required packages:

$ npm i @nestjs/graphql graphql-tools graphql apollo-server-express

info Hint If using Fastify, instead of installing apollo-server-express, you should install apollo-server-fastify.

Overview

Nest offers two ways of building GraphQL applications, the code first and the schema first methods. You should choose the one that works best for you. Most of the chapters in this GraphQL section are divided into two main parts: one you should follow if you adopt code first, and the other to be used if you adopt schema first.

In the code first approach, you use decorators and TypeScript classes to generate the corresponding GraphQL schema. This approach is useful if you prefer to work exclusively with TypeScript and avoid context switching between language syntaxes.

In the schema first approach, the source of truth is GraphQL SDL (Schema Definition Language) files. SDL is a language-agnostic way to share schema files between different platforms. Nest automatically generates your TypeScript definitions (using either classes or interfaces) based on the GraphQL schemas to reduce the need to write redundant boilerplate code.

Getting started with GraphQL & TypeScript

Once the packages are installed, we can import the GraphQLModule and configure it with the forRoot() static method.

@@filename()
import { Module } from '@nestjs/common';
import { GraphQLModule } from '@nestjs/graphql';

@Module({
  imports: [
    GraphQLModule.forRoot({}),
  ],
})
export class AppModule {}

The forRoot() method takes an options object as an argument. These options are passed through to the underlying Apollo instance (read more about available settings here). For example, if you want to disable the playground and turn off debug mode, pass the following options:

@@filename()
import { Module } from '@nestjs/common';
import { GraphQLModule } from '@nestjs/graphql';

@Module({
  imports: [
    GraphQLModule.forRoot({
      debug: false,
      playground: false,
    }),
  ],
})
export class AppModule {}

As mentioned, these options will be forwarded to the ApolloServer constructor.

GraphQL playground

The playground is a graphical, interactive, in-browser GraphQL IDE, available by default on the same URL as the GraphQL server itself. To access the playground, you need a basic GraphQL server configured and running. To see it now, you can install and build the working example here. Alternatively, if you're following along with these code samples, once you've complete the steps in the Resolvers chapter, you can access the playground.

With that in place, and with your application running in the background, you can then open your web browser and navigate to http://localhost:3000/graphql (host and port may vary depending on your configuration). You will then see the GraphQL playground, as shown below.

Multiple endpoints

Another useful feature of the @nestjs/graphql module is the ability to serve multiple endpoints at once. This lets you decide which modules should be included in which endpoint. By default, GraphQL searches for resolvers throughout the whole app. To limit this scan to only a subset of modules, use the include property.

GraphQLModule.forRoot({
  include: [CatsModule],
}),

Code first

In the code first approach, you use decorators and TypeScript classes to generate the corresponding GraphQL schema.

To use the code first approach, start by adding the autoSchemaFile property to the options object:

GraphQLModule.forRoot({
  autoSchemaFile: join(process.cwd(), 'src/schema.gql'),
}),

The autoSchemaFile property value is the path where your automatically generated schema will be created. Alternatively, the schema can be generated on-the-fly in memory. To enable this, set the autoSchemaFile property to true:

GraphQLModule.forRoot({
  autoSchemaFile: true,
}),

By default, the types in the generated schema will be in the order they are defined in the included modules. To sort the schema lexicographically, set the sortSchema property to true:

GraphQLModule.forRoot({
  autoSchemaFile: join(process.cwd(), 'src/schema.gql'),
  sortSchema: true,
}),

A fully working code first sample is available here.

Schema first

To use the schema first approach, start by adding a typePaths property to the options object. The typePaths property indicates where the GraphQLModule should look for GraphQL SDL schema definition files you'll be writing. These files will be combined in memory; this allows you to split your schemas into several files and locate them near their resolvers.

GraphQLModule.forRoot({
  typePaths: ['./**/*.graphql'],
}),

You will typically also need to have TypeScript definitions (classes and interfaces) that correspond to the GraphQL SDL types. Creating the corresponding TypeScript definitions by hand is redundant and tedious. It leaves us without a single source of truth -- each change made within SDL forces us to adjust TypeScript definitions as well. To address this, the @nestjs/graphql package can automatically generate TypeScript definitions from the abstract syntax tree (AST). To enable this feature, add the definitions options property when configuring the GraphQLModule.

GraphQLModule.forRoot({
  typePaths: ['./**/*.graphql'],
  definitions: {
    path: join(process.cwd(), 'src/graphql.ts'),
  },
}),

The path property of the definitions object indicates where to save generated TypeScript output. By default, all generated TypeScript types are created as interfaces. To generate classes instead, specify the outputAs property with a value of 'class'.

GraphQLModule.forRoot({
  typePaths: ['./**/*.graphql'],
  definitions: {
    path: join(process.cwd(), 'src/graphql.ts'),
    outputAs: 'class',
  },
}),

The above approach dynamically generates TypeScript definitions each time the application starts. Alternatively, it may be preferable to build a simple script to generate these on demand. For example, assume we create the following script as generate-typings.ts:

import { GraphQLDefinitionsFactory } from '@nestjs/graphql';
import { join } from 'path';

const definitionsFactory = new GraphQLDefinitionsFactory();
definitionsFactory.generate({
  typePaths: ['./src/**/*.graphql'],
  path: join(process.cwd(), 'src/graphql.ts'),
  outputAs: 'class',
});

Now you can run this script on demand:

$ ts-node generate-typings

info Hint You can compile the script beforehand (e.g., with tsc) and use node to execute it.

To enable watch mode for the script (to automatically generate typings whenever any .graphql file changes), pass the watch option to the generate() method.

definitionsFactory.generate({
  typePaths: ['./src/**/*.graphql'],
  path: join(process.cwd(), 'src/graphql.ts'),
  outputAs: 'class',
  watch: true,
});

To automatically generate the additional __typename field for every object type, enable the emitTypenameField option.

definitionsFactory.generate({
  // ...,
  emitTypenameField: true,
});

To generate resolvers (queries, mutations, subscriptions) as plain fields without arguments, enable the skipResolverArgs option.

definitionsFactory.generate({
  // ...,
  skipResolverArgs: true,
});

A fully working schema first sample is available here.

Accessing generated schema

In some circumstances (for example end-to-end tests), you may want to get a reference to the generated schema object. In end-to-end tests, you can then run queries using the graphql object without using any HTTP listeners.

You can access the generated schema (in either the code first or schema first approach), using the GraphQLSchemaHost class:

const { schema } = app.get(GraphQLSchemaHost);

info Hint You must call the GraphQLSchemaHost#schema getter after the application has been initialized (after the onModuleInit hook has been triggered by either the app.listen() or app.init() method).

Async configuration

When you need to pass module options asynchronously instead of statically, use the forRootAsync() method. As with most dynamic modules, Nest provides several techniques to deal with async configuration.

One technique is to use a factory function:

GraphQLModule.forRootAsync({
  useFactory: () => ({
    typePaths: ['./**/*.graphql'],
  }),
}),

Like other factory providers, our factory function can be async and can inject dependencies through inject.

GraphQLModule.forRootAsync({
  imports: [ConfigModule],
  useFactory: async (configService: ConfigService) => ({
    typePaths: configService.getString('GRAPHQL_TYPE_PATHS'),
  }),
  inject: [ConfigService],
}),

Alternatively, you can configure the GraphQLModule using a class instead of a factory, as shown below:

GraphQLModule.forRootAsync({
  useClass: GqlConfigService,
}),

The construction above instantiates GqlConfigService inside GraphQLModule, using it to create options object. Note that in this example, the GqlConfigService has to implement the GqlOptionsFactory interface, as shown below. The GraphQLModule will call the createGqlOptions() method on the instantiated object of the supplied class.

@Injectable()
class GqlConfigService implements GqlOptionsFactory {
  createGqlOptions(): GqlModuleOptions {
    return {
      typePaths: ['./**/*.graphql'],
    };
  }
}

If you want to reuse an existing options provider instead of creating a private copy inside the GraphQLModule, use the useExisting syntax.

GraphQLModule.forRootAsync({
  imports: [ConfigModule],
  useExisting: ConfigService,
}),

Resolvers

Resolvers provide the instructions for turning a GraphQL operation (a query, mutation, or subscription) into data. They return the same shape of data we specify in our schema -- either synchronously or as a promise that resolves to a result of that shape. Typically, you create a resolver map manually. The @nestjs/graphql package, on the other hand, generates a resolver map automatically using the metadata provided by decorators you use to annotate classes. To demonstrate the process of using the package features to create a GraphQL API, we'll create a simple authors API.

Code first

In the code first approach, we don't follow the typical process of creating our GraphQL schema by writing GraphQL SDL by hand. Instead, we use TypeScript decorators to generate the SDL from TypeScript class definitions. The @nestjs/graphql package reads the metadata defined through the decorators and automatically generates the schema for you.

Object types

Most of the definitions in a GraphQL schema are object types. Each object type you define should represent a domain object that an application client might need to interact with. For example, our sample API needs to be able to fetch a list of authors and their posts, so we should define the Author type and Post type to support this functionality.

If we were using the schema first approach, we'd define such a schema with SDL like this:

type Author {
  id: Int!
  firstName: String
  lastName: String
  posts: [Post]
}

In this case, using the code first approach, we define schemas using TypeScript classes and using TypeScript decorators to annotate the fields of those classes. The equivalent of the above SDL in the code first approach is:

@@filename(authors/models/author.model)
import { Field, Int, ObjectType } from '@nestjs/graphql';
import { Post } from './post';

@ObjectType()
export class Author {
  @Field(type => Int)
  id: number;

  @Field({ nullable: true })
  firstName?: string;

  @Field({ nullable: true })
  lastName?: string;

  @Field(type => [Post])
  posts: Post[];
}

info Hint TypeScript's metadata reflection system has several limitations which make it impossible, for instance, to determine what properties a class consists of or recognize whether a given property is optional or required. Because of these limitations, we must either explicitly use the @Field() decorator in our schema definition classes to provide metadata about each field's GraphQL type and optionality, or use a CLI plugin to generate these for us.

The Author object type, like any class, is made of a collection of fields, with each field declaring a type. A field's type corresponds to a GraphQL type. A field's GraphQL type can be either another object type or a scalar type. A GraphQL scalar type is a primitive (like ID, String, Boolean, or Int) that resolves to a single value.

info Hint In addition to GraphQL's built-in scalar types, you can define custom scalar types (read more).

The above Author object type definition will cause Nest to generate the SDL we showed above:

type Author {
  id: Int!
  firstName: String
  lastName: String
  posts: [Post]
}

The @Field() decorator accepts an optional type function (e.g., type => Int), and optionally an options object.

The type function is required when there's the potential for ambiguity between the TypeScript type system and the GraphQL type system. Specifically: it is not required for string and boolean types; it is required for arrays, numbers (which must be mapped to either a GraphQL Int or Float) and object types. The type function should simply return the desired GraphQL type (as shown in various examples in these chapters).

The options object can have any of the following key/value pairs:

  • nullable: for specifying whether a field is nullable (in SDL, each field is non-nullable by default); boolean
  • description: for setting a field description; string
  • deprecationReason: for marking a field as deprecated; string

For example:

@Field({ description: `Book title`, deprecationReason: 'Not useful in v2 schema' })
title: string;

info Hint You can also add a description to, or deprecate, the whole object type: @ObjectType({{ '{' }} description: 'Author model' {{ '}' }}).

When the field is an array, we must manually indicate the array type in the Field() decorator's type function, as shown below:

@Field(type => [Post])
posts: Post[];

info Hint Using array bracket notation ([ ]), we can indicate the depth of the array. For example, using [[Int]] would represent an integer matrix.

To declare that an array's items (not the array itself) are nullable, set the nullable property to 'items' as shown below:

@Field(type => [Post], { nullable: 'items' })
posts: Post[];

info Hint If both the array and its items are nullable, set nullable to 'itemsAndList' instead.

Now that the Author object type is created, let's define the Post object type.

@@filename(posts/models/post.model)
import { Field, Int, ObjectType } from '@nestjs/graphql';

@ObjectType()
export class Post {
  @Field(type => Int)
  id: number;

  @Field()
  title: string;

  @Field(type => Int, { nullable: true })
  votes?: number;
}

The Post object type will result in generating the following part of the GraphQL schema in SDL:

@@filename(schema.gql)
type Post {
  id: Int!
  title: String!
  votes: Int
}

Code first resolver

At this point, we've defined the objects (type definitions) that can exist in our data graph, but clients don't yet have a way to interact with those objects. To address that, we need to create a resolver class. In the code first method, a resolver class both defines resolver functions and generates the Query type. This will be clear as we work through the example below:

@@filename(authors/authors.resolver)
@Resolver(of => Author)
export class AuthorsResolver {
  constructor(
    private authorsService: AuthorsService,
    private postsService: PostsService,
  ) {}

  @Query(returns => Author)
  async author(@Args('id', { type: () => Int }) id: number) {
    return this.authorsService.findOneById(id);
  }

  @ResolveField()
  async posts(@Parent() author: Author) {
    const { id } = author;
    return this.postsService.findAll({ authorId: id });
  }
}

info Hint All decorators (e.g., @Resolver, @ResolveField, @Args, etc.) are exported from the @nestjs/graphql package.

You can define multiple resolver classes. Nest will combine these at run time. See the module section below for more on code organization.

warning Note The logic inside the AuthorsService and PostsService classes can be as simple or sophisticated as needed. The main point of this example is to show how to construct resolvers and how they can interact with other providers.

In the example above, we created the AuthorsResolver which defines one query resolver function and one field resolver function. To create a resolver, we create a class with resolver functions as methods, and annotate the class with the @Resolver() decorator.

In this example, we defined a query handler to get the author object based on the id sent in the request. To specify that the method is a query handler, use the @Query() decorator.

The argument passed to the @Resolver() decorator is optional, but comes into play when our graph becomes non-trivial. It's used to supply a parent object used by field resolver functions as they traverse down through an object graph.

In our example, since the class includes a field resolver function (for the posts property of the Author object type), we must supply the @Resolver() decorator with a value to indicate which class is the parent type (i.e., the corresponding ObjectType class name) for all field resolvers defined within this class. As should be clear from the example, when writing a field resolver function, it's necessary to access the parent object (the object the field being resolved is a member of). In this example, we populate an author's posts array with a field resolver that calls a service which takes the author's id as an argument. Hence the need to identify the parent object in the @Resolver() decorator. Note the corresponding use of the @Parent() method parameter decorator to then extract a reference to that parent object in the field resolver.

We can define multiple @Query() resolver functions (both within this class, and in any other resolver class), and they will be aggregated into a single Query type definition in the generated SDL along with the appropriate entries in the resolver map. This allows you to define queries close to the models and services that they use, and to keep them well organized in modules.

Query type names

In the above examples, the @Query() decorator generates a GraphQL schema query type name based on the method name. For example, consider the following construction from the example above:

@Query(returns => Author)
async author(@Args('id', { type: () => Int }) id: number) {
  return this.authorsService.findOneById(id);
}

This generates the following entry for the author query in our schema (the query type uses the same name as the method name):

type Query {
  author(id: Int!): Author
}

info Hint Learn more about GraphQL queries here.

Conventionally, we prefer to decouple these names; for example, we prefer to use a name like getAuthor() for our query handler method, but still use author for our query type name. The same applies to our field resolvers. We can easily do this by passing the mapping names as arguments of the @Query() and @ResolveField() decorators, as shown below:

@@filename(authors/authors.resolver)
@Resolver(of => Author)
export class AuthorsResolver {
  constructor(
    private authorsService: AuthorsService,
    private postsService: PostsService,
  ) {}

  @Query(returns => Author, { name: 'author' })
  async getAuthor(@Args('id', { type: () => Int }) id: number) {
    return this.authorsService.findOneById(id);
  }

  @ResolveField('posts', returns => [Post])
  async getPosts(@Parent() author: Author) {
    const { id } = author;
    return this.postsService.findAll({ authorId: id });
  }
}

The getAuthor handler method above will result in generating the following part of the GraphQL schema in SDL:

type Query {
  author(id: Int!): Author
}

Query decorator options

The @Query() decorator's options object (where we pass {{ '{' }}name: 'author'{{ '}' }} above) accepts a number of key/value pairs:

  • name: name of the query; a string
  • description: a description that will be used to generate GraphQL schema documentation (e.g., in GraphQL playground); a string
  • deprecationReason: sets query metadata to show the query as deprecated (e.g., in GraphQL playground); a string
  • nullable: whether the query can return a null data response; boolean or 'items' or 'itemsAndList' (see above for details of 'items' and 'itemsAndList')

Args decorator options

Use the @Args() decorator to extract arguments from a request for use in the method handler. This works in a very similar fashion to REST route parameter argument extraction.

Usually your @Args() decorator will be simple, and not require an object argument as seen with the getAuthor() method above. For example, if the type of an identifier is string, the following construction is sufficient, and simply plucks the named field from the inbound GraphQL request for use as a method argument.

@Args('id') id: string

In the getAuthor() case, the number type is used, which presents a challenge. The number TypeScript type doesn't give us enough information about the expected GraphQL representation (e.g., Int vs. Float). Thus we have to explicitly pass the type reference. We do that by passing a second argument to the Args() decorator, containing argument options, as shown below:

@Query(returns => Author, { name: 'author' })
async getAuthor(@Args('id', { type: () => Int }) id: number) {
  return this.authorsService.findOneById(id);
}

The options object allows us to specify the following optional key value pairs:

  • type: a function returning the GraphQL type
  • defaultValue: a default value; any
  • description: description metadata; string
  • deprecationReason: to deprecate a field and provide meta data describing why; string
  • nullable: whether the field is nullable

Query handler methods can take multiple arguments. Let's imagine that we want to fetch an author based on its firstName and lastName. In this case, we can call @Args twice:

getAuthor(
  @Args('firstName', { nullable: true }) firstName?: string,
  @Args('lastName', { defaultValue: '' }) lastName?: string,
) {}

Dedicated arguments class

With inline @Args() calls, code like the example above becomes bloated. Instead, you can create a dedicated GetAuthorArgs arguments class and access it in the handler method as follows:

@Args() args: GetAuthorArgs

Create the GetAuthorArgs class using @ArgsType() as shown below:

@@filename(authors/dto/get-author.args)
import { MinLength } from 'class-validator';
import { Field, ArgsType } from '@nestjs/graphql';

@ArgsType()
class GetAuthorArgs {
  @Field({ nullable: true })
  firstName?: string;

  @Field({ defaultValue: '' })
  @MinLength(3)
  lastName: string;
}

info Hint Again, due to TypeScript's metadata reflection system limitations, it's required to either use the @Field decorator to manually indicate type and optionality, or use a CLI plugin.

This will result in generating the following part of the GraphQL schema in SDL:

type Query {
  author(firstName: String, lastName: String = ''): Author
}

info Hint Note that arguments classes like GetAuthorArgs play very well with the ValidationPipe (read more).

Class inheritance

You can use standard TypeScript class inheritance to create base classes with generic utility type features (fields and field properties, validations, etc.) that can be extended. For example, you may have a set of pagination related arguments that always include the standard offset and limit fields, but also other index fields that are type-specific. You can set up a class hierarchy as shown below.

Base @ArgsType() class:

@ArgsType()
class PaginationArgs {
  @Field(type => Int)
  offset: number = 0;

  @Field(type => Int)
  limit: number = 10;
}

Type specific sub-class of the base @ArgsType() class:

@ArgsType()
class GetAuthorArgs extends PaginationArgs {
  @Field({ nullable: true })
  firstName?: string;

  @Field({ defaultValue: '' })
  @MinLength(3)
  lastName: string;
}

The same approach can be taken with @ObjectType() objects. Define generic properties on the base class:

@ObjectType()
class Character {
  @Field(type => Int)
  id: number;

  @Field()
  name: string;
}

Add type-specific properties on sub-classes:

@ObjectType()
class Warrior extends Character {
  @Field()
  level: number;
}

You can use inheritance with a resolver as well. You can ensure type safety by combining inheritance and TypeScript generics. For example, to create a base class with a generic findAll query, use a construction like this:

function BaseResolver<T extends Type<unknown>>(classRef: T): any {
  @Resolver({ isAbstract: true })
  abstract class BaseResolverHost {
    @Query(type => [classRef], { name: `findAll${classRef.name}` })
    async findAll(): Promise<T[]> {
      return [];
    }
  }
  return BaseResolverHost;
}

Note the following:

  • an explicit return type (any above) is required: otherwise TypeScript complains about the usage of a private class definition. Recommended: define an interface instead of using any.
  • Type is imported from the @nestjs/common package
  • The isAbstract: true property indicates that SDL (Schema Definition Language statements) shouldn't be generated for this class. Note, you can set this property for other types as well to suppress SDL generation.

Here's how you could generate a concrete sub-class of the BaseResolver:

@Resolver(of => Recipe)
export class RecipesResolver extends BaseResolver(Recipe) {
  constructor(private recipesService: RecipesService) {
    super();
  }
}

This construct would generated the following SDL:

type Query {
  findAllRecipe: [Recipe!]!
}

Generics

We saw one use of generics above. This powerful TypeScript feature can be used to create useful abstractions. For example, here's a sample cursor-based pagination implementation based on this documentation:

import { Field, ObjectType, Int } from '@nestjs/graphql';
import { Type } from '@nestjs/common';

export function Paginated<T>(classRef: Type<T>): any {
  @ObjectType(`${classRef.name}Edge`)
  abstract class EdgeType {
    @Field(type => String)
    cursor: string;

    @Field(type => classRef)
    node: T;
  }

  @ObjectType({ isAbstract: true })
  abstract class PaginatedType {
    @Field(type => [EdgeType], { nullable: true })
    edges: EdgeType[];

    @Field(type => [classRef], { nullable: true })
    nodes: T[];

    @Field(type => Int)
    totalCount: number;

    @Field()
    hasNextPage: boolean;
  }
  return PaginatedType;
}

With the above base class defined, we can now easily create specialized types that inherit this behavior. For example:

@ObjectType()
class PaginatedAuthor extends Paginated(Author) {}

Schema first

As mentioned in the previous chapter, in the schema first approach we start by manually defining schema types in SDL (read more). Consider the following SDL type definitions.

info Hint For convenience in this chapter, we've aggregated all of the SDL in one location (e.g., one .graphql file, as shown below). In practice, you may find it appropriate to organize your code in a modular fashion. For example, it can be helpful to create individual SDL files with type definitions representing each domain entity, along with related services, resolver code, and the Nest module definition class, in a dedicated directory for that entity. Nest will aggregate all the individual schema type definitions at run time.

type Author {
  id: Int!
  firstName: String
  lastName: String
  posts: [Post]
}

type Post {
  id: Int!
  title: String!
  votes: Int
}

type Query {
  author(id: Int!): Author
}

Schema first resolver

The schema above exposes a single query - author(id: Int!): Author.

info Hint Learn more about GraphQL queries here.

Let's now create an AuthorsResolver class that resolves author queries:

@@filename(authors/authors.resolver)
@Resolver('Author')
export class AuthorsResolver {
  constructor(
    private authorsService: AuthorsService,
    private postsService: PostsService,
  ) {}

  @Query()
  async author(@Args('id') id: number) {
    return this.authorsService.findOneById(id);
  }

  @ResolveField()
  async posts(@Parent() author) {
    const { id } = author;
    return this.postsService.findAll({ authorId: id });
  }
}

info Hint All decorators (e.g., @Resolver, @ResolveField, @Args, etc.) are exported from the @nestjs/graphql package.

warning Note The logic inside the AuthorsService and PostsService classes can be as simple or sophisticated as needed. The main point of this example is to show how to construct resolvers and how they can interact with other providers.

The @Resolver() decorator is required. It takes an optional string argument with the name of a class. This class name is required whenever the class includes @ResolveField() decorators to inform Nest that the decorated method is associated with a parent type (the Author type in our current example). Alternatively, instead of setting @Resolver() at the top of the class, this can be done for each method:

@Resolver('Author')
@ResolveField()
async posts(@Parent() author) {
  const { id } = author;
  return this.postsService.findAll({ authorId: id });
}

In this case (@Resolver() decorator at the method level), if you have multiple @ResolveField() decorators inside a class, you must add @Resolver() to all of them. This is not considered the best practice (as it creates extra overhead).

info Hint Any class name argument passed to @Resolver() does not affect queries (@Query() decorator) or mutations (@Mutation() decorator).

warning Warning Using the @Resolver decorator at the method level is not supported with the code first approach.

In the above examples, the @Query() and @ResolveField() decorators are associated with GraphQL schema types based on the method name. For example, consider the following construction from the example above:

@Query()
async author(@Args('id') id: number) {
  return this.authorsService.findOneById(id);
}

This generates the following entry for the author query in our schema (the query type uses the same name as the method name):

type Query {
  author(id: Int!): Author
}

Conventionally, we would prefer to decouple these, using names like getAuthor() or getPosts() for our resolver methods. We can easily do this by passing the mapping name as an argument to the decorator, as shown below:

@@filename(authors/authors.resolver)
@Resolver('Author')
export class AuthorsResolver {
  constructor(
    private authorsService: AuthorsService,
    private postsService: PostsService,
  ) {}

  @Query('author')
  async getAuthor(@Args('id') id: number) {
    return this.authorsService.findOneById(id);
  }

  @ResolveField('posts')
  async getPosts(@Parent() author) {
    const { id } = author;
    return this.postsService.findAll({ authorId: id });
  }
}

Generating types

Assuming that we use the schema first approach and have enabled the typings generation feature (with outputAs: 'class' as shown in the previous chapter), once you run the application it will generate the following file (in the location you specified in the GraphQLModule.forRoot() method. For example, in src/graphql.ts)

@@filename(graphql)
export class Author {
  id: number;
  firstName?: string;
  lastName?: string;
  posts?: Post[];
}

export class Post {
  id: number;
  title: string;
  votes?: number;
}

export abstract class IQuery {
  abstract author(id: number): Author | Promise<Author>;
}

By generating classes (instead of the default technique of generating interfaces), you can use declarative validation decorators in combination with the schema first approach, which is an extremely useful technique (read more). For example, you could add class-validator decorators to the generated CreatePostInput class as shown below to enforce minimum and maximum string lengths on the title field:

import { MinLength, MaxLength } from 'class-validator';

export class CreatePostInput {
  @MinLength(3)
  @MaxLength(50)
  title: string;
}

warning Notice To enable auto-validation of your inputs (and parameters), use ValidationPipe. Read more about validation here and more specifically about pipes here.

However, if you add decorators directly to the automatically generated file, they will be overwritten each time the file is generated. Instead, create a separate file and simply extend the generated class.

import { MinLength, MaxLength } from 'class-validator';
import { Post } from '../../graphql.ts';

export class CreatePostInput extends Post {
  @MinLength(3)
  @MaxLength(50)
  title: string;
}

GraphQL argument decorators

We can access the standard GraphQL resolver arguments using dedicated decorators. Below is a comparison of the Nest decorators and the plain Apollo parameters they represent.

@Root() and @Parent() root/parent
@Context(param?: string) context / context[param]
@Info(param?: string) info / info[param]
@Args(param?: string) args / args[param]

These arguments have the following meanings:

  • root: an object that contains the result returned from the resolver on the parent field, or, in the case of a top-level Query field, the rootValue passed from the server configuration.
  • context: an object shared by all resolvers in a particular query; typically used to contain per-request state.
  • info: an object that contains information about the execution state of the query.
  • args: an object with the arguments passed into the field in the query.

Module

Once we're done with the above steps, we have declaratively specified all the information needed by the GraphQLModule to generate a resolver map. The GraphQLModule uses reflection to introspect the meta data provided via the decorators, and transforms classes into the correct resolver map automatically.

The only other thing you need to take care of is to provide (i.e., list as a provider in some module) the resolver class(es) (AuthorsResolver), and importing the module (AuthorsModule) somewhere, so Nest will be able to utilize it.

For example, we can do this in an AuthorsModule, which can also provide other services needed in this context. Be sure to import AuthorsModule somewhere (e.g., in the root module, or some other module imported by the root module).

@@filename(authors/authors.module)
@Module({
  imports: [PostsModule],
  providers: [AuthorsService, AuthorsResolver],
})
export class AuthorsModule {}

info Hint It is helpful to organize your code by your so-called domain model (similar to the way you would organize entry points in a REST API). In this approach, keep your models (ObjectType classes), resolvers and services together within a Nest module representing the domain model. Keep all of these components in a single folder per module. When you do this, and use the Nest CLI to generate each element, Nest will wire all of these parts together (locating files in appropriate folders, generating entries in provider and imports arrays, etc.) automatically for you.

Scalars

A GraphQL object type has a name and fields, but at some point those fields have to resolve to some concrete data. That's where the scalar types come in: they represent the leaves of the query (read more here). GraphQL includes the following default types: Int, Float, String, Boolean and ID. In addition to these built-in types, you may need to support custom atomic data types (e.g., Date).

Code first

The code-first approach ships with five scalars in which three of them are simple aliases for the existing GraphQL types.

  • ID (alias for GraphQLID) - represents a unique identifier, often used to refetch an object or as the key for a cache
  • Int (alias for GraphQLInt) - a signed 32‐bit integer
  • Float (alias for GraphQLFloat) - a signed double-precision floating-point value
  • GraphQLISODateTime - a date-time string at UTC (used by default to represent Date type)
  • GraphQLTimestamp - a numeric string which represents time and date as number of milliseconds from start of UNIX epoch

The GraphQLISODateTime (e.g. 2019-12-03T09:54:33Z) is used by default to represent the Date type. To use the GraphQLTimestamp instead, set the dateScalarMode of the buildSchemaOptions object to 'timestamp' as follows:

GraphQLModule.forRoot({
  buildSchemaOptions: {
    dateScalarMode: 'timestamp',
  }
}),

In addition, you can create custom scalars. For example, to create a Date scalar, simply create a new class.

import { Scalar, CustomScalar } from '@nestjs/graphql';
import { Kind, ValueNode } from 'graphql';

@Scalar('Date', type => Date)
export class DateScalar implements CustomScalar<number, Date> {
  description = 'Date custom scalar type';

  parseValue(value: number): Date {
    return new Date(value); // value from the client
  }

  serialize(value: Date): number {
    return value.getTime(); // value sent to the client
  }

  parseLiteral(ast: ValueNode): Date {
    if (ast.kind === Kind.INT) {
      return new Date(ast.value);
    }
    return null;
  }
}

With this in place, register DateScalar as a provider.

@Module({
  providers: [DateScalar],
})
export class CommonModule {}

Now we can use the Date type in our classes.

@Field()
creationDate: Date;

Schema first

To define a custom scalar (read more about scalars here), create a type definition and a dedicated resolver. Here (as in the official documentation), we’ll use the graphql-type-json package for demonstration purposes. This npm package defines a JSON GraphQL scalar type.

Start by installing the package:

$ npm i --save graphql-type-json

Once the package is installed, we pass a custom resolver to the forRoot() method:

import GraphQLJSON from 'graphql-type-json';

@Module({
  imports: [
    GraphQLModule.forRoot({
      typePaths: ['./**/*.graphql'],
      resolvers: { JSON: GraphQLJSON },
    }),
  ],
})
export class ApplicationModule {}

Now we can use the JSON scalar in our type definitions:

scalar JSON

type Foo {
  field: JSON
}

Another method to define a scalar type is to create a simple class. Assume we want to enhance our schema with the Date type.

import { Scalar, CustomScalar } from '@nestjs/graphql';
import { Kind, ValueNode } from 'graphql';

@Scalar('Date')
export class DateScalar implements CustomScalar<number, Date> {
  description = 'Date custom scalar type';

  parseValue(value: number): Date {
    return new Date(value); // value from the client
  }

  serialize(value: Date): number {
    return value.getTime(); // value sent to the client
  }

  parseLiteral(ast: ValueNode): Date {
    if (ast.kind === Kind.INT) {
      return new Date(ast.value);
    }
    return null;
  }
}

With this in place, register DateScalar as a provider.

@Module({
  providers: [DateScalar],
})
export class CommonModule {}

Now we can use the Date scalar in type definitions.

scalar Date

Generating SDL

warning Warning This chapter applies only to the code first approach.

To manually generate a GraphQL SDL schema (i.e., without running an application, connecting to the database, hooking up resolvers, etc.), use the GraphQLSchemaBuilderModule.

async function generateSchema() {
  const app = await NestFactory.create(GraphQLSchemaBuilderModule);
  await app.init();

  const gqlSchemaFactory = app.get(GraphQLSchemaFactory);
  const schema = await gqlSchemaFactory.create([RecipesResolver]);
  console.log(printSchema(schema));
}

info Hint The GraphQLSchemaBuilderModule and GraphQLSchemaFactory are imported from the @nestjs/graphql package. The printSchema function is imported from the graphql package.

Usage

The gqlSchemaFactory.create() method takes an array of resolver class references. For example:

const schema = await gqlSchemaFactory.create([
  RecipesResolver,
  AuthorsResolver,
  PostsResolvers,
]);

It also takes a second optional argument with an array of scalar classes:

const schema = await gqlSchemaFactory.create(
  [RecipesResolver, AuthorsResolver, PostsResolvers],
  [DurationScalar, DateScalar],
);

Lastly, you can pass an options object:

const schema = await gqlSchemaFactory.create([RecipesResolver], {
  skipCheck: true,
  orphanedTypes: [],
});
  • skipCheck: ignore schema validation; boolean, defaults to false
  • orphanedTypes: list of classes that are not explicitly referenced (not part of the object graph) to be generated. Normally, if a class is declared but isn't otherwise referenced in the graph, it's omitted. The property value is an array of class references.

Subscriptions

In addition to fetching data using queries and modifying data using mutations, the GraphQL spec supports a third operation type, called subscription. GraphQL subscriptions are a way to push data from the server to the clients that choose to listen to real time messages from the server. Subscriptions are similar to queries in that they specify a set of fields to be delivered to the client, but instead of immediately returning a single answer, a channel is opened and a result is sent to the client every time a particular event happens on the server.

A common use case for subscriptions is notifying the client side about particular events, for example the creation of a new object, updated fields and so on (read more here).

Enable subscriptions

To enable subscriptions, set the installSubscriptionHandlers property to true.

GraphQLModule.forRoot({
  installSubscriptionHandlers: true,
}),

Code first

To create a subscription using the code first approach, we use the @Subscription() decorator and the PubSub class from the graphql-subscriptions package, which provides a simple publish/subscribe API.

The following subscription handler takes care of subscribing to an event by calling PubSub#asyncIterator. This method takes a single argument, the triggerName, which corresponds to an event topic name.

const pubSub = new PubSub();

@Resolver(of => Author)
export class AuthorResolver {
  // ...
  @Subscription(returns => Comment)
  commentAdded() {
    return pubSub.asyncIterator('commentAdded');
  }
}

info Hint All decorators are exported from the @nestjs/graphql package, while the PubSub class is exported from the graphql-subscriptions package.

warning Note PubSub is a class that exposes a simple publish and subscribe API. Read more about it here. Note that the Apollo docs warn that the default implementation is not suitable for production (read more here). Production apps should use a PubSub implementation backed by an external store (read more here).

This will result in generating the following part of the GraphQL schema in SDL:

type Subscription {
  commentAdded(): Comment!
}

Note that subscriptions, by definition, return an object with a single top level property whose key is the name of the subscription. This name is either inherited from the name of the subscription handler method (i.e., commentAdded above), or is provided explicitly by passing an option with the key name as the second argument to the @Subscription() decorator, as shown below.

  @Subscription(returns => Comment, {
    name: 'commentAdded',
  })
  addCommentHandler() {
    return pubSub.asyncIterator('commentAdded');
  }

This construct produces the same SDL as the previous code sample, but allows us to decouple the method name from the subscription.

Publishing

Now, to publish the event, we use the PubSub#publish method. This is often used within a mutation to trigger a client-side update when a part of the object graph has changed. For example:

@@filename(posts/posts.resolver)
@Mutation(returns => Post)
async addComment(
  @Args('postId', { type: () => Int }) postId: number,
  @Args('comment', { type: () => Comment }) comment: CommentInput,
) {
  const newComment = this.commentsService.addComment({ id: postId, comment });
  pubSub.publish('commentAdded', { commentAdded: newComment });
  return newComment;
}

The PubSub#publish method takes a triggerName (again, think of this as an event topic name) as the first parameter, and an event payload as the second parameter. As mentioned, the subscription, by definition, returns a value and that value has a shape. Look again at the generated SDL for our commentAdded subscription:

type Subscription {
  commentAdded(): Comment!
}

This tells us that the subscription must return an object with a top-level property name of commentAdded that has a value which is a Comment object. The important point to note is that the shape of the event payload emitted by the PubSub#publish method must correspond to the shape of the value expected to return from the subscription. So, in our example above, the pubSub.publish('commentAdded', {{ '{' }} commentAdded: newComment {{ '}' }}) statement publishes a commentAdded event with the appropriately shaped payload. If these shapes don't match, your subscription will fail during the GraphQL validation phase.

Filtering subscriptions

To filter out specific events, set the filter property to a filter function. This function acts similar to the function passed to an array filter. It takes two arguments: payload containing the event payload (as sent by the event publisher), and variables taking any arguments passed in during the subscription request. It returns a boolean determining whether this event should be published to client listeners.

@Subscription(returns => Comment, {
  filter: (payload, variables) =>
    payload.commentAdded.title === variables.title,
})
commentAdded(@Args('title') title: string) {
  return pubSub.asyncIterator('commentAdded');
}

Mutating subscription payloads

To mutate the published event payload, set the resolve property to a function. The function receives the event payload (as sent by the event publisher) and returns the appropriate value.

@Subscription(returns => Comment, {
  resolve: value => value,
})
commentAdded() {
  return pubSub.asyncIterator('commentAdded');
}

warning Note If you use the resolve option, you should return the unwrapped payload (e.g., with our example, return a newComment object directly, not a {{ '{' }} commentAdded: newComment {{ '}' }} object).

If you need to access injected providers (e.g., use an external service to validate the data), use the following construction.

@Subscription(returns => Comment, {
  resolve(this: AuthorResolver, value) {
    // "this" refers to an instance of "AuthorResolver"
    return value;
  }
})
commentAdded() {
  return pubSub.asyncIterator('commentAdded');
}

The same construction works with filters:

@Subscription(returns => Comment, {
  filter(this: AuthorResolver, payload, variables) {
    // "this" refers to an instance of "AuthorResolver"
    return payload.commentAdded.title === variables.title;
  }
})
commentAdded() {
  return pubSub.asyncIterator('commentAdded');
}

Schema first

To create an equivalent subscription in Nest, we'll make use of the @Subscription() decorator.

const pubSub = new PubSub();

@Resolver('Author')
export class AuthorResolver {
  // ...
  @Subscription()
  commentAdded() {
    return pubSub.asyncIterator('commentAdded');
  }
}

To filter out specific events based on context and arguments, set the filter property.

@Subscription('commentAdded', {
  filter: (payload, variables) =>
    payload.commentAdded.title === variables.title,
})
commentAdded() {
  return pubSub.asyncIterator('commentAdded');
}

To mutate the published payload, we can use a resolve function.

@Subscription('commentAdded', {
  resolve: value => value,
})
commentAdded() {
  return pubSub.asyncIterator('commentAdded');
}

If you need to access injected providers (e.g., use an external service to validate the data), use the following construction:

@Subscription('commentAdded', {
  resolve(this: AuthorResolver, value) {
    // "this" refers to an instance of "AuthorResolver"
    return value;
  }
})
commentAdded() {
  return pubSub.asyncIterator('commentAdded');
}

The same construction works with filters:

@Subscription('commentAdded', {
  filter(this: AuthorResolver, payload, variables) {
    // "this" refers to an instance of "AuthorResolver"
    return payload.commentAdded.title === variables.title;
  }
})
commentAdded() {
  return pubSub.asyncIterator('commentAdded');
}

The last step is to update the type definitions file.

type Author {
  id: Int!
  firstName: String
  lastName: String
  posts: [Post]
}

type Post {
  id: Int!
  title: String
  votes: Int
}

type Query {
  author(id: Int!): Author
}

type Comment {
  id: String
  content: String
}

type Subscription {
  commentAdded(title: String!): Comment
}

With this, we've created a single commentAdded(title: String!): Comment subscription. You can find a full sample implementation here.

PubSub

We instantiated a local PubSub instance above. The preferred approach is to define PubSub as a provider and inject it through the constructor (using the @Inject() decorator). This allows us to re-use the instance across the whole application. For example, define a provider as follows, then inject 'PUB_SUB' where needed.

{
  provide: 'PUB_SUB',
  useValue: new PubSub(),
}

Customize subscriptions server

To customize the subscriptions server (e.g., change the listener port), use the subscriptions options property (read more).

GraphQLModule.forRoot({
  installSubscriptionHandlers: true,
  subscriptions: {
    keepAlive: 5000,
  }
}),

Unions

Union types are very similar to interfaces, but they don't get to specify any common fields between the types (read more here). Unions are useful for returning disjoint data types from a single field.

Code first

To define a GraphQL union type, we must define classes that this union will be composed of. Following the example from the Apollo documentation, we'll create two classes. First, Book:

import { Field, ObjectType } from '@nestjs/graphql';

@ObjectType()
export class Book {
  @Field()
  title: string;
}

And then Author:

import { Field, ObjectType } from '@nestjs/graphql';

@ObjectType()
export class Author {
  @Field()
  name: string;
}

With this in place, register the Result union using the createUnionType function exported from the @nestjs/graphql package:

export const ResultUnion = createUnionType({
  name: 'Result',
  types: () => [Author, Book],
});

Now, we can reference the ResultUnion in our query:

@Query(returns => [ResultUnion])
search(): Array<typeof ResultUnion> {
  return [new Author(), new Book()];
}

This will result in generating the following part of the GraphQL schema in SDL:

type Author {
  name: String!
}

type Book {
  title: String!
}

union ResultUnion = Author | Book

type Query {
  search: [ResultUnion!]!
}

The default resolveType() function generated by the library will extract the type based on the value returned from the resolver method. That means returning class instances instead of literal JavaScript object is obligatory.

To provide a customized resolveType() function, pass the resolveType property to the options object passed into the createUnionType() function, as follows:

export const ResultUnion = createUnionType({
  name: 'Result',
  types: () => [Author, Book],
  resolveType(value) {
    if (value.name) {
      return Author;
    }
    if (value.title) {
      return Book;
    }
    return null;
  },
});

Schema first

To define a union in the schema first approach, simply create a GraphQL union with SDL.

type Author {
  name: String!
}

type Book {
  title: String!
}

union ResultUnion = Author | Book

Then, you can use the typings generation feature (as shown in the quick start chapter) to generate corresponding TypeScript definitions:

export class Author {
  name: string;
}

export class Book {
  title: string;
}

export type ResultUnion = Author | Book;

Unions require an extra __resolveType field in the resolver map to determine which type the union should resolve to. Let's create a ResultUnionResolver class and define the __resolveType method:

@Resolver('ResultUnion')
export class ResultUnionResolver {
  @ResolveField()
  __resolveType(value) {
    if (value.name) {
      return 'Author';
    }
    if (value.title) {
      return 'Book';
    }
    return null;
  }
}

info Hint All decorators are exported from the @nestjs/graphql package.

Overview

In addition to traditional (sometimes called monolithic) application architectures, Nest natively supports the microservice architectural style of development. Most of the concepts discussed elsewhere in this documentation, such as dependency injection, decorators, exception filters, pipes, guards and interceptors, apply equally to microservices. Wherever possible, Nest abstracts implementation details so that the same components can run across HTTP-based platforms, WebSockets, and Microservices. This section covers the aspects of Nest that are specific to microservices.

In Nest, a microservice is fundamentally an application that uses a different transport layer than HTTP.

Nest supports several built-in transport layer implementations, called transporters, which are responsible for transmitting messages between different microservice instances. Most transporters natively support both request-response and event-based message styles. Nest abstracts the implementation details of each transporter behind a canonical interface for both request-response and event-based messaging. This makes it easy to switch from one transport layer to another -- for example to leverage the specific reliability or performance features of a particular transport layer -- without impacting your application code.

Installation

To start building microservices, first install the required package:

$ npm i --save @nestjs/microservices

Getting started

To instantiate a microservice, use the createMicroservice() method of the NestFactory class:

@@filename(main)
import { NestFactory } from '@nestjs/core';
import { Transport, MicroserviceOptions } from '@nestjs/microservices';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.createMicroservice<MicroserviceOptions>(
    AppModule,
    {
      transport: Transport.TCP,
    },
  );
  app.listen(() => console.log('Microservice is listening'));
}
bootstrap();
@@switch
import { NestFactory } from '@nestjs/core';
import { Transport, MicroserviceOptions } from '@nestjs/microservices';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.createMicroservice(AppModule, {
    transport: Transport.TCP,
  });
  app.listen(() => console.log('Microservice is listening'));
}
bootstrap();

info Hint Microservices use the TCP transport layer by default.

The second argument of the createMicroservice() method is an options object. This object may consist of two members:

transport Specifies the transporter (for example, Transport.NATS)
options A transporter-specific options object that determines transporter behavior

The options object is specific to the chosen transporter. The TCP transporter exposes the properties described below. For other transporters (e.g, Redis, MQTT, etc.), see the relevant chapter for a description of the available options.

host Connection hostname
port Connection port
retryAttempts Number of times to retry message (default: 0)
retryDelay Delay between message retry attempts (ms) (default: 0)

Patterns

Microservices recognize both messages and events by patterns. A pattern is a plain value, for example, a literal object or a string. Patterns are automatically serialized and sent over the network along with the data portion of a message. In this way, message senders and consumers can coordinate which requests are consumed by which handlers.

Request-response

The request-response message style is useful when you need to exchange messages between various external services. With this paradigm, you can be certain that the service has actually received the message (without the need to manually implement a message ACK protocol). However, the request-response paradigm is not always the best choice. For example, streaming transporters that use log-based persistence, such as Kafka or NATS streaming, are optimized for solving a different range of issues, more aligned with an event messaging paradigm (see event-based messaging below for more details).

To enable the request-response message type, Nest creates two logical channels - one is responsible for transferring the data while the other waits for incoming responses. For some underlying transports, such as NATS, this dual-channel support is provided out-of-the-box. For others, Nest compensates by manually creating separate channels. There can be overhead for this, so if you do not require a request-response message style, you should consider using the event-based method.

To create a message handler based on the request-response paradigm use the @MessagePattern() decorator, which is imported from the @nestjs/microservices package.

@@filename(math.controller)
import { Controller } from '@nestjs/common';
import { MessagePattern } from '@nestjs/microservices';

@Controller()
export class MathController {
  @MessagePattern({ cmd: 'sum' })
  accumulate(data: number[]): number {
    return (data || []).reduce((a, b) => a + b);
  }
}
@@switch
import { Controller } from '@nestjs/common';
import { MessagePattern } from '@nestjs/microservices';

@Controller()
export class MathController {
  @MessagePattern({ cmd: 'sum' })
  accumulate(data) {
    return (data || []).reduce((a, b) => a + b);
  }
}

In the above code, the accumulate() message handler listens for messages that fulfill the {{ '{' }} cmd: 'sum' {{ '}' }} message pattern. The message handler takes a single argument, the data passed from the client. In this case, the data is an array of numbers which are to be accumulated.

Asynchronous responses

Message handlers are able to respond either synchronously or asynchronously. Hence, async methods are supported.

@@filename()
@MessagePattern({ cmd: 'sum' })
async accumulate(data: number[]): Promise<number> {
  return (data || []).reduce((a, b) => a + b);
}
@@switch
@MessagePattern({ cmd: 'sum' })
async accumulate(data) {
  return (data || []).reduce((a, b) => a + b);
}

A message handler is also able to return an Observable, in which case the result values will be emitted until the stream is completed.

@@filename()
@MessagePattern({ cmd: 'sum' })
accumulate(data: number[]): Observable<number> {
  return from([1, 2, 3]);
}
@@switch
@MessagePattern({ cmd: 'sum' })
accumulate(data: number[]): Observable<number> {
  return from([1, 2, 3]);
}

In the example above, the message handler will respond 3 times (with each item from the array).

Event-based

While the request-response method is ideal for exchanging messages between services, it is less suitable when your message style is event-based - when you just want to publish events without waiting for a response. In that case, you do not want the overhead required by request-response for maintaining two channels.

Suppose you would like to simply notify another service that a certain condition has occurred in this part of the system. This is the ideal use case for the event-based message style.

To create an event handler, we use the @EventPattern() decorator, which is imported from the @nestjs/microservices package.

@@filename()
@EventPattern('user_created')
async handleUserCreated(data: Record<string, unknown>) {
  // business logic
}
@@switch
@EventPattern('user_created')
async handleUserCreated(data) {
  // business logic
}

The handleUserCreated() event handler listens for the 'user_created' event. The event handler takes a single argument, the data passed from the client (in this case, an event payload which has been sent over the network).

Decorators

In more sophisticated scenarios, you may want to access more information about the incoming request. For example, in the case of NATS with wildcard subscriptions, you may want to get the original subject that the producer has sent the message to. Likewise, in Kafka you may want to access the message headers. In order to accomplish that, you can use built-in decorators as follows:

@@filename()
@MessagePattern('time.us.*')
getDate(@Payload() data: number[], @Ctx() context: NatsContext) {
  console.log(`Subject: ${context.getSubject()}`); // e.g. "time.us.east"
  return new Date().toLocaleTimeString(...);
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('time.us.*')
getDate(data, context) {
  console.log(`Subject: ${context.getSubject()}`); // e.g. "time.us.east"
  return new Date().toLocaleTimeString(...);
}

info Hint @Payload(), @Ctx() and NatsContext are imported from @nestjs/microservices.

Client

A client Nest application can exchange messages or publish events to a Nest microservice using the ClientProxy class. This class defines several methods, such as send() (for request-response messaging) and emit() (for event-driven messaging) that let you communicate with a remote microservice. Obtain an instance of this class in one of the following ways.

One technique is to import the ClientsModule, which exposes the static register() method. This method takes an argument which is an array of objects representing microservice transporters. Each such object has a name property, an optional transport property (default is Transport.TCP), and an optional transporter-specific options property.

The name property serves as an injection token that can be used to inject an instance of a ClientProxy where needed. The value of the name property, as an injection token, can be an arbitrary string or JavaScript symbol, as described here.

The options property is an object with the same properties we saw in the createMicroservice() method earlier.

@Module({
  imports: [
    ClientsModule.register([
      { name: 'MATH_SERVICE', transport: Transport.TCP },
    ]),
  ]
  ...
})

Once the module has been imported, we can inject an instance of the ClientProxy configured as specified via the 'MATH_SERVICE' transporter options shown above, using the @Inject() decorator.

constructor(
  @Inject('MATH_SERVICE') private client: ClientProxy,
) {}

info Hint The ClientsModule and ClientProxy classes are imported from the @nestjs/microservices package.

At times we may need to fetch the transporter configuration from another service (say a ConfigService), rather than hard-coding it in our client application. To do this, we can register a custom provider using the ClientProxyFactory class. This class has a static create() method, which accepts a transporter options object, and returns a customized ClientProxy instance.

@Module({
  providers: [
    {
      provide: 'MATH_SERVICE',
      useFactory: (configService: ConfigService) => {
        const mathSvcOptions = configService.getMathSvcOptions();
        return ClientProxyFactory.create(mathSvcOptions);
      },
      inject: [ConfigService],
    }
  ]
  ...
})

info Hint The ClientProxyFactory is imported from the @nestjs/microservices package.

Another option is to use the @Client() property decorator.

@Client({ transport: Transport.TCP })
client: ClientProxy;

info Hint The @Client() decorator is imported from the @nestjs/microservices package.

Using the @Client() decorator is not the preferred technique, as it is harder to test and harder to share a client instance.

The ClientProxy is lazy. It doesn't initiate a connection immediately. Instead, it will be established before the first microservice call, and then reused across each subsequent call. However, if you want to delay the application bootstrapping process until a connection is established, you can manually initiate a connection using the ClientProxy object's connect() method inside the OnApplicationBootstrap lifecycle hook.

@@filename()
async onApplicationBootstrap() {
  await this.client.connect();
}

If the connection cannot be created, the connect() method will reject with the corresponding error object.

Sending messages

The ClientProxy exposes a send() method. This method is intended to call the microservice and returns an Observable with its response. Thus, we can subscribe to the emitted values easily.

@@filename()
accumulate(): Observable<number> {
  const pattern = { cmd: 'sum' };
  const payload = [1, 2, 3];
  return this.client.send<number>(pattern, payload);
}
@@switch
accumulate() {
  const pattern = { cmd: 'sum' };
  const payload = [1, 2, 3];
  return this.client.send(pattern, payload);
}

The send() method takes two arguments, pattern and payload. The pattern should match one defined in a @MessagePattern() decorator. The payload is a message that we want to transmit to the remote microservice. This method returns a cold Observable, which means that you have to explicitly subscribe to it before the message will be sent.

Publishing events

To send an event, use the ClientProxy object's emit() method. This method publishes an event to the message broker.

@@filename()
async publish() {
  this.client.emit<number>('user_created', new UserCreatedEvent());
}
@@switch
async publish() {
  this.client.emit('user_created', new UserCreatedEvent());
}

The emit() method takes two arguments, pattern and payload. The patternshould match one defined in an @EventPattern() decorator. The payload is an event payload that we want to transmit to the remote microservice. This method returns a hot Observable (unlike the cold Observable returned by send()), which means that whether or not you explicitly subscribe to the observable, the proxy will immediately try to deliver the event.

Scopes

For people coming from different programming language backgrounds, it might be unexpected to learn that in Nest, almost everything is shared across incoming requests. We have a connection pool to the database, singleton services with global state, etc. Remember that Node.js doesn't follow the request/response Multi-Threaded Stateless Model in which every request is processed by a separate thread. Hence, using singleton instances is fully safe for our applications.

However, there are edge-cases when request-based lifetime of the handler may be the desired behavior, for instance per-request caching in GraphQL applications, request tracking or multi-tenancy. Learn how to control scopes here.

Request-scoped handlers and providers can inject RequestContext using the @Inject() decorator in combination with CONTEXT token:

import { Injectable, Scope, Inject } from '@nestjs/common';
import { CONTEXT, RequestContext } from '@nestjs/microservices';

@Injectable({ scope: Scope.REQUEST })
export class CatsService {
  constructor(@Inject(CONTEXT) private ctx: RequestContext) {}
}

This provides access to the RequestContext object, which has two properties:

export interface RequestContext<T = any> {
  pattern: string | Record<string, any>;
  data: T;
}

The data property is the message payload sent by the message producer. The pattern property is the pattern used to identify an appropriate handler to handle the incoming message.

Exception filters

The only difference between the HTTP exception filter layer and the corresponding microservices layer is that instead of throwing HttpException, you should use RpcException.

throw new RpcException('Invalid credentials.');

info Hint The RpcException class is imported from the @nestjs/microservices package.

With the sample above, Nest will handle the thrown exception and return the error object with the following structure:

{
  "status": "error",
  "message": "Invalid credentials."
}

Filters

Microservice exception filters behave similarly to HTTP exception filters, with one small difference. The catch() method must return an Observable.

@@filename(rpc-exception.filter)
import { Catch, RpcExceptionFilter, ArgumentsHost } from '@nestjs/common';
import { Observable, throwError } from 'rxjs';
import { RpcException } from '@nestjs/microservices';

@Catch(RpcException)
export class ExceptionFilter implements RpcExceptionFilter<RpcException> {
  catch(exception: RpcException, host: ArgumentsHost): Observable<any> {
    return throwError(exception.getError());
  }
}
@@switch
import { Catch } from '@nestjs/common';
import { throwError } from 'rxjs';

@Catch(RpcException)
export class ExceptionFilter {
  catch(exception, host) {
    return throwError(exception.getError());
  }
}

Warning You cannot set up global microservice exception filters when using a hybrid application.

The following example uses a manually instantiated method-scoped filter. Just as with HTTP based applications, you can also use controller-scoped filters (i.e., prefix the controller class with a @UseFilters() decorator).

@@filename()
@UseFilters(new ExceptionFilter())
@MessagePattern({ cmd: 'sum' })
accumulate(data: number[]): number {
  return (data || []).reduce((a, b) => a + b);
}
@@switch
@UseFilters(new ExceptionFilter())
@MessagePattern({ cmd: 'sum' })
accumulate(data) {
  return (data || []).reduce((a, b) => a + b);
}

Inheritance

Typically, you'll create fully customized exception filters crafted to fulfill your application requirements. However, there might be use-cases when you would like to simply extend the core exception filter, and override the behavior based on certain factors.

In order to delegate exception processing to the base filter, you need to extend BaseExceptionFilter and call the inherited catch() method.

@@filename()
import { Catch, ArgumentsHost } from '@nestjs/common';
import { BaseRpcExceptionFilter } from '@nestjs/microservices';

@Catch()
export class AllExceptionsFilter extends BaseRpcExceptionFilter {
  catch(exception: any, host: ArgumentsHost) {
    return super.catch(exception, host);
  }
}
@@switch
import { Catch } from '@nestjs/common';
import { BaseRpcExceptionFilter } from '@nestjs/microservices';

@Catch()
export class AllExceptionsFilter extends BaseRpcExceptionFilter {
  catch(exception, host) {
    return super.catch(exception, host);
  }
}

The above implementation is just a shell demonstrating the approach. Your implementation of the extended exception filter would include your tailored business logic (e.g., handling various conditions).

gRPC

gRPC is a modern, open source, high performance RPC framework that can run in any environment. It can efficiently connect services in and across data centers with pluggable support for load balancing, tracing, health checking and authentication.

Like many RPC systems, gRPC is based on the concept of defining a service in terms of functions (methods) that can be called remotely. For each method, you define the parameters and return types. Services, parameters, and return types are defined in .proto files using Google's open source language-neutral protocol buffers mechanism.

With the gRPC transporter, Nest uses .proto files to dynamically bind clients and servers to make it easy to implement remote procedure calls, automatically serializing and deserializing structured data.

Installation

To start building gRPC-based microservices, first install the required packages:

$ npm i --save grpc @grpc/proto-loader

Overview

Like other Nest microservices transport layer implementations, you select the gRPC transporter mechanism using the transport property of the options object passed to the createMicroservice() method. In the following example, we'll set up a hero service. The options property provides metadata about that service; its properties are described below.

@@filename(main)
const app = await NestFactory.createMicroservice<MicroserviceOptions>(AppModule, {
  transport: Transport.GRPC,
  options: {
    package: 'hero',
    protoPath: join(__dirname, 'hero/hero.proto'),
  },
});
@@switch
const app = await NestFactory.createMicroservice(AppModule, {
  transport: Transport.GRPC,
  options: {
    package: 'hero',
    protoPath: join(__dirname, 'hero/hero.proto'),
  },
});

info Hint The join() function is imported from the path package; the Transport enum is imported from the @nestjs/microservices package.

Options

The gRPC transporter options object exposes the properties described below.

package Protobuf package name (matches package setting from .proto file). Required
protoPath Absolute (or relative to the root dir) path to the .proto file. Required
url Connection url. String in the format ip address/dns name:port (for example, 'localhost:50051') defining the address/port on which the transporter establishes a connection. Optional. Defaults to 'localhost:5000'
protoLoader NPM package name for the utility to load .proto files. Optional. Defaults to '@grpc/proto-loader'
loader @grpc/proto-loader options. These provide detailed control over the behavior of .proto files. Optional. See here for more details
credentials Server credentials. Optional. Read more here

Sample gRPC service

Let's define our sample gRPC service called HeroesService. In the above options object, theprotoPath property sets a path to the .proto definitions file hero.proto. The hero.proto file is structured using protocol buffers. Here's what it looks like:

// hero/hero.proto
syntax = "proto3";

package hero;

service HeroesService {
  rpc FindOne (HeroById) returns (Hero) {}
}

message HeroById {
  int32 id = 1;
}

message Hero {
  int32 id = 1;
  string name = 2;
}

Our HeroesService exposes a FindOne() method. This method expects an input argument of type HeroById and returns a Hero message (protocol buffers use message elements to define both parameter types and return types).

Next, we need to implement the service. To define a handler that fulfills this definition, we use the @GrpcMethod() decorator in a controller, as shown below. This decorator provides the metadata needed to declare a method as a gRPC service method.

info Hint The @MessagePattern() decorator (read more) introduced in previous microservices chapters is not used with gRPC-based microservices. The @GrpcMethod() decorator effectively takes its place for gRPC-based microservices.

@@filename(heroes.controller)
@Controller()
export class HeroesController {
  @GrpcMethod('HeroesService', 'FindOne')
  findOne(data: HeroById, metadata: any): Hero {
    const items = [
      { id: 1, name: 'John' },
      { id: 2, name: 'Doe' },
    ];
    return items.find(({ id }) => id === data.id);
  }
}
@@switch
@Controller()
export class HeroesController {
  @GrpcMethod('HeroesService', 'FindOne')
  findOne(data, metadata) {
    const items = [
      { id: 1, name: 'John' },
      { id: 2, name: 'Doe' },
    ];
    return items.find(({ id }) => id === data.id);
  }
}

info Hint The @GrpcMethod() decorator is imported from the @nestjs/microservices package.

The decorator shown above takes two arguments. The first is the service name (e.g., 'HeroesService'), corresponding to the HeroesService service definition in hero.proto. The second (the string 'FindOne') corresponds to the FindOne() rpc method defined within HeroesService in the hero.proto file.

The findOne() handler method takes two arguments, the data passed from the caller and metadata that stores gRPC request metadata.

Both @GrpcMethod() decorator arguments are optional. If called without the second argument (e.g., 'FindOne'), Nest will automatically associate the .proto file rpc method with the handler based on converting the handler name to upper camel case (e.g., the findOne handler is associated with the FindOne rpc call definition). This is shown below.

@@filename(heroes.controller)
@Controller()
export class HeroesController {
  @GrpcMethod('HeroesService')
  findOne(data: HeroById, metadata: any): Hero {
    const items = [
      { id: 1, name: 'John' },
      { id: 2, name: 'Doe' },
    ];
    return items.find(({ id }) => id === data.id);
  }
}
@@switch
@Controller()
export class HeroesController {
  @GrpcMethod('HeroesService')
  findOne(data, metadata) {
    const items = [
      { id: 1, name: 'John' },
      { id: 2, name: 'Doe' },
    ];
    return items.find(({ id }) => id === data.id);
  }
}

You can also omit the first @GrpcMethod() argument. In this case, Nest automatically associates the handler with the service definition from the proto definitions file based on the class name where the handler is defined. For example, in the following code, class HeroesService associates its handler methods with the HeroesService service definition in the hero.proto file based on the matching of the name 'HeroesService'.

@@filename(heroes.controller)
@Controller()
export class HeroesService {
  @GrpcMethod()
  findOne(data: HeroById, metadata: any): Hero {
    const items = [
      { id: 1, name: 'John' },
      { id: 2, name: 'Doe' },
    ];
    return items.find(({ id }) => id === data.id);
  }
}
@@switch
@Controller()
export class HeroesService {
  @GrpcMethod()
  findOne(data, metadata) {
    const items = [
      { id: 1, name: 'John' },
      { id: 2, name: 'Doe' },
    ];
    return items.find(({ id }) => id === data.id);
  }
}

Client

Nest applications can act as gRPC clients, consuming services defined in .proto files. You access remote services through a ClientGrpc object. You can obtain a ClientGrpc object in several ways.

The preferred technique is to import the ClientsModule. Use the register() method to bind a package of services defined in a .proto file to an injection token, and to configure the service. The name property is the injection token. For gRPC services, use transport: Transport.GRPC. The options property is an object with the same properties described above.

imports: [
  ClientsModule.register([
    {
      name: 'HERO_PACKAGE',
      transport: Transport.GRPC,
      options: {
        package: 'hero',
        protoPath: join(__dirname, 'hero/hero.proto'),
      },
    },
  ]),
];

info Hint The register() method takes an array of objects. Register multiple packages by providing a comma separated list of registration objects.

Once registered, we can inject the configured ClientGrpc object with @Inject(). Then we use the ClientGrpc object's getService() method to retrieve the service instance, as shown below.

@Injectable()
export class AppService implements OnModuleInit {
  private heroesService: HeroesService;

  constructor(@Inject('HERO_PACKAGE') private client: ClientGrpc) {}

  onModuleInit() {
    this.heroesService = this.client.getService<HeroesService>('HeroesService');
  }

  getHero(): Observable<string> {
    return this.heroesService.findOne({ id: 1 });
  }
}

Notice that there is a small difference compared to the technique used in other microservice transport methods. Instead of the ClientProxy class, we use the ClientGrpc class, which provides the getService() method. The getService() generic method takes a service name as an argument and returns its instance (if available).

Alternatively, you can use the @Client() decorator to instantiate a ClientGrpc object, as follows:

@Injectable()
export class AppService implements OnModuleInit {
  @Client({
    transport: Transport.GRPC,
    options: {
      package: 'hero',
      protoPath: join(__dirname, 'hero/hero.proto'),
    },
  })
  client: ClientGrpc;

  private heroesService: HeroesService;

  onModuleInit() {
    this.heroesService = this.client.getService<HeroesService>('HeroesService');
  }

  getHero(): Observable<string> {
    return this.heroesService.findOne({ id: 1 });
  }
}

Finally, for more complex scenarios, we can inject a dynamically configured client using the ClientProxyFactory class as described here.

In either case, we end up with a reference to our HeroesService proxy object, which exposes the same set of methods that are defined inside the .proto file. Now, when we access this proxy object (i.e., heroesService), the gRPC system automatically serializes requests, forwards them to the remote system, returns a response, and deserializes the response. Because gRPC shields us from these network communication details, heroesService looks and acts like a local provider.

Note, all service methods are lower camel cased (in order to follow the natural convention of the language). So, for example, while our .proto file HeroesService definition contains the FindOne() function, the heroesService instance will provide the findOne() method.

interface HeroesService {
  findOne(data: { id: number }): Observable<any>;
}

A message handler is also able to return an Observable, in which case the result values will be emitted until the stream is completed.

@@filename(heroes.controller)
@Get()
call(): Observable<any> {
  return this.heroesService.findOne({ id: 1 });
}
@@switch
@Get()
call() {
  return this.heroesService.findOne({ id: 1 });
}

A full working example is available here.

gRPC Streaming

gRPC on its own supports long-term live connections, conventionally known as streams. Streams are useful for cases such as Chatting, Observations or Chunk-data transfers. Find more details in the official documentation here.

Nest supports GRPC stream handlers in two possible ways:

  • RxJS Subject + Observable handler: can be useful to write responses right inside of a Controller method or to be passed down to Subject/Observable consumer
  • Pure GRPC call stream handler: can be useful to be passed to some executor which will handle the rest of dispatch for the Node standard Duplex stream handler.

Streaming sample

Let's define a new sample gRPC service called HelloService. The hello.proto file is structured using protocol buffers. Here's what it looks like:

// hello/hello.proto
syntax = "proto3";

package hello;

service HelloService {
  rpc BidiHello(stream HelloRequest) returns (stream HelloResponse);
  rpc LotsOfGreetings(stream HelloRequest) returns (HelloResponse);
}

message HelloRequest {
  string greeting = 1;
}

message HelloResponse {
  string reply = 1;
}

info Hint The LotsOfGreetings method can be simply implemented with the @GrpcMethod decorator (as in the examples above) since the returned stream can emit multiple values.

Based on this .proto file, let's define the HelloService interface:

interface HelloService {
  bidiHello(upstream: Observable<HelloRequest>): Observable<HelloResponse>;
  lotsOfGreetings(
    upstream: Observable<HelloRequest>,
  ): Observable<HelloResponse>;
}

interface HelloRequest {
  greeting: string;
}

interface HelloResponse {
  reply: string;
}

Subject strategy

The @GrpcStreamMethod() decorator provides the function parameter as an RxJS Observable. Thus, we can receive and process multiple messages.

@GrpcStreamMethod()
bidiHello(messages: Observable<any>): Observable<any> {
  const subject = new Subject();

  const onNext = message => {
    console.log(message);
    subject.next({
      reply: 'Hello, world!'
    });
  };
  const onComplete = () => subject.complete();
  messages.subscribe(onNext, null, onComplete);

  return subject.asObservable();
}

info Hint For supporting full-duplex interaction with the @GrpcStreamMethod() decorator, the controller method must return an RxJS Observable.

According to the service definition (in the .proto file), the BidiHello method should stream requests to the service. To send multiple asynchronous messages to the stream from a client, we leverage an RxJS ReplySubject class.

const helloService = this.client.getService<HelloService>('HelloService');
const helloRequest$ = new ReplaySubject<HelloRequest>();

helloRequest$.next({ greeting: 'Hello (1)!' });
helloRequest$.next({ greeting: 'Hello (2)!' });
helloRequest$.complete();

return helloService.bidiHello(helloRequest$);

In the example above, we wrote two messages to the stream (next() calls) and notified the service that we've completed sending the data (complete() call).

Call stream handler

When the method return value is defined as stream, the @GrpcStreamCall() decorator provides the function parameter as grpc.ServerDuplexStream, which supports standard methods like .on('data', callback), .write(message) or .cancel(). Full documentation on available methods can be found here.

Alternatively, when the method return value is not a stream, the @GrpcStreamCall() decorator provides two function parameters, respectively grpc.ServerReadableStream (read more here) and callback.

Let's start with implementing the BidiHello which should support a full-duplex interaction.

@GrpcStreamCall()
bidiHello(requestStream: any) {
  requestStream.on('data', message => {
    console.log(message);
    requestStream.write({
      reply: 'Hello, world!'
    });
  });
}

info Hint This decorator does not require any specific return parameter to be provided. It is expected that the stream will be handled similar to any other standard stream type.

In the example above, we used the write() method to write objects to the response stream. The callback passed into the .on() method as a second parameter will be called every time our service receives a new chunk of data.

Let's implement the LotsOfGreetings method.

@GrpcStreamCall()
lotsOfGreetings(requestStream: any, callback: (err: unknown, value: HelloResponse) => void) {
  requestStream.on('data', message => {
    console.log(message);
  });
  requestStream.on('end', () => callback(null, { reply: 'Hello, world!' }));
}

Here we used the callback function to send the response once processing of the requestStream has been completed.

Guards

There is no fundamental difference between microservices guards and regular HTTP application guards. The only difference is that instead of throwing HttpException, you should use RpcException.

info Hint The RpcException class is exposed from @nestjs/microservices package.

Binding guards

The following example uses a method-scoped guard. Just as with HTTP based applications, you can also use controller-scoped guards (i.e., prefix the controller class with a @UseGuards() decorator).

@@filename()
@UseGuards(AuthGuard)
@MessagePattern({ cmd: 'sum' })
accumulate(data: number[]): number {
  return (data || []).reduce((a, b) => a + b);
}
@@switch
@UseGuards(AuthGuard)
@MessagePattern({ cmd: 'sum' })
accumulate(data) {
  return (data || []).reduce((a, b) => a + b);
}

Interceptors

There is no difference between regular interceptors and microservices interceptors. The following example uses a manually instantiated method-scoped interceptor. Just as with HTTP based applications, you can also use controller-scoped interceptors (i.e., prefix the controller class with a @UseInterceptors() decorator).

@@filename()
@UseInterceptors(new TransformInterceptor())
@MessagePattern({ cmd: 'sum' })
accumulate(data: number[]): number {
  return (data || []).reduce((a, b) => a + b);
}
@@switch
@UseInterceptors(new TransformInterceptor())
@MessagePattern({ cmd: 'sum' })
accumulate(data) {
  return (data || []).reduce((a, b) => a + b);
}

Kafka

Kafka is an open source, distributed streaming platform which has three key capabilities:

  • Publish and subscribe to streams of records, similar to a message queue or enterprise messaging system.
  • Store streams of records in a fault-tolerant durable way.
  • Process streams of records as they occur.

The Kafka project aims to provide a unified, high-throughput, low-latency platform for handling real-time data feeds. It integrates very well with Apache Storm and Spark for real-time streaming data analysis.

Kafka transporter is experimental.

Installation

To start building Kafka-based microservices, first install the required package:

$ npm i --save kafkajs

Overview

Like other Nest microservice transport layer implementations, you select the Kafka transporter mechanism using the transport property of the options object passed to the createMicroservice() method, along with an optional options property, as shown below:

@@filename(main)
const app = await NestFactory.createMicroservice<MicroserviceOptions>(ApplicationModule, {
  transport: Transport.KAFKA,
  options: {
    client: {
      brokers: ['localhost:9092'],
    }
  }
});
@@switch
const app = await NestFactory.createMicroservice(ApplicationModule, {
  transport: Transport.KAFKA,
  options: {
    client: {
      brokers: ['localhost:9092'],
    }
  }
});

info Hint The Transport enum is imported from the @nestjs/microservices package.

Options

The options property is specific to the chosen transporter. The Kafka transporter exposes the properties described below.

client Client configuration options (read more here)
consumer Consumer configuration options (read more here)
run Run configuration options (read more here)
subscribe Subscribe configuration options (read more here)
producer Producer configuration options (read more here)
send Send configuration options (read more here)

Client

There is a small difference in Kafka compared to other microservice transporters. Instead of the ClientProxy class, we use the ClientKafka class.

Like other microservice transporters, you have several options for creating a ClientKafka instance.

One method for creating an instance is to use use the ClientsModule. To create a client instance with the ClientsModule, import it and use the register() method to pass an options object with the same properties shown above in the createMicroservice() method, as well as a name property to be used as the injection token. Read more about ClientsModule here.

@Module({
  imports: [
    ClientsModule.register([
      {
        name: 'HERO_SERVICE',
        transport: Transport.KAFKA,
        options: {
          client: {
            clientId: 'hero',
            brokers: ['localhost:9092'],
          },
          consumer: {
            groupId: 'hero-consumer'
          }
        }
      },
    ]),
  ]
  ...
})

Other options to create a client (either ClientProxyFactory or @Client()) can be used as well. You can read about them here.

Use the @Client() decorator as follows:

@Client({
  transport: Transport.KAFKA,
  options: {
    client: {
      clientId: 'hero',
      brokers: ['localhost:9092'],
    },
    consumer: {
      groupId: 'hero-consumer'
    }
  }
})
client: ClientKafka;

Message response subscription

The ClientKafka class provides the subscribeToResponseOf() method. The subscribeToResponseOf() method takes a request's topic name as an argument and adds the derived reply topic name to a collection of reply topics. This method is required when implementing the message pattern.

@@filename(heroes.controller)
onModuleInit() {
  this.client.subscribeToResponseOf('hero.kill.dragon');
}

If the ClientKafka instance is created asynchronously, the subscribeToResponseOf() method must be called before calling the connect() method.

@@filename(heroes.controller)
async onModuleInit() {
  this.client.subscribeToResponseOf('hero.kill.dragon');
  await this.client.connect();
}

Message pattern

The Kafka microservice message pattern utilizes two topics for the request and reply channels. The ClientKafka#send() method sends messages with a return address by associating a correlation id, reply topic, and reply partition with the request message. This requires the ClientKafka instance to be subscribed to the reply topic and assigned to at least one partition before sending a message.

Subsequently, you need to have at least one reply topic partition for every Nest application running. For example, if you are running 4 Nest applications but the reply topic only has 3 partitions, then 1 of the Nest applications will error out when trying to send a message.

When new ClientKafka instances are launched they join the consumer group and subscribe to their respective topics. This process triggers a rebalance of topic partitions assigned to consumers of the consumer group.

Normally, topic partitions are assigned using the round robin partitioner, which assigns topic partitions to a collection of consumers sorted by consumer names which are randomly set on application launch. However, when a new consumer joins the consumer group, the new consumer can be positioned anywhere within the collection of consumers. This creates a condition where pre-existing consumers can be assigned different partitions when the pre-existing consumer is positioned after the new consumer. As a result, the consumers that are assigned different partitions will lose response messages of requests sent before the rebalance.

To prevent the ClientKafka consumers from losing response messages, a Nest-specific built-in custom partitioner is utilized. This custom partitioner assigns partitions to a collection of consumers sorted by high-resolution timestamps (process.hrtime()) that are set on application launch.

Incoming

Nest receives incoming Kafka messages as an object with key, value, and headers properties that have values of type Buffer. Nest then parses these values by transforming the buffers into strings. If the string is "object like", Nest attempts to parse the string as JSON. The value is then passed to its associated handler.

Outgoing

Nest sends outgoing Kafka messages after a serialization process when publishing events or sending messages. This occurs on arguments passed to the ClientKafka emit() and send() methods or on values returned from a @MessagePattern method. This serialization "stringifies" objects that are not strings or buffers by using JSON.stringify() or the toString() prototype method.

@@filename(heroes.controller)
@Controller()
export class HeroesController {
  @MessagePattern('hero.kill.dragon')
  killDragon(@Payload() message: KillDragonMessage): any {
    const dragonId = message.dragonId;
    const items = [
      { id: 1, name: 'Mythical Sword' },
      { id: 2, name: 'Key to Dungeon' },
    ];
    return items;
  }
}

info Hint @Payload() is imported from the @nestjs/microservices.

Outgoing messages can also be keyed by passing an object with the key and value properties. Keying messages is important for meeting the co-partitioning requirement.

@@filename(heroes.controller)
@Controller()
export class HeroesController {
  @MessagePattern('hero.kill.dragon')
  killDragon(@Payload() message: KillDragonMessage): any {
    const realm = 'Nest';
    const heroId = message.heroId;
    const dragonId = message.dragonId;

    const items = [
      { id: 1, name: 'Mythical Sword' },
      { id: 2, name: 'Key to Dungeon' },
    ];

    return {
      headers: {
        realm
      },
      key: heroId,
      value: items
    }
  }
}

Additionally, messages passed in this format can also contain custom headers set in the headers hash property. Header hash property values must be either of type string or type Buffer.

@@filename(heroes.controller)
@Controller()
export class HeroesController {
  @MessagePattern('hero.kill.dragon')
  killDragon(@Payload() message: KillDragonMessage): any {
    const realm = 'Nest';
    const heroId = message.heroId;
    const dragonId = message.dragonId;

    const items = [
      { id: 1, name: 'Mythical Sword' },
      { id: 2, name: 'Key to Dungeon' },
    ];

    return {
      headers: {
        kafka_nestRealm: realm
      },
      key: heroId,
      value: items
    }
  }
}

Context

In more sophisticated scenarios, you may want to access more information about the incoming request. When using the Kafka transporter, you can access the KafkaContext object.

@@filename()
@MessagePattern('hero.kill.dragon')
killDragon(@Payload() message: KillDragonMessage, @Ctx() context: KafkaContext) {
  console.log(`Topic: ${context.getTopic()}`);
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('hero.kill.dragon')
killDragon(message, context) {
  console.log(`Topic: ${context.getTopic()}`);
}

info Hint @Payload(), @Ctx() and KafkaContext are imported from the @nestjs/microservices package.

To access the original Kafka IncomingMessage object, use the getMessage() method of the KafkaContext object, as follows:

@@filename()
@MessagePattern('hero.kill.dragon')
killDragon(@Payload() message: KillDragonMessage, @Ctx() context: KafkaContext) {
  const originalMessage = context.getMessage();
  const { headers, partition, timestamp } = originalMessage;
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('hero.kill.dragon')
killDragon(message, context) {
  const originalMessage = context.getMessage();
  const { headers, partition, timestamp } = originalMessage;
}

Where the IncomingMessage fulfills the following interface:

interface IncomingMessage {
  topic: string;
  partition: number;
  timestamp: string;
  size: number;
  attributes: number;
  offset: string;
  key: any;
  value: any;
  headers: Record<string, any>;
}

Naming conventions

The Kafka microservice components append a description of their respective role onto the client.clientId and consumer.groupId options to prevent collisions between Nest microservice client and server components. By default the ClientKafka components append -client and the ServerKafka components append -server to both of these options. Note how the provided values below are transformed in that way (as shown in the comments).

@@filename(main)
const app = await NestFactory.createMicroservice(ApplicationModule, {
  transport: Transport.KAFKA,
  options: {
    client: {
      clientId: 'hero', // hero-server
      brokers: ['localhost:9092'],
    },
    consumer: {
      groupId: 'hero-consumer' // hero-consumer-server
    },
  }
});

And for the client:

@@filename(heroes.controller)
@Client({
  transport: Transport.KAFKA,
  options: {
    client: {
      clientId: 'hero', // hero-client
      brokers: ['localhost:9092'],
    },
    consumer: {
      groupId: 'hero-consumer' // hero-consumer-client
    }
  }
})
client: ClientKafka;

info Hint Kafka client and consumer naming conventions can be customized by extending ClientKafka and KafkaServer in your own custom provider and overriding the constructor.

Since the Kafka microservice message pattern utilizes two topics for the request and reply channels, a reply pattern should be derived from the request topic. By default, the name of the reply topic is the composite of the request topic name with .reply appended.

@@filename(heroes.controller)
onModuleInit() {
  this.client.subscribeToResponseOf('hero.get'); // hero.get.reply
}

info Hint Kafka reply topic naming conventions can be customized by extending ClientKafka in your own custom provider and overriding the getResponsePatternName method.

MQTT

MQTT (Message Queuing Telemetry Transport) is an open source, lightweight messaging protocol, optimized for high-latency. This protocol provides a scalable and cost-efficient way to connect devices using a publish/subscribe model. A communication system built on MQTT consists of the publishing server, a broker and one or more clients. It is designed for constrained devices and low-bandwidth, high-latency or unreliable networks.

Installation

To start building MQTT-based microservices, first install the required package:

$ npm i --save mqtt

Overview

To use the MQTT transporter, pass the following options object to the createMicroservice() method:

@@filename(main)
const app = await NestFactory.createMicroservice<MicroserviceOptions>(ApplicationModule, {
  transport: Transport.MQTT,
  options: {
    url: 'mqtt://localhost:1883',
  },
});
@@switch
const app = await NestFactory.createMicroservice(ApplicationModule, {
  transport: Transport.MQTT,
  options: {
    url: 'mqtt://localhost:1883',
  },
});

info Hint The Transport enum is imported from the @nestjs/microservices package.

Options

The options object is specific to the chosen transporter. The MQTT transporter exposes the properties described here.

Client

Like other microservice transporters, you have several options for creating a MQTT ClientProxy instance.

One method for creating an instance is to use use the ClientsModule. To create a client instance with the ClientsModule, import it and use the register() method to pass an options object with the same properties shown above in the createMicroservice() method, as well as a name property to be used as the injection token. Read more about ClientsModule here.

@Module({
  imports: [
    ClientsModule.register([
      {
        name: 'MATH_SERVICE',
        transport: Transport.MQTT,
        options: {
          url: 'mqtt://localhost:1883',
        }
      },
    ]),
  ]
  ...
})

Other options to create a client (either ClientProxyFactory or @Client()) can be used as well. You can read about them here.

Context

In more sophisticated scenarios, you may want to access more information about the incoming request. When using the MQTT transporter, you can access the MqttContext object.

@@filename()
@MessagePattern('notifications')
getNotifications(@Payload() data: number[], @Ctx() context: MqttContext) {
  console.log(`Topic: ${context.getTopic()}`);
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('notifications')
getNotifications(data, context) {
  console.log(`Topic: ${context.getTopic()}`);
}

info Hint @Payload(), @Ctx() and MqttContext are imported from the @nestjs/microservices package.

To access the original mqtt packet, use the getPacket() method of the MqttContext object, as follows:

@@filename()
@MessagePattern('notifications')
getNotifications(@Payload() data: number[], @Ctx() context: MqttContext) {
  console.log(context.getPacket());
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('notifications')
getNotifications(data, context) {
  console.log(context.getPacket());
}

Wildcards

A subscription may be to an explicit topic, or it may include wildcards. Two wildcards are available, + and #. + is a single-level wildcard, while # is a multi-level wildcard which covers many topic levels.

@@filename()
@MessagePattern('sensors/+/temperature/+')
getTemperature(@Ctx() context: MqttContext) {
  console.log(`Topic: ${context.getTopic()}`);
}
@@switch
@Bind(Ctx())
@MessagePattern('sensors/+/temperature/+')
getTemperature(context) {
  console.log(`Topic: ${context.getTopic()}`);
}

NATS

NATS is a simple, secure and high performance open source messaging system for cloud native applications, IoT messaging, and microservices architectures. The NATS server is written in the Go programming language, but client libraries to interact with the server are available for dozens of major programming languages. NATS supports both At Most Once and At Least Once delivery. It can run anywhere, from large servers and cloud instances, through edge gateways and even Internet of Things devices.

Installation

To start building NATS-based microservices, first install the required package:

$ npm i --save nats

Overview

To use the NATS transporter, pass the following options object to the createMicroservice() method:

@@filename(main)
const app = await NestFactory.createMicroservice<MicroserviceOptions>(ApplicationModule, {
  transport: Transport.NATS,
  options: {
    url: 'nats://localhost:4222',
  },
});
@@switch
const app = await NestFactory.createMicroservice(ApplicationModule, {
  transport: Transport.NATS,
  options: {
    url: 'nats://localhost:4222',
  },
});

info Hint The Transport enum is imported from the @nestjs/microservices package.

Options

The options object is specific to the chosen transporter. The NATS transporter exposes the properties described here. Additionally, there is a queue property which allows you to specify the name of the queue that your server should subscribe to (leave undefined to ignore this setting). Read more about NATS queue groups below.

Client

Like other microservice transporters, you have several options for creating a NATS ClientProxy instance.

One method for creating an instance is to use use the ClientsModule. To create a client instance with the ClientsModule, import it and use the register() method to pass an options object with the same properties shown above in the createMicroservice() method, as well as a name property to be used as the injection token. Read more about ClientsModule here.

@Module({
  imports: [
    ClientsModule.register([
      {
        name: 'MATH_SERVICE',
        transport: Transport.NATS,
        options: {
          url: 'nats://localhost:4222',
        }
      },
    ]),
  ]
  ...
})

Other options to create a client (either ClientProxyFactory or @Client()) can be used as well. You can read about them here.

Request-response

For the request-response message style (read more), the NATS transporter uses NATS built-in Request-Reply mechanism. A request is published on a given subject with a reply subject, and responders listen on that subject and send responses to the reply subject. Reply subjects are usually a subject called an _INBOX that will be directed back to the requestor dynamically, regardless of location of either party.

Event-based

For the event-based message style (read more), the NATS transporter uses NATS built-in Publish-Subscribe mechanism. A publisher sends a message on a subject and any active subscriber listening on that subject receives the message. Subscribers can also register interest in wildcard subjects that work a bit like a regular expression. This one-to-many pattern is sometimes called fan-out.

Queue groups

NATS provides a built-in load balancing feature called distributed queues. To create a queue subscription, use the queue property as follows:

@@filename(main)
const app = await NestFactory.createMicroservice(ApplicationModule, {
  transport: Transport.NATS,
  options: {
    url: 'nats://localhost:4222',
    queue: 'cats_queue',
  },
});

Context

In more sophisticated scenarios, you may want to access more information about the incoming request. When using the NATS transporter, you can access the NatsContext object.

@@filename()
@MessagePattern('notifications')
getNotifications(@Payload() data: number[], @Ctx() context: NatsContext) {
  console.log(`Subject: ${context.getSubject()}`);
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('notifications')
getNotifications(data, context) {
  console.log(`Subject: ${context.getSubject()}`);
}

info Hint @Payload(), @Ctx() and NatsContext are imported from the @nestjs/microservices package.

Wildcards

A subscription may be to an explicit subject, or it may include wildcards.

@@filename()
@MessagePattern('time.us.*')
getDate(@Payload() data: number[], @Ctx() context: NatsContext) {
  console.log(`Subject: ${context.getSubject()}`); // e.g. "time.us.east"
  return new Date().toLocaleTimeString(...);
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('time.us.*')
getDate(data, context) {
  console.log(`Subject: ${context.getSubject()}`); // e.g. "time.us.east"
  return new Date().toLocaleTimeString(...);
}

Pipes

There is no fundamental difference between regular pipes and microservices pipes. The only difference is that instead of throwing HttpException, you should use RpcException.

info Hint The RpcException class is exposed from @nestjs/microservices package.

Binding pipes

The following example uses a manually instantiated method-scoped pipe. Just as with HTTP based applications, you can also use controller-scoped pipes (i.e., prefix the controller class with a @UsePipes() decorator).

@@filename()
@UsePipes(new ValidationPipe())
@MessagePattern({ cmd: 'sum' })
accumulate(data: number[]): number {
  return (data || []).reduce((a, b) => a + b);
}
@@switch
@UsePipes(new ValidationPipe())
@MessagePattern({ cmd: 'sum' })
accumulate(data) {
  return (data || []).reduce((a, b) => a + b);
}

RabbitMQ

RabbitMQ is an open-source and lightweight message broker which supports multiple messaging protocols. It can be deployed in distributed and federated configurations to meet high-scale, high-availability requirements. In addition, it's the most widely deployed message broker, used worldwide at small startups and large enterprises.

Installation

To start building RabbitMQ-based microservices, first install the required packages:

$ npm i --save amqplib amqp-connection-manager

Overview

To use the RabbitMQ transporter, pass the following options object to the createMicroservice() method:

@@filename(main)
const app = await NestFactory.createMicroservice<MicroserviceOptions>(ApplicationModule, {
  transport: Transport.RMQ,
  options: {
    urls: ['amqp://localhost:5672'],
    queue: 'cats_queue',
    queueOptions: {
      durable: false
    },
  },
});
@@switch
const app = await NestFactory.createMicroservice(ApplicationModule, {
  transport: Transport.RMQ,
  options: {
    urls: ['amqp://localhost:5672'],
    queue: 'cats_queue',
    queueOptions: {
      durable: false
    },
  },
});

info Hint The Transport enum is imported from the @nestjs/microservices package.

Options

The options property is specific to the chosen transporter. The RabbitMQ transporter exposes the properties described below.

urls Connection urls
queue Queue name which your server will listen to
prefetchCount Sets the prefetch count for the channel
isGlobalPrefetchCount Enables per channel prefetching
noAck If false, manual acknowledgment mode enabled
queueOptions Additional queue options (read more here)
socketOptions Additional socket options (read more here)

Client

Like other microservice transporters, you have several options for creating a RabbitMQ ClientProxy instance.

One method for creating an instance is to use the ClientsModule. To create a client instance with the ClientsModule, import it and use the register() method to pass an options object with the same properties shown above in the createMicroservice() method, as well as a name property to be used as the injection token. Read more about ClientsModule here.

@Module({
  imports: [
    ClientsModule.register([
      {
        name: 'MATH_SERVICE',
        transport: Transport.RMQ,
        options: {
          urls: ['amqp://localhost:5672'],
          queue: 'cats_queue',
          queueOptions: {
            durable: false
          },
        },
      },
    ]),
  ]
  ...
})

Other options to create a client (either ClientProxyFactory or @Client()) can be used as well. You can read about them here.

Context

In more sophisticated scenarios, you may want to access more information about the incoming request. When using the RabbitMQ transporter, you can access the RmqContext object.

@@filename()
@MessagePattern('notifications')
getNotifications(@Payload() data: number[], @Ctx() context: RmqContext) {
  console.log(`Pattern: ${context.getPattern()}`);
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('notifications')
getNotifications(data, context) {
  console.log(`Pattern: ${context.getPattern()}`);
}

info Hint @Payload(), @Ctx() and RmqContext are imported from the @nestjs/microservices package.

To access the original RabbitMQ message (with the properties, fields, and content), use the getMessage() method of the RmqContext object, as follows:

@@filename()
@MessagePattern('notifications')
getNotifications(@Payload() data: number[], @Ctx() context: RmqContext) {
  console.log(context.getMessage());
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('notifications')
getNotifications(data, context) {
  console.log(context.getMessage());
}

To retrieve a reference to the RabbitMQ channel, use the getChannelRef method of the RmqContext object, as follows:

@@filename()
@MessagePattern('notifications')
getNotifications(@Payload() data: number[], @Ctx() context: RmqContext) {
  console.log(context.getChannelRef());
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('notifications')
getNotifications(data, context) {
  console.log(context.getChannelRef());
}

Message acknowledgement

To make sure a message is never lost, RabbitMQ supports message acknowledgements. An acknowledgement is sent back by the consumer to tell RabbitMQ that a particular message has been received, processed and that RabbitMQ is free to delete it. If a consumer dies (its channel is closed, connection is closed, or TCP connection is lost) without sending an ack, RabbitMQ will understand that a message wasn't processed fully and will re-queue it.

To enable manual acknowledgment mode, set the noAck property to false:

options: {
  urls: ['amqp://localhost:5672'],
  queue: 'cats_queue',
  noAck: false,
  queueOptions: {
    durable: false
  },
},

When manual consumer acknowledgements are turned on, we must send a proper acknowledgement from the worker to signal that we are done with a task.

@@filename()
@MessagePattern('notifications')
getNotifications(@Payload() data: number[], @Ctx() context: RmqContext) {
  const channel = context.getChannelRef();
  const originalMsg = context.getMessage();

  channel.ack(originalMsg);
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('notifications')
getNotifications(data, context) {
  const channel = context.getChannelRef();
  const originalMsg = context.getMessage();

  channel.ack(originalMsg);
}

Redis

The Redis transporter implements the publish/subscribe messaging paradigm and leverages the Pub/Sub feature of Redis. Published messages are categorized in channels, without knowing what subscribers (if any) will eventually receive the message. Each microservice can subscribe to any number of channels. In addition, more than one channel can be subscribed to at a time. Messages exchanged through channels are fire-and-forget, which means that if a message is published and there are no subscribers interested in it, the message is removed and cannot be recovered. Thus, you don't have a guarantee that either messages or events will be handled by at least one service. A single message can be subscribed to (and received) by multiple subscribers.

Installation

To start building Redis-based microservices, first install the required package:

$ npm i --save redis

Overview

To use the Redis transporter, pass the following options object to the createMicroservice() method:

@@filename(main)
const app = await NestFactory.createMicroservice<MicroserviceOptions>(ApplicationModule, {
  transport: Transport.REDIS,
  options: {
    url: 'redis://localhost:6379',
  },
});
@@switch
const app = await NestFactory.createMicroservice(ApplicationModule, {
  transport: Transport.REDIS,
  options: {
    url: 'redis://localhost:6379',
  },
});

info Hint The Transport enum is imported from the @nestjs/microservices package.

Options

The options property is specific to the chosen transporter. The Redis transporter exposes the properties described below.

url Connection url
retryAttempts Number of times to retry message (default: 0)
retryDelay Delay between message retry attempts (ms) (default: 0)

Client

Like other microservice transporters, you have several options for creating a Redis ClientProxy instance.

One method for creating an instance is to use use the ClientsModule. To create a client instance with the ClientsModule, import it and use the register() method to pass an options object with the same properties shown above in the createMicroservice() method, as well as a name property to be used as the injection token. Read more about ClientsModule here.

@Module({
  imports: [
    ClientsModule.register([
      {
        name: 'MATH_SERVICE',
        transport: Transport.REDIS,
        options: {
          url: 'redis://localhost:6379',
        }
      },
    ]),
  ]
  ...
})

Other options to create a client (either ClientProxyFactory or @Client()) can be used as well. You can read about them here.

Context

In more sophisticated scenarios, you may want to access more information about the incoming request. When using the Redis transporter, you can access the RedisContext object.

@@filename()
@MessagePattern('notifications')
getNotifications(@Payload() data: number[], @Ctx() context: RedisContext) {
  console.log(`Channel: ${context.getChannel()}`);
}
@@switch
@Bind(Payload(), Ctx())
@MessagePattern('notifications')
getNotifications(data, context) {
  console.log(`Channel: ${context.getChannel()}`);
}

info Hint @Payload(), @Ctx() and RedisContext are imported from the @nestjs/microservices package.

CLI Plugin

TypeScript's metadata reflection system has several limitations which make it impossible to, for instance, determine what properties a class consists of or recognize whether a given property is optional or required. However, some of these constraints can be addressed at compilation time. Nest provides a plugin that enhances the TypeScript compilation process to reduce the amount of boilerplate code required.

warning Hint This plugin is opt-in. If you prefer, you can declare all decorators manually, or only specific decorators where you need them.

Overview

The Swagger plugin will automatically:

  • annotate all DTO properties with @ApiProperty unless @ApiHideProperty is used
  • set the required property depending on the question mark (e.g. name?: string will set required: false)
  • set the type or enum property depending on the type (supports arrays as well)
  • set the default property based on the assigned default value
  • set several validation rules based on class-validator decorators (if classValidatorShim set to true)
  • add a response decorator to every endpoint with a proper status and type (response model)

Please, note that your filenames must have one of the following suffixes: ['.dto.ts', '.entity.ts'] (e.g., create-user.dto.ts) in order to be analysed by the plugin.

If you are using a different suffix, you can adjust the plugin's behavior by specifying the dtoFileNameSuffix option (see below).

Previously, if you wanted to provide an interactive experience with the Swagger UI, you had to duplicate a lot of code to let the package know how your models/components should be declared in the specification. For example, you could define a simple CreateUserDto class as follows:

export class CreateUserDto {
  @ApiProperty()
  email: string;

  @ApiProperty()
  password: string;

  @ApiProperty({ enum: RoleEnum, default: [], isArray: true })
  roles: RoleEnum[] = [];

  @ApiProperty({ required: false, default: true })
  isEnabled?: boolean = true;
}

While not a significant issue with medium-sized projects, it becomes verbose & hard to maintain once you have a large set of classes.

By enabling the Swagger plugin, the above class definition can be declared simply:

export class CreateUserDto {
  email: string;
  password: string;
  roles: RoleEnum[] = [];
  isEnabled?: boolean = true;
}

The plugin adds appropriate decorators on the fly based on the Abstract Syntax Tree. Thus you won't have to struggle with @ApiProperty decorators scattered throughout the code.

warning Hint The plugin will automatically generate any missing swagger properties, but if you need to override them, you simply set them explicitly via @ApiProperty().

Using the CLI plugin

To enable the plugin, open nest-cli.json (if you use Nest CLI) and add the following plugins configuration:

{
  "collection": "@nestjs/schematics",
  "sourceRoot": "src",
  "compilerOptions": {
    "plugins": ["@nestjs/swagger/plugin"]
  }
}

You can use the options property to customize the behavior of the plugin.

"plugins": [
  {
    "name": "@nestjs/swagger/plugin",
    "options": {
      "classValidatorShim": false
    }
  }
]

The options property has to fulfill the following interface:

export interface PluginOptions {
  dtoFileNameSuffix?: string[];
  controllerFileNameSuffix?: string[];
  classValidatorShim?: boolean;
}
Option Default Description
dtoFileNameSuffix ['.dto.ts', '.entity.ts'] DTO (Data Transfer Object) files suffix
controllerFileNameSuffix .controller.ts Controller files suffix
classValidatorShim true If set to true, the module will reuse class-validator validation decorators (e.g. @Max(10) will add max: 10 to schema definition)

If you don't use the CLI but instead have a custom webpack configuration, you can use this plugin in combination with ts-loader:

getCustomTransformers: (program: any) => ({
  before: [require('@nestjs/swagger/plugin').before({}, program)]
}),

Decorators

All of the available OpenAPI decorators have an Api prefix to distinguish them from the core decorators. Below is a full list of the exported decorators along with a designation of the level at which the decorator may be applied.

@ApiOperation() Method
@ApiResponse() Method / Controller
@ApiProduces() Method / Controller
@ApiConsumes() Method / Controller
@ApiBearerAuth() Method / Controller
@ApiOAuth2() Method / Controller
@ApiBasicAuth() Method / Controller
@ApiSecurity() Method / Controller
@ApiExtraModels() Method / Controller
@ApiBody() Method
@ApiParam() Method
@ApiQuery() Method
@ApiHeader() Method / Controller
@ApiExcludeEndpoint() Method
@ApiTags() Method / Controller
@ApiProperty() Model
@ApiPropertyOptional() Model
@ApiHideProperty() Model
@ApiExtension() Method

Introduction

The OpenAPI specification is a language-agnostic definition format used to describe RESTful APIs. Nest provides a dedicated module which allows generating such a specification by leveraging decorators.

Installation

To begin using it, we first install the required dependencies.

$ npm install --save @nestjs/swagger swagger-ui-express

If you use fastify, install fastify-swagger instead of swagger-ui-express:

$ npm install --save @nestjs/swagger fastify-swagger

Bootstrap

Once the installation process is complete, open the main.ts file and initialize Swagger using the SwaggerModule class:

import { NestFactory } from '@nestjs/core';
import { SwaggerModule, DocumentBuilder } from '@nestjs/swagger';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create(AppModule);

  const options = new DocumentBuilder()
    .setTitle('Cats example')
    .setDescription('The cats API description')
    .setVersion('1.0')
    .addTag('cats')
    .build();
  const document = SwaggerModule.createDocument(app, options);
  SwaggerModule.setup('api', app, document);

  await app.listen(3000);
}
bootstrap();

The DocumentBuilder helps to structure a base document that conforms to the OpenAPI Specification. It provides several methods that allow setting such properties as title, description, version, etc. In order to create a full document (with all HTTP routes defined) we use the createDocument() method of the SwaggerModule class. This method takes two arguments, an application instance and a Swagger options object.

Once we create a document, we can call the setup() method. It accepts:

  1. the path to mount the Swagger UI
  2. an application instance
  3. the document object instantiated above

Now you can run the following command to start the HTTP server:

$ npm run start

While the application is running, open your browser and navigate to http://localhost:3000/api. You should see the Swagger UI.

The SwaggerModule automatically reflects all of your endpoints. Note that the Swagger UI is created using either swagger-ui-express or fastify-swagger, depending on the platform.

info Hint To generate and download a Swagger JSON file, navigate to http://localhost:3000/api-json in your browser (assuming that your Swagger documentation is available under http://localhost:3000/api).

Example

A working example is available here.

Mapped types

As you build out features like CRUD (Create/Read/Update/Delete) it's often useful to construct variants on a base entity type. Nest provides several utility functions that perform type transformations to make this task more convenient.

Partial

When building input validation types (also called DTOs), it's often useful to build create and update variations on the same type. For example, the create variant may require all fields, while the update variant may make all fields optional.

Nest provides the PartialType() utility function to make this task easier and minimize boilerplate.

The PartialType() function returns a type (class) with all the properties of the input type set to optional. For example, suppose we have a create type as follows:

import { ApiProperty } from '@nestjs/swagger';

export class CreateCatDto {
  @ApiProperty()
  name: string;

  @ApiProperty()
  age: number;

  @ApiProperty()
  breed: string;
}

By default, all of these fields are required. To create a type with the same fields, but with each one optional, use PartialType() passing the class reference (CreateCatDto) as an argument:

export class UpdateCatDto extends PartialType(CreateCatDto) {}

info Hint The PartialType() function is imported from the @nestjs/swagger package.

Pick

The PickType() function constructs a new type (class) by picking a set of properties from an input type. For example, suppose we start with a type like:

import { ApiProperty } from '@nestjs/swagger';

export class CreateCatDto {
  @ApiProperty()
  name: string;

  @ApiProperty()
  age: number;

  @ApiProperty()
  breed: string;
}

We can pick a set of properties from this class using the PickType() utility function:

export class UpdateCatAgeDto extends PickType(CreateCatDto, ['age'] as const) {}

info Hint The PickType() function is imported from the @nestjs/swagger package.

Omit

The OmitType() function constructs a type by picking all properties from an input type and then removing a particular set of keys. For example, suppose we start with a type like:

import { ApiProperty } from '@nestjs/swagger';

export class CreateCatDto {
  @ApiProperty()
  name: string;

  @ApiProperty()
  age: number;

  @ApiProperty()
  breed: string;
}

We can generate a derived type that has every property except name as shown below. In this construct, the second argument to OmitType is an array of property names.

export class UpdateCatDto extends OmitType(CreateCatDto, ['name'] as const) {}

info Hint The OmitType() function is imported from the @nestjs/swagger package.

Intersection

The IntersectionType() function combines two types into one new type (class). For example, suppose we start with two types like:

import { ApiProperty } from '@nestjs/swagger';

export class CreateCatDto {
  @ApiProperty()
  name: string;

  @ApiProperty()
  breed: string;
}

export class AdditionalCatInfo {
  @ApiProperty()
  color: string;
}

We can generate a new type that combines all properties in both types.

export class UpdateCatDto extends IntersectionType(
  CreateCatDto,
  AdditionalCatInfo,
) {}

info Hint The IntersectionType() function is imported from the @nestjs/swagger package.

Composition

The type mapping utility functions are composable. For example, the following will produce a type (class) that has all of the properties of the CreateCatDto type except for name, and those properties will be set to optional:

export class UpdateCatDto extends PartialType(
  OmitType(CreateCatDto, ['name'] as const),
) {}

Migration guide

If you're currently using @nestjs/swagger@3.*, note the following breaking/API changes in version 4.0.

Breaking changes

The following decorators have been changed/renamed:

  • @ApiModelProperty is now @ApiProperty
  • @ApiModelPropertyOptional is now @ApiPropertyOptional
  • @ApiResponseModelProperty is now @ApiResponseProperty
  • @ApiImplicitQuery is now @ApiQuery
  • @ApiImplicitParam is now @ApiParam
  • @ApiImplicitBody is now @ApiBody
  • @ApiImplicitHeader is now @ApiHeader
  • @ApiOperation({{ '{' }} title: 'test' {{ '}' }}) is now @ApiOperation({{ '{' }} summary: 'test' {{ '}' }})
  • @ApiUseTags is now @ApiTags

DocumentBuilder breaking changes (updated method signatures):

  • addTag
  • addBearerAuth
  • addOAuth2
  • setContactEmail is now setContact
  • setHost has been removed
  • setSchemes has been removed (use the addServer instead, e.g., addServer('http://'))

New methods

The following methods have been added:

  • addServer
  • addApiKey
  • addBasicAuth
  • addSecurity
  • addSecurityRequirements

Operations

In OpenAPI terms, paths are endpoints (resources), such as /users or /reports/summary, that your API exposes, and operations are the HTTP methods used to manipulate these paths, such as GET, POST or DELETE.

Tags

To attach a controller to a specific tag, use the @ApiTags(...tags) decorator.

@ApiTags('cats')
@Controller('cats')
export class CatsController {}

Headers

To define custom headers that are expected as part of the request, use @ApiHeader().

@ApiHeader({
  name: 'X-MyHeader',
  description: 'Custom header',
})
@Controller('cats')
export class CatsController {}

Responses

To define a custom HTTP response, use the @ApiResponse() decorator.

@Post()
@ApiResponse({ status: 201, description: 'The record has been successfully created.'})
@ApiResponse({ status: 403, description: 'Forbidden.'})
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}

Nest provides a set of short-hand API response decorators that inherit from the @ApiResponse decorator:

  • @ApiOkResponse()
  • @ApiCreatedResponse()
  • @ApiAcceptedResponse()
  • @ApiNoContentResponse()
  • @ApiMovedPermanentlyResponse()
  • @ApiBadRequestResponse()
  • @ApiUnauthorizedResponse()
  • @ApiNotFoundResponse()
  • @ApiForbiddenResponse()
  • @ApiMethodNotAllowedResponse()
  • @ApiNotAcceptableResponse()
  • @ApiRequestTimeoutResponse()
  • @ApiConflictResponse()
  • @ApiTooManyRequestsResponse()
  • @ApiGoneResponse()
  • @ApiPayloadTooLargeResponse()
  • @ApiUnsupportedMediaTypeResponse()
  • @ApiUnprocessableEntityResponse()
  • @ApiInternalServerErrorResponse()
  • @ApiNotImplementedResponse()
  • @ApiBadGatewayResponse()
  • @ApiServiceUnavailableResponse()
  • @ApiGatewayTimeoutResponse()
  • @ApiDefaultResponse()
@Post()
@ApiCreatedResponse({ description: 'The record has been successfully created.'})
@ApiForbiddenResponse({ description: 'Forbidden.'})
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}

To specify a return model for a request, we must create a class and annotate all properties with the @ApiProperty() decorator.

export class Cat {
  @ApiProperty()
  id: number;

  @ApiProperty()
  name: string;

  @ApiProperty()
  age: number;

  @ApiProperty()
  breed: string;
}

Then the Cat model can be used in combination with the type property of the response decorator.

@ApiTags('cats')
@Controller('cats')
export class CatsController {
  @Post()
  @ApiCreatedResponse({
    description: 'The record has been successfully created.',
    type: Cat,
  })
  async create(@Body() createCatDto: CreateCatDto): Promise<Cat> {
    return this.catsService.create(createCatDto);
  }
}

Let's open the browser and verify the generated Cat model:

File upload

You can enable file upload for a specific method with the @ApiBody decorator together with @ApiConsumes(). Here's a full example using the File Upload technique:

@UseInterceptors(FileInterceptor('file'))
@ApiConsumes('multipart/form-data')
@ApiBody({
  description: 'List of cats',
  type: FileUploadDto,
})
uploadFile(@UploadedFile() file) {}

Where FileUploadDto is defined as follows:

class FileUploadDto {
  @ApiProperty({ type: 'string', format: 'binary' })
  file: any;
}

Extensions

To add an Extension to a request use the @ApiExtension() decorator. The extension name must be prefixed with x-.

@ApiExtension('x-foo', { hello: 'world' })

Other features

Global prefix

To ignore a global prefix for routes set through setGlobalPrefix(), use ignoreGlobalPrefix:

const document = SwaggerModule.createDocument(app, options, {
  ignoreGlobalPrefix: true,
});

Multiple specifications

The SwaggerModule provides a way to support multiple specifications. In other words, you can serve different documentation, with different UIs, on different endpoints.

To support multiple specifications, your application must be written with a modular approach. The createDocument() method takes a 3rd argument, extraOptions, which is an object with a property named include. The include property takes a value which is an array of modules.

You can setup multiple specifications support as shown below:

import { NestFactory } from '@nestjs/core';
import { SwaggerModule, DocumentBuilder } from '@nestjs/swagger';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create(AppModule);

  /**
   * createDocument(application, configurationOptions, extraOptions);
   *
   * createDocument method takes an optional 3rd argument "extraOptions"
   * which is an object with "include" property where you can pass an Array
   * of Modules that you want to include in that Swagger Specification
   * E.g: CatsModule and DogsModule will have two separate Swagger Specifications which
   * will be exposed on two different SwaggerUI with two different endpoints.
   */

  const options = new DocumentBuilder()
    .setTitle('Cats example')
    .setDescription('The cats API description')
    .setVersion('1.0')
    .addTag('cats')
    .build();

  const catDocument = SwaggerModule.createDocument(app, options, {
    include: [CatsModule],
  });
  SwaggerModule.setup('api/cats', app, catDocument);

  const secondOptions = new DocumentBuilder()
    .setTitle('Dogs example')
    .setDescription('The dogs API description')
    .setVersion('1.0')
    .addTag('dogs')
    .build();

  const dogDocument = SwaggerModule.createDocument(app, secondOptions, {
    include: [DogsModule],
  });
  SwaggerModule.setup('api/dogs', app, dogDocument);

  await app.listen(3000);
}
bootstrap();

Now you can start your server with the following command:

$ npm run start

Navigate to http://localhost:3000/api/cats to see the Swagger UI for cats:

In turn, http://localhost:3000/api/dogs will expose the Swagger UI for dogs:

### Security

To define which security mechanisms should be used for a specific operation, use the @ApiSecurity() decorator.

@ApiSecurity('basic')
@Controller('cats')
export class CatsController {}

Before you run your application, remember to add the security definition to your base document using DocumentBuilder:

const options = new DocumentBuilder().addSecurity('basic', {
  type: 'http',
  scheme: 'basic',
});

Some of the most popular authentication techniques are built-in (e.g., basic and bearer) and therefore you don't have to define security mechanisms manually as shown above.

Basic authentication

To enable basic authentication, use @ApiBasicAuth().

@ApiBasicAuth()
@Controller('cats')
export class CatsController {}

Before you run your application, remember to add the security definition to your base document using DocumentBuilder:

const options = new DocumentBuilder().addBasicAuth();

Bearer authentication

To enable bearer authentication, use @ApiBearerAuth().

@ApiBearerAuth()
@Controller('cats')
export class CatsController {}

Before you run your application, remember to add the security definition to your base document using DocumentBuilder:

const options = new DocumentBuilder().addBearerAuth();

OAuth2 authentication

To enable OAuth2, use @ApiOAuth2().

@ApiOAuth2(['pets:write'])
@Controller('cats')
export class CatsController {}

Before you run your application, remember to add the security definition to your base document using DocumentBuilder:

const options = new DocumentBuilder().addOAuth2();

Cookie authentication

To enable cookie authentication, use @ApiCookieAuth().

@ApiCookieAuth()
@Controller('cats')
export class CatsController {}

Before you run your application, remember to add the security definition to your base document using DocumentBuilder:

const options = new DocumentBuilder().addCookieAuth('optional-session-id');

Types and parameters

The SwaggerModule searches for all @Body(), @Query(), and @Param() decorators in route handlers to generate the API document. It also creates corresponding model definitions by taking advantage of reflection. Consider the following code:

@Post()
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}

info Hint To explicitly set the body definition use the @ApiBody() decorator (imported from the @nestjs/swagger package).

Based on the CreateCatDto, the following model definition Swagger UI will be created:

As you can see, the definition is empty although the class has a few declared properties. In order to make the class properties visible to the SwaggerModule, we have to either annotate them with the @ApiProperty() decorator or use the CLI plugin (read more in the Plugin section) which will do it automatically:

import { ApiProperty } from '@nestjs/swagger';

export class CreateCatDto {
  @ApiProperty()
  name: string;

  @ApiProperty()
  age: number;

  @ApiProperty()
  breed: string;
}

info Hint Instead of manually annotating each property, consider using the Swagger plugin (see Plugin section) which will automatically provide this for you.

Let's open the browser and verify the generated CreateCatDto model:

In addition, the @ApiProperty() decorator allows setting various Schema Object properties:

@ApiProperty({
  description: 'The age of a cat',
  minimum: 1,
  default: 1,
})
age: number;

info Hint Instead of explicitly typing the {{"@ApiProperty({ required: false })"}} you can use the @ApiPropertyOptional() short-hand decorator.

In order to explicitly set the type of the property, use the type key:

@ApiProperty({
  type: Number,
})
age: number;

Arrays

When the property is an array, we must manually indicate the array type as shown below:

@ApiProperty({ type: [String] })
names: string[];

info Hint Consider using the Swagger plugin (see Plugin section) which will automatically detect arrays.

Either include the type as the first element of an array (as shown above) or set the isArray property to true.

Circular dependencies

When you have circular dependencies between classes, use a lazy function to provide the SwaggerModule with type information:

@ApiProperty({ type: () => Node })
node: Node;

info Hint Consider using the Swagger plugin (see Plugin section) which will automatically detect circular dependencies.

Generics and interfaces

Since TypeScript does not store metadata about generics or interfaces, when you use them in your DTOs, SwaggerModule may not be able to properly generate model definitions at runtime. For instance, the following code won't be correctly inspected by the Swagger module:

createBulk(@Body() usersDto: CreateUserDto[])

In order to overcome this limitation, you can set the type explicitly:

@ApiBody({ type: [CreateUserDto] })
createBulk(@Body() usersDto: CreateUserDto[])

Enums

To identify an enum, we must manually set the enum property on the @ApiProperty with an array of values.

@ApiProperty({ enum: ['Admin', 'Moderator', 'User']})
role: UserRole;

Alternatively, define an actual TypeScript enum as follows:

export enum UserRole {
  Admin = 'Admin',
  Moderator = 'Moderator',
  User = 'User',
}

You can then use the enum directly with the @Query() parameter decorator in combination with the @ApiQuery() decorator.

@ApiQuery({ name: 'role', enum: UserRole })
async filterByRole(@Query('role') role: UserRole = UserRole.User) {}

With isArray set to true, the enum can be selected as a multi-select:

Enums schema

By default, the enum property will add a raw definition of Enum on the parameter.

- breed:
    type: 'string'
    enum:
      - Persian
      - Tabby
      - Siamese

The above specification works fine for most cases. However, if you are utilizing a tool that takes the specification as input and generates client-side code, you might run into a problem with the generated code containing duplicated enums. Consider the following code snippet:

// generated client-side code
export class CatDetail {
  breed: CatDetailEnum;
}

export class CatInformation {
  breed: CatInformationEnum;
}

export enum CatDetailEnum {
  Persian = 'Persian',
  Tabby = 'Tabby',
  Siamese = 'Siamese',
}

export enum CatInformationEnum {
  Persian = 'Persian',
  Tabby = 'Tabby',
  Siamese = 'Siamese',
}

info Hint The above snippet is generated using a tool called NSwag.

You can see that now you have two enums that are exactly the same. To address this issue, you can pass an enumName along with the enum property in your decorator.

export class CatDetail {
  @ApiProperty({ enum: CatBreed, enumName: 'CatBreed' })
  breed: CatBreed;
}

The enumName property enables @nestjs/swagger to turn CatBreed into its own schema which in turns makes CatBreed enum reusable. The specification will look like the following:

CatDetail:
  type: 'object'
  properties:
    ...
    - breed:
        schema:
          $ref: '#/components/schemas/CatBreed'
CatBreed:
  type: string
  enum:
    - Persian
    - Tabby
    - Siamese

info Hint Any decorator that takes enum as a property will also take enumName.

Raw definitions

In some specific scenarios (e.g., deeply nested arrays, matrices), you may want to describe your type by hand.

@ApiProperty({
  type: 'array',
  items: {
    type: 'array',
    items: {
      type: 'number',
    },
  },
})
coords: number[][];

Likewise, in order to define your input/output content manually in controller classes, use the schema property:

@ApiBody({
  schema: {
    type: 'array',
    items: {
      type: 'array',
      items: {
        type: 'number',
      },
    },
  },
})
async create(@Body() coords: number[][]) {}

Extra models

To define additional models that should be inspected by the Swagger module, use the @ApiExtraModels() decorator:

@ApiExtraModels(ExtraModel)
export class CreateCatDto {}

Then, you can get the reference ($ref) to your model using getSchemaPath(ExtraModel):

'application/vnd.api+json': {
   schema: { $ref: getSchemaPath(ExtraModel) },
},

oneOf, anyOf, allOf

To combine schemas, you can use the oneOf, anyOf or allOf keywords (read more).

@ApiProperty({
  oneOf: [
    { $ref: getSchemaPath(Cat) },
    { $ref: getSchemaPath(Dog) },
  ],
})
pet: Cat | Dog;

If you want to define a polymorphic array (i.e., an array whose members span multiple schemas), you should use a raw definition (see above) to define your type by hand.

type Pet = Cat | Dog;

@ApiProperty({
  type: 'array',
  items: {
    oneOf: [
      { $ref: getSchemaPath(Cat) },
      { $ref: getSchemaPath(Dog) },
    ],
  },
})
pets: Pet[];

info Hint The getSchemaPath() function is imported from @nestjs/swagger.

Both Cat and Dog must be defined as extra models using the @ApiExtraModels() decorator (at the class-level).

CQRS

The flow of simple CRUD (Create, Read, Update and Delete) applications can be described using the following steps:

  1. The controllers layer handles HTTP requests and delegates tasks to the services layer.
  2. The services layer is where most of the business logic lives.
  3. Services uses repositories / DAOs to change / persist entities.
  4. Entities act as containers for the values, with setters and getters.

In most cases, for small and medium-sized applications, this pattern is sufficient. However, when our requirements become more complex, the CQRS model may be more appropriate and scalable. To facilitate that model, Nest provides a lightweight CQRS module. This chapter describes how to use it.

Installation

First install the required package:

$ npm install --save @nestjs/cqrs

Commands

In this model, each action is called a Command. When a command is dispatched, the application reacts to it. Commands can be dispatched from the services layer, or directly from controllers/gateways. Commands are consumed by Command Handlers.

@@filename(heroes-game.service)
@Injectable()
export class HeroesGameService {
  constructor(private commandBus: CommandBus) {}

  async killDragon(heroId: string, killDragonDto: KillDragonDto) {
    return this.commandBus.execute(
      new KillDragonCommand(heroId, killDragonDto.dragonId)
    );
  }
}
@@switch
@Injectable()
@Dependencies(CommandBus)
export class HeroesGameService {
  constructor(commandBus) {
    this.commandBus = commandBus;
  }

  async killDragon(heroId, killDragonDto) {
    return this.commandBus.execute(
      new KillDragonCommand(heroId, killDragonDto.dragonId)
    );
  }
}

Here's a sample service that dispatches KillDragonCommand. Let's see how the command looks:

@@filename(kill-dragon.command)
export class KillDragonCommand {
  constructor(
    public readonly heroId: string,
    public readonly dragonId: string,
  ) {}
}
@@switch
export class KillDragonCommand {
  constructor(heroId, dragonId) {
    this.heroId = heroId;
    this.dragonId = dragonId;
  }
}

The CommandBus is a stream of commands. It delegates commands to the equivalent handlers. Each command must have a corresponding Command Handler:

@@filename(kill-dragon.handler)
@CommandHandler(KillDragonCommand)
export class KillDragonHandler implements ICommandHandler<KillDragonCommand> {
  constructor(private repository: HeroRepository) {}

  async execute(command: KillDragonCommand) {
    const { heroId, dragonId } = command;
    const hero = this.repository.findOneById(+heroId);

    hero.killEnemy(dragonId);
    await this.repository.persist(hero);
  }
}
@@switch
@CommandHandler(KillDragonCommand)
@Dependencies(HeroRepository)
export class KillDragonHandler {
  constructor(repository) {
    this.repository = repository;
  }

  async execute(command) {
    const { heroId, dragonId } = command;
    const hero = this.repository.findOneById(+heroId);

    hero.killEnemy(dragonId);
    await this.repository.persist(hero);
  }
}

With this approach, every application state change is driven by the occurrence of a Command. The logic is encapsulated in handlers. With this approach, we can simply add behavior like logging or persisting commands in the database (e.g., for diagnostics purposes).

Events

Command handlers neatly encapsulate logic. While beneficial, the application structure is still not flexible enough, not reactive. To remedy this, we also introduce events.

@@filename(hero-killed-dragon.event)
export class HeroKilledDragonEvent {
  constructor(
    public readonly heroId: string,
    public readonly dragonId: string,
  ) {}
}
@@switch
export class HeroKilledDragonEvent {
  constructor(heroId, dragonId) {
    this.heroId = heroId;
    this.dragonId = dragonId;
  }
}

Events are asynchronous. They are dispatched either by models or directly using EventBus. In order to dispatch events, models have to extend the AggregateRoot class.

@@filename(hero.model)
export class Hero extends AggregateRoot {
  constructor(private id: string) {
    super();
  }

  killEnemy(enemyId: string) {
    // logic
    this.apply(new HeroKilledDragonEvent(this.id, enemyId));
  }
}
@@switch
export class Hero extends AggregateRoot {
  constructor(id) {
    super();
    this.id = id;
  }

  killEnemy(enemyId) {
    // logic
    this.apply(new HeroKilledDragonEvent(this.id, enemyId));
  }
}

The apply() method does not dispatch events yet because there's no relationship between the model and the EventPublisher class. How do we associate the model and the publisher? By using a publisher mergeObjectContext() method inside our command handler.

@@filename(kill-dragon.handler)
@CommandHandler(KillDragonCommand)
export class KillDragonHandler implements ICommandHandler<KillDragonCommand> {
  constructor(
    private repository: HeroRepository,
    private publisher: EventPublisher,
  ) {}

  async execute(command: KillDragonCommand) {
    const { heroId, dragonId } = command;
    const hero = this.publisher.mergeObjectContext(
      await this.repository.findOneById(+heroId),
    );
    hero.killEnemy(dragonId);
    hero.commit();
  }
}
@@switch
@CommandHandler(KillDragonCommand)
@Dependencies(HeroRepository, EventPublisher)
export class KillDragonHandler {
  constructor(repository, publisher) {
    this.repository = repository;
    this.publisher = publisher;
  }

  async execute(command) {
    const { heroId, dragonId } = command;
    const hero = this.publisher.mergeObjectContext(
      await this.repository.findOneById(+heroId),
    );
    hero.killEnemy(dragonId);
    hero.commit();
  }
}

Now everything works as expected. Notice that we need to commit() events since they're not being dispatched immediately. Obviously, an object doesn't have to exist up front. We can easily merge type context as well:

const HeroModel = this.publisher.mergeContext(Hero);
new HeroModel('id');

Now the model has the ability to publish events. Additionally, we can emit events manually using EventBus:

this.eventBus.publish(new HeroKilledDragonEvent());

info Hint The EventBus is an injectable class.

Each event can have multiple Event Handlers.

@@filename(hero-killed-dragon.handler)
@EventsHandler(HeroKilledDragonEvent)
export class HeroKilledDragonHandler implements IEventHandler<HeroKilledDragonEvent> {
  constructor(private repository: HeroRepository) {}

  handle(event: HeroKilledDragonEvent) {
    // logic
  }
}

Now we can move the write logic into the event handlers.

Sagas

This type of Event-Driven Architecture improves application reactiveness and scalability. Now, when we have events, we can simply react to them in various ways. Sagas are the final building block from an architectural point of view.

Sagas are an extremely powerful feature. A single saga may listen for 1..* events. Using the RxJS library, it can combine, merge, filter or apply other RxJS operators on the event stream. Each saga returns an Observable which contains a command. This command is dispatched asynchronously.

@@filename(heroes-game.saga)
@Injectable()
export class HeroesGameSagas {
  @Saga()
  dragonKilled = (events$: Observable<any>): Observable<ICommand> => {
    return events$.pipe(
      ofType(HeroKilledDragonEvent),
      map((event) => new DropAncientItemCommand(event.heroId, fakeItemID)),
    );
  }
}
@@switch
@Injectable()
export class HeroesGameSagas {
  @Saga()
  dragonKilled = (events$) => {
    return events$.pipe(
      ofType(HeroKilledDragonEvent),
      map((event) => new DropAncientItemCommand(event.heroId, fakeItemID)),
    );
  }
}

info Hint The ofType operator is exported from the @nestjs/cqrs package.

We declared a rule - when any hero kills the dragon, the ancient item should be dropped. With this in place, DropAncientItemCommand will be dispatched and processed by the appropriate handler.

Queries

The CqrsModule can also be used for handling queries. The QueryBus follows the same pattern as the CommandsBus. Query handlers should implement the IQueryHandler interface and be marked with the @QueryHandler() decorator.

Setup

Finally, let's look at how to set up the whole CQRS mechanism.

@@filename(heroes-game.module)
export const CommandHandlers = [KillDragonHandler, DropAncientItemHandler];
export const EventHandlers =  [HeroKilledDragonHandler, HeroFoundItemHandler];

@Module({
  imports: [CqrsModule],
  controllers: [HeroesGameController],
  providers: [
    HeroesGameService,
    HeroesGameSagas,
    ...CommandHandlers,
    ...EventHandlers,
    HeroRepository,
  ]
})
export class HeroesGameModule {}

Summary

CommandBus, QueryBus and EventBus are Observables. This means that you can easily subscribe to the whole stream and enrich your application with Event Sourcing.

A working example is available here.

CRUD utilities

warning Notice This chapter applies only to TypeScript.

Overview

CRUD is a community package (@nestjsx/crud) that helps you create database-centric Create/Read/Update/Delete (CRUD) controllers and services with ease, and provides a rich set of features for your RESTful API out-of-the-box:

  • Database agnostic extendable CRUD controller
  • Query string parsing with filtering, pagination, sorting, relations, nested relations, cache, etc.
  • Framework agnostic package with query builder for frontend usage
  • Query, path params and DTO validation
  • Overriding controller methods with ease
  • Tiny but powerful configuration (including global configuration)
  • Additional helper decorators
  • Swagger documentation

warning Notice Currently @nestjsx/crud only supports TypeORM. Other ORMs like Sequelize and Mongoose will be included in the near future.

In this chapter, you'll get an overview of how to create CRUD controllers and services using TypeORM. Complete documentation is available at the project's wiki. We assume that you have already successfully installed and set up the @nestjs/typeorm package. To learn more, see here.

Getting started

Start by installing all required dependencies:

npm i --save @nestjsx/crud @nestjsx/crud-typeorm typeorm class-transformer class-validator

Assuming that you already have some entities in your project:

@@filename(hero.entity)
import { Entity, PrimaryGeneratedColumn, Column } from 'typeorm';

@Entity()
export class Hero {
  @PrimaryGeneratedColumn()
  id: number;

  @Column()
  name: string;

  @Column({ type: 'number' })
  power: number;
}

The first step is to create a service:

@@filename(heroes.service)
import { Injectable } from '@nestjs/common';
import { InjectRepository } from '@nestjs/typeorm';
import { TypeOrmCrudService } from '@nestjsx/crud-typeorm';
import { Hero } from './hero.entity';

@Injectable()
export class HeroesService extends TypeOrmCrudService<Hero> {
  constructor(@InjectRepository(Hero) repo) {
    super(repo);
  }
}

The next step is to create a controller:

@@filename(heroes.controller)
import { Controller } from '@nestjs/common';
import { Crud } from '@nestjsx/crud';
import { Hero } from './hero.entity';
import { HeroesService } from './heroes.service';

@Crud({
  model: {
    type: Hero,
  },
})
@Controller('heroes')
export class HeroesController {
  constructor(public service: HeroesService) {}
}

And finally, we need to wire up everything in our module:

@@filename(heroes.module)
import { Module } from '@nestjs/common';
import { TypeOrmModule } from '@nestjs/typeorm';

import { Hero } from './hero.entity';
import { HeroesService } from './heroes.service';
import { HeroesController } from './heroes.controller';

@Module({
  imports: [TypeOrmModule.forFeature([Hero])],
  providers: [HeroesService],
  controllers: [HeroesController],
})
export class HeroesModule {}

warning Notice Do not forget to import the HeroesModule into the root ApplicationModule.

At this point, your application will have these newly created endpoints:

  • GET /heroes - get many heroes.
  • GET /heroes/:id - get one hero.
  • POST /heroes/bulk - create many heroes.
  • POST /heroes - create one hero.
  • PATCH /heroes/:id - update one hero.
  • PUT /heroes/:id - replace one hero.
  • DELETE /heroes/:id - delete one hero.

Filtering and pagination

CRUD provides rich tools for filtering and pagination. Here's a sample HTTP REST request:

GET /heroes?select=name&filter=power||gt||90&sort=name,ASC&page=1&limit=3

In this example, we:

  • requested the list of heroes and selected only the name attribute
  • filtered the list to include heroes with a power greater than 90
  • limited the result set to 3 within page 1
  • sorted by name in ASC order

The response object would look like this:

{
  "data": [
    {
      "id": 2,
      "name": "Batman"
    },
    {
      "id": 4,
      "name": "Flash"
    },
    {
      "id": 3,
      "name": "Superman"
    }
  ],
  "count": 3,
  "total": 14,
  "page": 1,
  "pageCount": 5
}

warning Notice Primary columns persist in the resource response object whether they were requested or not. In our example, this is the id column.

The complete list of query params and filter operators can be found in the project's Wiki.

Relations

Relations is another powerful feature. In your CRUD controller, you can specify the list of an entity's relations which are allowed to fetch within your API calls:

@Crud({
  model: {
    type: Hero,
  },
  join: {
    profile: {
      exclude: ['secret'],
    },
    faction: {
      eager: true,
      only: ['name'],
    },
  },
})
@Controller('heroes')
export class HeroesController {
  constructor(public service: HeroesService) {}
}

After specifying allowed relations in the @Crud() decorator options, you can make the following request:

info Request GET /heroes/25?join=profile||address,bio

The response will contain a hero object with a joined profile which includes the address and bio columns.

The response will also contain a faction object with the name column selected because it was set to eager: true (and thus persists in every response).

You can find more information about relations in the project's WiKi.

Path params validation

By default, CRUD creates a slug with the name id and validates it as a number. You can modify this behavior if desired. Assume, your entity has a primary column _id - a UUID string - and you need to use it as a slug for your endpoints. This can be done with the following options:

@Crud({
  model: {
    type: Hero,
  },
  params: {
    _id: {
      field: '_id',
      type: 'uuid',
      primary: true,
    },
  },
})
@Controller('heroes')
export class HeroesController {
  constructor(public service: HeroesService) {}
}

For more params options please see the project's Wiki.

Request body validation

Request body validation is performed out-of-the-box by applying the standard Nest ValidationPipe on each POST, PUT or PATCH request. The model.type from the @Crud() decorator options is used as a DTO that describes validation rules.

To do that properly we use validation groups:

@@filename(hero.entity)
import { Entity, PrimaryGeneratedColumn, Column } from 'typeorm';
import { IsOptional, IsDefined, IsString, IsNumber } from 'class-validator';
import { CrudValidationGroups } from '@nestjsx/crud';

const { CREATE, UPDATE } = CrudValidationGroups;

@Entity()
export class Hero {
  @IsOptional({ always: true })
  @PrimaryGeneratedColumn()
  id: number;

  @IsOptional({ groups: [UPDATE] })
  @IsDefined({ groups: [CREATE] })
  @IsString({ always: true })
  @Column()
  name: string;

  @IsOptional({ groups: [UPDATE] })
  @IsDefined({ groups: [CREATE] })
  @IsNumber({}, { always: true })
  @Column({ type: 'number' })
  power: number;
}

warning Notice Full support of separate DTO classes for create and update actions is one of the main priorities for the next CRUD release.

Routes options

You can disable or enable generation of specific routes by passing the routes options property to the @Crud() decorator:

@Crud({
  model: {
    type: Hero,
  },
  routes: {
    only: ['getManyBase'],
    getManyBase: {
      decorators: [UseGuards(HeroAuthGuard)],
    },
  },
})
@Controller('heroes')
export class HeroesController {
  constructor(public service: HeroesService) {}
}

You can apply any method decorators by passing them to the specific route decorators array. This is convenient when you want to add some decorators without overriding base methods.

Documentation

The examples in this chapter cover only some of the CRUD features. You can find complete documentation on the project's Wiki page.

Documentation

Compodoc is a documentation tool for Angular applications. Since Nest and Angular share similar project and code structures, Compodoc works with Nest applications as well.

Setup

Setting up Compodoc inside an existing Nest project is very simple. Start by adding the dev-dependency with the following command in your OS terminal:

$ npm i -D @compodoc/compodoc

Generation

Generate project documentation using the following command (npm 6 is required for npx support). See the official documentation for more options.

$ npx @compodoc/compodoc -p tsconfig.json -s

Open your browser and navigate to http://localhost:8080. You should see an initial Nest CLI project:

Contribute

You can participate and contribute to the Compodoc project here.

Hot Reload

The highest impact on your application's bootstrapping process is TypeScript compilation. Fortunately, with webpack HMR (Hot-Module Replacement), we don't need to recompile the entire project each time a change occurs. This significantly decreases the amount of time necessary to instantiate your application, and makes iterative development a lot easier.

warning Warning Note that webpack won't automatically copy your assets (e.g. graphql files) to the dist folder. Similarly, webpack is not compatible with glob static paths (e.g., the entities property in TypeOrmModule).

With CLI

If you are using the Nest CLI, the configuration process is pretty straightforward. The CLI wraps webpack, which allows use of the HotModuleReplacementPlugin.

Installation

First install the required packages:

$ npm i --save-dev webpack-node-externals start-server-webpack-plugin

Configuration

Once the installation is complete, create a webpack-hmr.config.js file in the root directory of your application.

const webpack = require('webpack');
const nodeExternals = require('webpack-node-externals');
const StartServerPlugin = require('start-server-webpack-plugin');

module.exports = function (options) {
  return {
    ...options,
    entry: ['webpack/hot/poll?100', options.entry],
    watch: true,
    externals: [
      nodeExternals({
        allowlist: ['webpack/hot/poll?100'],
      }),
    ],
    plugins: [
      ...options.plugins,
      new webpack.HotModuleReplacementPlugin(),
      new webpack.WatchIgnorePlugin([/\.js$/, /\.d\.ts$/]),
      new StartServerPlugin({ name: options.output.filename }),
    ],
  };
};

This function takes the original object containing the default webpack configuration and returns a modified one with an applied HotModuleReplacementPlugin plugin.

Hot-Module Replacement

To enable HMR, open the application entry file (main.ts) and add the following webpack-related instructions:

declare const module: any;

async function bootstrap() {
  const app = await NestFactory.create(AppModule);
  await app.listen(3000);

  if (module.hot) {
    module.hot.accept();
    module.hot.dispose(() => app.close());
  }
}
bootstrap();

To simplify the execution process, add a script to your package.json file.

"start:dev": "nest build --webpack --webpackPath webpack-hmr.config.js"

Now simply open your command line and run the following command:

$ npm run start:dev

Without CLI

If you are not using the Nest CLI, the configuration will be slightly more complex (will require more manual steps).

Installation

First install the required packages:

$ npm i --save-dev webpack webpack-cli webpack-node-externals ts-loader start-server-webpack-plugin

Configuration

Once the installation is complete, create a webpack.config.js file in the root directory of your application.

const webpack = require('webpack');
const path = require('path');
const nodeExternals = require('webpack-node-externals');
const StartServerPlugin = require('start-server-webpack-plugin');

module.exports = {
  entry: ['webpack/hot/poll?100', './src/main.ts'],
  watch: true,
  target: 'node',
  externals: [
    nodeExternals({
      allowlist: ['webpack/hot/poll?100'],
    }),
  ],
  module: {
    rules: [
      {
        test: /.tsx?$/,
        use: 'ts-loader',
        exclude: /node_modules/,
      },
    ],
  },
  mode: 'development',
  resolve: {
    extensions: ['.tsx', '.ts', '.js'],
  },
  plugins: [
    new webpack.HotModuleReplacementPlugin(),
    new StartServerPlugin({ name: 'server.js' }),
  ],
  output: {
    path: path.join(__dirname, 'dist'),
    filename: 'server.js',
  },
};

This configuration tells webpack a few essential things about your application: location of the entry file, which directory should be used to hold compiled files, and what kind of loader we want to use to compile source files. Generally, you should be able to use this file as-is, even if you don't fully understand all of the options.

Hot-Module Replacement

To enable HMR, open the application entry file (main.ts) and add the following webpack-related instructions:

declare const module: any;

async function bootstrap() {
  const app = await NestFactory.create(AppModule);
  await app.listen(3000);

  if (module.hot) {
    module.hot.accept();
    module.hot.dispose(() => app.close());
  }
}
bootstrap();

To simplify the execution process, add a script to your package.json file.

"start:dev": "webpack --config webpack.config.js"

Now simply open your command line and run the following command:

$ npm run start:dev

A working example is available here.

MongoDB (Mongoose)

Warning In this article, you'll learn how to create a DatabaseModule based on the Mongoose package from scratch using custom components. As a consequence, this solution contains a lot of overhead that you can omit using ready to use and available out-of-the-box dedicated @nestjs/mongoose package. To learn more, see here.

Mongoose is the most popular MongoDB object modeling tool.

Getting started

To start the adventure with this library we have to install all required dependencies:

@@filename()
$ npm install --save mongoose
$ npm install --save-dev @types/mongoose
@@switch
$ npm install --save mongoose

The first step we need to do is to establish the connection with our database using connect() function. The connect() function returns a Promise, and therefore we have to create an async provider.

@@filename(database.providers)
import * as mongoose from 'mongoose';

export const databaseProviders = [
  {
    provide: 'DATABASE_CONNECTION',
    useFactory: (): Promise<typeof mongoose> =>
      mongoose.connect('mongodb://localhost/nest'),
  },
];
@@switch
import * as mongoose from 'mongoose';

export const databaseProviders = [
  {
    provide: 'DATABASE_CONNECTION',
    useFactory: () => mongoose.connect('mongodb://localhost/nest'),
  },
];

info Hint Following best practices, we declared the custom provider in the separated file which has a *.providers.ts suffix.

Then, we need to export these providers to make them accessible for the rest part of the application.

@@filename(database.module)
import { Module } from '@nestjs/common';
import { databaseProviders } from './database.providers';

@Module({
  providers: [...databaseProviders],
  exports: [...databaseProviders],
})
export class DatabaseModule {}

Now we can inject the Connection object using @Inject() decorator. Each class that would depend on the Connection async provider will wait until a Promise is resolved.

Model injection

With Mongoose, everything is derived from a Schema. Let's define the CatSchema:

@@filename(schemas/cat.schema)
import * as mongoose from 'mongoose';

export const CatSchema = new mongoose.Schema({
  name: String,
  age: Number,
  breed: String,
});

The CatsSchema belongs to the cats directory. This directory represents the CatsModule.

Now it's time to create a Model provider:

@@filename(cats.providers)
import { Connection } from 'mongoose';
import { CatSchema } from './schemas/cat.schema';

export const catsProviders = [
  {
    provide: 'CAT_MODEL',
    useFactory: (connection: Connection) => connection.model('Cat', CatSchema),
    inject: ['DATABASE_CONNECTION'],
  },
];
@@switch
import { CatSchema } from './schemas/cat.schema';

export const catsProviders = [
  {
    provide: 'CAT_MODEL',
    useFactory: (connection) => connection.model('Cat', CatSchema),
    inject: ['DATABASE_CONNECTION'],
  },
];

Notice In the real-world applications you should avoid magic strings. Both CAT_MODEL and DATABASE_CONNECTION should be kept in the separated constants.ts file.

Now we can inject the CAT_MODEL to the CatsService using the @Inject() decorator:

@@filename(cats.service)
import { Model } from 'mongoose';
import { Injectable, Inject } from '@nestjs/common';
import { Cat } from './interfaces/cat.interface';
import { CreateCatDto } from './dto/create-cat.dto';

@Injectable()
export class CatsService {
  constructor(
    @Inject('CAT_MODEL')
    private catModel: Model<Cat>,
  ) {}

  async create(createCatDto: CreateCatDto): Promise<Cat> {
    const createdCat = new this.catModel(createCatDto);
    return createdCat.save();
  }

  async findAll(): Promise<Cat[]> {
    return this.catModel.find().exec();
  }
}
@@switch
import { Injectable, Dependencies } from '@nestjs/common';

@Injectable()
@Dependencies('CAT_MODEL')
export class CatsService {
  constructor(catModel) {
    this.catModel = catModel;
  }

  async create(createCatDto) {
    const createdCat = new this.catModel(createCatDto);
    return createdCat.save();
  }

  async findAll() {
    return this.catModel.find().exec();
  }
}

In the above example we have used the Cat interface. This interface extends the Document from the mongoose package:

import { Document } from 'mongoose';

export interface Cat extends Document {
  readonly name: string;
  readonly age: number;
  readonly breed: string;
}

The database connection is asynchronous, but Nest makes this process completely invisible for the end-user. The CatModel class is waiting for the db connection, and the CatsService is delayed until model is ready to use. The entire application can start when each class is instantiated.

Here is a final CatsModule:

@@filename(cats.module)
import { Module } from '@nestjs/common';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';
import { catsProviders } from './cats.providers';
import { DatabaseModule } from '../database/database.module';

@Module({
  imports: [DatabaseModule],
  controllers: [CatsController],
  providers: [
    CatsService,
    ...catsProviders,
  ],
})
export class CatsModule {}

warning Hint Do not forget to import the CatsModule into the root ApplicationModule.

Prisma

Prisma is an open-source database toolkit. You can use it to query data from a database inside a Node.js or TypeScript application. Prisma and NestJS go great together since you can use Prisma in your services to fulfill the data needs from your controllers.

Prisma is used as an alternative to writing plain SQL, or using another database access tool such as SQL query builders (like knex.js) or ORMs (like TypeORM and Sequelize). Prisma currently supports PostgreSQL, MySQL and SQLite.

While Prisma is a toolkit that contains multiple tools, the focus of this guide will be on using Prisma Client. Prisma Client is an auto-generated and type-safe query builder that lets you read and write data in your database.

info Note If you want to get a quick overview of how Prisma works, you can follow the Quickstart or read the Introduction in the documentation.

Getting started

In this recipe, you'll learn how to get started with NestJS and Prisma from scratch. You are going to build a sample NestJS application with a REST API that can read and write data in a database.

For the purpose of this guide, you'll use a SQLite database to save the overhead of setting up a database server. Note that you can still follow this guide, even if you're using PostgreSQL or MySQL – you'll get extra instructions for using these databases at the right places.

Create your NestJS project

To get started, install the NestJS CLI and create your app skeleton with the following commands:

$ npm install -g @nestjs/cli
$ nest new hello-prisma

See the First steps page to learn more about the project files created by this command. Note also that you can now run npm start to start your application. The REST API running at http://localhost:3000/ currently serves a single route that's implemented in src/app.controller.ts. Over the course of this guide, you'll implement additional routes to store and retrieve data about users and posts.

Set up Prisma

Start by installing the Prisma CLI as a development dependency in your project:

$ npm install @prisma/cli --save-dev

In the following steps, we'll be utilizing the Prisma CLI. As a best practice, it's recommended to invoke the CLI locally by prefixing it with npx:

$ npx prisma
Expand if you're using Yarn

If you're using Yarn, then you can install the Prisma CLI as follows:

$ yarn add @prisma/cli --dev

Once installed, you can invoke it by prefixing it with yarn:

$ yarn prisma

Now create your initial Prisma setup using the init command of the Prisma CLI:

$ npx prisma init

This command creates a new prisma directory with the following contents:

  • schema.prisma: Specifies your database connection and contains the database schema
  • .env: A dotenv file, typically used to store your database credentials in a group of environment variables

Create your SQLite database file and set the database connection

SQLite databases are simple files; no server is required to use a SQLite database. You can therefore create a new SQLite database by manually creating a new file on your file system.

Navigate into the new prisma directory and create a file called dev.db inside of it.

Next, open up schema.prisma and adjust the provider field of the datasource block to sqlite:

datasource db {
  provider = "sqlite"
  url      = env("DATABASE_URL")
}

generator client {
  provider = "prisma-client-js"
}

Now, open up .env and adjust the DATABASE_URL environment variable to look as follows:

DATABASE_URL="file:./dev.db"
Expand if you're using PostgreSQL or MySQL

PostgreSQL

If you're using PostgreSQL, you have to adjust the schema.prisma and .env files as follows:

schema.prisma

datasource db {
  provider = "postgresql"
  url      = env("DATABASE_URL")
}

generator client {
  provider = "prisma-client-js"
}

.env

DATABASE_URL="postgresql://USER:PASSWORD@HOST:PORT/DATABASE?schema=SCHEMA"

Replace the placeholders spelled in all uppercase letters with your database credentials. Note that if you're unsure what to provide for the SCHEMA placeholder, it's most likely the default value public:

DATABASE_URL="postgresql://USER:PASSWORD@HOST:PORT/DATABASE?schema=public"

If you want to learn how to set up a PostgreSQL database, you can follow this guide on setting up a free PostgreSQL database on Heroku.

MySQL

If you're using MySQL, you have to adjust the schema.prisma and .env files as follows:

schema.prisma

datasource db {
  provider = "mysql"
  url      = env("DATABASE_URL")
}

generator client {
  provider = "prisma-client-js"
}

.env

DATABASE_URL="mysql://USER:PASSWORD@HOST:PORT/DATABASE"

Replace the placeholders spelled in all uppercase letters with your database credentials.

Create two database tables

In this section, you'll create two new tables in your database. Run the following SQL statements in your terminal:

Mac OS / Linux

$ sqlite3 dev.db \
'CREATE TABLE "User" (
  "id"    INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
  "name"  TEXT,
  "email" TEXT    NOT NULL UNIQUE
);
CREATE TABLE "Post" (
  "id"        INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
  "title"     TEXT    NOT NULL,
  "content"   TEXT,
  "published" BOOLEAN DEFAULT false,
  "authorId"  INTEGER REFERENCES "User"(id)
);
'

Windows

$ sqlite3 ./prisma/dev.db
CREATE TABLE "User" (
  "id"    INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
  "name"  TEXT,
  "email" TEXT    NOT NULL UNIQUE
);
CREATE TABLE "Post" (
  "id"        INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
  "title"     TEXT    NOT NULL,
  "content"   TEXT,
  "published" BOOLEAN DEFAULT false,
  "authorId"  INTEGER REFERENCES "User"(id)
);

Note that Prisma also features a schema migration tool called Prisma Migrate. Prisma Migrate lets you manually define your models in your Prisma schema and takes care of creating the tables in your database. Because Prisma Migrate is currently considered experimental, this guide uses an alternative workflow of using plain SQL to create your database tables and generate Prisma models via introspection.

Introspect your database to obtain your Prisma models in the Prisma schema

Now that you've created your database tables, you can introspect the database to generate your Prisma models. After that, you will install and generate Prisma Client, which will expose queries that are tailored to these models.

To introspect your database, run the following command in your terminal:

$ npx prisma introspect

This reads your SQL schema and translates each table into a corresponding Prisma model. Your schema.prisma file now looks as follows:

generator client {
  provider = "prisma-client-js"
}

datasource db {
  provider = "sqlite"
  url      = env("DATABASE_URL")
}

model User {
  email String  @unique
  id    Int     @default(autoincrement()) @id
  name  String?
  Post  Post[]
}

model Post {
  authorId  Int?
  content   String?
  id        Int      @default(autoincrement()) @id
  published Boolean? @default(false)
  title     String
  User      User?    @relation(fields: [authorId], references: [id])
}

With your Prisma models in place, you can install and generate Prisma Client.

Install and generate Prisma Client

To install Prisma Client in your project, run the following command in your terminal:

$ npm install @prisma/client

Note that during installation, Prisma automatically invokes the prisma generate command for you. In the future, you need to run this command after every change to your Prisma models to update your generated Prisma Client.

info Note The prisma generate command reads your Prisma schema and updates the generated Prisma Client library inside node_modules/@prisma/client.

Use Prisma Client in your NestJS services

You're now able to send database queries with Prisma Client. If you want to learn more about building queries with Prisma Client, check out the API documentation.

When setting up your NestJS application, you'll want to abstract away the Prisma Client API for database queries within a service. To get started, you can create a new PrismaService that takes care of instantiating PrismaClient and connecting to your database.

Inside the src directory, create a new file called prisma.service.ts and add the following code to it:

import { Injectable, OnModuleInit, OnModuleDestroy } from '@nestjs/common';
import { PrismaClient } from '@prisma/client';

@Injectable()
export class PrismaService
  extends PrismaClient
  implements OnModuleInit, OnModuleDestroy {
  async onModuleInit() {
    await this.$connect();
  }

  async onModuleDestroy() {
    await this.$disconnect();
  }
}

Next, you can write services that you can use to make database calls for the User and Post models from your Prisma schema.

Still inside the src directory, create a new file called user.service.ts and add the following code to it:

import { Injectable } from '@nestjs/common';
import { PrismaService } from './prisma.service';
import {
  UserUpdateInput,
  User,
  UserCreateInput,
  UserWhereUniqueInput,
  UserWhereInput,
  UserOrderByInput,
} from '@prisma/client';

@Injectable()
export class UserService {
  constructor(private prisma: PrismaService) {}

  async user(userWhereUniqueInput: UserWhereUniqueInput): Promise<User | null> {
    return this.prisma.user.findOne({
      where: userWhereUniqueInput,
    });
  }

  async users(params: {
    skip?: number;
    take?: number;
    cursor?: UserWhereUniqueInput;
    where?: UserWhereInput;
    orderBy?: UserOrderByInput;
  }): Promise<User[]> {
    const { skip, take, cursor, where, orderBy } = params;
    return this.prisma.user.findMany({
      skip,
      take,
      cursor,
      where,
      orderBy,
    });
  }

  async createUser(data: UserCreateInput): Promise<User> {
    return this.prisma.user.create({
      data,
    });
  }

  async updateUser(params: {
    where: UserWhereUniqueInput;
    data: UserUpdateInput;
  }): Promise<User> {
    const { where, data } = params;
    return this.prisma.user.update({
      data,
      where,
    });
  }

  async deleteUser(where: UserWhereUniqueInput): Promise<User> {
    return this.prisma.user.delete({
      where,
    });
  }
}

Notice how you're using Prisma Client's generated types to ensure that the methods that are exposed by your service are properly typed. You therefore save the boilerplate of typing your models and creating additional interface or DTO files.

Now do the same for the Post model.

Still inside the src directory, create a new file called post.service.ts and add the following code to it:

import { Injectable } from '@nestjs/common';
import { PrismaService } from './prisma.service';
import {
  PostUpdateInput,
  Post,
  PostCreateInput,
  PostWhereUniqueInput,
  PostWhereInput,
  PostOrderByInput,
} from '@prisma/client';

@Injectable()
export class PostService {
  constructor(private prisma: PrismaService) {}

  async post(postWhereUniqueInput: PostWhereUniqueInput): Promise<Post | null> {
    return this.prisma.post.findOne({
      where: postWhereUniqueInput,
    });
  }

  async posts(params: {
    skip?: number;
    take?: number;
    cursor?: PostWhereUniqueInput;
    where?: PostWhereInput;
    orderBy?: PostOrderByInput;
  }): Promise<Post[]> {
    const { skip, take, cursor, where, orderBy } = params;
    return this.prisma.post.findMany({
      skip,
      take,
      cursor,
      where,
      orderBy,
    });
  }

  async createPost(data: PostCreateInput): Promise<Post> {
    return this.prisma.post.create({
      data,
    });
  }

  async updatePost(params: {
    where: PostWhereUniqueInput;
    data: PostUpdateInput;
  }): Promise<Post> {
    const { data, where } = params;
    return this.prisma.post.update({
      data,
      where,
    });
  }

  async deletePost(where: PostWhereUniqueInput): Promise<Post> {
    return this.prisma.post.delete({
      where,
    });
  }
}

Your UserService and PostService currently wrap the CRUD queries that are available in Prisma Client. In a real world application, the service would also be the place to add business logic to your application. For example, you could have a method called updatePassword inside the UserService that would be responsible for updating the password of a user.

Implement your REST API routes in the main app controller

Finally, you'll use the services you created in the previous sections to implement the different routes of your app. For the purpose of this guide, you'll put all your routes into the already existing AppController class.

Replace the contents of the app.controller.ts file with the following code:

import {
  Controller,
  Get,
  Param,
  Post,
  Body,
  Put,
  Delete,
} from '@nestjs/common';
import { UserService } from './user.service';
import { PostService } from './post.service';
import { User as UserModel, Post as PostModel } from '@prisma/client';

@Controller()
export class AppController {
  constructor(
    private readonly userService: UserService,
    private readonly postService: PostService,
  ) {}

  @Get('post/:id')
  async getPostById(@Param('id') id: string): Promise<PostModel> {
    return this.postService.post({ id: Number(id) });
  }

  @Get('feed')
  async getPublishedPosts(): Promise<PostModel[]> {
    return this.postService.posts({
      where: { published: true },
    });
  }

  @Get('filtered-posts/:searchString')
  async getFilteredPosts(
    @Param('searchString') searchString: string,
  ): Promise<PostModel[]> {
    return this.postService.posts({
      where: {
        OR: [
          {
            title: { contains: searchString },
          },
          {
            content: { contains: searchString },
          },
        ],
      },
    });
  }

  @Post('post')
  async createDraft(
    @Body() postData: { title: string; content?: string; authorEmail: string },
  ): Promise<PostModel> {
    const { title, content, authorEmail } = postData;
    return this.postService.createPost({
      title,
      content,
      User: {
        connect: { email: authorEmail },
      },
    });
  }

  @Post('user')
  async signupUser(
    @Body() userData: { name?: string; email: string },
  ): Promise<UserModel> {
    return this.userService.createUser(userData);
  }

  @Put('publish/:id')
  async publishPost(@Param('id') id: string): Promise<PostModel> {
    return this.postService.updatePost({
      where: { id: Number(id) },
      data: { published: true },
    });
  }

  @Delete('post/:id')
  async deletePost(@Param('id') id: string): Promise<PostModel> {
    return this.postService.deletePost({ id: Number(id) });
  }
}

This controller implements the following routes:

GET
  • /post/:id: Fetch a single post by its id
  • /feed: Fetch all published posts
  • /filter-posts/:searchString: Filter posts by title or content
POST
  • /post: Create a new post
    • Body:
      • title: String (required): The title of the post
      • content: String (optional): The content of the post
      • authorEmail: String (required): The email of the user that creates the post
  • /user: Create a new user
    • Body:
      • email: String (required): The email address of the user
      • name: String (optional): The name of the user
PUT
  • /publish/:id: Publish a post by its id
DELETE
  • /post/:id: Delete a post by its id

Summary

In this recipe, you learned how to use Prisma along with NestJS to implement a REST API. The controller that implements the routes of the API is calling a PrismaService which in turn uses Prisma Client to send queries to a database to fulfill the data needs of incoming requests.

If you want to learn more about Prisma, be sure to check out the documentation.

Serve Static

In order to serve static content like a Single Page Application (SPA) we can use the ServeStaticModule from the @nestjs/serve-static package.

Installation

First we need to install the required package:

$ npm install --save @nestjs/serve-static

Bootstrap

Once the installation process is done, we can import the ServeStaticModule into the root AppModule and configure it by passing in a configuration object to the forRoot() method.

import { Module } from '@nestjs/common';
import { AppController } from './app.controller';
import { AppService } from './app.service';
import { ServeStaticModule } from '@nestjs/serve-static';
import { join } from 'path';

@Module({
  imports: [
    ServeStaticModule.forRoot({
      rootPath: join(__dirname, '..', 'client'),
    }),
  ],
  controllers: [AppController],
  providers: [AppService],
})
export class AppModule {}

With this in place, build the static website and place its content in the location specified by the rootPath property.

Summary

A working example is available here.

SQL (Sequelize)

This chapter applies only to TypeScript

Warning In this article, you'll learn how to create a DatabaseModule based on the Sequelize package from scratch using custom components. As a consequence, this technique contains a lot of overhead that you can avoid by using the dedicated, out-of-the-box @nestjs/sequelize package. To learn more, see here.

Sequelize is a popular Object Relational Mapper (ORM) written in a vanilla JavaScript, but there is a sequelize-typescript TypeScript wrapper which provides a set of decorators and other extras for the base sequelize.

Getting started

To start the adventure with this library we have to install the following dependencies:

$ npm install --save sequelize sequelize-typescript mysql2
$ npm install --save-dev @types/sequelize

The first step we need to do is create a Sequelize instance with an options object passed into the constructor. Also, we need to add all models (the alternative is to use modelPaths property) and sync() our database tables.

@@filename(database.providers)
import { Sequelize } from 'sequelize-typescript';
import { Cat } from '../cats/cat.entity';

export const databaseProviders = [
  {
    provide: 'SEQUELIZE',
    useFactory: async () => {
      const sequelize = new Sequelize({
        dialect: 'mysql',
        host: 'localhost',
        port: 3306,
        username: 'root',
        password: 'password',
        database: 'nest',
      });
      sequelize.addModels([Cat]);
      await sequelize.sync();
      return sequelize;
    },
  },
];

warning Hint Following best practices, we declared the custom provider in the separated file which has a *.providers.ts suffix.

Then, we need to export these providers to make them accessible for the rest part of the application.

import { Module } from '@nestjs/common';
import { databaseProviders } from './database.providers';

@Module({
  providers: [...databaseProviders],
  exports: [...databaseProviders],
})
export class DatabaseModule {}

Now we can inject the Sequelize object using @Inject() decorator. Each class that would depend on the Sequelize async provider will wait until a Promise is resolved.

Model injection

In Sequelize the Model defines a table in the database. Instances of this class represent a database row. Firstly, we need at least one entity:

@@filename(cat.entity)
import { Table, Column, Model } from 'sequelize-typescript';

@Table
export class Cat extends Model<Cat> {
  @Column
  name: string;

  @Column
  age: number;

  @Column
  breed: string;
}

The Cat entity belongs to the cats directory. This directory represents the CatsModule. Now it's time to create a Repository provider:

@@filename(cats.providers)
import { Cat } from './cat.entity';

export const catsProviders = [
  {
    provide: 'CATS_REPOSITORY',
    useValue: Cat,
  },
];

Notice In the real-world applications you should avoid magic strings. Both CATS_REPOSITORY and SEQUELIZE should be kept in the separated constants.ts file.

In Sequelize, we use static methods to manipulate the data, and thus we created an alias here.

Now we can inject the CATS_REPOSITORY to the CatsService using the @Inject() decorator:

@@filename(cats.service)
import { Injectable, Inject } from '@nestjs/common';
import { CreateCatDto } from './dto/create-cat.dto';
import { Cat } from './cat.entity';

@Injectable()
export class CatsService {
  constructor(
    @Inject('CATS_REPOSITORY') private catsRepository: typeof Cat) {}

  async findAll(): Promise<Cat[]> {
    return this.catsRepository.findAll<Cat>();
  }
}

The database connection is asynchronous, but Nest makes this process completely invisible for the end-user. The CATS_REPOSITORY provider is waiting for the db connection, and the CatsService is delayed until repository is ready to use. The entire application can start when each class is instantiated.

Here is a final CatsModule:

@@filename(cats.module)
import { Module } from '@nestjs/common';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';
import { catsProviders } from './cats.providers';
import { DatabaseModule } from '../database/database.module';

@Module({
  imports: [DatabaseModule],
  controllers: [CatsController],
  providers: [
    CatsService,
    ...catsProviders,
  ],
})
export class CatsModule {}

warning Hint Do not forget to import the CatsModule into the root ApplicationModule.

SQL (TypeORM)

This chapter applies only to TypeScript

Warning In this article, you'll learn how to create a DatabaseModule based on the TypeORM package from scratch using custom providers mechanism. As a consequence, this solution contains a lot of overhead that you can omit using ready to use and available out-of-the-box dedicated @nestjs/typeorm package. To learn more, see here.

TypeORM is definitely the most mature Object Relational Mapper (ORM) available in the node.js world. Since it's written in TypeScript, it works pretty well with the Nest framework.

Getting started

To start the adventure with this library we have to install all required dependencies:

$ npm install --save typeorm mysql

The first step we need to do is to establish the connection with our database using createConnection() function imported from the typeorm package. The createConnection() function returns a Promise, and therefore we have to create an async provider.

@@filename(database.providers)
import { createConnection } from 'typeorm';

export const databaseProviders = [
  {
    provide: 'DATABASE_CONNECTION',
    useFactory: async () => await createConnection({
      type: 'mysql',
      host: 'localhost',
      port: 3306,
      username: 'root',
      password: 'root',
      database: 'test',
      entities: [
          __dirname + '/../**/*.entity{.ts,.js}',
      ],
      synchronize: true,
    }),
  },
];

warning Hint Following best practices, we declared the custom provider in the separated file which has a *.providers.ts suffix.

Then, we need to export these providers to make them accessible for the rest of the application.

@@filename(database.module)
import { Module } from '@nestjs/common';
import { databaseProviders } from './database.providers';

@Module({
  providers: [...databaseProviders],
  exports: [...databaseProviders],
})
export class DatabaseModule {}

Now we can inject the Connection object using @Inject() decorator. Each class that would depend on the Connection async provider will wait until a Promise is resolved.

Repository pattern

The TypeORM supports the repository design pattern, thus each entity has its own Repository. These repositories can be obtained from the database connection.

But firstly, we need at least one entity. We are going to reuse the Photo entity from the official documentation.

@@filename(photo.entity)
import { Entity, Column, PrimaryGeneratedColumn } from 'typeorm';

@Entity()
export class Photo {
  @PrimaryGeneratedColumn()
  id: number;

  @Column({ length: 500 })
  name: string;

  @Column('text')
  description: string;

  @Column()
  filename: string;

  @Column('int')
  views: number;

  @Column()
  isPublished: boolean;
}

The Photo entity belongs to the photo directory. This directory represents the PhotoModule. Now, let's create a Repository provider:

@@filename(photo.providers)
import { Connection, Repository } from 'typeorm';
import { Photo } from './photo.entity';

export const photoProviders = [
  {
    provide: 'PHOTO_REPOSITORY',
    useFactory: (connection: Connection) => connection.getRepository(Photo),
    inject: ['DATABASE_CONNECTION'],
  },
];

warning Notice In the real-world applications you should avoid magic strings. Both PHOTO_REPOSITORY and DATABASE_CONNECTION should be kept in the separated constants.ts file.

Now we can inject the Repository<Photo> to the PhotoService using the @Inject() decorator:

@@filename(photo.service)
import { Injectable, Inject } from '@nestjs/common';
import { Repository } from 'typeorm';
import { Photo } from './photo.entity';

@Injectable()
export class PhotoService {
  constructor(
    @Inject('PHOTO_REPOSITORY')
    private photoRepository: Repository<Photo>,
  ) {}

  async findAll(): Promise<Photo[]> {
    return this.photoRepository.find();
  }
}

The database connection is asynchronous, but Nest makes this process completely invisible for the end-user. The PhotoRepository is waiting for the db connection, and the PhotoService is delayed until repository is ready to use. The entire application can start when each class is instantiated.

Here is a final PhotoModule:

@@filename(photo.module)
import { Module } from '@nestjs/common';
import { DatabaseModule } from '../database/database.module';
import { photoProviders } from './photo.providers';
import { PhotoService } from './photo.service';

@Module({
  imports: [DatabaseModule],
  providers: [
    ...photoProviders,
    PhotoService,
  ],
})
export class PhotoModule {}

warning Hint Do not forget to import the PhotoModule into the root ApplicationModule.

Healthchecks (Terminus)

The NestJS Terminus integration supports you with readiness / liveness health checks. Healthchecks are very important when it comes to complex backend setups. In a nutshell, a health check in the realm of web development usually consists of a special address, for example, https://my-website.com/health/readiness. A service, or a component of your infrastructure (e.g., Kubernetes) checks this address continuously. Depending on the HTTP status code returned from a GET request to this address the service will take action when it receives an "unhealthy" response. Since the definition of "healthy" or "unhealthy" varies with the type of service you provide, the NestJS Terminus integration supports you with a set of health indicators.

As an example, if your web server uses MongoDB to store its data, it would be crucial information whether MongoDB is still up and running. In that case, you can make use of the MongooseHealthIndicator. If configured correctly - more on that later - your health check address will return a healthy or unhealthy HTTP status code, depending on whether MongoDB is running.

Getting started

To get started with @nestjs/terminus we need to install the required dependency.

$ npm install --save @nestjs/terminus

Setting up a Healthcheck

A health check represents a summary of health indicators. A health indicator executes a check of a service, whether it is in a healthy or unhealthy state. A health check is positive if all the assigned health indicators are up and running. Because a lot of applications will need similar health indicators, @nestjs/terminus provides a set of predefined indicators, such as:

  • DNSHealthIndicator
  • TypeOrmHealthIndicator
  • MongooseHealthIndicator
  • MicroserviceHealthIndicator
  • GRPCHealthIndicator
  • MemoryHealthIndicator
  • DiskHealthIndicator

To get started with our first health check, we need to import the TerminusModule into our AppModule.

@@filename(app.module)
import { Module } from '@nestjs/common';
import { TerminusModule } from '@nestjs/terminus';

@Module({
  imports: [TerminusModule]
})
export class AppModule { }

Our healthcheck(s) can be executed using a controller, which can be easily setup using the NestJS CLI.

$ nest generate controller health

info Info It is highly recommended to enable shutdown hooks in your application. The Terminus integration makes use of this lifecycle event if enabled. Read more about shutdown hooks here.

DNS Healthcheck

Once we have installed @nestjs/terminus, imported our TerminusModule and created a new controller, we are ready to create a health check.

@@filename(health.controller)
@Controller('health')
export class HealthController {
  constructor(
    private health: HealthCheckService,
    private dns: DNSHealthIndicator,
  ) {}

  @Get()
  @HealthCheck()
  check() {
    return this.health.check([
      () => this.dns.pingCheck('nestjs-docs', 'https://docs.nestjs.com'),
    ]);
  }
}
@@switch
@Controller('health')
@Dependencies(HealthCheckService, DNSHealthIndicator)
export class HealthController {
  constructor(
    private health,
    private dns,
  ) { }

  @Get()
  @HealthCheck()
  healthCheck() {
    return this.health.check([
      async () => this.dns.pingCheck('nestjs-docs', 'https://docs.nestjs.com'),
    ])
  }
}

Our health check will now send a GET-request to the https://docs.nestjs.com address. If we get a healthy response from that address, our route at http://localhost:3000/health will return the following object with a 200 status code.

{
  "status": "ok",
  "info": {
    "nestjs-docs": {
      "status": "up"
    }
  },
  "error": {},
  "details": {
    "nestjs-docs": {
      "status": "up"
    }
  }
}

The interface of this response object can be accessed from the @nestjs/terminus package with the HealthCheckResult interface.

status If any health indicator failed the status will be 'error'. If the NestJS app is shutting down but still accepting HTTP requests, the health check will have the 'shutting_down' status. 'error' | 'ok' | 'shutting_down'
info Object containing information of each health indicator which is of status 'up', or in other words "healthy". object
error Object containing information of each health indicator which is of status 'down', or in other words "unhealthy". object
details Object containing all information of each health indicator object

Custom health indicator

In some cases, the predefined health indicators provided by @nestjs/terminus do not cover all of your health check requirements. In that case, you can set up a custom health indicator according to your needs.

Let's get started by creating a service that will represent our custom indicator. To get a basic understanding of how an indicator is structured, we will create an example DogHealthIndicator. This service should have the state 'up' if every Dog object has the type 'goodboy'. If that condition is not satisfied then it should throw an error.

@@filename(dog.health)
import { Injectable } from '@nestjs/common';
import { HealthIndicator, HealthIndicatorResult, HealthCheckError } from '@nestjs/terminus';

export interface Dog {
  name: string;
  type: string;
}

@Injectable()
export class DogHealthIndicator extends HealthIndicator {
  private dogs: Dog[] = [
    { name: 'Fido', type: 'goodboy' },
    { name: 'Rex', type: 'badboy' },
  ];

  async isHealthy(key: string): Promise<HealthIndicatorResult> {
    const badboys = this.dogs.filter(dog => dog.type === 'badboy');
    const isHealthy = badboys.length === 0;
    const result = this.getStatus(key, isHealthy, { badboys: badboys.length });

    if (isHealthy) {
      return result;
    }
    throw new HealthCheckError('Dogcheck failed', result);
  }
}
@@switch
import { Injectable } from '@nestjs/common';
import { HealthCheckError } from '@godaddy/terminus';

@Injectable()
export class DogHealthIndicator extends HealthIndicator {
  dogs = [
    { name: 'Fido', type: 'goodboy' },
    { name: 'Rex', type: 'badboy' },
  ];

  async isHealthy(key) {
    const badboys = this.dogs.filter(dog => dog.type === 'badboy');
    const isHealthy = badboys.length === 0;
    const result = this.getStatus(key, isHealthy, { badboys: badboys.length });

    if (isHealthy) {
      return result;
    }
    throw new HealthCheckError('Dogcheck failed', result);
  }
}

The next thing we need to do is register the health indicator as a provider.

@@filename(app.module)
import { Module } from '@nestjs/common';
import { TerminusModule } from '@nestjs/terminus';
import { DogHealthIndicator } from './dog.health';

@Module({
  controllers: [HealthController],
  imports: [TerminusModule],
  providers: [DogHealthIndicator]
})
export class AppModule { }

info Hint In a real-world application the DogHealthIndicator should be provided in a separate module, for example, DogModule, which then will be imported by the AppModule.

The last required step is to add the now available health indicator in the required health check endpoint. For that, we go back to our HealthController and add it to our check function.

@@filename(health.controller)
import { HealthCheckService } from '@nestjs/terminus';
import { Injectable } from '@nestjs/common';
import { DogHealthIndicator } from './dog.health';

@Injectable()
export class HealthController {
  constructor(
    private health: HealthCheckService,
    private dogHealthIndicator: DogHealthIndicator
  ) {}

  @Get()
  @HealthCheck()
  healthCheck() {
    return this.health.check([
      async () => this.dogHealthIndicator.isHealthy('dog'),
    ])
  }
}
@@switch
import { HealthCheckService } from '@nestjs/terminus';
import { Injectable } from '@nestjs/common';
import { DogHealthIndicator } from './dog.health';

@Injectable()
@Dependencies(HealthCheckService, DogHealthIndicator)
export class HealthController {
  constructor(
    private health,
    private dogHealthIndicator
  ) {}

  @Get()
  @HealthCheck()
  healthCheck() {
    return this.health.check([
      async () => this.dogHealthIndicator.isHealthy('dog'),
    ])
  }
}

Authentication

Authentication is an essential part of most applications. There are many different approaches and strategies to handle authentication. The approach taken for any project depends on its particular application requirements. This chapter presents several approaches to authentication that can be adapted to a variety of different requirements.

Passport is the most popular node.js authentication library, well-known by the community and successfully used in many production applications. It's straightforward to integrate this library with a Nest application using the @nestjs/passport module. At a high level, Passport executes a series of steps to:

  • Authenticate a user by verifying their "credentials" (such as username/password, JSON Web Token (JWT), or identity token from an Identity Provider)
  • Manage authenticated state (by issuing a portable token, such as a JWT, or creating an Express session)
  • Attach information about the authenticated user to the Request object for further use in route handlers

Passport has a rich ecosystem of strategies that implement various authentication mechanisms. While simple in concept, the set of Passport strategies you can choose from is large and presents a lot of variety. Passport abstracts these varied steps into a standard pattern, and the @nestjs/passport module wraps and standardizes this pattern into familiar Nest constructs.

In this chapter, we'll implement a complete end-to-end authentication solution for a RESTful API server using these powerful and flexible modules. You can use the concepts described here to implement any Passport strategy to customize your authentication scheme. You can follow the steps in this chapter to build this complete example. You can find a repository with a completed sample app here.

Authentication requirements

Let's flesh out our requirements. For this use case, clients will start by authenticating with a username and password. Once authenticated, the server will issue a JWT that can be sent as a bearer token in an authorization header on subsequent requests to prove authentication. We'll also create a protected route that is accessible only to requests that contain a valid JWT.

We'll start with the first requirement: authenticating a user. We'll then extend that by issuing a JWT. Finally, we'll create a protected route that checks for a valid JWT on the request.

First we need to install the required packages. Passport provides a strategy called passport-local that implements a username/password authentication mechanism, which suits our needs for this portion of our use case.

$ npm install --save @nestjs/passport passport passport-local
$ npm install --save-dev @types/passport-local

warning Notice For any Passport strategy you choose, you'll always need the @nestjs/passport and passport packages. Then, you'll need to install the strategy-specific package (e.g., passport-jwt or passport-local) that implements the particular authentication strategy you are building. In addition, you can also install the type definitions for any Passport strategy, as shown above with @types/passport-local, which provides assistance while writing TypeScript code.

Implementing Passport strategies

We're now ready to implement the authentication feature. We'll start with an overview of the process used for any Passport strategy. It's helpful to think of Passport as a mini framework in itself. The elegance of the framework is that it abstracts the authentication process into a few basic steps that you customize based on the strategy you're implementing. It's like a framework because you configure it by supplying customization parameters (as plain JSON objects) and custom code in the form of callback functions, which Passport calls at the appropriate time. The @nestjs/passport module wraps this framework in a Nest style package, making it easy to integrate into a Nest application. We'll use @nestjs/passport below, but first let's consider how vanilla Passport works.

In vanilla Passport, you configure a strategy by providing two things:

  1. A set of options that are specific to that strategy. For example, in a JWT strategy, you might provide a secret to sign tokens.
  2. A "verify callback", which is where you tell Passport how to interact with your user store (where you manage user accounts). Here, you verify whether a user exists (and/or create a new user), and whether their credentials are valid. The Passport library expects this callback to return a full user if the validation succeeds, or a null if it fails (failure is defined as either the user is not found, or, in the case of passport-local, the password does not match).

With @nestjs/passport, you configure a Passport strategy by extending the PassportStrategy class. You pass the strategy options (item 1 above) by calling the super() method in your subclass, optionally passing in an options object. You provide the verify callback (item 2 above) by implementing a validate() method in your subclass.

We'll start by generating an AuthModule and in it, an AuthService:

$ nest g module auth
$ nest g service auth

As we implement the AuthService, we'll find it useful to encapsulate user operations in a UsersService, so let's generate that module and service now:

$ nest g module users
$ nest g service users

Replace the default contents of these generated files as shown below. For our sample app, the UsersService simply maintains a hard-coded in-memory list of users, and a find method to retrieve one by username. In a real app, this is where you'd build your user model and persistence layer, using your library of choice (e.g., TypeORM, Sequelize, Mongoose, etc.).

@@filename(users/users.service)
import { Injectable } from '@nestjs/common';

export type User = any;

@Injectable()
export class UsersService {
  private readonly users: User[];

  constructor() {
    this.users = [
      {
        userId: 1,
        username: 'john',
        password: 'changeme',
      },
      {
        userId: 2,
        username: 'chris',
        password: 'secret',
      },
      {
        userId: 3,
        username: 'maria',
        password: 'guess',
      },
    ];
  }

  async findOne(username: string): Promise<User | undefined> {
    return this.users.find(user => user.username === username);
  }
}
@@switch
import { Injectable } from '@nestjs/common';

@Injectable()
export class UsersService {
  constructor() {
    this.users = [
      {
        userId: 1,
        username: 'john',
        password: 'changeme',
      },
      {
        userId: 2,
        username: 'chris',
        password: 'secret',
      },
      {
        userId: 3,
        username: 'maria',
        password: 'guess',
      },
    ];
  }

  async findOne(username) {
    return this.users.find(user => user.username === username);
  }
}

In the UsersModule, the only change needed is to add the UsersService to the exports array of the @Module decorator so that it is visible outside this module (we'll soon use it in our AuthService).

@@filename(users/users.module)
import { Module } from '@nestjs/common';
import { UsersService } from './users.service';

@Module({
  providers: [UsersService],
  exports: [UsersService],
})
export class UsersModule {}
@@switch
import { Module } from '@nestjs/common';
import { UsersService } from './users.service';

@Module({
  providers: [UsersService],
  exports: [UsersService],
})
export class UsersModule {}

Our AuthService has the job of retrieving a user and verifying the password. We create a validateUser() method for this purpose. In the code below, we use a convenient ES6 spread operator to strip the password property from the user object before returning it. We'll be calling into the validateUser() method from our Passport local strategy in a moment.

@@filename(auth/auth.service)
import { Injectable } from '@nestjs/common';
import { UsersService } from '../users/users.service';

@Injectable()
export class AuthService {
  constructor(private usersService: UsersService) {}

  async validateUser(username: string, pass: string): Promise<any> {
    const user = await this.usersService.findOne(username);
    if (user && user.password === pass) {
      const { password, ...result } = user;
      return result;
    }
    return null;
  }
}
@@switch
import { Injectable, Dependencies } from '@nestjs/common';
import { UsersService } from '../users/users.service';

@Injectable()
@Dependencies(UsersService)
export class AuthService {
  constructor(usersService) {
    this.usersService = usersService;
  }

  async validateUser(username, pass) {
    const user = await this.usersService.findOne(username);
    if (user && user.password === pass) {
      const { password, ...result } = user;
      return result;
    }
    return null;
  }
}

Warning Warning Of course in a real application, you wouldn't store a password in plain text. You'd instead use a library like bcrypt, with a salted one-way hash algorithm. With that approach, you'd only store hashed passwords, and then compare the stored password to a hashed version of the incoming password, thus never storing or exposing user passwords in plain text. To keep our sample app simple, we violate that absolute mandate and use plain text. Don't do this in your real app!

Now, we update our AuthModule to import the UsersModule.

@@filename(auth/auth.module)
import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { UsersModule } from '../users/users.module';

@Module({
  imports: [UsersModule],
  providers: [AuthService],
})
export class AuthModule {}
@@switch
import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { UsersModule } from '../users/users.module';

@Module({
  imports: [UsersModule],
  providers: [AuthService],
})
export class AuthModule {}

Implementing Passport local

Now we can implement our Passport local authentication strategy. Create a file called local.strategy.ts in the auth folder, and add the following code:

@@filename(auth/local.strategy)
import { Strategy } from 'passport-local';
import { PassportStrategy } from '@nestjs/passport';
import { Injectable, UnauthorizedException } from '@nestjs/common';
import { AuthService } from './auth.service';

@Injectable()
export class LocalStrategy extends PassportStrategy(Strategy) {
  constructor(private authService: AuthService) {
    super();
  }

  async validate(username: string, password: string): Promise<any> {
    const user = await this.authService.validateUser(username, password);
    if (!user) {
      throw new UnauthorizedException();
    }
    return user;
  }
}
@@switch
import { Strategy } from 'passport-local';
import { PassportStrategy } from '@nestjs/passport';
import { Injectable, UnauthorizedException, Dependencies } from '@nestjs/common';
import { AuthService } from './auth.service';

@Injectable()
@Dependencies(AuthService)
export class LocalStrategy extends PassportStrategy(Strategy) {
  constructor(authService) {
    super();
    this.authService = authService;
  }

  async validate(username, password) {
    const user = await this.authService.validateUser(username, password);
    if (!user) {
      throw new UnauthorizedException();
    }
    return user;
  }
}

We've followed the recipe described earlier for all Passport strategies. In our use case with passport-local, there are no configuration options, so our constructor simply calls super(), without an options object.

info Hint We can pass an options object in the call to super() to customize the behavior of the passport strategy. In this example, the passport-local strategy by default expects properties called username and password in the request body. Pass an options object to specify different property names, for example: super({{ '{' }} usernameField: 'email' {{ '}' }}). See the Passport documentation for more information.

We've also implemented the validate() method. For each strategy, Passport will call the verify function (implemented with the validate() method in @nestjs/passport) using an appropriate strategy-specific set of parameters. For the local-strategy, Passport expects a validate() method with the following signature: validate(username: string, password:string): any.

Most of the validation work is done in our AuthService (with the help of our UsersService), so this method is quite straightforward. The validate() method for any Passport strategy will follow a similar pattern, varying only in the details of how credentials are represented. If a user is found and the credentials are valid, the user is returned so Passport can complete its tasks (e.g., creating the user property on the Request object), and the request handling pipeline can continue. If it's not found, we throw an exception and let our exceptions layer handle it.

Typically, the only significant difference in the validate() method for each strategy is how you determine if a user exists and is valid. For example, in a JWT strategy, depending on requirements, we may evaluate whether the userId carried in the decoded token matches a record in our user database, or matches a list of revoked tokens. Hence, this pattern of sub-classing and implementing strategy-specific validation is consistent, elegant and extensible.

We need to configure our AuthModule to use the Passport features we just defined. Update auth.module.ts to look like this:

@@filename(auth/auth.module)
import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { UsersModule } from '../users/users.module';
import { PassportModule } from '@nestjs/passport';
import { LocalStrategy } from './local.strategy';

@Module({
  imports: [UsersModule, PassportModule],
  providers: [AuthService, LocalStrategy],
})
export class AuthModule {}
@@switch
import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { UsersModule } from '../users/users.module';
import { PassportModule } from '@nestjs/passport';
import { LocalStrategy } from './local.strategy';

@Module({
  imports: [UsersModule, PassportModule],
  providers: [AuthService, LocalStrategy],
})
export class AuthModule {}

Built-in Passport Guards

The Guards chapter describes the primary function of Guards: to determine whether a request will be handled by the route handler or not. That remains true, and we'll use that standard capability soon. However, in the context of using the @nestjs/passport module, we will also introduce a slight new wrinkle that may at first be confusing, so let's discuss that now. Consider that your app can exist in two states, from an authentication perspective:

  1. the user/client is not logged in (is not authenticated)
  2. the user/client is logged in (is authenticated)

In the first case (user is not logged in), we need to perform two distinct functions:

  • Restrict the routes an unauthenticated user can access (i.e., deny access to restricted routes). We'll use Guards in their familiar capacity to handle this function, by placing a Guard on the protected routes. As you may anticipate, we'll be checking for the presence of a valid JWT in this Guard, so we'll work on this Guard later, once we are successfully issuing JWTs.

  • Initiate the authentication step itself when a previously unauthenticated user attempts to login. This is the step where we'll issue a JWT to a valid user. Thinking about this for a moment, we know we'll need to POST username/password credentials to initiate authentication, so we'll set up a POST /auth/login route to handle that. This raises the question: how exactly do we invoke the passport-local strategy in that route?

The answer is straightforward: by using another, slightly different type of Guard. The @nestjs/passport module provides us with a built-in Guard that does this for us. This Guard invokes the Passport strategy and kicks off the steps described above (retrieving credentials, running the verify function, creating the user property, etc).

The second case enumerated above (logged in user) simply relies on the standard type of Guard we already discussed to enable access to protected routes for logged in users.

Login route

With the strategy in place, we can now implement a bare-bones /auth/login route, and apply the built-in Guard to initiate the passport-local flow.

Open the app.controller.ts file and replace its contents with the following:

@@filename(app.controller)
import { Controller, Request, Post, UseGuards } from '@nestjs/common';
import { AuthGuard } from '@nestjs/passport';

@Controller()
export class AppController {
  @UseGuards(AuthGuard('local'))
  @Post('auth/login')
  async login(@Request() req) {
    return req.user;
  }
}
@@switch
import { Controller, Bind, Request, Post, UseGuards } from '@nestjs/common';
import { AuthGuard } from '@nestjs/passport';

@Controller()
export class AppController {
  @UseGuards(AuthGuard('local'))
  @Post('auth/login')
  @Bind(Request())
  async login(req) {
    return req.user;
  }
}

With @UseGuards(AuthGuard('local')) we are using an AuthGuard that @nestjs/passport automatically provisioned for us when we extended the passport-local strategy. Let's break that down. Our Passport local strategy has a default name of 'local'. We reference that name in the @UseGuards() decorator to associate it with code supplied by the passport-local package. This is used to disambiguate which strategy to invoke in case we have multiple Passport strategies in our app (each of which may provision a strategy-specific AuthGuard). While we only have one such strategy so far, we'll shortly add a second, so this is needed for disambiguation.

In order to test our route we'll have our /auth/login route simply return the user for now. This also lets us demonstrate another Passport feature: Passport automatically creates a user object, based on the value we return from the validate() method, and assigns it to the Request object as req.user. Later, we'll replace this with code to create and return a JWT instead.

Since these are API routes, we'll test them using the commonly available cURL library. You can test with any of the user objects hard-coded in the UsersService.

$ # POST to /auth/login
$ curl -X POST http://localhost:3000/auth/login -d '{"username": "john", "password": "changeme"}' -H "Content-Type: application/json"
$ # result -> {"userId":1,"username":"john"}

While this works, passing the strategy name directly to the AuthGuard() introduces magic strings in the codebase. Instead, we recommend creating your own class, as shown below:

@@filename(auth/local-auth.guard)
import { Injectable } from '@nestjs/common';
import { AuthGuard } from '@nestjs/passport';

@Injectable()
export class LocalAuthGuard extends AuthGuard('local') {}

Now, we can update the /auth/login route handler and use the LocalAuthGuard instead:

@UseGuards(LocalAuthGuard)
@Post('auth/login')
async login(@Request() req) {
  return req.user;
}

JWT functionality

We're ready to move on to the JWT portion of our auth system. Let's review and refine our requirements:

  • Allow users to authenticate with username/password, returning a JWT for use in subsequent calls to protected API endpoints. We're well on our way to meeting this requirement. To complete it, we'll need to write the code that issues a JWT.
  • Create API routes which are protected based on the presence of a valid JWT as a bearer token

We'll need to install a couple more packages to support our JWT requirements:

$ npm install --save @nestjs/jwt passport-jwt
$ npm install --save-dev @types/passport-jwt

The @nestjs/jwt package (see more here) is a utility package that helps with JWT manipulation. The passport-jwt package is the Passport package that implements the JWT strategy and @types/passport-jwt provides the TypeScript type definitions.

Let's take a closer look at how a POST /auth/login request is handled. We've decorated the route using the built-in AuthGuard provided by the passport-local strategy. This means that:

  1. The route handler will only be invoked if the user has been validated
  2. The req parameter will contain a user property (populated by Passport during the passport-local authentication flow)

With this in mind, we can now finally generate a real JWT, and return it in this route. To keep our services cleanly modularized, we'll handle generating the JWT in the authService. Open the auth.service.ts file in the auth folder, and add the login() method, and import the JwtService as shown:

@@filename(auth/auth.service)
import { Injectable } from '@nestjs/common';
import { UsersService } from '../users/users.service';
import { JwtService } from '@nestjs/jwt';

@Injectable()
export class AuthService {
  constructor(
    private usersService: UsersService,
    private jwtService: JwtService
  ) {}

  async validateUser(username: string, pass: string): Promise<any> {
    const user = await this.usersService.findOne(username);
    if (user && user.password === pass) {
      const { password, ...result } = user;
      return result;
    }
    return null;
  }

  async login(user: any) {
    const payload = { username: user.username, sub: user.userId };
    return {
      access_token: this.jwtService.sign(payload),
    };
  }
}

@@switch
import { Injectable, Dependencies } from '@nestjs/common';
import { UsersService } from '../users/users.service';
import { JwtService } from '@nestjs/jwt';

@Dependencies(UsersService, JwtService)
@Injectable()
export class AuthService {
  constructor(usersService, jwtService) {
    this.usersService = usersService;
    this.jwtService = jwtService;
  }

  async validateUser(username, pass) {
    const user = await this.usersService.findOne(username);
    if (user && user.password === pass) {
      const { password, ...result } = user;
      return result;
    }
    return null;
  }

  async login(user) {
    const payload = { username: user.username, sub: user.userId };
    return {
      access_token: this.jwtService.sign(payload),
    };
  }
}

We're using the @nestjs/jwt library, which supplies a sign() function to generate our JWT from a subset of the user object properties, which we then return as a simple object with a single access_token property. Note: we choose a property name of sub to hold our userId value to be consistent with JWT standards. Don't forget to inject the JwtService provider into the AuthService.

We now need to update the AuthModule to import the new dependencies and configure the JwtModule.

First, create constants.ts in the auth folder, and add the following code:

@@filename(auth/constants)
export const jwtConstants = {
  secret: 'secretKey',
};
@@switch
export const jwtConstants = {
  secret: 'secretKey',
};

We'll use this to share our key between the JWT signing and verifying steps.

Warning Warning Do not expose this key publicly. We have done so here to make it clear what the code is doing, but in a production system you must protect this key using appropriate measures such as a secrets vault, environment variable, or configuration service.

Now, open auth.module.ts in the auth folder and update it to look like this:

@@filename(auth/auth.module)
import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { LocalStrategy } from './local.strategy';
import { UsersModule } from '../users/users.module';
import { PassportModule } from '@nestjs/passport';
import { JwtModule } from '@nestjs/jwt';
import { jwtConstants } from './constants';

@Module({
  imports: [
    UsersModule,
    PassportModule,
    JwtModule.register({
      secret: jwtConstants.secret,
      signOptions: { expiresIn: '60s' },
    }),
  ],
  providers: [AuthService, LocalStrategy],
  exports: [AuthService],
})
export class AuthModule {}
@@switch
import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { LocalStrategy } from './local.strategy';
import { UsersModule } from '../users/users.module';
import { PassportModule } from '@nestjs/passport';
import { JwtModule } from '@nestjs/jwt';
import { jwtConstants } from './constants';

@Module({
  imports: [
    UsersModule,
    PassportModule,
    JwtModule.register({
      secret: jwtConstants.secret,
      signOptions: { expiresIn: '60s' },
    }),
  ],
  providers: [AuthService, LocalStrategy],
  exports: [AuthService],
})
export class AuthModule {}

We configure the JwtModule using register(), passing in a configuration object. See here for more on the Nest JwtModule and here for more details on the available configuration options.

Now we can update the /auth/login route to return a JWT.

@@filename(app.controller)
import { Controller, Request, Post, UseGuards } from '@nestjs/common';
import { LocalAuthGuard } from './auth/local-auth.guard';
import { AuthService } from './auth/auth.service';

@Controller()
export class AppController {
  constructor(private authService: AuthService) {}

  @UseGuards(LocalAuthGuard)
  @Post('auth/login')
  async login(@Request() req) {
    return this.authService.login(req.user);
  }
}
@@switch
import { Controller, Bind, Request, Post, UseGuards } from '@nestjs/common';
import { LocalAuthGuard } from './auth/local-auth.guard';
import { AuthService } from './auth/auth.service';

@Controller()
export class AppController {
  constructor(private authService: AuthService) {}

  @UseGuards(LocalAuthGuard)
  @Post('auth/login')
  @Bind(Request())
  async login(req) {
    return this.authService.login(req.user);
  }
}

Let's go ahead and test our routes using cURL again. You can test with any of the user objects hard-coded in the UsersService.

$ # POST to /auth/login
$ curl -X POST http://localhost:3000/auth/login -d '{"username": "john", "password": "changeme"}' -H "Content-Type: application/json"
$ # result -> {"access_token":"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9..."}
$ # Note: above JWT truncated

Implementing Passport JWT

We can now address our final requirement: protecting endpoints by requiring a valid JWT be present on the request. Passport can help us here too. It provides the passport-jwt strategy for securing RESTful endpoints with JSON Web Tokens. Start by creating a file called jwt.strategy.ts in the auth folder, and add the following code:

@@filename(auth/jwt.strategy)
import { ExtractJwt, Strategy } from 'passport-jwt';
import { PassportStrategy } from '@nestjs/passport';
import { Injectable } from '@nestjs/common';
import { jwtConstants } from './constants';

@Injectable()
export class JwtStrategy extends PassportStrategy(Strategy) {
  constructor() {
    super({
      jwtFromRequest: ExtractJwt.fromAuthHeaderAsBearerToken(),
      ignoreExpiration: false,
      secretOrKey: jwtConstants.secret,
    });
  }

  async validate(payload: any) {
    return { userId: payload.sub, username: payload.username };
  }
}
@@switch
import { ExtractJwt, Strategy } from 'passport-jwt';
import { PassportStrategy } from '@nestjs/passport';
import { Injectable } from '@nestjs/common';
import { jwtConstants } from './constants';

@Injectable()
export class JwtStrategy extends PassportStrategy(Strategy) {
  constructor() {
    super({
      jwtFromRequest: ExtractJwt.fromAuthHeaderAsBearerToken(),
      ignoreExpiration: false,
      secretOrKey: jwtConstants.secret,
    });
  }

  async validate(payload) {
    return { userId: payload.sub, username: payload.username };
  }
}

With our JwtStrategy, we've followed the same recipe described earlier for all Passport strategies. This strategy requires some initialization, so we do that by passing in an options object in the super() call. You can read more about the available options here. In our case, these options are:

  • jwtFromRequest: supplies the method by which the JWT will be extracted from the Request. We will use the standard approach of supplying a bearer token in the Authorization header of our API requests. Other options are described here.
  • ignoreExpiration: just to be explicit, we choose the default false setting, which delegates the responsibility of ensuring that a JWT has not expired to the Passport module. This means that if our route is supplied with an expired JWT, the request will be denied and a 401 Unauthorized response sent. Passport conveniently handles this automatically for us.
  • secretOrKey: we are using the expedient option of supplying a symmetric secret for signing the token. Other options, such as a PEM-encoded public key, may be more appropriate for production apps (see here for more information). In any case, as cautioned earlier, do not expose this secret publicly.

The validate() method deserves some discussion. For the jwt-strategy, Passport first verifies the JWT's signature and decodes the JSON. It then invokes our validate() method passing the decoded JSON as its single parameter. Based on the way JWT signing works, we're guaranteed that we're receiving a valid token that we have previously signed and issued to a valid user.

As a result of all this, our response to the validate() callback is trivial: we simply return an object containing the userId and username properties. Recall again that Passport will build a user object based on the return value of our validate() method, and attach it as a property on the Request object.

It's also worth pointing out that this approach leaves us room ('hooks' as it were) to inject other business logic into the process. For example, we could do a database lookup in our validate() method to extract more information about the user, resulting in a more enriched user object being available in our Request. This is also the place we may decide to do further token validation, such as looking up the userId in a list of revoked tokens, enabling us to perform token revocation. The model we've implemented here in our sample code is a fast, "stateless JWT" model, where each API call is immediately authorized based on the presence of a valid JWT, and a small bit of information about the requester (its userId and username) is available in our Request pipeline.

Add the new JwtStrategy as a provider in the AuthModule:

@@filename(auth/auth.module)
import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { LocalStrategy } from './local.strategy';
import { JwtStrategy } from './jwt.strategy';
import { UsersModule } from '../users/users.module';
import { PassportModule } from '@nestjs/passport';
import { JwtModule } from '@nestjs/jwt';
import { jwtConstants } from './constants';

@Module({
  imports: [
    UsersModule,
    PassportModule,
    JwtModule.register({
      secret: jwtConstants.secret,
      signOptions: { expiresIn: '60s' },
    }),
  ],
  providers: [AuthService, LocalStrategy, JwtStrategy],
  exports: [AuthService],
})
export class AuthModule {}
@@switch
import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { LocalStrategy } from './local.strategy';
import { JwtStrategy } from './jwt.strategy';
import { UsersModule } from '../users/users.module';
import { PassportModule } from '@nestjs/passport';
import { JwtModule } from '@nestjs/jwt';
import { jwtConstants } from './constants';

@Module({
  imports: [
    UsersModule,
    PassportModule,
    JwtModule.register({
      secret: jwtConstants.secret,
      signOptions: { expiresIn: '60s' },
    }),
  ],
  providers: [AuthService, LocalStrategy, JwtStrategy],
  exports: [AuthService],
})
export class AuthModule {}

By importing the same secret used when we signed the JWT, we ensure that the verify phase performed by Passport, and the sign phase performed in our AuthService, use a common secret.

Finally, we define the JwtAuthGuard class which extends the built-in AuthGuard:

@@filename(auth/jwt-auth.guard)
import { Injectable } from '@nestjs/common';
import { AuthGuard } from '@nestjs/passport';

@Injectable()
export class JwtAuthGuard extends AuthGuard('jwt') {}

Implement protected route and JWT strategy guards

We can now implement our protected route and its associated Guard.

Open the app.controller.ts file and update it as shown below:

@@filename(app.controller)
import { Controller, Get, Request, Post, UseGuards } from '@nestjs/common';
import { JwtAuthGuard } from './auth/jwt-auth.guard';
import { LocalAuthGuard } from './auth/local-auth.guard';
import { AuthService } from './auth/auth.service';

@Controller()
export class AppController {
  constructor(private authService: AuthService) {}

  @UseGuards(LocalAuthGuard)
  @Post('auth/login')
  async login(@Request() req) {
    return this.authService.login(req.user);
  }

  @UseGuards(JwtAuthGuard)
  @Get('profile')
  getProfile(@Request() req) {
    return req.user;
  }
}
@@switch
import { Controller, Dependencies, Bind, Get, Request, Post, UseGuards } from '@nestjs/common';
import { JwtAuthGuard } from './auth/jwt-auth.guard';
import { LocalAuthGuard } from './auth/local-auth.guard';
import { AuthService } from './auth/auth.service';

@Dependencies(AuthService)
@Controller()
export class AppController {
  constructor(authService) {
    this.authService = authService;
  }

  @UseGuards(LocalAuthGuard)
  @Post('auth/login')
  @Bind(Request())
  async login(req) {
    return this.authService.login(req.user);
  }

  @UseGuards(JwtAuthGuard)
  @Get('profile')
  @Bind(Request())
  getProfile(req) {
    return req.user;
  }
}

Once again, we're applying the AuthGuard that the @nestjs/passport module has automatically provisioned for us when we configured the passport-jwt module. This Guard is referenced by its default name, jwt. When our GET /profile route is hit, the Guard will automatically invoke our passport-jwt custom configured logic, validating the JWT, and assigning the user property to the Request object.

Ensure the app is running, and test the routes using cURL.

$ # GET /profile
$ curl http://localhost:3000/profile
$ # result -> {"statusCode":401,"error":"Unauthorized"}

$ # POST /auth/login
$ curl -X POST http://localhost:3000/auth/login -d '{"username": "john", "password": "changeme"}' -H "Content-Type: application/json"
$ # result -> {"access_token":"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2Vybm... }

$ # GET /profile using access_token returned from previous step as bearer code
$ curl http://localhost:3000/profile -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2Vybm..."
$ # result -> {"userId":1,"username":"john"}

Note that in the AuthModule, we configured the JWT to have an expiration of 60 seconds. This is probably too short an expiration, and dealing with the details of token expiration and refresh is beyond the scope of this article. However, we chose that to demonstrate an important quality of JWTs and the passport-jwt strategy. If you wait 60 seconds after authenticating before attempting a GET /profile request, you'll receive a 401 Unauthorized response. This is because Passport automatically checks the JWT for its expiration time, saving you the trouble of doing so in your application.

We've now completed our JWT authentication implementation. JavaScript clients (such as Angular/React/Vue), and other JavaScript apps, can now authenticate and communicate securely with our API Server. You can find a complete version of the code in this chapter here.

Default strategy

In our AppController, we pass the name of the strategy in the AuthGuard() function. We need to do this because we've introduced two Passport strategies (passport-local and passport-jwt), both of which supply implementations of various Passport components. Passing the name disambiguates which implementation we're linking to. When multiple strategies are included in an application, we can declare a default strategy so that we no longer have to pass the name in the AuthGuard function if using that default strategy. Here's how to register a default strategy when importing the PassportModule. This code would go in the AuthModule:

import { Module } from '@nestjs/common';
import { AuthService } from './auth.service';
import { LocalStrategy } from './local.strategy';
import { UsersModule } from '../users/users.module';
import { PassportModule } from '@nestjs/passport';
import { JwtModule } from '@nestjs/jwt';
import { jwtConstants } from './constants';
import { JwtStrategy } from './jwt.strategy';

@Module({
  imports: [
    PassportModule.register({ defaultStrategy: 'jwt' }),
    JwtModule.register({
      secret: jwtConstants.secret,
      signOptions: { expiresIn: '60s' },
    }),
    UsersModule,
  ],
  providers: [AuthService, LocalStrategy, JwtStrategy],
  exports: [AuthService],
})
export class AuthModule {}

Request-scoped strategies

The passport API is based on registering strategies to the global instance of the library. Therefore strategies are not designed to have request-dependent options or to be dynamically instantiated per request (read more about the request-scoped providers). When you configure your strategy to be request-scoped, Nest will never instantiate it since it's not tied to any specific route. There is no physical way to determine which "request-scoped" strategies should be executed per request.

However, there are ways to dynamically resolve request-scoped providers within the strategy. For this, we leverage the module reference feature.

First, open the local.strategy.ts file and inject the ModuleRef in the normal way:

constructor(private moduleRef: ModuleRef) {
  super({
    passReqToCallback: true,
  });
}

info Hint The ModuleRef class is imported from the @nestjs/core package.

Be sure to set the passReqToCallback configuration property to true, as shown above.

In the next step, the request instance will be used to obtain the current context identifier, instead of generating a new one (read more about request context here).

Now, inside the validate() method of the LocalStrategy class, use the getByRequest() method of the ContextIdFactory class to create a context id based on the request object, and pass this to the resolve() call:

async validate(
  request: Request,
  username: string,
  password: string,
) {
  const contextId = ContextIdFactory.getByRequest(request);
  // "AuthService" is a request-scoped provider
  const authService = await this.moduleRef.resolve(AuthService, contextId);
  ...
}

In the example above, the resolve() method will asynchronously return the request-scoped instance of the AuthService provider (we assumed that AuthService is marked as a request-scoped provider).

Extending guards

In most cases, using a provided AuthGuard class is sufficient. However, there might be use-cases when you would like to simply extend the default error handling or authentication logic. For this, you can extend the built-in class and override methods within a sub-class.

import {
  ExecutionContext,
  Injectable,
  UnauthorizedException,
} from '@nestjs/common';
import { AuthGuard } from '@nestjs/passport';

@Injectable()
export class JwtAuthGuard extends AuthGuard('jwt') {
  canActivate(context: ExecutionContext) {
    // Add your custom authentication logic here
    // for example, call super.logIn(request) to establish a session.
    return super.canActivate(context);
  }

  handleRequest(err, user, info) {
    // You can throw an exception based on either "info" or "err" arguments
    if (err || !user) {
      throw err || new UnauthorizedException();
    }
    return user;
  }
}

Customize Passport

Any standard Passport customization options can be passed the same way, using the register() method. The available options depend on the strategy being implemented. For example:

PassportModule.register({ session: true });

You can also pass strategies an options object in their constructors to configure them. For the local strategy you can pass e.g.:

constructor(private authService: AuthService) {
  super({
    usernameField: 'email',
    passwordField: 'password',
  });
}

Take a look at the official Passport Website for property names.

Named strategies

When implementing a strategy, you can provide a name for it by passing a second argument to the PassportStrategy function. If you don't do this, each strategy will have a default name (e.g., 'jwt' for jwt-strategy):

export class JwtStrategy extends PassportStrategy(Strategy, 'myjwt')

Then, you refer to this via a decorator like @UseGuards(AuthGuard('myjwt')).

GraphQL

In order to use an AuthGuard with GraphQL, extend the built-in AuthGuard class and override the getRequest() method.

@Injectable()
export class GqlAuthGuard extends AuthGuard('jwt') {
  getRequest(context: ExecutionContext) {
    const ctx = GqlExecutionContext.create(context);
    return ctx.getContext().req;
  }
}

To use the above construct, be sure to pass the request (req) object as part of the context value in the GraphQL Module settings:

GraphQLModule.forRoot({
  context: ({ req }) => ({ req }),
});

To get the current authenticated user in your graphql resolver, you can define a @CurrentUser() decorator:

import { createParamDecorator, ExecutionContext } from '@nestjs/common';
import { GqlExecutionContext } from '@nestjs/graphql';

export const CurrentUser = createParamDecorator(
  (data: unknown, context: ExecutionContext) => {
    const ctx = GqlExecutionContext.create(context);
    return ctx.getContext().req.user;
  },
);

To use above decorator in your resolver, be sure to include it as a parameter of your query or mutation:

@Query(returns => User)
@UseGuards(GqlAuthGuard)
whoAmI(@CurrentUser() user: User) {
  return this.usersService.findById(user.id);
}

Caching

Caching is a great and simple technique that helps improve your app's performance. It acts as a temporary data store providing high performance data access.

Installation

First install the required package:

$ npm install --save cache-manager

In-memory cache

Nest provides a unified API for various cache storage providers. The built-in one is an in-memory data store. However, you can easily switch to a more comprehensive solution, like Redis. In order to enable caching, first import the CacheModule and call its register() method.

import { CacheModule, Module } from '@nestjs/common';
import { AppController } from './app.controller';

@Module({
  imports: [CacheModule.register()],
  controllers: [AppController],
})
export class ApplicationModule {}

warning Warning In GraphQL applications, interceptors are executed separately for each field resolver. Thus, CacheModule (which uses interceptors to cache responses) will not work properly.

Then just tie the CacheInterceptor where you want to cache data.

@Controller()
@UseInterceptors(CacheInterceptor)
export class AppController {
  @Get()
  findAll(): string[] {
    return [];
  }
}

warningWarning Only GET endpoints are cached. Also, HTTP server routes that inject the native response object (@Res()) cannot use the Cache Interceptor. See response mapping for more details.

Global cache

To reduce the amount of required boilerplate, you can bind CacheInterceptor to all endpoints globally:

import { CacheModule, Module, CacheInterceptor } from '@nestjs/common';
import { AppController } from './app.controller';
import { APP_INTERCEPTOR } from '@nestjs/core';

@Module({
  imports: [CacheModule.register()],
  controllers: [AppController],
  providers: [
    {
      provide: APP_INTERCEPTOR,
      useClass: CacheInterceptor,
    },
  ],
})
export class ApplicationModule {}

Customize caching

All cached data has its own expiration time (TTL). To customize default values, pass the options object to the register() method.

CacheModule.register({
  ttl: 5, // seconds
  max: 10, // maximum number of items in cache
});

Global cache overrides

While global cache is enabled, cache entries are stored under a CacheKey that is auto-generated based on the route path. You may override certain cache settings (@CacheKey() and @CacheTTL()) on a per-method basis, allowing customized caching strategies for individual controller methods. This may be most relevant while using different cache stores.

@Controller()
export class AppController {
  @CacheKey('custom_key')
  @CacheTTL(20)
  findAll(): string[] {
    return [];
  }
}

info Hint The @CacheKey() and @CacheTTL() decorators are imported from the @nestjs/common package.

The @CacheKey() decorator may be used with or without a corresponding @CacheTTL() decorator and vice versa. One may choose to override only the @CacheKey() or only the @CacheTTL(). Settings that are not overridden with a decorator will use the default values as registered globally (see Customize caching).

WebSockets and Microservices

You can also apply the CacheInterceptor to WebSocket subscribers as well as Microservice's patterns (regardless of the transport method that is being used).

@@filename()
@CacheKey('events')
@UseInterceptors(CacheInterceptor)
@SubscribeMessage('events')
handleEvent(client: Client, data: string[]): Observable<string[]> {
  return [];
}
@@switch
@CacheKey('events')
@UseInterceptors(CacheInterceptor)
@SubscribeMessage('events')
handleEvent(client, data) {
  return [];
}

However, the additional @CacheKey() decorator is required in order to specify a key used to subsequently store and retrieve cached data. Also, please note that you shouldn't cache everything. Actions which perform some business operations rather than simply querying the data should never be cached.

Additionally, you may specify a cache expiration time (TTL) by using the @CacheTTL() decorator, which will override the global default TTL value.

@@filename()
@CacheTTL(10)
@UseInterceptors(CacheInterceptor)
@SubscribeMessage('events')
handleEvent(client: Client, data: string[]): Observable<string[]> {
  return [];
}
@@switch
@CacheTTL(10)
@UseInterceptors(CacheInterceptor)
@SubscribeMessage('events')
handleEvent(client, data) {
  return [];
}

info Hint The @CacheTTL() decorator may be used with or without a corresponding @CacheKey() decorator.

Different stores

This service takes advantage of cache-manager under the hood. The cache-manager package supports a wide-range of useful stores, for example, Redis store. A full list of supported stores is available here. To set up the Redis store, simply pass the package together with corresponding options to the register() method.

import * as redisStore from 'cache-manager-redis-store';
import { CacheModule, Module } from '@nestjs/common';
import { AppController } from './app.controller';

@Module({
  imports: [
    CacheModule.register({
      store: redisStore,
      host: 'localhost',
      port: 6379,
    }),
  ],
  controllers: [AppController],
})
export class ApplicationModule {}

Adjust tracking

By default, Nest uses the request URL (in an HTTP app) or cache key (in websockets and microservices apps, set through the @CacheKey() decorator) to associate cache records with your endpoints. Nevertheless, sometimes you might want to set up tracking based on different factors, for example, using HTTP headers (e.g. Authorization to properly identify profile endpoints).

In order to accomplish that, create a subclass of CacheInterceptor and override the trackBy() method.

@Injectable()
class HttpCacheInterceptor extends CacheInterceptor {
  trackBy(context: ExecutionContext): string | undefined {
    return 'key';
  }
}

Async configuration

You may want to asynchronously pass in module options instead of passing them statically at compile time. In this case, use the registerAsync() method, which provides several ways to deal with async configuration.

One approach is to use a factory function:

CacheModule.registerAsync({
  useFactory: () => ({
    ttl: 5,
  }),
});

Our factory behaves like all other asynchronous module factories (it can be async and is able to inject dependencies through inject).

CacheModule.registerAsync({
  imports: [ConfigModule],
  useFactory: async (configService: ConfigService) => ({
    ttl: configService.getString('CACHE_TTL'),
  }),
  inject: [ConfigService],
});

Alternatively, you can use the useClass method:

CacheModule.registerAsync({
  useClass: CacheConfigService,
});

The above construction will instantiate CacheConfigService inside CacheModule and will use it to get the options object. The CacheConfigService has to implement the CacheOptionsFactory interface in order to provide the configuration options:

@Injectable()
class CacheConfigService implements CacheOptionsFactory {
  createCacheOptions(): CacheModuleOptions {
    return {
      ttl: 5,
    };
  }
}

If you wish to use an existing configuration provider imported from a different module, use the useExisting syntax:

CacheModule.registerAsync({
  imports: [ConfigModule],
  useExisting: ConfigService,
});

This works the same as useClass with one critical difference - CacheModule will lookup imported modules to reuse any already-created ConfigService, instead of instantiating its own.

Compression

Compression can greatly decrease the size of the response body, thereby increasing the speed of a web app.

For high-traffic websites in production, it is strongly recommended to offload compression from the application server - typically in a reverse proxy (e.g., Nginx). In that case, you should not use compression middleware.

Use with Express (default)

Use the compression middleware package to enable gzip compression.

First install the required package:

$ npm i --save compression

Once the installation is complete, apply the compression middleware as global middleware.

import * as compression from 'compression';
// somewhere in your initialization file
app.use(compression());

Use with Fastify

If using the FastifyAdapter, you'll want to use fastify-compress:

$ npm i --save fastify-compress

Once the installation is complete, apply the fastify-compress middleware as global middleware.

import * as compression from 'fastify-compress';
// somewhere in your initialization file
app.register(compression);

By default, fastify-compress will use Brotli compression (on Node >= 11.7.0) when browsers indicate support for the encoding. While Brotli is quite efficient in terms of compression ratio, it's also quite slow. Due to this, you may want to tell fastify-compress to only use deflate and gzip to compress responses; you'll end up with larger responses but they'll be delivered much more quickly.

To specify encodings, provide a second argument to app.register:

app.register(compression, { encodings: ['gzip', 'deflate'] });

The above tells fastify-compress to only use gzip and deflate encodings, preferring gzip if the client supports both.

Configuration

Applications often run in different environments. Depending on the environment, different configuration settings should be used. For example, usually the local environment relies on specific database credentials, valid only for the local DB instance. The production environment would use a separate set of DB credentials. Since configuration variables change, best practice is to store configuration variables in the environment.

Externally defined environment variables are visible inside Node.js through the process.env global. We could try to solve the problem of multiple environments by setting the environment variables separately in each environment. This can quickly get unwieldy, especially in the development and testing environments where these values need to be easily mocked and/or changed.

In Node.js applications, it's common to use .env files, holding key-value pairs where each key represents a particular value, to represent each environment. Running an app in different environments is then just a matter of swapping in the correct .env file.

A good approach for using this technique in Nest is to create a ConfigModule that exposes a ConfigService which loads the appropriate .env file. While you may choose to write such a module yourself, for convenience Nest provides the @nestjs/config package out-of-the box. We'll cover this package in the current chapter.

Installation

To begin using it, we first install the required dependency.

$ npm i --save @nestjs/config

info Hint The @nestjs/config package internally uses dotenv.

Getting started

Once the installation process is complete, we can import the ConfigModule. Typically, we'll import it into the root AppModule and control its behavior using the .forRoot() static method. During this step, environment variable key/value pairs are parsed and resolved. Later, we'll see several options for accessing the ConfigService class of the ConfigModule in our other feature modules.

@@filename(app.module)
import { Module } from '@nestjs/common';
import { ConfigModule } from '@nestjs/config';

@Module({
  imports: [ConfigModule.forRoot()],
})
export class AppModule {}

The above code will load and parse a .env file from the default location (the project root directory), merge key/value pairs from the .env file with environment variables assigned to process.env, and store the result in a private structure that you can access through the ConfigService. The forRoot() method registers the ConfigService provider, which provides a get() method for reading these parsed/merged configuration variables. Since @nestjs/config relies on dotenv, it uses that package's rules for resolving conflicts in environment variable names. When a key exists both in the runtime environment as an environment variable (e.g., via OS shell exports like export DATABASE_USER=test) and in a .env file, the runtime environment variable takes precedence.

A sample .env file looks something like this:

DATABASE_USER=test
DATABASE_PASSWORD=test

Custom env file path

By default, the package looks for a .env file in the root directory of the application. To specify another path for the .env file, set the envFilePath property of an (optional) options object you pass to forRoot(), as follows:

ConfigModule.forRoot({
  envFilePath: '.development.env',
});

You can also specify multiple paths for .env files like this:

ConfigModule.forRoot({
  envFilePath: ['.env.development.local', '.env.development'],
});

If a variable is found in multiple files, the first one takes precedence.

Disable env variables loading

If you don't want to load the .env file, but instead would like to simply access environment variables from the runtime environment (as with OS shell exports like export DATABASE_USER=test), set the options object's ignoreEnvFile property to true, as follows:

ConfigModule.forRoot({
  ignoreEnvFile: true,
});

Use module globally

When you want to use ConfigModule in other modules, you'll need to import it (as is standard with any Nest module). Alternatively, declare it as a global module by setting the options object's isGlobal property to true, as shown below. In that case, you will not need to import ConfigModule in other modules once it's been loaded in the root module (e.g., AppModule).

ConfigModule.forRoot({
  isGlobal: true,
});

Custom configuration files

For more complex projects, you may utilize custom configuration files to return nested configuration objects. This allows you to group related configuration settings by function (e.g., database-related settings), and to store related settings in individual files to help manage them independently.

A custom configuration file exports a factory function that returns a configuration object. The configuration object can be any arbitrarily nested plain JavaScript object. The process.env object will contain the fully resolved environment variable key/value pairs (with .env file and externally defined variables resolved and merged as described above). Since you control the returned configuration object, you can add any required logic to cast values to an appropriate type, set default values, etc. For example:

@@filename(config/configuration)
export default () => ({
  port: parseInt(process.env.PORT, 10) || 3000,
  database: {
    host: process.env.DATABASE_HOST,
    port: parseInt(process.env.DATABASE_PORT, 10) || 5432
  }
});

We load this file using the load property of the options object we pass to the ConfigModule.forRoot() method:

import configuration from './config/configuration';

@Module({
  imports: [
    ConfigModule.forRoot({
      load: [configuration],
    }),
  ],
})
export class AppModule {}

info Notice The value assigned to the load property is an array, allowing you to load multiple configuration files (e.g. load: [databaseConfig, authConfig])

Using the ConfigService

To access configuration values from our ConfigService, we first need to inject ConfigService. As with any provider, we need to import its containing module - the ConfigModule - into the module that will use it (unless you set the isGlobal property in the options object passed to the ConfigModule.forRoot() method to true). Import it into a feature module as shown below.

@@filename(feature.module)
@Module({
  imports: [ConfigModule],
  // ...
})

Then we can inject it using standard constructor injection:

constructor(private configService: ConfigService) {}

And use it in our class:

// get an environment variable
const dbUser = this.configService.get<string>('DATABASE_USER');

// get a custom configuration value
const dbHost = this.configService.get<string>('database.host');

As shown above, use the configService.get() method to get a simple environment variable by passing the variable name. You can do TypeScript type hinting by passing the type, as shown above (e.g., get<string>(...)). The get() method can also traverse a nested custom configuration object (created via a Custom configuration file), as shown in the second example above.

You can also get the whole nested custom configuration object using an interface as the type hint:

interface DatabaseConfig {
  host: string;
  port: number;
}

const dbConfig = this.configService.get<DatabaseConfig>('database');

// you can now use `dbConfig.port` and `dbConfig.host`
const port = dbConfig.port;

The get() method also takes an optional second argument defining a default value, which will be returned when the key doesn't exist, as shown below:

// use "localhost" when "database.host" is not defined
const dbHost = this.configService.get<string>('database.host', 'localhost');

ConfigService has an optional generic (type argument) to help prevent accessing a config property that does not exist. Use it as shown below:

interface EnvironmentVariables {
  PORT: number;
  TIMEOUT: string;
}

// somewhere in the code
constructor(private configService: ConfigService<EnvironmentVariables>) {
  // this is valid
  const port = this.configService.get<number>('PORT');

  // this is invalid as URL is not a property on the EnvironmentVariables interface
  const url = this.configService.get<string>('URL');
}

warning Notice If you have nested properties in your config, like in the database.host example above, the interface must have a matching 'database.host': string; property. Otherwise a TypeScript error will be thrown.

Configuration namespaces

The ConfigModule allows you to define and load multiple custom configuration files, as shown in Custom configuration files above. You can manage complex configuration object hierarchies with nested configuration objects as shown in that section. Alternatively, you can return a "namespaced" configuration object with the registerAs() function as follows:

@@filename(config/database.config)
export default registerAs('database', () => ({
  host: process.env.DATABASE_HOST,
  port: process.env.DATABASE_PORT || 5432
}));

As with custom configuration files, inside your registerAs() factory function, the process.env object will contain the fully resolved environment variable key/value pairs (with .env file and externally defined variables resolved and merged as described above).

info Hint The registerAs function is exported from the @nestjs/config package.

Load a namespaced configuration with the load property of the forRoot() method's options object, in the same way you load a custom configuration file:

import databaseConfig from './config/database.config';

@Module({
  imports: [
    ConfigModule.forRoot({
      load: [databaseConfig],
    }),
  ],
})
export class AppModule {}

Now, to get the host value from the database namespace, use dot notation. Use 'database' as the prefix to the property name, corresponding to the name of the namespace (passed as the first argument to the registerAs() function):

const dbHost = this.configService.get<string>('database.host');

A reasonable alternative is to inject the database namespace directly. This allows us to benefit from strong typing:

constructor(
  @Inject(databaseConfig.KEY)
  private dbConfig: ConfigType<typeof databaseConfig>,
) {}

info Hint The ConfigType is exported from the @nestjs/config package.

Partial registration

Thus far, we've processed configuration files in our root module (e.g., AppModule), with the forRoot() method. Perhaps you have a more complex project structure, with feature-specific configuration files located in multiple different directories. Rather than load all these files in the root module, the @nestjs/config package provides a feature called partial registration, which references only the configuration files associated with each feature module. Use the forFeature() static method within a feature module to perform this partial registration, as follows:

import databaseConfig from './config/database.config';

@Module({
  imports: [ConfigModule.forFeature(databaseConfig)],
})
export class DatabaseModule {}

info Warning In some circumstances, you may need to access properties loaded via partial registration using the onModuleInit() hook, rather than in a constructor. This is because the forFeature() method is run during module initialization, and the order of module initialization is indeterminate. If you access values loaded this way by another module, in a constructor, the module that the configuration depends upon may not yet have initialized. The onModuleInit() method runs only after all modules it depends upon have been initialized, so this technique is safe.

Schema validation

It is standard practice to throw an exception during application startup if required environment variables haven't been provided or if they don't meet certain validation rules. The @nestjs/config package enables use of the Joi npm package to support this type of validation. With Joi, you define an object schema and validate JavaScript objects against it.

Install Joi (and its types, for TypeScript users):

$ npm install --save @hapi/joi
$ npm install --save-dev @types/hapi__joi

warning Notice The latest version of @hapi/joi requires you to be running Node v12 or later. For older versions of node, please install v16.1.8. This is mainly after the release of v17.0.2 which causes errors during build time. For more information, please refer to their documentation & this github issue.

Now we can define a Joi validation schema and pass it via the validationSchema property of the forRoot() method's options object, as shown below:

@@filename(app.module)
import * as Joi from '@hapi/joi';

@Module({
  imports: [
    ConfigModule.forRoot({
      validationSchema: Joi.object({
        NODE_ENV: Joi.string()
          .valid('development', 'production', 'test', 'provision')
          .default('development'),
        PORT: Joi.number().default(3000),
      }),
    }),
  ],
})
export class AppModule {}

By default, all schema keys are considered optional. Here, we set default values for NODE_ENV and PORT which will be used if we don't provide these variables in the environment (.env file or process environment). Alternatively, we can use the required() validation method to require that a value must be defined in the environment (.env file or process environment). In this case, the validation step will throw an exception if we don't provide the variable in the environment. See Joi validation methods for more on how to construct validation schemas.

By default, unknown environment variables (environment variables whose keys are not present in the schema) are allowed and do not trigger a validation exception. By default, all validation errors are reported. You can alter these behaviors by passing an options object via the validationOptions key of the forRoot() options object. This options object can contain any of the standard validation options properties provided by Joi validation options. For example, to reverse the two settings above, pass options like this:

@@filename(app.module)
import * as Joi from '@hapi/joi';

@Module({
  imports: [
    ConfigModule.forRoot({
      validationSchema: Joi.object({
        NODE_ENV: Joi.string()
          .valid('development', 'production', 'test', 'provision')
          .default('development'),
        PORT: Joi.number().default(3000),
      }),
      validationOptions: {
        allowUnknown: false,
        abortEarly: true,
      },
    }),
  ],
})
export class AppModule {}

The @nestjs/config package uses default settings of:

  • allowUnknown: controls whether or not to allow unknown keys in the environment variables. Default is true
  • abortEarly: if true, stops validation on the first error; if false, returns all errors. Defaults to false.

Note that once you decide to pass a validationOptions object, any settings you do not explicitly pass will default to Joi standard defaults (not the @nestjs/config defaults). For example, if you leave allowUnknowns unspecified in your custom validationOptions object, it will have the Joi default value of false. Hence, it is probably safest to specify both of these settings in your custom object.

Custom getter functions

ConfigService defines a generic get() method to retrieve a configuration value by key. We may also add getter functions to enable a little more natural coding style:

@@filename()
@Injectable()
export class ApiConfigService {
  constructor(private configService: ConfigService) {}

  get isAuthEnabled(): boolean {
    return this.configService.get('AUTH_ENABLED') === 'true';
  }
}
@@switch
@Dependencies(ConfigService)
@Injectable()
export class ApiConfigService {
  constructor(configService) {
    this.configService = configService;
  }

  get isAuthEnabled() {
    return this.configService.get('AUTH_ENABLED') === 'true';
  }
}

Now we can use the getter function as follows:

@@filename(app.service)
@Injectable()
export class AppService {
  constructor(apiConfigService: ApiConfigService) {
    if (apiConfigService.isAuthEnabled) {
      // Authentication is enabled
    }
  }
}
@@switch
@Dependencies(ApiConfigService)
@Injectable()
export class AppService {
  constructor(apiConfigService) {
    if (apiConfigService.isAuthEnabled) {
      // Authentication is enabled
    }
  }
}

Expandable variables

The @nestjs/config package supports environment variable expansion. With this technique, you can create nested environment variables, where one variable is referred to within the definition of another. For example:

APP_URL=mywebsite.com
SUPPORT_EMAIL=support@${APP_URL}

With this construction, the variable SUPPORT_EMAIL resolves to '[email protected]'. Note the use of the ${{ '{' }}...{{ '}' }} syntax to trigger resolving the value of the variable APP_URL inside the definition of SUPPORT_EMAIL.

info Hint For this feature, @nestjs/config package internally uses dotenv-expand.

Enable environment variable expansion using the expandVariables property in the options object passed to the forRoot() method of the ConfigModule, as shown below:

@@filename(app.module)
@Module({
  imports: [
    ConfigModule.forRoot({
      // ...
      expandVariables: true,
    }),
  ],
})
export class AppModule {}

Using in the main.ts

While our config is a stored in a service, it can still be used in the main.ts file. This way, you can use it to store variables such as the application port or the CORS host.

To access it, you must use the app.get() method, followed by the service reference:

const configService = app.get(ConfigService);

You can then use it as usual, by calling the get method with the configuration key:

const port = configService.get('PORT');

File upload

To handle file uploading, Nest provides a built-in module based on the multer middleware package for Express. Multer handles data posted in the multipart/form-data format, which is primarily used for uploading files via an HTTP POST request. This module is fully configurable and you can adjust its behavior to your application requirements.

warning Warning Multer cannot process data which is not in the supported multipart format (multipart/form-data). Also, note that this package is not compatible with the FastifyAdapter.

Basic example

To upload a single file, simply tie the FileInterceptor() interceptor to the route handler and extract file from the request using the @UploadedFile() decorator.

@@filename()
@Post('upload')
@UseInterceptors(FileInterceptor('file'))
uploadFile(@UploadedFile() file) {
  console.log(file);
}
@@switch
@Post('upload')
@UseInterceptors(FileInterceptor('file'))
@Bind(UploadedFile())
uploadFile(file) {
  console.log(file);
}

info Hint The FileInterceptor() decorator is exported from the @nestjs/platform-express package. The @UploadedFile() decorator is exported from @nestjs/common.

The FileInterceptor() decorator takes two arguments:

  • fieldName: string that supplies the name of the field from the HTML form that holds a file
  • options: optional object of type MulterOptions. This is the same object used by the multer constructor (more details here).

Array of files

To upload an array of files (identified with a single field name), use the FilesInterceptor() decorator (note the plural Files in the decorator name). This decorator takes three arguments:

  • fieldName: as described above
  • maxCount: optional number defining the maximum number of files to accept
  • options: optional MulterOptions object, as described above

When using FilesInterceptor(), extract files from the request with the @UploadedFiles() decorator.

@@filename()
@Post('upload')
@UseInterceptors(FilesInterceptor('files'))
uploadFile(@UploadedFiles() files) {
  console.log(files);
}
@@switch
@Post('upload')
@UseInterceptors(FilesInterceptor('files'))
@Bind(UploadedFiles())
uploadFile(files) {
  console.log(files);
}

info Hint The FilesInterceptor() decorator is exported from the @nestjs/platform-express package. The @UploadedFiles() decorator is exported from @nestjs/common.

Multiple files

To upload multiple fields (all with different field name keys), use the FileFieldsInterceptor() decorator. This decorator takes two arguments:

  • uploadedFields: an array of objects, where each object specifies a required name property with a string value specifying a field name, as described above, and an optional maxCount property, as described above
  • options: optional MulterOptions object, as described above

When using FileFieldsInterceptor(), extract files from the request with the @UploadedFiles() decorator.

@@filename()
@Post('upload')
@UseInterceptors(FileFieldsInterceptor([
  { name: 'avatar', maxCount: 1 },
  { name: 'background', maxCount: 1 },
]))
uploadFile(@UploadedFiles() files) {
  console.log(files);
}
@@switch
@Post('upload')
@Bind(UploadedFiles())
@UseInterceptors(FileFieldsInterceptor([
  { name: 'avatar', maxCount: 1 },
  { name: 'background', maxCount: 1 },
]))
uploadFile(files) {
  console.log(files);
}

Any files

To upload all fields with arbitrary field name keys, use the AnyFilesInterceptor() decorator. This decorator can accept an optional options object as described above.

When using AnyFilesInterceptor(), extract files from the request with the @UploadedFiles() decorator.

@@filename()
@Post('upload')
@UseInterceptors(AnyFilesInterceptor())
uploadFile(@UploadedFiles() files) {
  console.log(files);
}
@@switch
@Post('upload')
@Bind(UploadedFiles())
@UseInterceptors(AnyFilesInterceptor())
uploadFile(files) {
  console.log(files);
}

Default options

You can specify multer options in the file interceptors as described above. To set default options, you can call the static register() method when you import the MulterModule, passing in supported options. You can use all options listed here.

MulterModule.register({
  dest: '/upload',
});

info Hint The MulterModule class is exported from the @nestjs/platform-express package.

Async configuration

When you need to set MulterModule options asynchronously instead of statically, use the registerAsync() method. As with most dynamic modules, Nest provides several techniques to deal with async configuration.

One technique is to use a factory function:

MulterModule.registerAsync({
  useFactory: () => ({
    dest: '/upload',
  }),
});

Like other factory providers, our factory function can be async and can inject dependencies through inject.

MulterModule.registerAsync({
  imports: [ConfigModule],
  useFactory: async (configService: ConfigService) => ({
    dest: configService.getString('MULTER_DEST'),
  }),
  inject: [ConfigService],
});

Alternatively, you can configure the MulterModule using a class instead of a factory, as shown below:

MulterModule.registerAsync({
  useClass: MulterConfigService,
});

The construction above instantiates MulterConfigService inside MulterModule, using it to create the required options object. Note that in this example, the MulterConfigService has to implement the MulterOptionsFactory interface, as shown below. The MulterModule will call the createMulterOptions() method on the instantiated object of the supplied class.

@Injectable()
class MulterConfigService implements MulterOptionsFactory {
  createMulterOptions(): MulterModuleOptions {
    return {
      dest: '/upload',
    };
  }
}

If you want to reuse an existing options provider instead of creating a private copy inside the MulterModule, use the useExisting syntax.

MulterModule.registerAsync({
  imports: [ConfigModule],
  useExisting: ConfigService,
});

HTTP module

Axios is richly featured HTTP client package that is widely used. Nest wraps Axios and exposes it via the built-in HttpModule. The HttpModule exports the HttpService class, which exposes Axios-based methods to perform HTTP requests. The library also transforms the resulting HTTP responses into Observables.

To use the HttpService, first import HttpModule.

@Module({
  imports: [HttpModule],
  providers: [CatsService],
})
export class CatsModule {}

Next, inject HttpService using normal constructor injection.

info Hint HttpModule and HttpService are imported from @nestjs/common package.

@@filename()
@Injectable()
export class CatsService {
  constructor(private httpService: HttpService) {}

  findAll(): Observable<AxiosResponse<Cat[]>> {
    return this.httpService.get('http://localhost:3000/cats');
  }
}
@@switch
@Injectable()
@Dependencies(HttpService)
export class CatsService {
  constructor(httpService) {
    this.httpService = httpService;
  }

  findAll() {
    return this.httpService.get('http://localhost:3000/cats');
  }
}

All HttpService methods return an AxiosResponse wrapped in an Observable object.

Configuration

Axios can be configured with a variety of options to customize the behavior of the HttpService. Read more about them here. To configure the underlying Axios instance, pass an optional options object to the register() method of HttpModule when importing it. This options object will be passed directly to the underlying Axios constructor.

@Module({
  imports: [
    HttpModule.register({
      timeout: 5000,
      maxRedirects: 5,
    }),
  ],
  providers: [CatsService],
})
export class CatsModule {}

Async configuration

When you need to pass module options asynchronously instead of statically, use the registerAsync() method. As with most dynamic modules, Nest provides several techniques to deal with async configuration.

One technique is to use a factory function:

HttpModule.registerAsync({
  useFactory: () => ({
    timeout: 5000,
    maxRedirects: 5,
  }),
});

Like other factory providers, our factory function can be async and can inject dependencies through inject.

HttpModule.registerAsync({
  imports: [ConfigModule],
  useFactory: async (configService: ConfigService) => ({
    timeout: configService.getString('HTTP_TIMEOUT'),
    maxRedirects: configService.getString('HTTP_MAX_REDIRECTS'),
  }),
  inject: [ConfigService],
});

Alternatively, you can configure the HttpModule using a class instead of a factory, as shown below.

HttpModule.registerAsync({
  useClass: HttpConfigService,
});

The construction above instantiates HttpConfigService inside HttpModule, using it to create an options object. Note that in this example, the HttpConfigService has to implement HttpModuleOptionsFactory interface as shown below. The HttpModule will call the createHttpOptions() method on the instantiated object of the supplied class.

@Injectable()
class HttpConfigService implements HttpModuleOptionsFactory {
  createHttpOptions(): HttpModuleOptions {
    return {
      timeout: 5000,
      maxRedirects: 5,
    };
  }
}

If you want to reuse an existing options provider instead of creating a private copy inside the HttpModule, use the useExisting syntax.

HttpModule.registerAsync({
  imports: [ConfigModule],
  useExisting: ConfigService,
});

Logger

Nest comes with a built-in text-based logger which is used during application bootstrapping and several other circumstances such as displaying caught exceptions (i.e., system logging). This functionality is provided via the Logger class in the @nestjs/common package. You can fully control the behavior of the logging system, including any of the following:

  • disable logging entirely
  • specify the log level of detail (e.g., display errors, warnings, debug information, etc.)
  • completely override the default logger
  • customize the default logger by extending it
  • make use of dependency injection to simplify composing and testing your application

You can also make use of the built-in logger, or create your own custom implementation, to log your own application-level events and messages.

For more advanced logging functionality, you can make use of any Node.js logging package, such as Winston, to implement a completely custom, production grade logging system.

Basic customization

To disable logging, set the logger property to false in the (optional) Nest application options object passed as the second argument to the NestFactory.create() method.

const app = await NestFactory.create(ApplicationModule, {
  logger: false,
});
await app.listen(3000);

To enable specific logging levels, set the logger property to an array of strings specifying the log levels to display, as follows:

const app = await NestFactory.create(ApplicationModule, {
  logger: ['error', 'warn'],
});
await app.listen(3000);

Values in the array can be any combination of 'log', 'error', 'warn', 'debug', and 'verbose'.

Custom implementation

You can provide a custom logger implementation to be used by Nest for system logging by setting the value of the logger property to an object that fulfills the LoggerService interface. For example, you can tell Nest to use the built-in global JavaScript console object (which implements the LoggerService interface), as follows:

const app = await NestFactory.create(ApplicationModule, {
  logger: console,
});
await app.listen(3000);

Implementing your own custom logger is straightforward. Simply implement each of the methods of the LoggerService interface as shown below.

import { LoggerService } from '@nestjs/common';

export class MyLogger implements LoggerService {
  log(message: string) {
    /* your implementation */
  }
  error(message: string, trace: string) {
    /* your implementation */
  }
  warn(message: string) {
    /* your implementation */
  }
  debug(message: string) {
    /* your implementation */
  }
  verbose(message: string) {
    /* your implementation */
  }
}

You can then supply an instance of MyLogger via the logger property of the Nest application options object.

const app = await NestFactory.create(ApplicationModule, {
  logger: new MyLogger(),
});
await app.listen(3000);

This technique, while simple, doesn't utilize dependency injection for the MyLogger class. This can pose some challenges, particularly for testing, and limit the reusability of MyLogger. For a better solution, see the Dependency Injection section below.

Extend built-in logger

Rather than writing a logger from scratch, you may be able to meet your needs by extending the built-in Logger class and overriding selected behavior of the default implementation.

import { Logger } from '@nestjs/common';

export class MyLogger extends Logger {
  error(message: string, trace: string) {
    // add your tailored logic here
    super.error(message, trace);
  }
}

You can use such an extended logger in your feature modules as described in the Using the logger for application logging section below.

You can tell Nest to use your extended logger for system logging by passing an instance of it via the logger property of the application options object (as shown in the Custom implementation section above), or by using the technique shown in the Dependency Injection section below. If you do so, you should take care to call super, as shown in the sample code above, to delegate the specific log method call to the parent (built-in) class so that Nest can rely on the built-in features it expects.

Dependency injection

For more advanced logging functionality, you'll want to take advantage of dependency injection. For example, you may want to inject a ConfigService into your logger to customize it, and in turn inject your custom logger into other controllers and/or providers. To enable dependency injection for your custom logger, create a class that implements LoggerService and register that class as a provider in some module. For example, you can

  1. Define a MyLogger class that either extends the built-in Logger or completely overrides it, as shown in previous sections.
  2. Create a LoggerModule as shown below, and provide MyLogger from that module.
import { Module } from '@nestjs/common';
import { MyLogger } from './my-logger.service';

@Module({
  providers: [MyLogger],
  exports: [MyLogger],
})
export class LoggerModule {}

With this construct, you are now providing your custom logger for use by any other module. Because your MyLogger class is part of a module, it can use dependency injection (for example, to inject a ConfigService). There's one more technique needed to provide this custom logger for use by Nest for system logging (e.g., for bootstrapping and error handling).

Because application instantiation (NestFactory.create()) happens outside the context of any module, it doesn't participate in the normal Dependency Injection phase of initialization. So we must ensure that at least one application module imports the LoggerModule to trigger Nest to instantiate a singleton instance of our MyLogger class. We can then instruct Nest to use the same singleton instance of MyLogger with the following construction:

const app = await NestFactory.create(ApplicationModule, {
  logger: false,
});
app.useLogger(app.get(MyLogger));
await app.listen(3000);

Here we use the get() method on the NestApplication instance to retrieve the singleton instance of the MyLogger object. This technique is essentially a way to "inject" an instance of a logger for use by Nest. The app.get() call retrieves the singleton instance of MyLogger, and depends on that instance being first injected in another module, as described above.

You can also inject this MyLogger provider in your feature classes, thus ensuring consistent logging behavior across both Nest system logging and application logging. See Using the logger for application logging below for more information.

The only downside of this solution is that your first initialization messages won't be handled by your logger instance, though, it shouldn't really matter at this point.

Using the logger for application logging

We can combine several of the techniques above to provide consistent behavior and formatting across both Nest system logging and our own application event/message logging. In this section, we'll achieve this with the following steps:

  1. We extend the built-in logger and customize the context portion of the log message (e.g., the phrase NestFactory in square brackets in the log line shown below).
[Nest] 19096   - 12/08/2019, 7:12:59 AM   [NestFactory] Starting Nest application...
  1. We inject a transient instance of the Logger into our feature modules so that each one has its own custom context.
  2. We supply this extended logger for Nest to use for system logging.

To start, extend the built-in logger with code like the following. We supply the scope option as configuration metadata for the Logger class, specifying a transient scope, to ensure that we'll have a unique instance of the Logger in each feature module. In this example, we do not extend the individual Logger methods (like log(), warn(), etc.), though you may choose to do so.

import { Injectable, Scope, Logger } from '@nestjs/common';

@Injectable({ scope: Scope.TRANSIENT })
export class MyLogger extends Logger {}

Next, create a LoggerModule with a construction like this:

import { Module } from '@nestjs/common';
import { MyLogger } from './my-logger.service';

@Module({
  providers: [MyLogger],
  exports: [MyLogger],
})
export class LoggerModule {}

Next, import the LoggerModule into your feature module. Then set the logger context, and start using the context-aware custom logger, like this:

import { Injectable } from '@nestjs/common';
import { MyLogger } from './my-logger.service';

@Injectable()
export class CatsService {
  private readonly cats: Cat[] = [];

  constructor(private myLogger: MyLogger) {
    this.myLogger.setContext('CatsService');
  }

  findAll(): Cat[] {
    this.myLogger.warn('About to return cats!');
    return this.cats;
  }
}

Finally, instruct Nest to use an instance of the custom logger in your main.ts file as shown below. Of course in this example, we haven't actually customized the logger behavior (by extending the Logger methods like log(), warn(), etc.), so this step isn't actually needed. But it would be needed if you added custom logic to those methods and wanted Nest to use the same implementation.

const app = await NestFactory.create(ApplicationModule, {
  logger: false,
});
app.useLogger(new MyLogger());
await app.listen(3000);

Use external logger

Production applications often have specific logging requirements, including advanced filtering, formatting and centralized logging. Nest's built-in logger is used for monitoring Nest system behavior, and can also be useful for basic formatted text logging in your feature modules while in development, but production applications often take advantage of dedicated logging modules like Winston. As with any standard Node.js application, you can take full advantage of such modules in Nest.

Mongo

Nest supports two methods for integrating with the MongoDB database. You can either use the built-in TypeORM module described here, which has a connector for MongoDB, or use Mongoose, the most popular MongoDB object modeling tool. In this chapter we'll describe the latter, using the dedicated @nestjs/mongoose package.

Start by installing the required dependencies:

$ npm install --save @nestjs/mongoose mongoose
$ npm install --save-dev @types/mongoose

Once the installation process is complete, we can import the MongooseModule into the root AppModule.

@@filename(app.module)
import { Module } from '@nestjs/common';
import { MongooseModule } from '@nestjs/mongoose';

@Module({
  imports: [MongooseModule.forRoot('mongodb://localhost/nest')],
})
export class AppModule {}

The forRoot() method accepts the same configuration object as mongoose.connect() from the Mongoose package, as described here.

Model injection

With Mongoose, everything is derived from a Schema. Each schema maps to a MongoDB collection and defines the shape of the documents within that collection. Schemas are used to define Models. Models are responsible for creating and reading documents from the underlying MongoDB database.

Schemas can be created with NestJS decorators, or with Mongoose itself manually. Using decorators to create schemas greatly reduces boilerplate and improves overall code readability.

Let's define the CatSchema:

@@filename(schemas/cat.schema)
import { Prop, Schema, SchemaFactory } from '@nestjs/mongoose';
import { Document } from 'mongoose';

export type CatDocument = Cat & Document;

@Schema()
export class Cat {
  @Prop()
  name: string;

  @Prop()
  age: number;

  @Prop()
  breed: string;
}

export const CatSchema = SchemaFactory.createForClass(Cat);

info Hint Note you can also generate a raw schema definition using the DefinitionsFactory class (from the nestjs/mongoose). This allows you to manually modify the schema definition generated based on the metadata you provided. This is useful for certain edge-cases where it may be hard to represent everything with decorators.

The @Schema() decorator marks a class as a schema definition. It maps our Cat class to a MongoDB collection of the same name, but with an additional “s” at the end - so the final mongo collection name will be cats. This decorator accepts a single optional argument which is a schema options object. Think of it as the object you would normally pass as a second argument of the mongoose.Schema class' constructor (e.g., new mongoose.Schema(_, options))). To learn more about available schema options, see this chapter.

The @Prop() decorator defines a property in the document. For example, in the schema definition above, we defined three properties: name, age, and breed. The schema types for these properties are automatically inferred thanks to TypeScript metadata (and reflection) capabilities. However, in more complex scenarios in which types cannot be implicitly reflected (for example, arrays or nested object structures), types must be indicated explicitly, as follows:

@Prop([String])
tags: string[];

Alternatively, the @Prop() decorator accepts an options object argument (read more about the available options). With this, you can indicate whether a property is required or not, specify a default value, or mark it as immutable. For example:

@Prop({ required: true })
name: string;

Finally, the raw schema definition can also be passed to the decorator. This is useful when, for example, a property represents a nested object which is not defined as a class. For this, use the raw() function from the @nestjs/mongoose package, as follows:

@Prop(raw({
  firstName: { type: String },
  lastName: { type: String }
}))
details: Record<string, any>;

Alternatively, if you prefer not using decorators, you can define a schema manually. For example:

export const CatSchema = new mongoose.Schema({
  name: String,
  age: Number,
  breed: String,
});

The cat.schema file resides in a folder in the cats directory, where we also define the CatsModule. While you can store schema files wherever you prefer, we recommend storing them near their related domain objects, in the appropriate module directory.

Let's look at the CatsModule:

@@filename(cats.module)
import { Module } from '@nestjs/common';
import { MongooseModule } from '@nestjs/mongoose';
import { CatsController } from './cats.controller';
import { CatsService } from './cats.service';
import { Cat, CatSchema } from './schemas/cat.schema';

@Module({
  imports: [MongooseModule.forFeature([{ name: Cat.name, schema: CatSchema }])],
  controllers: [CatsController],
  providers: [CatsService],
})
export class CatsModule {}

The MongooseModule provides the forFeature() method to configure the module, including defining which models should be registered in the current scope. If you also want to use the models in another module, add MongooseModule to the exports section of CatsModule and import CatsModule in the other module.

Once you've registered the schema, you can inject a Cat model into the CatsService using the @InjectModel() decorator:

@@filename(cats.service)
import { Model } from 'mongoose';
import { Injectable } from '@nestjs/common';
import { InjectModel } from '@nestjs/mongoose';
import { Cat, CatDocument } from './schemas/cat.schema';
import { CreateCatDto } from './dto/create-cat.dto';

@Injectable()
export class CatsService {
  constructor(@InjectModel(Cat.name) private catModel: Model<CatDocument>) {}

  async create(createCatDto: CreateCatDto): Promise<Cat> {
    const createdCat = new this.catModel(createCatDto);
    return createdCat.save();
  }

  async findAll(): Promise<Cat[]> {
    return this.catModel.find().exec();
  }
}
@@switch
import { Model } from 'mongoose';
import { Injectable, Dependencies } from '@nestjs/common';
import { getModelToken } from '@nestjs/mongoose';
import { Cat } from './schemas/cat.schema';

@Injectable()
@Dependencies(getModelToken(Cat.name))
export class CatsService {
  constructor(catModel) {
    this.catModel = catModel;
  }

  async create(createCatDto) {
    const createdCat = new this.catModel(createCatDto);
    return createdCat.save();
  }

  async findAll() {
    return this.catModel.find().exec();
  }
}

Connection

At times you may need to access the native Mongoose Connection object. For example, you may want to make native API calls on the connection object. You can inject the Mongoose Connection by using the @InjectConnection() decorator as follows:

import { Injectable } from '@nestjs/common';
import { InjectConnection } from '@nestjs/mongoose';
import { Connection } from 'mongoose';

@Injectable()
export class CatsService {
  constructor(@InjectConnection() private connection: Connection) {}
}

Multiple databases

Some projects require multiple database connections. This can also be achieved with this module. To work with multiple connections, first create the connections. In this case, connection naming becomes mandatory.

@@filename(app.module)
import { Module } from '@nestjs/common';
import { MongooseModule } from '@nestjs/mongoose';

@Module({
  imports: [
    MongooseModule.forRoot('mongodb://localhost/test', {
      connectionName: 'cats',
    }),
    MongooseModule.forRoot('mongodb://localhost/users', {
      connectionName: 'users',
    }),
  ],
})
export class AppModule {}

warning Notice Please note that you shouldn't have multiple connections without a name, or with the same name, otherwise they will get overridden.

With this setup, you have to tell the MongooseModule.forFeature() function which connection should be used.

@Module({
  imports: [
    MongooseModule.forFeature([{ name: Cat.name, schema: CatSchema }], 'cats'),
  ],
})
export class AppModule {}

You can also inject the Connection for a given connection:

import { Injectable } from '@nestjs/common';
import { InjectConnection } from '@nestjs/mongoose';
import { Connection } from 'mongoose';

@Injectable()
export class CatsService {
  constructor(@InjectConnection('cats') private connection: Connection) {}
}

Hooks (middleware)

Middleware (also called pre and post hooks) are functions which are passed control during execution of asynchronous functions. Middleware is specified on the schema level and is useful for writing plugins (source). Calling pre() or post() after compiling a model does not work in Mongoose. To register a hook before model registration, use the forFeatureAsync() method of the MongooseModule along with a factory provider (i.e., useFactory). With this technique, you can access a schema object, then use the pre() or post() method to register a hook on that schema. See example below:

@Module({
  imports: [
    MongooseModule.forFeatureAsync([
      {
        name: Cat.name,
        useFactory: () => {
          const schema = CatsSchema;
          schema.pre('save', () => console.log('Hello from pre save'));
          return schema;
        },
      },
    ]),
  ],
})
export class AppModule {}

Like other factory providers, our factory function can be async and can inject dependencies through inject.

@Module({
  imports: [
    MongooseModule.forFeatureAsync([
      {
        name: Cat.name,
        imports: [ConfigModule],
        useFactory: (configService: ConfigService) => {
          const schema = CatsSchema;
          schema.pre('save', () =>
            console.log(
              `${configService.get('APP_NAME')}: Hello from pre save`,
            ),
          );
          return schema;
        },
        inject: [ConfigService],
      },
    ]),
  ],
})
export class AppModule {}

Plugins

To register a plugin for a given schema, use the forFeatureAsync() method.

@Module({
  imports: [
    MongooseModule.forFeatureAsync([
      {
        name: Cat.name,
        useFactory: () => {
          const schema = CatsSchema;
          schema.plugin(require('mongoose-autopopulate'));
          return schema;
        },
      },
    ]),
  ],
})
export class AppModule {}

To register a plugin for all schemas at once, call the .plugin() method of the Connection object. You should access the connection before models are created; to do this, use the connectionFactory:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { MongooseModule } from '@nestjs/mongoose';

@Module({
  imports: [
    MongooseModule.forRoot('mongodb://localhost/test', {
      connectionFactory: (connection) => {
        connection.plugin(require('mongoose-autopopulate'));
        return connection;
      }
    }),
  ],
})
export class AppModule {}

Testing

When unit testing an application, we usually want to avoid any database connection, making our test suites simpler to set up and faster to execute. But our classes might depend on models that are pulled from the connection instance. How do we resolve these classes? The solution is to create mock models.

To make this easier, the @nestjs/mongoose package exposes a getModelToken() function that returns a prepared injection token based on a token name. Using this token, you can easily provide a mock implementation using any of the standard custom provider techniques, including useClass, useValue, and useFactory. For example:

@Module({
  providers: [
    CatsService,
    {
      provide: getModelToken(Cat.name),
      useValue: catModel,
    },
  ],
})
export class CatsModule {}

In this example, a hardcoded catModel (object instance) will be provided whenever any consumer injects a Model<Cat> using an @InjectModel() decorator.

Async configuration

When you need to pass module options asynchronously instead of statically, use the forRootAsync() method. As with most dynamic modules, Nest provides several techniques to deal with async configuration.

One technique is to use a factory function:

MongooseModule.forRootAsync({
  useFactory: () => ({
    uri: 'mongodb://localhost/nest',
  }),
});

Like other factory providers, our factory function can be async and can inject dependencies through inject.

MongooseModule.forRootAsync({
  imports: [ConfigModule],
  useFactory: async (configService: ConfigService) => ({
    uri: configService.get<string>('MONGODB_URI'),
  }),
  inject: [ConfigService],
});

Alternatively, you can configure the MongooseModule using a class instead of a factory, as shown below:

MongooseModule.forRootAsync({
  useClass: MongooseConfigService,
});

The construction above instantiates MongooseConfigService inside MongooseModule, using it to create the required options object. Note that in this example, the MongooseConfigService has to implement the MongooseOptionsFactory interface, as shown below. The MongooseModule will call the createMongooseOptions() method on the instantiated object of the supplied class.

@Injectable()
class MongooseConfigService implements MongooseOptionsFactory {
  createMongooseOptions(): MongooseModuleOptions {
    return {
      uri: 'mongodb://localhost/nest',
    };
  }
}

If you want to reuse an existing options provider instead of creating a private copy inside the MongooseModule, use the useExisting syntax.

MongooseModule.forRootAsync({
  imports: [ConfigModule],
  useExisting: ConfigService,
});

Example

A working example is available here.

Model-View-Controller

Nest, by default, makes use of the Express library under the hood. Hence, every technique for using the MVC (Model-View-Controller) pattern in Express applies to Nest as well.

First, let's scaffold a simple Nest application using the CLI tool:

$ npm i -g @nestjs/cli
$ nest new project

In order to create an MVC app, we also need a template engine to render our HTML views:

$ npm install --save hbs

We've used the hbs (Handlebars) engine, though you can use whatever fits your requirements. Once the installation process is complete, we need to configure the express instance using the following code:

@@filename(main)
import { NestFactory } from '@nestjs/core';
import { NestExpressApplication } from '@nestjs/platform-express';
import { join } from 'path';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create<NestExpressApplication>(
    AppModule,
  );

  app.useStaticAssets(join(__dirname, '..', 'public'));
  app.setBaseViewsDir(join(__dirname, '..', 'views'));
  app.setViewEngine('hbs');

  await app.listen(3000);
}
bootstrap();
@@switch
import { NestFactory } from '@nestjs/core';
import { join } from 'path';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create(
    AppModule,
  );

  app.useStaticAssets(join(__dirname, '..', 'public'));
  app.setBaseViewsDir(join(__dirname, '..', 'views'));
  app.setViewEngine('hbs');

  await app.listen(3000);
}
bootstrap();

We told Express that the public directory will be used for storing static assets, views will contain templates, and the hbs template engine should be used to render HTML output.

Template rendering

Now, let's create a views directory and index.hbs template inside it. In the template, we'll print a message passed from the controller:

<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8" />
    <title>App</title>
  </head>
  <body>
    {{ "{{ message }\}" }}
  </body>
</html>

Next, open the app.controller file and replace the root() method with the following code:

@@filename(app.controller)
import { Get, Controller, Render } from '@nestjs/common';

@Controller()
export class AppController {
  @Get()
  @Render('index')
  root() {
    return { message: 'Hello world!' };
  }
}

In this code, we are specifying the template to use in the @Render() decorator, and the return value of the route handler method is passed to the template for rendering. Notice that the return value is an object with a property message, matching the message placeholder we created in the template.

While the application is running, open your browser and navigate to http://localhost:3000. You should see the Hello world! message.

Dynamic template rendering

If the application logic must dynamically decide which template to render, then we should use the @Res() decorator, and supply the view name in our route handler, rather than in the @Render() decorator:

info Hint When Nest detects the @Res() decorator, it injects the library-specific response object. We can use this object to dynamically render the template. Learn more about the response object API here.

@@filename(app.controller)
import { Get, Controller, Res, Render } from '@nestjs/common';
import { Response } from 'express';
import { AppService } from './app.service';

@Controller()
export class AppController {
  constructor(private appService: AppService) {}

  @Get()
  root(@Res() res: Response) {
    return res.render(
      this.appService.getViewName(),
      { message: 'Hello world!' },
    );
  }
}

Example

A working example is available here.

Fastify

As mentioned in this chapter, we are able to use any compatible HTTP provider together with Nest. One such library is Fastify. In order to create an MVC application with Fastify, we have to install the following packages:

$ npm i --save fastify point-of-view handlebars

The next steps cover almost the same process used with Express, with minor differences specific to the platform. Once the installation process is complete, open the main.ts file and update its contents:

@@filename(main)
import { NestFactory } from '@nestjs/core';
import { NestFastifyApplication, FastifyAdapter } from '@nestjs/platform-fastify';
import { AppModule } from './app.module';
import { join } from 'path';

async function bootstrap() {
  const app = await NestFactory.create<NestFastifyApplication>(
    AppModule,
    new FastifyAdapter(),
  );
  app.useStaticAssets({
    root: join(__dirname, '..', 'public'),
    prefix: '/public/',
  });
  app.setViewEngine({
    engine: {
      handlebars: require('handlebars'),
    },
    templates: join(__dirname, '..', 'views'),
  });
  await app.listen(3000);
}
bootstrap();
@@switch
import { NestFactory } from '@nestjs/core';
import { FastifyAdapter } from '@nestjs/platform-fastify';
import { AppModule } from './app.module';
import { join } from 'path';

async function bootstrap() {
  const app = await NestFactory.create(AppModule, new FastifyAdapter());
  app.useStaticAssets({
    root: join(__dirname, '..', 'public'),
    prefix: '/public/',
  });
  app.setViewEngine({
    engine: {
      handlebars: require('handlebars'),
    },
    templates: join(__dirname, '..', 'views'),
  });
  await app.listen(3000);
}
bootstrap();

The Fastify API is slightly different but the end result of those methods calls remains the same. One difference to notice with Fastify is that the template name passed into the @Render() decorator must include a file extension.

@@filename(app.controller)
import { Get, Controller, Render } from '@nestjs/common';

@Controller()
export class AppController {
  @Get()
  @Render('index.hbs')
  root() {
    return { message: 'Hello world!' };
  }
}

While the application is running, open your browser and navigate to http://localhost:3000. You should see the Hello world! message.

Example

A working example is available here.

Performance (Fastify)

By default, Nest makes use of the Express framework. As mentioned earlier, Nest also provides compatibility with other libraries such as, for example, Fastify. Nest achieves this framework independence by implementing a framework adapter whose primary function is to proxy middleware and handlers to appropriate library-specific implementations.

info Hint Note that in order for a framework adapter to be implemented, the target library has to provide similar request/response pipeline processing as found in Express.

Fastify provides a good alternative framework for Nest because it solves design issues in a similar manner to Express. However, fastify is much faster than Express, achieving almost two times better benchmarks results. A fair question is why does Nest use Express as the default HTTP provider? The reason is that Express is widely-used, well-known, and has an enormous set of compatible middleware, which is available to Nest users out-of-the-box.

But since Nest provides framework-independence, you can easily migrate between them. Fastify can be a better choice when you place high value on very fast performance. To utilize Fastify, simply choose the built-in FastifyAdapter as shown in this chapter.

Installation

First, we need to install the required package:

$ npm i --save @nestjs/platform-fastify

Adapter

Once the Fastify platform is installed, we can use the FastifyAdapter.

import { NestFactory } from '@nestjs/core';
import {
  FastifyAdapter,
  NestFastifyApplication,
} from '@nestjs/platform-fastify';
import { AppModule } from './app.module';

async function bootstrap() {
  const app = await NestFactory.create<NestFastifyApplication>(
    AppModule,
    new FastifyAdapter(),
  );
  await app.listen(3000);
}
bootstrap();

By default, Fastify listens only on the localhost 127.0.0.1 interface (read more). If you want to accept connections on other hosts, you should specify '0.0.0.0' in the listen() call:

async function bootstrap() {
  const app = await NestFactory.create<NestFastifyApplication>(
    AppModule,
    new FastifyAdapter(),
  );
  await app.listen(3000, '0.0.0.0');
}

Platform specific packages

Keep in mind that when you use the FastifyAdapter, Nest uses Fastify as the HTTP provider. This means that each recipe that relies on Express may no longer work. You should, instead, use Fastify equivalent packages.

Redirect response

Fastify handles redirect responses slightly differently than Express. To do a proper redirect with Fastify, return both the status code and the URL, as follows:

@Get()
index(@Res() res) {
  res.status(302).redirect('/login');
}

Fastify options

You can pass options into the Fastify constructor through the FastifyAdapter constructor. For example:

new FastifyAdapter({ logger: true });

Example

A working example is available here.

Queues

Queues are a powerful design pattern that help you deal with common application scaling and performance challenges. Some examples of problems that Queues can help you solve are:

  • Smooth out processing peaks. For example, if users can initiate resource-intensive tasks at arbitrary times, you can add these tasks to a queue instead of performing them synchronously. Then you can have worker processes pull tasks from the queue in a controlled manner. You can easily add new Queue consumers to scale up the back-end task handling as the application scales up.
  • Break up monolithic tasks that may otherwise block the Node.js event loop. For example, if a user request requires CPU intensive work like audio transcoding, you can delegate this task to other processes, freeing up user-facing processes to remain responsive.
  • Provide a reliable communication channel across various services. For example, you can queue tasks (jobs) in one process or service, and consume them in another. You can be notified (by listening for status events) upon completion, error or other state changes in the job life cycle from any process or service. When Queue producers or consumers fail, their state is preserved and task handling can restart automatically when nodes are restarted.

Nest provides the @nestjs/bull package as an abstraction/wrapper on top of Bull, a popular, well supported, high performance Node.js based Queue system implementation. The package makes it easy to integrate Bull Queues in a Nest-friendly way to your application.

Bull uses Redis to persist job data, so you'll need to have Redis installed on your system. Because it is Redis-backed, your Queue architecture can be completely distributed and platform-independent. For example, you can have some Queue producers and consumers and listeners running in Nest on one (or several) nodes, and other producers, consumers and listeners running on other Node.js platforms on other network nodes.

This chapter covers the @nestjs/bull package. We also recommend reading the Bull documentation for more background and specific implementation details.

Installation

To begin using it, we first install the required dependencies.

$ npm install --save @nestjs/bull bull
$ npm install --save-dev @types/bull

Once the installation process is complete, we can import the BullModule into the root AppModule.

@@filename(app.module)
import { Module } from '@nestjs/common';
import { BullModule } from '@nestjs/bull';

@Module({
  imports: [
    BullModule.registerQueue({
      name: 'audio',
      redis: {
        host: 'localhost',
        port: 6379,
      },
    }),
  ],
})
export class AppModule {}

The registerQueue() method is used to instantiate and/or register queues. Queues are shared across modules and processes that connect to the same underlying Redis database with the same credentials. Each queue is unique by its name property (see below). When sharing queues (across modules/processes), the first registerQueue() method to run both instantiates the queue and registers it for that module. Other modules (in the same or separate processes) simply register the queue. Queue registration creates an injection token that can be used to access the queue in a given Nest module.

For each queue, pass a configuration object containing the following properties:

  • name: string - A queue name, which will be used as both an injection token (for injecting the queue into controllers/providers), and as an argument to decorators to associate consumer classes and listeners with queues. Required.
  • limiter: RateLimiter - Options to control the rate at which the queue's jobs are processed. See RateLimiter for more information. Optional.
  • redis: RedisOpts - Options to configure the Redis connection. See RedisOpts for more information. Optional.
  • prefix: string - Prefix for all queue keys. Optional.
  • defaultJobOptions: JobOpts - Options to control the default settings for new jobs. See JobOpts for more information. Optional.
  • settings: AdvancedSettings - Advanced Queue configuration settings. These should usually not be changed. See AdvancedSettings for more information. Optional.

As noted, the name property is required. The rest of the options are optional, providing detailed control over queue behavior. These are passed directly to the Bull Queue constructor. Read more about these options here. When registering a queue in a second or subsequent module, it is best practice to omit all options but the name property from the configuration object. These options should be specified only in the module that instantiates the queue.

info Hint Create multiple queues by passing multiple comma-separated configuration objects to the registerQueue() method.

Since jobs are persisted in Redis, each time a specific named queue is instantiated (e.g., when an app is started/restarted), it attempts to process any old jobs that may exist from a previous unfinished session.

Each queue can have one or many producers, consumers, and listeners. Consumers retrieve jobs from the queue in a specific order: FIFO (the default), LIFO, or according to priorities. Controlling queue processing order is discussed here.

Producers

Job producers add jobs to queues. Producers are typically application services (Nest providers). To add jobs to a queue, first inject the queue into the service as follows:

import { Injectable } from '@nestjs/common';
import { Queue } from 'bull';
import { InjectQueue } from '@nestjs/bull';

@Injectable()
export class AudioService {
  constructor(@InjectQueue('audio') private audioQueue: Queue) {}
}

info Hint The @InjectQueue() decorator identifies the queue by its name, as provided in the registerQueue() method call (e.g., 'audio').

Now, add a job by calling the queue's add() method, passing a user-defined job object. Jobs are represented as serializable JavaScript objects (since that is how they are stored in the Redis database). The shape of the job you pass is arbitrary; use it to represent the semantics of your job object.

const job = await this.audioQueue.add({
  foo: 'bar',
});

Named jobs

Jobs may have unique names. This allows you to create specialized consumers that will only process jobs with a given name.

const job = await this.audioQueue.add('transcode', {
  foo: 'bar',
});

Warning Warning When using named jobs, you must create processors for each unique name added to a queue, or the queue will complain that you are missing a processor for the given job. See here for more information on consuming named jobs.

Job options

Jobs can have additional options associated with them. Pass an options object after the job argument in the Queue.add() method. Job options properties are:

  • priority: number - Optional priority value. Ranges from 1 (highest priority) to MAX_INT (lowest priority). Note that using priorities has a slight impact on performance, so use them with caution.
  • delay: number - An amount of time (milliseconds) to wait until this job can be processed. Note that for accurate delays, both server and clients should have their clocks synchronized.
  • attempts: number - The total number of attempts to try the job until it completes.
  • repeat: RepeatOpts - Repeat job according to a cron specification. See RepeatOpts.
  • backoff: number | BackoffOpts - Backoff setting for automatic retries if the job fails. See BackoffOpts.
  • lifo: boolean - If true, adds the job to the right end of the queue instead of the left (default false).
  • timeout: number - The number of milliseconds after which the job should fail with a timeout error.
  • jobId: number | string - Override the job ID - by default, the job ID is a unique integer, but you can use this setting to override it. If you use this option, it is up to you to ensure the jobId is unique. If you attempt to add a job with an id that already exists, it will not be added.
  • removeOnComplete: boolean | number - If true, removes the job when it successfully completes. A number specifies the amount of jobs to keep. Default behavior is to keep the job in the completed set.
  • removeOnFail: boolean | number - If true, removes the job when it fails after all attempts. A number specifies the amount of jobs to keep. Default behavior is to keep the job in the failed set.
  • stackTraceLimit: number - Limits the amount of stack trace lines that will be recorded in the stacktrace.

Here are a few examples of customizing jobs with job options.

To delay the start of a job, use the delay configuration property.

const job = await this.audioQueue.add(
  {
    foo: 'bar',
  },
  { delay: 3000 }, // 3 seconds delayed
);

To add a job to the right end of the queue (process the job as LIFO (Last In First Out)), set the lifo property of the configuration object to true.

const job = await this.audioQueue.add(
  {
    foo: 'bar',
  },
  { lifo: true },
);

To prioritize a job, use the priority property.

const job = await this.audioQueue.add(
  {
    foo: 'bar',
  },
  { priority: 2 },
);

Consumers

A consumer is a class defining methods that either process jobs added into the queue, or listen for events on the queue, or both. Declare a consumer class using the @Processor() decorator as follows:

import { Processor } from '@nestjs/bull';

@Processor('audio')
export class AudioConsumer {}

Where the decorator's string argument (e.g., 'audio') is the name of the queue to be associated with the class methods.

Within a consumer class, declare job handlers by decorating handler methods with the @Process() decorator.

import { Processor, Process } from '@nestjs/bull';
import { Job } from 'bull';

@Processor('audio')
export class AudioConsumer {
  @Process()
  async transcode(job: Job<unknown>) {
    let progress = 0;
    for (i = 0; i < 100; i++) {
      await doSomething(job.data);
      progress += 10;
      job.progress(progress);
    }
    return {};
  }
}

The decorated method (e.g., transcode()) is called whenever the worker is idle and there are jobs to process in the queue. This handler method receives the job object as its only argument. The value returned by the handler method is stored in the job object and can be accessed later on, for example in a listener for the completed event.

Job objects have multiple methods that allow you to interact with their state. For example, the above code uses the progress() method to update the job's progress. See here for the complete Job object API reference.

You can designate that a job handler method will handle only jobs of a certain type (jobs with a specific name) by passing that name to the @Process() decorator as shown below. You can have multiple @Process() handlers in a given consumer class, corresponding to each job type (name). When you use named jobs, be sure to have a handler corresponding to each name.

@Process('transcode')
async transcode(job: Job<unknown>) { ... }

Event listeners

Bull generates a set of useful events when queue and/or job state changes occur. Nest provides a set of decorators that allow subscribing to a core set of standard events. These are exported from the @nestjs/bull package.

Event listeners must be declared within a consumer class (i.e., within a class decorated with the @Processor() decorator). To listen for an event, use one of the decorators in the table below to declare a handler for the event. For example, to listen to the event emitted when a job enters the active state in the audio queue, use the following construct:

import { Processor, Process } from '@nestjs/bull';
import { Job } from 'bull';

@Processor('audio')
export class AudioConsumer {

  @OnQueueActive()
  onActive(job: Job) {
    console.log(
      `Processing job ${job.id} of type ${job.name} with data ${job.data}...`,
    );
  }
  ...

Since Bull operates in a distributed (multi-node) environment, it defines the concept of event locality. This concept recognizes that events may be triggered either entirely within a single process, or on shared queues from different processes. A local event is one that is produced when an action or state change is triggered on a queue in the local process. In other words, when your event producers and consumers are local to a single process, all events happening on queues are local.

When a queue is shared across multiple processes, we encounter the possibility of global events. For a listener in one process to receive an event notification triggered by another process, it must register for a global event.

Event handlers are invoked whenever their corresponding event is emitted. The handler is called with the signature shown in the table below, providing access to information relevant to the event. We discuss one key difference between local and global event handler signatures below.

Local event listeners Global event listeners Handler method signature / When fired
@OnQueueError()@OnGlobalQueueError()handler(error: Error) - An error occurred. error contains the triggering error.
@OnQueueWaiting()@OnGlobalQueueWaiting()handler(jobId: number | string) - A Job is waiting to be processed as soon as a worker is idling. jobId contains the id for the job that has entered this state.
@OnQueueActive()@OnGlobalQueueActive()handler(job: Job) - Job jobhas started.
@OnQueueStalled()@OnGlobalQueueStalled()handler(job: Job) - Job job has been marked as stalled. This is useful for debugging job workers that crash or pause the event loop.
@OnQueueProgress()@OnGlobalQueueProgress()handler(job: Job, progress: number) - Job job's progress was updated to value progress.
@OnQueueCompleted()@OnGlobalQueueCompleted()handler(job: Job, result: any) Job job successfully completed with a result result.
@OnQueueFailed()@OnGlobalQueueFailed()handler(job: Job, err: Error) Job job failed with reason err.
@OnQueuePaused()@OnGlobalQueuePaused()handler() The queue has been paused.
@OnQueueResumed()@OnGlobalQueueResumed()handler(job: Job) The queue has been resumed.
@OnQueueCleaned()@OnGlobalQueueCleaned()handler(jobs: Job[], type: string) Old jobs have been cleaned from the queue. jobs is an array of cleaned jobs, and type is the type of jobs cleaned.
@OnQueueDrained()@OnGlobalQueueDrained()handler() Emitted whenever the queue has processed all the waiting jobs (even if there can be some delayed jobs not yet processed).
@OnQueueRemoved()@OnGlobalQueueRemoved()handler(job: Job) Job job was successfully removed.

When listening for global events, the method signatures can be slightly different from their local counterpart. Specifically, any method signature that receives job objects in the local version, instead receives a jobId (number) in the global version. To get a reference to the actual job object in such a case, use the Queue#getJob method. This call should be awaited, and therefore the handler should be declared async. For example:

@OnGlobalQueueCompleted()
async onGlobalCompleted(jobId: number, result: any) {
  const job = await this.immediateQueue.getJob(jobId);
  console.log('(Global) on completed: job ', job.id, ' -> result: ', result);
}

info Hint To access the Queue object (to make a getJob() call), you must of course inject it. Also, the Queue must be registered in the module where you are injecting it.

In addition to the specific event listener decorators, you can also use the generic @OnQueueEvent() decorator in combination with either BullQueueEvents or BullQueueGlobalEvents enums. Read more about events here.

Queue management

Queue's have an API that allows you to perform management functions like pausing and resuming, retrieving the count of jobs in various states, and several more. You can find the full queue API here. Invoke any of these methods directly on the Queue object, as shown below with the pause/resume examples.

Pause a queue with the pause() method call. A paused queue will not process new jobs until resumed, but current jobs being processed will continue until they are finalized.

await audioQueue.pause();

To resume a paused queue, use the resume() method, as follows:

await audioQueue.resume();

Async configuration

You may want to pass your queue options asynchronously instead of statically. In this case, use the registerQueueAsync() method, which provides several ways to deal with async configuration.

One approach is to use a factory function:

BullModule.registerQueueAsync({
  name: 'audio',
  useFactory: () => ({
    redis: {
      host: 'localhost',
      port: 6379,
    },
  }),
});

Our factory behaves like any other asynchronous provider (e.g., it can be async and it's able to inject dependencies through inject).

BullModule.registerQueueAsync({
  name: 'audio',
  imports: [ConfigModule],
  useFactory: async (configService: ConfigService) => ({
    redis: {
      host: configService.get('QUEUE_HOST'),
      port: +configService.get('QUEUE_PORT'),
    },
  }),
  inject: [ConfigService],
});

Alternatively, you can use the useClass syntax:

BullModule.registerQueueAsync({
  name: 'audio',
  useClass: BullConfigService,
});

The construction above will instantiate BullConfigService inside BullModule and use it to provide an options object by calling createBullOptions(). Note that this means that the BullConfigService has to implement the BullOptionsFactory interface, as shown below:

@Injectable()
class BullConfigService implements BullOptionsFactory {
  createBullOptions(): BullModuleOptions {
    return {
      redis: {
        host: 'localhost',
        port: 6379,
      },
    };
  }
}

In order to prevent the creation of BullConfigService inside BullModule and use a provider imported from a different module, you can use the useExisting syntax.

BullModule.registerQueueAsync({
  name: 'audio',
  imports: [ConfigModule],
  useExisting: ConfigService,
});

This construction works the same as useClass with one critical difference - BullModule will lookup imported modules to reuse an existing ConfigService instead of instantiating a new one.

Example

A working example is available here.

Security

In this chapter we cover various techniques that help you to increase the security of your applications.

Helmet

Helmet can help protect your app from some well-known web vulnerabilities by setting HTTP headers appropriately. Generally, Helmet is just a collection of 14 smaller middleware functions that set security-related HTTP headers (read more).

Start by installing the required package. If you are using Express (default in Nest):

$ npm i --save helmet

Once the installation is complete, apply it as a global middleware.

import * as helmet from 'helmet';
// somewhere in your initialization file
app.use(helmet());

If you are using the FastifyAdapter, you'll need fastify-helmet instead:

$ npm i --save fastify-helmet

fastify-helmet should not be used as a middleware, but as a Fastify plugin, i.e., by using app.register():

import * as helmet from 'fastify-helmet';
// somewhere in your initialization file
app.register(helmet);
// or the following, but note that it's not type safe
// app.getHttpAdapter().register(helmet);

info Hint Note that applying helmet as global or registering it must come before other calls to app.use() or setup functions that may call app.use()). This is due to the way the underlying platform (i.e., Express or Fastify) works, where the order that middleware/routes are defined matters. If you use middleware like helmet or cors after you define a route, then that middleware will not apply to that route, it will only apply to middleware defined after the route.

CORS

Cross-origin resource sharing (CORS) is a mechanism that allows resources to be requested from another domain. Under the hood, Nest makes use of the Express cors package. This package provides various options that you can customize based on your requirements. To enable CORS, call the enableCors() method on the Nest application object.

const app = await NestFactory.create(AppModule);
app.enableCors();
await app.listen(3000);

The enableCors() method takes an optional configuration object argument. The available properties of this object are described in the official CORS documentation.

Alternatively, enable CORS via the create() method's options object. Set the cors property to true to enable CORS with default settings. Alternatively, pass a CORS configuration object as the cors property value to customize its behavior.

const app = await NestFactory.create(AppModule, { cors: true });
await app.listen(3000);

CSRF

Cross-site request forgery (also known as CSRF or XSRF) is a type of malicious exploit of a website where unauthorized commands are transmitted from a user that the web application trusts. To mitigate this kind of attack you can use the csurf package.

Start by installing the required package:

$ npm i --save csurf

warning Warning As explained on the csurf middleware page, the csurf module requires either session middleware or a cookie-parser to be initialized first. Please see that documentation for further instructions.

Once the installation is complete, apply the csurf middleware as global middleware.

import * as csurf from 'csurf';
// somewhere in your initialization file
app.use(csurf());

Rate limiting

A common technique to protect applications from brute-force attacks is rate-limiting. Many Express packages exist to provide a rate-limiting feature. A popular one is express-rate-limit.

Start by installing the required package:

$ npm i --save express-rate-limit

Once the installation is complete, apply the rate-limiter as global middleware.

import * as rateLimit from 'express-rate-limit';
// somewhere in your initialization file
app.use(
  rateLimit({
    windowMs: 15 * 60 * 1000, // 15 minutes
    max: 100, // limit each IP to 100 requests per windowMs
  }),
);

When there is a load balancer or reverse proxy between the server and the internet, Express may need to be configured to trust the headers set by the proxy in order to get the correct IP for the end user. To do so, first use the NestExpressApplication platform interface when creating your app instance, then enable the trust proxy setting:

const app = await NestFactory.create<NestExpressApplication>(AppModule);
// see https://expressjs.com/en/guide/behind-proxies.html
app.set('trust proxy', 1);

info Hint If you use the FastifyAdapter, consider using fastify-rate-limit instead.

Serialization

Serialization is a process that happens before objects are returned in a network response. This is an appropriate place to provide rules for transforming and sanitizing the data to be returned to the client. For example, sensitive data like passwords should always be excluded from the response. Or, certain properties might require additional transformation, such as sending only a subset of properties of an entity. Performing these transformations manually can be tedious and error prone, and can leave you uncertain that all cases have been covered.

Overview

Nest provides a built-in capability to help ensure that these operations can be performed in a straightforward way. The ClassSerializerInterceptor interceptor uses the powerful class-transformer package to provide a declarative and extensible way of transforming objects. The basic operation it performs is to take the value returned by a method handler and apply the classToPlain() function from class-transformer. In doing so, it can apply rules expressed by class-transformer decorators on an entity/DTO class, as described below.

Exclude properties

Let's assume that we want to automatically exclude a password property from a user entity. We annotate the entity as follows:

import { Exclude } from 'class-transformer';

export class UserEntity {
  id: number;
  firstName: string;
  lastName: string;

  @Exclude()
  password: string;

  constructor(partial: Partial<UserEntity>) {
    Object.assign(this, partial);
  }
}

Now consider a controller with a method handler that returns an instance of this class.

@UseInterceptors(ClassSerializerInterceptor)
@Get()
findOne(): UserEntity {
  return new UserEntity({
    id: 1,
    firstName: 'Kamil',
    lastName: 'Mysliwiec',
    password: 'password',
  });
}

Warning Note that we must return an instance of the class. If you return a plain JavaScript object, for example, {{ '{' }} user: new UserEntity() {{ '}' }}, the object won't be properly serialized.

info Hint The ClassSerializerInterceptor is imported from @nestjs/common.

When this endpoint is requested, the client receives the following response:

{
  "id": 1,
  "firstName": "Kamil",
  "lastName": "Mysliwiec"
}

Note that the interceptor can be applied application-wide (as covered here). The combination of the interceptor and the entity class declaration ensures that any method that returns a UserEntity will be sure to remove the password property. This gives you a measure of centralized enforcement of this business rule.

Expose properties

You can use the @Expose() decorator to provide alias names for properties, or to execute a function to calculate a property value (analogous to getter functions), as shown below.

@Expose()
get fullName(): string {
  return `${this.firstName} ${this.lastName}`;
}

Transform

You can perform additional data transformation using the @Transform() decorator. For example, the following construct returns the name property of the RoleEntity instead of returning the whole object.

@Transform(role => role.name)
role: RoleEntity;

Pass options

You may want to modify the default behavior of the transformation functions. To override default settings, pass them in an options object with the @SerializeOptions() decorator.

@SerializeOptions({
  excludePrefixes: ['_'],
})
@Get()
findOne(): UserEntity {
  return new UserEntity();
}

info Hint The @SerializeOptions() decorator is imported from @nestjs/common.

Options passed via @SerializeOptions() are passed as the second argument of the underlying classToPlain() function. In this example, we are automatically excluding all properties that begin with the _ prefix.

WebSockets and Microservices

While this chapter shows examples using HTTP style applications (e.g., Express or Fastify), the ClassSerializerInterceptor works the same for WebSockets and Microservices, regardless of the transport method that is used.

Learn more

Read more about available decorators and options as provided by the class-transformer package here.

Database

Nest is database agnostic, allowing you to easily integrate with any SQL or NoSQL database. You have a number of options available to you, depending on your preferences. At the most general level, connecting Nest to a database is simply a matter of loading an appropriate Node.js driver for the database, just as you would with Express or Fastify.

You can also directly use any general purpose Node.js database integration library or ORM, such as Sequelize (navigate to the Sequelize integration section), Knex.js (tutorial) TypeORM, and Prisma (recipe) , to operate at a higher level of abstraction.

For convenience, Nest provides tight integration with TypeORM and Sequelize out-of-the-box with the @nestjs/typeorm and @nestjs/sequelize packages respectively, which we'll cover in the current chapter, and Mongoose with @nestjs/mongoose, which is covered in this chapter. These integrations provide additional NestJS-specific features, such as model/repository injection, testability, and asynchronous configuration to make accessing your chosen database even easier.

TypeORM Integration

For integrating with SQL and NoSQL databases, Nest provides the @nestjs/typeorm package. Nest uses TypeORM because it's the most mature Object Relational Mapper (ORM) available for TypeScript. Since it's written in TypeScript, it integrates well with the Nest framework.

To begin using it, we first install the required dependencies. In this chapter, we'll demonstrate using the popular MySQL Relational DBMS, but TypeORM provides support for many relational databases, such as PostgreSQL, Oracle, Microsoft SQL Server, SQLite, and even NoSQL databases like MongoDB. The procedure we walk through in this chapter will be the same for any database supported by TypeORM. You'll simply need to install the associated client API libraries for your selected database.

$ npm install --save @nestjs/typeorm typeorm mysql

Once the installation process is complete, we can import the TypeOrmModule into the root AppModule.

@@filename(app.module)
import { Module } from '@nestjs/common';
import { TypeOrmModule } from '@nestjs/typeorm';

@Module({
  imports: [
    TypeOrmModule.forRoot({
      type: 'mysql',
      host: 'localhost',
      port: 3306,
      username: 'root',
      password: 'root',
      database: 'test',
      entities: [],
      synchronize: true,
    }),
  ],
})
export class AppModule {}

The forRoot() method supports all the configuration properties exposed by the createConnection() function from the TypeORM package. In addition, there are several extra configuration properties described below.

retryAttempts Number of attempts to connect to the database (default: 10)
retryDelay Delay between connection retry attempts (ms) (default: 3000)
autoLoadEntities If true, entities will be loaded automatically (default: false)
keepConnectionAlive If true, connection will not be closed on application shutdown (default: false)

info Hint Learn more about the connection options here.

Alternatively, rather than passing a configuration object to forRoot(), we can create an ormconfig.json file in the project root directory.

{
  "type": "mysql",
  "host": "localhost",
  "port": 3306,
  "username": "root",
  "password": "root",
  "database": "test",
  "entities": ["dist/**/*.entity{.ts,.js}"],
  "synchronize": true
}

Then, we can call forRoot() without any options:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { TypeOrmModule } from '@nestjs/typeorm';

@Module({
  imports: [TypeOrmModule.forRoot()],
})
export class AppModule {}

info Warning Static glob paths (e.g., dist/**/*.entity{{ '{' }} .ts,.js{{ '}' }}) won't work properly with webpack.

warning Warning Note that the ormconfig.json file is loaded by the typeorm library. Thus, any of the extra properties described above (which are supported internally by way of the forRoot() method - for example, autoLoadEntities and retryDelay) won't be applied.

Once this is done, the TypeORM Connection and EntityManager objects will be available to inject across the entire project (without needing to import any modules), for example:

@@filename(app.module)
import { Connection } from 'typeorm';

@Module({
  imports: [TypeOrmModule.forRoot(), UsersModule],
})
export class AppModule {
  constructor(private connection: Connection) {}
}
@@switch
import { Connection } from 'typeorm';

@Dependencies(Connection)
@Module({
  imports: [TypeOrmModule.forRoot(), UsersModule],
})
export class AppModule {
  constructor(connection) {
    this.connection = connection;
  }
}

Repository pattern

TypeORM supports the repository design pattern, so each entity has its own repository. These repositories can be obtained from the database connection.

To continue the example, we need at least one entity. Let's define the User entity.

@@filename(user.entity)
import { Entity, Column, PrimaryGeneratedColumn } from 'typeorm';

@Entity()
export class User {
  @PrimaryGeneratedColumn()
  id: number;

  @Column()
  firstName: string;

  @Column()
  lastName: string;

  @Column({ default: true })
  isActive: boolean;
}

info Hint Learn more about entities in the TypeORM documentation.

The User entity file sits in the users directory. This directory contains all files related to the UsersModule. You can decide where to keep your model files, however, we recommend creating them near their domain, in the corresponding module directory.

To begin using the User entity, we need to let TypeORM know about it by inserting it into the entities array in the module forRoot() method options (unless you use a static glob path):

@@filename(app.module)
import { Module } from '@nestjs/common';
import { TypeOrmModule } from '@nestjs/typeorm';
import { User } from './users/user.entity';

@Module({
  imports: [
    TypeOrmModule.forRoot({
      type: 'mysql',
      host: 'localhost',
      port: 3306,
      username: 'root',
      password: 'root',
      database: 'test',
      entities: [User],
      synchronize: true,
    }),
  ],
})
export class AppModule {}

Next, let's look at the UsersModule:

@@filename(users.module)
import { Module } from '@nestjs/common';
import { TypeOrmModule } from '@nestjs/typeorm';
import { UsersService } from './users.service';
import { UsersController } from './users.controller';
import { User } from './user.entity';

@Module({
  imports: [TypeOrmModule.forFeature([User])],
  providers: [UsersService],
  controllers: [UsersController],
})
export class UsersModule {}

This module uses the forFeature() method to define which repositories are registered in the current scope. With that in place, we can inject the UsersRepository into the UsersService using the @InjectRepository() decorator:

@@filename(users.service)
import { Injectable } from '@nestjs/common';
import { InjectRepository } from '@nestjs/typeorm';
import { Repository } from 'typeorm';
import { User } from './user.entity';

@Injectable()
export class UsersService {
  constructor(
    @InjectRepository(User)
    private usersRepository: Repository<User>,
  ) {}

  findAll(): Promise<User[]> {
    return this.usersRepository.find();
  }

  findOne(id: string): Promise<User> {
    return this.usersRepository.findOne(id);
  }

  async remove(id: string): Promise<void> {
    await this.usersRepository.delete(id);
  }
}
@@switch
import { Injectable, Dependencies } from '@nestjs/common';
import { getRepositoryToken } from '@nestjs/typeorm';
import { User } from './user.entity';

@Injectable()
@Dependencies(getRepositoryToken(User))
export class UsersService {
  constructor(usersRepository) {
    this.usersRepository = usersRepository;
  }

  findAll() {
    return this.usersRepository.find();
  }

  findOne(id) {
    return this.usersRepository.findOne(id);
  }

  async remove(id) {
    await this.usersRepository.delete(id);
  }
}

warning Notice Don't forget to import the UsersModule into the root AppModule.

If you want to use the repository outside of the module which imports TypeOrmModule.forFeature, you'll need to re-export the providers generated by it. You can do this by exporting the whole module, like this:

@@filename(users.module)
import { Module } from '@nestjs/common';
import { TypeOrmModule } from '@nestjs/typeorm';
import { User } from './user.entity';

@Module({
  imports: [TypeOrmModule.forFeature([User])],
  exports: [TypeOrmModule]
})
export class UsersModule {}

Now if we import UsersModule in UserHttpModule, we can use @InjectRepository(User) in the providers of the latter module.

@@filename(users-http.module)
import { Module } from '@nestjs/common';
import { UsersModule } from './user.module';
import { UsersService } from './users.service';
import { UsersController } from './users.controller';

@Module({
  imports: [UsersModule],
  providers: [UsersService],
  controllers: [UsersController]
})
export class UserHttpModule {}

Relations

Relations are associations established between two or more tables. Relations are based on common fields from each table, often involving primary and foreign keys.

There are three types of relations:

One-to-one Every row in the primary table has one and only one associated row in the foreign table. Use the @OneToOne() decorator to define this type of relation.
One-to-many / Many-to-one Every row in the primary table has one or more related rows in the foreign table. Use the @OneToMany() and @ManyToOne() decorators to define this type of relation.
Many-to-many Every row in the primary table has many related rows in the foreign table, and every record in the foreign table has many related rows in the primary table. Use the @ManyToMany() decorator to define this type of relation.

To define relations in entities, use the corresponding decorators. For example, to define that each User can have multiple photos, use the @OneToMany() decorator.

@@filename(user.entity)
import { Entity, Column, PrimaryGeneratedColumn, OneToMany } from 'typeorm';
import { Photo } from '../photos/photo.entity';

@Entity()
export class User {
  @PrimaryGeneratedColumn()
  id: number;

  @Column()
  firstName: string;

  @Column()
  lastName: string;

  @Column({ default: true })
  isActive: boolean;

  @OneToMany(type => Photo, photo => photo.user)
  photos: Photo[];
}

info Hint To learn more about relations in TypeORM, visit the TypeORM documentation.

Auto-load entities

Manually adding entities to the entities array of the connection options can be tedious. In addition, referencing entities from the root module breaks application domain boundaries and causes leaking implementation details to other parts of the application. To solve this issue, static glob paths can be used (e.g., dist/**/*.entity{{ '{' }} .ts,.js{{ '}' }}).

Note, however, that glob paths are not supported by webpack, so if you are building your application within a monorepo, you won't be able to use them. To address this issue, an alternative solution is provided. To automatically load entities, set the autoLoadEntities property of the configuration object (passed into the forRoot() method) to true, as shown below:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { TypeOrmModule } from '@nestjs/typeorm';

@Module({
  imports: [
    TypeOrmModule.forRoot({
      ...
      autoLoadEntities: true,
    }),
  ],
})
export class AppModule {}

With that option specified, every entity registered through the forFeature() method will be automatically added to the entities array of the configuration object.

warning Warning Note that entities that aren't registered through the forFeature() method, but are only referenced from the entity (via a relationship), won't be included by way of the autoLoadEntities setting.

Separating entity definition

You can define an entity and its columns right in the model, using decorators. But some people prefer to define entities and their columns inside separate files using the "entity schemas".

import { EntitySchema } from 'typeorm';
import { User } from './user.entity';

export const UserSchema = new EntitySchema<User>({
  name: 'User',
  target: User,
  columns: {
    id: {
      type: Number,
      primary: true,
      generated: true,
    },
    firstName: {
      type: String,
    },
    lastName: {
      type: String,
    },
    isActive: {
      type: Boolean,
      default: true,
    },
  },
  relations: {
    photos: {
      type: 'one-to-many',
      target: 'Photo', // the name of the PhotoSchema
    },
  },
});

warning error Warning If you provide the target option, the name option value has to be the same as the name of the target class. If you do not provide the target you can use any name.

Nest allows you to use an EntitySchema instance wherever an Entity is expected, for example:

import { Module } from '@nestjs/common';
import { TypeOrmModule } from '@nestjs/typeorm';
import { UserSchema } from './user.schema';
import { UsersController } from './users.controller';
import { UsersService } from './users.service';

@Module({
  imports: [TypeOrmModule.forFeature([UserSchema])],
  providers: [UsersService],
  controllers: [UsersController],
})
export class UsersModule {}

Transactions

A database transaction symbolizes a unit of work performed within a database management system against a database, and treated in a coherent and reliable way independent of other transactions. A transaction generally represents any change in a database (learn more).

There are many different strategies to handle TypeORM transactions. We recommend using the QueryRunner class because it gives full control over the transaction.

First, we need to inject the Connection object into a class in the normal way:

@Injectable()
export class UsersService {
  constructor(private connection: Connection) {}
}

info Hint The Connection class is imported from the typeorm package.

Now, we can use this object to create a transaction.

async createMany(users: User[]) {
  const queryRunner = this.connection.createQueryRunner();

  await queryRunner.connect();
  await queryRunner.startTransaction();
  try {
    await queryRunner.manager.save(users[0]);
    await queryRunner.manager.save(users[1]);

    await queryRunner.commitTransaction();
  } catch (err) {
    // since we have errors lets rollback the changes we made
    await queryRunner.rollbackTransaction();
  } finally {
    // you need to release a queryRunner which was manually instantiated
    await queryRunner.release();
  }
}

info Hint Note that the connection is used only to create the QueryRunner. However, to test this class would require mocking the entire Connection object (which exposes several methods). Thus, we recommend using a helper factory class (e.g., QueryRunnerFactory) and defining an interface with a limited set of methods required to maintain transactions. This technique makes mocking these methods pretty straightforward.

Alternatively, you can use the callback-style approach with the transaction method of the Connection object (read more).

async createMany(users: User[]) {
  await this.connection.transaction(async manager => {
    await manager.save(users[0]);
    await manager.save(users[1]);
  });
}

Using decorators to control the transaction (@Transaction() and @TransactionManager()) is not recommended.

Subscribers

With TypeORM subscribers, you can listen to specific entity events.

import {
  Connection,
  EntitySubscriberInterface,
  EventSubscriber,
  InsertEvent,
} from 'typeorm';
import { User } from './user.entity';

@EventSubscriber()
export class UserSubscriber implements EntitySubscriberInterface<User> {
  constructor(connection: Connection) {
    connection.subscribers.push(this);
  }

  listenTo() {
    return User;
  }

  beforeInsert(event: InsertEvent<User>) {
    console.log(`BEFORE USER INSERTED: `, event.entity);
  }
}

error Warning Event subscribers can not be request-scoped.

Now, add the UserSubscriber class to the providers array:

import { Module } from '@nestjs/common';
import { TypeOrmModule } from '@nestjs/typeorm';
import { User } from './user.entity';
import { UsersController } from './users.controller';
import { UsersService } from './users.service';
import { UserSubscriber } from './user.subscriber';

@Module({
  imports: [TypeOrmModule.forFeature([User])],
  providers: [UsersService, UserSubscriber],
  controllers: [UsersController],
})
export class UsersModule {}

info Hint Learn more about entity subscribers here.

Migrations

Migrations provide a way to incrementally update the database schema to keep it in sync with the application's data model while preserving existing data in the database. To generate, run, and revert migrations, TypeORM provides a dedicated CLI.

Migration classes are separate from the Nest application source code. Their lifecycle is maintained by the TypeORM CLI. Therefore, you are not able to leverage dependency injection and other Nest specific features with migrations. To learn more about migrations, follow the guide in the TypeORM documentation.

Multiple databases

Some projects require multiple database connections. This can also be achieved with this module. To work with multiple connections, first create the connections. In this case, connection naming becomes mandatory.

Suppose you have an Album entity stored in its own database.

const defaultOptions = {
  type: 'postgres',
  port: 5432,
  username: 'user',
  password: 'password',
  database: 'db',
  synchronize: true,
};

@Module({
  imports: [
    TypeOrmModule.forRoot({
      ...defaultOptions,
      host: 'user_db_host',
      entities: [User],
    }),
    TypeOrmModule.forRoot({
      ...defaultOptions,
      name: 'albumsConnection',
      host: 'album_db_host',
      entities: [Album],
    }),
  ],
})
export class AppModule {}

warning Notice If you don't set the name for a connection, its name is set to default. Please note that you shouldn't have multiple connections without a name, or with the same name, otherwise they will get overridden.

At this point, you have User and Album entities registered with their own connection. With this setup, you have to tell the TypeOrmModule.forFeature() method and the @InjectRepository() decorator which connection should be used. If you do not pass any connection name, the default connection is used.

@Module({
  imports: [
    TypeOrmModule.forFeature([User]),
    TypeOrmModule.forFeature([Album], 'albumsConnection'),
  ],
})
export class AppModule {}

You can also inject the Connection or EntityManager for a given connection:

@Injectable()
export class AlbumsService {
  constructor(
    @InjectConnection('albumsConnection')
    private connection: Connection,
    @InjectEntityManager('albumsConnection')
    private entityManager: EntityManager,
  ) {}
}

Testing

When it comes to unit testing an application, we usually want to avoid making a database connection, keeping our test suites independent and their execution process as fast as possible. But our classes might depend on repositories that are pulled from the connection instance. How do we handle that? The solution is to create mock repositories. In order to achieve that, we set up custom providers. Each registered repository is automatically represented by an <EntityName>Repository token, where EntityName is the name of your entity class.

The @nestjs/typeorm package exposes the getRepositoryToken() function which returns a prepared token based on a given entity.

@Module({
  providers: [
    UsersService,
    {
      provide: getRepositoryToken(User),
      useValue: mockRepository,
    },
  ],
})
export class UsersModule {}

Now a substitute mockRepository will be used as the UsersRepository. Whenever any class asks for UsersRepository using an @InjectRepository() decorator, Nest will use the registered mockRepository object.

Custom repository

TypeORM provides a feature called custom repositories. Custom repositories allow you to extend a base repository class, and enrich it with several special methods. To learn more about this feature, visit this page.

In order to create your custom repository, use the @EntityRepository() decorator and extend the Repository class.

@EntityRepository(Author)
export class AuthorRepository extends Repository<Author> {}

info Hint Both @EntityRepository() and Repository are imported from the typeorm package.

Once the class is created, the next step is to delegate instantiation responsibility to Nest. For this, we have to pass theAuthorRepository class to the TypeOrm.forFeature() method.

@Module({
  imports: [TypeOrmModule.forFeature([AuthorRepository])],
  controller: [AuthorController],
  providers: [AuthorService],
})
export class AuthorModule {}

Afterward, simply inject the repository using the following construction:

@Injectable()
export class AuthorService {
  constructor(private authorRepository: AuthorRepository) {}
}

Async configuration

You may want to pass your repository module options asynchronously instead of statically. In this case, use the forRootAsync() method, which provides several ways to deal with async configuration.

One approach is to use a factory function:

TypeOrmModule.forRootAsync({
  useFactory: () => ({
    type: 'mysql',
    host: 'localhost',
    port: 3306,
    username: 'root',
    password: 'root',
    database: 'test',
    entities: [__dirname + '/**/*.entity{.ts,.js}'],
    synchronize: true,
  }),
});

Our factory behaves like any other asynchronous provider (e.g., it can be async and it's able to inject dependencies through inject).

TypeOrmModule.forRootAsync({
  imports: [ConfigModule],
  useFactory: (configService: ConfigService) => ({
    type: 'mysql',
    host: configService.get('HOST'),
    port: +configService.get<number>('PORT'),
    username: configService.get('USERNAME'),
    password: configService.get('PASSWORD'),
    database: configService.get('DATABASE'),
    entities: [__dirname + '/**/*.entity{.ts,.js}'],
    synchronize: true,
  }),
  inject: [ConfigService],
});

Alternatively, you can use the useClass syntax:

TypeOrmModule.forRootAsync({
  useClass: TypeOrmConfigService,
});

The construction above will instantiate TypeOrmConfigService inside TypeOrmModule and use it to provide an options object by calling createTypeOrmOptions(). Note that this means that the TypeOrmConfigService has to implement the TypeOrmOptionsFactory interface, as shown below:

@Injectable()
class TypeOrmConfigService implements TypeOrmOptionsFactory {
  createTypeOrmOptions(): TypeOrmModuleOptions {
    return {
      type: 'mysql',
      host: 'localhost',
      port: 3306,
      username: 'root',
      password: 'root',
      database: 'test',
      entities: [__dirname + '/**/*.entity{.ts,.js}'],
      synchronize: true,
    };
  }
}

In order to prevent the creation of TypeOrmConfigService inside TypeOrmModule and use a provider imported from a different module, you can use the useExisting syntax.

TypeOrmModule.forRootAsync({
  imports: [ConfigModule],
  useExisting: ConfigService,
});

This construction works the same as useClass with one critical difference - TypeOrmModule will lookup imported modules to reuse an existing ConfigService instead of instantiating a new one.

Example

A working example is available here.

Sequelize Integration

An alternative to using TypeORM is to use the Sequelize ORM with the @nestjs/sequelize package. In addition, we leverage the sequelize-typescript package which provides a set of additional decorators to declaratively define entities.

To begin using it, we first install the required dependencies. In this chapter, we'll demonstrate using the popular MySQL Relational DBMS, but Sequelize provides support for many relational databases, such as PostgreSQL, MySQL, Microsoft SQL Server, SQLite, and MariaDB. The procedure we walk through in this chapter will be the same for any database supported by Sequelize. You'll simply need to install the associated client API libraries for your selected database.

$ npm install --save @nestjs/sequelize sequelize sequelize-typescript mysql2
$ npm install --save-dev @types/sequelize

Once the installation process is complete, we can import the SequelizeModule into the root AppModule.

@@filename(app.module)
import { Module } from '@nestjs/common';
import { SequelizeModule } from '@nestjs/sequelize';

@Module({
  imports: [
    SequelizeModule.forRoot({
      dialect: 'mysql',
      host: 'localhost',
      port: 3306,
      username: 'root',
      password: 'root',
      database: 'test',
      models: [],
    }),
  ],
})
export class AppModule {}

The forRoot() method supports all the configuration properties exposed by the Sequelize constructor (read more). In addition, there are several extra configuration properties described below.

retryAttempts Number of attempts to connect to the database (default: 10)
retryDelay Delay between connection retry attempts (ms) (default: 3000)
autoLoadModels If true, models will be loaded automatically (default: false)
keepConnectionAlive If true, connection will not be closed on the application shutdown (default: false)
synchronize If true, automatically loaded models will be synchronized (default: false)

Once this is done, the Sequelize object will be available to inject across the entire project (without needing to import any modules), for example:

@@filename(app.service)
import { Injectable } from '@nestjs/common';
import { Sequelize } from 'sequelize-typescript';

@Injectable()
export class AppService {
  constructor(private sequelize: Sequelize) {}
}
@@switch
import { Injectable } from '@nestjs/common';
import { Sequelize } from 'sequelize-typescript';

@Dependencies(Sequelize)
@Injectable()
export class AppService {
  constructor(sequelize) {
    this.sequelize = sequelize;
  }
}

Models

Sequelize implements the Active Record pattern. With this pattern, you use model classes directly to interact with the database. To continue the example, we need at least one model. Let's define the User model.

@@filename(user.model)
import { Column, Model, Table } from 'sequelize-typescript';

@Table
export class User extends Model<User> {
  @Column
  firstName: string;

  @Column
  lastName: string;

  @Column({ defaultValue: true })
  isActive: boolean;
}

info Hint Learn more about the available decorators here.

The User model file sits in the users directory. This directory contains all files related to the UsersModule. You can decide where to keep your model files, however, we recommend creating them near their domain, in the corresponding module directory.

To begin using the User model, we need to let Sequelize know about it by inserting it into the models array in the module forRoot() method options:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { SequelizeModule } from '@nestjs/sequelize';
import { User } from './users/user.model';

@Module({
  imports: [
    SequelizeModule.forRoot({
      dialect: 'mysql',
      host: 'localhost',
      port: 3306,
      username: 'root',
      password: 'root',
      database: 'test',
      models: [User],
    }),
  ],
})
export class AppModule {}

Next, let's look at the UsersModule:

@@filename(users.module)
import { Module } from '@nestjs/common';
import { SequelizeModule } from '@nestjs/sequelize';
import { User } from './user.model';
import { UsersController } from './users.controller';
import { UsersService } from './users.service';

@Module({
  imports: [SequelizeModule.forFeature([User])],
  providers: [UsersService],
  controllers: [UsersController],
})
export class UsersModule {}

This module uses the forFeature() method to define which models are registered in the current scope. With that in place, we can inject the UserModel into the UsersService using the @InjectModel() decorator:

@@filename(users.service)
import { Injectable } from '@nestjs/common';
import { InjectModel } from '@nestjs/sequelize';
import { User } from './user.model';

@Injectable()
export class UsersService {
  constructor(
    @InjectModel(User)
    private userModel: typeof User,
  ) {}

  async findAll(): Promise<User[]> {
    return this.userModel.findAll();
  }

  findOne(id: string): Promise<User> {
    return this.userModel.findOne({
      where: {
        id,
      },
    });
  }

  async remove(id: string): Promise<void> {
    const user = await this.findOne(id);
    await user.destroy();
  }
}
@@switch
import { Injectable, Dependencies } from '@nestjs/common';
import { getModelToken } from '@nestjs/sequelize';
import { User } from './user.model';

@Injectable()
@Dependencies(getModelToken(User))
export class UsersService {
  constructor(usersRepository) {
    this.usersRepository = usersRepository;
  }

  async findAll() {
    return this.userModel.findAll();
  }

  findOne(id) {
    return this.userModel.findOne({
      where: {
        id,
      },
    });
  }

  async remove(id) {
    const user = await this.findOne(id);
    await user.destroy();
  }
}

warning Notice Don't forget to import the UsersModule into the root AppModule.

If you want to use the repository outside of the module which imports SequelizeModule.forFeature, you'll need to re-export the providers generated by it. You can do this by exporting the whole module, like this:

@@filename(users.module)
import { Module } from '@nestjs/common';
import { SequelizeModule } from '@nestjs/sequelize';
import { User } from './user.entity';

@Module({
  imports: [SequelizeModule.forFeature([User])],
  exports: [SequelizeModule]
})
export class UsersModule {}

Now if we import UsersModule in UserHttpModule, we can use @InjectModel(User) in the providers of the latter module.

@@filename(users-http.module)
import { Module } from '@nestjs/common';
import { UsersModule } from './user.module';
import { UsersService } from './users.service';
import { UsersController } from './users.controller';

@Module({
  imports: [UsersModule],
  providers: [UsersService],
  controllers: [UsersController]
})
export class UserHttpModule {}

Relations

Relations are associations established between two or more tables. Relations are based on common fields from each table, often involving primary and foreign keys.

There are three types of relations:

One-to-one Every row in the primary table has one and only one associated row in the foreign table
One-to-many / Many-to-one Every row in the primary table has one or more related rows in the foreign table
Many-to-many Every row in the primary table has many related rows in the foreign table, and every record in the foreign table has many related rows in the primary table

To define relations in entities, use the corresponding decorators. For example, to define that each User can have multiple photos, use the @HasMany() decorator.

@@filename(user.entity)
import { Column, Model, Table, HasMany } from 'sequelize-typescript';
import { Photo } from '../photos/photo.model';

@Table
export class User extends Model<User> {
  @Column
  firstName: string;

  @Column
  lastName: string;

  @Column({ defaultValue: true })
  isActive: boolean;

  @HasMany(() => Photo)
  photos: Photo[];
}

info Hint To learn more about associations in Sequelize, read this chapter.

Auto-load models

Manually adding models to the models array of the connection options can be tedious. In addition, referencing models from the root module breaks application domain boundaries and causes leaking implementation details to other parts of the application. To solve this issue, automatically load models by setting both autoLoadModels and synchronize properties of the configuration object (passed into the forRoot() method) to true, as shown below:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { SequelizeModule } from '@nestjs/sequelize';

@Module({
  imports: [
    SequelizeModule.forRoot({
      ...
      autoLoadModels: true,
      synchronize: true,
    }),
  ],
})
export class AppModule {}

With that option specified, every model registered through the forFeature() method will be automatically added to the models array of the configuration object.

warning Warning Note that models that aren't registered through the forFeature() method, but are only referenced from the model (via an association), won't be included.

Transactions

A database transaction symbolizes a unit of work performed within a database management system against a database, and treated in a coherent and reliable way independent of other transactions. A transaction generally represents any change in a database (learn more).

There are many different strategies to handle Sequelize transactions. Below is a sample implementation of a managed transaction (auto-callback).

First, we need to inject the Sequelize object into a class in the normal way:

@Injectable()
export class UsersService {
  constructor(private sequelize: Sequelize) {}
}

info Hint The Sequelize class is imported from the sequelize-typescript package.

Now, we can use this object to create a transaction.

async createMany() {
  try {
    await this.sequelize.transaction(async t => {
      const transactionHost = { transaction: t };

      await this.userModel.create(
          { firstName: 'Abraham', lastName: 'Lincoln' },
          transactionHost,
      );
      await this.userModel.create(
          { firstName: 'John', lastName: 'Boothe' },
          transactionHost,
      );
    });
  } catch (err) {
    // Transaction has been rolled back
    // err is whatever rejected the promise chain returned to the transaction callback
  }
}

info Hint Note that the Sequelize instance is used only to start the transaction. However, to test this class would require mocking the entire Sequelize object (which exposes several methods). Thus, we recommend using a helper factory class (e.g., TransactionRunner) and defining an interface with a limited set of methods required to maintain transactions. This technique makes mocking these methods pretty straightforward.

Migrations

Migrations provide a way to incrementally update the database schema to keep it in sync with the application's data model while preserving existing data in the database. To generate, run, and revert migrations, Sequelize provides a dedicated CLI.

Migration classes are separate from the Nest application source code. Their lifecycle is maintained by the Sequelize CLI. Therefore, you are not able to leverage dependency injection and other Nest specific features with migrations. To learn more about migrations, follow the guide in the Sequelize documentation.

Multiple databases

Some projects require multiple database connections. This can also be achieved with this module. To work with multiple connections, first create the connections. In this case, connection naming becomes mandatory.

Suppose you have an Album entity stored in its own database.

const defaultOptions = {
  dialect: 'postgres',
  port: 5432,
  username: 'user',
  password: 'password',
  database: 'db',
  synchronize: true,
};

@Module({
  imports: [
    SequelizeModule.forRoot({
      ...defaultOptions,
      host: 'user_db_host',
      models: [User],
    }),
    SequelizeModule.forRoot({
      ...defaultOptions,
      name: 'albumsConnection',
      host: 'album_db_host',
      models: [Album],
    }),
  ],
})
export class AppModule {}

warning Notice If you don't set the name for a connection, its name is set to default. Please note that you shouldn't have multiple connections without a name, or with the same name, otherwise they will get overridden.

At this point, you have User and Album models registered with their own connection. With this setup, you have to tell the SequelizeModule.forFeature() method and the @InjectModel() decorator which connection should be used. If you do not pass any connection name, the default connection is used.

@Module({
  imports: [
    SequelizeModule.forFeature([User]),
    SequelizeModule.forFeature([Album], 'albumsConnection'),
  ],
})
export class AppModule {}

You can also inject the Sequelize instance for a given connection:

@Injectable()
export class AlbumsService {
  constructor(
    @InjectConnection('albumsConnection')
    private sequelize: Sequelize,
  ) {}
}

Testing

When it comes to unit testing an application, we usually want to avoid making a database connection, keeping our test suites independent and their execution process as fast as possible. But our classes might depend on models that are pulled from the connection instance. How do we handle that? The solution is to create mock models. In order to achieve that, we set up custom providers. Each registered model is automatically represented by a <ModelName>Model token, where ModelName is the name of your model class.

The @nestjs/sequelize package exposes the getModelToken() function which returns a prepared token based on a given model.

@Module({
  providers: [
    UsersService,
    {
      provide: getModelToken(User),
      useValue: mockModel,
    },
  ],
})
export class UsersModule {}

Now a substitute mockModel will be used as the UserModel. Whenever any class asks for UserModel using an @InjectModel() decorator, Nest will use the registered mockModel object.

Async configuration

You may want to pass your SequelizeModule options asynchronously instead of statically. In this case, use the forRootAsync() method, which provides several ways to deal with async configuration.

One approach is to use a factory function:

SequelizeModule.forRootAsync({
  useFactory: () => ({
    dialect: 'mysql',
    host: 'localhost',
    port: 3306,
    username: 'root',
    password: 'root',
    database: 'test',
    models: [],
  }),
});

Our factory behaves like any other asynchronous provider (e.g., it can be async and it's able to inject dependencies through inject).

SequelizeModule.forRootAsync({
  imports: [ConfigModule],
  useFactory: (configService: ConfigService) => ({
    dialect: 'mysql',
    host: configService.get('HOST'),
    port: +configService.get('PORT'),
    username: configService.get('USERNAME'),
    password: configService.get('PASSWORD'),
    database: configService.get('DATABASE'),
    models: [],
  }),
  inject: [ConfigService],
});

Alternatively, you can use the useClass syntax:

SequelizeModule.forRootAsync({
  useClass: SequelizeConfigService,
});

The construction above will instantiate SequelizeConfigService inside SequelizeModule and use it to provide an options object by calling createSequelizeOptions(). Note that this means that the SequelizeConfigService has to implement the SequelizeOptionsFactory interface, as shown below:

@Injectable()
class SequelizeConfigService implements SequelizeOptionsFactory {
  createSequelizeOptions(): SequelizeModuleOptions {
    return {
      dialect: 'mysql',
      host: 'localhost',
      port: 3306,
      username: 'root',
      password: 'root',
      database: 'test',
      models: [],
    };
  }
}

In order to prevent the creation of SequelizeConfigService inside SequelizeModule and use a provider imported from a different module, you can use the useExisting syntax.

SequelizeModule.forRootAsync({
  imports: [ConfigModule],
  useExisting: ConfigService,
});

This construction works the same as useClass with one critical difference - SequelizeModule will lookup imported modules to reuse an existing ConfigService instead of instantiating a new one.

Example

A working example is available here.

Task Scheduling

Task scheduling allows you to schedule arbitrary code (methods/functions) to execute at a fixed date/time, at recurring intervals, or once after a specified interval. In the Linux world, this is often handled by packages like cron at the OS level. For Node.js apps, there are several packages that emulate cron-like functionality. Nest provides the @nestjs/schedule package, which integrates with the popular Node.js node-cron package. We'll cover this package in the current chapter.

Installation

To begin using it, we first install the required dependencies.

$ npm install --save @nestjs/schedule

To activate job scheduling, import the ScheduleModule into the root AppModule and run the forRoot() static method as shown below:

@@filename(app.module)
import { Module } from '@nestjs/common';
import { ScheduleModule } from '@nestjs/schedule';

@Module({
  imports: [
    ScheduleModule.forRoot()
  ],
})
export class AppModule {}

The .forRoot() call initializes the scheduler and registers any declarative cron jobs, timeouts and intervals that exist within your app. Registration occurs when the onApplicationBootstrap lifecycle hook occurs, ensuring that all modules have loaded and declared any scheduled jobs.

Declarative cron jobs

A cron job schedules an arbitrary function (method call) to run automatically. Cron jobs can run:

  • Once, at a specified date/time.
  • On a recurring basis; recurring jobs can run at a specified instant within a specified interval (for example, once per hour, once per week, once every 5 minutes)

Declare a cron job with the @Cron() decorator preceding the method definition containing the code to be executed, as follows:

import { Injectable, Logger } from '@nestjs/common';
import { Cron } from '@nestjs/schedule';

@Injectable()
export class TasksService {
  private readonly logger = new Logger(TasksService.name);

  @Cron('45 * * * * *')
  handleCron() {
    this.logger.debug('Called when the current second is 45');
  }
}

In this example, the handleCron() method will be called each time the current second is 45. In other words, the method will be run once per minute, at the 45 second mark.

The @Cron() decorator supports all standard cron patterns:

  • Asterisk (e.g. *)
  • Ranges (e.g. 1-3,5)
  • Steps (e.g. */2)

In the example above, we passed 45 * * * * * to the decorator. The following key shows how each position in the cron pattern string is interpreted:


* * * * * *
| | | | | |
| | | | | day of week
| | | | month
| | | day of month
| | hour
| minute
second (optional)

Some sample cron patterns are:

* * * * * * every second
45 * * * * * every minute, on the 45th second
* 10 * * * * every hour, at the start of the 10th minute
0 */30 9-17 * * * every 30 minutes between 9am and 5pm
0 30 11 * * 1-5 Monday to Friday at 11:30am

The @nestjs/schedule package provides a convenience enum with commonly used cron patterns. You can use this enum as follows:

import { Injectable, Logger } from '@nestjs/common';
import { Cron, CronExpression } from '@nestjs/schedule';

@Injectable()
export class TasksService {
  private readonly logger = new Logger(TasksService.name);

  @Cron(CronExpression.EVERY_45_SECONDS)
  handleCron() {
    this.logger.debug('Called every 45 seconds');
  }
}

In this example, the handleCron() method will be called every 45 seconds.

Alternatively, you can supply a JavaScript Date object to the @Cron() decorator. Doing so causes the job to execute exactly once, at the specified date.

info Hint Use JavaScript date arithmetic to schedule jobs relative to the current date. For example, @Cron(new Date(Date.now() + 10 * 1000)) to schedule a job to run 10 seconds after the app starts.

You can access and control a cron job after it's been declared, or dynamically create a cron job (where its cron pattern is defined at runtime) with the Dynamic API. To access a declarative cron job via the API, you must associate the job with a name by passing the name property in an optional options object as the second argument of the decorator, as shown below:

@Cron('* * 8 * * *', {
  name: 'notifications',
})
triggerNotifications() {}

Declarative intervals

To declare that a method should run at a (recurring) specified interval, prefix the method definition with the @Interval() decorator. Pass the interval value, as a number in milliseconds, to the decorator as shown below:

@Interval(10000)
handleInterval() {
  this.logger.debug('Called every 10 seconds');
}

info Hint This mechanism uses the JavaScript setInterval() function under the hood. You can also utilize a cron job to schedule recurring jobs.

If you want to control your declarative interval from outside the declaring class via the Dynamic API, associate the interval with a name using the following construction:

@Interval('notifications', 2500)
handleInterval() {}

The Dynamic API also enables creating dynamic intervals, where the interval's properties are defined at runtime, and listing and deleting them.

Declarative timeouts

To declare that a method should run (once) at a specified timeout, prefix the method definition with the @Timeout() decorator. Pass the relative time offset (in milliseconds), from application startup, to the decorator as shown below:

@Timeout(5000)
handleTimeout() {
  this.logger.debug('Called once after 5 seconds');
}

info Hint This mechanism uses the JavaScript setTimeout() function under the hood.

If you want to control your declarative timeout from outside the declaring class via the Dynamic API, associate the timeout with a name using the following construction:

@Timeout('notifications', 2500)
handleTimeout() {}

The Dynamic API also enables creating dynamic timeouts, where the timeout's properties are defined at runtime, and listing and deleting them.

Dynamic schedule module API

The @nestjs/schedule module provides a dynamic API that enables managing declarative cron jobs, timeouts and intervals. The API also enables creating and managing dynamic cron jobs, timeouts and intervals, where the properties are defined at runtime.

Dynamic cron jobs

Obtain a reference to a CronJob instance by name from anywhere in your code using the SchedulerRegistry API. First, inject SchedulerRegistry using standard constructor injection:

constructor(private schedulerRegistry: SchedulerRegistry) {}

info Hint Import the SchedulerRegistry from the @nestjs/schedule package.

Then use it in a class as follows. Assume a cron job was created with the following declaration:

@Cron('* * 8 * * *', {
  name: 'notifications',
})
triggerNotifications() {}

Access this job using the following:

const job = this.schedulerRegistry.getCronJob('notifications');

job.stop();
console.log(job.lastDate());

The getCronJob() method returns the named cron job. The returned CronJob object has the following methods:

  • stop() - stops a job that is scheduled to run.
  • start() - restarts a job that has been stopped.
  • setTime(time: CronTime) - stops a job, sets a new time for it, and then starts it
  • lastDate() - returns a string representation of the last date a job executed
  • nextDates(count: number) - returns an array (size count) of moment objects representing upcoming job execution dates.

info Hint Use toDate() on moment objects to render them in human readable form.

Create a new cron job dynamically using the SchedulerRegistry.addCronJob() method, as follows:

addCronJob(name: string, seconds: string) {
  const job = new CronJob(`${seconds} * * * * *`, () => {
    this.logger.warn(`time (${seconds}) for job ${name} to run!`);
  });

  this.scheduler.addCronJob(name, job);
  job.start();

  this.logger.warn(
    `job ${name} added for each minute at ${seconds} seconds!`,
  );
}

In this code, we use the CronJob object from the cron package to create the cron job. The CronJob constructor takes a cron pattern (just like the @Cron() decorator) as its first argument, and a callback to be executed when the cron timer fires as its second argument. The SchedulerRegistry.addCronJob() method takes two arguments: a name for the CronJob, and the CronJob object itself.

warning Warning Remember to inject the SchedulerRegistry before accessing it. Import CronJob from the cron package.

Delete a named cron job using the SchedulerRegistry.deleteCronJob() method, as follows:

deleteCron(name: string) {
  this.scheduler.deleteCronJob(name);
  this.logger.warn(`job ${name} deleted!`);
}

List all cron jobs using the SchedulerRegistry.getCronJobs() method as follows:

getCrons() {
  const jobs = this.scheduler.getCronJobs();
  jobs.forEach((value, key, map) => {
    let next;
    try {
      next = value.nextDates().toDate();
    } catch (e) {
      next = 'error: next fire date is in the past!';
    }
    this.logger.log(`job: ${key} -> next: ${next}`);
  });
}

The getCronJobs() method returns a map. In this code, we iterate over the map and attempt to access the nextDates() method of each CronJob. In the CronJob API, if a job has already fired and has no future firing dates, it throws an exception.

Dynamic intervals

Obtain a reference to an interval with the SchedulerRegistry.getInterval() method. As above, inject SchedulerRegistry using standard constructor injection:

constructor(private schedulerRegistry: SchedulerRegistry) {}

And use it as follows:

const interval = this.schedulerRegistry.getInterval('notifications');
clearInterval(interval);

Create a new interval dynamically using the SchedulerRegistry.addInterval() method, as follows:

addInterval(name: string, seconds: string) {
  const callback = () => {
    this.logger.warn(`Interval ${name} executing at time (${seconds})!`);
  };

  const interval = setInterval(callback, seconds);
  this.scheduler.addInterval(name, interval);
}

In this code, we create a standard JavaScript interval, then pass it to the ScheduleRegistry.addInterval() method. That method takes two arguments: a name for the interval, and the interval itself.

Delete a named interval using the SchedulerRegistry.deleteInterval() method, as follows:

deleteInterval(name: string) {
  this.scheduler.deleteInterval(name);
  this.logger.warn(`Interval ${name} deleted!`);
}

List all intervals using the SchedulerRegistry.getIntervals() method as follows:

getIntervals() {
  const intervals = this.scheduler.getIntervals();
  intervals.forEach(key => this.logger.log(`Interval: ${key}`));
}

Dynamic timeouts

Obtain a reference to a timeout with the SchedulerRegistry.getTimeout() method. As above, inject SchedulerRegistry using standard constructor injection:

constructor(private schedulerRegistry: SchedulerRegistry) {}

And use it as follows:

const timeout = this.schedulerRegistry.getTimeout('notifications');
clearTimeout(timeout);

Create a new timeout dynamically using the SchedulerRegistry.addTimeout() method, as follows:

addTimeout(name: string, seconds: string) {
  const callback = () => {
    this.logger.warn(`Timeout ${name} executing after (${seconds})!`);
  };

  const timeout = setTimeout(callback, seconds);
  this.scheduler.addTimeout(name, timeout);
}

In this code, we create a standard JavaScript timeout, then pass it to the ScheduleRegistry.addTimeout() method. That method takes two arguments: a name for the timeout, and the timeout itself.

Delete a named timeout using the SchedulerRegistry.deleteTimeout() method, as follows:

deleteTimeout(name: string) {
  this.scheduler.deleteTimeout(name);
  this.logger.warn(`Timeout ${name} deleted!`);
}

List all timeouts using the SchedulerRegistry.getTimeouts() method as follows:

getTimeouts() {
  const timeouts = this.scheduler.getTimeouts();
  timeouts.forEach(key => this.logger.log(`Timeout: ${key}`));
}

Example

A working example is available here.

Validation

It is best practice to validate the correctness of any data sent into a web application. To automatically validate incoming requests, Nest provides several pipes available right out-of-the-box:

  • ValidationPipe
  • ParseIntPipe
  • ParseBoolPipe
  • ParseArrayPipe
  • ParseUUIDPipe

The ValidationPipe makes use of the powerful class-validator package and its declarative validation decorators. The ValidationPipe provides a convenient approach to enforce validation rules for all incoming client payloads, where the specific rules are declared with simple annotations in local class/DTO declarations in each module.

Overview

In the Pipes chapter, we went through the process of building simple pipes and binding them to controllers, methods or to the global app to demonstrate how the process works. Be sure to review that chapter to best understand the topics of this chapter. Here, we'll focus on various real world use cases of the ValidationPipe, and show how to use some of its advanced customization features.

Using the built-in ValidationPipe

info Hint The ValidationPipe is imported from the @nestjs/common package.

Because this pipe uses the class-validator and class-transformer libraries, there are many options available. You configure these settings via a configuration object passed to the pipe. Following are the built-in options:

export interface ValidationPipeOptions extends ValidatorOptions {
  transform?: boolean;
  disableErrorMessages?: boolean;
  exceptionFactory?: (errors: ValidationError[]) => any;
}

In addition to these, all class-validator options (inherited from the ValidatorOptions interface) are available:

Option Type Description
skipMissingProperties boolean If set to true, validator will skip validation of all properties that are missing in the validating object.
whitelist boolean If set to true, validator will strip validated (returned) object of any properties that do not use any validation decorators.
forbidNonWhitelisted boolean If set to true, instead of stripping non-whitelisted properties validator will throw an exception.
forbidUnknownValues boolean If set to true, attempts to validate unknown objects fail immediately.
disableErrorMessages boolean If set to true, validation errors will not be returned to the client.
errorHttpStatusCode number This setting allows you to specify which exception type will be used in case of an error. By default it throws BadRequestException.
exceptionFactory Function Takes an array of the validation errors and returns an exception object to be thrown.
groups string[] Groups to be used during validation of the object.
dismissDefaultMessages boolean If set to true, the validation will not use default messages. Error message always will be undefined if its not explicitly set.
validationError.target boolean Indicates if target should be exposed in ValidationError
validationError.value boolean Indicates if validated value should be exposed in ValidationError.

info Notice Find more information about the class-validator package in its repository.

Auto-validation

We'll start by binding ValidationPipe at the application level, thus ensuring all endpoints are protected from receiving incorrect data.

async function bootstrap() {
  const app = await NestFactory.create(ApplicationModule);
  app.useGlobalPipes(new ValidationPipe());
  await app.listen(3000);
}
bootstrap();

To test our pipe, let's create a basic endpoint.

@Post()
create(@Body() createUserDto: CreateUserDto) {
  return 'This action adds a new user';
}

info Hint Since TypeScript does not store metadata about generics or interfaces, when you use them in your DTOs, ValidationPipe may not be able to properly validate incoming data. For this reason, consider using concrete classes in your DTOs.

Now we can add a few validation rules in our CreateUserDto. We do this using decorators provided by the class-validator package, described in detail here. In this fashion, any route that uses the CreateUserDto will automatically enforce these validation rules.

import { IsEmail, IsNotEmpty } from 'class-validator';

export class CreateUserDto {
  @IsEmail()
  email: string;

  @IsNotEmpty()
  password: string;
}

With these rules in place, if a request hits our endpoint with an invalid email property in the request body, the application will automatically respond with a 400 Bad Request code, along with the following response body:

{
  "statusCode": 400,
  "error": "Bad Request",
  "message": ["email must be an email"]
}

In addition to validating request bodies, the ValidationPipe can be used with other request object properties as well. Imagine that we would like to accept :id in the endpoint path. To ensure that only numbers are accepted for this request parameter, we can use the following construct:

@Get(':id')
findOne(@Param() params: FindOneParams) {
  return 'This action returns a user';
}

FindOneParams, like a DTO, is simply a class that defines validation rules using class-validator. It would look like this:

import { IsNumberString } from 'class-validator';

export class FindOneParams {
  @IsNumberString()
  id: number;
}

Disable detailed errors

Error messages can be helpful to explain what was incorrect in a request. However, some production environments prefer to disable detailed errors. Do this by passing an options object to the ValidationPipe:

app.useGlobalPipes(
  new ValidationPipe({
    disableErrorMessages: true,
  }),
);

As a result, detailed error messages won't be displayed in the response body.

Stripping properties

Our ValidationPipe can also filter out properties that should not be received by the method handler. In this case, we can whitelist the acceptable properties, and any property not included in the whitelist is automatically stripped from the resulting object. For example, if our handler expects email and password properties, but a request also includes an age property, this property can be automatically removed from the resulting DTO. To enable such behavior, set whitelist to true.

app.useGlobalPipes(
  new ValidationPipe({
    whitelist: true,
  }),
);

When set to true, this will automatically remove non-whitelisted properties (those without any decorator in the validation class).

Alternatively, you can stop the request from processing when non-whitelisted properties are present, and return an error response to the user. To enable this, set the forbidNonWhitelisted option property to true, in combination with setting whitelist to true.

Transform payload objects

Payloads coming in over the network are plain JavaScript objects. The ValidationPipe can automatically transform payloads to be objects typed according to their DTO classes. To enable auto-transformation, set transform to true. This can be done at a method level:

@@filename(cats.controller)
@Post()
@UsePipes(new ValidationPipe({ transform: true }))
async create(@Body() createCatDto: CreateCatDto) {
  this.catsService.create(createCatDto);
}

To enable this behavior globally, set the option on a global pipe:

app.useGlobalPipes(
  new ValidationPipe({
    transform: true,
  }),
);

With the auto-transformation option enabled, the ValidationPipe will also perform conversion of primitive types. In the following example, the findOne() method takes one argument which represents an extracted id path parameter:

@Get(':id')
findOne(@Param('id') id: number) {
  console.log(typeof id === 'number'); // true
  return 'This action returns a user';
}

By default, every path parameter and query parameter comes over the network as a string. In the above example, we specified the id type as a number (in the method signature). Therefore, the ValidationPipe will try to automatically convert a string identifier to a number.

Explicit conversion

In the above section, we showed how the ValidationPipe can implicitly transform query and path parameters based on the expected type. However, this feature requires having auto-transformation enabled.

Alternatively (with auto-transformation disabled), you can explicitly cast values using the ParseIntPipe or ParseBoolPipe (note that ParseStringPipe is not needed because, as mentioned earlier, every path parameter and query parameter comes over the network as a string by default).

@Get(':id')
findOne(
  @Param('id', ParseIntPipe) id: number,
  @Query('sort', ParseBoolPipe) sort: boolean,
) {
  console.log(typeof id === 'number'); // true
  console.log(typeof sort === 'boolean'); // true
  return 'This action returns a user';
}

info Hint The ParseIntPipe and ParseBoolPipe are exported from the @nestjs/common package.

Parsing and validating arrays

TypeScript does not store metadata about generics or interfaces, so when you use them in your DTOs, ValidationPipe may not be able to properly validate incoming data. For instance, in the following code, createUserDtos won't be correctly validated:

@Post()
createBulk(@Body() createUserDtos: CreateUserDto[]) {
  return 'This action adds new users';
}

To validate the array, create a dedicated class which contains a property that wraps the array, or use the ParseArrayPipe.

@Post()
createBulk(
  @Body(new ParseArrayPipe({ items: CreateUserDto }))
  createUserDtos: CreateUserDto[],
) {
  return 'This action adds new users';
}

In addition, the ParseArrayPipe may come in handy when parsing query parameters. Let's consider a findByIds() method that returns users based on identifiers passed as query parameters.

@Get()
findByIds(
  @Query('id', new ParseArrayPipe({ items: Number, separator: ',' }))
  ids: number[],
) {
  return 'This action returns users by ids';
}

This construction validates the incoming query parameters from an HTTP GET request like the following:

GET /?ids=1,2,3

WebSockets and Microservices

While this chapter shows examples using HTTP style applications (e.g., Express or Fastify), the ValidationPipe works the same for WebSockets and microservices, regardless of the transport method that is used.

Learn more

Read more about custom validators, error messages, and available decorators as provided by the class-validator package here.

Adapters

The WebSockets module is platform-agnostic, hence, you can bring your own library (or even a native implementation) by making use of WebSocketAdapter interface. This interface forces to implement few methods described in the following table:

create Creates a socket instance based on passed arguments
bindClientConnect Binds the client connection event
bindClientDisconnect Binds the client disconnection event (optional*)
bindMessageHandlers Binds the incoming message to the corresponding message handler
close Terminates a server instance

Extend socket.io

The socket.io package is wrapped in an IoAdapter class. What if you would like to enhance the basic functionality of the adapter? For instance, your technical requirements require a capability to broadcast events across multiple load-balanced instances of your web service. For this, you can extend IoAdapter and override a single method which responsibility is to instantiate new socket.io servers. But first of all, let's install the required package.

$ npm i --save socket.io-redis

Once the package is installed, we can create a RedisIoAdapter class.

import { IoAdapter } from '@nestjs/platform-socket.io';
import * as redisIoAdapter from 'socket.io-redis';

export class RedisIoAdapter extends IoAdapter {
  createIOServer(port: number, options?: any): any {
    const server = super.createIOServer(port, options);
    const redisAdapter = redisIoAdapter({ host: 'localhost', port: 6379 });

    server.adapter(redisAdapter);
    return server;
  }
}

Afterward, simply switch to your newly created Redis adapter.

const app = await NestFactory.create(ApplicationModule);
app.useWebSocketAdapter(new RedisIoAdapter(app));

Ws library

Another available adapter is a WsAdapter which in turn acts like a proxy between the framework and integrate blazing fast and thoroughly tested ws library. This adapter is fully compatible with native browser WebSockets and is far faster than socket.io package. Unluckily, it has significantly fewer functionalities available out-of-the-box. In some cases, you may just don't necessarily need them though.

In order to use ws, we firstly have to install the required package:

$ npm i --save @nestjs/platform-ws

Once the package is installed, we can switch an adapter:

const app = await NestFactory.create(ApplicationModule);
app.useWebSocketAdapter(new WsAdapter(app));

info Hint The WsAdapter is imported from @nestjs/platform-ws.

Advanced (custom adapter)

For demonstration purposes, we are going to integrate the ws library manually. As mentioned, the adapter for this library is already created and is exposed from the @nestjs/platform-ws package as a WsAdapter class. Here is how the simplified implementation could potentially look like:

@@filename(ws-adapter)
import * as WebSocket from 'ws';
import { WebSocketAdapter, INestApplicationContext } from '@nestjs/common';
import { MessageMappingProperties } from '@nestjs/websockets';
import { Observable, fromEvent, EMPTY } from 'rxjs';
import { mergeMap, filter } from 'rxjs/operators';

export class WsAdapter implements WebSocketAdapter {
  constructor(private app: INestApplicationContext) {}

  create(port: number, options: any = {}): any {
    return new ws.Server({ port, ...options });
  }

  bindClientConnect(server, callback: Function) {
    server.on('connection', callback);
  }

  bindMessageHandlers(
    client: WebSocket,
    handlers: MessageMappingProperties[],
    process: (data: any) => Observable<any>,
  ) {
    fromEvent(client, 'message')
      .pipe(
        mergeMap(data => this.bindMessageHandler(data, handlers, process)),
        filter(result => result),
      )
      .subscribe(response => client.send(JSON.stringify(response)));
  }

  bindMessageHandler(
    buffer,
    handlers: MessageMappingProperties[],
    process: (data: any) => Observable<any>,
  ): Observable<any> {
    const message = JSON.parse(buffer.data);
    const messageHandler = handlers.find(
      handler => handler.message === message.event,
    );
    if (!messageHandler) {
      return EMPTY;
    }
    return process(messageHandler.callback(message.data));
  }

  close(server) {
    server.close();
  }
}

info Hint When you want to take advantage of ws library, use built-in WsAdapter instead of creating your own one.

Then, we can set up a custom adapter using useWebSocketAdapter() method:

@@filename(main)
const app = await NestFactory.create(ApplicationModule);
app.useWebSocketAdapter(new WsAdapter(app));

Example

A working example that uses WsAdapter is available here.

Exception filters

The only difference between the HTTP exception filter layer and the corresponding web sockets layer is that instead of throwing HttpException, you should use WsException.

throw new WsException('Invalid credentials.');

info Hint The WsException class is imported from the @nestjs/websockets package.

With the sample above, Nest will handle the thrown exception and emit the exception message with the following structure:

{
  status: 'error',
  message: 'Invalid credentials.'
}

Filters

Web sockets exception filters behave equivalently to HTTP exception filters. The following example uses a manually instantiated method-scoped filter. Just as with HTTP based applications, you can also use gateway-scoped filters (i.e., prefix the gateway class with a @UseFilters() decorator).

@UseFilters(new WsExceptionFilter())
@SubscribeMessage('events')
onEvent(client, data: any): WsResponse<any> {
  const event = 'events';
  return { event, data };
}

Inheritance

Typically, you'll create fully customized exception filters crafted to fulfill your application requirements. However, there might be use-cases when you would like to simply extend the core exception filter, and override the behavior based on certain factors.

In order to delegate exception processing to the base filter, you need to extend BaseWsExceptionFilter and call the inherited catch() method.

@@filename()
import { Catch, ArgumentsHost } from '@nestjs/common';
import { BaseWsExceptionFilter } from '@nestjs/websockets';

@Catch()
export class AllExceptionsFilter extends BaseWsExceptionFilter {
  catch(exception: unknown, host: ArgumentsHost) {
    super.catch(exception, host);
  }
}
@@switch
import { Catch } from '@nestjs/common';
import { BaseWsExceptionFilter } from '@nestjs/websockets';

@Catch()
export class AllExceptionsFilter extends BaseWsExceptionFilter {
  catch(exception, host) {
    super.catch(exception, host);
  }
}

The above implementation is just a shell demonstrating the approach. Your implementation of the extended exception filter would include your tailored business logic (e.g., handling various conditions).

Gateways

Most of the concepts discussed elsewhere in this documentation, such as dependency injection, decorators, exception filters, pipes, guards and interceptors, apply equally to gateways. Wherever possible, Nest abstracts implementation details so that the same components can run across HTTP-based platforms, WebSockets, and Microservices. This section covers the aspects of Nest that are specific to WebSockets.

In Nest, a gateway is simply a class annotated with @WebSocketGateway() decorator. Technically, gateways are platform-agnostic which makes them compatible with any WebSockets library once an adapter is created. There are two WS platforms supported out-of-the-box: socket.io and ws. You can choose the one that best suits your needs. Also, you can build your own adapter by following this guide.

info Hint Gateways can be treated as providers; this means they can inject dependencies through the class constructor. Also, gateways can be injected by other classes (providers and controllers) as well.

Installation

To start building WebSockets-based applications, first install the required package:

@@filename()
$ npm i --save @nestjs/websockets @nestjs/platform-socket.io
$ npm i --save-dev @types/socket.io
@@switch
$ npm i --save @nestjs/websockets @nestjs/platform-socket.io

Overview

In general, each gateway is listening on the same port as the HTTP server, unless your app is not a web application, or you have changed the port manually. This default behavior can be modified by passing an argument to the @WebSocketGateway(80) decorator where 80 is a chosen port number. You can also set a namespace used by the gateway using the following construction:

@WebSocketGateway(80, { namespace: 'events' })

warning Warning Gateways are not instantiated until they are referenced in the providers array of an existing module.

You can pass any supported option to the socket constructor with the second argument to the @WebSocketGateway() decorator, as shown below:

@WebSocketGateway(81, { transports: ['websocket'] })

The gateway is now listening, but we have not yet subscribed to any incoming messages. Let's create a handler that will subscribe to the events messages and respond to the user with the exact same data.

@@filename(events.gateway)
@SubscribeMessage('events')
handleEvent(@MessageBody() data: string): string {
  return data;
}
@@switch
@Bind(MessageBody())
@SubscribeMessage('events')
handleEvent(data) {
  return data;
}

info Hint @SubscribeMessage() and @MessageBody() decorators are imported from @nestjs/websockets package.

If you would prefer not to use decorators, the following code is functionally equivalent:

@@filename(events.gateway)
@SubscribeMessage('events')
handleEvent(client: Socket, data: string): string {
  return data;
}
@@switch
@SubscribeMessage('events')
handleEvent(client, data) {
  return data;
}

In the example above, the handleEvent() function takes two arguments. The first one is a platform-specific socket instance, while the second one is the data received from the client. This approach is not recommended though, because it requires mocking the socket instance in each unit test.

Once the events message is received, the handler sends an acknowledgment with the same data that was sent over the network. In addition, it's possible to emit messages using a library-specific approach, for example, by making use of client.emit() method. In order to access a connected socket instance, use @ConnectedSocket() decorator.

@@filename(events.gateway)
@SubscribeMessage('events')
handleEvent(
  @MessageBody() data: string,
  @ConnectedSocket() client: Socket,
): string {
  return data;
}
@@switch
@Bind(MessageBody(), ConnectedSocket())
@SubscribeMessage('events')
handleEvent(data, client) {
  return data;
}

info Hint @ConnectedSocket() decorator is imported from @nestjs/websockets package.

However, in this case, you won't be able to leverage interceptors. If you don't want to respond to the user, you can simple skip the return statement (or explicitly return "falsy" value, e.g. undefined).

Now when a client emits the message as follows:

socket.emit('events', { name: 'Nest' });

The handleEvent() method will be executed. In order to listen for messages emitted from within the above handler, the client has to attach a corresponding acknowledgment listener:

socket.emit('events', { name: 'Nest' }, data => console.log(data));

Multiple responses

The acknowledgment is dispatched only once. Furthermore, it is not supported by native WebSockets implementation. To solve this limitation, you may return an object which consist of two properties. The event which is a name of the emitted event and the data that has to be forwarded to the client.

@@filename(events.gateway)
@SubscribeMessage('events')
handleEvent(@MessageBody() data: unknown): WsResponse<unknown> {
  const event = 'events';
  return { event, data };
}
@@switch
@Bind(MessageBody())
@SubscribeMessage('events')
handleEvent(data) {
  const event = 'events';
  return { event, data };
}

info Hint The WsResponse interface is imported from @nestjs/websockets package.

In order to listen for the incoming response(s), the client has to apply another event listener.

socket.on('events', data => console.log(data));

Asynchronous responses

Message handlers are able to respond either synchronously or asynchronously. Hence, async methods are supported. A message handler is also able to return an Observable, in which case the result values will be emitted until the stream is completed.

@@filename(events.gateway)
@SubscribeMessage('events')
onEvent(@MessageBody() data: unknown): Observable<WsResponse<number>> {
  const event = 'events';
  const response = [1, 2, 3];

  return from(response).pipe(
    map(data => ({ event, data })),
  );
}
@@switch
@Bind(MessageBody())
@SubscribeMessage('events')
onEvent(data) {
  const event = 'events';
  const response = [1, 2, 3];

  return from(response).pipe(
    map(data => ({ event, data })),
  );
}

In the example above, the message handler will respond 3 times (with each item from the array).

Lifecycle hooks

There are 3 useful lifecycle hooks available. All of them have corresponding interfaces and are described in the following table:

OnGatewayInit Forces to implement the afterInit() method. Takes library-specific server instance as an argument (and spreads the rest if required).
OnGatewayConnection Forces to implement the handleConnection() method. Takes library-specific client socket instance as an argument.
OnGatewayDisconnect Forces to implement the handleDisconnect() method. Takes library-specific client socket instance as an argument.

info Hint Each lifecycle interface is exposed from @nestjs/websockets package.

Server

Occasionally, you may want to have a direct access to the native, platform-specific server instance. The reference to this object is passed as an argument to the afterInit() method (OnGatewayInit interface). Another option is to use the @WebSocketServer() decorator.

@WebSocketServer()
server: Server;

warning Notice The @WebSocketServer() decorator is imported from the @nestjs/websockets package.

Nest will automatically assign the server instance to this property once it is ready to use.

Example

A working example is available here.

Guards

There is no fundamental difference between web sockets guards and regular HTTP application guards. The only difference is that instead of throwing HttpException, you should use WsException.

info Hint The WsException class is exposed from @nestjs/websockets package.

Binding guards

The following example uses a method-scoped guard. Just as with HTTP based applications, you can also use gateway-scoped guards (i.e., prefix the gateway class with a @UseGuards() decorator).

@@filename()
@UseGuards(AuthGuard)
@SubscribeMessage('events')
handleEvent(client: Client, data: unknown): WsResponse<unknown> {
  const event = 'events';
  return { event, data };
}
@@switch
@UseGuards(AuthGuard)
@SubscribeMessage('events')
handleEvent(client, data) {
  const event = 'events';
  return { event, data };
}

Interceptors

There is no difference between regular interceptors and web sockets interceptors. The following example uses a manually instantiated method-scoped interceptor. Just as with HTTP based applications, you can also use gateway-scoped interceptors (i.e., prefix the gateway class with a @UseInterceptors() decorator).

@@filename()
@UseInterceptors(new TransformInterceptor())
@SubscribeMessage('events')
handleEvent(client: Client, data: unknown): WsResponse<unknown> {
  const event = 'events';
  return { event, data };
}
@@switch
@UseInterceptors(new TransformInterceptor())
@SubscribeMessage('events')
handleEvent(client, data) {
  const event = 'events';
  return { event, data };
}

Pipes

There is no fundamental difference between regular pipes and microservice pipes. The only difference is that instead of throwing HttpException, you should use WsException. In addition, all pipes will be only applied to the data parameter (because validating or transforming client instance is useless).

info Hint The WsException class is exposed from @nestjs/websockets package.

Binding pipes

The following example uses a manually instantiated method-scoped pipe. Just as with HTTP based applications, you can also use gateway-scoped pipes (i.e., prefix the gateway class with a @UsePipes() decorator).

@@filename()
@UsePipes(new ValidationPipe())
@SubscribeMessage('events')
handleEvent(client: Client, data: unknown): WsResponse<unknown> {
  const event = 'events';
  return { event, data };
}
@@switch
@UsePipes(new ValidationPipe())
@SubscribeMessage('events')
handleEvent(client, data) {
  const event = 'events';
  return { event, data };
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment