Prosper Otemuyiwa

Deploying Apollo GQL API to Zeit

I’m going to show you how to use Manifold, Apollo and Express to build a GraphQL API which we’ll then deploy to Zeit.

We’re going to create a simple note app, that allows users to register and create notes. Our little API will showcase one of the best features of GraphQL — responses that include only the data you need.

As you might expect, we have two models: users and notes. Users will be able to register, login, create notes, update them and retrieve them later. We’ll also need to define the relationships between notes and users, so we can for example, get the notes that belong to a user or the author of a note.

Pretty straightforward right? If you have no idea what GraphQL or Express.js is, you might still be able to keep up, but I recommend you catch up here:

Let’s dive in!

Set up

Creating multiple accounts for different services, setting up credit card details on each of them and tracking across several platforms is a pain.

That’s where Manifold comes in — we’ll use Manifold to organize a database and logging, so If you don’t already have a manifold account, head over and create one now.

Now you’re logged in, you will be able to create resources by clicking on the Create Resource button.

You will see a list of available resources on the left, labelled Choose a product. For this tutorial, we will be using JawsDB, PostgreSQL and LogDNA. Go ahead and create resources for both services, it will provision a new LogDNA and JawsDB account for you from the Manifold dashboard. Great! Specify an app name to group them together.

To see your credentials, click on the Export all credentials button.

This will reveal a modal box pop-up with the credentials for both resources we have created. To copy, click on Copy to clipboard button.

Create a .env file in your root folder, this will hold our LogDNA API key, database hostname, username, password and name. It should look like this:


Installing necessary packages

Ensure you have Node.js and npm installed, and create a package.json for our app using npm init. Next, install the following modules:

*npm* install — save apollo-server-express babel-preset-env body-parser express graphql-tools logdna node-env-file pg pg-hstore sequelize

Set up the app server

Create a server.js file in your root folder and copy the following code:

Embedded content:

In the code above, we imported the necessary modules. Let’s take each one and explain what it does.

import express from ‘express’;

Express.js or simply known as express, is a web application framework for Node.js. It simplifies the process of setting up a server in Node.js.

import { graphqlExpress, graphiqlExpress } from ‘apollo-server-express’;

Here, we are importing the GraphQL module for express and also, the GraphQL interface module we will use to test our backend later on. Apart from giving you the flexibility of serving GraphQLi on a separate route, Apollo supports query batching which can help reduce load on your server and has built-in support for persisted queries, which can make your app faster and your server more secure.

import logger from ‘./config/logdna’;

This line imports the logger from LogDNA, we will use to log information about our backend API later on.

import bodyParser from ‘body-parser’;

The bodyParser module is used to extract the entire body portion of an incoming request stream and exposes it on req.body. It parses the data submitted using HTTP POST request.

import schema from ‘./data/schema’;

Here, we are importing our GraphQL schema, we will use this to initialize our graphqlExpress.

const app = express();

Next, we created our server using express.js we imported.

app.use(‘/api’, bodyParser.json(), graphqlExpress({ schema }));
app.use(‘/api-ui’, graphiqlExpress({ endpointURL: ‘/api’ }));

Then, we defined our routes for the GraphQL to use and initialized our graphqlExpress we imported.

app.listen(GRAPHQL_PORT, () => logger.log(‘The app backend is now running’));

Finally, we logged when the server started running, using the logger we imported from LogDNA.

Defining GraphQL schema

A schema tells the server what queries to accept and defines the types and the relationship between them, this is the core of any GraphQL server. Before we proceed, let’s understand what types are.

A Type holds a set of possible fields that are relatable to that type, that you can query. For example, a type Car can have fields such as make, model, color, engine capacity, year of production, details about production factory etc, therefore, when you query type Car, you can get these information about a Car.

A type can also have a field that returns the object of another type, in its set of fields. This common, if there is a relationship between both types. For example, our type User should be to hold data about notes belonging to a user.

In GraphQL schema, ! signifies not nullable and [] signifies an array. Therefore, any field in our type with ! appended to it can not be null.

Now, let’s create a folder named data, then create a schema.js file in it, i.e. data/schema.js:

Embedded content:

In our schema, type User has an id, email, password, firstname, lastname, bio and notes which returns an array of user’s notes.

type User{
 id: Int!
 email: String!
 password: String!
 firstname: String!
 lastname: String!
 bio: String!
 notes: [Note]

Type Note has id, title, body, user, createdAt and updatedAt and user which returns the user that owns the note.

type Note{
 id: Int!
 title: String!
 body: String!
 user: User!
 createdAt: Int!
 updatedAt: Int!

Type Query tells the server the queries that are allowed to be made to it, it defines the entry point of every GraphQL server query. For example, let’s pick the login query, it shows that an email and a password are required to return a type User with that login details.

GraphQL queries have tree-like structure which means execution begins at the top level of the query and cascade down till all resolver functions are executed.

GraphQL validates all queries to make sure they follow the definition in the schema under Type Query before execution. If it encounters any syntax error, it stops execution and returns an error.

type Query {
 login(email: String!, password: String!): User
 createUser(email: String!, password: String!, firstname: String!, lastname: String!, password: String!, bio: String!): User
 fetchAllUsers: [User]
 fetchUser(id: Int!): User
 updateUser(id: Int!, password: String!, firstname: String!, lastname: String!, bio: String!): Boolean
 createNote(title: String!, body: String!, userId: Int!): Note
 fetchNote(id: Int!): Note
 fetchAllNotes: [Note]
 updateNote(id: Int!, title: String!, body: String!): Boolean
 deleteNote(id: Int!): Boolean!

Creating database migrations

Before we create our migration files, make sure sequelize is installed globally via npm install –g sequelize, then run this command:

*sequelize* init

The command above will create folder migrations, config, seeders and models. Next, we need to configure sequelize to be to run the migrations.

Open the config/config.json generated and copy below:

  “development”: {
  “dialect”: “postgres”

Sequelize will create our database tables based on our definition. We definitely want our tables to closely feel like our type User and type Note. Create migrations/users.js in and copy the code below into it.

Embedded content:

To run the migration, run this command:

*sequelize* db:migrate

Now, we need to do the same for our notes table, create migrations/notes.js copy below and run the migration:

Embedded content:

Creating models

We need to define our app models, the schema of our database. Models interface with our database tables and allows us to query for data in our tables, as well as insert new records into the table. Create models/user.js and copy the code below:

Embedded content:

Next, create models/note.js and copy below:

Embedded content:

Setting up database connection

Now that we have our migrations executed and our database models created, let’s set up our database connection.

We setup Sequelize to connect to our PostgreSQL database, then we import our models, so that we can interface with our tables. Also, we need to specify the relationship between our models. A user can have many notes and a note belongs to a single user. Now create data/connector.js and copy below: Embedded content:

Setting up LogDNA

The importance of logging in an application cannot be overemphasized. You need to be able to track everything that goes on in your app. When something goes wrong in an app, the log repository/dashboard is the first place that is checked.

We will be logging various events in our app using LogDNA. Create a new file — config/logdna.js and copy the code below:

Embedded content:

Writing resolvers

So far, our app cannot create users or create notes, it needs to know what to do about each query in our data/schema.js. In GraphQL, this done through resolver functions.

Resolver functions basically answer the question “how do I get the data to respond with?” They specify which backend our data is coming from. The good thing about resolver functions is that they can talk to any backend. We can get the data about users from another GraphQL server or another API endpoint and data about notes from PostgreSQL database.

In this tutorial, our resolver functions will work with our PostgreSQL database. Let’s create resolver functions for our schema. Create data/resolver.js and copy the code below;

Embedded content:

Let’s take a look at the first resolver function:

login(_, args) {
   return new Promise((resolve, reject) => {
   setTimeout(() => reject(logger.warn(‘Action has taken over 15secs’)), 15000);
      where: {
         password: args.password,
    }).then((user) => resolve(user), () => reject(logger.warn(‘Could not fetch user with email: ‘, args)));

The function accepts two parameters; _ and args.

_: it contains the result returned from the resolver on the parent field, or, in the case of a top-level Query field, the rootValue passed from the server configuration. This argument enables the nested nature of GraphQL queries.

args: An object with the arguments passed into the field in the query. For example, if the field was called with author(name: “Ada”), the args object would be: { “name”: “Ada” }.

The function returns a promise that either resolve with the user’s details, or rejects and logs the error.

User: {
  // Fetches all notes that belongs to a user
  notes(user) {
    return user.getNotes();
 Note: {
  // Fetches the owner of a particular note
  user(note) {
   return note.getUser();

The snippet of code above from our resolver.js, shows how the relationship between notes and users are executed.

Deploying to Zeit

Let’s quickly update our package.json to make Babel transpile our ES6 code to ES5. Add this to package.json:

“scripts”: {
  “start”: “nodemon ./server.js — exec babel-node”

Finally, time to deploy our backend, we will be using the simple tool called now by Zeit. Run the command below:

npm install -g now

Next, run the now command to simply create your account. Make sure you are in the project directory and run now again. This will generate a url and deploy the application to the Zeit.

Now, our backend API is ready for consumption!

Open your browser and navigate to URL_GENERATED_BY_ZEIT/api-ui, you will see an interface we can use to test our queries. Let’s try to create a new user:



email: "",

password: "Fun_1234",

firstname: "Oluwafunsho",

lastname: "Okubanjo",

bio: "Developer Evangelist"

) {







Also, let’s check our log from our LogDNA dashboard.


We have been able to focus more on building our backend. The ability to spin off resources at lightning speed, all from a single dashboard means less problems to worry about.

Feel free to improve on the code in this tutorial on your own!


Sign up for the Stratus Update newsletter

With our monthly newsletter, we’ll keep you up to date with a curated selection of the latest cloud services, projects and best practices.
Click here to read the latest issue.