Repositories

Alright. We have come quite a long way and are now hitting the halfway point of this guide. Today, we will be creating the database layer, a.k.a. repositories, for our little app.

First off, you will need to install, if you have not yet done so, the following libraries with npm/yarn:

  • knex
  • sqlite3

For the purposes of this guide, and to keep things simple, we will use sqlite. Feel free to use any database system supported by knex, but you will have to figure out the configuration settings yourself then. Still, the code stays the same, regardless of the DB.

Knex

For any doubts about how to use knex, simply visit their website. There you can find examples for just about anything you can do with it. I will not be explaining the library API in detail, though it should be quite self-evident what most commands do.

Setting up knex

First, we need to create the "Knexfile", which is where Knex's configuration lives. You can do that manually by creating knexfile.js in your project root or by typing knex init in your terminal. The latter will also give you a template, but for our needs, we will keep all our configuration in our ./config module, so knexfile.js will only contain:

// ./knexfile.js

module.exports = require('./config')(process.env.NODE_ENV).db;

Not much to it, is there. We simply have it read the database settings from our relevant config.

Creating the database and tables

You could, of course, create the database manually. But you really do not have to, and I would argue that you really should not. That is why the smart folks over at knex gave us a nice migration engine, which is fancy talk for "specify the initial db structure and all subsequent changes in a file, and we will take care of the SQL magic, even or especially when something goes wrong and a rollback to a previous version is required".

In your terminal in the project root, type knex migrate:make initial. This will create your initial migration file in ./data/migrations directory, if you have followed the guide so far. It will be named {timestamp}_initial.js.

Open that file and add the following:

// ./data/migrations/{timestamp}_initial.js

exports.up = function(knex, Promise) {
  return knex.schema.createTableIfNotExists('todos', function (table) {
    table.increments('id');
    table.timestamps();
    table.integer('type');
    table.text('content');
  });
};

exports.down = function(knex, Promise) {
  return knex.schema.dropTable('todos');
};

exports.up is the function used when creating/updating database tables and exports.down is used when we are reverting to a previous version. Be careful with the latter as you will lose the data if you just drop the tables with a rollback.

Everything else should be understandable. If not, check knex's documentation.

Now, we are ready to run the migration. Go back to your project root in your terminal and type knex migrate:latest, press enter, and the magic will unfold. Your output should read something like:

Using environment: dev
Batch 1 run: 1 migrations
./data/migrations/20170405161126_initial.js

And voilá, your database and todos table is up and running, ready for action. If you want to browse the database, a simple, lightweight, apt-gettable tool is sqlitebrowser.

All we need now is a way for the remainder of our code to access this database. To easily achieve this, create ./data/index.js and enter

// ./data/index.js

const knex = require('knex');

module.exports = config => knex(config.db);

We will require the ./data module to get this database connection into the system.

Everything but the repositories...

Connecting the database to the system

Before we go into repository code proper, there are some other details I would like us to take care of, so everything runs smoothly.

We will have to add the database module, or rather the databse connection it returns when initialized, to the globals object in ./index.js. We can do this simply with a require.

const db = require('./data')(config);

and then we put that into globals, so the whole object now reads as

const globals = {
  config,
  router,
  db,
};

Now the database connection is available everywhere globals is available.

Upgrading the services

We will also make two sets of upgrades to our services.

If you check ./services/todo.js, you will see that we are using the base create method, followed by a get of the newly created entry. This is because knex will return only the index of the new item, not the whole item. However, when updating an existing entry, it will only return 1 or 0, depending on the success of the operation. We may want to also return the complete updated entry, not just the success status, which would mean we have to duplicate the getting code.

Do not fear, DRY is here.

Instead of writing a new update override for the base.update method and then copying

.then(([id]) => base.get({ id })(query)(body))

from our current create method and altering it so it passes the right id, we will do it a bit differently.

First, open up ./services/base.js and add the following to the exported object:

_returnActual: result => id =>
  result
  .then(createdId =>
    globals.repositories[serviceName]
    .get({ id: id || createdId[0] })()()),

This is a method which will receive the result, i.e. the Promise returned by another method, and then get the database entry by id, which is either provided as an argument to the function (in case of update) or returned from the result (in case of create).

Now update ./services/todo.js so it reads as follows:

// ./services/todo.js

const baseServices = require('./base');

module.exports = (globals) => {
  const base = baseServices(globals)('todo');

  const save = params => query => body =>
    globals.repositories.todo
    .save(params)(query)(body);

  return Object.assign({}, base, {
    create: params => query => body =>
      base._returnActual(save(params)(query)(body))(),

    update: params => query => body =>
      base._returnActual(save(params)(query)(body))(params.id),
  });
};

What we have done here is:

  1. We defined a save function, which takes care of calling the save method of the repository. We did this because both create and update methods use the same call (for now).

  2. We have created an update override for the base.update method.

  3. We have wrapped both the create and update methods in base._returnActual, so we get the full newly created/altered object, and not just the id/status. We do not pass the id with create as the database will provide it upon entry creation. With update, however, we do provide it, as we already know it.

Finally, repositories

Now that we have done everything else, it is time to create real repositories, not just use those silly mocks.

First, we will create the base.js for repositories.

// ./repositories/base.js

const _ = require('lodash');

module.exports = globals => modelName => ({
  get: params => query => body =>
    globals.db(`${modelName}s`)
    .where(params || query ? _.merge({}, params, query) : true),

  delete: params => query => body =>
    globals.db(`${modelName}s`)
    .where(params)
    .delete(),

  save: params => query => body =>
    (params.id
      ? globals.db(`${modelName}s`)
        .where(params)
        .update(_.merge(body, { updated_at: new Date().toUTCString() }))
      : globals.db(`${modelName}s`)
        .insert(body)),
});

This should be quite clear. We use the knex instance saved in globals.db to perform database operations. The reason we do not have separate create and update methods here is that those are both operations that save things to the database. Depending on your stack, you may have to have separate ones, but here, we do not.

And, finally, the new todo repository in its entirety:

// ./repositories/todo.js

const _ = require('lodash');
const baseRepositories = require('./base');

module.exports = (globals) => {
  const base = baseRepositories(globals)('todo');

  return _.merge({}, base, {});
};

No need to override anything there for now.

Yaaay!

So there we have it. A fully working system, from the externally exposed API all the way to the database.

We still need some things though. Currently, whatever data passes through our system is not really standardized. Or, to be more precise, there is nothing really enforcing the standard we have in mind. This will be the topic for our next chapter, which will cover Models - standardized packages of data all parts of our system will use to communicate with each other, so there can be no surprises.

Repositories — Github Repository

results matching ""

    No results matching ""