Local DynamoDB
I'm working on a side project that uses DynamoDB. I got tired of waiting for my DynamoDB table changes to complete, so I decided to try spinning up a local development environment. Here's what I found.
⚠️ Heads up!This post is mostly a brain-dump of what I'm working on at the moment. Typically I try to organize my posts to help folks learn, but this one leans more towards "a bunch of pasted code samples". I'll clean it up when I have time.
Local DynamoDB with Docker
Fortunately, there are several community-maintained Docker images for DynamoDB. I went to Docker Hub and chose the most popular one, cnadiminti/dynamodb-local
.
To test it out, you can run:
docker run -p 8000:8000 cnadiminti/dynamodb-local
Then, if your laptop is running Linux, you can visit http://localhost:8000/shell for an interactive shell and tutorial. This will allow you to not only iterate more quickly, but also to run your local development environment for free since you're not using requests on your AWS account.
Development server
You'll need to run your dev server in a way that allows you to direct requests to DynamoDB towards your container. Here's how I set that up:
const AWS = require('aws-sdk');
const {DYNAMO_ENDPOINT} = process.env;
const dynamoOpts = {region: 'us-east-1'};
if (DYNAMO_ENDPOINT) {
dynamoOpts.endpoint = DYNAMO_ENDPOINT;
}
const db = new AWS.DynamoDB(dynamoOpts);
module.exports = db;
Now, if you set the environment variable DYNAMO_OPTS
to http://localhost:8000
, your dev server will use your local containerized database instead of the one on your AWS account.
For a development server, I like to use nodemon
to automatically reload the server when there are changes.
npm install --save-dev nodemon
Then, add the following to your package.json
:
// Header: package.json
{
"scripts": {
"dev": "nodemon dev-server.js"
}
}
You can certainly begin developing like this, but there's one more trick that will bring everything together.
Docker Compose
If you use docker-compose
, then we can create a local dev environment with all of the environment variable set correctly. The approach I describe here assumes that your project is a monorepo with the server code located at packages/server
, but if you adjust the paths then you can get it to work in a regular repo just as well.
Create an empty directory .data
so you can persist your local development data.
Create a file docker-compose.yml
:
# Header: docker-compose.yml
version: '3'
services:
dynamo:
image: "cnadiminti/dynamodb-local"
volumes:
- ./.data/dynamodb:/dynamodb_local_db
ports:
- "8000:8000"
server:
image: "node:8"
user: node
working_dir: /home/node/server
environment:
DYNAMO_ENDPOINT: http://dynamo:8000
# Note: even though these keys are garbage values, access key and
# secret key are still required. Otherwise, `aws-sdk` will attempt
# to read credentials from Amazon's `169.254.169.254` service and
# fail.
AWS_ACCESS_KEY_ID: "abc"
AWS_SECRET_ACCESS_KEY: "xyz"
depends_on:
- dynamo
links:
- "dynamo:dynamo"
volumes:
- ./packages:/home/node
command: npm run dev
Now, when you run docker-compose up
, it will create a DynamoDB container, then connect to it with your dev server. Even better, when you make changes to your server, it will automatically reload itself!