Configuring automatic backups for Contentful

Edwin Maldonado
5 min readMay 14, 2019

--

In this tutorial I will explain how to set up automatic backups for your Contentful Space(s) using an AWS Lambda Function and CloudWatch for scheduling.

Contentful maintains a Command Line Interface (CLI) which offers a space export command that lets you download a backup of your Space. If you are interested in reading all the different flags/options that you can pass to the space export command go to the REAMDE on Github, but this is basically how it works:

contentful space export — space-id {SPACE-ID} — environment-id {SPACE-ENV} — management-Token {CMA-TOKEN} — exportDir {YOUR-PATH}

It will create a backup which is a JSON file with the whole space structure (content model, entries, assets, roles & permissions and webhooks).

The CLI space export command works great but we have bigger plans, we want to automate this process by scheduling a routine to do it for us.

Luckily there is yet another tool called the Contentful export tool. Which is a Javascript library that helps you to backup your space (as the space export CLI) in a programmatic way.

The plan

  1. Generate a backup of your Space. By writing a NodeJS script that can collect the export options, download the backup, save it as a JSON file locally.
  2. Upload the backup to AWS S3. Once we have the JSON dump locally, the plan is to upload it to a S3 Bucket where we can collect the backups.
  3. Lambda function. We will wrap our backup script in a Lambda function so we can have the advantage of trigger it in a variety of ways. For now one good option is AWS CloudWatch which can be configured to run as a Cron Job.
  4. Execute the backup script every day. For the purpose of this tutorial, I want to schedule my script on a daily basis. You are free to adapt it to your needs.

Implementation

Let’s create a new AWS Lambda function from the AWS Console.

You will quickly notice that there are three ways to create a lambda deployment package:

  1. Use the built in editor
  2. Upload a .zip file
  3. Upload a file from Amazon S3

We will use the Upload a .zip file option which means that we will have to upload a zip file that contains our function .js file and the node_modules required for our function.

Create a package.json file with the below dependencies:

package.json

{
"dependencies": {
"aws-sdk": "2.450.0",
"file-system": "",
"contentful-cli":
}
}

Create another file called index.js This file will contain our main function, for now we are only downloading our backup file to our local tmp folder:

index.js

'use strict'const AWS = require('aws-sdk');
const fs = require('fs');
const contentfulExport = require('contentful-export');
exports.handler = async (event) => {
const local_backup_path = '/tmp/'
const file_name = 'contentful_backup.json'
const options = {
spaceId: process.env.SPACE_ID,
environmentId: process.env.SPACE_ENV,
managementToken: process.env.MANAGEMENT_TOKEN,
contentFile: file_name,
exportDir: local_backup_path,
useVerboseRenderer: false,
saveFile: true
}
try {
const result = await contentfulExport(options);
return sendResponse(200, 'File downloaded');
} catch (err) {
console.log('Oh no! Some errors occurred!', err);
}

return sendResponse(200,'End of the Function');
};

Note that we are using Environment variables on the script, you can set these ENV variables on the AWS Lambda function page

We still need to upload the dump file to a S3 bucket, so go ahead to the AWS Console, create a new bucket (or pick an existing one).

Once you have your bucket in place, we will need to set up the communication between our lambda function and the S3 bucket. In order to do that we have two options …

  1. Use your API credentials in the lambda function to talk to S3.
  2. Or set up a new IAM Role.

I will lean towards the IAM Role option just to keep things clean and simple on our function. So again from the AWS Console, go to the Security, Identity, & Compliance section and choose IAM.

Create the new IAM role and make sure to attach the AmazonS3 policy to it

IAM Role with full access to talk to AmazonS3

One last step to make it work, link the new IAM Role to your lambda execution option:

super_lambda_role is my custom IAM Role with full access to AWS S3

Awesome! we just allowed our lambda function to talk directly with our S3 bucket. Let’s write the uploader part then, this is how the final function looks like:

'use strict'const AWS = require('aws-sdk');
const s3 = new AWS.S3();
const fs = require('fs');
const contentfulExport = require('contentful-export');
exports.handler = async (event) => {
const local_backup_path = '/tmp/'
const file_name = 'contentful_backup.json'
const options = {
spaceId: process.env.SPACE_ID,
environmentId: process.env.SPACE_ENV,
managementToken: process.env.MANAGEMENT_TOKEN,
contentFile: file_name,
exportDir: local_backup_path,
useVerboseRenderer: false,
saveFile: true
}
try {
const result = await contentfulExport(options);
const outputFile = local_backup_path + file_name;
let fileBuffer = new Buffer(fs.readFileSync(outputFile));
fs.unlinkSync(outputFile);
await uploadFile(fileBuffer);return sendResponse(200, outputFile);
} catch (err) {
console.log('Oh no! Some errors occurred!', err);
}
return sendResponse(200,'End of the Function');
};
const sendResponse = (status, body) => {
var response = {
statusCode: status,
body: body
};
return response;
};
const uploadFile = async (buffer) => {
let params = {
Bucket: process.env.S3_BUCKET_NAME,
Key: 'backups/' + Date.now().toString() + '.json',
Body: buffer
};
return await s3.putObject(params).promise();
}

Scheduling the function

Let’s pick a trigger for our lambda function, especifically a CloudWatch Event.

Go ahead, create the trigger and schedule it as you wish, I used the daily option:

Schedule expression: rate(1 day)

Super, make sure to compress your deployment package (index.js and node_modules ), upload the zip file to the lambda’s function code section and test it!

This is how the final solution looks like on the AWS Lambda page

I hope this function helps you to have a continuous backup for your space.

--

--

Edwin Maldonado
Edwin Maldonado

Written by Edwin Maldonado

Computer science engineer, interested in software architecture, entrepreneurship, and espresso machines

Responses (1)