Syncing your DB and assets across environments in Craft 3
The most tedious part of deploying a website from local to staging to production is importing/exporting the database and user files (assets). No longer.
Syncing your database and assets across different environments (local, staging and production) is an issue that plagues most developers whether deploying websites or apps. It’s tedious, time-consuming and error prone 😑. Every time I find myself knee-deep in an SSH terminal importing and exporting Postgres database dumps I promise myself “never again”
The import/export dance 💃🏾
First, let’s define a couple of common scenarios where I find myself doing the import/export dance.
1) Deploying local to staging
At the beginning of development you’ve been working locally on your machine with a test database and test user uploads.
Your colleagues or client want to see it up and running so you create a staging server. When you go to deploy your site to this staging server you do the dance:
- Create a local database dump
- Zip-up existing user uploads
- SFTP/SCP and upload that database dump and zip file to the staging server
- SSH into the staging server
- Import the database dump and unzip the database dump
My experience is at least 10 other things will happen in the meantime to frustrate this process. Permissions, forgotten passwords, wrong versions …
2) Importing staging to local
You haven’t worked on a website for a while but the client comes to you for a new feature. You still have a local copy of the code but the database and/or assets are out-of-date. You have to login to the production server and do the dance.
3) Deploying staging to production
You have a staging site that is ready for deployment. The client has been loading all their content onto the staging site. You either have to repurpose the staging server as the production server or … do the dance.
The Ideal, Automated, World
If you are regularly deploying websites to different environments this begins to add up to an enormous amount of time doing tedious tasks.
There is a way to remove all of this faff though. First, create a more sensible architecture that centralises our backups in a single accessible location and second, automate, automate, automate.
Single source of truth 📜
We first need to create a “single source of truth”: a location that holds the definitive versions of our database and assets backups.
All of our various environments will push and pull to and from this location. Ideally all of the backups within this location will be versioned too.
This single-source-of-truth needs to be in the cloud and remotely available so that our other environments (and colleagues/clients) can access it (in the above diagram this is an AWS S3 bucket).
We then need to automate as many of the steps mentioned in the export/import dance above:
- Automate the process of backing up our database and assets
- Automate the process of pushing/pulling these backups to and from our single source of truth.
- Automate the process of restoring these backups once pulled-down
A scripting approach:
Let’s be clear, these aren’t new ideas. In fact, there is already a de-facto library out there that handles many of the concepts mentioned above really well called
This library was created by Andrew Welch a.k.a. NYStudio 107. As well as creating an absolute plethora of great plugins for Craft, he’s written a comprehensive blog post on the topic of DB and asset syncing that you should read before going any further.
This library uses bash scripts to give you the tools you need to do everything we mentioned above:
There are some limitations to this approach though.
- You need to integrate it into your codebase (by adding the
scriptsfolder to your project).
- It uses bash scripts, so everything operates outside of the Craft ecosystem.
- You can’t run any of the scripts via the CP for example
- You can’t configure anything via the CP
- It requires additional environment configuration outside of the standard Craft
.envfile (via a
- This configuration is significantly different depending on environment.
While this certainly isn’t the end of the world, for me where this becomes an issue is when are working on a number of sites and you are trying to automate provisioning and deployment via a service such as Larvel Forge. You have to start logging in again to your server and copying/pasting configurations between environments which can take a lot of extra time.
A plugin approach:
🚨 This is a beta plugin — not yet ready for production!
To overcome some of these shortcoming we’ve created a plugin to completely handle syncing between environment:
This plugin lets you backup and restore both your database and assets from within the Craft 3 control panel. It also lets you push and pull these backups from AWS S3, again via the control panel. Furthermore everything is configured via the “settings” panel and Craft’s native
The big benefit of this approach is that you only have to configure the plugin once when you set up your site. After that you shouldn’t have to go near the command line again to backup and restore your database and assets.
You will also benefit from having a single-source of versioned backups in a private S3 bucket. Even better, you can schedule these backups to run regularly. 🎉
You need to have
awscli installed for every environment you plan on using the plugin in. This is the command-line tool that will perform the actual upload/download between our site and AWS
> brew install awscli
> sudo apt-get install awscli
Installing the plugin should be easy. Either search for it in the Craft Store or run:
composer require weareferal/craft-sync
from the command line.
First you need to add your AWS details. Go to the Setting > Plugins > Sync settings page:
You can either enter the details directly, but it’s easier to migrate from environment to environment when you use environment variables instead. This means adding your AWS details to your ‘.env‘ file first:
AWS_ACCESS_KEY = "..." AWS_SECRET_KEY = "..." AWS_REGION = "us-west-2" AWS_BUCKET_NAME = "feral-backups" AWS_BUCKET_PREFIX = "craft-backups/my-site"
and then referencing those environment variables from the settings page:
This is important as it means your access ID and secret key are not saved in the
project.yaml file (which is checked into Git) and are instead only referenced via environment variables (which are not checked into Git)
Syncing, backing up and restoring
Once you have configured everything, head over to the Utilties > Sync section of the Craft CP to actually perform operations.
This is the guts of the plugin. Here you can:
- Create a new local database backup (this piggy-backs off the existing “Database Backup” utility action)
- Push all local database backups to S3
- Pull all remote S3 database backups to your local environment
- Restore a particular local database backup
Similarly you can do the same for your volume assets.
Craft natively handles database backup/restore which makes life easier, but volume backups are another story.
This plugin collects all the volumes you have set up in your Settings > Assets section, zips them all up and places them alongside your database backups in the
That should be everything you need to get started with
craft-sync. If you’ve got any issues, ideas or feedback please get in touch via email@example.com and let us know.