Syncing your DB and assets across environments in Craft 3
The most tedious part of deploying a website from local to staging to production is importing/exporting the database and user files (assets). No longer.
Syncing your database and assets across different environments (local, staging and production) is an issue that plagues most developers whether deploying websites or apps. It’s tedious, time-consuming and error prone 😑. Every time I find myself knee-deep in an SSH terminal importing and exporting Postgres database dumps I promise myself “never again”
The import/export dance 💃🏾
First, let’s define a couple of common scenarios where I find myself doing the import/export dance.
1) Deploying local to staging
At the beginning of development you’ve been working locally on your machine with a test database and test user uploads.
Your colleagues or client want to see it up and running so you create a staging server. When you go to deploy your site to this staging server you do the dance:
- Create a local database dump
- Zip-up existing user uploads
- SFTP/SCP and upload that database dump and zip file to the staging server
- SSH into the staging server
- Import the database dump and unzip the database dump
My experience is at least 10 other things will happen in the meantime to frustrate this process. Permissions, forgotten passwords, wrong versions …
2) Importing staging to local
You haven’t worked on a website for a while but the client comes to you for a new feature. You still have a local copy of the code but the database and/or assets are out-of-date. You have to login to the production server and do the dance.
3) Deploying staging to production
You have a staging site that is ready for deployment. The client has been loading all their content onto the staging site. You either have to repurpose the staging server as the production server or … do the dance.
The Ideal, Automated, World
If you are regularly deploying websites to different environments this begins to add up to an enormous amount of time doing tedious tasks.
There is a way to remove all of this faff though. First, create a more sensible architecture that centralises our database dumps and asset volume copies in a single accessible location and second, automate, automate, automate.
Single source of truth
We first need to create a “single source of truth”: a location that we’ll use to share files. All of our various environments will push and pull to and from this location. Ideally all of the files within this location will be versioned too.
This single-source-of-truth needs to be in the cloud and remotely available so that our other environments (and colleagues/clients) can access it (in the above diagram this is an AWSS3 bucket).
Automation 🤖
We then need to automate as many of the steps mentioned in the export/import dance above:
- Automate the process of pushing our database and assets
- Automate the process of pulling our database/assets from our single source of truth.
- Automate the process of restoring these once pulled-down
The scripting approach
Let’s be clear, these aren’t new ideas. In fact, there is already a de-facto library out there that handles many of the concepts mentioned above really well called craft-scripts
This library was created by Andrew Welch a.k.a. NYStudio 107. As well as creating an absolute plethora of great plugins for Craft, he’s written a comprehensive blog post on the topic of DB and asset syncing that you should read before going any further.
This library uses bash scripts to give you the tools you need to do everything we mentioned above:
There are some limitations to this approach though.
- You need to integrate it into your codebase (by adding the
scripts
folder to your project). - It uses bash scripts, so everything operates outside of the Craft ecosystem.
- You can’t run any of the scripts via the CP for example
- You can’t configure anything via the CP
- It requires additional environment configuration outside of the standard Craft
.env
file (via ascripts/.env.sh
). - This configuration is significantly different depending on environment.
While this certainly isn’t the end of the world, for me where this becomes an issue is when are working on a number of sites and you are trying to automate provisioning and deployment via a service such as Larvel Forge. You have to start logging in again to your server and copying/pasting configurations between environments which can take a lot of extra time.
The plugin approach
To overcome some of these shortcoming we’ve created a plugin to handle syncing between environment:
Craft Remote Sync lets you push, pull & delete copies of your database and asset volumes from the comfort of the Control Panel (or the CLI if you prefer), letting you easily share files across your different environments.
This allows you to avoid having to SSH into machines, or dump/restore database manually. Everything is handled via a single “utilities” tab in the Craft Control Panel.
Installation
Installing the plugin should be easy. Either search for it in the Craft Store or run:
composer require weareferal/remote-sync
from the command line.
Configuration
First you need to add your AWS details. Go to the Setting > Plugins > Remote Sync settings page:
You can either enter the details directly, but it’s easier to migrate from environment to environment when you use environment variables instead. This means adding your AWS details to your ‘.env‘ file first:
AWS_ACCESS_KEY = "..."
AWS_SECRET_KEY = "..."
AWS_REGION = "us-west-2"
AWS_BUCKET_NAME = "feral-backups"
AWS_BUCKET_PREFIX = "craft-backups/my-site"
.env
and then referencing those environment variables from the settings page:
This is important as it means your access ID and secret key are not saved in the project.yaml
file (which is checked into Git) and are instead only referenced via environment variables (which are not checked into Git)
Pushing, Pulling and Deleting
Once you have configured everything, head over to the Utilties > Remote Sync section of the Craft CP to actually perform operations.
This is the guts of the plugin. From here you can:
- Push your local database/asset volumes from the current machine to your remote S3
- Pull and Restore a remote database/asset volumes copy from the remote S3 bucket to your local machine
- Delete a remote copy of a database or volume altogether
You’ll see in the above screenshot that we have currently 4 files available in our remote S3 bucket. A workflow might look like:
- On my local machine push my database and asset volumes to S3
- Login to my staging machine and pull & restore from the newly created files
- Follow up by deleting the remote file, now that I’ve synced everything
There are plenty more features available including emergency backups as well as usage of the queue. Head over to the Github page to read the documentation in more detail.
If you’ve got questions or comments feel free to get in touch via timmy@weraferal.com, or chat to me on the Craft Discord, username timmy
Interested in trying it out?
Head over to the Plugin Store to try Remote Sync out for yourself (it's free to try).