Sync­ing your data­base and assets across dif­fer­ent envi­ron­ments (local, stag­ing and pro­duc­tion) is an issue that plagues most devel­op­ers whether deploy­ing web­sites or apps. It’s tedious, time-con­sum­ing and error prone 😑. Every time I find myself knee-deep in an SSH ter­mi­nal import­ing and export­ing Post­gres data­base dumps I promise myself nev­er again”

The import/​export dance 💃🏾

First, let’s define a cou­ple of com­mon sce­nar­ios where I find myself doing the import/​export dance.

1) Deploy­ing local to staging

At the begin­ning of devel­op­ment you’ve been work­ing local­ly on your machine with a test data­base and test user uploads.

Your col­leagues or client want to see it up and run­ning so you cre­ate a stag­ing serv­er. When you go to deploy your site to this stag­ing serv­er you do the dance:

  1. Cre­ate a local data­base dump
  2. Zip-up exist­ing user uploads
  3. SFTP/SCP and upload that data­base dump and zip file to the stag­ing server
  4. SSH into the stag­ing server
  5. Import the data­base dump and unzip the data­base dump

My expe­ri­ence is at least 10 oth­er things will hap­pen in the mean­time to frus­trate this process. Per­mis­sions, for­got­ten pass­words, wrong versions …

2) Import­ing stag­ing to local

You haven’t worked on a web­site for a while but the client comes to you for a new fea­ture. You still have a local copy of the code but the data­base and/​or assets are out-of-date. You have to login to the pro­duc­tion serv­er and do the dance.

3) Deploy­ing stag­ing to production

You have a stag­ing site that is ready for deploy­ment. The client has been load­ing all their con­tent onto the stag­ing site. You either have to repur­pose the stag­ing serv­er as the pro­duc­tion serv­er or … do the dance.

The Ide­al, Auto­mat­ed, World

If you are reg­u­lar­ly deploy­ing web­sites to dif­fer­ent envi­ron­ments this begins to add up to an enor­mous amount of time doing tedious tasks.

There is a way to remove all of this faff though. First, cre­ate a more sen­si­ble archi­tec­ture that cen­tralis­es our data­base dumps and asset vol­ume copies in a sin­gle acces­si­ble loca­tion and sec­ond, auto­mate, auto­mate, automate.

Sin­gle source of truth

We first need to cre­ate a sin­gle source of truth”: a loca­tion that we’ll use to share files. All of our var­i­ous envi­ron­ments will push and pull to and from this loca­tion. Ide­al­ly all of the files with­in this loca­tion will be ver­sioned too.

This sin­gle-source-of-truth needs to be in the cloud and remote­ly avail­able so that our oth­er envi­ron­ments (and colleagues/​clients) can access it (in the above dia­gram this is an AWSS3 bucket).

Automa­tion 🤖

We then need to auto­mate as many of the steps men­tioned in the export/​import dance above:

  • Auto­mate the process of push­ing our data­base and assets
  • Auto­mate the process of pulling our database/​assets from our sin­gle source of truth.
  • Auto­mate the process of restor­ing these once pulled-down

The script­ing approach

Let’s be clear, these aren’t new ideas. In fact, there is already a de-fac­to library out there that han­dles many of the con­cepts men­tioned above real­ly well called craft-scripts

This library was cre­at­ed by Andrew Welch a.k.a. NYS­tu­dio 107. As well as cre­at­ing an absolute pletho­ra of great plu­g­ins for Craft, he’s writ­ten a com­pre­hen­sive blog post on the top­ic of DB and asset sync­ing that you should read before going any further.

This library uses bash scripts to give you the tools you need to do every­thing we men­tioned above:

There are some lim­i­ta­tions to this approach though.

  • You need to inte­grate it into your code­base (by adding the scriptsfold­er to your project).
  • It uses bash scripts, so every­thing oper­ates out­side of the Craft ecosystem.
    • You can’t run any of the scripts via the CP for example
    • You can’t con­fig­ure any­thing via the CP
  • It requires addi­tion­al envi­ron­ment con­fig­u­ra­tion out­side of the stan­dard Craft .env file (via a scripts/.env.sh).
  • This con­fig­u­ra­tion is sig­nif­i­cant­ly dif­fer­ent depend­ing on environment.

While this cer­tain­ly isn’t the end of the world, for me where this becomes an issue is when are work­ing on a num­ber of sites and you are try­ing to auto­mate pro­vi­sion­ing and deploy­ment via a ser­vice such as Larvel Forge. You have to start log­ging in again to your serv­er and copying/​pasting con­fig­u­ra­tions between envi­ron­ments which can take a lot of extra time.

The plu­g­in approach

To over­come some of these short­com­ing we’ve cre­at­ed a plu­g­in to han­dle sync­ing between environment:

Craft Remote Sync lets you push, pull & delete copies of your data­base and asset vol­umes from the com­fort of the Con­trol Pan­el (or the CLI if you pre­fer), let­ting you eas­i­ly share files across your dif­fer­ent environments.

This allows you to avoid hav­ing to SSH into machines, or dump/​restore data­base man­u­al­ly. Every­thing is han­dled via a sin­gle util­i­ties” tab in the Craft Con­trol Panel. 

Utilities screenshot image
The Remote Sync utilities page lets you push, pull and delete remote files.

Instal­la­tion

Installing the plu­g­in should be easy. Either search for it in the Craft Store or run:

composer require weareferal/remote-sync

from the com­mand line.

Con­fig­u­ra­tion

First you need to add your AWS details. Go to the Setting > Plugins > Remote Sync set­tings page:

You can either enter the details direct­ly, but it’s eas­i­er to migrate from envi­ron­ment to envi­ron­ment when you use envi­ron­ment vari­ables instead. This means adding your AWS details to your ‘.env‘ file first:

AWS_ACCESS_KEY = "..."
AWS_SECRET_KEY = "..."
AWS_REGION = "us-west-2"
AWS_BUCKET_NAME = "feral-backups"
AWS_BUCKET_PREFIX = "craft-backups/my-site"

.env

and then ref­er­enc­ing those envi­ron­ment vari­ables from the set­tings page:

This is impor­tant as it means your access ID and secret key are not saved in the project.yaml file (which is checked into Git) and are instead only ref­er­enced via envi­ron­ment vari­ables (which are not checked into Git)

Push­ing, Pulling and Deleting

Once you have con­fig­ured every­thing, head over to the Utilties > Remote Sync sec­tion of the Craft CP to actu­al­ly per­form operations.

This is the guts of the plu­g­in. From here you can:

  • Push your local database/​asset vol­umes from the cur­rent machine to your remote S3
  • Pull and Restore a remote database/​asset vol­umes copy from the remote S3 buck­et to your local machine
  • Delete a remote copy of a data­base or vol­ume altogether

You’ll see in the above screen­shot that we have cur­rent­ly 4 files avail­able in our remote S3 buck­et. A work­flow might look like:

  1. On my local machine push my data­base and asset vol­umes to S3
  2. Login to my stag­ing machine and pull & restore from the new­ly cre­at­ed files
  3. Fol­low up by delet­ing the remote file, now that I’ve synced everything

There are plen­ty more fea­tures avail­able includ­ing emer­gency back­ups as well as usage of the queue. Head over to the Github page to read the doc­u­men­ta­tion in more detail.

If you’ve got ques­tions or com­ments feel free to get in touch via timmy@​weraferal.​com, or chat to me on the Craft Dis­cord, user­name timmy

If you made it this far, you may also like

Understanding web animations: a half-baked history

Feral post 07

A long-winded and uninformed walk-through of web animations through the ages.