Importing Compose.IO MongoDB dump into local docker instance

Importing Compose MongoDB dump into local docker instance

Sometimes in order to easily test performance on your local machine or to test data migrations on real data you need to import production database. To make things a bit more complicated we are going to use docker. In general I avoid making such tutorials, but I was surpised to find out there are no tutorials for this particular operation.
If you inspect what contents of the dump that Compose provides you look like, it's something like that:

What this folder contains is actually raw database data. So no need for mongorestore and such tools. Also keep in mind that even if your local Mongo uses WiredTiger format, this old format is still going to work.

Just show me the code


# tab 1
# extract the backup
tar -xvf backup.tar -C backup

# stop running mongo instance
docker-compose stop mongo

# spin up mongo container and run a bash command in it
docker-compose run mongo /bin/bash

# executing code inside docker now!

# Assuming you had /data/db persisted in a mounted volume, let's back I old data in case we screw up:
cp /data/db /data/db_backup
rm /data/db/*
Now let's open a new tab an copy backup data into our container:

# adjust container name if it's different
CONTAINER_ID=$(docker ps -qf "name=mongo")

# copy backup data into your docker instance
docker cp backup/* $CONTAINER_ID:/data/db
Back to the tab #1:

# tab #1, inside docker

# now inside mongo

# rename database to whatever name you use locally
db.copyDatabase('my_db_production', 'my_db_development')

use my_db_production

# drop the db we copied from

Well, that was it. If you don't find it useful, I sure would be glad I wrote this for my later self.

Popular posts from this blog

Next.js: restrict pages to authenticated users

HTTP server in Ruby 3 - Fibers & Ractors

Migration locks for TypeORM