Content Data Management
This guide explains how to get the content library data from production into your local development environment using the custom-ish dumpdata and loaddata_overwrite process.
This process is only available for trusted developers and admins who have been granted access to the production infrastructure.
Why You Need This
Gyrinx is really limited without the content library data - it's what makes the application useful. The content library includes all the game data (fighters, equipment, weapons, skills, houses) that's managed by the Gyrinx content team in production.
Without this data, you'll have an empty shell of an application that's pretty hard to test or develop against.
The Process
1. Export from Production
The export uses the gyrinx-dumpdata Cloud Run job in production:
Access the Google Cloud Console (you need permissions)
Navigate to Cloud Run → Jobs
Find and run the
gyrinx-dumpdatajobThe job exports all content data to
latest.jsonin thegyrinx-app-bootstrap-dumpbucket
Or use the gcloud CLI:
# Trigger the dumpdata job
gcloud run jobs execute gyrinx-dumpdata --region=europe-west22. Download the Export
3. Import Locally
The loaddata_overwrite command replaces your local content with the production data:
What loaddata_overwrite Does
This custom command is different from Django's built-in loaddata:
Clears existing content - Wipes all content models before importing (destructive!)
Handles foreign keys - Temporarily disables constraints during import
Skips historical records - Ignores django-simple-history tables
Content primary keys stay consistent over time, so long as you develop against content exported from production, your local content references will remain valid.
Common Tasks
Getting Started with Development
When you first set up Gyrinx locally:
Debugging Production Content
Need to investigate a content issue from production?
Warnings
This deletes all your local content data - The command wipes content models before importing
Don't commit latest.json - It's already in .gitignore, keep it that way
Need access - You must be a trusted developer with GCS permissions
Big file - The export can be large, depending on how much content exists
If Things Go Wrong
Import Failed?
The database might be partially cleared. Just run the command again - it'll clear everything and start fresh.
No Access?
If you can't access the GCS bucket or Cloud Run job, you'll need to ask an admin for:
Access to the production GCP project
Permissions for the
gyrinx-app-bootstrap-dumpbucketAbility to run the
gyrinx-dumpdataCloud Run job
Corrupted JSON?
Re-download from the bucket. The export job creates valid JSON, so corruption usually happens during download.
Technical Details
The command lives at gyrinx/core/management/commands/loaddata_overwrite.py.
The command uses PostgreSQL's TRUNCATE CASCADE for fast deletion, but falls back to regular DELETE if that fails. Foreign key checks are disabled with SET session_replication_role = 'replica' during import.
Historical models (from django-simple-history) are automatically detected and skipped - they're managed separately by the history system.
Last updated