Another in the series of moving my code from a single machine into a containerised environment. I had previously assumed that I could write files to the local drive of the machine, and serve them from there. This assumption falls over in a containerised environment, and I need to use an external object store like S3 or Google Cloud Storage to hold and serve the files.
I've decided to try out Google Cloud for hosting this application, so we're going to look at Google Cloud Storage in this instance.
A quick search of PyPi turns up django-storages package that can change the DEFAULT_STORAGE
django setting to use Google Cloud Storage. A great start.
I've begun my work by creating a public cloud storage bucket, which I will then write to from my Django application
Reading the documentation for the django-storages package, it would appear that if I'm running my application within the Google Cloud infrastructure, I will automatically gain read/write access to my buckets, so long as I'm within the same project. I might just go right ahead and try this out.
In order to activate this, I'll once again fall-back to environment variables. I'll use local storage when running a local copy of the application, and the google cloud storage engine when I'm running within the Google Cloud environment.
To achieve this, I'll need the following:
MEDIA_URL
setting based on the environment we're inDEFAULT_STORAGE
setting based on the environment we're inMEDIA_URL
, combining it with the base url at request time.