Configuring Dragonfly to host files remotely on DigitalOcean Spaces? Pull in the dragonfly-s3_data_store
gem, and this is the config for you:
# config/initializers/dragonfly.rb
require 'dragonfly'
Dragonfly.app.configure do
# the usual stuff
datastore :s3,
bucket_name: 'my-bucket',
access_key_id: 'DO-access-key',
secret_access_key: 'DO-secret-key',
region: 'DO-region', # eg: 'nyc3'
url_host: 'https',
url_scheme: 'nyc3.digitaloceanspaces.com',
fog_storage_options: {
endpoint: 'https://nyc3.digitaloceanspaces.com'
}
end
DigitalOcean has done a pretty rad thing when it comes to building up their API around it's Spaces object storage services, to quote their documentation:
The API is interoperable with Amazon's AWS S3 API allowing you to interact with the service while using the tools you already know.
One more quote I'll pull for some additional context:
In most cases, when using a client library, setting the "endpoint" or "base" URL to ${REGION}.digitaloceanspaces.com and generating a Spaces key to replace your AWS IAM key will allow you to use Spaces in place of S3.
There is no exception when it comes to configuring a Dragonfly datastore. Pass in :s3
as the datastore type, and plop in the access credentials as per usual. The only tricky bit is around the fog_storage_options
configuration key. You normally don't need to worry about this, but it comes into play when the s3_data_store
gem builds the Fog::Storage object. The gem won't assume the endpoint based on the configured url_host
and url_scheme
values, and without a specified endpoint, the fog-aws
gem will supply a reasonable default along the lines of "s3-#{region}.amazonaws.com"
.
So, set the fog storage endpoint, and the rest should click into place. Kudos to open source software for making this process so easy.