The @payloadcms/plugin-cloud-storage
allows you to upload files from Payload collections to a Google Cloud Storage bucket.
It relies on Node APIs that are not available in the browser.
The payload.config.ts is used by both the server of Payload and the frontend for the webpack build.
The GCS adapter for this plugin relies on the @google-cloud/storage
which calls the Node APIs for processing and uploading files.
When you add this plugin to your payload.config.ts, and a collection uses a gcpAdapter, it will automatically extend your webpack config with the following (see: https://github.com/payloadcms/plugin-cloud-storage/blob/master/src/adapters/gcs/webpack.ts):
'@google-cloud/storage': path.resolve(__dirname, './mock.js')
Which mocks the dependency using the Node APIs for webpack, allowing the frontend to be built and not throw any errors.
In my project, I want to allow both:
- Local filesystem uploads when GCS_CREDENTIALS nor GCS_BUCKET are set
- Cloud uploads when GCS_CREDENTIALS and GCS_BUCKET are set
My first attempt was something similar to this:
cloudStorage({
collections: {
'heroImages': {
adapter: !!GCS_CREDENTIALS && !!GCS_BUCKET && myGcsAdapter,
disableLocalStorage: !!GCS_CREDENTIALS && !!GCS_BUCKET,
},
},
}),
However, since the adapter is not set, the plugin doesn't alias the dependency that uses the Node APIs, BUT they are still imported in the payload.config.ts file. This causes webpack to show many errors related to Node APIs that it says should be polyfilled or mocked.
The best workaround I have found is to define the plugin as noParse
when you want to use the local filesystem. This stops webpack from parsing the plugin dependency and therefore the @google-cloud/storage
.
In the attached payload.config.ts on a fresh install, this can be demonstrated by commenting out the custom webpack config and running npm run build
which will error. Adding the custom webpack config and running npm run build
will succeed.