Skip to content

Instantly share code, notes, and snippets.

@jezhumble
Last active November 6, 2024 22:23
Show Gist options
  • Save jezhumble/91051485db4462add82045ef9ac2a0ec to your computer and use it in GitHub Desktop.
Save jezhumble/91051485db4462add82045ef9ac2a0ec to your computer and use it in GitHub Desktop.
# Copyright 2019 Google LLC.
# SPDX-License-Identifier: Apache-2.0
# This snippet shows you how to use Blob.generate_signed_url() from within compute engine / cloud functions
# as described here: https://cloud.google.com/functions/docs/writing/http#uploading_files_via_cloud_storage
# (without needing access to a private key)
# Note: as described in that page, you need to run your function with a service account
# with the permission roles/iam.serviceAccountTokenCreator
import os, google.auth
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta
from google.cloud import storage
auth_request = requests.Request()
credentials, project = google.auth.default()
storage_client = storage.Client(project, credentials)
data_bucket = storage_client.lookup_bucket(os.getenv("BUCKET_NAME"))
signed_blob_path = data_bucket.blob("FILENAME")
expires_at_ms = datetime.now() + timedelta(minutes=30)
# This next line is the trick!
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "", service_account_email=credentials.service_account_email)
signed_url = signed_blob_path.generate_signed_url(expires_at_ms, credentials=signing_credentials, version="v4")
@jbn
Copy link

jbn commented Jan 25, 2020

This is a misleading error message, because the signature is not the problem. What actually was missing, is the correct "host" header which is for your google bucket:

For me, I needed to do,

signed_url = signed_blob_path.generate_signed_url(expires_at_ms, credentials=signing_credentials, version="v4", method='PUT')

@sparkinson
Copy link

sparkinson commented May 14, 2020

I recently had problems getting this working. Eventually, I managed to create credentials that can sign blobs using:

import google.auth
from google.auth import compute_engine

auth_request = google.auth.transport.requests.Request()
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "") # this uses the default credentials by default, no need to pass service_account_email
signing_credentials.signer.sign(string_to_sign)

This uses the default credentials for the compute instance.

Note that you need to add permissions to the default credentials to be able to sign URLs, and to enable the IAM service in your project:

gcloud iam service-accounts add-iam-policy-binding --role=roles/iam.serviceAccountTokenCreator "[email protected]" --member="serviceAccount:[email protected]"
gcloud services enable iam.googleapis.com

The methods stated above were not working for me as the credentials obtained using google.auth.default() of client_credentials had not refreshed, and so did not actually contain the service account email address. Attempts to refresh threw scope issues which I did not know how to solve.

@0zeroth
Copy link

0zeroth commented Oct 11, 2020

If you encounter the TypeError: 'Request' object is not callable - you are using requests.Request() and not google.auth.transport.requests.Request()

Hoping this saves someone else from chasing the obvious for far too long...

@meetchandan
Copy link

Hi,

I want to create a signed url using which I can upload a file to GCS.
Is just putting "POST" as the method parameter value enough?

@mezhaka
Copy link

mezhaka commented Nov 17, 2020

I had a similar issue to what @jessjenk described above:

TransportError: Error calling the IAM signBytes API: b'{\n "error": {\n "code": 400,\n "message": "Invalid service account email (default).",\n "status": "INVALID_ARGUMENT"\n }\n}\n'

However her workaround did not fix it in my case -- the value of storage_client._credentials.service_account_email was default in my case, despite the node that I was running it from had also a different service account. I had to explicitly put the email of the service account to make it work:

        signing_credentials = compute_engine.IDTokenCredentials(
            auth_request,
            "",
            service_account_email="[email protected]",
        )

@mezhaka
Copy link

mezhaka commented Nov 19, 2020

Does anyone know if it's OK to create IDTokenCredentials instance once and use it for the lifetime of the application? I have browsed the implementation a bit -- it has refresh method, which made me think that either something is going to call this method if the token is expired, or maybe I am supposed to call it?

@econtal
Copy link

econtal commented Jan 7, 2021

@mezhaka

However her workaround did not fix it in my case -- the value of storage_client._credentials.service_account_email was default in my case, despite the node that I was running it from had also a different service account.

Actually you need to do some API call with your client to automatically set service_account_email to the default email. So for instance this should work:

storage_client = storage.Client()
my_bucket = storage_client.get_bucket('my_bucket')
signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    service_account_email=storage_client._credentials.service_account_email)

or if you don't like using a private attribute, you can be more explicit:

credentials, project = GGDefault()
storage_client = Client(project, credentials)
my_bucket = storage_client.get_bucket('my_bucket')
signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    service_account_email=credentials.service_account_email)

@deven96
Copy link

deven96 commented Apr 22, 2021

""" Generating a downloadable GET link  (that expires in 30 minutes) for a  file in bucket '"""
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta
from google.cloud import storage

auth_request = requests.Request()
storage_client = storage.Client()
data_bucket = storage_client.bucket("BUCKET_NAME")
blob = data_bucket.get_blob("FILENAME")
expires_at_ms = datetime.now() + timedelta(minutes=30)
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "")
signed_url = blob.generate_signed_url(expires_at_ms, credentials=signing_credentials)

I am using AppEngine Python3 and this worked for me, however I had to add IAM Service Account Token Creator role to my AppEngine app default service account [email protected], else it showed error INFO:root:Error calling the IAM signBytes API: b'{\n "error": {\n "code": 403,\n "message": "The caller does not have permission",\n "status": "PERMISSION_DENIED"\n }\n}\n' . The role apparently enables the account to be able to sign blobs

@nguaman
Copy link

nguaman commented Oct 22, 2021

Service Account Token Creator

This work for me in Google Cloud Run. (oct-2021)

@deven96 Thanks!

@igortxra
Copy link

Thanks @deven96 , this was very useful. Works for me too (oct-2021).

@deven96
Copy link

deven96 commented Oct 28, 2021

My pleasure! @nguaman @igortxra

@patrickchho
Copy link

patrickchho commented Dec 24, 2021

@deven96 this works - thank you! (dec-2021).

One more note for future readers. The solution works in the cloud env. In local env, I get google.auth.exceptions.TransportError:Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Enginemetadata service. To workaround in local, I use the service account JSON key.

@adriangb
Copy link

Someone please correct me if I'm wrong, but this ends up making an HTTP request each time it needs to sign right? And a synchronous one at that.

@mezhaka
Copy link

mezhaka commented Apr 25, 2022

@adriangb This is my understanding as well. But there's a way to sign completely offline AFAIR.

@adriangb
Copy link

Not without a private key, which is super problematic both for production and local development.

@saiayn
Copy link

saiayn commented Jul 9, 2023

If you need to accomplish this task within a Firebase Cloud function using .onCall(), please follow the steps outlined below:

from firebase_admin import storage
from google.auth import compute_engine
from google.auth.transport.requests import Request
from datetime import timedelta

def your_function(parameter):
    # Create an authentication request
    auth_request = Request()

    # Get your IDTokenCredentials
    signing_credentials = compute_engine.IDTokenCredentials(
        auth_request,
        "",
        service_account_email='<ADD YOUR SERVICE ACCOUNT MAIL(Principal)>'
    )

    # Get your storage bucket
    data_bucket = storage.bucket('<YOUR BUCKET>')
    
    # Generate a signed URL for your bucket
    blob = data_bucket.blob(parameter)
    url = blob.generate_signed_url(
        expiration=timedelta(days=7),
        credentials=signing_credentials, 
        version="v4"
    )
    
    return url

Remember to replace '<ADD YOUR SERVICE ACCOUNT MAIL(Principal)>' and '' with your actual service account email and bucket name, respectively.

@PieterT2000
Copy link

If all of the above did not work for you, this worked for me:
https://stackoverflow.com/a/78647388/9713831

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment