-
-
Save jezhumble/91051485db4462add82045ef9ac2a0ec to your computer and use it in GitHub Desktop.
# Copyright 2019 Google LLC. | |
# SPDX-License-Identifier: Apache-2.0 | |
# This snippet shows you how to use Blob.generate_signed_url() from within compute engine / cloud functions | |
# as described here: https://cloud.google.com/functions/docs/writing/http#uploading_files_via_cloud_storage | |
# (without needing access to a private key) | |
# Note: as described in that page, you need to run your function with a service account | |
# with the permission roles/iam.serviceAccountTokenCreator | |
import os, google.auth | |
from google.auth.transport import requests | |
from google.auth import compute_engine | |
from datetime import datetime, timedelta | |
from google.cloud import storage | |
auth_request = requests.Request() | |
credentials, project = google.auth.default() | |
storage_client = storage.Client(project, credentials) | |
data_bucket = storage_client.lookup_bucket(os.getenv("BUCKET_NAME")) | |
signed_blob_path = data_bucket.blob("FILENAME") | |
expires_at_ms = datetime.now() + timedelta(minutes=30) | |
# This next line is the trick! | |
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "", service_account_email=credentials.service_account_email) | |
signed_url = signed_blob_path.generate_signed_url(expires_at_ms, credentials=signing_credentials, version="v4") |
Maybe this link will solve some problems, take a look
https://stackoverflow.com/questions/54271402/sign-google-cloud-storage-urls-with-google-compute-engine-default-service-accoun/54272263#54272263
I recently had problems getting this working. Eventually, I managed to create credentials that can sign blobs using:
import google.auth
from google.auth import compute_engine
auth_request = google.auth.transport.requests.Request()
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "") # this uses the default credentials by default, no need to pass service_account_email
signing_credentials.signer.sign(string_to_sign)
This uses the default credentials for the compute instance.
Note that you need to add permissions to the default credentials to be able to sign URLs, and to enable the IAM service in your project:
gcloud iam service-accounts add-iam-policy-binding --role=roles/iam.serviceAccountTokenCreator "[email protected]" --member="serviceAccount:[email protected]"
gcloud services enable iam.googleapis.com
The methods stated above were not working for me as the credentials obtained using google.auth.default()
of client_credentials
had not refreshed, and so did not actually contain the service account email address. Attempts to refresh threw scope issues which I did not know how to solve.
If you encounter the TypeError: 'Request' object is not callable
- you are using requests.Request()
and not google.auth.transport.requests.Request()
Hoping this saves someone else from chasing the obvious for far too long...
Hi,
I want to create a signed url using which I can upload a file to GCS.
Is just putting "POST" as the method
parameter value enough?
I had a similar issue to what @jessjenk described above:
TransportError: Error calling the IAM signBytes API: b'{\n "error": {\n "code": 400,\n "message": "Invalid service account email (default).",\n "status": "INVALID_ARGUMENT"\n }\n}\n'
However her workaround did not fix it in my case -- the value of storage_client._credentials.service_account_email
was default
in my case, despite the node that I was running it from had also a different service account. I had to explicitly put the email of the service account to make it work:
signing_credentials = compute_engine.IDTokenCredentials(
auth_request,
"",
service_account_email="[email protected]",
)
Does anyone know if it's OK to create IDTokenCredentials
instance once and use it for the lifetime of the application? I have browsed the implementation a bit -- it has refresh
method, which made me think that either something is going to call this method if the token is expired, or maybe I am supposed to call it?
However her workaround did not fix it in my case -- the value of storage_client._credentials.service_account_email was default in my case, despite the node that I was running it from had also a different service account.
Actually you need to do some API call with your client to automatically set service_account_email
to the default email. So for instance this should work:
storage_client = storage.Client()
my_bucket = storage_client.get_bucket('my_bucket')
signing_credentials = compute_engine.IDTokenCredentials(
auth_request,
"",
service_account_email=storage_client._credentials.service_account_email)
or if you don't like using a private attribute, you can be more explicit:
credentials, project = GGDefault()
storage_client = Client(project, credentials)
my_bucket = storage_client.get_bucket('my_bucket')
signing_credentials = compute_engine.IDTokenCredentials(
auth_request,
"",
service_account_email=credentials.service_account_email)
""" Generating a downloadable GET link (that expires in 30 minutes) for a file in bucket '"""
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta
from google.cloud import storage
auth_request = requests.Request()
storage_client = storage.Client()
data_bucket = storage_client.bucket("BUCKET_NAME")
blob = data_bucket.get_blob("FILENAME")
expires_at_ms = datetime.now() + timedelta(minutes=30)
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "")
signed_url = blob.generate_signed_url(expires_at_ms, credentials=signing_credentials)
I am using AppEngine Python3 and this worked for me, however I had to add IAM Service Account Token Creator
role to my AppEngine app default service account [email protected]
, else it showed error INFO:root:Error calling the IAM signBytes API: b'{\n "error": {\n "code": 403,\n "message": "The caller does not have permission",\n "status": "PERMISSION_DENIED"\n }\n}\n'
. The role apparently enables the account to be able to sign blobs
Thanks @deven96 , this was very useful. Works for me too (oct-2021).
@deven96 this works - thank you! (dec-2021).
One more note for future readers. The solution works in the cloud env. In local env, I get google.auth.exceptions.TransportError:Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Enginemetadata service.
To workaround in local, I use the service account JSON key.
Someone please correct me if I'm wrong, but this ends up making an HTTP request each time it needs to sign right? And a synchronous one at that.
@adriangb This is my understanding as well. But there's a way to sign completely offline AFAIR.
Not without a private key, which is super problematic both for production and local development.
If you need to accomplish this task within a Firebase Cloud function using .onCall(), please follow the steps outlined below:
from firebase_admin import storage
from google.auth import compute_engine
from google.auth.transport.requests import Request
from datetime import timedelta
def your_function(parameter):
# Create an authentication request
auth_request = Request()
# Get your IDTokenCredentials
signing_credentials = compute_engine.IDTokenCredentials(
auth_request,
"",
service_account_email='<ADD YOUR SERVICE ACCOUNT MAIL(Principal)>'
)
# Get your storage bucket
data_bucket = storage.bucket('<YOUR BUCKET>')
# Generate a signed URL for your bucket
blob = data_bucket.blob(parameter)
url = blob.generate_signed_url(
expiration=timedelta(days=7),
credentials=signing_credentials,
version="v4"
)
return url
Remember to replace '<ADD YOUR SERVICE ACCOUNT MAIL(Principal)>' and '' with your actual service account email and bucket name, respectively.
If all of the above did not work for you, this worked for me:
https://stackoverflow.com/a/78647388/9713831
For me, I needed to do,