Skip to content

Instantly share code, notes, and snippets.

@onelharrison
Last active May 10, 2022 01:23
Show Gist options
  • Save onelharrison/87a0fb97e7b7748818227606d0b3fb19 to your computer and use it in GitHub Desktop.
Save onelharrison/87a0fb97e7b7748818227606d0b3fb19 to your computer and use it in GitHub Desktop.
Demo script for writing a pandas data frame to a CSV file on S3 using s3fs-supported pandas APIs
"""
Demo script for writing a pandas data frame to a CSV file on S3 using s3fs-supported pandas APIs
"""
import os
import pandas as pd
AWS_S3_BUCKET = os.getenv("AWS_S3_BUCKET")
AWS_ACCESS_KEY_ID = os.getenv("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = os.getenv("AWS_SECRET_ACCESS_KEY")
AWS_SESSION_TOKEN = os.getenv("AWS_SESSION_TOKEN")
books_df = pd.DataFrame(
data={"Title": ["Book I", "Book II", "Book III"], "Price": [56.6, 59.87, 74.54]},
columns=["Title", "Price"],
)
key = "files/books.csv"
books_df.to_csv(
f"s3://{AWS_S3_BUCKET}/{key}",
index=False,
storage_options={
"key": AWS_ACCESS_KEY_ID,
"secret": AWS_SECRET_ACCESS_KEY,
"token": AWS_SESSION_TOKEN,
},
)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment