Last active
November 10, 2022 04:20
-
-
Save vadivelselvaraj/e7798cfeb29ceeebb08c7693d0efdc50 to your computer and use it in GitHub Desktop.
CompactParquetFiles
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Read the S3 folder as glue dynamic data frames | |
input_dyf = glueContext.create_dynamic_frame_from_options("s3", { | |
"paths": [ inputPath ], | |
"recurse": True, | |
"groupFiles": "inPartition" | |
}, | |
format = "parquet" | |
) | |
# Repartition them as required | |
repartitionedDYF = input_dyf.repartition(numberOfPartitions) | |
# Write them as glue parquet files to boost performance. | |
# Note: glueparquet files are compatible with parquet files and can be read | |
# by any tool that reads a parquet file. | |
glueContext.write_dynamic_frame.from_options( | |
frame = repartitionedDYF, | |
connection_type = "s3", | |
connection_options = {"path": outputPath}, | |
format = "glueparquet" | |
) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
As per the aws documentation(https://docs.aws.amazon.com/glue/latest/dg/grouping-input-files.html), groupFiles is supported for DynamicFrames created from the following data formats: csv, ion, grokLog, json, and xml. This option is not supported for avro, parquet, and orc. Did your code work?