Last active
May 13, 2021 14:40
-
-
Save nsivabalan/275d080dcfc400b46a47fe6635ccf431 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| show logfile metadata --logFilePathPattern /tmp/hudi_trips_cow/.a5619a76-5896-4c48-b568-f63212bab228-0_20210513102613.log.1_0-41-440 | |
| ββββββββββββββββββ€ββββββββββββββ€ββββββββββββββββββ€βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€βββββββββββββββββ | |
| β InstantTime β RecordCount β BlockType β HeaderMetadata β FooterMetadata β | |
| β βββββββββββββββββͺββββββββββββββͺββββββββββββββββββͺβββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββͺβββββββββββββββββ£ | |
| β 20210513102646 β 3 β AVRO_DATA_BLOCK β {"SCHEMA":"{\"type\":\"record\",\"name\":\"hudi_trips_cow_record\",\"namespace\":\"hoodie.hudi_trips_cow\",\"fields\":[{\"name\":\"_hoodie_commit_time\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null},{\"name\":\"_hoodie_commit_seqno\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null},{\"name\":\"_hoodie_record_key\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null},{\"name\":\"_hoodie_partition_path\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null},{\"name\":\"_hoodie_file_name\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null},{\"name\":\"typeId\",\"type\":\"int\"},{\"name\":\"eventTime\",\"type\":[\"null\",{\"type\":\"long\",\"logicalType\":\"timestamp-micros\"}],\"default\":null},{\"name\":\"preComb\",\"type\":\"long\"}]}","INSTANT_TIME":"20210513102646"} β {} β | |
| ββββββββββββββββββ§ββββββββββββββ§ββββββββββββββββββ§βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ§βββββββββββββββββ |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| // final output | |
| scala> spark.sql("describe hudi_trips_snapshot").show() | |
| +--------------------+---------+-------+ | |
| | col_name|data_type|comment| | |
| +--------------------+---------+-------+ | |
| | _hoodie_commit_time| string| null| | |
| |_hoodie_commit_seqno| string| null| | |
| | _hoodie_record_key| string| null| | |
| |_hoodie_partition...| string| null| | |
| | _hoodie_file_name| string| null| | |
| | typeId| int| null| | |
| | eventTime|timestamp| null| | |
| | preComb| bigint| null| | |
| +--------------------+---------+-------+ | |
| scala> | |
| scala> spark.sql("select * from hudi_trips_snapshot").show() | |
| +-------------------+--------------------+------------------+----------------------+--------------------+------+-------------------+-------+ | |
| |_hoodie_commit_time|_hoodie_commit_seqno|_hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|typeId| eventTime|preComb| | |
| +-------------------+--------------------+------------------+----------------------+--------------------+------+-------------------+-------+ | |
| | 20210513102649| 20210513102649_0_5| 1| |a5619a76-5896-4c4...| 1|2020-01-01 23:00:01| 5| | |
| | 20210513102649| 20210513102649_0_4| 2| |a5619a76-5896-4c4...| 2|2020-12-29 09:54:00| 6| | |
| | 20210513102649| 20210513102649_0_6| 3| |a5619a76-5896-4c4...| 3|2021-05-09 10:12:43| 7| | |
| +-------------------+--------------------+------------------+----------------------+--------------------+------+-------------------+-------+ | |
| scala> val hudi_df = spark.sql("select * from hudi_trips_snapshot") | |
| hudi_df: org.apache.spark.sql.DataFrame = [_hoodie_commit_time: string, _hoodie_commit_seqno: string ... 6 more fields] | |
| scala> hudi_df.schema | |
| res15: org.apache.spark.sql.types.StructType = StructType(StructField(_hoodie_commit_time,StringType,true), StructField(_hoodie_commit_seqno,StringType,true), StructField(_hoodie_record_key,StringType,true), StructField(_hoodie_partition_path,StringType,true), StructField(_hoodie_file_name,StringType,true), StructField(typeId,IntegerType,false), StructField(eventTime,TimestampType,true), StructField(preComb,LongType,false)) | |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| // using 0.9.0 snapshot from master, scala 11 and spark2 | |
| import org.apache.hudi.QuickstartUtils._ | |
| import scala.collection.JavaConversions._ | |
| import org.apache.spark.sql.SaveMode._ | |
| import org.apache.hudi.DataSourceReadOptions._ | |
| import org.apache.hudi.DataSourceWriteOptions._ | |
| import org.apache.hudi.config.HoodieWriteConfig._ | |
| import org.apache.spark.sql.types._ | |
| import org.apache.spark.sql.Row | |
| val tableName = "hudi_trips_cow" | |
| val basePath = "file:///tmp/hudi_trips_cow" | |
| import java.sql.Timestamp | |
| import spark.implicits._ | |
| val df = Seq( | |
| (1, Timestamp.valueOf("2014-01-01 23:00:01"),0L), | |
| (2, Timestamp.valueOf("2016-12-29 09:54:00"), 2L)).toDF("typeId","eventTime","preComb") | |
| df.schema | |
| df.write.format("hudi"). | |
| options(getQuickstartWriteConfigs). | |
| option(RECORDKEY_FIELD_OPT_KEY, "typeId"). | |
| option(PRECOMBINE_FIELD_OPT_KEY, "preComb"). | |
| option(KEYGENERATOR_CLASS_OPT_KEY, "org.apache.hudi.keygen.NonpartitionedKeyGenerator"). | |
| option("hoodie.index.type","SIMPLE"). | |
| option(TABLE_TYPE_OPT_KEY, "MERGE_ON_READ"). | |
| option(TABLE_NAME, tableName). | |
| option(OPERATION_OPT_KEY, "insert"). | |
| mode(Overwrite). | |
| save(basePath) | |
| var tripsSnapshotDF1 = spark. | |
| read. | |
| format("hudi"). | |
| load(basePath + "/*") | |
| tripsSnapshotDF1.createOrReplaceTempView("hudi_trips_snapshot") | |
| spark.sql("describe hudi_trips_snapshot").show() | |
| spark.sql("select * from hudi_trips_snapshot").show() | |
| var df1 = Seq( | |
| (1, Timestamp.valueOf("2020-01-01 23:00:01"),5L), | |
| (2, Timestamp.valueOf("2020-12-29 09:54:00"),6L), | |
| (3, Timestamp.valueOf("2021-05-09 10:12:43"),7L) | |
| ).toDF("typeId","eventTime","preComb") | |
| df1.write.format("hudi"). | |
| options(getQuickstartWriteConfigs). | |
| option(RECORDKEY_FIELD_OPT_KEY, "typeId"). | |
| option(PRECOMBINE_FIELD_OPT_KEY, "preComb"). | |
| option(KEYGENERATOR_CLASS_OPT_KEY, "org.apache.hudi.keygen.NonpartitionedKeyGenerator"). | |
| option("hoodie.index.type","SIMPLE"). | |
| option(TABLE_TYPE_OPT_KEY, "MERGE_ON_READ"). | |
| option(TABLE_NAME, tableName). | |
| option(OPERATION_OPT_KEY, "upsert"). | |
| option("hoodie.combine.before.upsert","false"). | |
| mode(Append). | |
| save(basePath) | |
| // ensured there are log files available in base path. if not, triggered another df1.write... | |
| tripsSnapshotDF1 = spark. | |
| read. | |
| format("hudi"). | |
| load(basePath + "/*") | |
| tripsSnapshotDF1.createOrReplaceTempView("hudi_trips_snapshot") | |
| spark.sql("describe hudi_trips_snapshot").show() | |
| spark.sql("select * from hudi_trips_snapshot").show() |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment