Skip to content

Instantly share code, notes, and snippets.

@nsivabalan
Created August 4, 2021 14:25
Show Gist options
  • Save nsivabalan/b23c86191a0cdec5fac296d327e10761 to your computer and use it in GitHub Desktop.
Save nsivabalan/b23c86191a0cdec5fac296d327e10761 to your computer and use it in GitHub Desktop.
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.hudi.TestConvertFilterToCatalystExpression
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.048 s - in org.apache.hudi.TestConvertFilterToCatalystExpression
[INFO] Running org.apache.hudi.TestDataSourceUtils
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/nsb/.m2/repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/nsb/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.48 s - in org.apache.hudi.TestDataSourceUtils
[INFO] Running org.apache.hudi.TestMergeOnReadSnapshotRelation
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.031 s - in org.apache.hudi.TestMergeOnReadSnapshotRelation
[INFO] Running org.apache.hudi.TestDataSourceDefaults
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.553 s - in org.apache.hudi.TestDataSourceDefaults
[INFO] Running org.apache.hudi.payload.TestAWSDmsAvroPayload
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s - in org.apache.hudi.payload.TestAWSDmsAvroPayload
[INFO] Running org.apache.hudi.TestHoodieSparkUtils
0 [main] WARN org.apache.spark.util.Utils - Your hostname, Sivabalans-MacBook-Pro.local resolves to a loopback address: 127.0.0.1; using 10.0.0.202 instead (on interface en0)
2 [main] WARN org.apache.spark.util.Utils - Set SPARK_LOCAL_IP if you need to bind to another address
428 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.867 s - in org.apache.hudi.TestHoodieSparkUtils
[INFO] Running org.apache.hudi.TestQuickstartUtils
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.011 s - in org.apache.hudi.TestQuickstartUtils
[INFO] Running org.apache.hudi.client.TestBootstrap
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
22210 [main] WARN org.apache.hudi.common.fs.FSUtils - try to delete instant file: [00000000000001__commit__COMPLETED]
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
52323 [main] WARN org.apache.hudi.common.fs.FSUtils - try to delete instant file: [00000000000001__commit__COMPLETED]
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
77215 [main] WARN org.apache.hudi.common.fs.FSUtils - try to delete instant file: [00000000000001__deltacommit__COMPLETED]
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
102479 [main] WARN org.apache.hudi.common.fs.FSUtils - try to delete instant file: [00000000000002__deltacommit__COMPLETED]
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
119444 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
120040 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
120202 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
120740 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
121260 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
121436 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
121916 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
122614 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
122840 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
123031 [main] WARN org.apache.hudi.common.fs.FSUtils - try to delete instant file: [00000000000002__commit__COMPLETED]
128906 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
128912 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
129306 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
129311 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
129397 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
129400 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
129795 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
129801 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
130181 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
130188 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
130278 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
130281 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
130672 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
130680 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
131043 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
131052 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
131142 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
131145 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
133111 [qtp1388983654-7605] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
133114 [qtp1388983654-7602] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
137756 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
137762 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
138020 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
138025 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
138030 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
138033 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
138311 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
138316 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
138558 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
138563 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
138569 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
138571 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
138921 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
138928 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
139153 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
139158 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
139164 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
139167 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
143082 [main] WARN org.apache.hudi.common.fs.FSUtils - try to delete instant file: [00000000000002__commit__COMPLETED]
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
156279 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
156749 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
156902 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
157279 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
157706 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
157865 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
158239 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
158603 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
158763 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F02)
158902 [main] WARN org.apache.hudi.common.fs.FSUtils - try to delete instant file: [00000000000002__deltacommit__COMPLETED]
162642 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
162664 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
162980 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
162983 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
163058 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
163065 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
163689 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
163701 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
164277 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
164285 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
164389 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
164398 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
164770 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
164775 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
165050 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
165056 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
165135 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
165142 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
root
|-- timestamp: long (nullable = true)
|-- _row_key: string (nullable = true)
|-- partition_path: string (nullable = true)
|-- rider: string (nullable = true)
|-- driver: string (nullable = true)
|-- begin_lat: double (nullable = true)
|-- begin_lon: double (nullable = true)
|-- end_lat: double (nullable = true)
|-- end_lon: double (nullable = true)
|-- fare: struct (nullable = true)
| |-- amount: double (nullable = true)
| |-- currency: string (nullable = true)
|-- tip_history: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- amount: double (nullable = true)
| | |-- currency: string (nullable = true)
|-- _hoodie_is_deleted: boolean (nullable = true)
|-- datestr: string (nullable = true)
167781 [qtp1359642704-9517] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
167781 [qtp1359642704-9516] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
171351 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
171358 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
171737 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
171744 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
171831 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
171842 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
172219 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
172225 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
172587 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
172599 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
172681 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
172691 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
173040 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
173045 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
173341 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
173347 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
173431 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
173441 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
173885 [qtp1359642704-9196] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
173885 [qtp1359642704-9514] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
175902 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
175909 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
176145 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
176151 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
176158 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
176167 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
176482 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
176490 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
176747 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
176755 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
176763 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
176774 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
177088 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
177096 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
177340 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
177347 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
177354 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F03)
177365 [main] WARN org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex - No value found for partition key (datestr=2020%252F04%252F01)
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 174.787 s - in org.apache.hudi.client.TestBootstrap
[INFO] Running org.apache.hudi.functional.TestEmptyCommit
178294 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
179297 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.727 s - in org.apache.hudi.functional.TestEmptyCommit
[INFO] Running org.apache.hudi.functional.TestMORDataSource
180045 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2510720680826430950/dataset/.hoodie/metadata
180065 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2510720680826430950/dataset/.hoodie/metadata
180072 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadataWriter - Cannot bootstrap metadata table as operation is in progress: [==>20210804095950__deltacommit__REQUESTED]
180073 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2510720680826430950/dataset/.hoodie/metadata
181446 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2510720680826430950/dataset/.hoodie/metadata
181448 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2510720680826430950/dataset/.hoodie/metadata
181658 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2510720680826430950/dataset/.hoodie/metadata
181661 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2510720680826430950/dataset/.hoodie/metadata
181666 [qtp1970943023-10248] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2510720680826430950/dataset/.hoodie/metadata
181670 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2510720680826430950/dataset/.hoodie/metadata
183113 [main] WARN org.apache.spark.util.Utils - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.
193381 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
201765 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+-------------------+-------------+------------+-------------+------------------+----------+------------------+------------------+--------------------+----------------+--------+----------+--------------+---------+-------------------+---------+--------------------+---------+
|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|_hoodie_is_deleted| _row_key| begin_lat| begin_lon|city_to_state|current_date| current_ts|distance_in_meters| driver| end_lat| end_lon| fare| height| nation| partition|partition_path| rider|seconds_since_epoch|timestamp| tip_history| weight|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+-------------------+-------------+------------+-------------+------------------+----------+------------------+------------------+--------------------+----------------+--------+----------+--------------+---------+-------------------+---------+--------------------+---------+
| 20210804100012|20210804100012_2_...|7f9c76cd-a0db-425...| 2015/03/17|d7dae667-4a42-470...| false|7f9c76cd-a0db-425...|0.6315342499363908|0.16482743364319596| [CA]| 18843|1628085611874| 1056857216|driver-001|0.4340895130553858|0.3783025995311793|[67.9076343665697...|[0, 0, 6, 0, -8]|[Canada]|2015/03/17| 2015/03/17|rider-001|7137069777685952782| 0|[[60.080629670001...|0.4314068|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+-------------------+-------------+------------+-------------+------------------+----------+------------------+------------------+--------------------+----------------+--------+----------+--------------+---------+-------------------+---------+--------------------+---------+
only showing top 1 row
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+-------------------+------------------+-------------+------------+-------------+------------------+----------+-----------------+------------------+--------------------+-------------------+--------+----------+--------------+---------+-------------------+---------+--------------------+----------+
|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|_hoodie_is_deleted| _row_key| begin_lat| begin_lon|city_to_state|current_date| current_ts|distance_in_meters| driver| end_lat| end_lon| fare| height| nation| partition|partition_path| rider|seconds_since_epoch|timestamp| tip_history| weight|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+-------------------+------------------+-------------+------------+-------------+------------------+----------+-----------------+------------------+--------------------+-------------------+--------+----------+--------------+---------+-------------------+---------+--------------------+----------+
| 20210804100014|20210804100014_2_573|99dbd547-fe74-4f4...| 2015/03/17|d7dae667-4a42-470...| false|99dbd547-fe74-4f4...|0.15615342219168804|0.7541816209734739| [CA]| 18843|1628085613598| -983207909|driver-002|0.455616628316851|0.4391359804981412|[29.9530941470908...|[0, 0, 11, -85, 59]|[Canada]|2015/03/17| 2015/03/17|rider-002| 72235109169482157| 0|[[14.866596882454...|0.30283588|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+-------------------+------------------+-------------+------------+-------------+------------------+----------+-----------------+------------------+--------------------+-------------------+--------+----------+--------------+---------+-------------------+---------+--------------------+----------+
only showing top 1 row
229430 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
234311 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
239696 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
251650 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
257364 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
267392 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
267855 [dispatcher-event-loop-6] WARN org.apache.spark.scheduler.TaskSetManager - Stage 0 contains a task of very large size (352 KB). The maximum recommended task size is 100 KB.
268069 [dispatcher-event-loop-4] WARN org.apache.spark.scheduler.TaskSetManager - Stage 1 contains a task of very large size (352 KB). The maximum recommended task size is 100 KB.
268303 [dispatcher-event-loop-3] WARN org.apache.spark.scheduler.TaskSetManager - Stage 2 contains a task of very large size (352 KB). The maximum recommended task size is 100 KB.
339369 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
343560 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
344023 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6754697232724941965/dataset/.hoodie/metadata
344044 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6754697232724941965/dataset/.hoodie/metadata
344047 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadataWriter - Cannot bootstrap metadata table as operation is in progress: [==>20210804100234__deltacommit__REQUESTED]
344048 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6754697232724941965/dataset/.hoodie/metadata
345254 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6754697232724941965/dataset/.hoodie/metadata
345257 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6754697232724941965/dataset/.hoodie/metadata
345441 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6754697232724941965/dataset/.hoodie/metadata
345443 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6754697232724941965/dataset/.hoodie/metadata
345447 [qtp947917269-22367] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6754697232724941965/dataset/.hoodie/metadata
345450 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6754697232724941965/dataset/.hoodie/metadata
349635 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
349897 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1559019332568221431/dataset/.hoodie/metadata
349919 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1559019332568221431/dataset/.hoodie/metadata
349923 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadataWriter - Cannot bootstrap metadata table as operation is in progress: [==>20210804100239__deltacommit__REQUESTED]
349924 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1559019332568221431/dataset/.hoodie/metadata
351094 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1559019332568221431/dataset/.hoodie/metadata
351096 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1559019332568221431/dataset/.hoodie/metadata
351320 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1559019332568221431/dataset/.hoodie/metadata
351322 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1559019332568221431/dataset/.hoodie/metadata
351329 [qtp339416158-22781] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1559019332568221431/dataset/.hoodie/metadata
351334 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1559019332568221431/dataset/.hoodie/metadata
355646 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
359638 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+----------+-----------------+----------+---------+---------+
|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|_hoodie_is_deleted| _row_key| driver| fare| partition| rider|timestamp|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+----------+-----------------+----------+---------+---------+
| 20210804100250|20210804100250_2_...|32bceb6e-e0d9-46b...| 2015/03/17|8fa1d147-397b-43a...| false|32bceb6e-e0d9-46b...|driver-001|73.10832847420247|2015/03/17|rider-001| 0|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+----------+-----------------+----------+---------+---------+
only showing top 1 row
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+----------+----------------+----------+---------+---------+
|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|_hoodie_is_deleted| _row_key| driver| fare| partition| rider|timestamp|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+----------+----------------+----------+---------+---------+
| 20210804100252|20210804100252_2_763|32bceb6e-e0d9-46b...| 2015/03/17|8fa1d147-397b-43a...| false|32bceb6e-e0d9-46b...|driver-002|23.8056569360648|2015/03/17|rider-002| 0|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+----------+----------------+----------+---------+---------+
only showing top 1 row
364741 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
364975 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit6044580830939325324/dataset already exists. Deleting existing data & overwriting with new data.
367033 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+-------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+-------------------+--------+----------+--------------+---------+--------------------+---------+--------------------+---------+
|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|_hoodie_is_deleted| _row_key| begin_lat| begin_lon|city_to_state|current_date| current_ts|distance_in_meters| driver| end_lat| end_lon| fare| height| nation| partition|partition_path| rider| seconds_since_epoch|timestamp| tip_history| weight|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+-------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+-------------------+--------+----------+--------------+---------+--------------------+---------+--------------------+---------+
| 20210804100257|20210804100257_2_...|dcc704cb-3dfc-490...| 2015/03/17|e8b0d1fa-cbc0-485...| false|dcc704cb-3dfc-490...|0.6622667934333721|0.40307723620933456| [CA]| 18843|1628085777126| 1561595879|driver-001|0.0963352328826913|0.42267827366829325|[63.6319856405424...|[0, 0, 7, -87, -45]|[Canada]|2015/03/17| 2015/03/17|rider-001|-3764240240373317113| 0|[[56.349080862804...|0.5270361|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+-------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+-------------------+--------+----------+--------------+---------+--------------------+---------+--------------------+---------+
only showing top 1 row
+--------------------+------------------+--------+--------------------+
|_hoodie_commit_seqno| amount|currency| tip_history|
+--------------------+------------------+--------+--------------------+
|20210804100257_2_...|63.631985640542474| USD|[[56.349080862804...|
+--------------------+------------------+--------+--------------------+
only showing top 1 row
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+-------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+-------------------+--------+----------+--------------+---------+--------------------+---------+--------------------+---------+
|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|_hoodie_is_deleted| _row_key| begin_lat| begin_lon|city_to_state|current_date| current_ts|distance_in_meters| driver| end_lat| end_lon| fare| height| nation| partition|partition_path| rider| seconds_since_epoch|timestamp| tip_history| weight|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+-------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+-------------------+--------+----------+--------------+---------+--------------------+---------+--------------------+---------+
| 20210804100257|20210804100257_2_...|dcc704cb-3dfc-490...| 2015/03/17|e8b0d1fa-cbc0-485...| false|dcc704cb-3dfc-490...|0.6622667934333721|0.40307723620933456| [CA]| 18843|1628085777126| 1561595879|driver-001|0.0963352328826913|0.42267827366829325|[63.6319856405424...|[0, 0, 7, -87, -45]|[Canada]|2015/03/17| 2015/03/17|rider-001|-3764240240373317113| 0|[[56.349080862804...|0.5270361|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+-------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+-------------------+--------+----------+--------------+---------+--------------------+---------+--------------------+---------+
only showing top 1 row
+--------------------+------------------+--------+--------------------+
|_hoodie_commit_seqno| amount|currency| tip_history|
+--------------------+------------------+--------+--------------------+
|20210804100257_2_...|63.631985640542474| USD|[[56.349080862804...|
+--------------------+------------------+--------+--------------------+
only showing top 1 row
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+-------------------+------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+----------------+--------+----------+--------------+---------+-------------------+---------+--------------------+---------+
|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|_hoodie_is_deleted| _row_key| begin_lat| begin_lon|city_to_state|current_date| current_ts|distance_in_meters| driver| end_lat| end_lon| fare| height| nation| partition|partition_path| rider|seconds_since_epoch|timestamp| tip_history| weight|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+-------------------+------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+----------------+--------+----------+--------------+---------+-------------------+---------+--------------------+---------+
| 20210804100257|20210804100257_1_...|69b288cc-0a5d-415...| 2015/03/16|1e045517-747a-4f8...| false|69b288cc-0a5d-415...|0.10992445731945999|0.7874704242964469| [CA]| 18843|1628085777126| -254578339|driver-001|0.7707505395832032|0.21966752744892826|[20.5552418771124...|[0, 0, 4, 0, 90]|[Canada]|2015/03/16| 2015/03/16|rider-001|1972906355628652078| 0|[[62.616605254988...|0.3053881|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+-------------------+------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+----------------+--------+----------+--------------+---------+-------------------+---------+--------------------+---------+
only showing top 1 row
+--------------------+------------------+--------+--------------------+
|_hoodie_commit_seqno| amount|currency| tip_history|
+--------------------+------------------+--------+--------------------+
|20210804100257_1_...|20.555241877112408| USD|[[62.616605254988...|
+--------------------+------------------+--------+--------------------+
only showing top 1 row
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+------------------+-------------+------------+-------------+------------------+----------+-------------------+----------------+--------------------+------------------+--------+----------+--------------+---------+-------------------+---------+--------------------+----------+
|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|_hoodie_is_deleted| _row_key| begin_lat| begin_lon|city_to_state|current_date| current_ts|distance_in_meters| driver| end_lat| end_lon| fare| height| nation| partition|partition_path| rider|seconds_since_epoch|timestamp| tip_history| weight|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+------------------+-------------+------------+-------------+------------------+----------+-------------------+----------------+--------------------+------------------+--------+----------+--------------+---------+-------------------+---------+--------------------+----------+
| 20210804100259|20210804100259_2_817|b85ebdaa-dd8c-46f...| 2015/03/17|e8b0d1fa-cbc0-485...| false|b85ebdaa-dd8c-46f...|0.2701100050821347|0.7686390187271622| [CA]| 18843|1628085778774| 1315236653|driver-002|0.35919236430729795|0.61757552806899|[33.4949270998965...|[0, 0, 8, -75, 72]|[Canada]|2015/03/17| 2015/03/17|rider-002|6678339034011801420| 0|[[90.231161676291...|0.21778327|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+------------------+-------------+------------+-------------+------------------+----------+-------------------+----------------+--------------------+------------------+--------+----------+--------------+---------+-------------------+---------+--------------------+----------+
only showing top 1 row
+--------------------+------------------+--------+--------------------+
|_hoodie_commit_seqno| amount|currency| tip_history|
+--------------------+------------------+--------+--------------------+
|20210804100259_2_817|33.494927099896564| USD|[[90.231161676291...|
+--------------------+------------------+--------+--------------------+
only showing top 1 row
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+------------------+--------+----------+--------------+---------+--------------------+---------+--------------------+---------+
|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name|_hoodie_is_deleted| _row_key| begin_lat| begin_lon|city_to_state|current_date| current_ts|distance_in_meters| driver| end_lat| end_lon| fare| height| nation| partition|partition_path| rider| seconds_since_epoch|timestamp| tip_history| weight|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+------------------+--------+----------+--------------+---------+--------------------+---------+--------------------+---------+
| 20210804100257|20210804100257_0_...|17e3d770-3497-4ed...| 2016/03/15|f1bf0c01-fd62-4ab...| false|17e3d770-3497-4ed...|0.7206276106446136|0.3518542351618481| [CA]| 18843|1628085777126| 1557959885|driver-001|0.1793572538288486|0.14289404900372016|[10.3037566084467...|[0, 0, 14, 44, 24]|[Canada]|2016/03/15| 2016/03/15|rider-001|-1859822735556898614| 0|[[59.849540953288...|0.9740608|
+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+--------------------+------------------+------------------+-------------+------------+-------------+------------------+----------+------------------+-------------------+--------------------+------------------+--------+----------+--------------+---------+--------------------+---------+--------------------+---------+
only showing top 1 row
+--------------------+-----------------+--------+--------------------+
|_hoodie_commit_seqno| amount|currency| tip_history|
+--------------------+-----------------+--------+--------------------+
|20210804100259_0_793|87.86898857567785| USD|[[31.708013486012...|
+--------------------+-----------------+--------+--------------------+
only showing top 1 row
373949 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
379644 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
[INFO] Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 200.347 s - in org.apache.hudi.functional.TestMORDataSource
[INFO] Running org.apache.hudi.functional.TestDataSourceForBootstrap
399567 [main] WARN org.apache.spark.sql.SparkSession$Builder - Using an existing SparkSession; some configuration may not take effect.
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.708 s - in org.apache.hudi.functional.TestDataSourceForBootstrap
[INFO] Running org.apache.hudi.functional.TestCOWDataSource
430822 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
435372 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
435611 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit3981273977033752470/dataset/.hoodie/metadata
435631 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit3981273977033752470/dataset/.hoodie/metadata
435633 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadataWriter - Cannot bootstrap metadata table as operation is in progress: [==>20210804100405__commit__REQUESTED]
435634 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit3981273977033752470/dataset/.hoodie/metadata
436859 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit3981273977033752470/dataset/.hoodie/metadata
436861 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit3981273977033752470/dataset/.hoodie/metadata
437041 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit3981273977033752470/dataset/.hoodie/metadata
437044 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit3981273977033752470/dataset/.hoodie/metadata
437049 [qtp1040865534-28049] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit3981273977033752470/dataset/.hoodie/metadata
437052 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit3981273977033752470/dataset/.hoodie/metadata
441817 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
442072 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7338162133435199157/dataset/.hoodie/metadata
442092 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7338162133435199157/dataset/.hoodie/metadata
442094 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadataWriter - Cannot bootstrap metadata table as operation is in progress: [==>20210804100412__commit__REQUESTED]
442096 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7338162133435199157/dataset/.hoodie/metadata
443715 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7338162133435199157/dataset/.hoodie/metadata
443718 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7338162133435199157/dataset/.hoodie/metadata
443958 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7338162133435199157/dataset/.hoodie/metadata
443960 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7338162133435199157/dataset/.hoodie/metadata
443965 [qtp332278596-28461] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7338162133435199157/dataset/.hoodie/metadata
443969 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7338162133435199157/dataset/.hoodie/metadata
448790 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
463652 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
466355 [main] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100436__replacecommit__REQUESTED]
467170 [Executor task launch worker for task 24] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100436__replacecommit__REQUESTED]
467176 [qtp588801314-29387] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100436__replacecommit__INFLIGHT]
+--------+
|count(1)|
+--------+
| 7|
+--------+
468651 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
471072 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
root
|-- _hoodie_commit_time: string (nullable = true)
|-- _hoodie_commit_seqno: string (nullable = true)
|-- _hoodie_record_key: string (nullable = true)
|-- _hoodie_partition_path: string (nullable = true)
|-- _hoodie_file_name: string (nullable = true)
|-- _row_key: string (nullable = true)
|-- name: string (nullable = true)
|-- timeStampValue: timestamp (nullable = true)
|-- dateValue: date (nullable = true)
|-- decimalValue: decimal(15,10) (nullable = true)
|-- timestamp: integer (nullable = true)
|-- partition: integer (nullable = true)
473133 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
478513 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4515735856849668230/dataset already exists. Deleting existing data & overwriting with new data.
480975 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4515735856849668230/dataset already exists. Deleting existing data & overwriting with new data.
482975 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4515735856849668230/dataset already exists. Deleting existing data & overwriting with new data.
483113 [Executor task launch worker for task 256] ERROR org.apache.spark.executor.Executor - Exception in task 0.0 in stage 110.0 (TID 256)
java.lang.IllegalArgumentException: No enum constant org.apache.hudi.keygen.CustomAvroKeyGenerator.PartitionKeyType.DUMMY
at java.lang.Enum.valueOf(Enum.java:238)
at org.apache.hudi.keygen.CustomAvroKeyGenerator$PartitionKeyType.valueOf(CustomAvroKeyGenerator.java:52)
at org.apache.hudi.keygen.CustomAvroKeyGenerator.getPartitionPath(CustomAvroKeyGenerator.java:82)
at org.apache.hudi.keygen.CustomKeyGenerator.getPartitionPath(CustomKeyGenerator.java:68)
at org.apache.hudi.keygen.BaseKeyGenerator.getKey(BaseKeyGenerator.java:62)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$3.apply(HoodieSparkSqlWriter.scala:173)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$3.apply(HoodieSparkSqlWriter.scala:168)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:193)
at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
483113 [Executor task launch worker for task 257] ERROR org.apache.spark.executor.Executor - Exception in task 1.0 in stage 110.0 (TID 257)
java.lang.IllegalArgumentException: No enum constant org.apache.hudi.keygen.CustomAvroKeyGenerator.PartitionKeyType.DUMMY
at java.lang.Enum.valueOf(Enum.java:238)
at org.apache.hudi.keygen.CustomAvroKeyGenerator$PartitionKeyType.valueOf(CustomAvroKeyGenerator.java:52)
at org.apache.hudi.keygen.CustomAvroKeyGenerator.getPartitionPath(CustomAvroKeyGenerator.java:82)
at org.apache.hudi.keygen.CustomKeyGenerator.getPartitionPath(CustomKeyGenerator.java:68)
at org.apache.hudi.keygen.BaseKeyGenerator.getKey(BaseKeyGenerator.java:62)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$3.apply(HoodieSparkSqlWriter.scala:173)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$3.apply(HoodieSparkSqlWriter.scala:168)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:193)
at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
483145 [task-result-getter-2] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 0.0 in stage 110.0 (TID 256, localhost, executor driver): java.lang.IllegalArgumentException: No enum constant org.apache.hudi.keygen.CustomAvroKeyGenerator.PartitionKeyType.DUMMY
at java.lang.Enum.valueOf(Enum.java:238)
at org.apache.hudi.keygen.CustomAvroKeyGenerator$PartitionKeyType.valueOf(CustomAvroKeyGenerator.java:52)
at org.apache.hudi.keygen.CustomAvroKeyGenerator.getPartitionPath(CustomAvroKeyGenerator.java:82)
at org.apache.hudi.keygen.CustomKeyGenerator.getPartitionPath(CustomKeyGenerator.java:68)
at org.apache.hudi.keygen.BaseKeyGenerator.getKey(BaseKeyGenerator.java:62)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$3.apply(HoodieSparkSqlWriter.scala:173)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$3.apply(HoodieSparkSqlWriter.scala:168)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:193)
at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
483148 [task-result-getter-2] ERROR org.apache.spark.scheduler.TaskSetManager - Task 0 in stage 110.0 failed 1 times; aborting job
483204 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
491019 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
493729 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
496302 [main] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100506__replacecommit__REQUESTED]
496795 [Executor task launch worker for task 40] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100506__replacecommit__REQUESTED]
496795 [Executor task launch worker for task 38] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100506__replacecommit__REQUESTED]
496795 [Executor task launch worker for task 39] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100506__replacecommit__REQUESTED]
496800 [qtp1034177642-31127] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100506__replacecommit__INFLIGHT]
497840 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
499655 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - upsert is not applicable when Key: 'hoodie.datasource.write.insert.drop.duplicates' , default: false description: If set to true, filters out all duplicate records from incoming dataframe, during insert operations. since version: version is not defined deprecated after: version is not defined) is set to be true, overriding the Key: 'hoodie.datasource.write.operation' , default: upsert description: Whether to do upsert, insert or bulkinsert for the write operation. Use bulkinsert to load new data into a table, and there on use upsert/insert. bulk insert uses a disk based write path to scale to load large inputs without need to cache it. since version: version is not defined deprecated after: version is not defined) to be insert
502302 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
507293 [main] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100517__replacecommit__REQUESTED]
507716 [Executor task launch worker for task 36] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100517__replacecommit__REQUESTED]
507721 [qtp1821610913-31911] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100517__replacecommit__INFLIGHT]
+--------+
|count(1)|
+--------+
| 13|
+--------+
508633 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
514653 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
516627 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
521053 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
523975 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit561509989922762132/dataset already exists. Deleting existing data & overwriting with new data.
525765 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
526015 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4522736734822327663/dataset/.hoodie/metadata
526035 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4522736734822327663/dataset/.hoodie/metadata
526038 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadataWriter - Cannot bootstrap metadata table as operation is in progress: [==>20210804100536__commit__REQUESTED]
526039 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4522736734822327663/dataset/.hoodie/metadata
528327 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4522736734822327663/dataset/.hoodie/metadata
528332 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4522736734822327663/dataset/.hoodie/metadata
528577 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4522736734822327663/dataset/.hoodie/metadata
528580 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4522736734822327663/dataset/.hoodie/metadata
528585 [qtp1907323490-33299] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4522736734822327663/dataset/.hoodie/metadata
528589 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4522736734822327663/dataset/.hoodie/metadata
543944 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
554609 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
556975 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit7452459923070689387/dataset already exists. Deleting existing data & overwriting with new data.
558743 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
560577 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
563365 [main] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100613__replacecommit__REQUESTED]
563892 [Executor task launch worker for task 33] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100613__replacecommit__REQUESTED]
563892 [Executor task launch worker for task 34] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100613__replacecommit__REQUESTED]
563901 [qtp1392094242-34874] WARN org.apache.hudi.common.util.ClusteringUtils - No content found in requested file for instant [==>20210804100613__replacecommit__INFLIGHT]
564762 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
567032 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit5482053788243845338/dataset already exists. Deleting existing data & overwriting with new data.
568813 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 139.453 s - in org.apache.hudi.functional.TestCOWDataSource
[INFO] Running org.apache.hudi.TestHoodieFileIndex
568977 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset already exists. Deleting existing data & overwriting with new data.
569022 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
569042 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
569044 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadataWriter - Cannot bootstrap metadata table as operation is in progress: [==>20210804100619__commit__REQUESTED]
569045 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
570331 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
570333 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
570520 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
570522 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
570529 [qtp76447703-35389] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
570533 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
571959 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset already exists. Deleting existing data & overwriting with new data.
572003 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
572023 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
572025 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadataWriter - Cannot bootstrap metadata table as operation is in progress: [==>20210804100621__commit__REQUESTED]
572026 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
573277 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
573280 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
573510 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
573514 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
573519 [qtp1457224683-35578] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
573523 [main] WARN org.apache.hudi.metadata.HoodieBackedTableMetadata - Metadata table was not found at path /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset/.hoodie/metadata
574843 [main] WARN org.apache.hudi.HoodieFileIndex - Cannot do the partition prune for table /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset.The partitionFragments size (2021,03,01,10) is not equal to the partition columns size(StructField(dt,StringType,true),StructField(hh,StringType,true))
574843 [main] WARN org.apache.hudi.HoodieFileIndex - Cannot do the partition prune for table /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset.The partitionFragments size (2021,03,02,10) is not equal to the partition columns size(StructField(dt,StringType,true),StructField(hh,StringType,true))
574873 [main] WARN org.apache.hudi.HoodieFileIndex - Cannot do the partition prune for table file:/var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset.The partitionFragments size (2021,03,01,10) is not equal to the partition columns size(StructField(dt,StringType,true),StructField(hh,StringType,true))
574873 [main] WARN org.apache.hudi.HoodieFileIndex - Cannot do the partition prune for table file:/var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1884847010781781286/dataset.The partitionFragments size (2021,03,02,10) is not equal to the partition columns size(StructField(dt,StringType,true),StructField(hh,StringType,true))
575085 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
575854 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit8641909570568598962/dataset already exists. Deleting existing data & overwriting with new data.
578213 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit8641909570568598962/dataset already exists. Deleting existing data & overwriting with new data.
579974 [main] WARN org.apache.hudi.HoodieFileIndex - Cannot do the partition prune for table /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit8641909570568598962/dataset.The partitionFragments size (2021,03,02,10) is not equal to the partition columns size(StructField(dt,StringType,true),StructField(hh,StringType,true))
579974 [main] WARN org.apache.hudi.HoodieFileIndex - Cannot do the partition prune for table /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit8641909570568598962/dataset.The partitionFragments size (2021,03,01,10) is not equal to the partition columns size(StructField(dt,StringType,true),StructField(hh,StringType,true))
580450 [main] WARN org.apache.hudi.HoodieFileIndex - Cannot do the partition prune for table file:/var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit8641909570568598962/dataset.The partitionFragments size (2021,03,02,10) is not equal to the partition columns size(StructField(dt,StringType,true),StructField(hh,StringType,true))
580450 [main] WARN org.apache.hudi.HoodieFileIndex - Cannot do the partition prune for table file:/var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit8641909570568598962/dataset.The partitionFragments size (2021,03,01,10) is not equal to the partition columns size(StructField(dt,StringType,true),StructField(hh,StringType,true))
580711 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
581275 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit763351060335921180/dataset already exists. Deleting existing data & overwriting with new data.
583178 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
583807 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4571807105396198020/dataset already exists. Deleting existing data & overwriting with new data.
585691 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
586244 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit5121419322483757533/dataset already exists. Deleting existing data & overwriting with new data.
588590 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
589298 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit2629129428961874335/dataset already exists. Deleting existing data & overwriting with new data.
593738 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
595224 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1269787836649900335/dataset already exists. Deleting existing data & overwriting with new data.
597870 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
598489 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1138954208432105400/dataset already exists. Deleting existing data & overwriting with new data.
600611 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
601332 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit4142566152793631846/dataset already exists. Deleting existing data & overwriting with new data.
603517 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
604220 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit1543652511358291474/dataset already exists. Deleting existing data & overwriting with new data.
605926 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
606665 [main] WARN org.apache.hudi.HoodieSparkSqlWriter$ - hoodie table at /var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/junit217163374617912958/dataset already exists. Deleting existing data & overwriting with new data.
608473 [main] WARN org.apache.hudi.testutils.HoodieClientTestHarness - Closing file-system instance used in previous test-run
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.087 s - in org.apache.hudi.TestHoodieFileIndex
[INFO]
[INFO] Results:
[INFO]
[INFO] Tests run: 91, Failures: 0, Errors: 0, Skipped: 0
[INFO]
[INFO]
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ hudi-spark_2.11 ---
Discovery starting.
Discovery completed in 1 second, 18 milliseconds.
Run starting. Expected test count is: 60
TestAvroConversionHelper:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/nsb/.m2/repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/nsb/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
trans data from int [ 7, 365, 0 ] to date [ 1970-01-08, 1971-01-01, 1970-01-01 ]
- Logical type: date
HoodieSparkSqlWriterSuite:
- Parameters With Write Defaults
0 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] WARN org.apache.spark.util.Utils - Your hostname, Sivabalans-MacBook-Pro.local resolves to a loopback address: 127.0.0.1; using 10.0.0.202 instead (on interface en0)
4 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] WARN org.apache.spark.util.Utils - Set SPARK_LOCAL_IP if you need to bind to another address
121 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkContext - Running Spark version 2.4.4
792 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
933 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkContext - Submitted application: hoodie_test
1042 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - Changing view acls to: nsb
1043 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - Changing modify acls to: nsb
1044 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
1044 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
1045 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(nsb); groups with view permissions: Set(); users with modify permissions: Set(nsb); groups with modify permissions: Set()
1565 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.util.Utils - Successfully started service 'sparkDriver' on port 58800.
1597 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkEnv - Registering MapOutputTracker
1623 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkEnv - Registering BlockManagerMaster
1627 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
1628 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - BlockManagerMasterEndpoint up
1642 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.DiskBlockManager - Created local directory at /private/var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/blockmgr-097175b6-58d0-4103-b55e-097213c1a64c
1667 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore started with capacity 912.3 MB
1687 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkEnv - Registering OutputCommitCoordinator
1778 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.util.log - Logging initialized @5474ms
1890 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.Server - jetty-9.3.z-SNAPSHOT, build timestamp: 2019-02-15T08:53:49-08:00, git hash: eb70b240169fcf1abbd86af36482d1c49826fa0b
1924 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.Server - Started @5621ms
1959 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.AbstractConnector - Started ServerConnector@4334b9fd{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
1959 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.util.Utils - Successfully started service 'SparkUI' on port 4040.
1997 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@118ae1ca{/jobs,null,AVAILABLE,@Spark}
1997 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3d607d35{/jobs/json,null,AVAILABLE,@Spark}
1998 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@ed80697{/jobs/job,null,AVAILABLE,@Spark}
1998 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@57c903e2{/jobs/job/json,null,AVAILABLE,@Spark}
1999 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@69fe55fb{/stages,null,AVAILABLE,@Spark}
2000 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@275c1aed{/stages/json,null,AVAILABLE,@Spark}
2000 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@4e3617fe{/stages/stage,null,AVAILABLE,@Spark}
2001 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@d79fd49{/stages/stage/json,null,AVAILABLE,@Spark}
2002 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@730e92ff{/stages/pool,null,AVAILABLE,@Spark}
2003 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@5c803ec4{/stages/pool/json,null,AVAILABLE,@Spark}
2003 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3ebbe1d9{/storage,null,AVAILABLE,@Spark}
2004 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7cead0f2{/storage/json,null,AVAILABLE,@Spark}
2005 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7bf42cfa{/storage/rdd,null,AVAILABLE,@Spark}
2005 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@26df5b5e{/storage/rdd/json,null,AVAILABLE,@Spark}
2006 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@470bd375{/environment,null,AVAILABLE,@Spark}
2006 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@6d4ce626{/environment/json,null,AVAILABLE,@Spark}
2007 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@37859489{/executors,null,AVAILABLE,@Spark}
2008 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@6bdc2c8c{/executors/json,null,AVAILABLE,@Spark}
2008 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@72d5daf9{/executors/threadDump,null,AVAILABLE,@Spark}
2009 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@78227f53{/executors/threadDump/json,null,AVAILABLE,@Spark}
2020 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@63989d02{/static,null,AVAILABLE,@Spark}
2021 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@26b2b702{/,null,AVAILABLE,@Spark}
2022 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@2ec993d6{/api,null,AVAILABLE,@Spark}
2023 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3157ee4e{/jobs/job/kill,null,AVAILABLE,@Spark}
2023 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@768ed999{/stages/stage/kill,null,AVAILABLE,@Spark}
2026 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.ui.SparkUI - Bound SparkUI to 0.0.0.0, and started at http://10.0.0.202:4040
2188 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.executor.Executor - Starting executor ID driver on host localhost
2271 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.util.Utils - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 58801.
2272 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.network.netty.NettyBlockTransferService - Server created on 10.0.0.202:58801
2274 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManager - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2319 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManagerMaster - Registering BlockManager BlockManagerId(driver, 10.0.0.202, 58801, None)
2324 [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - Registering block manager 10.0.0.202:58801 with 912.3 MB RAM, BlockManagerId(driver, 10.0.0.202, 58801, None)
2328 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManagerMaster - Registered BlockManager BlockManagerId(driver, 10.0.0.202, 58801, None)
2329 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManager - Initialized BlockManager: BlockManagerId(driver, 10.0.0.202, 58801, None)
2571 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1e46b33b{/metrics/json,null,AVAILABLE,@Spark}
4443 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.sql.internal.SharedState - loading hive config file: jar:file:/Users/nsb/.m2/repository/org/apache/spark/spark-sql_2.11/2.4.4/spark-sql_2.11-2.4.4-tests.jar!/hive-site.xml
4461 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.sql.internal.SharedState - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/Users/nsb/Documents/personal/projects/july21/hudi/hudi-spark-datasource/hudi-spark/spark-warehouse/').
4462 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.sql.internal.SharedState - Warehouse path is 'file:/Users/nsb/Documents/personal/projects/july21/hudi/hudi-spark-datasource/hudi-spark/spark-warehouse/'.
4472 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@74f22354{/SQL,null,AVAILABLE,@Spark}
4473 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1e553a05{/SQL/json,null,AVAILABLE,@Spark}
4473 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1d97d705{/SQL/execution,null,AVAILABLE,@Spark}
4474 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@70bbb286{/SQL/execution/json,null,AVAILABLE,@Spark}
4475 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@b7c4055{/static/sql,null,AVAILABLE,@Spark}
4902 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.sql.execution.streaming.state.StateStoreCoordinatorRef - Registered StateStoreCoordinator endpoint
5136 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@4334b9fd{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
5137 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://10.0.0.202:4040
5150 [dispatcher-event-loop-1] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
5172 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
5172 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
5179 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
5183 [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
5190 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext
- throw hoodie exception when invalid serializer
5192 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkContext - Running Spark version 2.4.4
5192 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkContext - Submitted application: test_append_mode
5194 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - Changing view acls to: nsb
5194 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - Changing modify acls to: nsb
5194 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
5194 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
5194 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(nsb); groups with view permissions: Set(); users with modify permissions: Set(nsb); groups with modify permissions: Set()
5229 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.util.Utils - Successfully started service 'sparkDriver' on port 58802.
5231 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkEnv - Registering MapOutputTracker
5232 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkEnv - Registering BlockManagerMaster
5232 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
5232 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - BlockManagerMasterEndpoint up
5233 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.DiskBlockManager - Created local directory at /private/var/folders/ym/8yjkm3n90kq8tk4gfmvk7y140000gn/T/blockmgr-2d25dce1-467b-4d27-9d67-e4d6249c7917
5234 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore started with capacity 912.3 MB
5235 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.SparkEnv - Registering OutputCommitCoordinator
5241 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.Server - jetty-9.3.z-SNAPSHOT, build timestamp: 2019-02-15T08:53:49-08:00, git hash: eb70b240169fcf1abbd86af36482d1c49826fa0b
5243 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.Server - Started @8940ms
5244 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.AbstractConnector - Started ServerConnector@149b4149{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
5244 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.util.Utils - Successfully started service 'SparkUI' on port 4040.
5244 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@e4e89e0{/jobs,null,AVAILABLE,@Spark}
5245 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@78803816{/jobs/json,null,AVAILABLE,@Spark}
5245 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3911b8ce{/jobs/job,null,AVAILABLE,@Spark}
5245 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@52610c7{/jobs/job/json,null,AVAILABLE,@Spark}
5246 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7fdd880a{/stages,null,AVAILABLE,@Spark}
5246 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@6644f6f{/stages/json,null,AVAILABLE,@Spark}
5246 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@328740a5{/stages/stage,null,AVAILABLE,@Spark}
5247 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@792de1a{/stages/stage/json,null,AVAILABLE,@Spark}
5247 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@149c9b3a{/stages/pool,null,AVAILABLE,@Spark}
5247 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3027a1f1{/stages/pool/json,null,AVAILABLE,@Spark}
5247 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@382d4d38{/storage,null,AVAILABLE,@Spark}
5248 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@29de3629{/storage/json,null,AVAILABLE,@Spark}
5248 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@59eb3d63{/storage/rdd,null,AVAILABLE,@Spark}
5249 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@54ce3786{/storage/rdd/json,null,AVAILABLE,@Spark}
5249 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@658f4aa{/environment,null,AVAILABLE,@Spark}
5249 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@5eeff536{/environment/json,null,AVAILABLE,@Spark}
5250 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@5eefdbad{/executors,null,AVAILABLE,@Spark}
5250 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7bb8a5f1{/executors/json,null,AVAILABLE,@Spark}
5250 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3a04f388{/executors/threadDump,null,AVAILABLE,@Spark}
5251 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1b63f7bb{/executors/threadDump/json,null,AVAILABLE,@Spark}
5252 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3b7175c9{/static,null,AVAILABLE,@Spark}
5252 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@558c87b{/,null,AVAILABLE,@Spark}
5253 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@18ddf3eb{/api,null,AVAILABLE,@Spark}
5253 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@2c68e010{/jobs/job/kill,null,AVAILABLE,@Spark}
5254 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@6f30eef6{/stages/stage/kill,null,AVAILABLE,@Spark}
5254 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.ui.SparkUI - Bound SparkUI to 0.0.0.0, and started at http://10.0.0.202:4040
5270 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.executor.Executor - Starting executor ID driver on host localhost
5273 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.util.Utils - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 58803.
5273 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.network.netty.NettyBlockTransferService - Server created on 10.0.0.202:58803
5273 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManager - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
5273 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManagerMaster - Registering BlockManager BlockManagerId(driver, 10.0.0.202, 58803, None)
5274 [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - Registering block manager 10.0.0.202:58803 with 912.3 MB RAM, BlockManagerId(driver, 10.0.0.202, 58803, None)
5274 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManagerMaster - Registered BlockManager BlockManagerId(driver, 10.0.0.202, 58803, None)
5274 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.apache.spark.storage.BlockManager - Initialized BlockManager: BlockManagerId(driver, 10.0.0.202, 58803, None)
5275 [ScalaTest-main-running-HoodieSparkSqlWriterSuite] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@2d23b27c{/metrics/json,null,AVAILABLE,@Spark}
- throw hoodie exception when there already exist a table with different name with Append Save mode
- test_bulk_insert_for_GLOBAL_SORT
- test_bulk_insert_for_NONE
- test_bulk_insert_for_PARTITION_SORT
- test_bulk_insert_for_populate_meta_fields_true
- test_bulk_insert_for_populate_meta_fields_false
- test disable and enable meta fields
- test drop duplicates row writing for bulk_insert
- test insert dataset without precombine field
- test bulk insert dataset with datasource impl multiple rounds
- test basic HoodieSparkSqlWriter functionality with datasource insert for COPY_ON_WRITE with PARQUET as the base file format with populate meta fields true
- test basic HoodieSparkSqlWriter functionality with datasource insert for COPY_ON_WRITE with ORC as the base file format with populate meta fields true
- test basic HoodieSparkSqlWriter functionality with datasource insert for MERGE_ON_READ with PARQUET as the base file format with populate meta fields true
- test basic HoodieSparkSqlWriter functionality with datasource insert for MERGE_ON_READ with ORC as the base file format with populate meta fields true
- test basic HoodieSparkSqlWriter functionality with datasource insert for COPY_ON_WRITE with PARQUET as the base file format with populate meta fields false
- test basic HoodieSparkSqlWriter functionality with datasource insert for MERGE_ON_READ with PARQUET as the base file format with populate meta fields false
- test HoodieSparkSqlWriter functionality with datasource bootstrap for COPY_ON_WRITE
- test HoodieSparkSqlWriter functionality with datasource bootstrap for MERGE_ON_READ
- test schema evolution for COPY_ON_WRITE
- test schema evolution for MERGE_ON_READ
- Test build sync config for spark sql
- Test build sync config for skip Ro Suffix vals
- test Incremental View WithReplacement
- test Non partition table with metatable support
TestStreamingSource:
- test cow stream source
- test mor stream source
TestInsertTable:
- Test Insert Into
174509 [pool-558-thread-2] ERROR org.apache.hudi.common.util.queue.BoundedInMemoryExecutor - error consuming records
org.apache.hudi.exception.HoodieUpsertException: Failed to combine/merge new record with old value in storage, for new record {HoodieRecord{key=HoodieKey { recordKey=id:1 partitionPath=}, currentLocation='HoodieRecordLocation {instantTime=20210804070955, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}', newLocation='HoodieRecordLocation {instantTime=20210804070957, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}'}}, old value {{"_hoodie_commit_time": "20210804070953", "_hoodie_commit_seqno": "20210804070953_0_8582", "_hoodie_record_key": "id:1", "_hoodie_partition_path": "", "_hoodie_file_name": "7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0_0-48-62_20210804070953.parquet", "id": 1, "name": "a1", "price": 10.0, "ts": 1000}}
at org.apache.hudi.io.HoodieMergeHandle.write(HoodieMergeHandle.java:321)
at org.apache.hudi.table.action.commit.AbstractMergeHelper$UpdateHandler.consumeOneRecord(AbstractMergeHelper.java:122)
at org.apache.hudi.table.action.commit.AbstractMergeHelper$UpdateHandler.consumeOneRecord(AbstractMergeHelper.java:112)
at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:37)
at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:121)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hudi.exception.HoodieDuplicateKeyException: Duplicate key found for insert statement, key is: id:1
at org.apache.spark.sql.hudi.command.ValidateDuplicateKeyPayload.combineAndGetUpdateValue(InsertIntoHoodieTableCommand.scala:277)
at org.apache.hudi.io.HoodieMergeHandle.write(HoodieMergeHandle.java:301)
... 8 more
174527 [Executor task launch worker for task 151] ERROR org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor - Error upserting bucketType UPDATE for partition :0
org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: org.apache.hudi.exception.HoodieUpsertException: Failed to combine/merge new record with old value in storage, for new record {HoodieRecord{key=HoodieKey { recordKey=id:1 partitionPath=}, currentLocation='HoodieRecordLocation {instantTime=20210804070955, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}', newLocation='HoodieRecordLocation {instantTime=20210804070957, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}'}}, old value {{"_hoodie_commit_time": "20210804070953", "_hoodie_commit_seqno": "20210804070953_0_8582", "_hoodie_record_key": "id:1", "_hoodie_partition_path": "", "_hoodie_file_name": "7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0_0-48-62_20210804070953.parquet", "id": 1, "name": "a1", "price": 10.0, "ts": 1000}}
at org.apache.hudi.table.action.commit.SparkMergeHelper.runMerge(SparkMergeHelper.java:102)
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpdateInternal(BaseSparkCommitActionExecutor.java:334)
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpdate(BaseSparkCommitActionExecutor.java:325)
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpsertPartition(BaseSparkCommitActionExecutor.java:298)
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.lambda$execute$ecf5068c$1(BaseSparkCommitActionExecutor.java:156)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$mapPartitionsWithIndex$1.apply(JavaRDDLike.scala:102)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$mapPartitionsWithIndex$1.apply(JavaRDDLike.scala:102)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$25.apply(RDD.scala:853)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$25.apply(RDD.scala:853)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: org.apache.hudi.exception.HoodieUpsertException: Failed to combine/merge new record with old value in storage, for new record {HoodieRecord{key=HoodieKey { recordKey=id:1 partitionPath=}, currentLocation='HoodieRecordLocation {instantTime=20210804070955, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}', newLocation='HoodieRecordLocation {instantTime=20210804070957, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}'}}, old value {{"_hoodie_commit_time": "20210804070953", "_hoodie_commit_seqno": "20210804070953_0_8582", "_hoodie_record_key": "id:1", "_hoodie_partition_path": "", "_hoodie_file_name": "7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0_0-48-62_20210804070953.parquet", "id": 1, "name": "a1", "price": 10.0, "ts": 1000}}
at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:147)
at org.apache.hudi.table.action.commit.SparkMergeHelper.runMerge(SparkMergeHelper.java:100)
... 33 more
Caused by: java.util.concurrent.ExecutionException: org.apache.hudi.exception.HoodieUpsertException: Failed to combine/merge new record with old value in storage, for new record {HoodieRecord{key=HoodieKey { recordKey=id:1 partitionPath=}, currentLocation='HoodieRecordLocation {instantTime=20210804070955, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}', newLocation='HoodieRecordLocation {instantTime=20210804070957, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}'}}, old value {{"_hoodie_commit_time": "20210804070953", "_hoodie_commit_seqno": "20210804070953_0_8582", "_hoodie_record_key": "id:1", "_hoodie_partition_path": "", "_hoodie_file_name": "7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0_0-48-62_20210804070953.parquet", "id": 1, "name": "a1", "price": 10.0, "ts": 1000}}
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:141)
... 34 more
Caused by: org.apache.hudi.exception.HoodieUpsertException: Failed to combine/merge new record with old value in storage, for new record {HoodieRecord{key=HoodieKey { recordKey=id:1 partitionPath=}, currentLocation='HoodieRecordLocation {instantTime=20210804070955, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}', newLocation='HoodieRecordLocation {instantTime=20210804070957, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}'}}, old value {{"_hoodie_commit_time": "20210804070953", "_hoodie_commit_seqno": "20210804070953_0_8582", "_hoodie_record_key": "id:1", "_hoodie_partition_path": "", "_hoodie_file_name": "7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0_0-48-62_20210804070953.parquet", "id": 1, "name": "a1", "price": 10.0, "ts": 1000}}
at org.apache.hudi.io.HoodieMergeHandle.write(HoodieMergeHandle.java:321)
at org.apache.hudi.table.action.commit.AbstractMergeHelper$UpdateHandler.consumeOneRecord(AbstractMergeHelper.java:122)
at org.apache.hudi.table.action.commit.AbstractMergeHelper$UpdateHandler.consumeOneRecord(AbstractMergeHelper.java:112)
at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:37)
at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:121)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more
Caused by: org.apache.hudi.exception.HoodieDuplicateKeyException: Duplicate key found for insert statement, key is: id:1
at org.apache.spark.sql.hudi.command.ValidateDuplicateKeyPayload.combineAndGetUpdateValue(InsertIntoHoodieTableCommand.scala:277)
at org.apache.hudi.io.HoodieMergeHandle.write(HoodieMergeHandle.java:301)
... 8 more
174532 [Executor task launch worker for task 151] ERROR org.apache.spark.executor.Executor - Exception in task 0.0 in stage 110.0 (TID 151)
org.apache.hudi.exception.HoodieUpsertException: Error upserting bucketType UPDATE for partition :0
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpsertPartition(BaseSparkCommitActionExecutor.java:305)
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.lambda$execute$ecf5068c$1(BaseSparkCommitActionExecutor.java:156)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$mapPartitionsWithIndex$1.apply(JavaRDDLike.scala:102)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$mapPartitionsWithIndex$1.apply(JavaRDDLike.scala:102)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$25.apply(RDD.scala:853)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$25.apply(RDD.scala:853)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: org.apache.hudi.exception.HoodieUpsertException: Failed to combine/merge new record with old value in storage, for new record {HoodieRecord{key=HoodieKey { recordKey=id:1 partitionPath=}, currentLocation='HoodieRecordLocation {instantTime=20210804070955, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}', newLocation='HoodieRecordLocation {instantTime=20210804070957, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}'}}, old value {{"_hoodie_commit_time": "20210804070953", "_hoodie_commit_seqno": "20210804070953_0_8582", "_hoodie_record_key": "id:1", "_hoodie_partition_path": "", "_hoodie_file_name": "7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0_0-48-62_20210804070953.parquet", "id": 1, "name": "a1", "price": 10.0, "ts": 1000}}
at org.apache.hudi.table.action.commit.SparkMergeHelper.runMerge(SparkMergeHelper.java:102)
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpdateInternal(BaseSparkCommitActionExecutor.java:334)
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpdate(BaseSparkCommitActionExecutor.java:325)
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpsertPartition(BaseSparkCommitActionExecutor.java:298)
... 30 more
Caused by: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: org.apache.hudi.exception.HoodieUpsertException: Failed to combine/merge new record with old value in storage, for new record {HoodieRecord{key=HoodieKey { recordKey=id:1 partitionPath=}, currentLocation='HoodieRecordLocation {instantTime=20210804070955, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}', newLocation='HoodieRecordLocation {instantTime=20210804070957, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}'}}, old value {{"_hoodie_commit_time": "20210804070953", "_hoodie_commit_seqno": "20210804070953_0_8582", "_hoodie_record_key": "id:1", "_hoodie_partition_path": "", "_hoodie_file_name": "7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0_0-48-62_20210804070953.parquet", "id": 1, "name": "a1", "price": 10.0, "ts": 1000}}
at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:147)
at org.apache.hudi.table.action.commit.SparkMergeHelper.runMerge(SparkMergeHelper.java:100)
... 33 more
Caused by: java.util.concurrent.ExecutionException: org.apache.hudi.exception.HoodieUpsertException: Failed to combine/merge new record with old value in storage, for new record {HoodieRecord{key=HoodieKey { recordKey=id:1 partitionPath=}, currentLocation='HoodieRecordLocation {instantTime=20210804070955, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}', newLocation='HoodieRecordLocation {instantTime=20210804070957, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}'}}, old value {{"_hoodie_commit_time": "20210804070953", "_hoodie_commit_seqno": "20210804070953_0_8582", "_hoodie_record_key": "id:1", "_hoodie_partition_path": "", "_hoodie_file_name": "7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0_0-48-62_20210804070953.parquet", "id": 1, "name": "a1", "price": 10.0, "ts": 1000}}
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:141)
... 34 more
Caused by: org.apache.hudi.exception.HoodieUpsertException: Failed to combine/merge new record with old value in storage, for new record {HoodieRecord{key=HoodieKey { recordKey=id:1 partitionPath=}, currentLocation='HoodieRecordLocation {instantTime=20210804070955, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}', newLocation='HoodieRecordLocation {instantTime=20210804070957, fileId=7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0}'}}, old value {{"_hoodie_commit_time": "20210804070953", "_hoodie_commit_seqno": "20210804070953_0_8582", "_hoodie_record_key": "id:1", "_hoodie_partition_path": "", "_hoodie_file_name": "7c18bcb1-a446-4fa8-8c8a-d06aba2ceb82-0_0-48-62_20210804070953.parquet", "id": 1, "name": "a1", "price": 10.0, "ts": 1000}}
at org.apache.hudi.io.HoodieMergeHandle.write(HoodieMergeHandle.java:321)
at org.apache.hudi.table.action.commit.AbstractMergeHelper$UpdateHandler.consumeOneRecord(AbstractMergeHelper.java:122)
at org.apache.hudi.table.action.commit.AbstractMergeHelper$UpdateHandler.consumeOneRecord(AbstractMergeHelper.java:112)
at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:37)
at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:121)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more
Caused by: org.apache.hudi.exception.HoodieDuplicateKeyException: Duplicate key found for insert statement, key is: id:1
at org.apache.spark.sql.hudi.command.ValidateDuplicateKeyPayload.combineAndGetUpdateValue(InsertIntoHoodieTableCommand.scala:277)
at org.apache.hudi.io.HoodieMergeHandle.write(HoodieMergeHandle.java:301)
... 8 more
174552 [task-result-getter-3] ERROR org.apache.spark.scheduler.TaskSetManager - Task 0 in stage 110.0 failed 1 times; aborting job
- Test Insert Into None Partitioned Table
- Test Insert Overwrite
- Test Different Type of Partition Column
- Test insert for uppercase table name
- Test Insert Exception
TestMergeIntoTable:
- Test MergeInto Basic
- Test MergeInto with ignored record
- Test MergeInto for MOR table
- Test MergeInto with insert only
- Test MergeInto For PreCombineField
- Merge Hudi to Hudi
- Test Different Type of PreCombineField
- Test MergeInto For MOR With Compaction On
- Test MereInto With Null Fields
- Test MereInto With All Kinds Of DataType
TestAvroConversionUtils:
- test convertStructTypeToAvroSchema
TestTruncateTable:
- Test Truncate Table
TestMergeIntoTable2:
- Test MergeInto for MOR table 2
- Test Merge Into CTAS Table
TestHoodieSqlBase:
TestAlterTable:
- Test Alter Table
TestDeleteTable:
- Test Delete Table
TestMereIntoLogOnlyTable:
- Test Query Log Only MOR Table
TestPartialUpdateForMergeInto:
- Test Partial Update
- Test MergeInto Exception
TestUpdateTable:
- Test Update Table
TestSqlStatement:
- Test Sql Statements
TestCreateTable:
- Test Create Managed Hoodie Table
- Test Create External Hoodie Table
- Test Table Column Validate
- Test Create Table As Select
Run completed in 10 minutes, 16 seconds.
Total number of tests run: 60
Suites: completed 17, aborted 0
Tests: succeeded 60, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO]
[INFO] --- jacoco-maven-plugin:0.8.5:report (post-unit-tests) @ hudi-spark_2.11 ---
[INFO] Loading execution data file /Users/nsb/Documents/personal/projects/july21/hudi/hudi-spark-datasource/hudi-spark/target/jacoco.exec
[INFO] Analyzed bundle 'hudi-spark_2.11' with 492 classes
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 20:49 min
[INFO] Finished at: 2021-08-04T10:17:22-04:00
[INFO] ------------------------------------------------------------------------
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment