Created
February 20, 2021 21:47
-
-
Save nsivabalan/c926a022e73e11f2d7fa1f57724a130e to your computer and use it in GitHub Desktop.
test logs
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
17483 [main] WARN org.apache.spark.sql.SparkSession$Builder - Using an existing SparkSession; some configuration may not take effect. | |
18361 [main] WARN org.apache.spark.sql.SparkSession$Builder - Using an existing SparkSession; some configuration may not take effect. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.partition_fields' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.short_trip_db.dummy_table_short_trip.configFile' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.uber_db.dummy_table_uber.configFile' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.schemaprovider.source.schema.file' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.partition_extractor_class' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.source.dfs.root' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'key.serializer' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.recordkey.field' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.keygenerator.class' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.source.kafka.topic' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.bulkinsert.shuffle.parallelism' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.assume_date_partitioning' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.kafka.source.maxEvents' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.tablesToBeIngested' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.keygen.timebased.output.dateformat' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.schemaprovider.target.schema.file' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.partitionpath.field' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.delete.shuffle.parallelism' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.upsert.shuffle.parallelism' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.jdbcurl' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'value.serializer' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.insert.shuffle.parallelism' was supplied but isn't a known config. | |
18927 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.database' was supplied but isn't a known config. | |
18969 [main] WARN org.apache.spark.streaming.kafka010.KafkaUtils - overriding enable.auto.commit to false for executor | |
18969 [main] WARN org.apache.spark.streaming.kafka010.KafkaUtils - overriding auto.offset.reset to none for executor | |
18970 [main] ERROR org.apache.spark.streaming.kafka010.KafkaUtils - group.id is null, you should probably set it | |
18970 [main] WARN org.apache.spark.streaming.kafka010.KafkaUtils - overriding executor group.id to spark-executor-null | |
18971 [main] WARN org.apache.spark.streaming.kafka010.KafkaUtils - overriding receive.buffer.bytes to 65536 see KAFKA-3135 | |
Checkpoint str topic1,0:2,1:3 | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.partition_fields' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.short_trip_db.dummy_table_short_trip.configFile' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.uber_db.dummy_table_uber.configFile' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.schemaprovider.source.schema.file' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.partition_extractor_class' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.source.dfs.root' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'key.serializer' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.recordkey.field' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.keygenerator.class' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.source.kafka.topic' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.bulkinsert.shuffle.parallelism' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.assume_date_partitioning' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.kafka.source.maxEvents' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.tablesToBeIngested' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.keygen.timebased.output.dateformat' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.schemaprovider.target.schema.file' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.partitionpath.field' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.delete.shuffle.parallelism' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.upsert.shuffle.parallelism' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.jdbcurl' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'value.serializer' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.insert.shuffle.parallelism' was supplied but isn't a known config. | |
19742 [Executor task launch worker for task 0] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.database' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.partition_fields' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.short_trip_db.dummy_table_short_trip.configFile' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.uber_db.dummy_table_uber.configFile' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.schemaprovider.source.schema.file' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.partition_extractor_class' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.source.dfs.root' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'key.serializer' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.recordkey.field' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.keygenerator.class' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.source.kafka.topic' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.bulkinsert.shuffle.parallelism' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.assume_date_partitioning' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.kafka.source.maxEvents' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.tablesToBeIngested' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.keygen.timebased.output.dateformat' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.schemaprovider.target.schema.file' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.partitionpath.field' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.delete.shuffle.parallelism' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.upsert.shuffle.parallelism' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.jdbcurl' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'value.serializer' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.insert.shuffle.parallelism' was supplied but isn't a known config. | |
20262 [Executor task launch worker for task 2] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.database' was supplied but isn't a known config. | |
GenRec 333 {"timestamp": 0, "_row_key": "e6327057-6772-44a7-ad75-92ceade4763d", "rider": "rider-000", "driver": "driver-000", "fare": 81.92868687714224, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "0e28a84b-e8f0-4942-9c77-3c464e530a8b", "rider": "rider-000", "driver": "driver-000", "fare": 41.06290929046368, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "42bb3940-7984-47db-aeef-916438d1624a", "rider": "rider-000", "driver": "driver-000", "fare": 90.65078444936647, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "3920a203-4d9e-4713-a82d-398eed0324cc", "rider": "rider-000", "driver": "driver-000", "fare": 38.63372961020515, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "9e60200d-bc23-4702-9116-a00fb83b68d8", "rider": "rider-000", "driver": "driver-000", "fare": 11.488393157088261, "_hoodie_is_deleted": false} | |
Key generator generated key HoodieKey { recordKey=e6327057-6772-44a7-ad75-92ceade4763d partitionPath=default} | |
Before Pre write | |
Pre write complete | |
Just before tagging | |
Tag location +++++++ | |
Key generator generated key HoodieKey { recordKey=42bb3940-7984-47db-aeef-916438d1624a partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=3920a203-4d9e-4713-a82d-398eed0324cc partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=9e60200d-bc23-4702-9116-a00fb83b68d8 partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=e6327057-6772-44a7-ad75-92ceade4763d partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=0e28a84b-e8f0-4942-9c77-3c464e530a8b partitionPath=default} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=0e28a84b-e8f0-4942-9c77-3c464e530a8b partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=9e60200d-bc23-4702-9116-a00fb83b68d8 partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=42bb3940-7984-47db-aeef-916438d1624a partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=3920a203-4d9e-4713-a82d-398eed0324cc partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=e6327057-6772-44a7-ad75-92ceade4763d partitionPath=default}, currentLocation='null', newLocation='null'} | |
Part 1 :: default : e6327057-6772-44a7-ad75-92ceade4763d | |
within look up index | |
records per partition default -> 5 | |
file info list complete | |
Load involved files comeplete | |
Just before findMatchingFilesForRecordKeys | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=0e28a84b-e8f0-4942-9c77-3c464e530a8b partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=9e60200d-bc23-4702-9116-a00fb83b68d8 partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=42bb3940-7984-47db-aeef-916438d1624a partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=3920a203-4d9e-4713-a82d-398eed0324cc partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=e6327057-6772-44a7-ad75-92ceade4763d partitionPath=default}, currentLocation='null', newLocation='null'} | |
tagging Complete | |
Building workload profile | |
Completed building workload profile | |
Write complete . beofre post write | |
Handling INSERT for partition fe710a73-1704-44f1-b1aa-5d8db686078d | |
25450 [main] WARN org.apache.hudi.DefaultSource - Loading Base File Only View. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.partition_fields' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.short_trip_db.dummy_table_short_trip.configFile' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.uber_db.dummy_table_uber.configFile' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.schemaprovider.source.schema.file' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.partition_extractor_class' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.source.dfs.root' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'key.serializer' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.recordkey.field' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.keygenerator.class' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.source.kafka.topic' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.bulkinsert.shuffle.parallelism' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.assume_date_partitioning' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.kafka.source.maxEvents' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.ingestion.tablesToBeIngested' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.keygen.timebased.output.dateformat' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.deltastreamer.schemaprovider.target.schema.file' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.write.partitionpath.field' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.delete.shuffle.parallelism' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.upsert.shuffle.parallelism' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.jdbcurl' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'value.serializer' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.insert.shuffle.parallelism' was supplied but isn't a known config. | |
28182 [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'hoodie.datasource.hive_sync.database' was supplied but isn't a known config. | |
28193 [main] WARN org.apache.spark.streaming.kafka010.KafkaUtils - overriding enable.auto.commit to false for executor | |
28193 [main] WARN org.apache.spark.streaming.kafka010.KafkaUtils - overriding auto.offset.reset to none for executor | |
28193 [main] ERROR org.apache.spark.streaming.kafka010.KafkaUtils - group.id is null, you should probably set it | |
28193 [main] WARN org.apache.spark.streaming.kafka010.KafkaUtils - overriding executor group.id to spark-executor-null | |
28193 [main] WARN org.apache.spark.streaming.kafka010.KafkaUtils - overriding receive.buffer.bytes to 65536 see KAFKA-3135 | |
Checkpoint str topic1,0:7,1:8 | |
GenRec 333 {"timestamp": 0, "_row_key": "305589f2-4307-46ab-abef-f379d0bc9fda", "rider": "rider-000", "driver": "driver-000", "fare": 45.40019146422721, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "00fdb7aa-4112-4b68-b056-471672ebe2b9", "rider": "rider-000", "driver": "driver-000", "fare": 73.02987759991886, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "b797af8b-ff20-47c5-93b1-eafee85ff8cd", "rider": "rider-000", "driver": "driver-000", "fare": 77.21097247931135, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "ab085445-e0fa-4607-bfc1-6b7391b44e05", "rider": "rider-000", "driver": "driver-000", "fare": 42.49381794821758, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "ed0050b5-4519-423b-8532-53d6cdc357cd", "rider": "rider-000", "driver": "driver-000", "fare": 1.0872312870502165, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "a1ec5a3c-64a6-41d7-aae2-c1a6115f9816", "rider": "rider-000", "driver": "driver-000", "fare": 39.54939864908973, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "922cd40c-e662-424e-8326-aec51ea68f81", "rider": "rider-000", "driver": "driver-000", "fare": 44.561085373053935, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "967bdc9d-4925-457d-a148-a4fa5deb6076", "rider": "rider-000", "driver": "driver-000", "fare": 78.14655558162802, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "a1f9a065-45af-44b7-8ba6-1869e3d49b09", "rider": "rider-000", "driver": "driver-000", "fare": 28.072552620450796, "_hoodie_is_deleted": false} | |
GenRec 333 {"timestamp": 0, "_row_key": "e1dd7126-71fb-4876-9106-846c51031717", "rider": "rider-000", "driver": "driver-000", "fare": 93.00604432281203, "_hoodie_is_deleted": false} | |
Key generator generated key HoodieKey { recordKey=305589f2-4307-46ab-abef-f379d0bc9fda partitionPath=default} | |
Before Pre write | |
Pre write complete | |
Just before tagging | |
Tag location +++++++ | |
Key generator generated key HoodieKey { recordKey=a1ec5a3c-64a6-41d7-aae2-c1a6115f9816 partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=922cd40c-e662-424e-8326-aec51ea68f81 partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=967bdc9d-4925-457d-a148-a4fa5deb6076 partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=a1f9a065-45af-44b7-8ba6-1869e3d49b09 partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=e1dd7126-71fb-4876-9106-846c51031717 partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=305589f2-4307-46ab-abef-f379d0bc9fda partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=00fdb7aa-4112-4b68-b056-471672ebe2b9 partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=b797af8b-ff20-47c5-93b1-eafee85ff8cd partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=ab085445-e0fa-4607-bfc1-6b7391b44e05 partitionPath=default} | |
Key generator generated key HoodieKey { recordKey=ed0050b5-4519-423b-8532-53d6cdc357cd partitionPath=default} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=00fdb7aa-4112-4b68-b056-471672ebe2b9 partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=b797af8b-ff20-47c5-93b1-eafee85ff8cd partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=a1ec5a3c-64a6-41d7-aae2-c1a6115f9816 partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=ab085445-e0fa-4607-bfc1-6b7391b44e05 partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=922cd40c-e662-424e-8326-aec51ea68f81 partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=967bdc9d-4925-457d-a148-a4fa5deb6076 partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=e1dd7126-71fb-4876-9106-846c51031717 partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=a1f9a065-45af-44b7-8ba6-1869e3d49b09 partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=305589f2-4307-46ab-abef-f379d0bc9fda partitionPath=default}, currentLocation='null', newLocation='null'} | |
rec1111 HoodieRecord{key=HoodieKey { recordKey=ed0050b5-4519-423b-8532-53d6cdc357cd partitionPath=default}, currentLocation='null', newLocation='null'} | |
Part 1 :: default : ed0050b5-4519-423b-8532-53d6cdc357cd | |
within look up index | |
records per partition default -> 10 | |
110347 [Executor task launch worker for task 55] ERROR org.apache.hudi.common.table.view.PriorityBasedFileSystemView - Got error running preferred function. Trying secondary | |
org.apache.hudi.exception.HoodieRemoteException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f] failed: Connection refused (Connection refused) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFilesFromParams(RemoteHoodieTableFileSystemView.java:241) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFilesBeforeOrOn(RemoteHoodieTableFileSystemView.java:248) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:97) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFilesBeforeOrOn(PriorityBasedFileSystemView.java:134) | |
at org.apache.hudi.index.HoodieIndexUtils.lambda$getLatestBaseFilesForAllPartitions$ff6885d8$1(HoodieIndexUtils.java:58) | |
at org.apache.hudi.client.common.HoodieSparkEngineContext.lambda$flatMap$7d470b86$1(HoodieSparkEngineContext.java:78) | |
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:125) | |
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:125) | |
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) | |
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) | |
at scala.collection.Iterator$class.foreach(Iterator.scala:891) | |
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) | |
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) | |
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104) | |
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48) | |
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310) | |
at scala.collection.AbstractIterator.to(Iterator.scala:1334) | |
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302) | |
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1334) | |
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289) | |
at scala.collection.AbstractIterator.toArray(Iterator.scala:1334) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945) | |
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101) | |
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f] failed: Connection refused (Connection refused) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151) | |
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353) | |
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380) | |
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236) | |
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184) | |
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88) | |
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) | |
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55) | |
at org.apache.http.client.fluent.Request.execute(Request.java:151) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.executeRequest(RemoteHoodieTableFileSystemView.java:172) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFilesFromParams(RemoteHoodieTableFileSystemView.java:237) | |
... 32 more | |
Caused by: java.net.ConnectException: Connection refused (Connection refused) | |
at java.net.PlainSocketImpl.socketConnect(Native Method) | |
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) | |
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) | |
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) | |
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) | |
at java.net.Socket.connect(Socket.java:589) | |
at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:74) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134) | |
... 45 more | |
190258 [Executor task launch worker for task 56] ERROR org.apache.hudi.common.table.view.PriorityBasedFileSystemView - Got error running preferred function. Trying secondary | |
org.apache.hudi.exception.HoodieRemoteException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812] failed: Connection refused (Connection refused) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFile(RemoteHoodieTableFileSystemView.java:493) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:97) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFile(PriorityBasedFileSystemView.java:140) | |
at org.apache.hudi.io.HoodieReadHandle.getLatestDataFile(HoodieReadHandle.java:62) | |
at org.apache.hudi.io.HoodieReadHandle.createNewFileReader(HoodieReadHandle.java:67) | |
at org.apache.hudi.io.HoodieRangeInfoHandle.getMinMaxKeys(HoodieRangeInfoHandle.java:39) | |
at org.apache.hudi.index.bloom.SparkHoodieBloomIndex.lambda$loadInvolvedFiles$dac7877d$1(SparkHoodieBloomIndex.java:209) | |
at org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1040) | |
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410) | |
at scala.collection.Iterator$class.foreach(Iterator.scala:891) | |
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) | |
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) | |
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104) | |
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48) | |
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310) | |
at scala.collection.AbstractIterator.to(Iterator.scala:1334) | |
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302) | |
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1334) | |
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289) | |
at scala.collection.AbstractIterator.toArray(Iterator.scala:1334) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945) | |
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101) | |
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812] failed: Connection refused (Connection refused) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151) | |
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353) | |
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380) | |
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236) | |
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184) | |
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88) | |
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) | |
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55) | |
at org.apache.http.client.fluent.Request.execute(Request.java:151) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.executeRequest(RemoteHoodieTableFileSystemView.java:172) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFile(RemoteHoodieTableFileSystemView.java:489) | |
... 31 more | |
Caused by: java.net.ConnectException: Connection refused (Connection refused) | |
at java.net.PlainSocketImpl.socketConnect(Native Method) | |
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) | |
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) | |
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) | |
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) | |
at java.net.Socket.connect(Socket.java:589) | |
at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:74) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134) | |
... 44 more | |
file info list complete | |
Load involved files comeplete | |
file comparisons : fe710a73-1704-44f1-b1aa-5d8db686078d-0 -> HoodieKey { recordKey=b797af8b-ff20-47c5-93b1-eafee85ff8cd partitionPath=default} | |
file comparisons : fe710a73-1704-44f1-b1aa-5d8db686078d-0 -> HoodieKey { recordKey=a1ec5a3c-64a6-41d7-aae2-c1a6115f9816 partitionPath=default} | |
file comparisons : fe710a73-1704-44f1-b1aa-5d8db686078d-0 -> HoodieKey { recordKey=ab085445-e0fa-4607-bfc1-6b7391b44e05 partitionPath=default} | |
file comparisons : fe710a73-1704-44f1-b1aa-5d8db686078d-0 -> HoodieKey { recordKey=922cd40c-e662-424e-8326-aec51ea68f81 partitionPath=default} | |
file comparisons : fe710a73-1704-44f1-b1aa-5d8db686078d-0 -> HoodieKey { recordKey=967bdc9d-4925-457d-a148-a4fa5deb6076 partitionPath=default} | |
file comparisons : fe710a73-1704-44f1-b1aa-5d8db686078d-0 -> HoodieKey { recordKey=e1dd7126-71fb-4876-9106-846c51031717 partitionPath=default} | |
file comparisons : fe710a73-1704-44f1-b1aa-5d8db686078d-0 -> HoodieKey { recordKey=a1f9a065-45af-44b7-8ba6-1869e3d49b09 partitionPath=default} | |
file comparisons : fe710a73-1704-44f1-b1aa-5d8db686078d-0 -> HoodieKey { recordKey=305589f2-4307-46ab-abef-f379d0bc9fda partitionPath=default} | |
Just before findMatchingFilesForRecordKeys | |
270399 [Executor task launch worker for task 65] ERROR org.apache.hudi.common.table.view.PriorityBasedFileSystemView - Got error running preferred function. Trying secondary | |
org.apache.hudi.exception.HoodieRemoteException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4] failed: Connection refused (Connection refused) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFile(RemoteHoodieTableFileSystemView.java:493) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:97) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFile(PriorityBasedFileSystemView.java:140) | |
at org.apache.hudi.io.HoodieReadHandle.getLatestDataFile(HoodieReadHandle.java:62) | |
at org.apache.hudi.io.HoodieReadHandle.createNewFileReader(HoodieReadHandle.java:67) | |
at org.apache.hudi.io.HoodieKeyLookupHandle.<init>(HoodieKeyLookupHandle.java:66) | |
at org.apache.hudi.index.bloom.HoodieBloomIndexCheckFunction$LazyKeyCheckIterator.computeNext(HoodieBloomIndexCheckFunction.java:87) | |
at org.apache.hudi.index.bloom.HoodieBloomIndexCheckFunction$LazyKeyCheckIterator.computeNext(HoodieBloomIndexCheckFunction.java:60) | |
at org.apache.hudi.client.utils.LazyIterableIterator.next(LazyIterableIterator.java:119) | |
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43) | |
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) | |
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) | |
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:462) | |
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) | |
at scala.collection.Iterator$class.foreach(Iterator.scala:891) | |
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) | |
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) | |
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104) | |
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48) | |
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310) | |
at scala.collection.AbstractIterator.to(Iterator.scala:1334) | |
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302) | |
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1334) | |
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289) | |
at scala.collection.AbstractIterator.toArray(Iterator.scala:1334) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945) | |
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101) | |
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4] failed: Connection refused (Connection refused) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151) | |
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353) | |
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380) | |
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236) | |
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184) | |
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88) | |
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) | |
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55) | |
at org.apache.http.client.fluent.Request.execute(Request.java:151) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.executeRequest(RemoteHoodieTableFileSystemView.java:172) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFile(RemoteHoodieTableFileSystemView.java:489) | |
... 36 more | |
Caused by: java.net.ConnectException: Connection refused (Connection refused) | |
at java.net.PlainSocketImpl.socketConnect(Native Method) | |
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) | |
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) | |
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) | |
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) | |
at java.net.Socket.connect(Socket.java:589) | |
at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:74) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134) | |
... 49 more | |
270430 [Executor task launch worker for task 65] WARN org.apache.hudi.common.table.view.PriorityBasedFileSystemView - Routing request to secondary file-system view | |
Total records (8), bloom filter candidates (0)/fp(0), actual matches (0) | |
350700 [Executor task launch worker for task 66] ERROR org.apache.hudi.common.table.view.PriorityBasedFileSystemView - Got error running preferred function. Trying secondary | |
org.apache.hudi.exception.HoodieRemoteException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4] failed: Connection refused (Connection refused) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFile(RemoteHoodieTableFileSystemView.java:493) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:97) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFile(PriorityBasedFileSystemView.java:140) | |
at org.apache.hudi.io.HoodieReadHandle.getLatestDataFile(HoodieReadHandle.java:62) | |
at org.apache.hudi.io.HoodieReadHandle.createNewFileReader(HoodieReadHandle.java:67) | |
at org.apache.hudi.io.HoodieKeyLookupHandle.<init>(HoodieKeyLookupHandle.java:66) | |
at org.apache.hudi.index.bloom.HoodieBloomIndexCheckFunction$LazyKeyCheckIterator.computeNext(HoodieBloomIndexCheckFunction.java:87) | |
at org.apache.hudi.index.bloom.HoodieBloomIndexCheckFunction$LazyKeyCheckIterator.computeNext(HoodieBloomIndexCheckFunction.java:60) | |
at org.apache.hudi.client.utils.LazyIterableIterator.next(LazyIterableIterator.java:119) | |
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43) | |
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) | |
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) | |
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:462) | |
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) | |
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221) | |
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4] failed: Connection refused (Connection refused) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151) | |
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353) | |
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380) | |
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236) | |
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184) | |
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88) | |
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) | |
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55) | |
at org.apache.http.client.fluent.Request.execute(Request.java:151) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.executeRequest(RemoteHoodieTableFileSystemView.java:172) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFile(RemoteHoodieTableFileSystemView.java:489) | |
... 30 more | |
Caused by: java.net.ConnectException: Connection refused (Connection refused) | |
at java.net.PlainSocketImpl.socketConnect(Native Method) | |
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) | |
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) | |
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) | |
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) | |
at java.net.Socket.connect(Socket.java:589) | |
at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:74) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134) | |
... 43 more | |
350764 [Executor task launch worker for task 66] WARN org.apache.hudi.common.table.view.PriorityBasedFileSystemView - Routing request to secondary file-system view | |
Total records (8), bloom filter candidates (0)/fp(0), actual matches (0) | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=00fdb7aa-4112-4b68-b056-471672ebe2b9 partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=b797af8b-ff20-47c5-93b1-eafee85ff8cd partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=a1ec5a3c-64a6-41d7-aae2-c1a6115f9816 partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=ab085445-e0fa-4607-bfc1-6b7391b44e05 partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=922cd40c-e662-424e-8326-aec51ea68f81 partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=967bdc9d-4925-457d-a148-a4fa5deb6076 partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=e1dd7126-71fb-4876-9106-846c51031717 partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=a1f9a065-45af-44b7-8ba6-1869e3d49b09 partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=305589f2-4307-46ab-abef-f379d0bc9fda partitionPath=default}, currentLocation='null', newLocation='null'} | |
Tagged rec :: HoodieRecord{key=HoodieKey { recordKey=ed0050b5-4519-423b-8532-53d6cdc357cd partitionPath=default}, currentLocation='null', newLocation='null'} | |
tagging Complete | |
Building workload profile | |
Completed building workload profile | |
431211 [Executor task launch worker for task 76] ERROR org.apache.hudi.common.table.view.PriorityBasedFileSystemView - Got error running preferred function. Trying secondary | |
org.apache.hudi.exception.HoodieRemoteException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f] failed: Connection refused (Connection refused) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFilesFromParams(RemoteHoodieTableFileSystemView.java:241) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFilesBeforeOrOn(RemoteHoodieTableFileSystemView.java:248) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:97) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFilesBeforeOrOn(PriorityBasedFileSystemView.java:134) | |
at org.apache.hudi.table.action.commit.UpsertPartitioner.getSmallFiles(UpsertPartitioner.java:271) | |
at org.apache.hudi.table.action.commit.UpsertPartitioner.lambda$getSmallFilesForPartitions$f1d92f9e$1(UpsertPartitioner.java:252) | |
at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1043) | |
at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1043) | |
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410) | |
at scala.collection.Iterator$class.foreach(Iterator.scala:891) | |
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) | |
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) | |
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104) | |
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48) | |
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310) | |
at scala.collection.AbstractIterator.to(Iterator.scala:1334) | |
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302) | |
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1334) | |
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289) | |
at scala.collection.AbstractIterator.toArray(Iterator.scala:1334) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945) | |
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101) | |
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f] failed: Connection refused (Connection refused) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151) | |
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353) | |
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380) | |
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236) | |
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184) | |
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88) | |
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) | |
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55) | |
at org.apache.http.client.fluent.Request.execute(Request.java:151) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.executeRequest(RemoteHoodieTableFileSystemView.java:172) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFilesFromParams(RemoteHoodieTableFileSystemView.java:237) | |
... 31 more | |
Caused by: java.net.ConnectException: Connection refused (Connection refused) | |
at java.net.PlainSocketImpl.socketConnect(Native Method) | |
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) | |
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) | |
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) | |
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) | |
at java.net.Socket.connect(Socket.java:589) | |
at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:74) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134) | |
... 44 more | |
Write complete . beofre post write | |
Handling upsert for partition default | |
510962 [Executor task launch worker for task 79] ERROR org.apache.hudi.common.table.view.PriorityBasedFileSystemView - Got error running preferred function. Trying secondary | |
org.apache.hudi.exception.HoodieRemoteException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812] failed: Connection refused (Connection refused) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFile(RemoteHoodieTableFileSystemView.java:493) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:97) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFile(PriorityBasedFileSystemView.java:140) | |
at org.apache.hudi.io.HoodieMergeHandle.<init>(HoodieMergeHandle.java:111) | |
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.getUpdateHandle(BaseSparkCommitActionExecutor.java:341) | |
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpdate(BaseSparkCommitActionExecutor.java:311) | |
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpsertPartition(BaseSparkCommitActionExecutor.java:284) | |
at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.lambda$execute$ecf5068c$1(BaseSparkCommitActionExecutor.java:140) | |
at org.apache.spark.api.java.JavaRDDLike$$anonfun$mapPartitionsWithIndex$1.apply(JavaRDDLike.scala:102) | |
at org.apache.spark.api.java.JavaRDDLike$$anonfun$mapPartitionsWithIndex$1.apply(JavaRDDLike.scala:102) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$25.apply(RDD.scala:853) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$25.apply(RDD.scala:853) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337) | |
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812] failed: Connection refused (Connection refused) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151) | |
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353) | |
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380) | |
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236) | |
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184) | |
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88) | |
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) | |
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55) | |
at org.apache.http.client.fluent.Request.execute(Request.java:151) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.executeRequest(RemoteHoodieTableFileSystemView.java:172) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestBaseFile(RemoteHoodieTableFileSystemView.java:489) | |
... 36 more | |
Caused by: java.net.ConnectException: Connection refused (Connection refused) | |
at java.net.PlainSocketImpl.socketConnect(Native Method) | |
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) | |
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) | |
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) | |
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) | |
at java.net.Socket.connect(Socket.java:589) | |
at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:74) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134) | |
... 49 more | |
Handling upsert for file fe710a73-1704-44f1-b1aa-5d8db686078d-0. going to merge | |
Handling upsert for file fe710a73-1704-44f1-b1aa-5d8db686078d-0. Complete | |
591697 [main] ERROR org.apache.hudi.common.table.view.PriorityBasedFileSystemView - Got error running preferred function. Trying secondary | |
org.apache.hudi.exception.HoodieRemoteException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4] failed: Connection refused (Connection refused) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getPendingCompactionOperations(RemoteHoodieTableFileSystemView.java:430) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:66) | |
at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getPendingCompactionOperations(PriorityBasedFileSystemView.java:214) | |
at org.apache.hudi.table.action.clean.CleanPlanner.<init>(CleanPlanner.java:90) | |
at org.apache.hudi.table.action.clean.BaseCleanActionExecutor.requestClean(BaseCleanActionExecutor.java:70) | |
at org.apache.hudi.table.action.clean.BaseCleanActionExecutor.requestClean(BaseCleanActionExecutor.java:129) | |
at org.apache.hudi.table.action.clean.BaseCleanActionExecutor.execute(BaseCleanActionExecutor.java:214) | |
at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.clean(HoodieSparkCopyOnWriteTable.java:225) | |
at org.apache.hudi.client.AbstractHoodieWriteClient.clean(AbstractHoodieWriteClient.java:595) | |
at org.apache.hudi.client.AbstractHoodieWriteClient.clean(AbstractHoodieWriteClient.java:607) | |
at org.apache.hudi.client.AbstractHoodieWriteClient.autoCleanOnCommit(AbstractHoodieWriteClient.java:454) | |
at org.apache.hudi.client.AbstractHoodieWriteClient.postCommit(AbstractHoodieWriteClient.java:413) | |
at org.apache.hudi.client.AbstractHoodieWriteClient.commitStats(AbstractHoodieWriteClient.java:178) | |
at org.apache.hudi.client.SparkRDDWriteClient.commit(SparkRDDWriteClient.java:117) | |
at org.apache.hudi.client.SparkRDDWriteClient.commit(SparkRDDWriteClient.java:68) | |
at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieWriteClient.java:153) | |
at org.apache.hudi.utilities.deltastreamer.DeltaSync.writeToSink(DeltaSync.java:473) | |
at org.apache.hudi.utilities.deltastreamer.DeltaSync.syncOnce(DeltaSync.java:277) | |
at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.lambda$sync$2(HoodieDeltaStreamer.java:170) | |
at org.apache.hudi.common.util.Option.ifPresent(Option.java:96) | |
at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.sync(HoodieDeltaStreamer.java:168) | |
at org.apache.hudi.utilities.functional.TestHoodieDeltaStreamer.testJsonKafkaDFSSource(TestHoodieDeltaStreamer.java:1110) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140) | |
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84) | |
at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$6(TestMethodTestDescriptor.java:212) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:208) | |
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:137) | |
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:71) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:129) | |
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:127) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:126) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:84) | |
at java.util.ArrayList.forEach(ArrayList.java:1257) | |
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:143) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:129) | |
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:127) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:126) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:84) | |
at java.util.ArrayList.forEach(ArrayList.java:1257) | |
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:143) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:129) | |
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:127) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:126) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:84) | |
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:32) | |
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) | |
at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:51) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:107) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:87) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:53) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:66) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:51) | |
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:87) | |
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:66) | |
at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:71) | |
at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33) | |
at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:220) | |
at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:53) | |
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to sivabala-c02xg219jgh6.attlocal.net:59649 [sivabala-c02xg219jgh6.attlocal.net/192.168.1.75, sivabala-c02xg219jgh6.attlocal.net/fe80:0:0:0:10d5:1e59:ee54:3076, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:bdf4:70a5:7838:f812, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:0:0:0:2f, sivabala-c02xg219jgh6.attlocal.net/2600:1700:2e50:aba0:cf2:a39a:d9e3:27c4] failed: Connection refused (Connection refused) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151) | |
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353) | |
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380) | |
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236) | |
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184) | |
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88) | |
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) | |
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107) | |
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55) | |
at org.apache.http.client.fluent.Request.execute(Request.java:151) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.executeRequest(RemoteHoodieTableFileSystemView.java:172) | |
at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getPendingCompactionOperations(RemoteHoodieTableFileSystemView.java:426) | |
... 86 more | |
Caused by: java.net.ConnectException: Connection refused (Connection refused) | |
at java.net.PlainSocketImpl.socketConnect(Native Method) | |
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) | |
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) | |
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) | |
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) | |
at java.net.Socket.connect(Socket.java:589) | |
at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:74) | |
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134) | |
... 99 more | |
591727 [main] WARN org.apache.hudi.DefaultSource - Loading Base File Only View. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment