Created
August 21, 2020 19:44
-
-
Save nsivabalan/0643202bf1af2d85cdc0d35dd6a68d36 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.766 s - in org.apache.hudi.index.hbase.TestHBaseIndexUsage | |
[INFO] Running org.apache.hudi.index.hbase.TestHBaseIndex | |
Formatting using clusterid: testClusterID | |
71151 [main] WARN org.apache.hadoop.metrics2.impl.MetricsConfig - Cannot locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-metrics2.properties | |
71293 [main] WARN org.apache.hadoop.http.HttpRequestLog - Jetty request log can only be enabled using Log4j | |
71913 [main] WARN org.apache.hadoop.http.HttpRequestLog - Jetty request log can only be enabled using Log4j | |
Formatting using clusterid: testClusterID | |
78013 [main] WARN org.apache.hadoop.http.HttpRequestLog - Jetty request log can only be enabled using Log4j | |
78208 [main] WARN org.apache.hadoop.http.HttpRequestLog - Jetty request log can only be enabled using Log4j | |
79416 [IPC Server handler 4 on 63947] WARN org.apache.hadoop.hdfs.StateChange - DIR* FSDirectory.unprotectedRenameTo: failed to rename /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit6082867201785643239/.hoodie/hoodie.properties.updated to /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit6082867201785643239/.hoodie/hoodie.properties because destination exists | |
80997 [HBase-Metrics2-1] WARN org.apache.hadoop.metrics2.impl.MetricsConfig - Cannot locate configuration: tried hadoop-metrics2-datanode.properties,hadoop-metrics2.properties | |
83926 [IPC Server handler 6 on 63947] WARN org.apache.hadoop.hdfs.StateChange - DIR* FSDirectory.unprotectedRenameTo: failed to rename /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit4814721806904878482/.hoodie/hoodie.properties.updated to /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit4814721806904878482/.hoodie/hoodie.properties because destination exists | |
87247 [IPC Server handler 0 on 63947] WARN org.apache.hadoop.hdfs.StateChange - DIR* FSDirectory.unprotectedRenameTo: failed to rename /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit9084697497621722248/.hoodie/hoodie.properties.updated to /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit9084697497621722248/.hoodie/hoodie.properties because destination exists | |
90965 [IPC Server handler 4 on 63947] WARN org.apache.hadoop.hdfs.StateChange - DIR* FSDirectory.unprotectedRenameTo: failed to rename /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit7707938724298754502/.hoodie/hoodie.properties.updated to /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit7707938724298754502/.hoodie/hoodie.properties because destination exists | |
95516 [IPC Server handler 7 on 63947] WARN org.apache.hadoop.hdfs.StateChange - DIR* FSDirectory.unprotectedRenameTo: failed to rename /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit5309717706962091829/.hoodie/hoodie.properties.updated to /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit5309717706962091829/.hoodie/hoodie.properties because destination exists | |
97436 [IPC Server handler 3 on 63947] WARN org.apache.hadoop.hdfs.StateChange - DIR* FSDirectory.unprotectedRenameTo: failed to rename /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit5309717706962091829/.hoodie/hoodie.properties.updated to /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit5309717706962091829/.hoodie/hoodie.properties because destination exists | |
109719 [IPC Server handler 7 on 63947] WARN org.apache.hadoop.hdfs.StateChange - DIR* FSDirectory.unprotectedRenameTo: failed to rename /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit3139705364637912536/.hoodie/hoodie.properties.updated to /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit3139705364637912536/.hoodie/hoodie.properties because destination exists | |
109756 [dispatcher-event-loop-6] WARN org.apache.spark.scheduler.TaskSetManager - Stage 112 contains a task of very large size (120 KB). The maximum recommended task size is 100 KB. | |
111716 [IPC Server handler 1 on 63947] WARN org.apache.hadoop.hdfs.StateChange - DIR* FSDirectory.unprotectedRenameTo: failed to rename /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit6166181391565948342/.hoodie/hoodie.properties.updated to /var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/junit6166181391565948342/.hoodie/hoodie.properties because destination exists | |
111753 [dispatcher-event-loop-5] WARN org.apache.spark.scheduler.TaskSetManager - Stage 115 contains a task of very large size (120 KB). The maximum recommended task size is 100 KB. | |
119972 [HBase-Metrics2-1] WARN org.apache.hadoop.metrics2.impl.MetricsConfig - Cannot locate configuration: tried hadoop-metrics2-datanode.properties,hadoop-metrics2.properties | |
125486 [main] WARN org.apache.hadoop.hdfs.server.datanode.DirectoryScanner - DirectoryScanner: shutdown has been called | |
125606 [DataNode: [[[DISK]file:/Users/sivabala/Documents/personal/projects/siva_hudi/apache_hudi/hudi/hudi-client/target/test-data/a40265f2-7472-4595-b6f1-a7aa2f889739/dfscluster_482b323f-6edc-4a78-bc7a-e0baf83c106c/dfs/data/data1/, [DISK]file:/Users/sivabala/Documents/personal/projects/siva_hudi/apache_hudi/hudi/hudi-client/target/test-data/a40265f2-7472-4595-b6f1-a7aa2f889739/dfscluster_482b323f-6edc-4a78-bc7a-e0baf83c106c/dfs/data/data2/]] heartbeating to localhost/127.0.0.1:63947] WARN org.apache.hadoop.hdfs.server.datanode.DataNode - BPOfferService for Block pool BP-1360534292-127.0.0.1-1598038564530 (Datanode Uuid 6db66987-7a89-44c6-be4e-9b1ac0d1500c) service to localhost/127.0.0.1:63947 interrupted | |
125606 [DataNode: [[[DISK]file:/Users/sivabala/Documents/personal/projects/siva_hudi/apache_hudi/hudi/hudi-client/target/test-data/a40265f2-7472-4595-b6f1-a7aa2f889739/dfscluster_482b323f-6edc-4a78-bc7a-e0baf83c106c/dfs/data/data1/, [DISK]file:/Users/sivabala/Documents/personal/projects/siva_hudi/apache_hudi/hudi/hudi-client/target/test-data/a40265f2-7472-4595-b6f1-a7aa2f889739/dfscluster_482b323f-6edc-4a78-bc7a-e0baf83c106c/dfs/data/data2/]] heartbeating to localhost/127.0.0.1:63947] WARN org.apache.hadoop.hdfs.server.datanode.DataNode - Ending block pool service for: Block pool BP-1360534292-127.0.0.1-1598038564530 (Datanode Uuid 6db66987-7a89-44c6-be4e-9b1ac0d1500c) service to localhost/127.0.0.1:63947 | |
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.551 s - in org.apache.hudi.index.hbase.TestHBaseIndex | |
[INFO] Running org.apache.hudi.index.hbase.TestHBasePutBatchSizeCalculator | |
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in org.apache.hudi.index.hbase.TestHBasePutBatchSizeCalculator | |
[INFO] Running org.apache.hudi.index.TestHoodieIndex | |
126340 [Executor task launch worker for task 0-SendThread(localhost:64262)] WARN org.apache.zookeeper.ClientCnxn - Session 0x174128564e10009 for server null, unexpected error, closing socket connection and attempting reconnect | |
java.net.ConnectException: Connection refused | |
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) | |
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1081) | |
126702 [localhost:63956.activeMasterManager-SendThread(localhost:64262)] WARN org.apache.zookeeper.ClientCnxn - Session 0x174128564e10005 for server null, unexpected error, closing socket connection and attempting reconnect | |
java.net.ConnectException: Connection refused | |
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) | |
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1081) | |
128143 [Executor task launch worker for task 0-SendThread(localhost:64262)] WARN org.apache.zookeeper.ClientCnxn - Session 0x174128564e10009 for server null, unexpected error, closing socket connection and attempting reconnect | |
java.net.ConnectException: Connection refused | |
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) | |
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1081) | |
[ERROR] Tests run: 15, Failures: 0, Errors: 15, Skipped: 0, Time elapsed: 2.432 s <<< FAILURE! - in org.apache.hudi.index.TestHoodieIndex | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testTagLocationAndFetchRecordLocations(HoodieIndex$IndexType) Time elapsed: 0.327 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testTagLocationAndFetchRecordLocations(TestHoodieIndex.java:249) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testTagLocationAndFetchRecordLocations(HoodieIndex$IndexType) Time elapsed: 0.185 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testTagLocationAndFetchRecordLocations(TestHoodieIndex.java:249) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback(HoodieIndex$IndexType) Time elapsed: 0.156 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback(TestHoodieIndex.java:197) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback(HoodieIndex$IndexType) Time elapsed: 0.201 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback(TestHoodieIndex.java:197) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback(HoodieIndex$IndexType) Time elapsed: 0.191 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback(TestHoodieIndex.java:197) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback(HoodieIndex$IndexType) Time elapsed: 0.198 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback(TestHoodieIndex.java:197) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testTagLocationAndDuplicateUpdate(HoodieIndex$IndexType) Time elapsed: 0.11 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testTagLocationAndDuplicateUpdate(TestHoodieIndex.java:147) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testTagLocationAndDuplicateUpdate(HoodieIndex$IndexType) Time elapsed: 0.11 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testTagLocationAndDuplicateUpdate(TestHoodieIndex.java:147) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testTagLocationAndDuplicateUpdate(HoodieIndex$IndexType) Time elapsed: 0.111 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testTagLocationAndDuplicateUpdate(TestHoodieIndex.java:147) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testTagLocationAndDuplicateUpdate(HoodieIndex$IndexType) Time elapsed: 0.123 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testTagLocationAndDuplicateUpdate(TestHoodieIndex.java:147) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleGlobalIndexTagLocationWhenShouldUpdatePartitionPath(HoodieIndex$IndexType) Time elapsed: 0.13 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testSimpleGlobalIndexTagLocationWhenShouldUpdatePartitionPath(TestHoodieIndex.java:333) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdate(HoodieIndex$IndexType) Time elapsed: 0.135 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdate(TestHoodieIndex.java:97) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdate(HoodieIndex$IndexType) Time elapsed: 0.134 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdate(TestHoodieIndex.java:97) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdate(HoodieIndex$IndexType) Time elapsed: 0.184 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdate(TestHoodieIndex.java:97) | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdate(HoodieIndex$IndexType) Time elapsed: 0.105 s <<< ERROR! | |
org.apache.spark.SparkException: | |
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: | |
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) | |
org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) | |
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
java.lang.reflect.Method.invoke(Method.java:498) | |
org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:126) | |
org.junit.jupiter.engine.extension.TimeoutExtension.interceptBeforeEachMethod(TimeoutExtension.java:76) | |
org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.apache.hudi.index.TestHoodieIndex.setUp(TestHoodieIndex.java:81) | |
at org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdate(TestHoodieIndex.java:97) | |
128309 [Thread-908] WARN org.apache.hadoop.hdfs.server.datanode.DirectoryScanner - DirectoryScanner: shutdown has been called | |
128432 [DataNode: [[[DISK]file:/private/var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/temp339958385408347581/dfs/data/data1/, [DISK]file:/private/var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/temp339958385408347581/dfs/data/data2/]] heartbeating to localhost/127.0.0.1:63998] WARN org.apache.hadoop.hdfs.server.datanode.DataNode - BPOfferService for Block pool BP-1511846068-127.0.0.1-1598038571425 (Datanode Uuid effdbea5-a3fa-4b66-8b2f-16a1e5a6545d) service to localhost/127.0.0.1:63998 interrupted | |
128432 [DataNode: [[[DISK]file:/private/var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/temp339958385408347581/dfs/data/data1/, [DISK]file:/private/var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/temp339958385408347581/dfs/data/data2/]] heartbeating to localhost/127.0.0.1:63998] WARN org.apache.hadoop.hdfs.server.datanode.DataNode - Ending block pool service for: Block pool BP-1511846068-127.0.0.1-1598038571425 (Datanode Uuid effdbea5-a3fa-4b66-8b2f-16a1e5a6545d) service to localhost/127.0.0.1:63998 | |
128559 [localhost:63956.activeMasterManager-SendThread(localhost:64262)] WARN org.apache.zookeeper.ClientCnxn - Session 0x174128564e10005 for server null, unexpected error, closing socket connection and attempting reconnect | |
java.net.ConnectException: Connection refused | |
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) | |
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1081) | |
129203 [Executor task launch worker for task 0-SendThread(localhost:64262)] WARN org.apache.zookeeper.ClientCnxn - Session 0x174128564e10009 for server null, unexpected error, closing socket connection and attempting reconnect | |
java.net.ConnectException: Connection refused | |
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) | |
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1081) | |
129242 [localhost:63956.activeMasterManager-SendThread(localhost:64262)] WARN org.apache.zookeeper.ClientCnxn - Session 0x174128564e10005 for server null, unexpected error, closing socket connection and attempting reconnect | |
java.net.ConnectException: Connection refused | |
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) | |
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) | |
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) | |
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1081) | |
[INFO] | |
[INFO] Results: | |
[INFO] | |
[ERROR] Errors: | |
[ERROR] TestHoodieIndex.testSimpleGlobalIndexTagLocationWhenShouldUpdatePartitionPath:333->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdate(HoodieIndex$IndexType) | |
[ERROR] Run 1: TestHoodieIndex.testSimpleTagLocationAndUpdate:97->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 2: TestHoodieIndex.testSimpleTagLocationAndUpdate:97->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 3: TestHoodieIndex.testSimpleTagLocationAndUpdate:97->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 4: TestHoodieIndex.testSimpleTagLocationAndUpdate:97->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[INFO] | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback(HoodieIndex$IndexType) | |
[ERROR] Run 1: TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback:197->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 2: TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback:197->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 3: TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback:197->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 4: TestHoodieIndex.testSimpleTagLocationAndUpdateWithRollback:197->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[INFO] | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testTagLocationAndDuplicateUpdate(HoodieIndex$IndexType) | |
[ERROR] Run 1: TestHoodieIndex.testTagLocationAndDuplicateUpdate:147->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 2: TestHoodieIndex.testTagLocationAndDuplicateUpdate:147->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 3: TestHoodieIndex.testTagLocationAndDuplicateUpdate:147->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 4: TestHoodieIndex.testTagLocationAndDuplicateUpdate:147->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[INFO] | |
[ERROR] org.apache.hudi.index.TestHoodieIndex.testTagLocationAndFetchRecordLocations(HoodieIndex$IndexType) | |
[ERROR] Run 1: TestHoodieIndex.testTagLocationAndFetchRecordLocations:249->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[ERROR] Run 2: TestHoodieIndex.testTagLocationAndFetchRecordLocations:249->setUp:81->HoodieClientTestHarness.initResources:100->HoodieClientTestHarness.initSparkContexts:139->HoodieClientTestHarness.initSparkContexts:126 » Spark | |
[INFO] | |
[INFO] | |
[ERROR] Tests run: 63, Failures: 0, Errors: 5, Skipped: 0 | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Reactor Summary for Hudi 0.6.0-rc1: | |
[INFO] | |
[INFO] Hudi ............................................... SUCCESS [ 1.658 s] | |
[INFO] hudi-common ........................................ SUCCESS [ 6.517 s] | |
[INFO] hudi-timeline-service .............................. SUCCESS [ 2.304 s] | |
[INFO] hudi-hadoop-mr ..................................... SUCCESS [ 3.408 s] | |
[INFO] hudi-client ........................................ FAILURE [02:17 min] | |
[INFO] hudi-sync-common ................................... SKIPPED | |
[INFO] hudi-hive-sync ..................................... SKIPPED | |
[INFO] hudi-spark_2.11 .................................... SKIPPED | |
[INFO] hudi-utilities_2.11 ................................ SKIPPED | |
[INFO] hudi-utilities-bundle_2.11 ......................... SKIPPED | |
[INFO] hudi-cli ........................................... SKIPPED | |
[INFO] hudi-dla-sync ...................................... SKIPPED | |
[INFO] hudi-sync .......................................... SKIPPED | |
[INFO] hudi-hadoop-mr-bundle .............................. SKIPPED | |
[INFO] hudi-hive-sync-bundle .............................. SKIPPED | |
[INFO] hudi-spark-bundle_2.11 ............................. SKIPPED | |
[INFO] hudi-presto-bundle ................................. SKIPPED | |
[INFO] hudi-timeline-server-bundle ........................ SKIPPED | |
[INFO] hudi-hadoop-docker ................................. SKIPPED | |
[INFO] hudi-hadoop-base-docker ............................ SKIPPED | |
[INFO] hudi-hadoop-namenode-docker ........................ SKIPPED | |
[INFO] hudi-hadoop-datanode-docker ........................ SKIPPED | |
[INFO] hudi-hadoop-history-docker ......................... SKIPPED | |
[INFO] hudi-hadoop-hive-docker ............................ SKIPPED | |
[INFO] hudi-hadoop-sparkbase-docker ....................... SKIPPED | |
[INFO] hudi-hadoop-sparkmaster-docker ..................... SKIPPED | |
[INFO] hudi-hadoop-sparkworker-docker ..................... SKIPPED | |
[INFO] hudi-hadoop-sparkadhoc-docker ...................... SKIPPED | |
[INFO] hudi-hadoop-presto-docker .......................... SKIPPED | |
[INFO] hudi-integ-test .................................... SKIPPED | |
[INFO] hudi-integ-test-bundle ............................. SKIPPED | |
[INFO] hudi-examples ...................................... SKIPPED | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] BUILD FAILURE | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Total time: 02:31 min | |
[INFO] Finished at: 2020-08-21T15:37:03-04:00 | |
[INFO] ------------------------------------------------------------------------ | |
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M4:test (default-test) on project hudi-client: There are test failures. | |
[ERROR] | |
[ERROR] Please refer to /Users/sivabala/Documents/personal/projects/siva_hudi/apache_hudi/hudi/hudi-client/target/surefire-reports for the individual test results. | |
[ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream. | |
[ERROR] -> [Help 1] | |
[ERROR] | |
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. | |
[ERROR] Re-run Maven using the -X switch to enable full debug logging. | |
[ERROR] | |
[ERROR] For more information about the errors and possible solutions, please read the following articles: | |
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException | |
[ERROR] | |
[ERROR] After correcting the problems, you can resume the build with the command | |
[ERROR] mvn <goals> -rf :hudi-client |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment