Created
September 2, 2020 19:31
-
-
Save nsivabalan/e43565435990755602ee7f0b4e696cc8 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
n] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
Formatting using clusterid: testClusterID | |
1701 [main] WARN org.apache.hadoop.metrics2.impl.MetricsConfig - Cannot locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-metrics2.properties | |
1962 [main] WARN org.apache.hadoop.http.HttpRequestLog - Jetty request log can only be enabled using Log4j | |
3638 [main] WARN org.apache.hadoop.http.HttpRequestLog - Jetty request log can only be enabled using Log4j | |
5566 [main] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist | |
5786 [main] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist | |
10680 [main] WARN org.apache.hadoop.hive.metastore.ObjectStore - Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 2.3.0 | |
10680 [main] WARN org.apache.hadoop.hive.metastore.ObjectStore - setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore [email protected] | |
10697 [main] WARN org.apache.hadoop.hive.metastore.ObjectStore - Failed to get database default, returning NoSuchObjectException | |
11667 [main] WARN org.apache.hadoop.hive.ql.session.SessionState - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. | |
12521 [pool-18-thread-1] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist | |
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. | |
14915 [HiveServer2-Handler-Pool: Thread-127] WARN org.apache.hadoop.security.ShellBasedUnixGroupsMapping - got exception trying to get groups for user anonymous: id: anonymous: no such user | |
id: anonymous: no such user | |
15001 [pool-18-thread-3] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist | |
15001 [pool-18-thread-3] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist | |
16433 [pool-18-thread-3] WARN org.apache.hadoop.hive.metastore.ObjectStore - Failed to get database testdb1, returning NoSuchObjectException | |
OK | |
16582 [pool-18-thread-3] WARN org.apache.hadoop.hive.metastore.ObjectStore - Failed to get database testdb1, returning NoSuchObjectException | |
OK | |
17318 [main] WARN org.apache.spark.util.Utils - Your hostname, sivabala-C02XG219JGH6 resolves to a loopback address: 127.0.0.1; using 192.168.1.75 instead (on interface en0) | |
17318 [main] WARN org.apache.spark.util.Utils - Set SPARK_LOCAL_IP if you need to bind to another address | |
20863 [main] WARN org.apache.spark.sql.SparkSession$Builder - Using an existing SparkSession; some configuration may not take effect. | |
20890 [main] WARN org.apache.spark.sql.SparkSession$Builder - Using an existing SparkSession; some configuration may not take effect. | |
21254 [main] WARN org.apache.spark.sql.SparkSession$Builder - Using an existing SparkSession; some configuration may not take effect. | |
21773 [main] WARN org.apache.hudi.integ.testsuite.dag.scheduler.DagScheduler - Running workloads | |
21774 [pool-38-thread-1] WARN org.apache.hudi.integ.testsuite.dag.scheduler.DagScheduler - executing node: 101d8b7a-3a0f-4bde-9154-c9cbe7857601 of type: class org.apache.hudi.integ.testsuite.dag.nodes.InsertNode | |
23084 [Executor task launch worker for task 2] WARN org.apache.hudi.integ.testsuite.generator.GenericRecordFullPayloadGenerator - The schema does not have any collections/complex fields. Cannot achieve minPayloadSize : 70000 | |
23400 [Executor task launch worker for task 3] WARN org.apache.avro.mapreduce.AvroKeyInputFormat - Reader schema was not set. Use AvroJob.setInputKeySchema() if desired. | |
23768 [Executor task launch worker for task 4] WARN org.apache.avro.mapreduce.AvroKeyInputFormat - Reader schema was not set. Use AvroJob.setInputKeySchema() if desired. | |
213549 [pool-38-thread-2] WARN org.apache.hudi.integ.testsuite.dag.scheduler.DagScheduler - executing node: 02cfe582-7ce4-441c-b703-74dc2afcf2ea of type: class org.apache.hudi.integ.testsuite.dag.nodes.HiveSyncNode | |
214241 [HiveServer2-Handler-Pool: Thread-321] WARN org.apache.hadoop.security.ShellBasedUnixGroupsMapping - got exception trying to get groups for user hive: id: hive: no such user | |
id: hive: no such user | |
214278 [pool-18-thread-4] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist | |
214279 [pool-18-thread-4] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist | |
215117 [pool-18-thread-4] ERROR org.apache.hadoop.hive.metastore.RetryingHMSHandler - AlreadyExistsException(message:Database testdb1 already exists) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:921) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107) | |
at com.sun.proxy.$Proxy40.create_database(Unknown Source) | |
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_database.getResult(ThriftHiveMetastore.java:10721) | |
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_database.getResult(ThriftHiveMetastore.java:10705) | |
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.security.auth.Subject.doAs(Subject.java:422) | |
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118) | |
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
OK | |
OK | |
OK | |
215840 [pool-38-thread-1] WARN org.apache.hudi.integ.testsuite.dag.scheduler.DagScheduler - executing node: c08d83d6-0911-4164-b0b0-cfb7f4618a02 of type: class org.apache.hudi.integ.testsuite.dag.nodes.HiveQueryNode | |
216032 [pool-18-thread-5] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist | |
216033 [pool-18-thread-5] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist | |
217317 [pool-18-thread-5] ERROR org.apache.hadoop.hive.metastore.RetryingHMSHandler - AlreadyExistsException(message:Database testdb1 already exists) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:921) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148) | |
OK | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107) | |
at com.sun.proxy.$Proxy40.create_database(Unknown Source) | |
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_database.getResult(ThriftHiveMetastore.java:10721) | |
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_database.getResult(ThriftHiveMetastore.java:10705) | |
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.security.auth.Subject.doAs(Subject.java:422) | |
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118) | |
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
217487 [pool-18-thread-5] WARN org.apache.hadoop.hive.metastore.MetaStoreDirectSql - Failed to execute [SELECT "DBS"."NAME", "TBLS"."TBL_NAME", "COLUMNS_V2"."COLUMN_NAME","KEY_CONSTRAINTS"."POSITION", "KEY_CONSTRAINTS"."CONSTRAINT_NAME", "KEY_CONSTRAINTS"."ENABLE_VALIDATE_RELY" FROM "TBLS" INNER JOIN "KEY_CONSTRAINTS" ON "TBLS"."TBL_ID" = "KEY_CONSTRAINTS"."PARENT_TBL_ID" INNER JOIN "DBS" ON "TBLS"."DB_ID" = "DBS"."DB_ID" INNER JOIN "COLUMNS_V2" ON "COLUMNS_V2"."CD_ID" = "KEY_CONSTRAINTS"."PARENT_CD_ID" AND "COLUMNS_V2"."INTEGER_IDX" = "KEY_CONSTRAINTS"."PARENT_INTEGER_IDX" WHERE "KEY_CONSTRAINTS"."CONSTRAINT_TYPE" = 0 AND "DBS"."NAME" = ? AND "TBLS"."TBL_NAME" = ?] with parameters [testdb1, table1] | |
javax.jdo.JDODataStoreException: Error executing SQL query "SELECT "DBS"."NAME", "TBLS"."TBL_NAME", "COLUMNS_V2"."COLUMN_NAME","KEY_CONSTRAINTS"."POSITION", "KEY_CONSTRAINTS"."CONSTRAINT_NAME", "KEY_CONSTRAINTS"."ENABLE_VALIDATE_RELY" FROM "TBLS" INNER JOIN "KEY_CONSTRAINTS" ON "TBLS"."TBL_ID" = "KEY_CONSTRAINTS"."PARENT_TBL_ID" INNER JOIN "DBS" ON "TBLS"."DB_ID" = "DBS"."DB_ID" INNER JOIN "COLUMNS_V2" ON "COLUMNS_V2"."CD_ID" = "KEY_CONSTRAINTS"."PARENT_CD_ID" AND "COLUMNS_V2"."INTEGER_IDX" = "KEY_CONSTRAINTS"."PARENT_INTEGER_IDX" WHERE "KEY_CONSTRAINTS"."CONSTRAINT_TYPE" = 0 AND "DBS"."NAME" = ? AND "TBLS"."TBL_NAME" = ?". | |
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543) | |
at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:391) | |
at org.datanucleus.api.jdo.JDOQuery.executeWithArray(JDOQuery.java:267) | |
at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.executeWithArray(MetaStoreDirectSql.java:1750) | |
at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getPrimaryKeys(MetaStoreDirectSql.java:1939) | |
at org.apache.hadoop.hive.metastore.ObjectStore$11.getSqlResult(ObjectStore.java:8551) | |
at org.apache.hadoop.hive.metastore.ObjectStore$11.getSqlResult(ObjectStore.java:8547) | |
at org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.run(ObjectStore.java:2789) | |
at org.apache.hadoop.hive.metastore.ObjectStore.getPrimaryKeysInternal(ObjectStore.java:8559) | |
at org.apache.hadoop.hive.metastore.ObjectStore.getPrimaryKeys(ObjectStore.java:8537) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101) | |
at com.sun.proxy.$Proxy38.getPrimaryKeys(Unknown Source) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_primary_keys(HiveMetaStore.java:6828) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107) | |
at com.sun.proxy.$Proxy40.get_primary_keys(Unknown Source) | |
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_primary_keys.getResult(ThriftHiveMetastore.java:12907) | |
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_primary_keys.getResult(ThriftHiveMetastore.java:12891) | |
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.security.auth.Subject.doAs(Subject.java:422) | |
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118) | |
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
NestedThrowablesStackTrace: | |
java.sql.SQLSyntaxErrorException: Table/View 'KEY_CONSTRAINTS' does not exist. | |
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source) | |
at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source) | |
at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown Source) | |
at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown Source) | |
at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source) | |
at org.apache.derby.impl.jdbc.ConnectionChild.handleException(Unknown Source) | |
at org.apache.derby.impl.jdbc.EmbedPreparedStatement.<init>(Unknown Source) | |
at org.apache.derby.impl.jdbc.EmbedPreparedStatement20.<init>(Unknown Source) | |
at org.apache.derby.impl.jdbc.EmbedPreparedStatement30.<init>(Unknown Source) | |
at org.apache.derby.impl.jdbc.EmbedPreparedStatement40.<init>(Unknown Source) | |
at org.apache.derby.impl.jdbc.EmbedPreparedStatement42.<init>(Unknown Source) | |
at org.apache.derby.jdbc.Driver42.newEmbedPreparedStatement(Unknown Source) | |
at org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source) | |
at org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source) | |
at com.jolbox.bonecp.ConnectionHandle.prepareStatement(ConnectionHandle.java:1193) | |
at org.datanucleus.store.rdbms.SQLController.getStatementForQuery(SQLController.java:345) | |
at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getPreparedStatementForQuery(RDBMSQueryUtils.java:211) | |
at org.datanucleus.store.rdbms.query.SQLQuery.performExecute(SQLQuery.java:633) | |
at org.datanucleus.store.query.Query.executeQuery(Query.java:1855) | |
at org.datanucleus.store.rdbms.query.SQLQuery.executeWithArray(SQLQuery.java:807) | |
at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:368) | |
at org.datanucleus.api.jdo.JDOQuery.executeWithArray(JDOQuery.java:267) | |
at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.executeWithArray(MetaStoreDirectSql.java:1750) | |
at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getPrimaryKeys(MetaStoreDirectSql.java:1939) | |
at org.apache.hadoop.hive.metastore.ObjectStore$11.getSqlResult(ObjectStore.java:8551) | |
at org.apache.hadoop.hive.metastore.ObjectStore$11.getSqlResult(ObjectStore.java:8547) | |
at org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.run(ObjectStore.java:2789) | |
at org.apache.hadoop.hive.metastore.ObjectStore.getPrimaryKeysInternal(ObjectStore.java:8559) | |
at org.apache.hadoop.hive.metastore.ObjectStore.getPrimaryKeys(ObjectStore.java:8537) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101) | |
at com.sun.proxy.$Proxy38.getPrimaryKeys(Unknown Source) | |
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_primary_keys(HiveMetaStore.java:6828) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148) | |
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107) | |
at com.sun.proxy.$Proxy40.get_primary_keys(Unknown Source) | |
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_primary_keys.getResult(ThriftHiveMetastore.java:12907) | |
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_primary_keys.getResult(ThriftHiveMetastore.java:12891) | |
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.security.auth.Subject.doAs(Subject.java:422) | |
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) | |
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118) | |
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.sql.SQLException: Table/View 'KEY_CONSTRAINTS' does not exist. | |
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) | |
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source) | |
... 56 more | |
Caused by: ERROR 42X05: Table/View 'KEY_CONSTRAINTS' does not exist. | |
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) | |
at org.apache.derby.impl.sql.compile.FromBaseTable.bindTableDescriptor(Unknown Source) | |
at org.apache.derby.impl.sql.compile.FromBaseTable.bindNonVTITables(Unknown Source) | |
at org.apache.derby.impl.sql.compile.TableOperatorNode.bindNonVTITables(Unknown Source) | |
at org.apache.derby.impl.sql.compile.TableOperatorNode.bindNonVTITables(Unknown Source) | |
at org.apache.derby.impl.sql.compile.TableOperatorNode.bindNonVTITables(Unknown Source) | |
at org.apache.derby.impl.sql.compile.FromList.bindTables(Unknown Source) | |
at org.apache.derby.impl.sql.compile.SelectNode.bindNonVTITables(Unknown Source) | |
at org.apache.derby.impl.sql.compile.DMLStatementNode.bindTables(Unknown Source) | |
at org.apache.derby.impl.sql.compile.DMLStatementNode.bind(Unknown Source) | |
at org.apache.derby.impl.sql.compile.CursorNode.bindStatement(Unknown Source) | |
at org.apache.derby.impl.sql.GenericStatement.prepMinion(Unknown Source) | |
at org.apache.derby.impl.sql.GenericStatement.prepare(Unknown Source) | |
at org.apache.derby.impl.sql.conn.GenericLanguageConnectionContext.prepareInternalStatement(Unknown Source) | |
... 50 more | |
217496 [pool-18-thread-5] WARN org.apache.hadoop.hive.metastore.ObjectStore - Falling back to ORM path due to direct SQL failure (this is not an error): See previous errors; Error executing SQL query "SELECT "DBS"."NAME", "TBLS"."TBL_NAME", "COLUMNS_V2"."COLUMN_NAME","KEY_CONSTRAINTS"."POSITION", "KEY_CONSTRAINTS"."CONSTRAINT_NAME", "KEY_CONSTRAINTS"."ENABLE_VALIDATE_RELY" FROM "TBLS" INNER JOIN "KEY_CONSTRAINTS" ON "TBLS"."TBL_ID" = "KEY_CONSTRAINTS"."PARENT_TBL_ID" INNER JOIN "DBS" ON "TBLS"."DB_ID" = "DBS"."DB_ID" INNER JOIN "COLUMNS_V2" ON "COLUMNS_V2"."CD_ID" = "KEY_CONSTRAINTS"."PARENT_CD_ID" AND "COLUMNS_V2"."INTEGER_IDX" = "KEY_CONSTRAINTS"."PARENT_INTEGER_IDX" WHERE "KEY_CONSTRAINTS"."CONSTRAINT_TYPE" = 0 AND "DBS"."NAME" = ? AND "TBLS"."TBL_NAME" = ?". at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.executeWithArray(MetaStoreDirectSql.java:1762) at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getPrimaryKeys(MetaStoreDirectSql.java:1939) at org.apache.hadoop.hive.metastore.ObjectStore$11.getSqlResult(ObjectStore.java:8551) | |
217921 [pool-18-thread-6] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist | |
217922 [pool-18-thread-6] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist | |
218641 [a07e57f6-dd4d-41a3-9161-d00fc630668f HiveServer2-Handler-Pool: Thread-363] ERROR org.apache.hadoop.hdfs.KeyProviderCache - Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !! | |
222229 [HiveServer2-Background-Pool: Thread-402] WARN org.apache.hadoop.hive.ql.Driver - Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. | |
WARNING: Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. | |
Query ID = sivabala_20200902104719_9773fe97-1747-429c-827c-138f8cd8b634 | |
Total jobs = 1 | |
Launching Job 1 out of 1 | |
Number of reduce tasks not specified. Estimated from input data size: 1 | |
In order to change the average load for a reducer (in bytes): | |
set hive.exec.reducers.bytes.per.reducer=<number> | |
In order to limit the maximum number of reducers: | |
set hive.exec.reducers.max=<number> | |
In order to set a constant number of reducers: | |
set mapreduce.job.reduces=<number> | |
222447 [HiveServer2-Background-Pool: Thread-402] WARN org.apache.hadoop.mapreduce.JobResourceUploader - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. | |
Job running in-process (local Hadoop) | |
2020-09-02 10:47:24,784 Stage-1 map = 100%, reduce = 100% | |
Ended Job = job_local1767562431_0001 | |
MapReduce Jobs Launched: | |
Stage-Stage-1: HDFS Read: 1870664 HDFS Write: 952624 SUCCESS | |
Total MapReduce CPU Time Spent: 0 msec | |
OK | |
org.opentest4j.AssertionFailedError: | |
Expected :1 | |
Actual :5 | |
<Click to see difference> | |
at org.junit.jupiter.api.AssertionUtils.fail(AssertionUtils.java:55) | |
at org.junit.jupiter.api.AssertionUtils.failNotEqual(AssertionUtils.java:62) | |
at org.junit.jupiter.api.AssertEquals.assertEquals(AssertEquals.java:166) | |
at org.junit.jupiter.api.AssertEquals.assertEquals(AssertEquals.java:161) | |
at org.junit.jupiter.api.Assertions.assertEquals(Assertions.java:611) | |
at org.apache.hudi.integ.testsuite.job.TestHoodieTestSuiteJob.testCOWFullDagFromYaml(TestHoodieTestSuiteJob.java:202) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:688) | |
at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) | |
at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) | |
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140) | |
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84) | |
at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) | |
at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) | |
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) | |
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) | |
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) | |
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$6(TestMethodTestDescriptor.java:212) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:208) | |
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:137) | |
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:71) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:129) | |
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:127) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:126) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:84) | |
at java.util.ArrayList.forEach(ArrayList.java:1257) | |
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:143) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:129) | |
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:127) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:126) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:84) | |
at java.util.ArrayList.forEach(ArrayList.java:1257) | |
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:143) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:129) | |
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:127) | |
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:126) | |
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:84) | |
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:32) | |
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) | |
at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:51) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:107) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:87) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:53) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:66) | |
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:51) | |
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:87) | |
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:66) | |
at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:69) | |
at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33) | |
at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230) | |
at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58) | |
224740 [main] WARN org.apache.hadoop.hdfs.server.datanode.DirectoryScanner - DirectoryScanner: shutdown has been called | |
224861 [DataNode: [[[DISK]file:/private/var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/temp6346781454238835690/dfs/data/data1/, [DISK]file:/private/var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/temp6346781454238835690/dfs/data/data2/]] heartbeating to localhost/127.0.0.1:56275] WARN org.apache.hadoop.hdfs.server.datanode.DataNode - BPOfferService for Block pool BP-1220230576-127.0.0.1-1599057822179 (Datanode Uuid c2604b5c-3826-478d-8c03-b3ae4dadcbdd) service to localhost/127.0.0.1:56275 interrupted | |
224861 [DataNode: [[[DISK]file:/private/var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/temp6346781454238835690/dfs/data/data1/, [DISK]file:/private/var/folders/2k/v2f22b650rbf2s5zkg3_4cc40000gn/T/temp6346781454238835690/dfs/data/data2/]] heartbeating to localhost/127.0.0.1:56275] WARN org.apache.hadoop.hdfs.server.datanode.DataNode - Ending block pool service for: Block pool BP-1220230576-127.0.0.1-1599057822179 (Datanode Uuid c2604b5c-3826-478d-8c03-b3ae4dadcbdd) service to localhost/127.0.0.1:56275 | |
Process finished with exit code 255 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment