Last active
August 29, 2015 14:06
-
-
Save jaceklaskowski/93c344b5242cb2c4ba04 to your computer and use it in GitHub Desktop.
Spark 1.2.0-SNAPSHOT + spark-cassandra-connector-assembly-1.2.0-SNAPSHOT
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
➜ spark git:(master) ./bin/spark-shell --master local --jars /Users/jacek/oss/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar | |
Spark assembly has been built with Hive, including Datanucleus jars on classpath | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
14/09/23 22:13:03 INFO SecurityManager: Changing view acls to: jacek | |
14/09/23 22:13:03 INFO SecurityManager: Changing modify acls to: jacek | |
14/09/23 22:13:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jacek); users with modify permissions: Set(jacek) | |
14/09/23 22:13:03 INFO HttpServer: Starting HTTP Server | |
14/09/23 22:13:03 INFO Utils: Successfully started service 'HTTP class server' on port 62525. | |
Welcome to | |
____ __ | |
/ __/__ ___ _____/ /__ | |
_\ \/ _ \/ _ `/ __/ '_/ | |
/___/ .__/\_,_/_/ /_/\_\ version 1.2.0-SNAPSHOT | |
/_/ | |
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67) | |
Type in expressions to have them evaluated. | |
Type :help for more information. | |
14/09/23 22:13:06 INFO SecurityManager: Changing view acls to: jacek | |
14/09/23 22:13:06 INFO SecurityManager: Changing modify acls to: jacek | |
14/09/23 22:13:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jacek); users with modify permissions: Set(jacek) | |
14/09/23 22:13:07 INFO Slf4jLogger: Slf4jLogger started | |
14/09/23 22:13:07 INFO Remoting: Starting remoting | |
14/09/23 22:13:07 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:62526] | |
14/09/23 22:13:07 INFO Remoting: Remoting now listens on addresses: [akka.tcp://[email protected]:62526] | |
14/09/23 22:13:07 INFO Utils: Successfully started service 'sparkDriver' on port 62526. | |
14/09/23 22:13:07 INFO SparkEnv: Registering MapOutputTracker | |
14/09/23 22:13:07 INFO SparkEnv: Registering BlockManagerMaster | |
14/09/23 22:13:07 INFO Utils: Successfully started service 'Connection manager for block manager' on port 62527. | |
14/09/23 22:13:07 INFO ConnectionManager: Bound socket to port 62527 with id = ConnectionManagerId(192.168.1.7,62527) | |
14/09/23 22:13:07 INFO DiskBlockManager: Created local directory at /var/folders/jf/y3127s650jq2cvv77s9r95zc0000gp/T/spark-local-20140923221307-2ff6 | |
14/09/23 22:13:07 INFO MemoryStore: MemoryStore started with capacity 265.4 MB | |
14/09/23 22:13:07 INFO BlockManagerMaster: Trying to register BlockManager | |
14/09/23 22:13:07 INFO BlockManagerMasterActor: Registering block manager 192.168.1.7:62527 with 265.4 MB RAM | |
14/09/23 22:13:07 INFO BlockManagerMaster: Registered BlockManager | |
14/09/23 22:13:07 INFO HttpFileServer: HTTP File server directory is /var/folders/jf/y3127s650jq2cvv77s9r95zc0000gp/T/spark-d2f6102b-95c9-476f-9b08-7c1787f7c869 | |
14/09/23 22:13:07 INFO HttpServer: Starting HTTP Server | |
14/09/23 22:13:07 INFO Utils: Successfully started service 'HTTP file server' on port 62528. | |
14/09/23 22:13:07 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
14/09/23 22:13:07 INFO SparkUI: Started SparkUI at http://192.168.1.7:4040 | |
14/09/23 22:13:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
14/09/23 22:13:19 INFO SparkContext: Added JAR file:/Users/jacek/oss/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar at http://192.168.1.7:62528/jars/spark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar with timestamp 1411503199680 | |
14/09/23 22:13:19 INFO Executor: Using REPL class URI: http://192.168.1.7:62525 | |
14/09/23 22:13:19 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://[email protected]:62526/user/HeartbeatReceiver | |
14/09/23 22:13:19 INFO SparkILoop: Created spark context.. | |
Spark context available as sc. | |
scala> import org.apache.spark._ | |
import org.apache.spark._ | |
scala> import com.datastax.spark.connector._ | |
import com.datastax.spark.connector._ | |
scala> val rdd = sc.cassandraTable("myks","users") | |
warning: Class com.google.common.util.concurrent.ListenableFuture not found - continuing with a stub. | |
warning: Class com.google.common.util.concurrent.ListenableFuture not found - continuing with a stub. | |
warning: Class com.google.common.util.concurrent.ListenableFuture not found - continuing with a stub. | |
warning: Class com.google.common.util.concurrent.ListenableFuture not found - continuing with a stub. | |
java.lang.NoClassDefFoundError: com/google/common/util/concurrent/AbstractFuture | |
at java.lang.ClassLoader.defineClass1(Native Method) | |
at java.lang.ClassLoader.defineClass(ClassLoader.java:800) | |
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) | |
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) | |
at java.net.URLClassLoader.access$100(URLClassLoader.java:71) | |
at java.net.URLClassLoader$1.run(URLClassLoader.java:361) | |
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at java.net.URLClassLoader.findClass(URLClassLoader.java:354) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:425) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:358) | |
at com.datastax.driver.core.Cluster.<init>(Cluster.java:113) | |
at com.datastax.driver.core.Cluster.<init>(Cluster.java:100) | |
at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:169) | |
at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1029) | |
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:167) | |
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:155) | |
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:155) | |
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:36) | |
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:61) | |
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:70) | |
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:95) | |
at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:106) | |
at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:134) | |
at com.datastax.spark.connector.rdd.CassandraRDD.tableDef$lzycompute(CassandraRDD.scala:222) | |
at com.datastax.spark.connector.rdd.CassandraRDD.tableDef(CassandraRDD.scala:221) | |
at com.datastax.spark.connector.rdd.CassandraRDD.<init>(CassandraRDD.scala:228) | |
at com.datastax.spark.connector.SparkContextFunctions.cassandraTable(SparkContextFunctions.scala:48) | |
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:18) | |
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:23) | |
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:25) | |
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27) | |
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:29) | |
at $iwC$$iwC$$iwC.<init>(<console>:31) | |
at $iwC$$iwC.<init>(<console>:33) | |
at $iwC.<init>(<console>:35) | |
at <init>(<console>:37) | |
at .<init>(<console>:41) | |
at .<clinit>(<console>) | |
at .<init>(<console>:7) | |
at .<clinit>(<console>) | |
at $print(<console>) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:846) | |
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1119) | |
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:672) | |
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:703) | |
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:667) | |
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828) | |
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873) | |
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785) | |
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628) | |
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636) | |
at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641) | |
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968) | |
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916) | |
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916) | |
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) | |
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916) | |
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011) | |
at org.apache.spark.repl.Main$.main(Main.scala:31) | |
at org.apache.spark.repl.Main.main(Main.scala) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:331) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.lang.ClassNotFoundException: com.google.common.util.concurrent.AbstractFuture | |
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) | |
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at java.net.URLClassLoader.findClass(URLClassLoader.java:354) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:425) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:358) | |
... 72 more | |
➜ spark git:(master) ./bin/spark-shell --master local --jars /Users/jacek/oss/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar,/Users/jacek/.ivy2/cache/com.google.guava/guava/jars/guava-16.0.1.jar | |
Spark assembly has been built with Hive, including Datanucleus jars on classpath | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
14/09/24 00:17:08 INFO SecurityManager: Changing view acls to: jacek | |
14/09/24 00:17:08 INFO SecurityManager: Changing modify acls to: jacek | |
14/09/24 00:17:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jacek); users with modify permissions: Set(jacek) | |
14/09/24 00:17:08 INFO HttpServer: Starting HTTP Server | |
14/09/24 00:17:08 INFO Utils: Successfully started service 'HTTP class server' on port 65183. | |
Welcome to | |
____ __ | |
/ __/__ ___ _____/ /__ | |
_\ \/ _ \/ _ `/ __/ '_/ | |
/___/ .__/\_,_/_/ /_/\_\ version 1.2.0-SNAPSHOT | |
/_/ | |
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67) | |
Type in expressions to have them evaluated. | |
Type :help for more information. | |
14/09/24 00:17:11 INFO SecurityManager: Changing view acls to: jacek | |
14/09/24 00:17:11 INFO SecurityManager: Changing modify acls to: jacek | |
14/09/24 00:17:11 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jacek); users with modify permissions: Set(jacek) | |
14/09/24 00:17:11 INFO Slf4jLogger: Slf4jLogger started | |
14/09/24 00:17:11 INFO Remoting: Starting remoting | |
14/09/24 00:17:11 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:65184] | |
14/09/24 00:17:11 INFO Remoting: Remoting now listens on addresses: [akka.tcp://[email protected]:65184] | |
14/09/24 00:17:11 INFO Utils: Successfully started service 'sparkDriver' on port 65184. | |
14/09/24 00:17:11 INFO SparkEnv: Registering MapOutputTracker | |
14/09/24 00:17:11 INFO SparkEnv: Registering BlockManagerMaster | |
14/09/24 00:17:11 INFO Utils: Successfully started service 'Connection manager for block manager' on port 65185. | |
14/09/24 00:17:11 INFO ConnectionManager: Bound socket to port 65185 with id = ConnectionManagerId(192.168.1.7,65185) | |
14/09/24 00:17:11 INFO DiskBlockManager: Created local directory at /var/folders/jf/y3127s650jq2cvv77s9r95zc0000gp/T/spark-local-20140924001711-6c11 | |
14/09/24 00:17:11 INFO MemoryStore: MemoryStore started with capacity 265.4 MB | |
14/09/24 00:17:11 INFO BlockManagerMaster: Trying to register BlockManager | |
14/09/24 00:17:11 INFO BlockManagerMasterActor: Registering block manager 192.168.1.7:65185 with 265.4 MB RAM | |
14/09/24 00:17:11 INFO BlockManagerMaster: Registered BlockManager | |
14/09/24 00:17:11 INFO HttpFileServer: HTTP File server directory is /var/folders/jf/y3127s650jq2cvv77s9r95zc0000gp/T/spark-c25a7608-cd32-4627-a5f1-a2b7aff172f9 | |
14/09/24 00:17:11 INFO HttpServer: Starting HTTP Server | |
14/09/24 00:17:11 INFO Utils: Successfully started service 'HTTP file server' on port 65186. | |
14/09/24 00:17:11 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
14/09/24 00:17:11 INFO SparkUI: Started SparkUI at http://192.168.1.7:4040 | |
14/09/24 00:17:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
14/09/24 00:17:23 INFO SparkContext: Added JAR file:/Users/jacek/oss/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar at http://192.168.1.7:65186/jars/spark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar with timestamp 1411510643429 | |
14/09/24 00:17:23 INFO SparkContext: Added JAR file:/Users/jacek/.ivy2/cache/com.google.guava/guava/jars/guava-16.0.1.jar at http://192.168.1.7:65186/jars/guava-16.0.1.jar with timestamp 1411510643441 | |
14/09/24 00:17:23 INFO Executor: Using REPL class URI: http://192.168.1.7:65183 | |
14/09/24 00:17:23 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://[email protected]:65184/user/HeartbeatReceiver | |
14/09/24 00:17:23 INFO SparkILoop: Created spark context.. | |
Spark context available as sc. | |
scala> import org.apache.spark._ | |
import org.apache.spark._ | |
scala> import com.datastax.spark.connector._ | |
import com.datastax.spark.connector._ | |
scala> val rdd = sc.cassandraTable("myks","users") | |
14/09/24 00:18:26 ERROR Futures$CombinedFuture: input future failed. | |
java.lang.IllegalAccessError: tried to access class org.spark-project.guava.common.base.Absent from class com.google.common.base.Optional | |
at com.google.common.base.Optional.absent(Optional.java:79) | |
at com.google.common.base.Optional.fromNullable(Optional.java:94) | |
at com.google.common.util.concurrent.Futures$CombinedFuture.setOneValue(Futures.java:1608) | |
at com.google.common.util.concurrent.Futures$CombinedFuture.access$400(Futures.java:1470) | |
at com.google.common.util.concurrent.Futures$CombinedFuture$2.run(Futures.java:1548) | |
at com.google.common.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:297) | |
at com.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156) | |
at com.google.common.util.concurrent.ExecutionList.add(ExecutionList.java:101) | |
at com.google.common.util.concurrent.AbstractFuture.addListener(AbstractFuture.java:170) | |
at com.google.common.util.concurrent.Futures$CombinedFuture.init(Futures.java:1545) | |
at com.google.common.util.concurrent.Futures$CombinedFuture.<init>(Futures.java:1491) | |
at com.google.common.util.concurrent.Futures.listFuture(Futures.java:1640) | |
at com.google.common.util.concurrent.Futures.allAsList(Futures.java:983) | |
at com.datastax.driver.core.CloseFuture$Forwarding.<init>(CloseFuture.java:73) | |
at com.datastax.driver.core.Cluster$Manager$ClusterCloseFuture.<init>(Cluster.java:1924) | |
at com.datastax.driver.core.Cluster$Manager.close(Cluster.java:1240) | |
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1168) | |
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:313) | |
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:170) | |
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:155) | |
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:155) | |
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:36) | |
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:61) | |
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:70) | |
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:95) | |
at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:106) | |
at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:134) | |
at com.datastax.spark.connector.rdd.CassandraRDD.tableDef$lzycompute(CassandraRDD.scala:222) | |
at com.datastax.spark.connector.rdd.CassandraRDD.tableDef(CassandraRDD.scala:221) | |
at com.datastax.spark.connector.rdd.CassandraRDD.<init>(CassandraRDD.scala:228) | |
at com.datastax.spark.connector.SparkContextFunctions.cassandraTable(SparkContextFunctions.scala:48) | |
at $line16.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:18) | |
at $line16.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:23) | |
at $line16.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:25) | |
at $line16.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27) | |
at $line16.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29) | |
at $line16.$read$$iwC$$iwC$$iwC.<init>(<console>:31) | |
at $line16.$read$$iwC$$iwC.<init>(<console>:33) | |
at $line16.$read$$iwC.<init>(<console>:35) | |
at $line16.$read.<init>(<console>:37) | |
at $line16.$read$.<init>(<console>:41) | |
at $line16.$read$.<clinit>(<console>) | |
at $line16.$eval$.<init>(<console>:7) | |
at $line16.$eval$.<clinit>(<console>) | |
at $line16.$eval.$print(<console>) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:846) | |
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1119) | |
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:672) | |
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:703) | |
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:667) | |
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828) | |
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873) | |
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785) | |
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628) | |
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636) | |
at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641) | |
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968) | |
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916) | |
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916) | |
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) | |
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916) | |
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011) | |
at org.apache.spark.repl.Main$.main(Main.scala:31) | |
at org.apache.spark.repl.Main.main(Main.scala) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:331) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
com.datastax.driver.core.exceptions.DriverInternalError: Unexpected exception thrown | |
at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:258) | |
at com.datastax.driver.core.Cluster.close(Cluster.java:441) | |
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:176) | |
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:155) | |
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:155) | |
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:36) | |
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:61) | |
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:70) | |
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:95) | |
at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:106) | |
at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:134) | |
at com.datastax.spark.connector.rdd.CassandraRDD.tableDef$lzycompute(CassandraRDD.scala:222) | |
at com.datastax.spark.connector.rdd.CassandraRDD.tableDef(CassandraRDD.scala:221) | |
at com.datastax.spark.connector.rdd.CassandraRDD.<init>(CassandraRDD.scala:228) | |
at com.datastax.spark.connector.SparkContextFunctions.cassandraTable(SparkContextFunctions.scala:48) | |
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:18) | |
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:23) | |
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:25) | |
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27) | |
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:29) | |
at $iwC$$iwC$$iwC.<init>(<console>:31) | |
at $iwC$$iwC.<init>(<console>:33) | |
at $iwC.<init>(<console>:35) | |
at <init>(<console>:37) | |
at .<init>(<console>:41) | |
at .<clinit>(<console>) | |
at .<init>(<console>:7) | |
at .<clinit>(<console>) | |
at $print(<console>) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:846) | |
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1119) | |
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:672) | |
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:703) | |
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:667) | |
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828) | |
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873) | |
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785) | |
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628) | |
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636) | |
at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641) | |
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968) | |
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916) | |
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916) | |
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) | |
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916) | |
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011) | |
at org.apache.spark.repl.Main$.main(Main.scala:31) | |
at org.apache.spark.repl.Main.main(Main.scala) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:331) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.lang.IllegalAccessError: tried to access class org.spark-project.guava.common.base.Absent from class com.google.common.base.Optional | |
at com.google.common.base.Optional.absent(Optional.java:79) | |
at com.google.common.base.Optional.fromNullable(Optional.java:94) | |
at com.google.common.util.concurrent.Futures$CombinedFuture.setOneValue(Futures.java:1608) | |
at com.google.common.util.concurrent.Futures$CombinedFuture.access$400(Futures.java:1470) | |
at com.google.common.util.concurrent.Futures$CombinedFuture$2.run(Futures.java:1548) | |
at com.google.common.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:297) | |
at com.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156) | |
at com.google.common.util.concurrent.ExecutionList.add(ExecutionList.java:101) | |
at com.google.common.util.concurrent.AbstractFuture.addListener(AbstractFuture.java:170) | |
at com.google.common.util.concurrent.Futures$CombinedFuture.init(Futures.java:1545) | |
at com.google.common.util.concurrent.Futures$CombinedFuture.<init>(Futures.java:1491) | |
at com.google.common.util.concurrent.Futures.listFuture(Futures.java:1640) | |
at com.google.common.util.concurrent.Futures.allAsList(Futures.java:983) | |
at com.datastax.driver.core.CloseFuture$Forwarding.<init>(CloseFuture.java:73) | |
at com.datastax.driver.core.Cluster$Manager$ClusterCloseFuture.<init>(Cluster.java:1924) | |
at com.datastax.driver.core.Cluster$Manager.close(Cluster.java:1240) | |
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1168) | |
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:313) | |
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:170) | |
... 56 more |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment