Created
May 1, 2015 08:13
-
-
Save mbonaci/7c4a7160e45654840b33 to your computer and use it in GitHub Desktop.
scala.reflect.internal.MissingRequirementError: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial classloader with boot classpath
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
15/05/01 10:11:38 INFO SparkContext: Running Spark version 1.3.1 | |
15/05/01 10:11:38 WARN Utils: Your hostname, mbo-sia resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface eth0) | |
15/05/01 10:11:38 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address | |
15/05/01 10:11:38 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
15/05/01 10:11:39 INFO SecurityManager: Changing view acls to: mbo | |
15/05/01 10:11:39 INFO SecurityManager: Changing modify acls to: mbo | |
15/05/01 10:11:39 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mbo); users with modify permissions: Set(mbo) | |
15/05/01 10:11:39 INFO Slf4jLogger: Slf4jLogger started | |
15/05/01 10:11:39 INFO Remoting: Starting remoting | |
15/05/01 10:11:40 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:47877] | |
15/05/01 10:11:40 INFO Utils: Successfully started service 'sparkDriver' on port 47877. | |
15/05/01 10:11:40 INFO SparkEnv: Registering MapOutputTracker | |
15/05/01 10:11:40 INFO SparkEnv: Registering BlockManagerMaster | |
15/05/01 10:11:40 INFO DiskBlockManager: Created local directory at /tmp/spark-a0a6fd93-a545-47fb-9761-af9a49b120d7/blockmgr-22aba3b7-9ca4-4615-8a09-6fd28916fb40 | |
15/05/01 10:11:40 INFO MemoryStore: MemoryStore started with capacity 535.7 MB | |
15/05/01 10:11:40 INFO HttpFileServer: HTTP File server directory is /tmp/spark-8ed30e33-74d1-4237-a7bf-054311978f1c/httpd-0681c8be-056f-4de4-9224-4418c98bd427 | |
15/05/01 10:11:40 INFO HttpServer: Starting HTTP Server | |
15/05/01 10:11:40 INFO Server: jetty-8.y.z-SNAPSHOT | |
15/05/01 10:11:40 INFO AbstractConnector: Started [email protected]:59121 | |
15/05/01 10:11:40 INFO Utils: Successfully started service 'HTTP file server' on port 59121. | |
15/05/01 10:11:40 INFO SparkEnv: Registering OutputCommitCoordinator | |
15/05/01 10:11:40 INFO Server: jetty-8.y.z-SNAPSHOT | |
15/05/01 10:11:40 INFO AbstractConnector: Started [email protected]:4040 | |
15/05/01 10:11:40 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
15/05/01 10:11:40 INFO SparkUI: Started SparkUI at http://10.0.2.15:4040 | |
15/05/01 10:11:41 INFO Executor: Starting executor ID <driver> on host localhost | |
15/05/01 10:11:41 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://[email protected]:47877/user/HeartbeatReceiver | |
15/05/01 10:11:41 INFO NettyBlockTransferService: Server created on 42007 | |
15/05/01 10:11:41 INFO BlockManagerMaster: Trying to register BlockManager | |
15/05/01 10:11:41 INFO BlockManagerMasterActor: Registering block manager localhost:42007 with 535.7 MB RAM, BlockManagerId(<driver>, localhost, 42007) | |
15/05/01 10:11:41 INFO BlockManagerMaster: Registered BlockManager | |
15/05/01 10:11:42 INFO MemoryStore: ensureFreeSpace(138675) called with curMem=0, maxMem=561701191 | |
15/05/01 10:11:42 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 135.4 KB, free 535.5 MB) | |
15/05/01 10:11:42 INFO MemoryStore: ensureFreeSpace(18512) called with curMem=138675, maxMem=561701191 | |
15/05/01 10:11:42 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 18.1 KB, free 535.5 MB) | |
15/05/01 10:11:42 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:42007 (size: 18.1 KB, free: 535.7 MB) | |
15/05/01 10:11:42 INFO BlockManagerMaster: Updated info of block broadcast_0_piece0 | |
15/05/01 10:11:42 INFO SparkContext: Created broadcast 0 from textFile at JSONRelation.scala:114 | |
15/05/01 10:11:42 INFO FileInputFormat: Total input paths to process : 1 | |
15/05/01 10:11:42 INFO SparkContext: Starting job: isEmpty at JsonRDD.scala:51 | |
15/05/01 10:11:42 INFO DAGScheduler: Got job 0 (isEmpty at JsonRDD.scala:51) with 1 output partitions (allowLocal=true) | |
15/05/01 10:11:42 INFO DAGScheduler: Final stage: Stage 0(isEmpty at JsonRDD.scala:51) | |
15/05/01 10:11:42 INFO DAGScheduler: Parents of final stage: List() | |
15/05/01 10:11:42 INFO DAGScheduler: Missing parents: List() | |
15/05/01 10:11:42 INFO DAGScheduler: Submitting Stage 0 (/home/mbo/sia/github-archive/2015-03-01-0.json MapPartitionsRDD[1] at textFile at JSONRelation.scala:114), which has no missing parents | |
15/05/01 10:11:42 INFO MemoryStore: ensureFreeSpace(2696) called with curMem=157187, maxMem=561701191 | |
15/05/01 10:11:42 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 2.6 KB, free 535.5 MB) | |
15/05/01 10:11:42 INFO MemoryStore: ensureFreeSpace(1991) called with curMem=159883, maxMem=561701191 | |
15/05/01 10:11:42 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1991.0 B, free 535.5 MB) | |
15/05/01 10:11:42 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:42007 (size: 1991.0 B, free: 535.7 MB) | |
15/05/01 10:11:42 INFO BlockManagerMaster: Updated info of block broadcast_1_piece0 | |
15/05/01 10:11:42 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:839 | |
15/05/01 10:11:42 INFO DAGScheduler: Submitting 1 missing tasks from Stage 0 (/home/mbo/sia/github-archive/2015-03-01-0.json MapPartitionsRDD[1] at textFile at JSONRelation.scala:114) | |
15/05/01 10:11:42 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks | |
15/05/01 10:11:42 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1315 bytes) | |
15/05/01 10:11:42 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) | |
15/05/01 10:11:43 INFO HadoopRDD: Input split: file:/home/mbo/sia/github-archive/2015-03-01-0.json:0+22319700 | |
15/05/01 10:11:43 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id | |
15/05/01 10:11:43 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id | |
15/05/01 10:11:43 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap | |
15/05/01 10:11:43 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition | |
15/05/01 10:11:43 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id | |
15/05/01 10:11:43 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 2487 bytes result sent to driver | |
15/05/01 10:11:43 INFO DAGScheduler: Stage 0 (isEmpty at JsonRDD.scala:51) finished in 0.213 s | |
15/05/01 10:11:43 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 194 ms on localhost (1/1) | |
15/05/01 10:11:43 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool | |
15/05/01 10:11:43 INFO DAGScheduler: Job 0 finished: isEmpty at JsonRDD.scala:51, took 0.327185 s | |
15/05/01 10:11:43 INFO SparkContext: Starting job: reduce at JsonRDD.scala:54 | |
15/05/01 10:11:43 INFO DAGScheduler: Got job 1 (reduce at JsonRDD.scala:54) with 2 output partitions (allowLocal=false) | |
15/05/01 10:11:43 INFO DAGScheduler: Final stage: Stage 1(reduce at JsonRDD.scala:54) | |
15/05/01 10:11:43 INFO DAGScheduler: Parents of final stage: List() | |
15/05/01 10:11:43 INFO DAGScheduler: Missing parents: List() | |
15/05/01 10:11:43 INFO DAGScheduler: Submitting Stage 1 (MapPartitionsRDD[3] at map at JsonRDD.scala:54), which has no missing parents | |
15/05/01 10:11:43 INFO MemoryStore: ensureFreeSpace(3208) called with curMem=161874, maxMem=561701191 | |
15/05/01 10:11:43 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 3.1 KB, free 535.5 MB) | |
15/05/01 10:11:43 INFO MemoryStore: ensureFreeSpace(2290) called with curMem=165082, maxMem=561701191 | |
15/05/01 10:11:43 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 2.2 KB, free 535.5 MB) | |
15/05/01 10:11:43 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on localhost:42007 (size: 2.2 KB, free: 535.7 MB) | |
15/05/01 10:11:43 INFO BlockManagerMaster: Updated info of block broadcast_2_piece0 | |
15/05/01 10:11:43 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:839 | |
15/05/01 10:11:43 INFO DAGScheduler: Submitting 2 missing tasks from Stage 1 (MapPartitionsRDD[3] at map at JsonRDD.scala:54) | |
15/05/01 10:11:43 INFO TaskSchedulerImpl: Adding task set 1.0 with 2 tasks | |
15/05/01 10:11:43 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, PROCESS_LOCAL, 1315 bytes) | |
15/05/01 10:11:43 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 2, localhost, PROCESS_LOCAL, 1315 bytes) | |
15/05/01 10:11:43 INFO Executor: Running task 0.0 in stage 1.0 (TID 1) | |
15/05/01 10:11:43 INFO Executor: Running task 1.0 in stage 1.0 (TID 2) | |
15/05/01 10:11:43 INFO HadoopRDD: Input split: file:/home/mbo/sia/github-archive/2015-03-01-0.json:0+22319700 | |
15/05/01 10:11:43 INFO HadoopRDD: Input split: file:/home/mbo/sia/github-archive/2015-03-01-0.json:22319700+22319700 | |
15/05/01 10:11:43 INFO BlockManager: Removing broadcast 1 | |
15/05/01 10:11:43 INFO BlockManager: Removing block broadcast_1_piece0 | |
15/05/01 10:11:43 INFO MemoryStore: Block broadcast_1_piece0 of size 1991 dropped from memory (free 561535810) | |
15/05/01 10:11:43 INFO BlockManagerInfo: Removed broadcast_1_piece0 on localhost:42007 in memory (size: 1991.0 B, free: 535.7 MB) | |
15/05/01 10:11:43 INFO BlockManagerMaster: Updated info of block broadcast_1_piece0 | |
15/05/01 10:11:43 INFO BlockManager: Removing block broadcast_1 | |
15/05/01 10:11:43 INFO MemoryStore: Block broadcast_1 of size 2696 dropped from memory (free 561538506) | |
15/05/01 10:11:43 INFO ContextCleaner: Cleaned broadcast 1 | |
15/05/01 10:11:47 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 44405 bytes result sent to driver | |
15/05/01 10:11:47 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 4672 ms on localhost (1/2) | |
15/05/01 10:11:48 INFO Executor: Finished task 1.0 in stage 1.0 (TID 2). 44273 bytes result sent to driver | |
15/05/01 10:11:48 INFO DAGScheduler: Stage 1 (reduce at JsonRDD.scala:54) finished in 4.892 s | |
15/05/01 10:11:48 INFO TaskSetManager: Finished task 1.0 in stage 1.0 (TID 2) in 4865 ms on localhost (2/2) | |
15/05/01 10:11:48 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool | |
15/05/01 10:11:48 INFO DAGScheduler: Job 1 finished: reduce at JsonRDD.scala:54, took 4.909011 s | |
15/05/01 10:11:48 INFO MemoryStore: ensureFreeSpace(69728) called with curMem=162685, maxMem=561701191 | |
15/05/01 10:11:48 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 68.1 KB, free 535.5 MB) | |
15/05/01 10:11:48 INFO MemoryStore: ensureFreeSpace(25502) called with curMem=232413, maxMem=561701191 | |
15/05/01 10:11:48 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 24.9 KB, free 535.4 MB) | |
15/05/01 10:11:48 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on localhost:42007 (size: 24.9 KB, free: 535.6 MB) | |
15/05/01 10:11:48 INFO BlockManagerMaster: Updated info of block broadcast_3_piece0 | |
15/05/01 10:11:48 INFO SparkContext: Created broadcast 3 from textFile at App.scala:31 | |
15/05/01 10:11:48 INFO FileInputFormat: Total input paths to process : 1 | |
15/05/01 10:11:48 INFO SparkContext: Starting job: collect at App.scala:32 | |
15/05/01 10:11:48 INFO DAGScheduler: Got job 2 (collect at App.scala:32) with 2 output partitions (allowLocal=false) | |
15/05/01 10:11:48 INFO DAGScheduler: Final stage: Stage 2(collect at App.scala:32) | |
15/05/01 10:11:48 INFO DAGScheduler: Parents of final stage: List() | |
15/05/01 10:11:48 INFO DAGScheduler: Missing parents: List() | |
15/05/01 10:11:48 INFO DAGScheduler: Submitting Stage 2 (/home/mbo/sia/ghEmployees.txt MapPartitionsRDD[5] at textFile at App.scala:31), which has no missing parents | |
15/05/01 10:11:48 INFO MemoryStore: ensureFreeSpace(2664) called with curMem=257915, maxMem=561701191 | |
15/05/01 10:11:48 INFO MemoryStore: Block broadcast_4 stored as values in memory (estimated size 2.6 KB, free 535.4 MB) | |
15/05/01 10:11:48 INFO MemoryStore: ensureFreeSpace(1953) called with curMem=260579, maxMem=561701191 | |
15/05/01 10:11:48 INFO MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 1953.0 B, free 535.4 MB) | |
15/05/01 10:11:48 INFO BlockManagerInfo: Added broadcast_4_piece0 in memory on localhost:42007 (size: 1953.0 B, free: 535.6 MB) | |
15/05/01 10:11:48 INFO BlockManagerMaster: Updated info of block broadcast_4_piece0 | |
15/05/01 10:11:48 INFO SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:839 | |
15/05/01 10:11:48 INFO DAGScheduler: Submitting 2 missing tasks from Stage 2 (/home/mbo/sia/ghEmployees.txt MapPartitionsRDD[5] at textFile at App.scala:31) | |
15/05/01 10:11:48 INFO TaskSchedulerImpl: Adding task set 2.0 with 2 tasks | |
15/05/01 10:11:48 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 3, localhost, PROCESS_LOCAL, 1298 bytes) | |
15/05/01 10:11:48 INFO TaskSetManager: Starting task 1.0 in stage 2.0 (TID 4, localhost, PROCESS_LOCAL, 1298 bytes) | |
15/05/01 10:11:48 INFO Executor: Running task 0.0 in stage 2.0 (TID 3) | |
15/05/01 10:11:48 INFO Executor: Running task 1.0 in stage 2.0 (TID 4) | |
15/05/01 10:11:48 INFO HadoopRDD: Input split: file:/home/mbo/sia/ghEmployees.txt:0+1008 | |
15/05/01 10:11:48 INFO HadoopRDD: Input split: file:/home/mbo/sia/ghEmployees.txt:1008+1009 | |
15/05/01 10:11:48 INFO Executor: Finished task 1.0 in stage 2.0 (TID 4). 3010 bytes result sent to driver | |
15/05/01 10:11:48 INFO Executor: Finished task 0.0 in stage 2.0 (TID 3). 3017 bytes result sent to driver | |
15/05/01 10:11:48 INFO TaskSetManager: Finished task 1.0 in stage 2.0 (TID 4) in 27 ms on localhost (1/2) | |
15/05/01 10:11:48 INFO DAGScheduler: Stage 2 (collect at App.scala:32) finished in 0.018 s | |
15/05/01 10:11:48 INFO DAGScheduler: Job 2 finished: collect at App.scala:32, took 0.047903 s | |
15/05/01 10:11:48 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 3) in 38 ms on localhost (2/2) | |
15/05/01 10:11:48 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool | |
15/05/01 10:11:48 INFO MemoryStore: ensureFreeSpace(13385) called with curMem=262532, maxMem=561701191 | |
15/05/01 10:11:48 INFO MemoryStore: Block broadcast_5 stored as values in memory (estimated size 13.1 KB, free 535.4 MB) | |
15/05/01 10:11:48 INFO MemoryStore: ensureFreeSpace(2437) called with curMem=275917, maxMem=561701191 | |
15/05/01 10:11:48 INFO MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 2.4 KB, free 535.4 MB) | |
15/05/01 10:11:48 INFO BlockManagerInfo: Added broadcast_5_piece0 in memory on localhost:42007 (size: 2.4 KB, free: 535.6 MB) | |
15/05/01 10:11:48 INFO BlockManagerMaster: Updated info of block broadcast_5_piece0 | |
15/05/01 10:11:48 INFO SparkContext: Created broadcast 5 from broadcast at App.scala:34 | |
Exception in thread "main" scala.reflect.internal.MissingRequirementError: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial classloader with boot classpath [/home/mbo/bin/scala-ide/plugins/org.scala-ide.scala210.jars_4.0.0.201503031935/target/jars/scala-library.jar:/home/mbo/bin/scala-ide/plugins/org.scala-ide.scala210.jars_4.0.0.201503031935/target/jars/scala-reflect.jar:/home/mbo/bin/scala-ide/plugins/org.scala-ide.scala210.jars_4.0.0.201503031935/target/jars/scala-actor.jar:/home/mbo/bin/scala-ide/plugins/org.scala-ide.scala210.jars_4.0.0.201503031935/target/jars/scala-swing.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/resources.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/rt.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/jsse.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/jce.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/charsets.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/rhino.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/jfr.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/classes] not found. | |
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16) | |
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17) | |
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48) | |
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61) | |
at scala.reflect.internal.Mirrors$RootsBase.staticModuleOrClass(Mirrors.scala:72) | |
at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:119) | |
at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:21) | |
at org.apache.spark.sql.catalyst.ScalaReflection$$typecreator1$1.apply(ScalaReflection.scala:127) | |
at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:231) | |
at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:231) | |
at scala.reflect.api.TypeTags$class.typeOf(TypeTags.scala:335) | |
at scala.reflect.api.Universe.typeOf(Universe.scala:59) | |
at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:127) | |
at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:30) | |
at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:112) | |
at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:30) | |
at org.apache.spark.sql.UDFRegistration.register(UDFRegistration.scala:132) | |
at org.sia.chapter03App.App$.main(App.scala:60) | |
at org.sia.chapter03App.App.main(App.scala) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment