Created
March 14, 2017 21:36
-
-
Save jtemporal/f945f4a188dbeba382eb20ef7670a7b7 to your computer and use it in GitHub Desktop.
output from spark first try
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ spark-submit \ | |
> --class "ChamberOfDeputies" \ | |
> --packages com.databricks:spark-xml_2.11:0.4.1 \ | |
> target/scala-2.11/chamber-of-deputies_2.11-1.0.jar \ | |
> ~/Code/serenata/serenata-de-amor/data/AnoAtual.xml | |
Ivy Default Cache set to: /home/temporal/.ivy2/cache | |
The jars for the packages stored in: /home/temporal/.ivy2/jars | |
:: loading settings :: url = jar:file:/usr/local/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
com.databricks#spark-xml_2.11 added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found com.databricks#spark-xml_2.11;0.4.1 in list | |
:: resolution report :: resolve 878ms :: artifacts dl 13ms | |
:: modules in use: | |
com.databricks#spark-xml_2.11;0.4.1 from list in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 1 | 0 | 0 | 0 || 1 | 0 | | |
--------------------------------------------------------------------- | |
:: problems summary :: | |
:::: ERRORS | |
unknown resolver null | |
unknown resolver null | |
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
1 artifacts copied, 0 already retrieved (215kB/36ms) | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
17/03/14 17:50:01 INFO SparkContext: Running Spark version 2.1.0 | |
17/03/14 17:50:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
17/03/14 17:50:04 WARN Utils: Your hostname, DarthMaul resolves to a loopback address: 127.0.1.1; using 192.168.15.14 instead (on interface eno1) | |
17/03/14 17:50:04 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address | |
17/03/14 17:50:04 INFO SecurityManager: Changing view acls to: temporal | |
17/03/14 17:50:04 INFO SecurityManager: Changing modify acls to: temporal | |
17/03/14 17:50:04 INFO SecurityManager: Changing view acls groups to: | |
17/03/14 17:50:04 INFO SecurityManager: Changing modify acls groups to: | |
17/03/14 17:50:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(temporal); groups with view permissions: Set(); users with modify permissions: Set(temporal); groups with modify permissions: Set() | |
17/03/14 17:50:06 INFO Utils: Successfully started service 'sparkDriver' on port 45468. | |
17/03/14 17:50:06 INFO SparkEnv: Registering MapOutputTracker | |
17/03/14 17:50:07 INFO SparkEnv: Registering BlockManagerMaster | |
17/03/14 17:50:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
17/03/14 17:50:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up | |
17/03/14 17:50:07 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-dd752ac5-098f-42d6-82ea-766ee6e724d2 | |
17/03/14 17:50:07 INFO MemoryStore: MemoryStore started with capacity 366.3 MB | |
17/03/14 17:50:08 INFO SparkEnv: Registering OutputCommitCoordinator | |
17/03/14 17:50:10 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
17/03/14 17:50:10 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.15.14:4040 | |
17/03/14 17:50:10 INFO SparkContext: Added JAR file:/home/temporal/.ivy2/jars/com.databricks_spark-xml_2.11-0.4.1.jar at spark://192.168.15.14:45468/jars/com.databricks_spark-xml_2.11-0.4.1.jar with timestamp 1489524610789 | |
17/03/14 17:50:10 INFO SparkContext: Added JAR file:/home/temporal/Documents/DSBR/chamber-of-deputies-preprocessing/target/scala-2.11/chamber-of-deputies_2.11-1.0.jar at spark://192.168.15.14:45468/jars/chamber-of-deputies_2.11-1.0.jar with timestamp 1489524610792 | |
17/03/14 17:50:11 INFO Executor: Starting executor ID driver on host localhost | |
17/03/14 17:50:11 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43726. | |
17/03/14 17:50:11 INFO NettyBlockTransferService: Server created on 192.168.15.14:43726 | |
17/03/14 17:50:11 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
17/03/14 17:50:11 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.15.14, 43726, None) | |
17/03/14 17:50:11 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.15.14:43726 with 366.3 MB RAM, BlockManagerId(driver, 192.168.15.14, 43726, None) | |
17/03/14 17:50:11 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.15.14, 43726, None) | |
17/03/14 17:50:11 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.15.14, 43726, None) | |
17/03/14 17:50:13 INFO SharedState: Warehouse path is 'file:/home/temporal/Documents/DSBR/chamber-of-deputies-preprocessing/spark-warehouse'. | |
17/03/14 17:50:18 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 275.9 KB, free 366.0 MB) | |
17/03/14 17:50:19 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 23.0 KB, free 366.0 MB) | |
17/03/14 17:50:19 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.15.14:43726 (size: 23.0 KB, free: 366.3 MB) | |
17/03/14 17:50:19 INFO SparkContext: Created broadcast 0 from newAPIHadoopFile at XmlFile.scala:51 | |
Exception in thread "main" org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: file:/home/temporal/Code/serenata/serenata-de-amor/data/AnoAtual.xml | |
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:323) | |
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:265) | |
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:387) | |
at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:125) | |
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252) | |
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250) | |
at scala.Option.getOrElse(Option.scala:121) | |
at org.apache.spark.rdd.RDD.partitions(RDD.scala:250) | |
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) | |
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252) | |
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250) | |
at scala.Option.getOrElse(Option.scala:121) | |
at org.apache.spark.rdd.RDD.partitions(RDD.scala:250) | |
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) | |
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252) | |
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250) | |
at scala.Option.getOrElse(Option.scala:121) | |
at org.apache.spark.rdd.RDD.partitions(RDD.scala:250) | |
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1.apply(RDD.scala:1129) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) | |
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) | |
at org.apache.spark.rdd.RDD.treeAggregate(RDD.scala:1127) | |
at com.databricks.spark.xml.util.InferSchema$.infer(InferSchema.scala:109) | |
at com.databricks.spark.xml.XmlRelation$$anonfun$1.apply(XmlRelation.scala:46) | |
at com.databricks.spark.xml.XmlRelation$$anonfun$1.apply(XmlRelation.scala:46) | |
at scala.Option.getOrElse(Option.scala:121) | |
at com.databricks.spark.xml.XmlRelation.<init>(XmlRelation.scala:45) | |
at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:65) | |
at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:43) | |
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330) | |
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152) | |
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:135) | |
at ChamberOfDeputies$.main(ChamberOfDeputies.scala:35) | |
at ChamberOfDeputies.main(ChamberOfDeputies.scala) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) | |
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
17/03/14 17:50:19 INFO SparkContext: Invoking stop() from shutdown hook | |
17/03/14 17:50:19 INFO SparkUI: Stopped Spark web UI at http://192.168.15.14:4040 | |
17/03/14 17:50:19 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! | |
17/03/14 17:50:19 INFO MemoryStore: MemoryStore cleared | |
17/03/14 17:50:19 INFO BlockManager: BlockManager stopped | |
17/03/14 17:50:19 INFO BlockManagerMaster: BlockManagerMaster stopped | |
17/03/14 17:50:19 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! | |
17/03/14 17:50:19 INFO SparkContext: Successfully stopped SparkContext | |
17/03/14 17:50:19 INFO ShutdownHookManager: Shutdown hook called | |
17/03/14 17:50:19 INFO ShutdownHookManager: Deleting directory /tmp/spark-047c737d-a9f9-4c95-a1fb-a42a512fcedc |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment