Skip to content

Instantly share code, notes, and snippets.

@khajavi
Last active September 6, 2015 23:55
Show Gist options
  • Save khajavi/89f802398d6f8c40e23e to your computer and use it in GitHub Desktop.
Save khajavi/89f802398d6f8c40e23e to your computer and use it in GitHub Desktop.
Run spark job programmatically with webservice by tomcat (java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1)
package blah;
import static spark.Spark.get;
import javax.ws.rs.core.Response;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import spark.servlet.SparkApplication;
public class App implements SparkApplication {
@Override
public void init() {
get("/hello", (req, res) -> {
String sourcePath = "hdfs://spark:54310/input/*";
SparkConf conf = new SparkConf().setAppName("TestLineCount");
conf.setJars(new String[] { App.class.getProtectionDomain()
.getCodeSource().getLocation().getPath() });
conf.setMaster("spark://tootak:7077");
conf.set("spark.driver.allowMultipleContexts", "true");
@SuppressWarnings("resource")
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> log = sc.textFile(sourcePath);
JavaRDD<String> lines = log.filter(x -> {
return true;
});
return Response.ok(lines.count()).build();
});
}
}
Jan 27, 2015 9:58:34 PM org.apache.tomcat.util.digester.SetPropertiesRule begin
WARNING: [SetPropertiesRule]{Server/Service/Engine/Host/Context} Setting property 'source' to 'org.eclipse.jst.j2ee.server:test' did not find a matching property.
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: Server version: Apache Tomcat/8.0.17
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: Server built: Jan 9 2015 15:58:59 UTC
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: Server number: 8.0.17.0
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: OS Name: Linux
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: OS Version: 3.13.0-43-generic
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: Architecture: amd64
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: JAVA_HOME: /usr/lib/jvm/java-8-oracle/jre
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: JVM Version: 1.8.0_25-b17
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: JVM Vendor: Oracle Corporation
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: CATALINA_BASE: /home/milad/workspace-sts/.metadata/.plugins/org.eclipse.wst.server.core/tmp2
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: CATALINA_HOME: /home/milad/workspace/packages/apache-tomcat-8.0.17
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: Command line argument: -Dcatalina.base=/home/milad/workspace-sts/.metadata/.plugins/org.eclipse.wst.server.core/tmp2
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: Command line argument: -Dcatalina.home=/home/milad/workspace/packages/apache-tomcat-8.0.17
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: Command line argument: -Dwtp.deploy=/home/milad/workspace-sts/.metadata/.plugins/org.eclipse.wst.server.core/tmp2/wtpwebapps
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: Command line argument: -Djava.endorsed.dirs=/home/milad/workspace/packages/apache-tomcat-8.0.17/endorsed
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.VersionLoggerListener log
INFO: Command line argument: -Dfile.encoding=UTF-8
Jan 27, 2015 9:58:34 PM org.apache.catalina.core.AprLifecycleListener lifecycleEvent
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
Jan 27, 2015 9:58:34 PM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["http-nio-8085"]
Jan 27, 2015 9:58:34 PM org.apache.tomcat.util.net.NioSelectorPool getSharedSelector
INFO: Using a shared selector for servlet write/read
Jan 27, 2015 9:58:34 PM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["ajp-nio-8009"]
Jan 27, 2015 9:58:34 PM org.apache.tomcat.util.net.NioSelectorPool getSharedSelector
INFO: Using a shared selector for servlet write/read
Jan 27, 2015 9:58:34 PM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 403 ms
Jan 27, 2015 9:58:34 PM org.apache.catalina.core.StandardService startInternal
INFO: Starting service Catalina
Jan 27, 2015 9:58:34 PM org.apache.catalina.core.StandardEngine startInternal
INFO: Starting Servlet Engine: Apache Tomcat/8.0.17
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/milad/workspace-sts/.metadata/.plugins/org.eclipse.wst.server.core/tmp2/wtpwebapps/test/WEB-INF/lib/slf4j-simple-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/milad/workspace-sts/.metadata/.plugins/org.eclipse.wst.server.core/tmp2/wtpwebapps/test/WEB-INF/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]
Jan 27, 2015 9:58:39 PM org.apache.coyote.AbstractProtocol start
INFO: Starting ProtocolHandler ["http-nio-8085"]
Jan 27, 2015 9:58:39 PM org.apache.coyote.AbstractProtocol start
INFO: Starting ProtocolHandler ["ajp-nio-8009"]
Jan 27, 2015 9:58:39 PM org.apache.catalina.startup.Catalina start
INFO: Server startup in 4666 ms
[http-nio-8085-exec-4] WARN org.apache.spark.SparkContext - Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
blah.App.lambda$0(App.java:27)
blah.App$$Lambda$1/2025917060.handle(Unknown Source)
spark.SparkBase$1.handle(SparkBase.java:264)
spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:154)
spark.servlet.SparkFilter.doFilter(SparkFilter.java:126)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:610)
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:516)
org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1086)
org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:659)
org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(Http11NioProtocol.java:223)
org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1558)
[http-nio-8085-exec-3] WARN org.apache.spark.util.Utils - Your hostname, tootak resolves to a loopback address: 127.0.1.1; using 10.1.2.61 instead (on interface eth0)
[http-nio-8085-exec-3] WARN org.apache.spark.util.Utils - Set SPARK_LOCAL_IP if you need to bind to another address
[http-nio-8085-exec-4] INFO org.apache.spark.SecurityManager - Changing view acls to: milad
[http-nio-8085-exec-3] INFO org.apache.spark.SecurityManager - Changing view acls to: milad
[http-nio-8085-exec-4] INFO org.apache.spark.SecurityManager - Changing modify acls to: milad
[http-nio-8085-exec-3] INFO org.apache.spark.SecurityManager - Changing modify acls to: milad
[http-nio-8085-exec-3] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(milad); users with modify permissions: Set(milad)
[http-nio-8085-exec-4] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(milad); users with modify permissions: Set(milad)
[sparkDriver-akka.actor.default-dispatcher-2] INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
[sparkDriver-akka.actor.default-dispatcher-2] INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
[sparkDriver-akka.actor.default-dispatcher-3] INFO Remoting - Starting remoting
[sparkDriver-akka.actor.default-dispatcher-4] INFO Remoting - Starting remoting
[sparkDriver-akka.actor.default-dispatcher-4] INFO Remoting - Remoting started; listening on addresses :[akka.tcp://[email protected]:49917]
[sparkDriver-akka.actor.default-dispatcher-3] INFO Remoting - Remoting started; listening on addresses :[akka.tcp://[email protected]:56362]
[http-nio-8085-exec-3] INFO org.apache.spark.util.Utils - Successfully started service 'sparkDriver' on port 49917.
[http-nio-8085-exec-4] INFO org.apache.spark.util.Utils - Successfully started service 'sparkDriver' on port 56362.
[http-nio-8085-exec-3] INFO org.apache.spark.SparkEnv - Registering MapOutputTracker
[http-nio-8085-exec-4] INFO org.apache.spark.SparkEnv - Registering MapOutputTracker
[http-nio-8085-exec-3] INFO org.apache.spark.SparkEnv - Registering BlockManagerMaster
[http-nio-8085-exec-4] INFO org.apache.spark.SparkEnv - Registering BlockManagerMaster
[http-nio-8085-exec-4] INFO org.apache.spark.storage.DiskBlockManager - Created local directory at /tmp/spark-local-20150127215844-9ab4
[http-nio-8085-exec-3] INFO org.apache.spark.storage.DiskBlockManager - Created local directory at /tmp/spark-local-20150127215844-f257
[http-nio-8085-exec-3] INFO org.apache.spark.storage.MemoryStore - MemoryStore started with capacity 951.5 MB
[http-nio-8085-exec-4] INFO org.apache.spark.storage.MemoryStore - MemoryStore started with capacity 951.5 MB
[http-nio-8085-exec-3] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[http-nio-8085-exec-4] INFO org.apache.spark.HttpFileServer - HTTP File server directory is /tmp/spark-27f906a5-cddc-4cf2-9008-0e06b7d221ba
[http-nio-8085-exec-3] INFO org.apache.spark.HttpFileServer - HTTP File server directory is /tmp/spark-1cdf9132-b1b9-412d-9900-23646125e0e6
[http-nio-8085-exec-4] INFO org.apache.spark.HttpServer - Starting HTTP Server
[http-nio-8085-exec-3] INFO org.apache.spark.HttpServer - Starting HTTP Server
[http-nio-8085-exec-3] INFO org.eclipse.jetty.server.Server - jetty-8.1.14.v20131031
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.Server - jetty-8.1.14.v20131031
[http-nio-8085-exec-3] INFO org.eclipse.jetty.server.AbstractConnector - Started [email protected]:43830
[http-nio-8085-exec-3] INFO org.apache.spark.util.Utils - Successfully started service 'HTTP file server' on port 43830.
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.AbstractConnector - Started [email protected]:35097
[http-nio-8085-exec-4] INFO org.apache.spark.util.Utils - Successfully started service 'HTTP file server' on port 35097.
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.Server - jetty-8.1.14.v20131031
[http-nio-8085-exec-3] INFO org.eclipse.jetty.server.Server - jetty-8.1.14.v20131031
[http-nio-8085-exec-4] WARN org.eclipse.jetty.util.component.AbstractLifeCycle - FAILED [email protected]:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.listen(Native Method)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:215)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:194)
at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1676)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1667)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:204)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:269)
at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:269)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:269)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at blah.App.lambda$0(App.java:27)
at blah.App$$Lambda$1/2025917060.handle(Unknown Source)
at spark.SparkBase$1.handle(SparkBase.java:264)
at spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:154)
at spark.servlet.SparkFilter.doFilter(SparkFilter.java:126)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:610)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:516)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1086)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:659)
at org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(Http11NioProtocol.java:223)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1558)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1515)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
[http-nio-8085-exec-4] WARN org.eclipse.jetty.util.component.AbstractLifeCycle - FAILED org.eclipse.jetty.server.Server@6d163d95: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.listen(Native Method)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:215)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:194)
at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1676)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1667)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:204)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:269)
at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:269)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:269)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at blah.App.lambda$0(App.java:27)
at blah.App$$Lambda$1/2025917060.handle(Unknown Source)
at spark.SparkBase$1.handle(SparkBase.java:264)
at spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:154)
at spark.servlet.SparkFilter.doFilter(SparkFilter.java:126)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:610)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:516)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1086)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:659)
at org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(Http11NioProtocol.java:223)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1558)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1515)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
[http-nio-8085-exec-3] INFO org.eclipse.jetty.server.AbstractConnector - Started [email protected]:4040
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/,null}
[http-nio-8085-exec-3] INFO org.apache.spark.util.Utils - Successfully started service 'SparkUI' on port 4040.
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/static,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/executors,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/environment,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/storage,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/stages,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.handler.ContextHandler - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[http-nio-8085-exec-3] INFO org.apache.spark.ui.SparkUI - Started SparkUI at http://10.1.2.61:4040
[http-nio-8085-exec-3] INFO org.apache.spark.SparkContext - Added JAR /home/milad/workspace-sts/.metadata/.plugins/org.eclipse.wst.server.core/tmp2/wtpwebapps/test/WEB-INF/classes/blah/App.class at http://10.1.2.61:43830/jars/App.class with timestamp 1422383325392
[http-nio-8085-exec-4] WARN org.apache.spark.util.Utils - Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.deploy.client.AppClient$ClientActor - Connecting to master spark://tootak:7077...
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.Server - jetty-8.1.14.v20131031
[http-nio-8085-exec-4] INFO org.eclipse.jetty.server.AbstractConnector - Started [email protected]:4041
[http-nio-8085-exec-4] INFO org.apache.spark.util.Utils - Successfully started service 'SparkUI' on port 4041.
[http-nio-8085-exec-4] INFO org.apache.spark.ui.SparkUI - Started SparkUI at http://10.1.2.61:4041
[http-nio-8085-exec-4] INFO org.apache.spark.SparkContext - Added JAR /home/milad/workspace-sts/.metadata/.plugins/org.eclipse.wst.server.core/tmp2/wtpwebapps/test/WEB-INF/classes/blah/App.class at http://10.1.2.61:35097/jars/App.class with timestamp 1422383325595
[sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.deploy.client.AppClient$ClientActor - Connecting to master spark://tootak:7077...
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - Connected to Spark cluster with app ID app-20150127215845-0006
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - Connected to Spark cluster with app ID app-20150127215845-0005
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.deploy.client.AppClient$ClientActor - Executor added: app-20150127215845-0005/0 on worker-20150127202257-tootak-41695 (tootak:41695) with 2 cores
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - Granted executor ID app-20150127215845-0005/0 on hostPort tootak:41695 with 2 cores, 512.0 MB RAM
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.deploy.client.AppClient$ClientActor - Executor updated: app-20150127215845-0005/0 is now LOADING
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.deploy.client.AppClient$ClientActor - Executor updated: app-20150127215845-0005/0 is now RUNNING
[http-nio-8085-exec-3] INFO org.apache.spark.network.netty.NettyBlockTransferService - Server created on 57359
[http-nio-8085-exec-3] INFO org.apache.spark.storage.BlockManagerMaster - Trying to register BlockManager
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.BlockManagerMasterActor - Registering block manager 10.1.2.61:57359 with 951.5 MB RAM, BlockManagerId(<driver>, 10.1.2.61, 57359)
[http-nio-8085-exec-3] INFO org.apache.spark.storage.BlockManagerMaster - Registered BlockManager
[http-nio-8085-exec-4] INFO org.apache.spark.network.netty.NettyBlockTransferService - Server created on 48324
[http-nio-8085-exec-4] INFO org.apache.spark.storage.BlockManagerMaster - Trying to register BlockManager
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.BlockManagerMasterActor - Registering block manager 10.1.2.61:48324 with 951.5 MB RAM, BlockManagerId(<driver>, 10.1.2.61, 48324)
[http-nio-8085-exec-4] INFO org.apache.spark.storage.BlockManagerMaster - Registered BlockManager
[http-nio-8085-exec-4] INFO org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
[http-nio-8085-exec-3] INFO org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
[http-nio-8085-exec-3] INFO org.apache.spark.metrics.MetricsSystem - Metrics already registered
java.lang.IllegalArgumentException: A metric named app-20150127215845-0005.driver.DAGScheduler.stage.waitingStages already exists
at com.codahale.metrics.MetricRegistry.register(MetricRegistry.java:89)
at com.codahale.metrics.MetricRegistry.registerAll(MetricRegistry.java:383)
at com.codahale.metrics.MetricRegistry.register(MetricRegistry.java:83)
at org.apache.spark.metrics.MetricsSystem.registerSource(MetricsSystem.scala:133)
at org.apache.spark.SparkContext.initDriverMetrics(SparkContext.scala:500)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:504)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at blah.App.lambda$0(App.java:27)
at blah.App$$Lambda$1/2025917060.handle(Unknown Source)
at spark.SparkBase$1.handle(SparkBase.java:264)
at spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:154)
at spark.servlet.SparkFilter.doFilter(SparkFilter.java:126)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:610)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:516)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1086)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:659)
at org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(Http11NioProtocol.java:223)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1558)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1515)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
[http-nio-8085-exec-3] INFO org.apache.spark.metrics.MetricsSystem - Metrics already registered
java.lang.IllegalArgumentException: A metric named app-20150127215845-0005.driver.BlockManager.disk.diskSpaceUsed_MB already exists
at com.codahale.metrics.MetricRegistry.register(MetricRegistry.java:89)
at com.codahale.metrics.MetricRegistry.registerAll(MetricRegistry.java:383)
at com.codahale.metrics.MetricRegistry.register(MetricRegistry.java:83)
at org.apache.spark.metrics.MetricsSystem.registerSource(MetricsSystem.scala:133)
at org.apache.spark.SparkContext.initDriverMetrics(SparkContext.scala:501)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:504)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at blah.App.lambda$0(App.java:27)
at blah.App$$Lambda$1/2025917060.handle(Unknown Source)
at spark.SparkBase$1.handle(SparkBase.java:264)
at spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:154)
at spark.servlet.SparkFilter.doFilter(SparkFilter.java:126)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:610)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:516)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1086)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:659)
at org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(Http11NioProtocol.java:223)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1558)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1515)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
[http-nio-8085-exec-4] INFO org.apache.spark.storage.MemoryStore - ensureFreeSpace(133168) called with curMem=0, maxMem=997699092
[http-nio-8085-exec-4] INFO org.apache.spark.storage.MemoryStore - Block broadcast_0 stored as values in memory (estimated size 130.0 KB, free 951.4 MB)
[http-nio-8085-exec-3] WARN org.apache.spark.storage.BlockManager - Block broadcast_0 already exists on this machine; not re-adding it
[http-nio-8085-exec-4] INFO org.apache.spark.storage.MemoryStore - ensureFreeSpace(18512) called with curMem=133168, maxMem=997699092
[http-nio-8085-exec-4] INFO org.apache.spark.storage.MemoryStore - Block broadcast_0_piece0 stored as bytes in memory (estimated size 18.1 KB, free 951.3 MB)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_0_piece0 in memory on 10.1.2.61:48324 (size: 18.1 KB, free: 951.5 MB)
[http-nio-8085-exec-4] INFO org.apache.spark.storage.BlockManagerMaster - Updated info of block broadcast_0_piece0
[http-nio-8085-exec-3] WARN org.apache.spark.storage.BlockManager - Block broadcast_0_piece0 already exists on this machine; not re-adding it
[http-nio-8085-exec-4] INFO org.apache.spark.SparkContext - Created broadcast 0 from textFile at App.java:28
[http-nio-8085-exec-3] INFO org.apache.spark.SparkContext - Created broadcast 0 from textFile at App.java:28
[http-nio-8085-exec-4] INFO org.apache.hadoop.mapred.FileInputFormat - Total input paths to process : 26
[http-nio-8085-exec-3] INFO org.apache.hadoop.mapred.FileInputFormat - Total input paths to process : 26
[http-nio-8085-exec-3] INFO org.apache.spark.SparkContext - Starting job: count at App.java:34
[http-nio-8085-exec-4] INFO org.apache.spark.SparkContext - Starting job: count at App.java:34
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Got job 0 (count at App.java:34) with 26 output partitions (allowLocal=false)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Got job 0 (count at App.java:34) with 26 output partitions (allowLocal=false)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: Stage 0(count at App.java:34)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: Stage 0(count at App.java:34)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List()
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List()
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List()
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List()
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Submitting Stage 0 (FilteredRDD[2] at filter at App.java:30), which has no missing parents
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Submitting Stage 0 (FilteredRDD[2] at filter at App.java:30), which has no missing parents
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.MemoryStore - ensureFreeSpace(3360) called with curMem=151680, maxMem=997699092
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.MemoryStore - Block broadcast_1 stored as values in memory (estimated size 3.3 KB, free 951.3 MB)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.MemoryStore - ensureFreeSpace(2403) called with curMem=155040, maxMem=997699092
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.MemoryStore - Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.3 KB, free 951.3 MB)
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_1_piece0 in memory on 10.1.2.61:48324 (size: 2.3 KB, free: 951.5 MB)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.BlockManagerMaster - Updated info of block broadcast_1_piece0
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.SparkContext - Created broadcast 1 from broadcast at DAGScheduler.scala:838
[sparkDriver-akka.actor.default-dispatcher-2] WARN org.apache.spark.storage.BlockManager - Block broadcast_1 already exists on this machine; not re-adding it
[sparkDriver-akka.actor.default-dispatcher-2] WARN org.apache.spark.storage.BlockManager - Block broadcast_1_piece0 already exists on this machine; not re-adding it
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.SparkContext - Created broadcast 1 from broadcast at DAGScheduler.scala:838
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 26 missing tasks from Stage 0 (FilteredRDD[2] at filter at App.java:30)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 26 missing tasks from Stage 0 (FilteredRDD[2] at filter at App.java:30)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 0.0 with 26 tasks
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 0.0 with 26 tasks
[sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - Registered executor: Actor[akka.tcp://sparkExecutor@tootak:36739/user/Executor#648126472] with ID 0
[sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 0.0 (TID 0, tootak, ANY, 1361 bytes)
[sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 0.0 (TID 1, tootak, ANY, 1362 bytes)
[sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.storage.BlockManagerMasterActor - Registering block manager tootak:38347 with 265.1 MB RAM, BlockManagerId(0, tootak, 38347)
[sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_1_piece0 in memory on tootak:38347 (size: 2.3 KB, free: 265.1 MB)
[sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 0.0 (TID 2, tootak, ANY, 1362 bytes)
[sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 0.0 (TID 3, tootak, ANY, 1362 bytes)
[task-result-getter-0] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 0.0 in stage 0.0 (TID 0, tootak): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2089)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1261)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1999)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:57)
at org.apache.spark.scheduler.Task.run(Task.scala:56)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.1 in stage 0.0 (TID 4, tootak, ANY, 1361 bytes)
[task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 1.0 in stage 0.0 (TID 1) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 1]
[task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 2.0 in stage 0.0 (TID 2) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 2]
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.1 in stage 0.0 (TID 5, tootak, ANY, 1362 bytes)
[task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 3.0 in stage 0.0 (TID 3) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 3]
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.1 in stage 0.0 (TID 6, tootak, ANY, 1362 bytes)
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.1 in stage 0.0 (TID 7, tootak, ANY, 1362 bytes)
[task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 0.1 in stage 0.0 (TID 4) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 4]
[task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 2.1 in stage 0.0 (TID 5) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 5]
[sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.2 in stage 0.0 (TID 8, tootak, ANY, 1362 bytes)
[task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 3.1 in stage 0.0 (TID 6) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 6]
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.2 in stage 0.0 (TID 9, tootak, ANY, 1362 bytes)
[task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 1.1 in stage 0.0 (TID 7) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 7]
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.2 in stage 0.0 (TID 10, tootak, ANY, 1362 bytes)
[task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 2.2 in stage 0.0 (TID 8) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 8]
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.3 in stage 0.0 (TID 11, tootak, ANY, 1362 bytes)
[task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 3.2 in stage 0.0 (TID 9) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 9]
[sparkDriver-akka.actor.default-dispatcher-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.3 in stage 0.0 (TID 12, tootak, ANY, 1362 bytes)
[task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 1.2 in stage 0.0 (TID 10) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 10]
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.3 in stage 0.0 (TID 13, tootak, ANY, 1362 bytes)
[task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 2.3 in stage 0.0 (TID 11) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 11]
[task-result-getter-3] ERROR org.apache.spark.scheduler.TaskSetManager - Task 2 in stage 0.0 failed 4 times; aborting job
[task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 3.3 in stage 0.0 (TID 12) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 12]
[task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Lost task 1.3 in stage 0.0 (TID 13) on executor tootak: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 13]
[task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 0.0, whose tasks have all completed, from pool
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Cancelling stage 0
[http-nio-8085-exec-4] INFO org.apache.spark.scheduler.DAGScheduler - Job 0 failed: count at App.java:34, took 1.457926 s
[http-nio-8085-exec-4] ERROR spark.webserver.MatcherFilter -
org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 0.0 failed 4 times, most recent failure: Lost task 2.3 in stage 0.0 (TID 11, tootak): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2089)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1261)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1999)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:57)
at org.apache.spark.scheduler.Task.run(Task.scala:56)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:1375)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.storage.BlockManager - Removing broadcast 1
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.storage.BlockManager - Removing block broadcast_1
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.BlockManagerInfo - Removed broadcast_1_piece0 on tootak:38347 in memory (size: 2.3 KB, free: 265.1 MB)
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.storage.MemoryStore - Block broadcast_1 of size 3360 dropped from memory (free 997545009)
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.storage.BlockManager - Removing block broadcast_1_piece0
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.storage.MemoryStore - Block broadcast_1_piece0 of size 2403 dropped from memory (free 997547412)
[sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.storage.BlockManagerInfo - Removed broadcast_1_piece0 on 10.1.2.61:48324 in memory (size: 2.3 KB, free: 951.5 MB)
[sparkDriver-akka.actor.default-dispatcher-14] INFO org.apache.spark.storage.BlockManagerMaster - Updated info of block broadcast_1_piece0
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned broadcast 1
<!DOCTYPE web-app PUBLIC
"-//Sun Microsystems, Inc.//DTD Web Application 2.3//EN"
"http://java.sun.com/dtd/web-app_2_3.dtd" >
<web-app>
<display-name>Archetype Created Web Application</display-name>
<filter>
<filter-name>SparkFilter</filter-name>
<filter-class>spark.servlet.SparkFilter</filter-class>
<init-param>
<param-name>applicationClass</param-name>
<param-value>blah.App</param-value>
</init-param>
</filter>
<filter-mapping>
<filter-name>SparkFilter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
</web-app>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>temp</groupId>
<artifactId>test</artifactId>
<packaging>war</packaging>
<version>0.0.1-SNAPSHOT</version>
<name>test Maven Webapp</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.1</version>
<exclusions>
<exclusion>
<artifactId>jetty-server</artifactId>
<groupId>org.eclipse.jetty</groupId>
</exclusion>
<exclusion>
<artifactId>jetty-webapp</artifactId>
<groupId>org.eclipse.jetty</groupId>
</exclusion>
<exclusion>
<artifactId>jetty-security</artifactId>
<groupId>org.eclipse.jetty</groupId>
</exclusion>
<exclusion>
<artifactId>jetty-servlet</artifactId>
<groupId>org.eclipse.jetty</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.2.0</version>
</dependency>
</dependencies>
<build>
<finalName>test</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<inherited>true</inherited>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
@ashokjjo
Copy link

ashokjjo commented Sep 6, 2015

you would have run it multiple times. You need to have sc.close() at the end. So, when you hit, it can rerun.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment