Skip to content

Instantly share code, notes, and snippets.

@nsivabalan
Created August 17, 2021 04:17
Show Gist options
  • Save nsivabalan/aa2f6d537eab17e9075f66e9adec38d0 to your computer and use it in GitHub Desktop.
Save nsivabalan/aa2f6d537eab17e9075f66e9adec38d0 to your computer and use it in GitHub Desktop.
21/08/17 04:15:02 INFO netty.NettyBlockTransferService: Server created on ip-172-31-33-9.us-east-2.compute.internal:38471
21/08/17 04:15:02 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/08/17 04:15:02 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, ip-172-31-33-9.us-east-2.compute.internal, 38471, None)
21/08/17 04:15:02 INFO storage.BlockManagerMasterEndpoint: Registering block manager ip-172-31-33-9.us-east-2.compute.internal:38471 with 2.7 GiB RAM, BlockManagerId(driver, ip-172-31-33-9.us-east-2.compute.internal, 38471, None)
21/08/17 04:15:02 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, ip-172-31-33-9.us-east-2.compute.internal, 38471, None)
21/08/17 04:15:02 INFO storage.BlockManager: external shuffle service port = 7337
21/08/17 04:15:02 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, ip-172-31-33-9.us-east-2.compute.internal, 38471, None)
21/08/17 04:15:02 INFO ui.ServerInfo: Adding filter to /metrics/json: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
21/08/17 04:15:02 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@798dad3d{/metrics/json,null,AVAILABLE,@Spark}
21/08/17 04:15:02 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.S3AFileSystem not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2595)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3269)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3301)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1853)
at org.apache.spark.deploy.history.EventLogFileWriter.<init>(EventLogFileWriters.scala:60)
at org.apache.spark.deploy.history.SingleEventLogFileWriter.<init>(EventLogFileWriters.scala:213)
at org.apache.spark.deploy.history.EventLogFileWriter$.apply(EventLogFileWriters.scala:181)
at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:64)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:576)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at org.apache.hudi.utilities.UtilHelpers.buildSparkContext(UtilHelpers.java:260)
at org.apache.hudi.integ.testsuite.HoodieTestSuiteJob.main(HoodieTestSuiteJob.java:168)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.S3AFileSystem not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2499)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2593)
... 27 more
21/08/17 04:15:02 INFO server.AbstractConnector: Stopped Spark@48bfb884{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
21/08/17 04:15:02 INFO ui.SparkUI: Stopped Spark web UI at http://ip-172-31-33-9.us-east-2.compute.internal:4040
21/08/17 04:15:02 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
21/08/17 04:15:02 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
21/08/17 04:15:02 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
21/08/17 04:15:02 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
21/08/17 04:15:02 INFO cluster.YarnClientSchedulerBackend: YARN client scheduler backend Stopped
21/08/17 04:15:02 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/08/17 04:15:02 INFO memory.MemoryStore: MemoryStore cleared
21/08/17 04:15:02 INFO storage.BlockManager: BlockManager stopped
21/08/17 04:15:02 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
21/08/17 04:15:02 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/08/17 04:15:02 INFO spark.SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.S3AFileSystem not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2595)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3269)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3301)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1853)
at org.apache.spark.deploy.history.EventLogFileWriter.<init>(EventLogFileWriters.scala:60)
at org.apache.spark.deploy.history.SingleEventLogFileWriter.<init>(EventLogFileWriters.scala:213)
at org.apache.spark.deploy.history.EventLogFileWriter$.apply(EventLogFileWriters.scala:181)
at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:64)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:576)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at org.apache.hudi.utilities.UtilHelpers.buildSparkContext(UtilHelpers.java:260)
at org.apache.hudi.integ.testsuite.HoodieTestSuiteJob.main(HoodieTestSuiteJob.java:168)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.S3AFileSystem not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2499)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2593)
... 27 more
21/08/17 04:15:02 INFO util.ShutdownHookManager: Shutdown hook called
21/08/17 04:15:02 INFO util.ShutdownHookManager: Deleting directory /mnt/tmp/spark-3ddbf49a-8fb4-402e-8469-52d4390717de
21/08/17 04:15:02 INFO util.ShutdownHookManager: Deleting directory /mnt/tmp/spark-b5033a2c-39ec-4f0c-9dfd-f14339cd808c
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment