Created
July 17, 2024 22:28
-
-
Save MrCreosote/faf5b4c0591ce9f37b6c95f049c38a45 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
24/07/17 21:30:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
24/07/17 21:30:49 INFO DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at yarn-resourcemanager/172.26.0.3:8032 | |
24/07/17 21:30:49 INFO Configuration: resource-types.xml not found | |
24/07/17 21:30:49 INFO ResourceUtils: Unable to find 'resource-types.xml'. | |
24/07/17 21:30:49 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) | |
24/07/17 21:30:49 INFO Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead | |
24/07/17 21:30:49 INFO Client: Setting up container launch context for our AM | |
24/07/17 21:30:49 INFO Client: Setting up the launch environment for our AM container | |
24/07/17 21:30:49 INFO Client: Preparing resources for our AM container | |
24/07/17 21:30:49 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. | |
24/07/17 21:30:52 INFO Client: Uploading resource file:/tmp/spark-2759cd36-49af-420e-ae36-d252500538ad/__spark_libs__14017948242901530524.zip -> file:/home/spark_user/.sparkStaging/application_1721244785338_0002/__spark_libs__14017948242901530524.zip | |
24/07/17 21:30:52 INFO Client: Uploading resource file:/opt/bitnami/spark/examples/jars/spark-examples_2.12-3.5.1.jar -> file:/home/spark_user/.sparkStaging/application_1721244785338_0002/spark-examples_2.12-3.5.1.jar | |
24/07/17 21:30:53 INFO Client: Uploading resource file:/tmp/spark-2759cd36-49af-420e-ae36-d252500538ad/__spark_conf__4046546026871803866.zip -> file:/home/spark_user/.sparkStaging/application_1721244785338_0002/__spark_conf__.zip | |
24/07/17 21:30:53 INFO SecurityManager: Changing view acls to: spark_user,spark | |
24/07/17 21:30:53 INFO SecurityManager: Changing modify acls to: spark_user,spark | |
24/07/17 21:30:53 INFO SecurityManager: Changing view acls groups to: | |
24/07/17 21:30:53 INFO SecurityManager: Changing modify acls groups to: | |
24/07/17 21:30:53 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: spark_user, spark; groups with view permissions: EMPTY; users with modify permissions: spark_user, spark; groups with modify permissions: EMPTY | |
24/07/17 21:30:53 INFO Client: Submitting application application_1721244785338_0002 to ResourceManager | |
24/07/17 21:30:53 INFO YarnClientImpl: Submitted application application_1721244785338_0002 | |
24/07/17 21:30:54 INFO Client: Application report for application_1721244785338_0002 (state: ACCEPTED) | |
24/07/17 21:30:54 INFO Client: | |
client token: N/A | |
diagnostics: [Wed Jul 17 21:30:53 +0000 2024] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:8192, vCores:8> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; Queue's capacity (absolute resource) = <memory:8192, vCores:8> ; Queue's used capacity (absolute resource) = <memory:0, vCores:0> ; Queue's max capacity (absolute resource) = <memory:8192, vCores:8> ; | |
ApplicationMaster host: N/A | |
ApplicationMaster RPC port: -1 | |
queue: default | |
start time: 1721251853191 | |
final status: UNDEFINED | |
tracking URL: http://b967e49687bc:8088/proxy/application_1721244785338_0002/ | |
user: spark_user | |
24/07/17 21:30:55 INFO Client: Application report for application_1721244785338_0002 (state: FAILED) | |
24/07/17 21:30:55 INFO Client: | |
client token: N/A | |
diagnostics: Application application_1721244785338_0002 failed 2 times due to AM Container for appattempt_1721244785338_0002_000002 exited with exitCode: -1000 | |
Failing this attempt.Diagnostics: [2024-07-17 21:30:54.966]File file:/home/spark_user/.sparkStaging/application_1721244785338_0002/__spark_libs__14017948242901530524.zip does not exist | |
java.io.FileNotFoundException: File file:/home/spark_user/.sparkStaging/application_1721244785338_0002/__spark_libs__14017948242901530524.zip does not exist | |
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:915) | |
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:1236) | |
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:905) | |
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:462) | |
at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:275) | |
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:72) | |
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:425) | |
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:422) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.security.auth.Subject.doAs(Subject.java:422) | |
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899) | |
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:422) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:247) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:240) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:228) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
For more detailed output, check the application tracking page: http://b967e49687bc:8088/cluster/app/application_1721244785338_0002 Then click on links to logs of each attempt. | |
. Failing the application. | |
ApplicationMaster host: N/A | |
ApplicationMaster RPC port: -1 | |
queue: default | |
start time: 1721251853191 | |
final status: FAILED | |
tracking URL: http://b967e49687bc:8088/cluster/app/application_1721244785338_0002 | |
user: spark_user | |
24/07/17 21:30:55 INFO Client: Deleted staging directory file:/home/spark_user/.sparkStaging/application_1721244785338_0002 | |
24/07/17 21:30:55 ERROR Client: Application diagnostics message: Application application_1721244785338_0002 failed 2 times due to AM Container for appattempt_1721244785338_0002_000002 exited with exitCode: -1000 | |
Failing this attempt.Diagnostics: [2024-07-17 21:30:54.966]File file:/home/spark_user/.sparkStaging/application_1721244785338_0002/__spark_libs__14017948242901530524.zip does not exist | |
java.io.FileNotFoundException: File file:/home/spark_user/.sparkStaging/application_1721244785338_0002/__spark_libs__14017948242901530524.zip does not exist | |
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:915) | |
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:1236) | |
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:905) | |
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:462) | |
at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:275) | |
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:72) | |
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:425) | |
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:422) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.security.auth.Subject.doAs(Subject.java:422) | |
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899) | |
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:422) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:247) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:240) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:228) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
For more detailed output, check the application tracking page: http://b967e49687bc:8088/cluster/app/application_1721244785338_0002 Then click on links to logs of each attempt. | |
. Failing the application. | |
Exception in thread "main" org.apache.spark.SparkException: Application application_1721244785338_0002 finished with failed status | |
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1309) | |
at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1742) | |
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1029) | |
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194) | |
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217) | |
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) | |
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1120) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1129) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
24/07/17 21:30:55 INFO ShutdownHookManager: Shutdown hook called | |
24/07/17 21:30:55 INFO ShutdownHookManager: Deleting directory /tmp/spark-fc7d2147-bc62-417a-8a9f-e6e8f5d5942c | |
24/07/17 21:30:55 INFO ShutdownHookManager: Deleting directory /tmp/spark-2759cd36-49af-420e-ae36-d252500538ad |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment