Skip to content

Instantly share code, notes, and snippets.

@Hungsiro506
Created August 28, 2017 19:20
Show Gist options
  • Select an option

  • Save Hungsiro506/8d404dbafbb5ca1941540fc95b21144f to your computer and use it in GitHub Desktop.

Select an option

Save Hungsiro506/8d404dbafbb5ca1941540fc95b21144f to your computer and use it in GitHub Desktop.
cannot be Serializable, because it contains references to
Spark structures (i.e. SparkSession, SparkConf, etc...) as attributes.
Job aborted due to stage failure: Task not serializable:
If you see this error:
org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable: java.io.NotSerializableException: ...
The above error can be triggered when you intialize a variable on the driver (master), but then try to use it on one of the workers. In that case, Spark Streaming will try to serialize the object to send it over to the worker, and fail if the object is not serializable. Consider the following code snippet:
NotSerializable notSerializable = new NotSerializable();
JavaRDD<String> rdd = sc.textFile("/tmp/myfile");
rdd.map(s -> notSerializable.doSomething(s)).collect();
This will trigger that error. Here are some ideas to fix this error:
Serializable the class
Declare the instance only within the lambda function passed in map.
Make the NotSerializable object as a static and create it once per machine.
Call rdd.forEachPartition and create the NotSerializable object in there like this:
rdd.forEachPartition(iter -> {
NotSerializable notSerializable = new NotSerializable();
// ...Now process iter
});
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment