In Pyspark, you can use:
persisted_RDDs = spark.sparkContext._jsc.getPersistentRDDs()
for (i, rdd_) in persisted_RDDs.items():
rdd_.unpersist()
spark
is of type pyspark.sql.session.SparkSession
.
In Pyspark, you can use:
persisted_RDDs = spark.sparkContext._jsc.getPersistentRDDs()
for (i, rdd_) in persisted_RDDs.items():
rdd_.unpersist()
spark
is of type pyspark.sql.session.SparkSession
.