To be able to use custom endpoints with the latest Spark distribution, one needs to add an external package (hadoop-aws
). Then, custum endpoints can be configured according to docs.
bin/spark-shell --packages org.apache.hadoop:hadoop-aws:2.7.2
To be able to use custom endpoints with the latest Spark distribution, one needs to add an external package (hadoop-aws
). Then, custum endpoints can be configured according to docs.
bin/spark-shell --packages org.apache.hadoop:hadoop-aws:2.7.2
//================================================================== | |
// SPARK INSTRUMENTATION | |
//================================================================== | |
import com.codahale.metrics.{MetricRegistry, Meter, Gauge} | |
import org.apache.spark.{SparkEnv, Accumulator} | |
import org.apache.spark.metrics.source.Source | |
import org.joda.time.DateTime | |
import scala.collection.mutable |