This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
public class UserValidator { | |
private Cryptographer cryptographer; | |
public boolean checkPassword(String userName, String password) { | |
User user = UserGateway.findByName(userName); | |
if (user != User.NULL) { | |
String codedPhrase = user.getPhraseEncodedByPassword(); | |
String phrase = cryptographer.decrypt(codedPhrase, password); | |
if ("Valid Password".equals(phrase)) { | |
Session.initialize(); |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
public class StaticClass { | |
static { | |
throwsException(); | |
} | |
private static void throwsException() { | |
throw new RuntimeException("bam!"); | |
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.json4s._ | |
import org.json4s.jackson.JsonMethods.parse | |
import scala.io.Source.fromURL | |
object SparkAppStats { | |
/** | |
* (partial) representation of a Spark Stage object | |
*/ | |
case class SparkStage(name: String, shuffleWriteBytes: Long, memoryBytesSpilled: Long, diskBytesSpilled: Long) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package com.kenshoo.kripke.core | |
import com.yammer.metrics.Metrics | |
import com.yammer.metrics.core.{MetricName, Counter} | |
import org.apache.spark.Accumulator | |
import org.apache.spark.rdd.RDD | |
import scala.reflect.ClassTag | |
object CounterBackedAccumulatorUtil { |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import com.yammer.metrics.Metrics | |
import com.yammer.metrics.core.Gauge | |
import org.apache.spark.SparkContext | |
/** | |
* Created by tzachz on 10/21/15 | |
*/ | |
object SparkContextInstrumentor { | |
def instrument(context: SparkContext): SparkContext = { |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import java.text.SimpleDateFormat | |
import java.util.Date | |
import org.json4s._ | |
import org.json4s.jackson.JsonMethods.parse | |
import scala.io.Source.fromURL | |
object SparkAppStats { | |
val url = "http://<host>:4040/api/v1/applications/<app-name>/jobs" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import java.io.{File, IOException} | |
import java.net.InetAddress | |
import org.apache.spark.SparkContext | |
import org.apache.spark.rdd.RDD | |
import org.slf4j.{Logger, LoggerFactory} | |
import scala.util.{Failure, Try} | |
object LocalDiskHealthCheck { |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.apache.spark.SparkContext | |
import org.apache.spark.sql.expressions.{MutableAggregationBuffer, UserDefinedAggregateFunction} | |
import org.apache.spark.sql.types._ | |
import org.apache.spark.sql.{Column, Row, SQLContext} | |
/*** | |
* UDAF combining maps, overriding any duplicate key with "latest" value | |
* @param keyType DataType of Map key | |
* @param valueType DataType of Value key | |
* @param merge function to merge values of identical keys |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// Inspired by https://gist.github.com/abesto/cdcdd38263eacf1cbb51 | |
// Task creates a .dot file with all inter-module dependencies | |
// Supports any depth of nested modules | |
task moduleDependencyReport { | |
doLast { | |
def file = new File("project-dependencies.dot") | |
file.delete() | |
file << "digraph {\n" | |
file << "splines=ortho\n" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
AnodotReporterConfiguration anodotConf = | |
new DefaultAnodotReporterConfiguration("your-token", 60, "https://api.anodot.com/api/v1/metrics"); | |
Anodot3ReporterBuilder.builderFor(anodotConf) | |
.build(metricRegistry) | |
.start() |