This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
case class BrainRecommender[R[_], Path, C[_], CleanedData, KPI, T](actorSystem: ActorSystem, | |
modelsLocation: R[Path], | |
readModels: Path => C[Model[Continuous[T]]], | |
rec: Seq[Continuous[T]] => Recommendation[T]) | |
(implicit opts: Optimizable[Continuous[T]], mR: Monad[R], rC: Read[R, Path, C, Model[Continuous[T]]], fC: Functor[C]) { | |
def issueRecommendation(): C[BrainRecommendation] = { | |
val models = mR.map(modelsLocation)(p => fC.map(readModels(p))(m => m)) | |
val RCollOfOptimal = mR.map(models)(collOfModels => fC.map(collOfModels)(m => opts.optimize(m, x => Set.empty))) | |
mR.map(RCollOfOptimal)(collOfOptimal => fC.map(collOfOptimal)(o => rec(o))) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.specs2.mutable.Specification | |
import org.specs2.ScalaCheck | |
import fpinscala.gettingstarted._ | |
class MyModuleSpec extends Specification with ScalaCheck { | |
"fib" should { | |
"return 0 for 0th index" in { MyModule.fib(0) mustEqual 0 } | |
"return 1 for 1st index" in { MyModule.fib(1) mustEqual 1 } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
diff --git a/report-core/src/main/scala/mediative/util/wrapper/FileWrapper.scala b/report-core/src/main/scala/mediative/util/wrapper/FileWrapper.scala | |
new file mode 100644 | |
index 0000000..acbdfd8 | |
--- /dev/null | |
+++ b/report-core/src/main/scala/mediative/util/wrapper/FileWrapper.scala | |
@@ -0,0 +1,42 @@ | |
+package mediative.util.wrapper | |
+ | |
+import java.io | |
+ |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.apache.spark.sql._ | |
case class CC1(i: Int, s: Option[Int]) | |
case class MockExpected2(i: Int, s: String, d: Option[Double], cc1: Option[CC1]) | |
object MockExpected2 { | |
val expectedSchema2: StructType = { | |
StructType(Seq( | |
StructField(name = "cc1", dataType = StructType({ |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
case class B(d: Int) | |
case class C(c: Int) | |
case class A(b: B, i: Int, c: C) | |
val sc = new SparkContext() | |
val sqlContext = new org.apache.spark.sql.SQLContext(sc) | |
import sqlContext._ | |
val rdd = sc.parallelize(1 to 10).map(v => A(b = B(v), i = v, C(v))) | |
val fName = "lala.parquet" | |
rdd.saveAsParquetFile(fName) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def generateRandom(length: Int): String = scala.util.Random.alphanumeric.take(length).mkString("") | |
def timeInMs[R](block: => R): (Long, R) = { | |
val t0 = System.currentTimeMillis() | |
val result = block // call-by-name | |
val t1 = System.currentTimeMillis() | |
((t1 - t0), result) | |
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// a Representation is just a wrap around a type | |
trait Repr[T] extends wrappers.Base.Wrapper[T] | |
// one way of representing a value is by having the value itself: | |
case class Eval[T](value: T) extends Repr[T] | |
// or: | |
case class Identity[T](value: T) extends Repr[T] | |
// another way is by println()'ing the value before: | |
case class Debug[T](raw: T) extends Repr[T] { | |
def value: T = { println(raw.toString); raw } | |
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/** | |
* We want to register a table with Spark SQL whose rows are composed of 2 longs and 1 String. | |
* We would like to put restrictions of the shape of the String: for example, that it doesn't | |
* contain non-alphanumeric characters (or whatever...) | |
* Let's define what we want: | |
*/ | |
// CleanString defined somewhere else | |
case class MyCaseClass(firstP: Long, secondP: Long, thirdP: CleanString) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.apache.spark.SparkContext | |
import org.apache.spark.rdd.RDD | |
/** | |
* I am trying to narrow down on an Exception thrown by Spark when using "Factories". | |
* The factories have parameters that are used in the classes' functions. | |
* | |
* To run this code: copy-paste this whole content in a Spark-Shell. Execute Test.theMain(sc) | |
* | |
*/ |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.apache.spark.SparkContext | |
import org.apache.spark.rdd.RDD | |
import scala.util.control.Exception._ | |
import scala.util.parsing.combinator.Parsers | |
import scala.util.parsing.input.{CharSequenceReader, Reader, OffsetPosition} | |
import scala.language.postfixOps | |
case class myCaseClass(c: Char) extends Serializable |