Skip to content

Instantly share code, notes, and snippets.

View ldacosta's full-sized avatar

Luis Da Costa ldacosta

View GitHub Profile
@ldacosta
ldacosta / Reprs.scala
Last active August 29, 2015 14:22
I am not sure how to get the `R` out of the way
case class BrainRecommender[R[_], Path, C[_], CleanedData, KPI, T](actorSystem: ActorSystem,
modelsLocation: R[Path],
readModels: Path => C[Model[Continuous[T]]],
rec: Seq[Continuous[T]] => Recommendation[T])
(implicit opts: Optimizable[Continuous[T]], mR: Monad[R], rC: Read[R, Path, C, Model[Continuous[T]]], fC: Functor[C]) {
def issueRecommendation(): C[BrainRecommendation] = {
val models = mR.map(modelsLocation)(p => fC.map(readModels(p))(m => m))
val RCollOfOptimal = mR.map(models)(collOfModels => fC.map(collOfModels)(m => opts.optimize(m, x => Set.empty)))
mR.map(RCollOfOptimal)(collOfOptimal => fC.map(collOfOptimal)(o => rec(o)))
@ldacosta
ldacosta / testSpecs2.scala
Created April 20, 2015 09:34
Tests Specs 2
import org.specs2.mutable.Specification
import org.specs2.ScalaCheck
import fpinscala.gettingstarted._
class MyModuleSpec extends Specification with ScalaCheck {
"fib" should {
"return 0 for 0th index" in { MyModule.fib(0) mustEqual 0 }
"return 1 for 1st index" in { MyModule.fib(1) mustEqual 1 }
diff --git a/report-core/src/main/scala/mediative/util/wrapper/FileWrapper.scala b/report-core/src/main/scala/mediative/util/wrapper/FileWrapper.scala
new file mode 100644
index 0000000..acbdfd8
--- /dev/null
+++ b/report-core/src/main/scala/mediative/util/wrapper/FileWrapper.scala
@@ -0,0 +1,42 @@
+package mediative.util.wrapper
+
+import java.io
+
@ldacosta
ldacosta / mock.scala
Created March 31, 2015 18:08
StructField from case class
import org.apache.spark.sql._
case class CC1(i: Int, s: Option[Int])
case class MockExpected2(i: Int, s: String, d: Option[Double], cc1: Option[CC1])
object MockExpected2 {
val expectedSchema2: StructType = {
StructType(Seq(
StructField(name = "cc1", dataType = StructType({
case class B(d: Int)
case class C(c: Int)
case class A(b: B, i: Int, c: C)
val sc = new SparkContext()
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
val rdd = sc.parallelize(1 to 10).map(v => A(b = B(v), i = v, C(v)))
val fName = "lala.parquet"
rdd.saveAsParquetFile(fName)
@ldacosta
ldacosta / cmpSpeed.scala
Created November 28, 2014 10:28
What is fastest to do when filling up a Map?
def generateRandom(length: Int): String = scala.util.Random.alphanumeric.take(length).mkString("")
def timeInMs[R](block: => R): (Long, R) = {
val t0 = System.currentTimeMillis()
val result = block // call-by-name
val t1 = System.currentTimeMillis()
((t1 - t0), result)
}
// a Representation is just a wrap around a type
trait Repr[T] extends wrappers.Base.Wrapper[T]
// one way of representing a value is by having the value itself:
case class Eval[T](value: T) extends Repr[T]
// or:
case class Identity[T](value: T) extends Repr[T]
// another way is by println()'ing the value before:
case class Debug[T](raw: T) extends Repr[T] {
def value: T = { println(raw.toString); raw }
}
@ldacosta
ldacosta / SparkSQL-UDFTypes.scala
Last active August 29, 2015 14:08
Spark SQL: register a table using a case class that has user-defined types
/**
* We want to register a table with Spark SQL whose rows are composed of 2 longs and 1 String.
* We would like to put restrictions of the shape of the String: for example, that it doesn't
* contain non-alphanumeric characters (or whatever...)
* Let's define what we want:
*/
// CleanString defined somewhere else
case class MyCaseClass(firstP: Long, secondP: Long, thirdP: CleanString)
@ldacosta
ldacosta / SparkFactoryNotSerializable.scala
Last active August 29, 2015 14:07
I am trying to narrow down an Exception thrown by Spark when using "factories" to create classes. See example below ====> only factory2 works!
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD
/**
* I am trying to narrow down on an Exception thrown by Spark when using "Factories".
* The factories have parameters that are used in the classes' functions.
*
* To run this code: copy-paste this whole content in a Spark-Shell. Execute Test.theMain(sc)
*
*/
@ldacosta
ldacosta / SparkSomethingNoSerializable
Last active August 29, 2015 14:07
Running the following code throwss NonSerializableException when using the case class
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD
import scala.util.control.Exception._
import scala.util.parsing.combinator.Parsers
import scala.util.parsing.input.{CharSequenceReader, Reader, OffsetPosition}
import scala.language.postfixOps
case class myCaseClass(c: Char) extends Serializable