Skip to content

Instantly share code, notes, and snippets.

@gangstead
Last active December 20, 2015 15:21
Show Gist options
  • Save gangstead/c3972e275d3f69c998d4 to your computer and use it in GitHub Desktop.
Save gangstead/c3972e275d3f69c998d4 to your computer and use it in GitHub Desktop.
PNW Scala Nodes

Friday 11/14/14

notes by Steven Gangstead

Rapture Art of One-liner - Jon Pretty @propensive

rapture.io -collection of libraries for scala

jawn json parser. Quicker than Jackson, and I guess Jackson must normally be considered the quickest.

My thoughts: Awesome talk. Look into rapture, looks useful. Jon is smart.

Scalaz gateway drug - @rit

presentation in Deckset, source is in markdown http://www.decksetapp.com/

Either convention - Left error, Right success
Problem - no monad things like flatmap so you have to do a lot of if(_.isRight){...}

Scalaz alternative: Disjunction /

Assumes you mostly want the right (success) side. Right side bias Right notation: \/- Left notation: -\/ "Failure!".left === -/("Failure!") === /.left("Failure!") "Success!".right === /-("Success!") === /.right("Success!")

Problem with option: for{ x <- getOptionX y <- getOptionY(x) } yield {y} if result y is None, did getOptionX fail or getOptionY fail? You don't know

Convert option to a right disjunction: for{ x <- getOptionX /> "No x found" y <- getOptionY(x) /> "No y found" } yield {y} Y will yield /-(y) if successful (remember the right bias) or else it will yield a helpful error state like -/(no y found) or -/(no x found)

	scala>	None \/> "no object found"
	res: 		-\/("no object found")

Much more helpful message can be returned to the front end! There was some talk that the name Disjunction is bad because this doesn't fit the mathematical definition of disjunction. My take: Disjunction is a bad name and "/" is a bad "name" or symbol. The structure is still useful.

Validation

|@| - home alone operator, tie fighter, the scream. Official name Validation Like a Disjunction but has a success and a failure mode you can append them together with *> operator

scala>
val foo = "foo".failure
val bar = "bar".failure
val baz = "baz".failure

foo *> bar *> baz
res: Failure(foobarbaz)

NonEmptyList - list guaranteed to have at least one item

ValidationNel

guarantees if there is a failure it will be a Non Empty List

Monad Transformers

mentioned that they are cool and they help in situations where you are having to do nested for comprehensions

See Eugene Yokata's Learning Scalaz http://eed3si9n.com/learning-scalaz

My thoughts: @rit was a funny nerd. Really smart. Scalaz's biggest problem is using symbols for everything. Makes it hard to discus and follow. Revolting to new comers. My idea: make a scalaz fork called scalazed with names for everything.

Types Out Of Patmat - Stephen Compall

X <: Y

<: called duckface X is a subtype of Y, but possibly X == Y

Injectivity For all values y, trait K[R], y.type <: K[S] y.type <: K[T]

implies S = T

My thoughts: I copied this stuff down but I didn't really follow what was going on. High level dry academic talk. He was mostly doing proofs of why pattern matching on types works in Scala. It's nice to know, but there isn't really anything actionable to do with it.

Does Stephen Compall work for Typelevel? His message was that there is a problem with pattern matching on empty lists or something and that this is a huge hole in Scala that affects everyone. I think he's just trying to drum up justification for typelevel's scala fork.

Don't Cross the Streams - Marc Millstone

Lean Analytics and Machine Learning (ANIML)

Counting is important

  • service alerting
  • big data analysis

Code and slides at: http://github.com/splittingfield/tallyho

Similar libraries: http://github.com/twitter/algebird http://github.com/addthis/streamlib

Rules of streaming data:

  • data is a sequence of elements
  • each element can only be processed once
  • data structures may only consume constant memory (regardless of stream size)

What follows is a bunch of mathematical proofs of different stream algorithms like "approximate the cardinality of x in a stream" or "track the top N terms in a stream" accompanied by scala implementations of those algorithms.

Fun rhyme to remember which style to translate between java and scala data structures:

import scala.collections.JavaConversion._ "... is a perversion" import scala.collections.ScalaConverters._ "... never cause hurters"

Apache Spark I: From Scala Collections to Fast Interactive Big Data with Spark

Evan Chan

Spark horizontally scalable in memory queries Integration with Hadoop, S3, most databases Much more concise than Hadoop

Conclusion

If you can flatmap it then you can Spark it

My thoughts: Spark code looks pretty simple, much simpler than the equivalent hadoop code and there are lots of integrations into more than just hadoop. Worth taking a look at.

Apache Spark II: Streaming Big Data Analytics with Team Apache, Scala & Akka

Helena Edelson

Kafka - distributed message passing http://kafka.apache.org/

My Thoughts: showed some good example cassandra code that's split up to run in actors. KillrWeather. It's on github.

Miniboxing: JVM Generics without the overhead

Vlad Ureche

Vlad is a disciple of Odersky himself. He's getting his PhD from EPFL

Miniboxing: something to do with reducing boilerplate javacode that would otherwise require exponential amount of java classes due to variance on generic type parameters (I didn't totally follow).

It's a compiler plugin.

Add @miniboxed annotation to source and it cuts the compile time. In the demo it went from 4 seconds to 3 seconds, 25% reduction in the demo.

Then he changed an option flag in build.sbt and compile time dropped to 1 second. No annotation necessary. -P:minibox:mark-all //mark all type parameters as @miniboxed

Sometimes faster than a similar library called Specialization sometimes slower. Bytecode generated also sometimes smaller than Specialized, sometimes more, depending on the library.

http://scala-miniboxing.org

Special projects updates from the scala team at EPFL:

  • There's a scala staircase at EPFL
  • There was a spiral logo, then they built the stairs, then they changed the logo to be based on the stairs
  • YinYang - frontend multi-stage execution
  • Scala.js backend
  • Lightweight Modular Staging - program optimization
  • Dependent Object Types calculus - core type system of the dotty compiler
  • Pickling framework and Spores - support for distributed programming
  • Stage Parser-combinators - fast parser combinators through staging
  • dotty compiler - compiler for Scala but with the DOT type system (Odersky on the team)
  • scala.meta - metaprogramming support. Improved reflection, macros, and many more
  • scaladyno plugin - giving Scala a dynamic language look and feel
  • ScalaBlitz - optimization framework, speeding up collections
  • LMS-Kappa - protein simulator
  • Odds - probabilistic programming framework
  • Type debugger - debugging aid for Scala type errors
  • ScalaMeter - benchmarking framework - google caliper for Scala
  • Vector implementation for RRB trees - improved performance for Scala collections

Bridgeport Brewery tonight

Towards a Safer Scala

Leif Wickland

Use Static Analysis Gives you a better compiler Automatic code review Automatic enforcement of coding standard (not a markdown) Fewer hidden bugs

Static Analysis options

  • IDE-based
  • frowny face, no enforcement, next dev might not have same ide setup

scalacOptions he recommends:

  • -Xfatal-warnings compiler switch
  • -deprecation won't allow use of deprecated code
  • -Xlint catches things
  • like missing interpolator (s"$blah" )
  • another thing it catches: inferred Any type
  • In 2.11 you can selectively set the settings

FindBugs

  • high false positive rate for Scala
  • don't use for Scala

Scalastyle

  • not a scalac plugin
  • sbtplugin for easy integration
  • can be tied to compile and test commands
  • used on Coursera course
  • hard to configure
  • can suppress via comments
  • can ban "better java" code that uses vars, nulls, etc
  • don't ship ??? and println
  • rule set feels broad and unfocused
  • good for suppression, overall recommend

Abide

  • very new
  • supported by TypeSafe
  • part of main scala on github
  • rules have mainstream flavor
  • has promise, but doesn't really exist yet
  • quicker compile times than WartRemover
  • can write rules to rewrite code
  • watch it, shows promise

WartRemover

  • moving forward fast
  • easy sbt integration
  • good documentation
  • typelevel
  • avoid problematic inference
  • product
  • serializable
  • any
  • ...
  • ban "better java"
  • bans partial methods which "throw"
  • List.head .tail .last
  • Option.get
  • "YOLO" methods
  • good signal/noise, only ok suppression, easy learning curve
  • recommended

Linter

  • many years of many forks
  • spotty documentation
  • scalac plugin without sbt plugin
  • configured via scalac switches
  • no suppression mechanism
  • more traditional linter style
  • good signal/noise in warnings
  • weak recommendation

Scapegoat

  • less than a year
  • release proliffically
  • good documenation effort
  • sbt plugin
  • WartRemove ++ Linter ++ ScalaStyle
  • catches things like using list.find(...).isDefined instead of list.exists(...)
  • lonely sealed class
  • some drawbacks
    • overwhelming (turns 100+ rules on by default)
    • bans final case class
    • somewhat unclear how to disable a rule
    • bans wildcard imports
    • some of the rules are kind of misguided
  • medium recommendation, shows promise

best recommendations:

  • scalac
  • scalastyle
  • wartremover

Skeleton project on github

Existing project:

  1. start small, tighten screws
  2. start with rules disabled
  3. make linter gate in CI
  4. pick a rule the team agrees on
  5. clean up code supress
  6. enable rule as an error
  7. goto 3

slides at http://tinyurl.com/pnwslint

Saturday 11/15/14

Adding Tree and Tree: Distributed Decision Tree Learning

Avi Bryant @avibryant

Code examples in Brushfire - a distributed generic decision library in Scala that Avi promises will be open source RealSoonNow (tm)

Trying to make predictions. Data structure is a tree. Tree nodes have decisions, Leaf nodes have predictions in them.

The prediction is a generic type, T.

trait Tree[V,T] {
	def predict(features: Map[String,V]): T
}

In order to support distributed there is a type constraint on T:

Tree[V,T:Monoid]

T has to be a monoid, which essentially means "something which can be summed"

Next Avi gave a detailed demo of how to do a map reduce job for Brushfire on Hadoop

What is new since "Programming in Scala"

Marconi Lanna @Originate

Book is by Odersky, Spoon and Venners

2nd edition was in 2010 and covered Scala 2.8, but so far no new editions of the book published.

Venners 3 months ago: "we have plans, but no time". Agrees it's long overdue.

Scala Timeline

  • 2003: .8, .9
  • 2004: 1.0-1.3
  • 2005: 1.4
  • 2006: 2.0-2.3
  • scalac written in Scala
  • 2007: 2.4-2.6
  • Lift
  • 2008: 2.7
  • 2010: 2.8
  • Play 1.1 Scala support via plugin
  • Akka
  • 2011: 2.9
  • Typesafe
  • 2012
  • Play 2.0: native scala
  • 2013: 2.10
  • 2014: 2.11
  • 2016: 2.12

New in Scala 2.8

  • Lots of bug fixes since 2.7.7
  • redesigned collectiosn
  • copy methods for case classes
  • package objects
  • boxed primitives
  • revamped repl
  • java converters
  • scaladoc2
  • binary compatibility between minor versions

New 2.9, 2.10, 2.11

"If it ain't broke in 2.10, 2.11 is going to fix it"

DelayedInit and the App trait (2.9)

//before 2.9
object Hello extends Application{...}

Not thread safe in the jvm, very slow

object Hello extends App{...}

Way faster

Range.foreach optimation (2.10)

foreach compiles to a while loop instead of a for loop. Resulting java code 10x faster (example given)

Parallel collections (2.9)

  • concurrent out-of-order semantics
  • associative operations work
  • non-commutative operations are still deterministic
  • example: string concatenation works as expected
  • timing example given. Simple switch from regular to parallel collection and time went from 400 to 100 ms

Generalized try-catch-finally (2.9)

Reusable exception handling

try
	body
catch
	handler
finally
	cleanup

Cool, but supersceded by Try monad...

Error handling with Try (2.10)

Try is used to perform operations without the need to do explicit exception-handling in all places that an exception might occur

Only non-fatal exceptions are caught. System errors are thrown.

Should be similar performance to try-catch-finally (since that what's going on in the background), but no test example given.

Try has monad operations:

  • map
  • flatMap
  • recover
  • recoverWith
  • filter
  • getOrElse
  • toOption

Question from the peanut gallery about the monad-ness of Try Consensus: It looks like a monad, and has monad-like functions, but is not a true monad due to lack of associativity.

Implicit classes (2.10)

A more convenient syntax for defining extension methods Have a primary constructor with exactly one parameter

implicit class A(n:Int){
   def x = ???
}

They are desugared into a class and an implicit method pairing:

class A(n: Int) {...}
implicit def A(n:Int) = new A(n)

Value classes (2.10)

used to avoid object allocation (conditions apply)

Type safety of custom data types without the runtime overhead ex: celsius, fahrenheit, weight, height, firstname, email,age etc Only a primary constructor with exaclty one val parameter. Only methods (def). no Var, val, lazy, val , nested classes, traits or objects May not define equals or hashCode cannot be extended by another class.

case class Age(age: Int) extends AnyVal
val age = Age(18)

At compile time Age is a type, but at runtime it's an Int

Extension mehtods (2.10)

Value classes and implicit classes can be combined to produce allocation-free extension methods Equiavalent to using an object with static helper methods. A simple mechanical transformation performed by the compiler.

Similar feature in C#

String interpolation (2.10)

val name = "world"
assert(s"hello, $name" == "hello, world")

Supports arbitrary expressions assert(s"${2+2} == "4") and works within triple quotes """ ..."""

Similar to features in many scripting languages, but Scala adds:

  • typesafe
  • at compile time
  • can define your own custom interpolators
  • example of a sql"..." interpolator that prevents sql injection attacks
  • example of a json"..." that is also useful

Futures and Promises (2.10, 2.9.3)

Future

  • operations in parallel in an efficient and non blocking (asynch) way
  • placeholder for a result that does not yet exists, but may become available

Callbacks are executed eventually (onComplete, onSuccess onFailure)

Futures can be combined and transformed with monad functions and used in for-comprehensions.

nervous chuckles "I'm pretty sure it counts as a monad" waits for retort from math geeks

Example how you can run futures in parallel by assigning them to vals before using them in a for-comprehension. If you define them in the for-comprehension they will be done sequencially.

Where a future is a read-only palceholder for a result that doesn't exist yet a Promise is a writeable, single-assignment container which completes using a Future.

val p = Promise[Int]
val f = p.future

assert (!f.isCompleted)
p success 42
asserter (if.isCompleted)

Dynamic trait (2.10)

  • Syntax sugar. A simple mechanicla transformation performed by the complier
  • Is not any sort of "dynamic type"
  • Is not any sort of "optional" static typing
  • enable flexible DSLs and convenient interfacing with dynamic languages and data formats like JSON

Extend Dynamic and implement at least one of the following mehtods:

  • applyDynaic
  • applyDynamicNamed
  • selectDynamic
  • updateDynamic

Akka actors (2.10)

Since 2.10 Akka is the default actor library legacy scala actors deprecated in 2.11 bit topic all by itself

Modularization (2.10)

Some of the more advanced lnagugage features have to be explicitly enabled postfixOps reflectiveCalls experimental.macros dymaics existential types ... (couldn't type fast enough)

Reflection, macros and quasiquotes (2.10, experimental)

Reflection

  • experimental means api might change without going through deprecation first

Macros

  • Metaprogramming: programs that modify themselves at compile time
  • code generation and advanced DSLs
  • Compile-time (macros) and runtime reflection
  • Macro examples: Play JSON API, Scala Pckling

Quasiquotes (2.11) are a significantly simplified notation to manipulate Scala syntax trees with ease.

q"foo + bar"
assert( q"foo + bar" equalsStructure q"foo.+(bar)")

quasiquotes can be decomposed via pattern matching:

  • example

Case classes with more than 22 parameters (2.11)

  • you shouldn't have to use it, but it's there

New methods in collections

  • non-exhaustive list
  • too many to type out

sbt incremental compilation (2.11)

  • sbt 0.13.2
  • one line in build file
  • anecdotal example
  • 25% - 80% compile speed improvement
  • bigger results in bigger projects

Predef.??? (2.10)

  • placehoder for uneimplemented methods

REPL colors (2.11.4)

  • pretty

2.12 and beyond

  • Java 8 support (2.12)
  • java 8-style closures and lambdas
  • java streams and functional interfaces
  • @interface traits
  • Java 8+ only
  • Funny tweets from @ScalaFacts on twitter
  • improved lazy vals initialization
  • async & await
  • collections library cleanup and simplification
  • compilier-based code style checker
  • deprecations
  • procedure syntax: def a {...}
  • XML literals
  • scala.swing
  • scala.js
  • scala compiler forks: Dotty (EPFL), Typelevel, that other guy

Presentation at http://github.com/marconilanna/PNWSCala2014

One Year of Akka

Ryan Tanner @youfoundryan

  • Works at Conspire - the evilist named company.
  • They were originally using Java based Akka, but converted to Scala.
  • Started converting small pieces from Java to Scala
  • used a mixed project with sbt
  • One of their guys sat down with no Scala experience and wrote a bunch of code
  • once it type checked it ran the first time

Problems they have:

  • hiring Scala devs
  • managing different Scala styles

"Are we just using Scala as a better Java"

  • A little bit
  • We're just starting out here
  • hard to sell full on Scala to a team with no functional programming experience
  • afraid to use scalaz or shapeless

Goals for year one:

  • Resiliency
  • Scalability
  • Clarity

First two goals met. Resiliency and Scalability closely linked. Akka "let it crash" means they don't worry about nodes crashing. We missed the mark on Clarity

"Akka won't save you from building a monolith"

You can wind up with a tightly couple architecture. Nicely distributed, but tightly coupled. This made it extra hard to bring on new developers to the project.

"SRP isn't just for actors and funcitons, it's for everything"

SRP - Single Responsibility Program We didn't know where to start so we regret some of our early decisions. Proper design patterns around akka aren't well documented.

Why everything crashed and burned when we went live

Things started crashing when they went to EC2 and multiple customers started going through the pipeline. Didn't take into account any time of bak-pressure Actors couldn't get any CPU time. Out-of-memory killer on the OS was shutting down actors and they didn't put anything in their logs (just the kill statment in the system log)

What didn't work

  • Going to bigger and bigger EC2 instances
  • Tried tweaking actor dispatchers
  • Also didn't work
  • You still need to do some tweaking, but not much
  • All went back to lack of back pressure
  • Actor going off to do an expensive operation in a future and returning it
  • Actor moves on to the next message, even though that expensive resource isn't available yet
  • Learned you must bake in back pressure at every step in pipeline

Pull, don't push

Was it worth the risk?

Yes, it successful for the business. Akka's place in their infrastructure is secure.

Going forward

  • Keep services small
  • Think about how you would onboard a new dev
  • not saying you shouldn't use hard stuff
  • good indicator of where you are on the project
  • Assume that everything sucks and will crash at the worst possible time because it will

what is Conspire doing next

  • breaking apart cluser into smaller services that communcate via Kafka
  • Throwing out Chef and Vagrant and deploying with Docker using CoreOS
  • sbt docker plugin is fantastic
  • Sticking with Akka and Scala
  • Eliminate top down ... something (I didn't type fast enough)

Hands-on Scala.js

Li Haoyi

Works at drop box

Scala.js

  • Scala -> Javascript compiler
  • Run scala code in web browser
  • Performance
  • 1/3x slower by raw JS
  • 10x slower than Scala-jvm
  • still 5x faster than python
  • 150kb for hello world application. JS code, not-gzipped
  • 400kb for big program with lots of libraries
  • Shows some simple Scala code and the JS that is generated. It's pretty dense. Hardly recognizable, but the generated code is pretty straight forward

http://scala-js.org Fiddle

Extends the reach of your Scala

  • Play Websites
  • Node.js modules
  • Chrome extensions
  • Autodesk Fusion plugins
  • Firefox OS?
  • Not just the JVM?

timeline

  • announced june 2013

Why Scala JS?

  • JS not a good language
  • The "good parts" are just 4-5% of the language
  • Use a compiler that compiles a good language just using that good 4-5%
  • Lots of other examples like this
  • JS is
  • verbose
  • too flexible
  • hard to write tools
  • scary to refactor

Interactive Web Pages

  • have to use JS because that's all you get in the browser
  • ignoring flash, which is dying anyways

Example:

  • built with sbt, incremental recompiile
  • sbt console output statement goes to JS console in the browser (cool)
  • extra sbt plugin
  • 845kb of js output without any optimization from about 50 lines of code. On account of all the library code that has to be included. But compiles quickly.
  • With full optimization it is 140kb
  • not as small as clojure script and haskell script(?) which are 40 and 20 kbs.

Puts his code on dropbox in his public folder, because he works at drop box. It runs from there.

Now he switches it around to be interactive and inserts into a div instead of painting on a canvas. All of the dom stuff is statically typed in the scala.js libraries.

Also brings in another library he wrote, scalatags This looks like a dsl he wrote to navigate HTML tags in scala code. Much less verbose than doing html document.getlementbyId.

Then he codes up an autocomplete widget, which he says is a hard thing to do in coffeescript. This guy is a wizard of live-coding

Now he adds some ajax calls to an open weather rest api.

  • the ajax and json parsing of the result are very JS like.
  • you can then cast the json to something statically typed

Question from the audience Can the standard library be put in a cdn? answer not the way it's currently setup. They tried that before and it was 20mb of js. What they do now is that unreachable code is not included.

Question can you call other JS libraries (angular, jquery, whatever...) Answer yes. Two ways: dynamically with any library or statically if you write a wrapper.

Question is the boilerplate baseline going to get any smaller than 150kb and be closer to the 60kb you get with Clojurescript Answer I don't know how we can get any better than we currently are doing. Scala core is all intertwined and hard to pare down.

Cross-platform libraries

Scalatags started as a jvm libary, but it doesn't need to be. It doesn't have jvm specific functionality. Now it's used for scala-js

You have to cross compile using the sourcecode, same way you cross compile across scala versions. Other libraries like scalaz and shapeless weren't written for scala-js

JVM specific things that preclude scala-js usage:

  • thread
  • runtime
  • reflection: pickling, akka
  • scalatest
  • scalate
  • netty
  • spray
  • swing

these things are ok:

  • java.lang.*
  • scala.*
  • macros
  • scalaz
  • many more (couldn't type fast enough)

Demo

  • libray with some shared code
  • separate code for things that differ functionality depending on platform (jvm, js)
  • couple extra lines in build.sbt and it cross compiles to both a jar and scala-js
  • unit tests with uTest
  • can share unit tests that when you test will run on both platforms
  • can have separate unit tests that are different on each platform

Client-server integration

Web development in theory would be much simpler if you had the same language on front and back end. Wouldn't have problems for example with serializing data.

Minmal scala-js integration Demo

  • took a simple spray/akka server
  • build a subproject "client"
  • compiles the client project to scala-js
  • changes changes sbt configuration to have the client output wired up as a resource directory for the spray project
  • then changes the spray route to getFromResourceDirectory

Optimal scala js integration demo

  • shares lists between front and backend
  • serialization code is done by a macro so the shared methods are statically typed
  • typos caught in the editor (never get that from JS)

Client-server Takeaways:

  • wiring sclaa.js into any existing project is trivial
  • sharing code between client/server is awesome
  • constants algorithsm, data-structures, libraries, etc
  • type-safety makes shared code amazing
  • the whole setup actually works

Wrap up

Scala.js works!

  • Scala.js usable for all sorts of preojcts
  • Experience is great
  • Future looks promising

The future is now

  • scala.js provides multiple web-dev holy grails
  • shared interface between client/server
  • checked interfaces between c/s
  • sane shared language between c/s
  • whole program checked c/s
  • Not the future, but today
  • actually ~6months ago

Not great things:

  • small community
  • it's new
  • scala compiler is slowww, std lib bloated
  • even with incremental compiler and dead-code-elimination
  • no big corporate backing
  • just two guys and a some extras
  • some rough edges
  • arguable fewer than JS itself

Unruly Creatures: Strategies for dealing with Real Numbers

Erik Osheim

I only caught the last 5 minutes of this talk. I did catch the summary.

Summary:

  • Int and Long are fast and dumb
  • SafeLong is still pretty fast but safe
  • Double is fast and as-smart-as-possible, but no smarter
  • using primitives "by default" is probably not smart
  • if you use Double you already work with approximates
  • Use Rational for exactness, and Real for "best" estimates
  • There is no silver bullet.
  • "but there are better bullets"
  • Previous talks have discussed generic numeric code
  • allows you to "try out" different number types through your application
  • can switch out if it doesn't work. Just a small change.
  • Spire goes to great lengths to support generic algebra
  • Hopefully these tradeoffs hel pmotivate that work

What every (Scala) programmer should know about category theory

Gabriel Claramunt @gclaramunt

Category theory is only easy to understand if you already know Category Theory

What is the most important technique for programming? Abstraction Category theory is abstraction taken to 11

Why?

  • Has direct applicability to programming
  • We can go far just learning the basic vocabulary

What's a category?

  • Things
  • Connections between things
  • what we actually care about
  • The only things we want in a category
  • identity
  • composition
  • illustrative graph of objects (things) and arrows between them
- morphism (arrows)

What's a category? Laws:

  • identity as unit
  • composition is associative

How is it related to programming?

  • illustrative graph of objects, noted by types and the morphisms (arrows) are the functions that transform them
  • Types & functions*
  • types are the objects
  • functions are the arrows
  • lots of restrictions on functions, not valid for partial functions and exceptions
  • Results are valid in the context of "Fast and loose reasoning is morally correct"

Every category has an initial object (0)

  • for every object A there's a unique arrow 0 -> A Final object (1)
  • for every obect A there's a unique arrow A -> 1
  • In "previous category" the types:
  • 0 = Nothing
  • 1 = Unit

Objects are types.

  • "new cateogory" Initial object = Any Final Object = Nothing
  • I asked about this and the previous category and didn't really understand the answer
  • Arrows are extends/with (traits)

Dual category (co-)

Reverse the arrows (Steven's note: Eric Meijer mentions Duals a lot in the Coursera Reactive course)

Functors

Functors are the "structure-preserving" mappings between categories

Functors "map" one category to another

  • F(obj)
  • F(fn)

"structure-preserving"

  • miniproof I didn't understand

Endofunctors

"endo" = inside, within Functors where the source and target are the same category

  • mapping to the same type
  • in programming mostly dealing with endofunctors
  • upon clarification from the audience programming only uses endofunctors (maybe?) String -> Int the arrow (morphism) is a functor Option[String] -> Option[Int] is also a functor, it's a map functor on top of the other functor or something

Check out this guy's blog: James Iry Monad. He's supposed to have a good category theory explanation

Natural Transformations

Allow you to move from functors to functors Example: Seq[A] ->(headOption) -> Option[A] -> Option.fmap f -> Option[B] Seq[B] -> Seq.fmap f -> Seq[B] -> headOption -> Option[B]

My thoughts: I don't know what I can do with this information.

Building a Better Future: Advanced Error Handling for Concurrent Programming with Scalaz and Shapeless

Jean-Rémi Desjardins, Eddie Carlson - work @whitepages

Sequence Futures in a well-typed, fail-fast manner

  • while preserving order

Example problem

Search system: Name -> 3 current requests (phone, home, work) sequence them into a response after they all complete

How to do it today

  • Future.sequence(...)
  • Limited typing - will infer a sequence of any
  • will look fine but get a match error at run time
  • Doesn't fail fast. Won't return until they all return even if one fails instantly (more specifically will return if first one fails)
  • For comprehension: for( p <- phone; h<-home; w<- work)...
  • Does not fail fast
  • async{ val p = await(phone) ... }
  • way to write programs that look sequential but run concurrently
  • same problems as for comprehension

Shapeless: HList

  • Solution to first problem (limited typing)
  • heterogeneous list
  • typed over each element
  • list-like api
  • generic function definition

Sequence over HList

  • we want to sequence an HList[Future[x]::Future[y]:: HNil] and return Future(HList[x]::HList[Y]::HNil)
  • Shapeless already has this. Defined in the shapeless-contrib lib

Scalaz

  • provides hierarchy of type classes
  • monad
  • Container
  • Option, Future, Either
  • Combination function defined in a non-parallel friendly way
  • doesn't lend itself to failing fast
  • Applicative
  • All monads are applicatives
  • Combination function defined in parallel-friendly way

Shapeless sequence

  • sequences over an applicative

Existing Monad Definition for Future

  • Does not fail fast in sequence

Fail-Fast Future Applicative Definition

  • These guys' solution, not part of scalaz
  • Now the shapeless contrib-sequence function fails fast
  • make it available via an implicit

Caveat

Type inference does not always paly well with typeclasses and implicit parameters

Multiple type parameters

  • They defined a class CumulativeTask for their collection of futures (if I understood correctly)

Ordering of type parameters

Scalaz and shapeless assume the most important parameter comes last. If you don't follow that convention you end up having to write a lot more code to make it work.

Composing Project Archetypes with SBT AutoPlugins

Mark Schaake

New title: Combatting multiple build maintenance hell with autobuild plugins

Multiple Build Maintenance Hell When you have limited visibility and contorl over many project builds to the point wher eyou feel paralyzed to maintaining cross-project consistency

Hypothetical company: SOA, Inc

  • scala, akka and spray
  • Svc A uses some plugins and has a bunch of dependencies
  • Svc B uses the same plugins but uses newer version of scala, akka and spray
  • Svc C uses some different plugins and yet more dependencies
  • Svc D doesn't know and copies SvcA and uses the old plugins and old dependencies
  • etc... it gets out of hand

Ai2 got into MBMH in the course of a matter of months

  • got up to 52 projects and build files were getting huge
  • now have much smaller build files while having more projects

Solving MBMH

  • maximize consistency across projects
  • minimize build complexity (LOC in build files)
  • maximize agility to evolve standards
  • allow for stragglers (don't force upgrades)
  • easy build upgrade path

Other considerations

  • archetype settins shared by simialr projects
  • core settings common to all projects
  • formatting and style
  • generate git version resource

Solution: Archetype SBT Plugins

-Projects enable a single (versioned) archetype plugin

  • archetype plugin provides
  • core build settings (style, scala version, etc)
  • archetype build settings ( deploy, publish, etc)

Implementing Archetype plugins

  • How to wrap/ depend on other plugins?
  • How to include core settings in each archetype?
  • Core set of plugins (4 in this case)
  • Those bring in auxiliarry plugins.

(Vanilla) SBT Plugins?

  • Loosely defined API
  • What are the right conventions?
  • not really defined

SBT AutoPlugins

  • New plugin standard (since SBT 0.13.5)
  • Well-defined plugin API
  • Less need for conventions
  • Killer feature: can compose plugins

Core Settings AutoPlugin

  • sample code
  • requires keyword to pull in other AutoPlugins
  • when this plugin is enabled...
  • adds projectSettings from required plugins
  • adds projectSettings from this plugin

Composing an archetype plugin

  • example WebServicePlugin
  • requires two other organization AutoPlugins (core and deploy)
  • When webserviceplugin is enabled
  • recursively composes other plugins
  • adds webservice plugin specific settings

Archetype Plugins in Action

  • Demo code from SBT command line
  • in latest SBT there is a plugins command that will list all the available AutoPlugins
  • Can define plugins to be automatically enabled if a trigger condition is met
  • happens as soon as you add that source for plugins
  • for example the Style plugin gets put on all projects
  • they are publishing their plugins publicly
  • unsure what to do if you don't want to
  • my guess is you just publish them to your org's nexus and resolve from there

check out their plugins at http://github.com/allenai/sbt-plugins

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment