Skip to content

Instantly share code, notes, and snippets.

View 1ambda's full-sized avatar
🦁
in the jungle

Kun 1ambda

🦁
in the jungle
View GitHub Profile
@1ambda
1ambda / JDBC.scala
Last active November 19, 2015 15:28
implementation - Programs as Values: JDBC Programming with Doobie
package free
import scalaz._, Scalaz._
import scalaz.effect._
/**
* Programs as Values: JDBC Programming with Doobie
*
* video - https://www.youtube.com/watch?v=M5MF6M7FHPo
* slide - http://tpolecat.github.io/assets/sbtb-slides.pdf
@1ambda
1ambda / ts-compiler-host-impl.js
Created December 24, 2015 11:54
TypeError: hosts.fileExists is not a function on IntelliJ 14, Typescript 1.7
var ts;
var options;
var typeScriptServiceDirectory;
var typeScriptServicePath;
var sessionId;
var logDebugData = true;
var logFileContent = false;
var sys;
var store;
var emitFilesArray;
@1ambda
1ambda / Kleisli.md
Created February 29, 2016 12:40
Kleisli

Composition (합성) 은 함수형 언어에서 중요한 테마중 하나인데요, 이번 시간에는 Kleisli 를 이용해 어떻게 함수를 타입으로 표현하고, 합성할 수 있는지 살펴보겠습니다. 그리고 나서, Reader, Writer 에 대해 알아보고, 이것들과 State 를 같이 사용하는 RWST 에 대해 알아보겠습니다.

Kleisli

State(S) => (S, A) 를 타입클래스로 표현한 것이라면, A => B 를 타입클래스로 표현한 것도 있지 않을까요? 그렇게 되면, 스칼라에서 지원하는 andThen, compose 을 이용해서 함수를 조합하는 것처럼, 타입 클래스를 조합할 수 있을겁니다. Kleisli 가 바로, 그런 역할을 하는 타입 클래스입니다.

Kleisli represents a function A => M[B]

타입을 보면, 단순히 A => B 이 아니라 A => M[B] 를 나타냅니다. 이는 KleisliM 을 해석하고, 조합할 수 있는 방법을 제공한다는 것을 의미합니다. 실제 구현을 보면,

@1ambda
1ambda / gist:79958f7f16ffe599ef162b711f518b72
Created July 18, 2016 16:34
kafka-connect-stacktrace-2016-07-19
connect_1 | linger.ms = 0
connect_1 | (org.apache.kafka.clients.producer.ProducerConfig:178)
connect_1 | [2016-07-18 16:32:18,732] WARN The configuration config.storage.topic = connect-configs was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:186)
connect_1 | [2016-07-18 16:32:18,732] WARN The configuration group.id = connect-cluster was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:186)
connect_1 | [2016-07-18 16:32:18,732] WARN The configuration status.storage.topic = connect-status was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:186)
connect_1 | [2016-07-18 16:32:18,733] WARN The configuration internal.key.converter.schemas.enable = false was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:186)
connect_1 | [2016-07-18 16:32:18,733] WARN The configuration offset.flush.interval.ms = 10000 was supplied but isn't a known config. (org.apache.kafka
atom setting
1.6.2-bin-hadoop2.6 ./bin/pyspark --packages com.datastax.spark:spark-cassandra-connector_2.10:1.6.2,TargetHolding:pyspark-cassandra:0.3.5 --exclude-packages org.slf4j:slf4j-api
Python 2.7.12 (default, Dec 16 2016, 13:04:21)
[GCC 4.2.1 Compatible Apple LLVM 8.0.0 (clang-800.0.42.1)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Ivy Default Cache set to: /Users/lambda/.ivy2/cache
The jars for the packages stored in: /Users/lambda/.ivy2/jars
:: loading settings :: url = jar:file:/Users/lambda/github/apache-spark/1.6.2-bin-hadoop2.6/lib/spark-assembly-1.6.2-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.datastax.spark#spark-cassandra-connector_2.10 added as a dependency
TargetHolding#pyspark-cassandra added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
bash-4.1#
j-api6.2,TargetHolding:pyspark-cassandra:0.3.5 --exclude-packages org.slf4j:slf4
Ivy Default Cache set to: /root/.ivy2/cache
The jars for the packages stored in: /root/.ivy2/jars
:: loading settings :: url = jar:file:/usr/local/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.datastax.spark#spark-cassandra-connector_2.10 added as a dependency
TargetHolding#pyspark-cassandra added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found com.datastax.spark#spark-cassandra-connector_2.10;1.6.2 in central
@1ambda
1ambda / gist:c2eab7f6e4fa00a27deb14b76436842a
Created January 3, 2017 05:12
pyspark cassadra yarn cluster
zeppelin-web git:(ZEPPELIN-1850/remove-strip-loader-in-webpack) docker run -it -p 8088:8088 -p 8042:8042 -p 4040:4040 -h sandbox sequenceiq/spark:1.6.0 bash
/
Starting sshd: [ OK ]
Starting namenodes on [sandbox]
sandbox: starting namenode, logging to /usr/local/hadoop/logs/hadoop-root-namenode-sandbox.out
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-root-datanode-sandbox.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-root-secondarynamenode-sandbox.out
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn--resourcemanager-sandbox.out
root@moby:/# spark-shell --master mesos://master.mesos:5050 --packages com.datastax.spark:spark-cassandra-connector_2.10:1.6.2,TargetHolding:pyspark-cassandra:0.3.5 --exclude-packages org.slf4j:slf4j-api
Ivy Default Cache set to: /root/.ivy2/cache
The jars for the packages stored in: /root/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.datastax.spark#spark-cassandra-connector_2.10 added as a dependency
TargetHolding#pyspark-cassandra added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found com.datastax.spark#spark-cassandra-connector_2.10;1.6.2 in central
found joda-time#joda-time;2.3 in central
[root@moby sbin]# ps -ef
UID PID PPID C STIME TTY TIME CMD
root 1 0 0 06:35 ? 00:00:00 /bin/bash /etc/entrypoint.sh bash
root 152 1 0 06:35 ? 00:00:00 mesos-master --ip=0.0.0.0 --work_dir=/var/lib/mesos
root 153 1 0 06:35 ? 00:00:00 mesos-slave --master=0.0.0.0:5050 --work_dir=/var/lib
root 154 1 0 06:35 ? 00:00:00 bash
root 189 154 0 06:36 ? 00:00:00 ps -ef
[root@moby sbin]# ./bin/pyspark --master mesos://127.0.0.1:5050 --packages com.datastax.spark:spark-cassandra-connector_2.10:1.6.2,TargetHolding:pyspark-cassandra:0.3.5 --exclude-packages org.slf4j:slf4j-api
bash: ./bin/pyspark: No such file or directory
[root@moby sbin]# cd ..