Skip to content

Instantly share code, notes, and snippets.

View huafengw's full-sized avatar

Huafeng Wang huafengw

  • Alibaba
  • china shanghai
View GitHub Profile
@huafengw
huafengw / deployGearpump.sh
Last active December 17, 2015 06:48
Simple script for Gearpump deploy
#!/bin/bash
alias ssh="ssh -i ~/.ssh/id_rsa";
alias scp="scp -i ~/.ssh/id_rsa";
gearpumpVersion="pack-2.11.5-0.6.2-SNAPSHOT"
gearpumpDir="/root/gearpump-${gearpumpVersion}"
gearpumpPack="/root/gearpump-${gearpumpVersion}.tar.gz"
#cluster="intelidh-01 intelidh-03 intelidh-04 intelidh-06"
cluster="node10 node11 node12 node15"
masters="node10"
@huafengw
huafengw / gist:306f0250f29b50d154e1
Created February 3, 2015 09:51
Create a Json from a case class
import spray.json._
import DefaultJsonProtocol._
case class Person(id: Int, name: String)
object Test extends App{
implicit val personFormat = jsonFormat2(Person.apply)
val person = Person(1, "Tom").toJson
println(person.prettyPrint)
}
@huafengw
huafengw / sol_storm_acker_on.yaml
Last active August 29, 2015 14:20
sol_storm_acker_on.yaml
metrics.enabled: true
metrics.poll: 5000 # 60 secs
metrics.time: 600000 # 10 mins
metrics.path: "reports"
# topology configurations
topology.workers: 16
topology.acker.executors: 48
topology.max.spout.pending: 1000
topology.executor.receive.buffer.size: 16384
@huafengw
huafengw / sol_storm_acker_off.yaml
Created April 28, 2015 08:07
sol_storm_acker_off.yaml
metrics.enabled: true
metrics.poll: 5000 # 60 secs
metrics.time: 600000 # 10 mins
metrics.path: "reports"
# topology configurations
topology.workers: 16
topology.acker.executors: 0
topology.max.spout.pending: 1000
topology.executor.receive.buffer.size: 16384
@huafengw
huafengw / sol_gearpump.yaml
Created April 28, 2015 08:08
sol_gearpump.yaml
# metrics configurations
metrics.enabled: false
metrics.poll: 60000 # 60 secs
metrics.time: 900000 # 15 mins
metrics.path: "reports"
# topology configurations
topology.workers: 16
topology.acker.executors: 0
topology.max.spout.pending: 1000
@huafengw
huafengw / stream-bench.sh
Created September 2, 2016 01:56
yahoo streaming benchmark distributed shell
#!/bin/bash
# Copyright 2015, Yahoo Inc.
# Licensed under the terms of the Apache License 2.0. Please see LICENSE file in the project root for terms.
alias ssh="ssh -i ~/.ssh/id_rsa";
alias scp="scp -i ~/.ssh/id_rsa";
set -o errtrace
kafkaCluster="node13-1 node13-2 node13-3 node13-4"
zookeeperCluster="node13-1 node13-2 node13-3"
@huafengw
huafengw / .mk
Created February 23, 2017 07:01
Build Tensorflow native shared library
# How to build Tensroflow native shared library
1. Download Tensorflow's source code to dicrectory ${TENSORFLOW_HOME}.
2. Prepare the build environemnt following the instructions from https://www.tensorflow.org/install/install_sources
3. Adding following code into file ${TENSORFLOW_HOME}/tensorflow/core/distributed_runtime/rpc/BUILD
```
cc_binary(
@huafengw
huafengw / .java
Created October 20, 2017 01:47
Determine whether a string is composed by its substring
public static boolean composedBySubString(String s) {
if (s == null || s.length() < 2) {
return false;
}
int step = 1;
while (step <= (s.length() / 2)) {
if (s.length() % step == 0) {
String subString = s.substring(0, step);
int start = step;
boolean found = true;
import tensorflow as tf
tensor_list = [[1,2,3,4], [5,6,7,8],[9,10,11,12],[13,14,15,16],[17,18,19,20]]
tensor_list2 = [[[1,2,3,4]], [[5,6,7,8]],[[9,10,11,12]],[[13,14,15,16]],[[17,18,19,20]]]
with tf.Session() as sess:
x1 = tf.train.batch(tensor_list, batch_size=4, enqueue_many=False)
x2 = tf.train.batch(tensor_list, batch_size=4, enqueue_many=True)
y1 = tf.train.batch_join(tensor_list, batch_size=4, enqueue_many=False)
@huafengw
huafengw / Test.scala
Created May 28, 2018 08:34
mleap example
package com.vip.mlp.dag
import ml.combust.bundle.BundleContext
import ml.combust.bundle.dsl.{Model, NodeShape, Value}
import ml.combust.bundle.op.OpModel
import org.apache.spark.ml.bundle.{ParamSpec, SimpleParamSpec, SimpleSparkOp, SparkBundleContext}
import org.apache.spark.ml.feature.IndexToString
import org.apache.spark.sql.SparkSession
import ml.combust.bundle.BundleFile
import ml.combust.mleap.spark.SparkSupport._