Skip to content

Instantly share code, notes, and snippets.

@wyukawa
wyukawa / gist:3340614
Created August 13, 2012 13:15
部分和問題
def dfs(i, sum, a, k):
if i == len(a):
return sum == k
if dfs(i+1, sum, a, k):
return True
if dfs(i+1, sum+a[i], a, k):
return True
@wyukawa
wyukawa / gist:3511506
Created August 29, 2012 11:58
Pig動作確認ログ
$ wget http://ftp.jaist.ac.jp/pub/apache/pig/pig-0.10.0/pig-0.10.0.tar.gz
$ tar zxvf pig-0.10.0.tar.gz
$ cd pig-0.10.0
$ export JAVA_HOME=/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/
$ wget --no-check-certificate https://raw.github.com/tomwhite/hadoop-book/master/input/ncdc/micro-tab/sample.txt
@wyukawa
wyukawa / gist:3732210
Created September 16, 2012 12:23
郵便番号データをPigで処理してみる
set pig.exec.mapPartAgg true;
A = load 'KEN_ALL_ROME.CSV'
using PigStorage(',')
as (
org_code:chararray,
postcode:chararray,
choiki_name:chararray,
choson_name:chararray,
todoufuken_name:chararray
@wyukawa
wyukawa / gist:3884759
Created October 13, 2012 14:14
exist例
-- 全ての教科が50点以上
$ cat TestScores.csv
100,math,100
100,Japanese,80
100,science,80
200,math,80
200,Japanese,95
300,math,40
300,Japanese,90
@wyukawa
wyukawa / gist:3897379
Created October 16, 2012 05:38
python twitter
#coding: utf-8
import twitter
twitter_search = twitter.Twitter(domain="search.twitter.com")
search_results = twitter_search.search(q="Hadoop", rpp=10, page=1)
tweets = [r['text'] for r in search_results['results']]
for tweet in tweets:
print tweet.encode("utf-8")
@wyukawa
wyukawa / gist:3919187
Created October 19, 2012 16:30
PigのCUBE
$ cat cube.dat
dog,miami,12
cat,miami,18
turtle,tampa,4
dog,tampa,14
cat,naples,9
dog,naples,5
turtle,naples,1
$ cat cube.pig
a = load 'cube.dat' USING PigStorage(',') as (x:chararray,y:chararray,z:long);
@wyukawa
wyukawa / gist:4663529
Created January 29, 2013 11:19
手元のMacがruby 1.8系だったのでrbenvでruby 1.9系にしてfluentdをinstallするメモ
$ brew install rbenv
$ brew install ruby-build
$ echo 'eval "$(rbenv init -)"' >> ~/.bashrc
$ rbenv install 1.9.3-pXXX
$ rbenv rehash
$ rbenv local 1.9.3-pXXX
$ gem install fluentd
$ rbenv rehash
$ fluentd --setup ./fluent
$ fluentd -c ./fluent/fluent.conf -vv &
@wyukawa
wyukawa / gist:4704891
Created February 4, 2013 03:35
Azkaban install
$ wget https://github.com/downloads/azkaban/azkaban/azkaban-0.10.zip
$ unzip azkaban-0.10.zip
$ cd azkaban-0.10
$ mkdir azkaban-jobs
$ ./bin/azkaban-server.sh --job-dir azkaban-jobs/
$ open http://localhost:8081
@wyukawa
wyukawa / gist:4759914
Last active December 12, 2015 10:29
fluentdのretry処理時刻が過去になっている
# stacktrace
2013-02-12 04:38:39 +0900: failed to communicate hdfs cluster, path: ...
2013-02-12 04:38:39 +0900: temporarily failed to flush the buffer, next retry will be at 2013-02-12 04:37:10 +0900. error="{...
2013-02-12 04:38:39 +0900: /usr/local/lib/ruby/gems/1.9.1/gems/webhdfs-0.5.1/lib/webhdfs/client_v1.rb:278:in `request'
2013-02-12 04:38:39 +0900: /usr/local/lib/ruby/gems/1.9.1/gems/webhdfs-0.5.1/lib/webhdfs/client_v1.rb:232:in `operate_requests'
2013-02-12 04:38:39 +0900: /usr/local/lib/ruby/gems/1.9.1/gems/webhdfs-0.5.1/lib/webhdfs/client_v1.rb:46:in `append'
2013-02-12 04:38:39 +0900: /usr/local/lib/ruby/gems/1.9.1/gems/fluent-plugin-webhdfs-0.1.0/lib/fluent/plugin/out_webhdfs.rb:100:in `send_data'
2013-02-12 04:38:39 +0900: /usr/local/lib/ruby/gems/1.9.1/gems/fluent-plugin-webhdfs-0.1.0/lib/fluent/plugin/out_webhdfs.rb:109:in `write'
2013-02-12 04:38:39 +0900: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.30/lib/fluent/buffer.rb:279:in `write_chunk'
@wyukawa
wyukawa / gist:5097142
Created March 6, 2013 06:17
slf4j error
Exception in thread "main" java.lang.NoSuchMethodError: org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
at org.apache.commons.logging.impl.SLF4JLocationAwareLog.debug(SLF4JLocationAwareLog.java:133)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:139)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:205)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:466)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:452)
at org.apache.hadoop.mapreduce.JobContext.<init>(JobContext.java:80)
at org.apache.hadoop.mapred