Skip to content

Instantly share code, notes, and snippets.

View shrijeet's full-sized avatar

Shrijeet shrijeet

  • Redwood City, CA
View GitHub Profile
@shrijeet
shrijeet / jstack_hbase_read_deadlock.java
Created December 11, 2012 20:13
jstack_hbase_read_deadlock.stack
2012-12-11 11:56:08
Full thread dump Java HotSpot(TM) 64-Bit Server VM (14.2-b01 mixed mode):
"IPC Client (1542500044) connection to txa-2.rfiserve.net/172.22.12.2:60000 from hbase" daemon prio=10 tid=0x0000000042217800 nid=0x2ad2 in Object.wait() [0x00007fb7d3e73000]
java.lang.Thread.State: TIMED_WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.waitForWork(HBaseClient.java:459)
- locked <0x00007fb8ddb07fb0> (a org.apache.hadoop.hbase.ipc.HBaseClient$Connection)
at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:504)
@shrijeet
shrijeet / delete_blocks.log
Created February 11, 2013 08:21
NN log deleting blocks
2013-02-11 02:40:15,560 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* ask 172.22.4.30:50010 to replicate blk_-8282418489515119773_208956459 to datanode(s) 172.22.4.36:50010
2013-02-11 02:40:19,711 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* NameSystem.addStoredBlock: blockMap updated: 172.22.4.36:50010 is added to blk_-8282418489515119773_208956459 size 1882
2013-02-11 02:52:14,531 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* NameSystem.addStoredBlock: blockMap updated: 172.22.24.37:50010 is added to blk_-8282418489515119773_208956459 size 1882
2013-02-11 02:52:14,531 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* NameSystem.chooseExcessReplicates: (172.22.4.30:50010, blk_-8282418489515119773_208956459) is added to recentInvalidateSets
2013-02-11 02:52:30,141 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* ask 172.22.4.30:50010 to delete blk_-7899761269070348109_200978153 blk_-8553516317821166119_181078974 blk_-6417975954560547521_204368816 blk_-7917102129184696044_160428177 blk_-884761233393181724
@shrijeet
shrijeet / gist:4968275
Created February 16, 2013 19:13
copy table's main
public static void main(String[] args) throws Exception {
Configuration conf = HBaseConfiguration.create();
String[] otherArgs =
new GenericOptionsParser(conf, args).getRemainingArgs();
Job job = createSubmittableJob(conf, otherArgs);
if (job != null) {
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
require 'formula'
class TmuxForIterm2 < Formula
url 'http://iterm2.googlecode.com/files/tmux-for-iTerm2-20130302.tar.gz'
sha1 '83d1389eb55b55bc069e0b66a11aa0a8faf9cddd'
homepage 'http://code.google.com/p/iterm2/wiki/TmuxIntegration'
depends_on 'libevent'
def install
export HADOOP_COMMON_HOME=/usr/lib/hadoop
export HADOOP_CONF_DIR=/etc/hadoop/conf
export HADOOP_DATANODE_USER=hdfs
export HADOOP_HDFS_HOME=/usr/lib/hadoop-hdfs
export HADOOP_HOME=/usr/lib/hadoop-0.20-mapreduce
export HADOOP_HOME_WARN_SUPPRESS=true
export HADOOP_IDENT_STRING=hadoop
export HADOOP_IDENT_STRING=hdfs
export HADOOP_JOBTRACKERHA_USER=mapred
export HADOOP_JOBTRACKER_USER=mapred
/etc/init.d/hadoop-hdfs-namenode
/etc/default/hadoop
/etc/default/hadoop-0.20-mapreduce
/etc/default/hadoop-fuse
/etc/default/hadoop-hdfs
/etc/default/hadoop-hdfs-namenode
/etc/default/hadoop-hdfs-secondarynamenode
/usr/lib/hadoop/libexec/hadoop-config.sh
/usr/lib/hadoop/libexec/hadoop-layout.sh
/etc/hadoop/conf/hadoop-env.sh
@shrijeet
shrijeet / npe_beeswax_1.java
Created May 15, 2013 20:13
NPE during query result fetch from history
java.lang.NullPointerException
at com.cloudera.beeswax.BeeswaxServiceImpl$RunningQueryState.access$600(BeeswaxServiceImpl.java:124)
at com.cloudera.beeswax.BeeswaxServiceImpl.doWithState(BeeswaxServiceImpl.java:770)
at com.cloudera.beeswax.BeeswaxServiceImpl.fetch(BeeswaxServiceImpl.java:980)
at com.cloudera.beeswax.api.BeeswaxService$Processor$fetch.getResult(BeeswaxService.java:987)
at com.cloudera.beeswax.api.BeeswaxService$Processor$fetch.getResult(BeeswaxService.java:971)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
@shrijeet
shrijeet / .vimrc
Last active December 17, 2015 20:18 — forked from rocarvaj/.vimrc
" VIM Configuration File
" Description: Optimized for C/C++ development, but useful also for other things.
" Author: Gerhard Gappmeier
"
" set UTF-8 encoding
set enc=utf-8
set fenc=utf-8
set termencoding=utf-8
" disable vi compatibility (emulation of old bugs)
@shrijeet
shrijeet / hive_no_system_exit.patch
Created May 30, 2013 18:38
Do not run taskCleanup if running hive in sequential mode
if (tsk.ifRetryCmdWhenFail()) {
- if (running.size() != 0) {
+ if (running.size() != 0 && executeTasksInParallel()) {
taskCleanup();
}
// in case we decided to run everything in local mode, restore the
@@ -1183,7 +1183,7 @@ public class Driver implements CommandProcessor {
}
SQLState = "08S01";
console.printError(errorMessage);
@shrijeet
shrijeet / KeyValue.java
Created September 18, 2013 01:55
KeyValue size
+ 4 // int: Total length of the whole KeyValue.
+ 4 // int: Total length of the key part of the KeyValue.
+ 4 // int: Total length of the value part of the KeyValue.
+ 2 // short: Row key length.
+ key.length // The row key.
+ 1 // byte: Family length.
+ family.length // The family.
+ qualifier.length // The qualifier.
+ 8 // long: The timestamp.
+ 1 // byte: The type of KeyValue.