Skip to content

Instantly share code, notes, and snippets.

@okram
Created September 30, 2013 16:12
Show Gist options
  • Save okram/6766191 to your computer and use it in GitHub Desktop.
Save okram/6766191 to your computer and use it in GitHub Desktop.
~/software/aurelius/faunus$ bin/gremlin.sh
\,,,/
(o o)
-----oOOo-(_)-oOOo-----
2013-09-30 10:11:48.366 java[29040:1703] Unable to load realm info from SCDynamicStore
gremlin> g = FaunusFactory.open('bin/faunus.properties')
==>faunusgraph[graphsoninputformat->graphsonoutputformat]
gremlin> g.getConf().setClass('mapred.map.output.compression.codec',org.apache.hadoop.io.compress.SnappyCodec,CompressionCodec.class)
10:11:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
10:11:55 WARN snappy.LoadSnappy: Snappy native library not loaded
==>null
gremlin> g.getConf('mapred.map.output.compression')
==>mapred.map.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec
gremlin> g.V.out.count()
10:12:13 WARN mapreduce.FaunusCompiler: Using the developer Faunus job jar: target/faunus-0.4.0-SNAPSHOT-job.jar
10:12:13 INFO mapreduce.FaunusCompiler: Compiled to 2 MapReduce job(s)
10:12:13 INFO mapreduce.FaunusCompiler: Executing job 1 out of 2: MapSequence[com.thinkaurelius.faunus.mapreduce.transform.VerticesMap.Map, com.thinkaurelius.faunus.mapreduce.transform.VerticesVerticesMapReduce.Map, com.thinkaurelius.faunus.mapreduce.transform.VerticesVerticesMapReduce.Reduce]
10:12:13 INFO mapreduce.FaunusCompiler: Job data location: output/job-0
10:12:13 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
10:12:14 INFO input.FileInputFormat: Total input paths to process : 1
10:12:14 INFO mapred.JobClient: Running job: job_201309301011_0001
10:12:15 INFO mapred.JobClient: map 0% reduce 0%
10:12:21 INFO mapred.JobClient: map 100% reduce 0%
10:12:28 INFO mapred.JobClient: map 100% reduce 33%
10:12:30 INFO mapred.JobClient: map 100% reduce 100%
10:12:30 INFO mapred.JobClient: Job complete: job_201309301011_0001
10:12:30 INFO mapred.JobClient: Counters: 29
10:12:30 INFO mapred.JobClient: com.thinkaurelius.faunus.mapreduce.transform.VerticesMap$Counters
10:12:30 INFO mapred.JobClient: VERTICES_PROCESSED=12
10:12:30 INFO mapred.JobClient: EDGES_PROCESSED=0
10:12:30 INFO mapred.JobClient: Job Counters
10:12:30 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=5352
10:12:30 INFO mapred.JobClient: Launched reduce tasks=1
10:12:30 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
10:12:30 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
10:12:30 INFO mapred.JobClient: Launched map tasks=1
10:12:30 INFO mapred.JobClient: Data-local map tasks=1
10:12:30 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=8381
10:12:30 INFO mapred.JobClient: File Output Format Counters
10:12:30 INFO mapred.JobClient: Bytes Written=0
10:12:30 INFO mapred.JobClient: FileSystemCounters
10:12:30 INFO mapred.JobClient: FILE_BYTES_READ=1713
10:12:30 INFO mapred.JobClient: HDFS_BYTES_READ=2148
10:12:30 INFO mapred.JobClient: FILE_BYTES_WRITTEN=228347
10:12:30 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=1420
10:12:30 INFO mapred.JobClient: File Input Format Counters
10:12:30 INFO mapred.JobClient: Bytes Read=2028
10:12:30 INFO mapred.JobClient: com.thinkaurelius.faunus.mapreduce.transform.VerticesVerticesMapReduce$Counters
10:12:30 INFO mapred.JobClient: EDGES_TRAVERSED=17
10:12:30 INFO mapred.JobClient: Map-Reduce Framework
10:12:30 INFO mapred.JobClient: Reduce input groups=12
10:12:30 INFO mapred.JobClient: Map output materialized bytes=1713
10:12:30 INFO mapred.JobClient: Combine output records=0
10:12:30 INFO mapred.JobClient: Map input records=12
10:12:30 INFO mapred.JobClient: Reduce shuffle bytes=1713
10:12:30 INFO mapred.JobClient: Reduce output records=12
10:12:30 INFO mapred.JobClient: Spilled Records=58
10:12:30 INFO mapred.JobClient: Map output bytes=1647
10:12:30 INFO mapred.JobClient: Total committed heap usage (bytes)=325058560
10:12:30 INFO mapred.JobClient: Combine input records=0
10:12:30 INFO mapred.JobClient: Map output records=29
10:12:30 INFO mapred.JobClient: SPLIT_RAW_BYTES=120
10:12:30 INFO mapred.JobClient: Reduce input records=29
10:12:30 INFO mapreduce.FaunusCompiler: Executing job 2 out of 2: MapSequence[com.thinkaurelius.faunus.mapreduce.util.CountMapReduce.Map, com.thinkaurelius.faunus.mapreduce.util.CountMapReduce.Reduce]
10:12:30 INFO mapreduce.FaunusCompiler: Job data location: output/job-1
10:12:31 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
10:12:31 INFO input.FileInputFormat: Total input paths to process : 1
10:12:31 INFO mapred.JobClient: Running job: job_201309301011_0002
10:12:32 INFO mapred.JobClient: map 0% reduce 0%
10:12:37 INFO mapred.JobClient: map 100% reduce 0%
10:12:46 INFO mapred.JobClient: map 100% reduce 100%
10:12:46 INFO mapred.JobClient: Job complete: job_201309301011_0002
10:12:46 INFO mapred.JobClient: Counters: 27
10:12:46 INFO mapred.JobClient: Job Counters
10:12:46 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=4972
10:12:46 INFO mapred.JobClient: Launched reduce tasks=1
10:12:46 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
10:12:46 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
10:12:46 INFO mapred.JobClient: Launched map tasks=1
10:12:46 INFO mapred.JobClient: Data-local map tasks=1
10:12:46 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=8247
10:12:46 INFO mapred.JobClient: File Output Format Counters
10:12:46 INFO mapred.JobClient: Bytes Written=0
10:12:46 INFO mapred.JobClient: com.thinkaurelius.faunus.mapreduce.util.CountMapReduce$Counters
10:12:46 INFO mapred.JobClient: VERTICES_COUNTED=11
10:12:46 INFO mapred.JobClient: FileSystemCounters
10:12:46 INFO mapred.JobClient: FILE_BYTES_READ=16
10:12:46 INFO mapred.JobClient: HDFS_BYTES_READ=1543
10:12:46 INFO mapred.JobClient: FILE_BYTES_WRITTEN=224275
10:12:46 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=2031
10:12:46 INFO mapred.JobClient: File Input Format Counters
10:12:46 INFO mapred.JobClient: Bytes Read=1420
10:12:46 INFO mapred.JobClient: Map-Reduce Framework
10:12:46 INFO mapred.JobClient: Reduce input groups=1
10:12:46 INFO mapred.JobClient: Map output materialized bytes=16
10:12:46 INFO mapred.JobClient: Combine output records=1
10:12:46 INFO mapred.JobClient: Map input records=12
10:12:46 INFO mapred.JobClient: Reduce shuffle bytes=16
10:12:46 INFO mapred.JobClient: Reduce output records=0
10:12:46 INFO mapred.JobClient: Spilled Records=2
10:12:46 INFO mapred.JobClient: Map output bytes=96
10:12:46 INFO mapred.JobClient: Total committed heap usage (bytes)=333447168
10:12:46 INFO mapred.JobClient: Combine input records=12
10:12:46 INFO mapred.JobClient: Map output records=12
10:12:46 INFO mapred.JobClient: SPLIT_RAW_BYTES=123
10:12:46 INFO mapred.JobClient: Reduce input records=1
==>17
gremlin>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment