Skip to content

Instantly share code, notes, and snippets.

@zmaril
Created March 4, 2013 13:05
Show Gist options
  • Save zmaril/5082102 to your computer and use it in GitHub Desktop.
Save zmaril/5082102 to your computer and use it in GitHub Desktop.
Frame exception
gremlin> g = FaunusFactory.open("bin/titan-cassandra-input.properties")
==>faunusgraph[titancassandrainputformat->graphsonoutputformat]
gremlin> g.E.count()
13/03/04 12:58:29 INFO mapreduce.FaunusCompiler: Compiled to 1 MapReduce job(s)
13/03/04 12:58:29 INFO mapreduce.FaunusCompiler: Executing job 1 out of 1: MapSequence[com.thinkaurelius.faunus.mapreduce.transform.EdgesMap.Map, com.thinkaurelius.faunus.mapreduce.util.CountMapReduce.Map, com.thinkaurelius.faunus.mapreduce.util.CountMapReduce.Reduce]
13/03/04 12:58:29 INFO mapreduce.FaunusCompiler: Job data location: output/job-0
13/03/04 12:58:29 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/03/04 12:58:29 INFO impl.ConnectionPoolMBeanManager: Registering mbean: com.netflix.MonitoredResources:type=ASTYANAX,name=TitanConnectionPool,ServiceType=connectionpool
13/03/04 12:58:29 INFO mapred.JobClient: Running job: job_local_0002
13/03/04 12:58:29 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@134eca
13/03/04 12:58:29 INFO mapred.MapTask: io.sort.mb = 100
13/03/04 12:58:29 INFO mapred.MapTask: data buffer = 79691776/99614720
13/03/04 12:58:29 INFO mapred.MapTask: record buffer = 262144/327680
13/03/04 12:58:30 WARN mapred.LocalJobRunner: job_local_0002
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Frame size (29153359) larger than max length (16384000)!
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.maybeInit(ColumnFamilyRecordReader.java:400)
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.computeNext(ColumnFamilyRecordReader.java:406)
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.computeNext(ColumnFamilyRecordReader.java:324)
at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader.nextKeyValue(ColumnFamilyRecordReader.java:189)
at com.thinkaurelius.faunus.formats.titan.cassandra.TitanCassandraRecordReader.nextKeyValue(TitanCassandraRecordReader.java:37)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:532)
at org.apache.hadoop.mapreduce.MapContext.nextKeyValue(MapContext.java:67)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
Caused by: org.apache.thrift.transport.TTransportException: Frame size (29153359) larger than max length (16384000)!
at org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:137)
at org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:101)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:297)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:204)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at org.apache.cassandra.thrift.Cassandra$Client.recv_get_range_slices(Cassandra.java:692)
at org.apache.cassandra.thrift.Cassandra$Client.get_range_slices(Cassandra.java:676)
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.maybeInit(ColumnFamilyRecordReader.java:357)
... 12 more
13/03/04 12:58:30 INFO mapred.JobClient: map 0% reduce 0%
13/03/04 12:58:30 INFO mapred.JobClient: Job complete: job_local_0002
13/03/04 12:58:30 INFO mapred.JobClient: Counters: 0
13/03/04 12:58:30 ERROR mapreduce.FaunusCompiler: Faunus job error -- remaining MapReduce jobs have been canceled
gremlin>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment