Skip to content

Instantly share code, notes, and snippets.

@neilkod
Created June 27, 2011 20:57
Show Gist options
  • Save neilkod/1049817 to your computer and use it in GitHub Desktop.
Save neilkod/1049817 to your computer and use it in GitHub Desktop.
pig 0.9 with PIG-1782
Data is being loaded from HBase.
I can dump the data to screen and write the data to HDFS ok. I am explicitly casting everything to a chararray before inserting back into HBase.
Trying to insert
new_helix_users: {ureg: chararray,helix_id: chararray,last_sess_dt: chararray,anon_map: map[]}
with
STORE new_helix_users INTO 'hbase://user_info_helix'
USING org.apache.pig.backend.hadoop.hbase.HBaseStorage('alias:helix profile:dw.last_sess_dt alias:');
and it throws the error:
Backend error message
---------------------
java.lang.ClassCastException: org.apache.pig.data.DataByteArray cannot be cast to java.lang.String
at org.apache.pig.backend.hadoop.hbase.HBaseStorage.objToBytes(HBaseStorage.java:597)
at org.apache.pig.backend.hadoop.hbase.HBaseStorage.putNext(HBaseStorage.java:547)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POStore.getNext(POStore.java:143)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POSplit.runPipeline(POSplit.java:254)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POSplit.processPlan(POSplit.java:236)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POSplit.processPlan(POSplit.java:241)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POSplit.getNext(POSplit.java:228)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapReduce$Reduce.runPipeline(PigMapReduce.java:456)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapReduce$Reduce.processOnePackageOutput(PigMapReduce.java:424)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapReduce$Reduce.reduce(PigMapReduce.java:404)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapReduce$Reduce.reduce(PigMapReduce.java:258)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:571)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:413)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Pig Stack Trace
---------------
ERROR 2997: Unable to recreate exception from backed error: java.lang.ClassCastException: org.apache.pig.data.DataByteArray cannot be cast to java.lang.String
org.apache.pig.backend.executionengine.ExecException: ERROR 2997: Unable to recreate exception from backed error: java.lang.ClassCastException: org.apache.pig.data.DataByteArray cannot be cast to java.lang.String
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.Launcher.getErrorMessages(Launcher.java:221)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.Launcher.getStats(Launcher.java:154)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:342)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1337)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1322)
at org.apache.pig.PigServer.execute(PigServer.java:1309)
at org.apache.pig.PigServer.executeBatch(PigServer.java:368)
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:119)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:180)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:152)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:90)
at org.apache.pig.Main.run(Main.java:554)
at org.apache.pig.Main.main(Main.java:109)
================================================================================
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment