Description
It's been found that trying to export data from Couchbase to Hadoop can fail if the data value is larger than somewhere in the 1640-1680 range. The failure will show an error from the hadoop connector as follows:
2012-02-16 20:23:25,116 WARN org.apache.hadoop.mapred.Child: Error running child
java.lang.NegativeArraySizeException
at org.apache.hadoop.io.Text.readString(Text.java:401)
at DUMP.readFields(DUMP.java:98)
at com.couchbase.sqoop.mapreduce.db.CouchbaseRecordReader.nextKeyValue(CouchbaseRecordReader.java:219)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:456)
at org.apache.hadoop.mapreduce.MapContext.nextKeyValue(MapContext.java:67)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143)
at com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1157)
at org.apache.hadoop.mapred.Child.main(Child.java:264)