Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
0.98.10
-
None
-
None
Description
My HFiles are generated by Spark HBaseBulkLoad. And then when i read a few of them(or hbase do compact), i encounter the following exceptions.
Exception in thread "main" java.io.IOException: Requested block is out of range: 77329641, lastDataBlockOffset: 77329641, trailer.getLoadOnOpenDataOffset: 77329641
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:396)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:734)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.isNextBlock(HFileReaderV2.java:859)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.positionForNextBlock(HFileReaderV2.java:854)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2._next(HFileReaderV2.java:871)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:891)
at io.patamon.hbase.test.read.TestHFileRead.main(TestHFileRead.java:49)
Looks like `lastDataBlockOffset` is equals to `trailer.getLoadOnOpenDataOffset`. Could anyone help me? Thanks very much.