Details
-
Bug
-
Resolution: Fixed
-
Major
-
Goldfish Public Preview, Columnar 1.0.0
-
Untriaged
-
-
0
-
No
-
Analytics Sprint 43, Analytics Sprint 44
Description
Steps to reproduce -
- Create a capella cluster and create 1 bucket with 1 scope and 10 collections.
- Load 10,48,576 docs of 1KB into each of the KV collections.
- Create a columnar instance.
- set following analytics service configs -
- unlimited_storage_debug_flags = {
"cloudStorageCachePolicy": selective", "cloudStorageDiskMonitorInterval": 60, "cloudStorageIndexInactiveDurationThreshold": 1, "cloudStorageDebugModeEnabled": True, "cloudStorageDebugSweepThresholdSize": 1073741824
}
- unlimited_storage_debug_flags = {
- Restart the analytics service.
- Wait for service to be up.
- Create 1 remote link and 10 remote collections (1 remote collection on 1 KV collection each)
- Run a few queries.
POST https://svc-da-node-001.uzimc1kyvet62sh.sandbox.nonprod-project-avengers.com:18095/analytics/service body: {"statement": "SET `compiler.external.field.pushdown` \"false\"; SELECT count(product_reviews) as product_reviews_count, count(avg_rating) as avg_rating_count from iDLQu.eyd0rD1WXYXrWGKJHr9.iQSsVYZy3DrerrOTj3RgiCD6s;", "pretty": "true", "client_context_id": null, "timeout": "300s"} headers: {'Content-Type': 'application/json', ', 'Connection': 'close', 'Accept': '*/*'} error: 500 reason: [{'code': 25000, 'msg': 'Internal error', 'retriable': False}] |
{
|
"requestID": "0b221a92-9af0-40e8-b0d9-b4a255996659", |
"clientContextID": "null", |
"errors": [{ |
"code": 25000, "msg": "Internal error", "retriable": false } |
],
|
"status": "fatal", |
"metrics": { |
"elapsedTime": "772.747573ms", |
"executionTime": "771.857943ms", |
"compileTime": "0ns", |
"queueWaitTime": "0ns", |
"resultCount": 0, |
"resultSize": 0, |
"processedObjects": 0, |
"bufferCacheHitRatio": "0.00%", |
"bufferCachePageReadCount": 0, |
"errorCount": 1 |
}
|
}
|
org.apache.hyracks.api.exceptions.HyracksDataException: java.io.IOException: FAILED_TO_UNCOMPRESS(5) |
at org.apache.hyracks.api.exceptions.HyracksDataException.create(HyracksDataException.java:70) ~[hyracks-api.jar:1.0.0-2126] |
at org.apache.hyracks.api.util.ExceptionUtils.setNodeIds(ExceptionUtils.java:70) ~[hyracks-api.jar:1.0.0-2126] |
at org.apache.hyracks.control.nc.Task.run(Task.java:398) ~[hyracks-control-nc.jar:1.0.0-2126] |
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] |
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] |
at java.base/java.lang.Thread.run(Thread.java:840) [?:?] |
Caused by: java.io.IOException: FAILED_TO_UNCOMPRESS(5) |
Few exceptions are logged as warning as well -
2024-06-05T10:26:30.951+00:00 WARN CBAS.buffercache.BufferCache [SAO:JID:0.128:TAID:TID:ANID:ODID:3:0:1:0] Failure while trying to read a page from disk |
java.lang.IndexOutOfBoundsException: 35 + 17 > 51 |
at org.apache.hyracks.storage.common.compression.file.CompressedFileManager.getTotalCompressedSize(CompressedFileManager.java:262) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.common.file.CompressedBufferedFileHandle.getPagesTotalSize(CompressedBufferedFileHandle.java:238) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudMegaPageReadContext.getOrCreateStream(CloudMegaPageReadContext.java:187) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudMegaPageReadContext.readFromStream(CloudMegaPageReadContext.java:152) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudMegaPageReadContext.processHeader(CloudMegaPageReadContext.java:121) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126] |
at org.apache.hyracks.storage.common.file.CompressedBufferedFileHandle.read(CompressedBufferedFileHandle.java:62) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.common.buffercache.BufferCache.read(BufferCache.java:571) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.common.buffercache.BufferCache.tryRead(BufferCache.java:544) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.common.buffercache.BufferCache.pin(BufferCache.java:214) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudColumnReadContext.pin(CloudColumnReadContext.java:177) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudColumnReadContext.prepareColumns(CloudColumnReadContext.java:169) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.btree.column.impls.btree.ColumnBTreeRangeSearchCursor.doOpen(ColumnBTreeRangeSearchCursor.java:134) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126] |
at org.apache.hyracks.storage.common.EnforcedIndexCursor.open(EnforcedIndexCursor.java:54) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.btree.impls.DiskBTree.searchDown(DiskBTree.java:138) ~[hyracks-storage-am-btree.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.btree.impls.DiskBTree.search(DiskBTree.java:107) ~[hyracks-storage-am-btree.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.btree.impls.DiskBTree$DiskBTreeAccessor.search(DiskBTree.java:195) ~[hyracks-storage-am-btree.jar:1.0.0-2126] |
at org.apache.hyracks.storage.common.util.IndexCursorUtils.open(IndexCursorUtils.java:90) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.btree.impls.LSMBTreeRangeSearchCursor.doOpen(LSMBTreeRangeSearchCursor.java:415) ~[hyracks-storage-am-lsm-btree.jar:1.0.0-2126] |
at org.apache.hyracks.storage.common.EnforcedIndexCursor.open(EnforcedIndexCursor.java:54) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.btree.impls.LSMBTreeSearchCursor.doOpen(LSMBTreeSearchCursor.java:62) ~[hyracks-storage-am-lsm-btree.jar:1.0.0-2126] |
at org.apache.hyracks.storage.common.EnforcedIndexCursor.open(EnforcedIndexCursor.java:54) ~[hyracks-storage-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.btree.impls.LSMBTree.search(LSMBTree.java:219) ~[hyracks-storage-am-lsm-btree.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.common.impls.LSMHarness.search(LSMHarness.java:468) ~[hyracks-storage-am-lsm-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.lsm.common.impls.LSMTreeIndexAccessor.search(LSMTreeIndexAccessor.java:118) ~[hyracks-storage-am-lsm-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.common.dataflow.IndexSearchOperatorNodePushable.searchAllPartitions(IndexSearchOperatorNodePushable.java:466) ~[hyracks-storage-am-common.jar:1.0.0-2126] |
at org.apache.hyracks.storage.am.common.dataflow.IndexSearchOperatorNodePushable.nextFrame(IndexSearchOperatorNodePushable.java:316) ~[hyracks-storage-am-common.jar:1.0.0-2126] |
at org.apache.hyracks.dataflow.common.comm.io.AbstractFrameAppender.write(AbstractFrameAppender.java:94) ~[hyracks-dataflow-common.jar:1.0.0-2126] |
at org.apache.hyracks.algebricks.runtime.operators.std.EmptyTupleSourceRuntimeFactory$1.open(EmptyTupleSourceRuntimeFactory.java:55) ~[algebricks-runtime.jar:1.0.0-2126] |
at org.apache.hyracks.algebricks.runtime.operators.meta.AlgebricksMetaOperatorDescriptor$SourcePushRuntime.initialize(AlgebricksMetaOperatorDescriptor.java:175) ~[algebricks-runtime.jar:1.0.0-2126] |
at org.apache.hyracks.api.rewriter.runtime.SuperActivityOperatorNodePushable.lambda$runInParallel$0(SuperActivityOperatorNodePushable.java:233) ~[hyracks-api.jar:1.0.0-2126] |
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] |
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] |
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] |
Attachments
Issue Links
- is duplicated by
-
MB-62173 Internal error while disconnecting remote link
- Closed