Uploaded image for project: 'Couchbase Server'
  1. Couchbase Server
  2. MB-62175

Internal error while running analytics query

    XMLWordPrintable

Details

    Description

      Steps to reproduce -

      1. Create a capella cluster and create 1 bucket with 1 scope and 10 collections.
      2. Load 10,48,576 docs of 1KB into each of the KV collections.
      3. Create a columnar instance.
      4. set following analytics service configs -
        1. unlimited_storage_debug_flags = {
          "cloudStorageCachePolicy": selective", "cloudStorageDiskMonitorInterval": 60, "cloudStorageIndexInactiveDurationThreshold": 1, "cloudStorageDebugModeEnabled": True, "cloudStorageDebugSweepThresholdSize": 1073741824
          }
      1. Restart the analytics service.
      2. Wait for service to be up.
      3. Create 1 remote link and 10 remote collections (1 remote collection on 1 KV collection each)
      4. Run a few queries.

      POST https://svc-da-node-001.uzimc1kyvet62sh.sandbox.nonprod-project-avengers.com:18095/analytics/service body: {"statement": "SET `compiler.external.field.pushdown` \"false\"; SELECT count(product_reviews) as product_reviews_count, count(avg_rating) as avg_rating_count from iDLQu.eyd0rD1WXYXrWGKJHr9.iQSsVYZy3DrerrOTj3RgiCD6s;", "pretty": "true", "client_context_id": null, "timeout": "300s"} headers: {'Content-Type': 'application/json', ', 'Connection': 'close', 'Accept': '*/*'} error: 500 reason: [{'code': 25000, 'msg': 'Internal error', 'retriable': False}] 
      {
          "requestID": "0b221a92-9af0-40e8-b0d9-b4a255996659",
          "clientContextID": "null",
          "errors": [{ 
              "code": 25000,        "msg": "Internal error",        "retriable": false    } 
          ],
          "status": "fatal",
          "metrics": {
              "elapsedTime": "772.747573ms",
              "executionTime": "771.857943ms",
              "compileTime": "0ns",
              "queueWaitTime": "0ns",
              "resultCount": 0,
              "resultSize": 0,
              "processedObjects": 0,
              "bufferCacheHitRatio": "0.00%",
              "bufferCachePageReadCount": 0,
              "errorCount": 1
          }
      } 

      org.apache.hyracks.api.exceptions.HyracksDataException: java.io.IOException: FAILED_TO_UNCOMPRESS(5)
      	at org.apache.hyracks.api.exceptions.HyracksDataException.create(HyracksDataException.java:70) ~[hyracks-api.jar:1.0.0-2126]
      	at org.apache.hyracks.api.util.ExceptionUtils.setNodeIds(ExceptionUtils.java:70) ~[hyracks-api.jar:1.0.0-2126]
      	at org.apache.hyracks.control.nc.Task.run(Task.java:398) ~[hyracks-control-nc.jar:1.0.0-2126]
      	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
      	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
      	at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
      Caused by: java.io.IOException: FAILED_TO_UNCOMPRESS(5) 

      Few exceptions are logged as warning as well -

      2024-06-05T10:26:30.951+00:00 WARN CBAS.buffercache.BufferCache [SAO:JID:0.128:TAID:TID:ANID:ODID:3:0:1:0] Failure while trying to read a page from disk
      java.lang.IndexOutOfBoundsException: 35 + 17 > 51
      	at org.apache.hyracks.storage.common.compression.file.CompressedFileManager.getTotalCompressedSize(CompressedFileManager.java:262) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.common.file.CompressedBufferedFileHandle.getPagesTotalSize(CompressedBufferedFileHandle.java:238) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudMegaPageReadContext.getOrCreateStream(CloudMegaPageReadContext.java:187) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudMegaPageReadContext.readFromStream(CloudMegaPageReadContext.java:152) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudMegaPageReadContext.processHeader(CloudMegaPageReadContext.java:121) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.common.file.CompressedBufferedFileHandle.read(CompressedBufferedFileHandle.java:62) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.common.buffercache.BufferCache.read(BufferCache.java:571) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.common.buffercache.BufferCache.tryRead(BufferCache.java:544) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.common.buffercache.BufferCache.pin(BufferCache.java:214) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudColumnReadContext.pin(CloudColumnReadContext.java:177) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.btree.column.cloud.buffercache.read.CloudColumnReadContext.prepareColumns(CloudColumnReadContext.java:169) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.btree.column.impls.btree.ColumnBTreeRangeSearchCursor.doOpen(ColumnBTreeRangeSearchCursor.java:134) ~[hyracks-storage-am-lsm-btree-column.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.common.EnforcedIndexCursor.open(EnforcedIndexCursor.java:54) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.btree.impls.DiskBTree.searchDown(DiskBTree.java:138) ~[hyracks-storage-am-btree.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.btree.impls.DiskBTree.search(DiskBTree.java:107) ~[hyracks-storage-am-btree.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.btree.impls.DiskBTree$DiskBTreeAccessor.search(DiskBTree.java:195) ~[hyracks-storage-am-btree.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.common.util.IndexCursorUtils.open(IndexCursorUtils.java:90) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.btree.impls.LSMBTreeRangeSearchCursor.doOpen(LSMBTreeRangeSearchCursor.java:415) ~[hyracks-storage-am-lsm-btree.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.common.EnforcedIndexCursor.open(EnforcedIndexCursor.java:54) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.btree.impls.LSMBTreeSearchCursor.doOpen(LSMBTreeSearchCursor.java:62) ~[hyracks-storage-am-lsm-btree.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.common.EnforcedIndexCursor.open(EnforcedIndexCursor.java:54) ~[hyracks-storage-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.btree.impls.LSMBTree.search(LSMBTree.java:219) ~[hyracks-storage-am-lsm-btree.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.common.impls.LSMHarness.search(LSMHarness.java:468) ~[hyracks-storage-am-lsm-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.lsm.common.impls.LSMTreeIndexAccessor.search(LSMTreeIndexAccessor.java:118) ~[hyracks-storage-am-lsm-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.common.dataflow.IndexSearchOperatorNodePushable.searchAllPartitions(IndexSearchOperatorNodePushable.java:466) ~[hyracks-storage-am-common.jar:1.0.0-2126]
      	at org.apache.hyracks.storage.am.common.dataflow.IndexSearchOperatorNodePushable.nextFrame(IndexSearchOperatorNodePushable.java:316) ~[hyracks-storage-am-common.jar:1.0.0-2126]
      	at org.apache.hyracks.dataflow.common.comm.io.AbstractFrameAppender.write(AbstractFrameAppender.java:94) ~[hyracks-dataflow-common.jar:1.0.0-2126]
      	at org.apache.hyracks.algebricks.runtime.operators.std.EmptyTupleSourceRuntimeFactory$1.open(EmptyTupleSourceRuntimeFactory.java:55) ~[algebricks-runtime.jar:1.0.0-2126]
      	at org.apache.hyracks.algebricks.runtime.operators.meta.AlgebricksMetaOperatorDescriptor$SourcePushRuntime.initialize(AlgebricksMetaOperatorDescriptor.java:175) ~[algebricks-runtime.jar:1.0.0-2126]
      	at org.apache.hyracks.api.rewriter.runtime.SuperActivityOperatorNodePushable.lambda$runInParallel$0(SuperActivityOperatorNodePushable.java:233) ~[hyracks-api.jar:1.0.0-2126]
      	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
      	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
      	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] 

      Attachments

        Issue Links

          No reviews matched the request. Check your Options in the drop-down menu of this sections header.

          Activity

            People

              umang.agrawal Umang
              umang.agrawal Umang
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Gerrit Reviews

                  There are no open Gerrit changes

                  PagerDuty