Details
-
Bug
-
Resolution: Fixed
-
Critical
-
4.5.0, 4.5.1
-
None
-
Untriaged
-
Unknown
Description
Steps to reproduce
- Create a index with the following definition:
CREATE INDEX `test` ON `default`(`name`)
- Run the following script:
#!/usr/bin/python
from couchbase.bucket import Bucket
from couchbase.n1ql import N1QLQuery
cb = Bucket('couchbase://10.111.151.101/default')
def get_name_length():
query = N1QLQuery("SELECT name FROM `default` WHERE name IS NOT MISSING")
for row in cb.n1ql_query(query):
return len(row['name'])
for x in [10, 20971420]:
cb.upsert('large', {'name': ("x" * x) })
length_check = True
while length_check:
length = get_name_length()
if length == x:
length_check = False
elif length == None and x == 20971420 :
print "Index is empty at x: {} - length which is expected {}.\n Test has passed".format(x, length)
length_check = False
elif length < x:
print "Retring x: {} - length: {}".format(x, length)
The script will keep on looping as the index is never updated.
Problem
The problem here is that the projector does not inform the indexer about the document with the large field:
- Projector logs:
2017-01-18T22:44:04.845+00:00 [Error] WRKR[43<-default<-127.0.0.1:8091 #MAINT_STREAM_TOPIC_e5:11:29:9:92:3b:f1:cc] ##4 TransformRoute: collatejson.outputLen
2017-01-18T22:44:09.064+00:00 [Error] CollateJSONEncode: index field for docid: large (err: collatejson.outputLen)
Expected behaviour
The index not to contain any value at all, especially not the old value when the field is above the maximum index length size
Attachments
Issue Links
- relates to
-
MB-22693 Backport MB-22403 to 4.6.2 - Indexer is inconsistent when a field goes from a small value to an extremely large value.
- Closed