Details
-
Bug
-
Resolution: Duplicate
-
Major
-
None
-
3.0.3, 4.0.0
-
Security Level: Public
-
Three nodes in AWS r3.xlarge
1 bucket, full eviction, memory quote 100MB
-
Untriaged
-
No
Description
Problem
While running a script that continually adds documents to the cluster using the add operation I got the follow exception from the python SDK when mem_used hits the high watermark when using full eviction:
Process Process-3:
|
Traceback (most recent call last):
|
File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in _bootstrap
|
self.run()
|
File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
|
self._target(*self._args, **self._kwargs)
|
File "./add-forever.py", line 19, in proc
|
bucket.add("doc={0}".format(x), doc, format=couchbase.FMT_BYTES)
|
File "/usr/local/lib64/python2.7/site-packages/couchbase/bucket.py", line 1388, in _tmpmeth
|
return _dst(self, *args, **kwargs)
|
File "/usr/local/lib64/python2.7/site-packages/couchbase/bucket.py", line 363, in insert
|
persist_to=persist_to, replicate_to=replicate_to)
|
_NotFoundError_0xD (generated, catch NotFoundError): <Key=u'doc=83838', RC=0xD[The key does not exist on the server], Operational Error, Results=1, C Source=(src/multiresult.c,309)>
|
I have attach the tcpdump from the test run showing that the error is coming from the server.
Request
I would not expect to get a "key does not exist on the server" for a add operation.
Steps to reproduce
I tried to simplify the test, to one node and one thread however when I did it was less reliable.
With the script below and a 100MB bucket quote I was able to reliably reproduce it in 1 minute in AWS.
#!/usr/bin/env python2.7
|
|
import os |
from multiprocessing import Process |
|
import couchbase |
from couchbase.bucket import Bucket |
|
TOTAL_PROC = 4 |
HOSTNAME = '10.0.0.137' |
BUCKET = 'default' |
TOTAL_ITEMS = 200000000 |
SIZE = 1024 |
|
def proc(proc_id): |
bucket = Bucket("couchbase://{0}/{1}".format(HOSTNAME, BUCKET)) |
for x in xrange(proc_id, TOTAL_ITEMS, TOTAL_PROC): |
doc = bytearray(os.urandom(SIZE)) |
bucket.add("doc={0}".format(x), doc, format=couchbase.FMT_BYTES) |
|
if __name__ == '__main__': |
procs = [] |
for proc_id in range(TOTAL_PROC): |
p = Process(target=proc, args=[proc_id]) |
p.start()
|
procs.append(p)
|
|
print "Have {0} processes".format(len(procs)) |
try: |
for proc in procs: |
proc.join()
|
except: |
for proc in procs: |
proc.terminate()
|
Logs
Server:
https://s3.amazonaws.com/cb-customers/patrick/collectinfo-2016-02-10T170834-ns_1%4010.0.0.104.zip
https://s3.amazonaws.com/cb-customers/patrick/collectinfo-2016-02-10T170834-ns_1%4010.0.0.105.zip
https://s3.amazonaws.com/cb-customers/patrick/collectinfo-2016-02-10T170834-ns_1%4010.0.0.137.zip
Tcpdump:
add.pcap
Attachments
Issue Links
- duplicates
-
MB-14859 SetWithMeta returns KEY_ENOENT to goxdcr
- Closed