Details
-
Bug
-
Resolution: Fixed
-
Major
-
None
-
None
Description
The root cause may be MDS or something else. Please check that Kafka works as expected in MDS scenarios.
Bug report originally from forum:
https://forums.couchbase.com/t/error-while-using-couchbase-kafka-connector/6598/3
Some of the bug report is reproduced here to help JIRA search.
I am trying to run the latest couchbase kafka connector samples from the below mentioned link.
https://github.com/couchbase/couchbase-kafka-connector3
However when i run the kafka producer, i get to see the following error.
INFO - Client environment:java.io.tmpdir=/var/folders/4w/xs06ksps51xb5ygrwtrr9xg80000gq/T/
INFO - Client environment:java.compiler=
INFO - Client environment:os.name=Mac OS X
INFO - Client environment:os.arch=x86_64
INFO - Client environment:os.version=10.11.2
INFO - Client environment:user.name=kadhambari
INFO - Client environment:user.home=/Users/kadhambari
INFO - Client environment:user.dir=/Users/kadhambari/Documents/couchbase-kafka-connector/samples/producer
INFO - Initiating client connection, connectString=192.168.244.52 sessionTimeout=4000 watcher=org.I0Itec.zkclient.ZkClient@5f282abb
INFO - Opening socket connection to server 192.168.244.52/192.168.244.52:2181. Will not attempt to authenticate using SASL (unknown error)
INFO - Socket connection established to 192.168.244.52/192.168.244.52:2181, initiating session
INFO - Session establishment complete on server 192.168.244.52/192.168.244.52:2181, sessionid = 0x15262921fcd0002, negotiated timeout = 4000
INFO - zookeeper state changed (SyncConnected)
INFO - Verifying properties
INFO - Property key.serializer.class is overridden to kafka.serializer.StringEncoder
*INFO - Property metadata.broker.list is overridden to *
INFO - Property serializer.class is overridden to example.SampleEncoder
INFO - Connected to Node 192.168.244.94
ERROR - Error while subscribing to bucket config stream.
Please help me understand what could be causing this issue.
—Hence i have hardcoded the IPAddress. The couchbase server and kafka is up and running. Finally i was able to connect to couchbase from kafka when i ran Data,Index and query services on the same node. However i still face the same issue when i connect to a cluster where data,index,query services run on different nodes. Following is the configuration i currently have for the cluster
192.168.244.94 - Data
192.168.244.117 - Index/QueryIs there anything else i need to do in order to make it work??
Reply
Reply as linked TopicWillGardella12d
Hi @kadhambari,
The Kafka Connector works with DCP streams, so mutations on documents. I would only expect the Kafka Connector to work against nodes that are running the data service but not on nodes that have the data service disabled. Any changes to documents will take place in the data service, so you should not actually need to hook up Kafka to any nodes that are not running the data service.
Best,
-Willkadhambari12d
Thanks for pointing this out @WillGardella. Yes it does make complete sense to hook up kafka to a data node.But If you look at my error log, I have hooked up kafka to a node(192.168.244.94) where the data service runs. I did also try to connect kafka to a node(192.168.244.117) where Index/query service was running to verify if that was causing the issue. However, I ended up getting the same error on both the scenarios.
Attachments
Issue Links
- depends on
-
JVMCBC-290 In MDS cluster, the client tries to initialize DCP on node, which does not serve binary protocol
- Resolved
- relates to
-
JVMCBC-370 DcpConnection.getCurrentState() sends requests to all nodes in MDS setup
- Resolved