Verify connector works when cluster contains nodes not running all services

Description

The root cause may be MDS or something else. Please check that Kafka works as expected in MDS scenarios.

Bug report originally from forum:
https://forums.couchbase.com/t/error-while-using-couchbase-kafka-connector/6598/3

Some of the bug report is reproduced here to help JIRA search.

I am trying to run the latest couchbase kafka connector samples from the below mentioned link.

https://github.com/couchbase/couchbase-kafka-connector3

However when i run the kafka producer, i get to see the following error.

INFO - Client environment:java.io.tmpdir=/var/folders/4w/xs06ksps51xb5ygrwtrr9xg80000gq/T/ INFO - Client environment:java.compiler= INFO - Client environment:os.name=Mac OS X INFO - Client environment:os.arch=x86_64 INFO - Client environment:os.version=10.11.2 INFO - Client environment:user.name=kadhambari INFO - Client environment:user.home=/Users/kadhambari INFO - Client environment:user.dir=/Users/kadhambari/Documents/couchbase-kafka-connector/samples/producer INFO - Initiating client connection, connectString=192.168.244.52 sessionTimeout=4000 watcher=org.I0Itec.zkclient.ZkClient@5f282abb INFO - Opening socket connection to server 192.168.244.52/192.168.244.52:2181. Will not attempt to authenticate using SASL (unknown error) INFO - Socket connection established to 192.168.244.52/192.168.244.52:2181, initiating session INFO - Session establishment complete on server 192.168.244.52/192.168.244.52:2181, sessionid = 0x15262921fcd0002, negotiated timeout = 4000 INFO - zookeeper state changed (SyncConnected) INFO - Verifying properties INFO - Property key.serializer.class is overridden to kafka.serializer.StringEncoder *INFO - Property metadata.broker.list is overridden to * INFO - Property serializer.class is overridden to example.SampleEncoder INFO - Connected to Node 192.168.244.94 ERROR - Error while subscribing to bucket config stream.

Please help me understand what could be causing this issue.

Hence i have hardcoded the IPAddress. The couchbase server and kafka is up and running. Finally i was able to connect to couchbase from kafka when i ran Data,Index and query services on the same node. However i still face the same issue when i connect to a cluster where data,index,query services run on different nodes. Following is the configuration i currently have for the cluster

192.168.244.94 - Data
192.168.244.117 - Index/Query

Is there anything else i need to do in order to make it work??

Reply
Reply as linked Topic

WillGardella12d
Hi @kadhambari,
The Kafka Connector works with DCP streams, so mutations on documents. I would only expect the Kafka Connector to work against nodes that are running the data service but not on nodes that have the data service disabled. Any changes to documents will take place in the data service, so you should not actually need to hook up Kafka to any nodes that are not running the data service.
Best,
-Will

kadhambari12d
Thanks for pointing this out @WillGardella. Yes it does make complete sense to hook up kafka to a data node.

But If you look at my error log, I have hooked up kafka to a node(192.168.244.94) where the data service runs. I did also try to connect kafka to a node(192.168.244.117) where Index/query service was running to verify if that was causing the issue. However, I ended up getting the same error on both the scenarios.

Environment

None

Gerrit Reviews

None

Release Notes Description

None

Activity

Show:

Will Gardella February 12, 2016 at 7:11 PM

This requires a fix to JVM Core library. This should be fixed in an upcoming release of the JVM Core maintenance release planned as 1.2.5 and then will need to be included next maintenance release of Kafka Connector.

Fixed
Pinned fields
Click on the next to a field label to start pinning.

Details

Assignee

Reporter

Fix versions

Priority

Instabug

Open Instabug

PagerDuty

Sentry

Zendesk Support

Created February 5, 2016 at 9:50 PM
Updated February 9, 2017 at 9:46 PM
Resolved February 12, 2016 at 6:37 PM
Instabug