Details
-
New Feature
-
Resolution: Fixed
-
Major
-
None
-
.next
-
Security Level: Public
-
None
Description
We are implementing a Kafka connector based on the work Sergey has done with DCP and Hadoop.
The basic idea is to write example application, which accepts bucket credentials: urls, bucket name, password, and stream everything to kafka.
Kafka message represents Key/Value pair, where Key matches Couchbase key, and value aggregates Couchbase value with addition meta data in there
{
'event': '...',
'key': '...',
'expiration': '...',
'flags': '...',
'lockTime': '...',
'cas': '...',
'content':
}
where content could be either JSON hash or just base64 encoded string of raw data.