Hi,
We encountered an interesting problem recently. The kafka connector sent duplicate events to kafka until we restarted the connector app. We did not see any problem in db or kafka at the time of the problem.
We are using Couchbase Kafka Connect 4.0.2 version with distributed mode to capture events and send them to Kafka. And also XDCR enabled at Couchbase cluster. We are using Couchbase Enterprise Edition 6.6 version.
Kafka connect configuration as below:
{
“connector.class”: “com.couchbase.connect.kafka.CouchbaseSourceConnector”,
“couchbase.persistence.polling.interval”: “0”,
“tasks.max”: “3”,
“couchbase.compression”: “ENABLED”,
“connection.timeout.ms”: “2000”,
“transforms”: “ignoreDeletes,deserializeJson”,
“couchbase.source.handler”: “com.couchbase.connect.kafka.handler.source.RawJsonSourceHandler”,
“couchbase.seed.nodes”: “nodes”,
“couchbase.bucket”: “bucket”,
“couchbase.username”: “user”,
“couchbase.stream.from”: “SAVED_OFFSET_OR_NOW”,
“value.converter.schemas.enable”: “false”,
“event.filter.class”: “com.couchbase.connect.kafka.filter.AllPassFilter”,
“name”: “name”,
“couchbase.password”: “pass”,
“value.converter”: “org.apache.kafka.connect.json.JsonConverter”,
“transforms.deserializeJson.type”: “com.couchbase.connect.kafka.transform.DeserializeJson”,
“couchbase.topic”: “topic”,
“transforms.ignoreDeletes.type”: “com.couchbase.connect.kafka.transform.DropIfNullValue”
}
Is there any mistake in our config? What else should we check? Can anyone help?
Thanks,
Umit