A theme that comes along often in our engagements is Capture Data Change(CDC). As Kafka becomes a standard part of the IT infrastructure, there is a critical need to capture and translate data changes from external data stores such as RDBMS and KV stores into an event stream for Kafka —and vice versa. There isn’t yet a globally accepted and standardized method for CDC to Kafka. The procedure is highly involved and dependent on the facilities each external datastore provides. Further complexity arises as the two systems (source datastore and Kafka) become dependent on each other —for example, how should a downtime of Kafka be handled? Our experienced developers are happy to have a chat with you if you are interested in the implementation details of such a system.
In this webinar, we approach Kafka to Couchbase CDC from a primary devops point of view. We set up a 2-way ETL, moving data from Couchbase to Kafka and vice versa, using Kafka Connect and Lenses. Once we establish the ETL, we experiment with more advanced CDC scenarios. While we do use the command line to produce new messages to Kafka or new documents to Couchbase, we implement, verify and monitor the ETL in real-time via Lenses.
Build end-to-end flows without code
Connect Couchbase to Kafka
Connect Kafka to Couchbase
Experiment with CDC features of the Couchbase connector