I don't think this is needed any longer. running "confluent-hub install confluentinc/kafka-connect-jdbc:latest" gets me both the driver and connector jar file automatically installed.
That's a great point - the 10.0 release of the JDBC Connector now includes the Oracle and MS SQL JDBC drivers. The MySQL one still isn't included, for licensing reasons.
Thanks Robin for all your wonderful video. I have some doubt in jdbc source connector. I have created below table CREATE TABLE Emp ( ID INT NOT NULL, Firstname VARCHAR (20), Lastname VARCHAR (20), AGE INT NOT NULL, MonthlySalary Int, PRIMARY KEY (ID) ); but in Query parameter I am using : select cast(CONCAT(Firstname, ' ', Lastname) as VARCHAR(20)) as Name, cast((MonthlySalary * 12) as INT) as AnnualSalary from Emp Now, connector start properly but when I insert data in table it is not reflecting in topic. Mainly here I am manipulating query to use different parameter and want to sink that in target DB. Can I use define schema for this, is there a way?
Hi Robin, thanks for your clear and organised contents. I'm trying to stream a table contains geospatial data from PostgreSQL (PostGIS enabled) to a Kafka topic. I have installed postgis-jdbc-2.5.0.jar through docker runtime. My JDBC source connector is up and running, but it does not write the geometry column to the Kafka topic. Is there anything wrong with my configuration? Thanks.
Hi Evan, I'm not sure how the JDBC Source connector handles geo data. Maybe head over to forum.confluent.io/ and try asking there, and give some details of the source data and what you're seeing in the Kafka topic.
Hi, This video is great to configure KafkaConnect, thanks for sharing with us :) , Just one question: all time I'm getting full copy of a table after pool interval time, can I get the change only data, are there any configuration?
Hello Robin, How are you getting kafkacat command? are you running on docker host or container? I tried with your github code, I am not getting confluent web base with port 9021 , with code in getihub demo-scene, can I bring up all the services and write data from DB to kafka?
kafkacat is included in the docker-compose (github.com/confluentinc/demo-scene/blob/master/oracle-and-kafka/docker-compose.yml), but Confluent Control Center isn't so you'd need to add that container. If you've got any more questions please head to forum.confluent.io/ and I will try and answer there :)
Hey thanks for the great vid! How can I restart Kafka Connect if I don't use confluent hub cli? It is in a K8S Pod and I seem to be unable to restart it...
To restart Kafka Connect you restart the JVM process. If you're using Docker then you'd bounce the container. I'm not familiar with k8s but I assume it's roughly the same. If you need more help then head to #containers channel on cnfl.io/slack
For some reason, when querying :8083/connector-plugins the MySQL plugin does not show up, even though the confluent JDBC one does. Both the jar files are in the same volume for the docker container. Anyone else run into this issue? I am not sure what more I can do...
Here's an example of using the MySQL connector from Docker: github.com/confluentinc/demo-scene/blob/master/kafka-connect-zero-to-hero/docker-compose.yml Note that you need to add the path to which you're mounting your connector JARs in the Kafka Connect container to the CONNECT_PLUGIN_PATH variable
It will automatically start, unless you override the command/entrypoint, which is what is shown here. For more questions/detail, head over to forum.confluent.io/ :)
Hi robin, i have a question regarding JDBC source connector offset reset. my requirement is whenever my connector restart it will load the entire data from the table. to achieve this what configuration property should i added in connector configuration. i searched alot in the internet most of the answer is rename the existing connector.. but i am not satisfied with this approach. i need to change the source connector offset to 0 without renaming the connector, how we can do this? can you help me?
So long as the Docker container can access the external Oracle database this will work. You should just be able to reference your Oracle database with its address/IP. See docs.docker.com/network/ for information about Docker networking.
Hi Jorge, I have connected to oracle which is present on my local system. I am trying to do CDC there. It is giving me one time full data present in table, but not recent table data. If you know anything , please help me
@@rmoff thanks fr ur reply... source and sink connectors from con fluent Kafka are only configuration based. What if I want to implement same using Scala/java code? I am unable to find any code on Internet for MySQL as source and Kafka topic as target
@@abhinavkumar-se2fd Why would you want to write it yourself in scala/java? As much fun as it may be, Kafka Connect is built for a reason and that is to do exactly this kind of workload :)
@@abhinavkumar-se2fd You configure Kafka Connect using REST calls, so from your application that generates the configuration dynamically you would submit this to the Kafka Connect end point. For more help and discussion, head over to cnfl.io/slack and the #connect channel :)
Hi Robin, thank you. You explanation helped me a lot.
Glad to have helped :)
Very clear and thorough, as always. Thanks!
Great tutorial and very helpful! Thanks for this
Great and clear explanation video..
👏👏👏👏👏👏
Glad you liked it
I don't think this is needed any longer.
running "confluent-hub install confluentinc/kafka-connect-jdbc:latest" gets me both the driver and connector jar file automatically installed.
That's a great point - the 10.0 release of the JDBC Connector now includes the Oracle and MS SQL JDBC drivers. The MySQL one still isn't included, for licensing reasons.
Thanks Robin for all your wonderful video.
I have some doubt in jdbc source connector. I have created below table
CREATE TABLE Emp (
ID INT NOT NULL,
Firstname VARCHAR (20),
Lastname VARCHAR (20),
AGE INT NOT NULL,
MonthlySalary Int,
PRIMARY KEY (ID)
);
but in Query parameter I am using : select cast(CONCAT(Firstname, ' ', Lastname) as VARCHAR(20)) as Name, cast((MonthlySalary * 12) as INT) as AnnualSalary from Emp
Now, connector start properly but when I insert data in table it is not reflecting in topic. Mainly here I am manipulating query to use different parameter and want to sink that in target DB. Can I use define schema for this, is there a way?
Check if your connector is running properly - see ruclips.net/video/1EenWEm-5dg/видео.html
Hi Robin, thanks for your clear and organised contents. I'm trying to stream a table contains geospatial data from PostgreSQL (PostGIS enabled) to a Kafka topic. I have installed postgis-jdbc-2.5.0.jar through docker runtime. My JDBC source connector is up and running, but it does not write the geometry column to the Kafka topic. Is there anything wrong with my configuration? Thanks.
Hi Evan, I'm not sure how the JDBC Source connector handles geo data. Maybe head over to forum.confluent.io/ and try asking there, and give some details of the source data and what you're seeing in the Kafka topic.
Hi,
This video is great to configure KafkaConnect, thanks for sharing with us :) ,
Just one question: all time I'm getting full copy of a table after pool interval time, can I get the change only data, are there any configuration?
Yes, you set this in the "mode" configuration - if it's "mode":"bulk" you'll see the behaviour that you've observed.
Hello Robin, How are you getting kafkacat command? are you running on docker host or container? I tried with your github code, I am not getting confluent web base with port 9021 , with code in getihub demo-scene, can I bring up all the services and write data from DB to kafka?
kafkacat is included in the docker-compose (github.com/confluentinc/demo-scene/blob/master/oracle-and-kafka/docker-compose.yml), but Confluent Control Center isn't so you'd need to add that container.
If you've got any more questions please head to forum.confluent.io/ and I will try and answer there :)
Where can I get this shirt? :D
hi Robin, how can I add db2 driver in kafka-connect docker container, it's so confusing, please explain and help me. thanks
Hi Hamed, if this video didn't explain it for you then head over to forum.confluent.io/ and ask there
Hey thanks for the great vid! How can I restart Kafka Connect if I don't use confluent hub cli? It is in a K8S Pod and I seem to be unable to restart it...
To restart Kafka Connect you restart the JVM process. If you're using Docker then you'd bounce the container. I'm not familiar with k8s but I assume it's roughly the same.
If you need more help then head to #containers channel on cnfl.io/slack
@@rmoff Thank u
Hello can you please do a tutorial, where you can sink kafka topic to mongodb
I'll add it to my todo list :)
For some reason, when querying :8083/connector-plugins the MySQL plugin does not show up, even though the confluent JDBC one does. Both the jar files are in the same volume for the docker container.
Anyone else run into this issue? I am not sure what more I can do...
Here's an example of using the MySQL connector from Docker: github.com/confluentinc/demo-scene/blob/master/kafka-connect-zero-to-hero/docker-compose.yml
Note that you need to add the path to which you're mounting your connector JARs in the Kafka Connect container to the CONNECT_PLUGIN_PATH variable
Is /etc/confluent/docker/run mandatory? Should not it automatically start without that command?
It will automatically start, unless you override the command/entrypoint, which is what is shown here.
For more questions/detail, head over to forum.confluent.io/ :)
Hi robin, i have a question regarding JDBC source connector offset reset. my requirement is whenever my connector restart it will load the entire data from the table. to achieve this what configuration property should i added in connector configuration. i searched alot in the internet most of the answer is rename the existing connector.. but i am not satisfied with this approach.
i need to change the source connector offset to 0 without renaming the connector, how we can do this? can you help me?
Hi, please post this at forum.confluent.io/ :)
@@rmoff thanks for the reply. I will post this question
Hey great tutorial but i got the following error:
java.lang.NoClassDefFoundError: Could not initialize class oracle.jdbc.OracleDriver
Please help,
Hi, it's a bit difficult to debug this kind of thing here - head over to confluent.io/community/ask-the-community/ instead :)
is it possible use kafka Connect on Docker connected to an external Oracle Database, outside network container? How can I achieve it?
So long as the Docker container can access the external Oracle database this will work. You should just be able to reference your Oracle database with its address/IP. See docs.docker.com/network/ for information about Docker networking.
Hi Jorge, I have connected to oracle which is present on my local system. I am trying to do CDC there. It is giving me one time full data present in table, but not recent table data. If you know anything , please help me
@robin Where can I find Scala /java code for this ?
I'm not sure which code you're referring to. The Kafka Connect code is part of Apache Kafka, the source for which it on GitHub.
@@rmoff thanks fr ur reply... source and sink connectors from con fluent Kafka are only configuration based. What if I want to implement same using Scala/java code? I am unable to find any code on Internet for MySQL as source and Kafka topic as target
@@abhinavkumar-se2fd Why would you want to write it yourself in scala/java? As much fun as it may be, Kafka Connect is built for a reason and that is to do exactly this kind of workload :)
@@rmoff actually I do not want manual intervention. Then how can I generate connector configuration files dynamically ?
@@abhinavkumar-se2fd You configure Kafka Connect using REST calls, so from your application that generates the configuration dynamically you would submit this to the Kafka Connect end point.
For more help and discussion, head over to cnfl.io/slack and the #connect channel :)
the connector is created but when i run "list topics;" in ksql , there is no topic
What's the output of SHOW CONNECTORS ?
Check the Kafka Connect worker log.
If you're still stuck head to #ksqldb on cnfl.io/slack
i have the same issue there is no topic created after create the connector
@@mysaramewafy4301 Is the connector running? Are there any errors in the log? Head to #connect at cnfl.io/slack for more help.