Very interesting series of videos. Very helpful. A little remark: at 38:58, it seems that the order value to be inserted was way higher that the currently displayed maximum (22190899.73 vs 216233.09) and still this value was not updated.
Thank you Robin for great video. If we edit the same csv file again all records gets processed i think . Can we connect this behaviour. I have seen this with S3 source connector?
Thank you so much for this guide ! But i've got an issue i don't know if it's normal. Personnaly when i cp back a .csv file ( same name ) in my unprocessed directory, the file is processed again and the offset is going from 500 to 501, 502 etc. Is this normal ? Plus when a file is processed it's creating a "orders.csv" subdirectory in the processed directory. Is this due to some update ?
Great video Robin. QQ - is there a way to publish an event once a file has been completely ingested? Does ksqlDB provide any hooks that might help? I need to ingest a customer file and then send subsets of that file to a number of vendors. I figure I'll have a consumer for each vendor.
The spooldir connector moves files to a new folder once ingested. I don't know if the functionality you describe is available in other connectors. Check out www.confluent.io/hub/streamthoughts/kafka-connect-file-pulse and www.confluent.io/hub/mmolimar/kafka-connect-fs perhaps. For any more questions, head to forum.confluent.io/ :)
@@drhouse1980 he shows how to get the headers via Kafkacat but the question I have is how to then turn this into a topic that other consumers can subscribe to. For example, after a customer sends a request you then want to ship out requests to vendors then marry the data up later.
HTTP/1.1 405 Method Not Allowed X-Confluent-Control-Center-Version: 6.2.1 X-Confluent-Control-Session: 96af01d2-6b69-45ea-937c-1f42c8aa7f78 Strict-Transport-Security: max-age=31536000 Content-Length: 0 5:20 Error, any idea how to fix?
If you just have a CSV file and a database, I don't think adding Kafka in just to do the load would make any sense - there are plenty of database tools to load the CSV file directly. If you already have the data in a Kafka topic and want to load it into a database then you can use the JDBC Sink connector. For more questions head over to forum.confluent.io.
realized this is a "old'ish" video... you dont show at any time how you started your kafkacat container, also of course now kafkacat has been replaced/renamed to kcat
Just getting started with Kafka, but this video makes me realise how useful it’s going to be. Great video, thank you
Thanks!
This is great, thank you. I have a question regarding Timestamp conversion, but placed it on the community page.
Very useful! Thanks Robin!
Very interesting series of videos. Very helpful.
A little remark: at 38:58, it seems that the order value to be inserted was way higher that the currently displayed maximum (22190899.73 vs 216233.09) and still this value was not updated.
thanks.this is good video. you save my time bro. 👍
Glad I could help!
Thank you Robin for great video. If we edit the same csv file again all records gets processed i think . Can we connect this behaviour. I have seen this with S3 source connector?
Brilliant
Great Video!! Will this work for processing a csv with 1.000.000 registries ?? Would it last less than an hour to save it in an Oracle Database??
Thank you so much for this guide ! But i've got an issue i don't know if it's normal. Personnaly when i cp back a .csv file ( same name ) in my unprocessed directory, the file is processed again and the offset is going from 500 to 501, 502 etc. Is this normal ? Plus when a file is processed it's creating a "orders.csv" subdirectory in the processed directory. Is this due to some update ?
Thanks
Just Awesome !!
Thanks, glad you liked it :)
Thanks!
If i run docker-compose up -d it hangs everytime downloading the Kafka Connect JDBC hub plugin
hi, the best place to get help is at www.confluent.io/en-gb/community/ask-the-community/ :)
Hi, Is there any connector for csv sink? Thanks!
U d champion !!
Great video Robin. QQ - is there a way to publish an event once a file has been completely ingested? Does ksqlDB provide any hooks that might help? I need to ingest a customer file and then send subsets of that file to a number of vendors. I figure I'll have a consumer for each vendor.
I've answered your question over here: forum.confluent.io/t/submit-message-when-csv-has-been-ingested/1658/2
very awesome
i'm beginners and it helpful
Thanks for the comment :) Glad I could help!
Does it support file changes ? when the file change i want to re-read the file !
The spooldir connector moves files to a new folder once ingested. I don't know if the functionality you describe is available in other connectors. Check out www.confluent.io/hub/streamthoughts/kafka-connect-file-pulse and
www.confluent.io/hub/mmolimar/kafka-connect-fs perhaps.
For any more questions, head to forum.confluent.io/ :)
Hi How can i move data from a database running sql server on a windows server operating system into kafka
See rmoff.dev/no-more-silos. Connectors suitable include the JDBC Source connector or Debezium.
@@rmoff thanks
Hi Robin great video! I have one requirement can we send only the file name, not the file content whenever the new file is created on the directory
I don't know if there is a connector that does this. You could ask at forum.confluent.io/.
For questions about the connector and Apache Kafka in general please head to confluent.io/community/ask-the-community/
Nice video, do you know if this connector can get the filename?
@@drhouse1980 he shows how to get the headers via Kafkacat but the question I have is how to then turn this into a topic that other consumers can subscribe to. For example, after a customer sends a request you then want to ship out requests to vendors then marry the data up later.
Hi @rmoff
May I know how to get the same CSV file data from a SFTP location which use key based authentication...
Can't able to know how mention the key values
I am giving request to start the connector from postman
HTTP/1.1 405 Method Not Allowed
X-Confluent-Control-Center-Version: 6.2.1
X-Confluent-Control-Session: 96af01d2-6b69-45ea-937c-1f42c8aa7f78
Strict-Transport-Security: max-age=31536000
Content-Length: 0
5:20 Error, any idea how to fix?
Hi, please post this at forum.confluent.io/ :)
This is really great video . I want to load everyday huge csv file into database . Can I use Kafka Csv connector
If you just have a CSV file and a database, I don't think adding Kafka in just to do the load would make any sense - there are plenty of database tools to load the CSV file directly.
If you already have the data in a Kafka topic and want to load it into a database then you can use the JDBC Sink connector.
For more questions head over to forum.confluent.io.
how to solve : % ERROR: Failed to query metadata for topic orders_spooldir_00: Local: Broker transport failure ?
Hi, please post this at forum.confluent.io/ :)
Great video and amazing content! Could you please share a repo/link with the code used in this video? :)
Thanks, glad you liked it! The code is here: github.com/confluentinc/demo-scene/tree/master/csv-to-kafka
realized this is a "old'ish" video... you dont show at any time how you started your kafkacat container, also of course now kafkacat has been replaced/renamed to kcat