Thank you so much for your efforts. After finishing this tutorial i have now got enough knowledge around Kafka and associated tools. I am very eager in a tutorial on Ktable and Kstreams of Kafka. Kudos.
Thanks Basant😊.. Appreciate your efforts, these are all our previous videos.. will go through one more time.. waiting for new topics 😊.. God bless You!!!!
At highlevel its very good to understand flow, but it would be great if you can add a note of API's usage and its importance. Ex: Why we are using consumerFactroy? why we have to use KafkaTemplate? and what all the other relevant API's can be considered?
@javaTechie.- I am someone who has hugely benefitted from your videos where you explained everything in detail. I am also a fan member of your channel. One small humble request if it is possible, please if you could kindly show us how to make the configurations of kafka in a windows machine. I have seen a lot of videos but most of them are misleading.
Hello subhra . Thanks for following Javatechie and I am so happy to see you as a member of Javatechie. Could you please tell me what configuration you are expecting is it offset explorer or Kafka yaml configuration.
@@Javatechie if you could show how the kafka, the zookeeper configuration setting is done in windows command prompt, because the settings would be very different from MAC right?
Hello buddy. No I don't have any plans at this moment because I Don't have windows OS with me now but i can suggest you to check out my old video which I did using windows .
Another worthy tutorial, thanks for your effort as always. Actually there is lot to learn in this video such as Docker, types of kafka etc. Thank You for your effort Basant.
@JavaTechie great one. can you please add the timer ? It would be really useful - for example we can straight away go to the consumer implementation for example.
Hello @JavaTechie thank you very very much for this course.Please add angular and react course as well if possible.We are waiting so long. Thank you very much.
Thanks for the tutorial! Could you be so kind as to explain me why un the Listener Integracion Test you do not consume the event withing the test, asserting the object sent is equal to the one recieved? Thanks!
Very Informative Tutorial but I have a question, In the Avro Lecture It is demonstrated that producer and consumer is in same project and whenever we make a change in employee.avsc file It changes the employee class in the defined package and now since both producer and consumer are in same project and are using employee class from same package that will not an issue, But If producer and Consumer are in different projects , then how do the changes we make in the employee.avsc file will be in sync with the Employee class in consumer. Asuming that I am using Avro and producer in a single project and Consumer in another project @Java Techie
Yes if it's in different project then nothing to worry we are not generating Employee class manually right, consumer needs to just run mvn build it will create payload for you by reading the latest schema
partitions in a topic will not have depluicate data, meaning - if a topic has 3 partitions then a data will be present in all 3 partitions but ony 1...
Hi Basant, while creating topic via binary download you have used bootstrap-server localhost:9092 but with docker you have used zookeeper:2181. Can you pls tell reason for difference
#JavaTechie , I noticed producer config values being printed in my IDE logs every time we threw exception for invalid IP address from the consumer . I guess that is because we are actually 'producing' to the DLT topic from the consumer? Please advise
No it's not because of the serialization issue please check at topic level you have to choose string as data format by default it will set byte array for you.
Hey Hi @Javatechie... I was following along and faced an issue while doing serialize and deserialize i was getting serialization exception... Then after checking whole lot of things i found out package name fro Customer class has to be same as one defined in consumer even if contents of class are same it wasn't working fine... So couldn't understand why this was happenning. Can you help me with this?
when deserialization in consumer side consumer, it uses the metadata about object which include while serialization. use these in consumer properties spring.kafka.consumer.properties.spring.json.use.type_info_headers=false spring.kafka.consumer.properties.spring.json.value.default.type=yourEntitynamewithpackage spring.kafka.consumer.properties.spring.json.use.type.headers=false
@Java Techie, can you please help me with the issue in starting kafka server. I am getting no response when i start kafka server. C:\KAFKA\kafka_2.12-3.7.1>.\bin\windows\kafka-server-start.bat .\config\server.properties C:\KAFKA\kafka_2.12-3.7.1>
I am trying to practice the same code but getting error even after addding trusted packages in kafka consumer. Error is , failed to resolve class name. Class not found. Can you please help. Thanks.
Yes . I even tried the second approach of adding through config class but still getting the error as Listener method couldn't be invoked with incoming message. Can't covert from java.lang.string to dto.customer. Can you please link the git hub Link for this code? I will try and copy paste the code if I am missing something. Thanks.
If you know python also that's enough but you need to find out integration steps brother. This course for java integration I don't think it helps you . But check first 4 videos it will give you complete picture of Kafka internal
Any chance you could use voice over for your videos? No offense but the accent is too strong for non-india to understand, the captions are wrong too. Otherwise the material presented is excellent.
There should be no spaces between folder/dir name and also keep the dir name simple without its version, ex: "kafka" instead of "kafka_2.12-3.8.0", "ApacheKafka" instead of "Apache Kafka".
Hello @javaTechie : around 1h:43m , while using the template to send the message to topic. getting the below error Cannot invoke "org.springframework.kafka.core.KafkaTemplate.send(String, Object)" because "this.template" is null
@javatechie, I get the error in kafkaconsumer at 02:45:05 Caused by: org.springframework.messaging.converter.MessageConversionException: failed to resolve class name. Class not found [com.example.kafka_youtube_javatechie.model.User] at org.springframework.kafka.support.mapping.DefaultJackson2JavaTypeMapper.getClassIdType(DefaultJackson2JavaTypeMapper.java:137) ~[spring-kafka-3.1.4.jar:3.1.4] in your case, the Customer class is in the same package in both producer and consumer projects. But in my case, the equivalent (User.java) is in different packages in both producer & consumer. Can you please help?
turns out I had to use the following consumer configuration: Map map = Map.of(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092", ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class, ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class, JsonDeserializer.TRUSTED_PACKAGES, "com.example.kafka_consumer_javatechie.model,com.example.kafka_youtube_javatechie.model", JsonDeserializer.USE_TYPE_INFO_HEADERS, false, JsonDeserializer.VALUE_DEFAULT_TYPE, "com.example.kafka_consumer_javatechie.model.User");
Oh now I know what I'm gonna do this weekend. Thank you
00:00:00 course outline
00:00:58 kafka for beginners
00:14:45 components and architecture
00:29:54 kafka installation
00:41:27 kafka CLI and workflows
01:22:09 installing kafka using docker-compose
01:37:43 kafka producer example using springboot
02:02:46 kafka consumer example using springboot
02:29:48 kafka serialize & deserialize
02:54:19 kafka partition
03:10:33 kafka e2e testing in spring boot with test containers
03:33:15 kafka error handling
04:01:12 kafka schema registry
Thank you so much buddy ☺️
thanks!
Awesome explanation sir.....please make video on AWS....waiting your next video
Aws is already there please checkout my aws Playlist
The sign of a good teacher is to make things simple. Very good.
Thank you so much for your efforts.
After finishing this tutorial i have now got enough knowledge around Kafka and associated tools. I am very eager in a tutorial on Ktable and Kstreams of Kafka.
Kudos.
@JavaTechie, as long as you are there, no issues for us to learn Java Ecosystem tools and software, many thanks.
Thanks Basant😊.. Appreciate your efforts, these are all our previous videos.. will go through one more time.. waiting for new topics 😊.. God bless You!!!!
At highlevel its very good to understand flow, but it would be great if you can add a note of API's usage and its importance. Ex: Why we are using consumerFactroy? why we have to use KafkaTemplate? and what all the other relevant API's can be considered?
Much needed! Thanks for posting 👏
@javaTechie.- I am someone who has hugely benefitted from your videos where you explained everything in detail. I am also a fan member of your channel. One small humble request if it is possible, please if you could kindly show us how to make the configurations of kafka in a windows machine. I have seen a lot of videos but most of them are misleading.
Hello subhra . Thanks for following Javatechie and I am so happy to see you as a member of Javatechie. Could you please tell me what configuration you are expecting is it offset explorer or Kafka yaml configuration.
@@Javatechie if you could show how the kafka, the zookeeper configuration setting is done in windows command prompt, because the settings would be very different from MAC right?
Hi @Javatechie. Any plan on making the video of the topic I mentioned please.😔
Hello buddy. No I don't have any plans at this moment because I Don't have windows OS with me now but i can suggest you to check out my old video which I did using windows .
@@Javatechie Thank you man! Sorry for bothering you too much could you please provide me with the link?
Million thanks to Basant Sir.Always check Java techie if anything i wants to study.God bless
Great Share, Hoping for more like this.
Gold Stuff, what a clear concise explanation, Thanks for your effort towards the community Basant.
Good Job Basant, thank you so much! All your videos are just amazing. Keep providing us with such amazing tutorials
Another worthy tutorial, thanks for your effort as always. Actually there is lot to learn in this video such as Docker, types of kafka etc. Thank You for your effort Basant.
Thanks Basant.. Appreciate your efforts. Waiting more video on Ktable and Kstreams of Kafka and Spring Cloud Stream.
Great video ! Please add chapters/timeline in case someone wants to back and take a quick look at particular section
Noted will update that
Great video! Keep up the fantastic work. By the way, every time you say "why is it crying," it cracks me up! Keep those hilarious moments coming!
Thank you for your word . Keep learning 🤓
What a Course . Best Teacher
Please try to post about performance tuning in Java and SQL side
thank you for uploading such grate content - can you please make video for redis with real time example
Wow thanks for this JavaTechie.
Thank you Basant sir, Love and great regards from Mumbai
Love to study from you bro!❤
1:10:10 Offer letter :)
@JavaTechie great one.
can you please add the timer ? It would be really useful - for example we can straight away go to the consumer implementation for example.
Yes i will add
very useful and informative training on kafka with nice graphic illustrations
Hello @JavaTechie thank you very very much for this course.Please add angular and react course as well if possible.We are waiting so long.
Thank you very much.
Thanks for making this course
Nice video sir pls make a video on real time use cases of Kafka ...
Dhanywad Basant sir
Thank you so much for your efforts.
Very good Video. Need video on Kafka Connect and Connectors.
Hi i am very happy to getting such a wonderful teacher to making understand easily thanks a lot
Glad to hear that
Hi Basant Sir,
Could you please create the video content for KAfka Streams, Tables and Kafka Connect as well. That will be really helpful.
Yes man I will
🙏 Great 5 hours
Very good ! Excellent !
HI Thank you for this tutorial..
When will you release the part 2 of this series
You are really great bro. Thanks for the wonderful content.
great video
Thanks a ton Buddy
Thanks for uploading this video 😊
Thanks !!! More JT gold
Thank you for this fantastic tutorial @javatechie. Please can you do a video on Debezium CDC with kinesis data streams, thank you.
great Java Techie
Absolutely fantastic
Thanks for the tutorial!
Could you be so kind as to explain me why un the Listener Integracion Test you do not consume the event withing the test, asserting the object sent is equal to the one recieved? Thanks!
Everything all and good, I noticed that in error handling part you are using some annotations for listner consumeEven which you never discussed about
very good
Very Informative Tutorial but I have a question, In the Avro Lecture It is demonstrated that producer and consumer is in same project and whenever we make a change in employee.avsc file It changes the employee class in the defined package and now since both producer and consumer are in same project and are using employee class from same package that will not an issue, But If producer and Consumer are in different projects , then how do the changes we make in the employee.avsc file will be in sync with the Employee class in consumer. Asuming that I am using Avro and producer in a single project and Consumer in another project @Java Techie
Yes if it's in different project then nothing to worry we are not generating Employee class manually right, consumer needs to just run mvn build it will create payload for you by reading the latest schema
good
Good content but please request to make a full fledged video ro use this in real time Microservices of huge data and track with that
💓 great course
thanks for everything.
Thank you
Thank you so much. Please will you be updating the Java AWS course any time soon?
Java AWS already available please checkout AWS playlist buddy 🙂
Hi Basant, I want to understand difference between client id and group id. Can you please explain with an example of usage
Make POC on Kafka Stream As well with source code.
Mini project
.
It's been great but do some real World project based Kafka well this is one example,
partitions in a topic will not have depluicate data, meaning - if a topic has 3 partitions then a data will be present in all 3 partitions but ony 1...
Yes only 1 partition will have that info it won't be spam in other
Hi Basant, while creating topic via binary download you have used bootstrap-server localhost:9092 but with docker you have used zookeeper:2181. Can you pls tell reason for difference
Tq basen. 🎉🎉🎉🎉
Just awesome 😎 thanks
What is the work of replication factors?
Thanks a lot!.
Thank you
#JavaTechie , I noticed producer config values being printed in my IDE logs every time we threw exception for invalid IP address from the consumer .
I guess that is because we are actually 'producing' to the DLT topic from the consumer?
Please advise
It would have been so great if there were timestamps. Other than that, it's all good
I will add it soon 🤠
where i can see code of the above course.the github repo in description has different code base.
I might missed it . Please check link in kafka Playlist for now . I will update soon
Hey @JavaTechie I really liked your video and it was awesome. It will be very good if you share the link of the pdf. Thanks
i have never seen a crash course being 4hrs
Did you enjoyed it or feel bored 😴?
Much needed if possible then mysql or postgresql db integration between them
I already cover this usecase in cqrs design pattern buddy please check
Thank you so much sir 🙏❤️
can you please share git repo? above git repo is about springboot-apache-pulsar.
Please go to the root repo and just filter with Kafka
Hi brother!! In my offset explorer, I can see data which is sent by producer is saving as different word. Is this due to serialization or something?
No it's not because of the serialization issue please check at topic level you have to choose string as data format by default it will set byte array for you.
@@Javatechie ok.. Thanks brother
Properties -> Content Types -> String -> Update, then all the value will change from Byte to String 😎
Hey Hi @Javatechie... I was following along and faced an issue while doing serialize and deserialize i was getting serialization exception... Then after checking whole lot of things i found out package name fro Customer class has to be same as one defined in consumer even if contents of class are same it wasn't working fine... So couldn't understand why this was happenning. Can you help me with this?
when deserialization in consumer side consumer, it uses the metadata about object which include while serialization. use these in consumer properties
spring.kafka.consumer.properties.spring.json.use.type_info_headers=false
spring.kafka.consumer.properties.spring.json.value.default.type=yourEntitynamewithpackage
spring.kafka.consumer.properties.spring.json.use.type.headers=false
Hello Amrit you need to soecify trusted package both from consumer and producer if it's different
I am facing erros like connection reset , timeout of 18000ms exceeded in zookeeper
tried everything, still issue is not solved, please guide
Please drop an email to javatechie4u@gmail.com
and also no need to use Lombok, you can use records instead
@Java Techie, can you please help me with the issue in starting kafka server. I am getting no response when i start kafka server.
C:\KAFKA\kafka_2.12-3.7.1>.\bin\windows\kafka-server-start.bat .\config\server.properties
C:\KAFKA\kafka_2.12-3.7.1>
Don't worry, I'll write a blog about it and post it on my Medium. I'll also announce it in a youtube community post this weekend
I am trying to practice the same code but getting error even after addding trusted packages in kafka consumer. Error is , failed to resolve class name. Class not found. Can you please help. Thanks.
Does your producer and consumer follow proper package structure
Yes . I even tried the second approach of adding through config class but still getting the error as Listener method couldn't be invoked with incoming message. Can't covert from java.lang.string to dto.customer. Can you please link the git hub Link for this code? I will try and copy paste the code if I am missing something. Thanks.
exactly the same error I have encountered during the consumer and producer example class not found.
Nice tutorial. Where Can I find the source code used in the tutorial?
In video description
In the real World Springboot application how can we use Kafka, i mean We have lot of apis and rest classes
Bro can you make a video on jackson api and json to java obj and vice versa ?
May i know the part 2 link please for kafka
Part 2 not released yet.
Do we need to know Java to learn from this tutorial???
Yes java knowledge required
That's the great question I have heard till now 😅
@@jhari4683 I meant to say is Python enough or do we need to know Java as well ... because in my organisation Kafka python is being used...
I hope you got it khari 😅
If you know python also that's enough but you need to find out integration steps brother. This course for java integration I don't think it helps you . But check first 4 videos it will give you complete picture of Kafka internal
I am trying to download kafka file on windows but every time getting editable notepad file Please guide me how to download
I think each consumer group consumes all the messages from all the partitions rather one CG to one Partition, Correct me if Im wrong.
No buddy. Each consumer will listen to one partition if there cg is different
In windows zookeper running cmnd not working
I will setup in my windows and update you 👍
You can use bat file instead of sh file
Could we please have the pdf you used.
Any chance you could use voice over for your videos? No offense but the accent is too strong for non-india to understand, the captions are wrong too. Otherwise the material presented is excellent.
Why fonts are so small? It is too difficult to watch.
Can I get that ppt
Anyone can help me how to run zookeeper in windows?
Is this resolved ?
There should be no spaces between folder/dir name and also keep the dir name simple without its version, ex: "kafka" instead of "kafka_2.12-3.8.0", "ApacheKafka" instead of "Apache Kafka".
RabbitMQ also
Part 2 neiki asa
🙏🙂👍
Can you send me core java complete
stop reading my mind please !!!
this is very boring . Too much explanation and less interaction . Better if we can start action side by side
Hello @javaTechie : around 1h:43m , while using the template to send the message to topic. getting the below error
Cannot invoke "org.springframework.kafka.core.KafkaTemplate.send(String, Object)" because "this.template" is null
@javatechie, I get the error in kafkaconsumer at 02:45:05
Caused by: org.springframework.messaging.converter.MessageConversionException: failed to resolve class name. Class not found [com.example.kafka_youtube_javatechie.model.User]
at org.springframework.kafka.support.mapping.DefaultJackson2JavaTypeMapper.getClassIdType(DefaultJackson2JavaTypeMapper.java:137) ~[spring-kafka-3.1.4.jar:3.1.4]
in your case, the Customer class is in the same package in both producer and consumer projects.
But in my case, the equivalent (User.java) is in different packages in both producer & consumer.
Can you please help?
turns out I had to use the following consumer configuration:
Map map = Map.of(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092", ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class, ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class, JsonDeserializer.TRUSTED_PACKAGES, "com.example.kafka_consumer_javatechie.model,com.example.kafka_youtube_javatechie.model",
JsonDeserializer.USE_TYPE_INFO_HEADERS, false, JsonDeserializer.VALUE_DEFAULT_TYPE, "com.example.kafka_consumer_javatechie.model.User");
Hi @mayurnagdev5545, I am getting the same error while following the same code. How did you got to solve that ?
@@2668rajan I am also getting this error.