Kafka Tutorial - Schema Evolution Part 2

Поделиться
HTML-код
  • Опубликовано: 17 окт 2024

Комментарии • 30

  • @ScholarNest
    @ScholarNest  3 года назад

    Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code.
    www.learningjournal.guru/courses/

  • @NIKUNJ2578
    @NIKUNJ2578 5 лет назад +2

    As Albert Einstein once said "if you can not explain something in simple terms, you have not understood it". This tutorial explains Kafka concepts in such simple terms that anybody can understand. Thank you very much for uploading the video! Subscribed and shared with friends!

  • @DeepakPatel-xl3tr
    @DeepakPatel-xl3tr Год назад +1

    great

  • @mohamedmuhad4827
    @mohamedmuhad4827 Год назад

    How did the consumer work in this example as it did not have the latest version of ClickRecord class ?
    Consumer will not be able to set the newly added properties on the older version of the ClickRecord object which was generated using old schema.

  • @drakezen
    @drakezen 6 лет назад

    Since you mention confluent, it would be nice to have some sessions on what confluent adds as compared to apache kafka for instance, and also the usage of confluent control center and how that might be useful for managing confluent Kafka services.

  • @SeekingHorizon
    @SeekingHorizon 4 года назад

    Do we need Confluent Kafka for schema registry? How do we achieve it with Apache kafka?

  • @SunilKumar-uw2pf
    @SunilKumar-uw2pf 4 года назад

    Hi Sir,
    I am able to produce multiSchema (ex. Product and Customer) messages thru AvroProducer using multiProducer. but I am not getting any API which help us to consume multiSchemas(Product, Customer) messages from same topic.

  • @harishmi32007
    @harishmi32007 5 лет назад

    Thank you! It's very easy to understand what you teach.

  • @foruvasanth
    @foruvasanth 5 лет назад

    Do we need to create a new consumer and producer every time the schema changes???

  • @jdang67
    @jdang67 5 лет назад

    In my case, the data to be added into the Kafka topic is a complex object with inheritance. Without versioning, JSON serializer should be enough. What is the best approach to deal with existing java classes with inheritance?

    • @ScholarNest
      @ScholarNest  5 лет назад

      Thanks for putting this question. JSON and AVRO are the most commonly used approaches. However, AVRO do not support inheritance. I checked with JSON and it works.

  • @karthikkumar12
    @karthikkumar12 7 лет назад

    I started off with one video it was so informative I ended up going thru all videos (subscribed ofcourse). Great way to present things. I have a question
    Is it possible to convert string/json in kafka to be deserialized to avro by consumer ?
    string or json (non-avro) writing apps -----> serialize ---> bytes ==> KAFKA ==> bytes ---> deserializer --> avro ?
    Have a suggestion as well .. Can you make a video tutorial schema registry and/or avro data ?

    • @ScholarNest
      @ScholarNest  7 лет назад

      I already have a video on schema registry and Avro data.

    • @karthikkumar12
      @karthikkumar12 7 лет назад

      Thanks for your response. I will search for it.
      btw do you think this is possible
      string or json (non-avro) writing apps -----> serialize ---> bytes ==> KAFKA ==> bytes ---> deserializer --> avro ?

    • @ScholarNest
      @ScholarNest  7 лет назад

      Yes, it is possible. However, what are you going to do with the Avro object in the end? I guess you want to store it in Hadoop or some other place. If that's what you are aiming, Kafka Connect is the most suitable solution. I can see an analogy with a Use Case where Kafka Connect pulls data from an RDBMS (Non-Avro) and sink it into HDFS (Avro file).

  • @MyTtest
    @MyTtest 4 года назад

    Thank you!
    I looked to this video and part one of the same topic, but I am stuck since I need to implement a .net solution (in c#) for Kafka/Avro... It looks like confluence does not have a Kafka for windows (only if I use the containers or run some Linux mode...etc.). It would be great if you had some videos on .net / c# on the subject (really poor documented...)

  • @rajeshbhupal6679
    @rajeshbhupal6679 7 лет назад

    Your explanations are outstanding!!! Thank you so much.
    Sir, While answering one of the questions from the subscribers, you mentioned about a use case "here Kafka Connect pulls data from an RDBMS (Non-Avro) and sink it into HDFS (Avro file)" - is it possible to for you to create a video for this use showing steps from Producer (RDBMS DB) --> Kafka Broker -> Consumer(Hadoop HDFS)?

    • @ScholarNest
      @ScholarNest  7 лет назад +2

      I will create Kafka connect tutorial shortly.

    • @rajeshbhupal6679
      @rajeshbhupal6679 7 лет назад

      Thanks!!! May we all know you name? :-)

    • @ScholarNest
      @ScholarNest  7 лет назад

      :-)

    • @kristijankontus4846
      @kristijankontus4846 5 лет назад

      @@ScholarNest is there any chance you could do a tutorial on RabbitMQ in the future? great job here, much appreciated

  • @杨正云
    @杨正云 7 лет назад

    As the earlier topic mentioned, one partition is taken care of by just one consumer. And in this topic you started 2 consumers, one for old schema and one for new schema. I think they definitely consume different partitions. Then does this mean I need to add new partitions for new schemas changes? I can't let new consumers start working if there is not available partitions for it.

    • @杨正云
      @杨正云 7 лет назад

      I realize that actually it is not always true that the number of Consumers are the same as the number of partitions... So if the number of Consumers is less than the number of partitions I think add new Consumers are totally fine.

  • @pheiroz6307
    @pheiroz6307 7 лет назад

    Hi, You mentioned that the schema id is embedded in the message and the consumer/deserailizer uses it to refer to the appropriate schema from the registry.
    Q: Can you please tell how/when are the 2 schema registered in the Schema registry

    • @ScholarNest
      @ScholarNest  7 лет назад +2

      In the method explained in this example, We don't have to register the schema manually. It is taken care by Serializer. So Serializer is responsible for registering the new schema and embedding the schema ID in the message.

  • @matrixlnmi169
    @matrixlnmi169 6 лет назад

    Sir can we have video on cassandra

  • @markcberman
    @markcberman 6 лет назад

    Can someone share out the GITHub link that is referred to in this video?

    • @ScholarNest
      @ScholarNest  6 лет назад

      github.com/LearningJournal/ApacheKafkaTutorials