Build Scaleable Realtime Chat App with Kafka and Postgresql

Поделиться
HTML-код
  • Опубликовано: 10 окт 2024

Комментарии • 92

  • @prashlovessamosa
    @prashlovessamosa 10 месяцев назад +12

    You come up with something that no one makes
    You are awesome Piyush.

  • @CuriousAnonDev
    @CuriousAnonDev 9 месяцев назад +9

    Hey you are doing great work! A request- Can you please continue such tutorials and also teach about scalability, microservices, chat servers with rooms, video calls or deployment on docker-k8s etc.
    Like what software engineering looks like irl. People on yt just doing nextjs stuff and I can't understand anything...

  • @abdussamad0348
    @abdussamad0348 7 месяцев назад

    Amazing tutorial, helping us to do better engineering. Industry level standards !!

  • @debasishdutta9073
    @debasishdutta9073 9 месяцев назад +1

    Thanks piyush, I was waiting for this video. I love your scalable, system design videos

  • @Aitool-r3q
    @Aitool-r3q 7 месяцев назад

    Finally, completed this project and gained a lot of knowledge Thank you Piyush Sir ❤️👍🎉

  • @shravan2891
    @shravan2891 10 месяцев назад +2

    Fresh unique stuff, no one is teaching this on RUclips.
    Also can you teach more about tueborepo in detail like testing, linting etc in a turborepo

  • @niraz9701
    @niraz9701 10 месяцев назад +13

    Please make a full video about rabbitMQ🙏

  • @mdmusaddique_cse7458
    @mdmusaddique_cse7458 3 месяца назад

    Man! So much of learning in these 2 videos. Thanks!

  • @aniruddhadas2953
    @aniruddhadas2953 9 месяцев назад

    Just wow 🤩 🤩 . I have learned something new that no one teaches us. Highly appreciable work. Thank you . 🙏

  • @shubhtalk7073
    @shubhtalk7073 Месяц назад

    You are best piyush bhaiya ❤

  • @aadarshgurug
    @aadarshgurug 9 месяцев назад

    that is something i was planning to build and had lot of confusion, now everything is cleared thank you brother

  • @ARSHADKHAN-hc6pb
    @ARSHADKHAN-hc6pb 10 месяцев назад +1

    Love you brother ❤❤❤,
    And my one is that please make series on microservices project, that how to make project using this architecture

  • @harshgautam6260
    @harshgautam6260 9 месяцев назад

    Mann i love how professionally you do your work ❤

  • @souravkumar3553
    @souravkumar3553 5 месяцев назад

    Really something awesome.Practically answering all system design question.

  • @shubhtalk7073
    @shubhtalk7073 Месяц назад

    Better than paid course ❤

  • @vishalkhoje
    @vishalkhoje 9 месяцев назад +1

    Hi Great work Piyush. Can you please create a video how we can able to deploy turborepo project like current scalable Realtime chat app on servers (vercel)

  • @PrantikNoor
    @PrantikNoor 10 месяцев назад

    I like your teaching style. ❤

  • @akash-kumar737
    @akash-kumar737 Месяц назад

    Thank man. Learned a lot.
    Love you ❤❤❤

  • @TechSpot56
    @TechSpot56 6 месяцев назад

    Great video, please continue this series.

  • @Support-Phalestine
    @Support-Phalestine 9 месяцев назад +1

    Piyush ek introduction video banao plzz What is turboRepo im confused with it

  • @swarajgandhi
    @swarajgandhi Месяц назад

    Very helpful videos Piyush.
    Just have one doubt, How to retrieve data in case user refresh the page and we have to fetch the last few messages? because we can't query DB in that case.

  • @sagar7929
    @sagar7929 9 месяцев назад

    awesome course and thank you so much.
    Make more awesome valuable content with monorepo architecture in nodejs .
    God bless you sir and thank you once again.

  • @snehasish-bhuin
    @snehasish-bhuin 9 месяцев назад

    Very nice learning ❤🎉it will definitely impact on community

  • @FarazAhmad-t6g
    @FarazAhmad-t6g 3 месяца назад

    bro is litrally creating his own empire in backend mastery.

  • @rajukadel1007
    @rajukadel1007 9 месяцев назад

    Great content !! Keep sharing your experience ❤

  • @krishangopal2475
    @krishangopal2475 8 месяцев назад

    Hi Piyush. Thanks for the amazing video!!!
    Just one Question, Couldn't we use kafka directly as a pub/sub instead of using redis separately , where all servers and the processing server ( running write queries in postgres ) subscribes to the 'MESSAGES' kafka topic?

  • @mohammedgazi786
    @mohammedgazi786 9 месяцев назад +1

    ek MVC pe bhi video lao bhai

  • @wahabkazmi7486
    @wahabkazmi7486 Месяц назад

    Piyush you are sending data into the database one by one not in bulk, right?

  • @abhidevlops
    @abhidevlops 9 месяцев назад

    Awessome Content Brother . You can further extend this Project ❤

  • @skzahirulislam2820
    @skzahirulislam2820 9 месяцев назад

    Bhai ek request hai plz ek pagination aur inifinite scroll ke upar bhi video banaye bahut jada ye topic pucha jata hai interviews mein react.js, node.js mein. Aur konsa kab use karna hai wo bhi bata dijiyega indepth banaye ga bhai

  • @shubhasheeshkundu6040
    @shubhasheeshkundu6040 10 месяцев назад +1

    Sir please make backend project using microservices

  • @digitalTechspace
    @digitalTechspace 7 месяцев назад +1

    why you use redis and kafka both can we use kafka only?

  • @sohanbafna2282
    @sohanbafna2282 5 месяцев назад

    Here I was curious to know that if Redis could have been replaced kafka? I am not sure if redis is required here if we are using kafka ? Please let me know your thoughts

  • @deezwhat2791
    @deezwhat2791 9 месяцев назад

    top notch content

  • @anassaif3181
    @anassaif3181 4 месяца назад +2

    You earlier said that consumer is a separate NodeJS Server but u defined the consumer in the primary server itself.. Why ? Is that for the sake of simplicity? If so , then how wil l get the same Prisma instance if we had standalone consumer server ?

    • @opsingh861
      @opsingh861 3 месяца назад

      You can just create the new instance for the prisma, and if you are using turbo repo then you can just import that

    • @anassaif3181
      @anassaif3181 3 месяца назад

      @@opsingh861 New instance would probably lead to new connection ig and probably new migrations.... Yeah !, Turbo might be a better option.

  • @ankitpradhan3327
    @ankitpradhan3327 2 месяца назад

    You are making a new entry to the database every time a new message is produced and as we know databases have low through put so can we run consumer or a second consumer to consume the data at a certain interval of time to store the produced data in db?

  • @PrashantSaini-i6j
    @PrashantSaini-i6j 9 месяцев назад

    Keep posting such content please

  • @_034_divyanshusrivastava6
    @_034_divyanshusrivastava6 6 месяцев назад +1

    Brother please deploy bhi kr diya kro, bahut problem hoti hai.

  • @aliimranadil2
    @aliimranadil2 9 месяцев назад

    love it bhai

  • @lakshyasyal
    @lakshyasyal 7 месяцев назад

    This is a beneficial video but I didn't like the ending of this. With the try catch if the DB crashes or something goes wrong with it then pause for 1 minute and restart it from the beginning, Can we do something else that would be good for the DB always?

  • @akhiltej7q8
    @akhiltej7q8 9 месяцев назад

    please make more videos for this as continuation !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

  • @jbrocksfellas4147
    @jbrocksfellas4147 10 месяцев назад

    awesome value

  • @Chanchal_Sen
    @Chanchal_Sen 2 месяца назад

    You Got New Subscriber

  • @work7066
    @work7066 10 месяцев назад +2

    why are we using redis along with kafka, can't we simply use Kafka's pub/sub for two servers to communicate instead of redis? can someone please explain the advantages or tradeoffs of doing so?

    • @AryanRaj-td4vs
      @AryanRaj-td4vs 8 месяцев назад

      I'll give example based on Google pubsub or storage queue that I use at place of kafkaa, and the reason is if one consumer takes that message and ACK that message is gone from topic and other instances with same subscriber won't get the messages, offcourse we can create separate subscriber for each instance but that is manual process unlike redis pub sub

  • @shi-nee7966
    @shi-nee7966 9 месяцев назад

    where is the video piyush sir has mentioned about i.e the first part to this video?

  • @riyanshbiswas
    @riyanshbiswas 9 месяцев назад

    Just one question, why are you using cloud services for postgres and kafka? Isn't using a docker container locally free and less time consuming as well?

  • @_PiyushKumar-vv7ui
    @_PiyushKumar-vv7ui 9 месяцев назад

    amazing video

  • @harsh_dagar
    @harsh_dagar 6 месяцев назад

    Hey @piyushgargdev,
    Is Kafka's consumer interval (to consume message) is incremental or all data will be provided? If so, how can we only handle the incremental data and not whole?

  • @saksham_1612
    @saksham_1612 10 месяцев назад

    Awesome video ❤🎉

  • @physicsakhada592
    @physicsakhada592 7 месяцев назад

    If it is a group chat then we have to make a different database postgres to store all the data like room id and all users of that grp

  • @krishnanand_yadav
    @krishnanand_yadav Месяц назад

    When the messages are consumed, then they should be deleted from the 'Topics' in Kafka, right? But they are still present there. Is it supposed to be like this?

  • @Dark-nt8hh
    @Dark-nt8hh Месяц назад

    One thing i didnt understand is why did you run the consumer function in init function of index.ts file. Couldn't understand the logic behind it. Everything else is topnotch

  • @Sandbox-coder
    @Sandbox-coder 10 месяцев назад

    One more like... hey mate have query how kafka understands that to which consumer beed to send reply.. i did see your kafka video... struggling to understand this...and how can i build same for mobile app??

  • @effectsofmp3927
    @effectsofmp3927 10 месяцев назад +1

    First comment 😁

  • @AdityaGhode-l2z
    @AdityaGhode-l2z 3 месяца назад

    I had one doubt please anyone help, Where is the complete frontend and deployment part of this project ?

  • @arafatislam4648
    @arafatislam4648 9 месяцев назад

    Can you show the deployment process?

  • @mystic_monk55
    @mystic_monk55 10 месяцев назад

    👏👏

  • @nitinjain4519
    @nitinjain4519 9 месяцев назад

    Can i run two databases in prisma in the same project like PostgreSQL and MySQL

  • @sachinsingh2104
    @sachinsingh2104 9 месяцев назад

    can you help us in knowing how to deploy monorepo appilcations
    .

  • @Ankit-01-01
    @Ankit-01-01 10 месяцев назад

    Make a video on posgresql

  • @growwithriu
    @growwithriu 9 месяцев назад

    Hi Piyush and everyone, I have a doubt. When there are multiple servers, each of them will be consuming msg from kafka and writing to postgres, thereby creating as many message entries in db on every message as the number of servers. Is that desired?
    spin up one more server on a diff port. Send a message. There will be two entries for this message on db

    • @shantanubhise9288
      @shantanubhise9288 9 месяцев назад +1

      Yes, you are correct. If the logic for consuming messages and writing to PostgreSQL is directly placed within the message-receiving event, it can lead to duplicated message entries. In my implementation, I've used Redis for inter-server communication. I have not used kafka yet. When a message is sent (triggered by the "send" event), I publish it to the "MESSAGES" channel in Redis. And, on the "receive" event, I broadcast the message to all connected clients.
      Regarding the storage of messages in PostgreSQL, I've introduced a global array named "messageBatch." When a message is sent ("send" event), I push the message into this array. The important aspect is the use of setInterval to periodically process this array(use copy of messageBatch and make messageBatch empty to store new messages), writing its contents to PostgreSQL. The data is successfully stored.

    • @anagnaikgaunekar9081
      @anagnaikgaunekar9081 Месяц назад

      This is a valid issue. This wont occur if we put the produceMessageForKafka logic after publishing to redis instead of putting it on subscribe

  • @indentCoding
    @indentCoding 9 месяцев назад

    A doubt, You are able to get messages in kafka at high velocity because kafka is meant for that , but when you are inserting into db for eachMessage , how will it make any diff because in event of high velocity of messages eachMessage function will do a insert query into db so for example if you are rec 100000 messages at 1-2 second interval your db will have 100000 insert operation which will make the db down. and if that happen what is the benefit of using kafka i understand that there will be no downtime because kafka will still be active but there should be something that will reduce the insert operation into the db

    • @xcoder420
      @xcoder420 8 месяцев назад

      Actually the consumer should be an altogether different microservice. This will consume the messages in batches and do batch insertion. Let's say we configured the DB to support 10k WPS. So we'll consume 10k messages and insert it into DB. This is actually known as async processing.

  • @sagarbhatt3346
    @sagarbhatt3346 10 месяцев назад

    please make the same videos on python.

  • @catchroniclesbyanik
    @catchroniclesbyanik 9 месяцев назад

    I have one question. Since you introduced Kafka in the project, couldn't we remove Redis from the project?? Because Redis was being used for pub/sub, which Kafka can do as well.

    • @abdulrehmanjaved-rt8jq
      @abdulrehmanjaved-rt8jq 9 месяцев назад

      you mean we can subscribe on servers to kafka topics?

    • @abdulrehmanjaved-rt8jq
      @abdulrehmanjaved-rt8jq 9 месяцев назад +1

      Redis supports push-based delivery of messages that means messages published to Redis will be delivered automatically to subscribers immediately but kafka is supports pull-based delivery of messages, meaning that messages published in Kafka are never distributed directly to consumers, consumers subscribe to topics and ask for messages when consumers are ready to deal with them.

    • @shantanubhise9288
      @shantanubhise9288 9 месяцев назад

      Yes, one might think of using only Kafka or Redis. But here we need both.
      Here we have 2 requirements:
      1. Inter-Server Communication.
      Meaning message sent by user1 on server1 should be received by all the users present on different servers. Here we can use Redis Pub/Sub model. Redis Publisher publishes the message to the channel "MESSAGES". All the Redis Subscribers of this channel will receive the message, including the server which has sent the message. Thus Inter-Server Communication is achieved.
      If we use Kafka in this case. Kafka producer will produce the message to topic "MESSAGES". Here all the Kafka consumers (on all servers) will belong to the same consumer group, because they have same groupId. Hence only any one consumer will receive the message on the "MESSAGES" topic. And other Kafka consumers (servers) will not receive the message.
      2. Storage of Messages in Database.
      Here we can use Kafka. Kafka producer will produce the message to topic "MESSAGES". And only one Kafka consumer of this topic will receive the message. This consumer will store it in the database. Like I said earlier, all the consumers here have same groupId. Hence only one of them can receive message.
      If we use Redis here, all the Redis subscribers will receive the messages and store the messages in database, resulting in duplicate messages.

    • @catchroniclesbyanik
      @catchroniclesbyanik 9 месяцев назад

      Here, all the server instances subscribe to a redis channel for incoming messages. I think, we could simply remove redis, and make every server long poll kafka for messages.

    • @abdulrehmanjaved-rt8jq
      @abdulrehmanjaved-rt8jq 9 месяцев назад +1

      ​@@catchroniclesbyanikyes because kafka also gives pub/sub mechanism. I think pyush bhai ny first video just problem solve krny k liye bnai and yeh full scalability k liye.

  • @rohitpandey4411
    @rohitpandey4411 9 месяцев назад

    Kafka itself have a pub/sub model, rather than saving data in two places(redis and kafka), can we create a data aggregation function that'll cater the user messaging service? Working adjacent to yours for updating the db update query handling function

    • @amardeepsingh1168
      @amardeepsingh1168 9 месяцев назад

      We should use either Kafka or redis , right ? not both @rohitpandey4411

  • @rishiraj2548
    @rishiraj2548 10 месяцев назад

    🙏👍

  • @HimanshuDhiman-v9c
    @HimanshuDhiman-v9c 7 месяцев назад

    can i do this with mongo ???

  • @tanvir4748
    @tanvir4748 9 месяцев назад

    Can anyone give me the previous video link?

  • @FaisalKhan-oy4zz
    @FaisalKhan-oy4zz 9 месяцев назад

    when db is down or not able to insert that message so the consumer will resume from that message or from the next message?

    • @skzahirulislam2820
      @skzahirulislam2820 9 месяцев назад +2

      That message because the message is stored inside kafka

  • @iUmerFarooq
    @iUmerFarooq 9 месяцев назад

    How to handle real time notification in Vue&Node like FB handle its post notification. For example I have a Assignment Management system and I'm logged in as a Admin, when someone upload/send new assignment so the notification come in realtime and show in toast. How it can be possible?

  • @ankitkapoordirector6087
    @ankitkapoordirector6087 10 месяцев назад

    Please reply anyone can i make chat app using Java Networking concept? is it possible ? please reply

    • @xcoder420
      @xcoder420 8 месяцев назад

      Yes. Explore about Netty.

  • @Sagarkun
    @Sagarkun 7 месяцев назад

    maza ni aaya bro

  • @RiskyMeal
    @RiskyMeal 7 месяцев назад +1

    Video Title In English... Video audio in Hindi... No offense... but Bruh... What are you doing?

  • @thebigdanktheory6877
    @thebigdanktheory6877 7 месяцев назад

    Is there any way we can create virtual load to test our application?
    Then I can die peacefully 🎉