If you're wondering how data consistency will work in EDA-like we are showing a success message before the data is fully saved to the database-and how the post will become instantly available to others, and what if we have to do some validations like (authentication), you're thinking in the right direction! The short answer is: we’ll be using CQRS + a feed service along with some additional components (if needed). These strategies ensure smooth performance and handle scenarios like (validations) user authentication without overwhelming the database, even under high traffic. I’ll cover all these topics, including edge cases you might be thinking about, in the upcoming videos of this series. Stay tuned!
When someone teaches you with so much detail and a real eye opening view! Mind opens so widely thank you for the course. Ab se mere post api mhe latency nahi hogi
I appreciate the awesome videos you’ve been putting out. I’m really interested in more content that covers production standards and setting up a CI/CD pipeline, especially with ECS and ECR. There are tons of Docker Compose and Kubernetes videos, but I’d love to see more about scaling and auto-scaling with ECS for production cluster setup. It feels like there’s a bit of a gap in that area.
Sandeep sir it's amazing that one Indian youtuber teaching latest trend tech stack like honojs bun etc., In future want to more such videos like honojs with full-fledged backend video
using Grafana k6, To know more about it, you can watch this video:ruclips.net/video/TyktC6sPXn4/видео.html IN this video I have shown how you can test your backend using Grafana k6
Thank you for this insightful video 💙! As a web developer, I found your explanation on optimizing POST APIs with Kafka incredibly helpful. Looking forward to more of your content!
Hi, first of all the really liked the video, but at 12:16 you said we will bulk insert all the blogs collected till now, but here in this case we are giving up on consistency right? for example a user post a blog the producer will push the insert event to the kafka, and the user will get a success message, now this specific blog will get inserted to the db after sometime since we are waiting on a batch of write getting collected.
Yes and there are methods to maintain consistency, and I specifically told in this video, that we have to ensure consistency to implement this type of backend and will cover how we can maintain consistency, data accuracy, fault tolerance, retry, etc. in upcoming videos... but am happy that raised this question... just stay tuned with this series you will get answers of your whole questions...
Great video and explanation! Can you tell how do you test the application metrics ? Do you deploy it somewhere or test it on local ? Also which tool are you using ?
bhai How you get the traffic details. I mean how can i check the number of traffic my server can handle and how to check the time also like you have shown in the starting of the video
@@sandeepdev0 Thank you bhai for your reply. Yes I saw your video related to grafana. And one more thing can you please make a video on searching in 1 billion user like search a user by name in 1 billion user records.
Bro how to make validation just like some restriction are there on adding post. Or the post / user email already exists as the event is called later so how will it validate.
This is a real scenario content shared covers only happy scenario. A real world api request goes through multiple proxy calls/ data validations.. and we need to give response based on validation result.
Bro, in upcoming videos we will cover all possible edge cases, this was just a very basic video... Short answer for real scenarios: Using multiple stuff like CQRS, feed services, db sharding, replications and a lot more stuff.... Everything is coming on this channel that required for real scenario stuff💻👉✅
what if the the server gets down? the messages array will be destroyed... and cannot be entered into Kafka! if we use redis then if there are more then 1 severs then it'll create concurrency over control and it could add the same messages into the DB 3 times! Let's say we use RedLock for redis concurrency control then how can it control the lock in milliseconds? Could you answer it? And please make a video on it! Thanks ❤️
you are asking absolutely right questions and am happy you are following the video very closely. Short answer of your questions is : 👉CQRS - implemeting this for data consistency + feed service ( for instantly make posts available to other users ) Making a video on it very soon... stay tuned💖
That's super easy - (Just need caching ), but we are using EDA in this video, so to optimize Get request -- Firstly need to understand about data consistency, caching strategies, and little bit more things, -- will do all this stuff in upcoming videos ( after Kafka Crash course )
Hello Sir, Can you please let us know your local dev configuration? Because I did try on my laravel app, just a simple API which is just doing a basic validation and no DB connection just return the result but its not even pass 100 users and 500 req per min. So, can you describe us in a video or give a touch on comment how can we configure local dev for this testing. Thanks video was really helpful.
Suppose in consumerside any issue occur inserting the data, then how user will get this info about this, bcs you have returned success immediately from producer.
If something went wrong on consumer side, fault tolerence system will do retries(Will discuss this in upcoming videos) Also, we will make data instantly available to users, I have shown in this video: ruclips.net/video/hZYKchyRlDQ/видео.htmlsi=sgB6robuCQB9_O_M
What about those tasks which needs to synchronous... Like sending OTP... We can't afford the latency, it might be possible ki consumer slow hojaye humara , and we are telling user that yeah we have sent OTP
@@sandeepdev0 yes sirji ar easy lagra hai yrh video or samaj vhi gaya. Or sir agar time ho paye to koi real-time application banakar dikha dijiye or ache sai samaj jayengai ham sab . Ase topics ki bhot jarurat hai ❣
This is a very vague statement, bro. Choosing between Node.js and Go really depends on the type of app you're building and your specific requirements. Netflix uses Node.js PayPal uses Node.js Uber uses Node.js ...and many more companies do as well. It's not about whether Go is better than Node.js or vice versa. Every language solves different use cases, and the choice between them depends on your application's needs and use cases.
Sir , how access token are stored in frontend like react . If we store them in localstorage then it will not be safe . Then tell what is the correct way of storing them please guide .
@@ujjwalsahare8360 Thanks buddy for replying , but tell me one thing there will be the issue of CSRF attack as we are using cookies . So in that case we need to use CSRF token as well . Is I am correct or missing something.
Yes you can generate csrf token to prevent csrf attacks and for xss can go with refresh token and access token pattern so even if he get the access token it will expire in few minutes . And send refresh token with http only so from client side js it can't be accessible .
But still user is htting the api and waiting for the response... user API call is waiting the bulk insert to be finished. isn't it.. let's say,, the consumer waits for 20s or 1000 events ,, then insert data into db... then 20s is the waiting time for the user..
No, user gets response immediately from Producer, but consumer inserts data as bulk in db. Now you may be in doubt that user gets response immediately, but the actual data being saved after some time right.... Well also answered and practically implemented in this video, how to make data instantly available in EDA ruclips.net/video/hZYKchyRlDQ/видео.htmlsi=cJPeLFm_vKno5VfB But if you are still confused, don't worry, I am planning to do a live stream in couple of days, there I can explain in detailed and if you have any questions you can directly ask in live stream
@@sandeepdev0 okay got your point 👍. . Now let's say, user calls the api and expects an id of the resource saved in db.. in this case how will the system behave. ?? So, it completely dependent on the api requirement .. can't be generalized
If we keep Kafka aside and consistency also aside if we are focusing on latency then doing the same bulk insertion logic at the producer level instead of using kafka, will also increase the latency so much right? Kafka is used for asynchronous tasks and consistency right? Can you clarify this doubt
Yes, doing bulk insertion logic at the producer level will significantly increase the latency because producers will be blocked waiting for database responses. Kafka solves this by decoupling the producer from the database, enabling asynchronous handling and smoothing out the load. Kafka is also used for event consistency and ordering, but a major benefit is how it helps systems scale with low latency by preventing blocking operations on the producer side
If you're wondering how data consistency will work in EDA-like we are showing a success message before the data is fully saved to the database-and how the post will become instantly available to others, and what if we have to do some validations like (authentication), you're thinking in the right direction!
The short answer is: we’ll be using CQRS + a feed service along with some additional components (if needed). These strategies ensure smooth performance and handle scenarios like (validations) user authentication without overwhelming the database, even under high traffic.
I’ll cover all these topics, including edge cases you might be thinking about, in the upcoming videos of this series. Stay tuned!
bhai dill se maza hi agaya ye practicaly implement kiya to . thanks you very much
Grand salute aapko…kya padhaya h bhai….gyan chakshu khol diye🎉🎉
When someone teaches you with so much detail and a real eye opening view! Mind opens so widely thank you for the course. Ab se mere post api mhe latency nahi hogi
Glad you enjoyed it!
I appreciate the awesome videos you’ve been putting out. I’m really interested in more content that covers production standards and setting up a CI/CD pipeline, especially with ECS and ECR. There are tons of Docker Compose and Kubernetes videos, but I’d love to see more about scaling and auto-scaling with ECS for production cluster setup. It feels like there’s a bit of a gap in that area.
Hi @truth_Taken, thanks for the suggestions, I will definitely make videos on your suggestions specially on AWS.
Keep suggesting💖
@@sandeepdev0 ty so much ❤️
Bhai its not 95% users but 95 percentile latency
and Nice video bhai
Thanks
Oh my god this is senior js dev level content which is very hard to find on youtube
Nice bro keep going
Sandeep sir it's amazing that one Indian youtuber teaching latest trend tech stack like honojs bun etc., In future want to more such videos like honojs with full-fledged backend video
halfway into the video, and im loving this so much. Thankyou for this awesome video!
All developers should follow you
Hi, just eager to know, how to get test results at 0:42
using Grafana k6, To know more about it, you can watch this video:ruclips.net/video/TyktC6sPXn4/видео.html
IN this video I have shown how you can test your backend using Grafana k6
Best video ever😊
Tq for knowledge ❤
Subscribed just for the next video …please jaldi banana❤
bahot badhiya
eagerly waiting for kafka video
Thank you for this insightful video 💙! As a web developer, I found your explanation on optimizing POST APIs with Kafka incredibly helpful.
Looking forward to more of your content!
Glad it was helpful!❣️
Super excited for part 2
I appreciate your video bro you are only the youtuber making videos on such topic this are very useful for small developers ❤
Bhaiya apka videos hatke type sa hai, full support ❤❤
Thanks Sandeep bhaiya ye video banane ke liye 😊
Really great tutorial, eagerly waiting to see more such content from you. Thank u sirji..🙏
Most welcome 😊
subcribedd serr
Bro i was tackling this problem
thanks man just subscribed you and hit a like button need more content on backend
beautiful 😍
waiting for next lectures ☺️
waiting for next video
Hi, first of all the really liked the video, but at 12:16 you said we will bulk insert all the blogs collected till now, but here in this case we are giving up on consistency right? for example a user post a blog the producer will push the insert event to the kafka, and the user will get a success message, now this specific blog will get inserted to the db after sometime since we are waiting on a batch of write getting collected.
Yes and there are methods to maintain consistency, and I specifically told in this video, that we have to ensure consistency to implement this type of backend and will cover how we can maintain consistency, data accuracy, fault tolerance, retry, etc. in upcoming videos...
but am happy that raised this question... just stay tuned with this series you will get answers of your whole questions...
@@sandeepdev0when can i expect the next video?
Bhai next video kab aaygi
Next video: ruclips.net/video/hZYKchyRlDQ/видео.htmlsi=_FXwlxcZbMqRKRfq
Next next video: coming soon
Please give the Git repo for this project.
When we will get part -2 ? We are excited about part 2.
coming❣️
Great video and explanation! Can you tell how do you test the application metrics ? Do you deploy it somewhere or test it on local ? Also which tool are you using ?
Nice content
Bhai DB ko scale and replication krke nhiii bna skte???? Usse bhi more requests handle ho skti hai
Hey brother can u just bring more like this .
You deserve a sub ❤
❣️
Sir how to test our APIs
Can you please make for other CRUD operations as well?
Covering this up in upcoming videos, where we will build a fullstack project with Event Driven architecture
@@sandeepdev0 Thanks. Appreciate the hard work.
bhai How you get the traffic details. I mean how can i check the number of traffic my server can handle and how to check the time also like you have shown in the starting of the video
I used grafana k6 for testing wiuth virtual users
@@sandeepdev0 Thank you bhai for your reply. Yes I saw your video related to grafana. And one more thing can you please make a video on searching in 1 billion user like search a user by name in 1 billion user records.
Wouldn’t you need ack that the data was inserted successfully? What if it failed in the consumer? How would producer know?
what is the software name you use to see performance backend server
Its Grafana
Bro how to make validation just like some restriction are there on adding post. Or the post / user email already exists as the event is called later so how will it validate.
This is a real scenario content shared covers only happy scenario. A real world api request goes through multiple proxy calls/ data validations.. and we need to give response based on validation result.
will cover cover in upcoming videos,
but short answer is : using CQRS and feed services
Bro, in upcoming videos we will cover all possible edge cases, this was just a very basic video...
Short answer for real scenarios: Using multiple stuff like CQRS, feed services, db sharding, replications and a lot more stuff....
Everything is coming on this channel that required for real scenario stuff💻👉✅
which font family are you use in vs code editor.
Thank you sir ❤
Sir can we do both in single application what are the pros and cons?
what if the the server gets down? the messages array will be destroyed... and cannot be entered into Kafka! if we use redis then if there are more then 1 severs then it'll create concurrency over control and it could add the same messages into the DB 3 times! Let's say we use RedLock for redis concurrency control then how can it control the lock in milliseconds? Could you answer it? And please make a video on it! Thanks ❤️
you are asking absolutely right questions and am happy you are following the video very closely.
Short answer of your questions is : 👉CQRS - implemeting this for data consistency + feed service ( for instantly make posts available to other users )
Making a video on it very soon... stay tuned💖
@@sandeepdev0 ok thanks 🤙🏼
Great content sir
hi sir .....what about get request like how to handle 150k get request
That's super easy - (Just need caching ), but we are using EDA in this video, so to optimize Get request -- Firstly need to understand about data consistency, caching strategies, and little bit more things, -- will do all this stuff in upcoming videos ( after Kafka Crash course )
Which tool did you use to test the api latency??
Grafana k6
Can you please share the script?@@sandeepdev0
Great video
Hello Sir, Can you please let us know your local dev configuration?
Because I did try on my laravel app, just a simple API which is just doing a basic validation and no DB connection just return the result but its not even pass 100 users and 500 req per min.
So, can you describe us in a video or give a touch on comment how can we configure local dev for this testing.
Thanks video was really helpful.
thanks sir ji
Thanks bhai 😊
Make a video, comparing whatsapp, telegram and signal. Also, make highlights how they work behind the scenes so that users will get a clear picture
Good idea👍, will work on this...
Great ❤
Sir how did you calculated that your api can Handel this much of request which tool
I use Grafana k6
Suppose in consumerside any issue occur inserting the data, then how user will get this info about this, bcs you have returned success immediately from producer.
If something went wrong on consumer side, fault tolerence system will do retries(Will discuss this in upcoming videos)
Also, we will make data instantly available to users, I have shown in this video:
ruclips.net/video/hZYKchyRlDQ/видео.htmlsi=sgB6robuCQB9_O_M
100/10 loveddd it
Can we use redis instead of Kafka?
Redis is In memory database. Kafka is distributed event store and stream-processing platform
sir mujhe kafka seekhna h from scratch please koi resource bta do net par kuch dhang ka mila nhi
can you share github repo for this code, I am getting some error so I can briefly see your code
What about those tasks which needs to synchronous... Like sending OTP... We can't afford the latency, it might be possible ki consumer slow hojaye humara , and we are telling user that yeah we have sent OTP
In that case create a separate queue and add priority for each message
I don't know Kafka coding concept... Please allow me to find..
A Kafka crash course is coming in a few days
Yeh video ko samaj nake liye mai kafka sikh kar aya 😂
good efforts bhai👍
@@sandeepdev0 yes sirji ar easy lagra hai yrh video or samaj vhi gaya.
Or sir agar time ho paye to koi real-time application banakar dikha dijiye or ache sai samaj jayengai ham sab .
Ase topics ki bhot jarurat hai ❣
Hi sir😎, Currently i start android development career with kotlin❤, now what should i learn create api with kafka please suggest me😊
And If i was that care about my backend then i wouldn't have written my backend in node in first place i would have chosen go
This is a very vague statement, bro. Choosing between Node.js and Go really depends on the type of app you're building and your specific requirements.
Netflix uses Node.js
PayPal uses Node.js
Uber uses Node.js
...and many more companies do as well.
It's not about whether Go is better than Node.js or vice versa. Every language solves different use cases, and the choice between them depends on your application's needs and use cases.
Please make a course
I feel this is going to be a nice video but i don't speak or understand Indian language, I wish you can create a blog post on what you're sharing
Thanks for the suggestion, I'll consider adding blog posts as well!
@@sandeepdev0 I will very much appreciate it
Sir , how access token are stored in frontend like react . If we store them in localstorage then it will not be safe . Then tell what is the correct way of storing them please guide .
cookies mai store karte hai hash karke
Session mai save karoge to server mai save hota hai
@@ujjwalsahare8360 Thanks buddy for replying , but tell me one thing there will be the issue of CSRF attack as we are using cookies . So in that case we need to use CSRF token as well . Is I am correct or missing something.
Yes you can generate csrf token to prevent csrf attacks and for xss can go with refresh token and access token pattern so even if he get the access token it will expire in few minutes . And send refresh token with http only so from client side js it can't be accessible .
❤
But still user is htting the api and waiting for the response... user API call is waiting the bulk insert to be finished. isn't it.. let's say,, the consumer waits for 20s or 1000 events ,, then insert data into db... then 20s is the waiting time for the user..
No, user gets response immediately from Producer, but consumer inserts data as bulk in db.
Now you may be in doubt that user gets response immediately, but the actual data being saved after some time right.... Well also answered and practically implemented in this video, how to make data instantly available in EDA
ruclips.net/video/hZYKchyRlDQ/видео.htmlsi=cJPeLFm_vKno5VfB
But if you are still confused, don't worry, I am planning to do a live stream in couple of days, there I can explain in detailed and if you have any questions you can directly ask in live stream
@@sandeepdev0 okay got your point 👍. .
Now let's say, user calls the api and expects an id of the resource saved in db.. in this case how will the system behave. ?? So, it completely dependent on the api requirement .. can't be generalized
How many more videos on this series
A lot of☺️
If we keep Kafka aside and consistency also aside if we are focusing on latency then doing the same bulk insertion logic at the producer level instead of using kafka, will also increase the latency so much right? Kafka is used for asynchronous tasks and consistency right? Can you clarify this doubt
Yes, doing bulk insertion logic at the producer level will significantly increase the latency because producers will be blocked waiting for database responses.
Kafka solves this by decoupling the producer from the database, enabling asynchronous handling and smoothing out the load.
Kafka is also used for event consistency and ordering, but a major benefit is how it helps systems scale with low latency by preventing blocking operations on the producer side
Thank you
Please don't use copilot while teaching it henders learning experience
Sure bro, will keep it off in upcoming videos
abhi subscribe kar leta hu jaldi jaldi baad mai weekend pe binge watch kar lunga wale log please give a heads up
Pehle nhi bata sakte 7:30
😅My bad, next time se dhyaan rakhunga👍
bro in english plzzzzzzzzzzzzzzzzzzzzzzzzzz.....................................
I think, I need to start an English channel also
English please
you need DDD .
Great content