Gratitude for the incredible Node.js microservices architecture integrated with RabbitMQ for RUclips! This innovative approach has revolutionized the way we handle our platform's complexity, enabling seamless communication and scalability. With Node.js' agility and RabbitMQ's reliability, we've been empowered to deliver a top-notch user experience while efficiently managing our growing audience. Huge thanks to the brilliant minds behind this architecture for providing us with the tools to thrive in the dynamic world of online content creation. 🚀🙏 #Grateful #NodeJS #RabbitMQ #RUclips"
Hello, @ManoSriram eventually what you did was right apart from creating another queue for response in the same route. The major purpose of using queues is to get asynchronous nature. Apparently, you destroyed that purpose by defining the response queue at the same spot. Even if your queue would have worked fine in a sync manner, if we look for a rather scaled and bigger architecture you will start receiving wrong responses because of ill order acknowledgments.
@ManoSriram : worth to watching video and simple way to understand the Microservice with Rabbitmq. Could you please create application with Kafka as well
Very informative video but if you create big project and explain pros and cons of rabbitmq about scalability and how to use it in production and about security things it will be great
@@codedestiny6955 Usually, the messages in the queue will be persisted and when the queue is up, will be consumed which I have demonstrated at the end of the video.
Amazing video bro, but small issue.. at the first time u hit the route, u got an empty response (at 42:32 ) again next time u hit the route then a response came, (at 42:34 ) Is this correct? Actually we are getting wrong response for the current hit to that route((I,e we are getting response of the previous hit) At very first time, before the queue consumes the message, the call back returns order, where it is undefined still, so u got no output. By this time your queue will consume message and assigns it to order, so that in the next hit to the route, this order value is coming, which was supposed to come in the before hit itself. So to get the correct output, we need to hit the route twice… So how to solve this issue.??? Can we write return res.json(order) inside the consume function itself? (I did like this, the 1st hit gave me correct output, but while in the 2nd hit, the server is getting crashed, saying Error: Can't set headers after they are sent, ) Please Correct me if I am wrong..
Yeah, you cannot return inside a consume function. So, I tried returning outside the function. I think the problem might be like, it always waits to consume and you cannot return all of a sudden, maybe we can have some callback fn which helps us to return after consuming. Will try to do a video on this if I find a suitable solution. Thank you for watching :)
Let's say we are deploying this whole application in aws so how we will configure nginx for order service because this service is not getting any api request or response it's just getting data from message broker and using it to make orders but it's running on a port to do all these tasks so when we will deploy it to aws how we will configure it for nginx proxy pass or location for api we can upstream it for server ip which we will get from aws instance and port of the server but what about location and proxy pass please give a suggestion
Hello :) how are you, a query ... if in the case I want to add another function called UpdateOrder in the order service then I add it but how did you call it from Auth for example? because I see that the example is when you have a single createOrder function and you call it by the "ORDER", as it would be when you want to call a new UpdateOrder function with Income Parameters.
Because the client called the Product API. We have to return a response from that exact api. So after finishing the job in the order service, we return back data to finish and respond to the client.
At what scale using message queue make sense? I have known that queues can be utmost helpful during really high amount of traffic when managing throughput is essential and also has retry feature. For eg. AWS lambda has a its own throughput limit was can be balanced by SQS, step function, etc. How helpful is it for small scale where I just want to handle basic crud app? What's your view? Edit: small suggestion, treat your audio in adobe audition with denoise filter and parametric EQ during editing to remove background noise and it will sound better.
Message queues are most useful in microservices. Microservices are used at Scale (Huge codebases). For small-scale apps, it can still help you to handle your application if it is scaling steadily. Thank you, I have found the audio bug and will fix it from next video :)
Thank you for this video, it war very clear, I want to dockerise this services and use nginx as reverse proxy, can you please tell me how can i handle with middlware isAuthenticate, should i make its own container ?
@@manosriram thank you for your response. The solution i found is integrating the middlware in each service. I don't know if it's the best approch or not but i can't use the middlware separately because i'm using docker-compose for managing my containers.
big thanks. great video. one question';' in a actual prod ver I saw ur github code, so how each service which have separated repo can have auth service or isAuthenticated service shared among them without repeating the code? tnx
You can take all common code and put it a seperate project ( called "common" e.g) then publish it to npm using npm publish u can look about this online to get more info... Then u can import your common code to any project by running npm install common-module that u have already published. It will like any other module like express, etc...
Awesome video.can you please advise. is it possible separate each micro services in different languages like PHP,golang. Each micro services connect to one nodejs api.like grpc
Your video is very informative and thank you very much. But it does introduces further problem of dependency among services. And response is essentially waiting for the whole saga to be completed. Can you please publish a video about how to mitigate these problems
How did rabbitmq know which order to return to which request? I'm imagining a case where hundreds of people are hitting that route and adding items to the order queue, how would the consumer of new order know which order to return with which request?
Nice Man, thanks. But just one question. Clear and Simple question or you can say it's my doubt only that For Microservices, we do not need docker (compulsory) right? we can only use Rabbit MQ and node js and other stuff right ?
No ESLint, no TS, not real microservices, RabbitMQ doesn't any sense here, why did you go for gRPC, Kafka or something, would be a better fit. I don't wanna be mean, but nobody should take this as an example.
Where is your pain bro? You are teaching people in a very complicated manner. I know you are a good coder and you are committed to using a vim editor for your learning means remember the Syntex. But just think people here are coming to learn something .why you are making it complicated.
simple and straight forward approach i ever seen for micro-services tutorial and more helpful than others. Thank you🖤
Gratitude for the incredible Node.js microservices architecture integrated with RabbitMQ for RUclips! This innovative approach has revolutionized the way we handle our platform's complexity, enabling seamless communication and scalability. With Node.js' agility and RabbitMQ's reliability, we've been empowered to deliver a top-notch user experience while efficiently managing our growing audience. Huge thanks to the brilliant minds behind this architecture for providing us with the tools to thrive in the dynamic world of online content creation. 🚀🙏 #Grateful #NodeJS #RabbitMQ #RUclips"
Excellent tutorial, clearly explained with code & errors also, very much appreciated. Thanks a lot
Came here for message queue ,impressed by vim skills.
Thanks Vamshi :)
Wow.... Superb Vim ahhh
I love seeing you use vim while doing these tutorials. hahaha. great.
Thanks a lot :)
this channel is underrated.
Thank you Aditya🙂
wow thanks best on youtube
I loved this tutorial! Thank's for sharing, man!
i think this is a great tutorial, thank you
Thank you very much for sharing such a great tutorial. 👍
I understood how It works clearly. Thanks for your nice explanation
Thank you Selva 🙂
Awesome tutorial video! I just loved it.
Good Info Man Keep going
Hello, @ManoSriram eventually what you did was right apart from creating another queue for response in the same route. The major purpose of using queues is to get asynchronous nature. Apparently, you destroyed that purpose by defining the response queue at the same spot.
Even if your queue would have worked fine in a sync manner, if we look for a rather scaled and bigger architecture you will start receiving wrong responses because of ill order acknowledgments.
I agree with you, this is just a sample or demo application, so I think those problems are negligible.
@@manosriram no doubt, rather easy and straightforward explanation! Good work mate.
@ManoSriram : worth to watching video and simple way to understand the Microservice with Rabbitmq. Could you please create application with Kafka as well
Thank you Pramod, there is already a video on Kafka on my channel. You can check that out.
i not understand , its weird, api /product/buy channel consume no have await, but got response with data from consumer, it code syncron and blocking ?
can you explain?
You are a life saver.
great stuff mate.
Thank you Drake.
This was very informative. Thanks
Glad it helped 🙂
How does RabbitMQ compare to Kafka , when you consider a Ecommerce application ?
Very informative video but if you create big project and explain pros and cons of rabbitmq about scalability and how to use it in production and about security things it will be great
Thanks for your time, I will use this project to develop logging system using elk stack
Thank you, glad this video is useful for you :)
@@manosriram what if our queue goes down ?? How do we ensure our service availability
@@codedestiny6955 Usually, the messages in the queue will be persisted and when the queue is up, will be consumed which I have demonstrated at the end of the video.
Amazing video bro, but small issue..
at the first time u hit the route, u got an empty response (at 42:32 )
again next time u hit the route then a response came, (at 42:34 )
Is this correct?
Actually we are getting wrong response for the current hit to that route((I,e we are getting response of the previous hit)
At very first time, before the queue consumes the message, the call back returns order, where it is undefined still, so u got no output. By this time your queue will consume message and assigns it to order, so that in the next hit to the route, this order value is coming, which was supposed to come in the before hit itself.
So to get the correct output, we need to hit the route twice…
So how to solve this issue.???
Can we write return res.json(order) inside the consume function itself?
(I did like this, the 1st hit gave me correct output, but while in the 2nd hit, the server is getting crashed, saying Error: Can't set headers after they are sent, )
Please Correct me if I am wrong..
Yeah, you cannot return inside a consume function. So, I tried returning outside the function. I think the problem might be like, it always waits to consume and you cannot return all of a sudden, maybe we can have some callback fn which helps us to return after consuming. Will try to do a video on this if I find a suitable solution.
Thank you for watching :)
yes sending request twice returns the response back any solution for this
@@manosriram you can simply use await to wait for the consumption first then you can send a response
Let's say we are deploying this whole application in aws so how we will configure nginx for order service because this service is not getting any api request or response it's just getting data from message broker and using it to make orders but it's running on a port to do all these tasks so when we will deploy it to aws how we will configure it for nginx proxy pass or location for api we can upstream it for server ip which we will get from aws instance and port of the server but what about location and proxy pass please give a suggestion
you make easier microservice understanding using node js
wounderfull
and very very thanks for the github link
Hi anyone can explain why /product/buy we need to hit api twice for correct answer .Also if i shift res.json in consume method than error occurs
I got the concepts, thks bro
Thank you, glad it helped.
Sir, i have to add frontend to this, what will entry point from browser and all can u explain . Thank you
Hello :) how are you, a query ... if in the case I want to add another function called UpdateOrder in the order service then I add it but how did you call it from Auth for example? because I see that the example is when you have a single createOrder function and you call it by the "ORDER", as it would be when you want to call a new UpdateOrder function with Income Parameters.
Thanks very informative
we can also use the kafka insted of rebbitmq ?
yes, you can.
just Perfect...
@Mano Sriram I need a little clarification please. Why did you send data back from the order service to the product queue
Because the client called the Product API. We have to return a response from that exact api. So after finishing the job in the order service, we return back data to finish and respond to the client.
@@manosriram ok...Thank you so much
At what scale using message queue make sense? I have known that queues can be utmost helpful during really high amount of traffic when managing throughput is essential and also has retry feature. For eg. AWS lambda has a its own throughput limit was can be balanced by SQS, step function, etc. How helpful is it for small scale where I just want to handle basic crud app? What's your view?
Edit: small suggestion, treat your audio in adobe audition with denoise filter and parametric EQ during editing to remove background noise and it will sound better.
Message queues are most useful in microservices. Microservices are used at Scale (Huge codebases).
For small-scale apps, it can still help you to handle your application if it is scaling steadily.
Thank you, I have found the audio bug and will fix it from next video :)
great tutorial but why do you have to use vim to write the code? I find my self wanting to see the full folder structure a lot in the video.
i usually write code in vim; but i'll try to show the file tree from now on.
I am wondering, cant you make the isAuthenticated function into a separate middleware that isn't in the project dir?
Sure, we could do that.
Thank you for this video, it war very clear,
I want to dockerise this services and use nginx as reverse proxy, can you please tell me how can i handle with middlware isAuthenticate, should i make its own container ?
I don’t think you need to dockerfile the middleware. You can just use it without separately containerising it
@@manosriram thank you for your response. The solution i found is integrating the middlware in each service. I don't know if it's the best approch or not but i can't use the middlware separately because i'm using docker-compose for managing my containers.
big thanks. great video. one question';' in a actual prod ver I saw ur github code, so how each service which have separated repo can have auth service or isAuthenticated service shared among them without repeating the code? tnx
You can take all common code and put it a seperate project ( called "common" e.g) then publish it to npm using npm publish u can look about this online to get more info... Then u can import your common code to any project by running npm install common-module that u have already published. It will like any other module like express, etc...
Awesome video.can you please advise. is it possible separate each micro services in different languages like PHP,golang. Each micro services connect to one nodejs api.like grpc
Yes, it is possible to have different microservices in different languages. Only the API matters.
How we create relationship between 2 microservices like products and order?
By raising events! we can use kafka or rabbitmq to consume & emit these events, where products & orders handle that event data accordingly
Your video is very informative and thank you very much. But it does introduces further problem of dependency among services. And response is essentially waiting for the whole saga to be completed. Can you please publish a video about how to mitigate these problems
Yes, thank you. Will make a video on this.
How to write from rabbit mq to mongodb?
Thank you!
How did rabbitmq know which order to return to which request? I'm imagining a case where hundreds of people are hitting that route and adding items to the order queue, how would the consumer of new order know which order to return with which request?
thanks for asking, am stuck right now doing something like this
Awesome content 👍. Just when I was looking for how to connect services with message brokers.
Glad it helped you :)
It's really helpful. Could you please help me to achieve this approach in Python or Flask python?
Great
Can you make a video on kafkajs in node microservices app soon sir? I am waiting to implement in my project. TQ!
Sure, will do very very soon :)
Can you make a video on kafkajs in node microservices app?
Sure, will do it in the very near future.
@@manosriram Please do this bro,your channel is going to be my new hangout
you should show folder structure, it will help
thaks subscribed too
Nice Man, thanks. But just one question. Clear and Simple question or you can say it's my doubt only that For Microservices, we do not need docker (compulsory) right? we can only use Rabbit MQ and node js and other stuff right ?
Yes, you can install rabbitmq locally and connect it with the application.
Thanks.
Use please rtx voice.
Can you make video microsevices without rabbit mq like i want api gateway
Yes, I am thinking of doing a Gateway for microservices very soon. Do subscribe to stay updated :)
Can u help me create microsevices i will send mail
Nice, good info.
Can you structure all these using typescript and make routes, controller, db connection and db model in different class file.
Bro what in this world you are not using visual studio code
I prefer Vim over VsCode.
@@manosriram why do you prefer Vim?
github repo link please
Updated link with github link :)
please make same program with kafka Pleaseeeeeeeeeee
I already have a video on Kafka, do check that out 😄
Please use vs code with good theme, and also the file structure should be visible
No ESLint, no TS, not real microservices, RabbitMQ doesn't any sense here, why did you go for gRPC, Kafka or something, would be a better fit. I don't wanna be mean, but nobody should take this as an example.
what have u done? Dumb ass u're already mean. Give ur opinion as in pros and cons and state ur preference pussy
Where is your pain bro?
You are teaching people in a very complicated manner.
I know you are a good coder and you are committed to using a vim editor for your learning means remember the Syntex.
But just think people here are coming to learn something .why you are making it complicated.