Please include this in the "Production ready server" playlist. As horizontal auto-scaling is also a very useful feature in the production ready server. Amazing content. Thanks.
I have been scratching my head the whole day trying to figure out why my nginx wasn't working. I wasn't using docker-compose.yml; instead, I was creating separate containers for each server instance. On top of that, I was editing the main nginx configuration file (/etc/nginx/nginx.conf) instead of changing the virtual host configuration file (/etc/nginx/conf.d/default.conf) as you did. After watching your video, I figured out two issues. First, my containers were not communicating with each other because I hadn't created a custom network for them. After doing that, everything worked fine with reverse proxy and load balancing. Second, I still don't understand what the issue was with using the same upstream to specify the servers and the location with the proxy_pass directive to send them to the servers in round-robin. Maybe it's not a good practice to update the main nginx configuration file directly, but I'm not sure. Can you provide me with some resources to read more about nginx configurations? Thanks a lot; this video was a great help! 💕💕
Great explanation, thank you. I was just wondering, if we are running in same machine we can use cluster module as well right? The nginx with auto-scaling may become prominent when using multiple standalone servers, isn't that right?
I am getting error while running docker compose up --scale api-2 - curl localhost:3000 does not round robin between 2 containers alternativly although both containers are created as seen in docker ps. what could be the issue. please help. thank you
Great video, is it possible to configure de reverse proxy to request server from another machine with different IP? Like configure an upstream cluster in nginx
Hi ! great video as usual !, is that a normal behavior that you can access the api container by its port without exposing the 3000 port in its Dockerfile?
Sir lets say i have three vps each of dual core processor and each vps will have two docker container running(of same image). How can we load balance and proxy using docker. Since the vps are different they will have different ip and container id how do we link all of them? Do we need to have another vps only for nginx to load balance or just puting nginx in vps1 it will do the job?
thank you. how to scale my MySQL database? i am using 1vcpu,2gb ram digital ocean droplet, which contains a node js API and MySQL server, it works fine, but after 10-12 days MySQL queries become slower. how to scale MySQL? i have 50% ram free and only 3-4 % cpu is using most of the time.
Great! Tank you ❤ Hey Bro, I'm scaling a Chatbot where the user's history is preserved. After scaling up I want the users second or forthcoming request to point to the First request Host. I can send the hostname to user and send it back as you shown. Can U guide how to map request to particular host?
the only non well explained! part is specifying api on proxy_pass and not saying where its coming from or how nginx detect it cuz its a service name not a loadbalancer upstream name
Please include this in the "Production ready server" playlist. As horizontal auto-scaling is also a very useful feature in the production ready server. Amazing content. Thanks.
Still waiting to see the 1000 container version as promised in the thumbnail .... :)
Such crisp & clear explanation. Thanks a lot.
Man what a perfect and straight explanation. thank you so much for your efforts! you are a legend
this is some production level content 😂
thanks man
Subscribe like share 👍
Your docker series is awosme .Beacuse u covered almost all tpoics in crash code mode .Thanks
Please accept my gratitude for your content it's definitely helpful ☺️
Great content. Do upload similar content so that we as developers can get advanced.
Subscribe like share
I couldnt figure it out and you saved me hours of work thanks!!!
Subscribe like share 👍
You're not using nginx here to load balance. You're using docker to load balancer. Tge nginx container is just a proxy in this setup.
Mann.. this lecture is awesome.. thanks. Alottttt
thank you so much. All you guys are brilliant!!
great tutorial, bring more videos on how to work with nodejs and if possible react and aws
Thanks, will do!
So clear and precise tutorial please make tutorials on kubernetes for beginners also.
Subscribe like share
I have been scratching my head the whole day trying to figure out why my nginx wasn't working. I wasn't using docker-compose.yml; instead, I was creating separate containers for each server instance. On top of that, I was editing the main nginx configuration file (/etc/nginx/nginx.conf) instead of changing the virtual host configuration file (/etc/nginx/conf.d/default.conf) as you did.
After watching your video, I figured out two issues. First, my containers were not communicating with each other because I hadn't created a custom network for them. After doing that, everything worked fine with reverse proxy and load balancing. Second, I still don't understand what the issue was with using the same upstream to specify the servers and the location with the proxy_pass directive to send them to the servers in round-robin.
Maybe it's not a good practice to update the main nginx configuration file directly, but I'm not sure. Can you provide me with some resources to read more about nginx configurations? Thanks a lot; this video was a great help! 💕💕
Great explanation, thank you.
I was just wondering, if we are running in same machine we can use cluster module as well right?
The nginx with auto-scaling may become prominent when using multiple standalone servers, isn't that right?
Amazing Video :)
Really helped, just one correction, I believe CI stands for Clean Install. Please correct me if I am wrong.
Correct
Simple and quite usefull explanation, thank you.
Great tutorial sir as always you make really good tutorials.
Thank you very much!
Since there's no upstream definition in the nginx conf, we're pretty much stuck to a round-robin type of configuration with this setup right?
awesome. but you didnt say about load balancer!! is it in another video?
Well explained. Thank you!
Thank you very much!
This video helped me a lot.
Superb. Thanks for your hardwork.
Awesome explanation. Thank you.
Pure Gold 🪙, Thank you man!
Hi, when i tried setting it up.. it only routes between 2 containers instead of all containers (in my case 5). why is it so?
man thanks for great content
Does this also work with JWT Auth containers? Or will this mess up the JWT auth?
I really liked your command prompt theme. Can you please tell me the theme name?
Great tutorial, but scaling all those app in the same machine, even with different containers, won`t make so much difference, right?
What a nice lesson!
Thank you man you are the best
Worker conrainers limited by server thread? Example: server - 4 core 8 thread 4 ram --> I can create only 7 worker container + 1 for nginx
?
Truly a Gem! ✨✨✨
Wow that was awesome 👌 thank you
there is a way to create a rule for nginx to fire up more docker containers based on requests demand?
I am getting error while running docker compose up --scale api-2 - curl localhost:3000 does not round robin between 2 containers alternativly although both containers are created as seen in docker ps. what could be the issue. please help. thank you
Great video, is it possible to configure de reverse proxy to request server from another machine with different IP? Like configure an upstream cluster in nginx
How we scale up the Nginx container so that there are two replica of Nginx?
Hi ! great video as usual !, is that a normal behavior that you can access the api container by its port without exposing the 3000 port in its Dockerfile?
Thank you so much amazing video
may i know what terminal you are using, i am also want suggestion like that
oh-my-zsh + spaceship prompt + autocomplete plugin
@@mafiacodes thanks a lot
In npm ci, ci stands for clean install.
How can I configure this nginx to add a new distinct service with a different server location, for example /new
Sir lets say i have three vps each of dual core processor and each vps will have two docker container running(of same image). How can we load balance and proxy using docker. Since the vps are different they will have different ip and container id how do we link all of them? Do we need to have another vps only for nginx to load balance or just puting nginx in vps1 it will do the job?
This is not possible
Thankyou
how it will auto scale
great content! thanks a lot!
what theme are you using?
good video
Can we use this in a production environment??
Yup
thank you. how to scale my MySQL database?
i am using 1vcpu,2gb ram digital ocean droplet,
which contains a node js API and MySQL server,
it works fine, but after 10-12 days MySQL queries become slower.
how to scale MySQL? i have 50% ram free and only 3-4 % cpu is using most of the time.
Try using cockroach db... It is fun.
Great! Tank you ❤ Hey Bro, I'm scaling a Chatbot where the user's history is preserved. After scaling up I want the users second or forthcoming request to point to the First request Host. I can send the hostname to user and send it back as you shown. Can U guide how to map request to particular host?
Best
can someone give his zsh theme
Bro AWS having less tutorial in RUclips so working with Aws is too hard for who never know about it can we have tutorial about it
I’ll try to make it
Subscribe like share
I love ❤ you
wait wait.. npm ci doesnt stand for "continuous integration" it stands for "clean install". make some correction dude.
the only non well explained! part is specifying api on proxy_pass and not saying where its coming from or how nginx detect it cuz its a service name not a loadbalancer upstream name
npm ci = clear & install
Imo continuous integration
why u didnt use any orchestration tool? why you made things this complicated??? @yoursTRULY
You can do that