very instructive i liked it. some honest reviews: this would still work without specifying an ap network as docker compose creates a default network. also no working_dir is necessary as you have already specified it in Dockerfile. and also it would be nice to have some volumes for data as for now every time you docker up and down the data dies.
Hi, very informative video! I just had one question, is it normal that the node_modules directory on my host, which is generated from the container, is owned by root?
I had to change my .env variable db_host to the same as the container name in my docker-compose file. I didn't see you do it here, so how does it work with db_host=localhost for you?
Yes, you're right. I should change that value. It should be the postgres instead of localhost for the db_host variable. Maybe, let's explain how it works. If there is no port mapping in the docker-compose file or there is no port mapping passed to the docker run command, the containers remain invisible from the host network perspective. It means we cannot use the localhost alias to access them. Instead, the containers can talk to each other directly, using its services names.
We could use the localhost value for the db_host variable, if we would map the ports in the docker-compose file. We would have to add this to postgres service: ports: - 5432:5432
This is what I was looking for, dev setup for nest!
Thank you so much , Please make a more videos about nestjs. You are teaching very well.
Holy focking shite, nice video. MORE WITH DOCKER PLEASE, resources are rare in this area
Nice Work! I understood now something's, how to use ARG with DockerFile and TARGET with docker-compose and prepare image to production. Thanks!!!
very instructive i liked it. some honest reviews: this would still work without specifying an ap network as docker compose creates a default network. also no working_dir is necessary as you have already specified it in Dockerfile. and also it would be nice to have some volumes for data as for now every time you docker up and down the data dies.
Pretty nice 👍
Hi, very informative video! I just had one question, is it normal that the node_modules directory on my host, which is generated from the container, is owned by root?
I had to change my .env variable db_host to the same as the container name in my docker-compose file. I didn't see you do it here, so how does it work with db_host=localhost for you?
Yes, you're right. I should change that value. It should be the postgres instead of localhost for the db_host variable. Maybe, let's explain how it works. If there is no port mapping in the docker-compose file or there is no port mapping passed to the docker run command, the containers remain invisible from the host network perspective. It means we cannot use the localhost alias to access them. Instead, the containers can talk to each other directly, using its services names.
We could use the localhost value for the db_host variable, if we would map the ports in the docker-compose file. We would have to add this to postgres service:
ports:
- 5432:5432
Ah i see. Thanks for the detailed answers! I am really enjoying the series!
use different db for prod and dev 👍🏻