Hey people, I made this video as a first step and a base starting point to use docker containers with Laravel. I see many of you are interested and want to know more about how to handle static files, how to get this ready for production and maybe even how to use nginx with this. If you are curious and want to see the stuff I mentioned above, please like the comment and I can revisit this video in depth if enough people are interested.
Quick note if you have problems with Caddy, you can easily switch to nginx, just use a nginx image and add the nginx.conf file. To go to production, setup your Laravel env, check permissions and build the containers and you're all set. The more complex your app is the more you will have to dive deeper and configure the project, it would be impossible for me to cover all possible cases without making a 15 hour video.
Yes it is. It has everything you need for a production app. You should only modify the Caddyfile to suit your needs and of course have SSL on your site since you don't want to use port 80 in production.
When you are running it locally with php artisan serve you should use localhost. When running it in production the container will want to connect the database and since they are running on the internal docker network you will use db.
What about storage folder on production? How would you handle that? Create a volume for storage ? Or what is the best way in your opinion? My problem is i dont want to loose the files, even if i rebuild the image
If you need storage then you should map a docker volume in your caddy service to /storage in your application and then any uploaded files will be saved.
@@tenacity_dev i have followed your instruction, but why my files is return 404 when i access it? what i have do is: app: volumes: - public-data:/var/www/html/storage/app/public caddy: volumes: - ./laravel-app/storage/app/public:/var/www/html/storage/app/public - storage-volume:/var/www/html/storage volumes: public-data: storage-volume:
I'm not quite sure. I usually add another service in the docker compose file for my UI where I use a framework when building any backend. Try this template out and add any changes you need.
It's probably because the permissions need to be setup. Do the following: 1. Create a entrypoint.sh script in your Laravel directory and execute the entrypoint script from your Dockerfile 2. In that entrypoint.sh script you should run php artisan storage:link and run chmod and chown on the storage directory. 3. Now the storage has been linked to the public dir and you should go into the nginx container and check if it has those files. 4. Navigate to the URL. If you can't manage to pull this off, tell me and I'll try to make a video on how to setup storage also. Most people use a UI framework + Laravel (API only) that's why I did not go any deeper for static files.
@@tenacity_dev i use laravel for fe and be haha. i've fix it by myself, the storage:link is always produce file named "storage" in public folder idk why. but when i run storage:link before build the image, and then run storage:link straight to the docker image app, it will run. and idk when i change the filesystem_disk to public, still return 404, but if filesystem_disk=local, its work. im sorry for my bad english, but the conclusion is, i dont even know how the storage:link work on docker, i just do a few adjustment to make it work
@@tenacity_dev but, the storage:link command return a file named 'storage', not a symlink folder. and hey, i have one request. since laravel is much faster if we use FrankenPHP, can u make separate video about it?
@@AbdiMayu-c3p Got it, I'll keep that in mind for future videos. I'll have to read up on FrankenPHP first. I know what it is but I need to prepare and learn it before making the video.
Everthing was going fine until i checked localhost:8000 and then found error there : Couldnot translate hostname "db" to address: nodename nor servname provided, or not known (Connection: pgsql, SQL: select * from "sessions" where "id" = ............... limit 1) .
When you are developing locally go to the .env file and use 127.0.0.1 hostnames for the database and Redis. When you are using Docker environment then use db and redis for the hostname.
@@suwandicahyadi9213 When you want to develop the application, set the env HOST variable for redis and postgres to be 127.0.0.1 and run redis and postgres docker containers from the docker compose stack. Go to your Laravel project and run php artisan serve and you can then develop. When you are done and want to deploy the containers. SSH to your server, set the env HOST variable to be redis and db for their own services respectively and run the required commands inside the app to optimize imports and so on.
Hey people, I made this video as a first step and a base starting point to use docker containers with Laravel. I see many of you are interested and want to know more about how to handle static files, how to get this ready for production and maybe even how to use nginx with this.
If you are curious and want to see the stuff I mentioned above, please like the comment and I can revisit this video in depth if enough people are interested.
😊
Quick note
if you have problems with Caddy, you can easily switch to nginx, just use a nginx image and add the nginx.conf file.
To go to production, setup your Laravel env, check permissions and build the containers and you're all set.
The more complex your app is the more you will have to dive deeper and configure the project, it would be impossible for me to cover all possible cases without making a 15 hour video.
This is amazing, this is the best explanation of docker setup and laravel thank you
Thank you!!!
Solid explanation, Is this suitable for production?
Yes it is. It has everything you need for a production app.
You should only modify the Caddyfile to suit your needs and of course have SSL on your site since you don't want to use port 80 in production.
Please add reverb, I have a lot of problems with it
I'll keep that in mind for the future videos
When switching from 'production' to 'development' mode, do I have to change the 'DB_HOST' from db to localhost?
When you are running it locally with php artisan serve you should use localhost. When running it in production the container will want to connect the database and since they are running on the internal docker network you will use db.
What about storage folder on production? How would you handle that? Create a volume for storage ? Or what is the best way in your opinion? My problem is i dont want to loose the files, even if i rebuild the image
If you need storage then you should map a docker volume in your caddy service to /storage in your application and then any uploaded files will be saved.
@@tenacity_dev i have followed your instruction, but why my files is return 404 when i access it?
what i have do is:
app:
volumes:
- public-data:/var/www/html/storage/app/public
caddy:
volumes:
- ./laravel-app/storage/app/public:/var/www/html/storage/app/public
- storage-volume:/var/www/html/storage
volumes:
public-data:
storage-volume:
its the same way if im using vue.js as a frontend with inertia?
I'm not quite sure. I usually add another service in the docker compose file for my UI where I use a framework when building any backend. Try this template out and add any changes you need.
how to link the public storage? its always return 404.
It's probably because the permissions need to be setup.
Do the following:
1. Create a entrypoint.sh script in your Laravel directory and execute the entrypoint script from your Dockerfile
2. In that entrypoint.sh script you should run php artisan storage:link and run chmod and chown on the storage directory.
3. Now the storage has been linked to the public dir and you should go into the nginx container and check if it has those files.
4. Navigate to the URL.
If you can't manage to pull this off, tell me and I'll try to make a video on how to setup storage also. Most people use a UI framework + Laravel (API only) that's why I did not go any deeper for static files.
@@tenacity_dev i use laravel for fe and be haha. i've fix it by myself, the storage:link is always produce file named "storage" in public folder idk why. but when i run storage:link before build the image, and then run storage:link straight to the docker image app, it will run. and idk when i change the filesystem_disk to public, still return 404, but if filesystem_disk=local, its work. im sorry for my bad english, but the conclusion is, i dont even know how the storage:link work on docker, i just do a few adjustment to make it work
@@AbdiMayu-c3p That's great, the storage:link command creates a symlink between storage and public directory
@@tenacity_dev but, the storage:link command return a file named 'storage', not a symlink folder. and hey, i have one request.
since laravel is much faster if we use FrankenPHP, can u make separate video about it?
@@AbdiMayu-c3p Got it, I'll keep that in mind for future videos. I'll have to read up on FrankenPHP first. I know what it is but I need to prepare and learn it before making the video.
Everthing was going fine until i checked localhost:8000 and then found error there :
Couldnot translate hostname "db" to address: nodename nor servname provided, or not known (Connection: pgsql, SQL: select * from "sessions" where "id" = ............... limit 1)
.
When you are developing locally go to the .env file and use 127.0.0.1 hostnames for the database and Redis. When you are using Docker environment then use db and redis for the hostname.
@@tenacity_dev How to do development in a Docker environment?
@@suwandicahyadi9213 When you want to develop the application, set the env HOST variable for redis and postgres to be 127.0.0.1 and run redis and postgres docker containers from the docker compose stack. Go to your Laravel project and run php artisan serve and you can then develop. When you are done and want to deploy the containers. SSH to your server, set the env HOST variable to be redis and db for their own services respectively and run the required commands inside the app to optimize imports and so on.
Subs👍