Installing and Running Airflow on a small AWS EC2 Instance
HTML-код
- Опубликовано: 4 фев 2025
- Running Apache Airflow on a small AWS EC2 Instance
Many people think Airflow is resource-consuming, but you can install and run it on a machine with only 1 CPU and 2 GB or RAM.
🏆 BECOME A PRO: www.udemy.com/...
👍 Smash the like button to become an Airflow Super Hero!
❤️ Subscribe to my channel to become a master of Airflow
🚨 My Patreon: / marclamberti
The materials: robust-dinosau...
Enjoy 🫶
Fantastic video. Got it up and running in no time. Simple yet comprehensive video that came out just in time to help me with my project.
Happy to help
Hey great vid Marc was able to follow along just surprised we need postgres. I found this useful to get me started on my personal project of taking some data and writing to a SQL server db daily. Thanks
This video was such a tremendous help, thank you so much Marc :)
Airflow on Fargate next please! Thank you so much for your work.
This video is a blessing for me. Thank you so much for making this.
Happy to help 🙌
Great work Marc! Please keep it up
Simple and great explanation!!! Thanks a lot
Marc, for simple airflow projects only 5 simple dags, it's cheaper to use AWS managed airflow service or to run airflow on a EC2 instance?
Thanks for this. I once got Airflow to start on a Micro instance with an external database 😂but once I installed the Databricks operators it was game over from a disk space POV
Fantastic! Thanks Marc!
Thank you!
Great work Marc. Helpful content!
Hey great vid thank you very much, just i have a question
The t2.small instance in EC2 is free ? Or should i pay sometimes at the end of usage ?
In cases where you are installing certain "latest versions" (e.g. airflow 2.5.0, python 3.10, etc.), can you mention if we should continue with those versions in the video or find out the actual latest version while installing. Will installing another highest latest version cause any issues with rest of the steps?
How do we keep the scheduler always running?
Thanks for the video, Marc. It was unclear though how it differs from running on 4GB or 8GB instances because I didn't see any specific points on memory allocation. Also, I tried to run Airflow in a Docker container on a 2GB ec2 instance, but at some point, the instance stopped responding for some reason. I wasn't able to find out whether Airflow or Docker caused the issue, but for me, it didn't work.
Will webserver run even if we close terminal ?
Also how will we run both webserver and scheduler in background even after closing terminal ?
Hi Marc, again, thanks for your videos. After running airflow on aws ec2 instance, can I still follow your "Apache Airflow: The Hands-On Guide" Course on Udemy without any problems? What happens when I stop the ec2 instance for later resume? Do I need to recreate everything again from scratch? How can I avoid that?
Great video! Have you considered making a terraform script so all of these steps could be deployed together?
Nop, but if you wanna partner on that I would be happy 🫶
excellent example
can you make it with docker ? and kub8 ?
Hi Marc. Can you give an idea of montlhy costs for this setup? Thanks!
$0.023 per hour, so $201 per year ($16.80 per month)
Prices vary depending on the region you select
Hello ! Your video helped me a lot! What should I do if I want to modify the port number from 8080 to 8081 ?
If I want to deploy airflow in a production environment with multiple workers for scalability, how should I do it ? I assume with kubernetes?
Sir please can you tell how to stop all of airflow components and restart them in one single command. so that changes can be made easily
How do we disable the example dags and reduce the clutter in web UI?
Can you make video how should we do airflow for windows rather than ubantu
hey do you teach sqllite db in airflow in your udemy course. ?
Beautiful video, thank you!
Two questions:
If I shut down my computer, will the dags still run in the cloud?
Can anyone enter with the public ipv4 dns 12:42 or can only I enter?
correct
@@MarcLamberti Awesome, thank you!
@@MarcLamberti Correct what?
@@MrMadmaggot shutdown pc - resources still run in the cloud.
Can anyone enter... -Security is not my best friend, but this does not look secure at all to me. Anyone with the link can enter for sure.
How is it possible? How the dags can run if the Instance is shutdown?@@UsrU-y8c
Iam facing while mapping sql alchemy connection with postgres when i tried with what you said its not reflecting to correspong variable
in your udemy course have covered sparksubmit operator?
not yet
Awesome video. Is it possible to install airflow via cloud9 (it runs on ec2 and can be stopped anytime)?
absolutely
Do you have an azure equivalent video? Thanks!
I have question. It is important to install postgresql before?
yes
Anyone notice that the written command sequence is different from the tutorial verbal sequence?
For example (4:25), ‘python3 -m venv venv’ is left out of the written seq; necessary to create the venv directory with the virtual environment.
Then the lib install (sudo apt-get install libpq-dev) comes before the virtual environment activation (source venv/bin/activate). (don’t know if this is a problem)
Since the EC2 instance works in the tutorial, might be a good idea to use it to ‘fix’ the written sequence … or just follow.
Why when i hit the command airflow db init that don't create de airflow folder?
Hi, where are those dag files stored
python3 -m venv venv is missing from the notion page
The tutorial works but I have a problem, The scheduler does not appear to be running. ... This happened after I close the shh connection, but the webserver is still active.
I need use this airflow to schedule a periodic task what should I do ? ... How do I keep the scheduler active
merci
Tried this with azure VMs and got some errors... I had to give up lol
Need many dollar to create airflow on aws
Good thing you can have a free trial on Astronomer
Encountered and issue with 10:16 airflow init db. Error
Error : psycopg2.errors.InsufficientPrivilege: permission denied for schema public
Solution that worked for me: ALTER DATABASE airflow_db OWNER TO airflow_user;
Note I am using the latest airflow 2.9.1 as of this point
tysm! ran into the same issue here
In version 16, I had to add:
GRANT ALL ON SCHEMA public TO airflow;
ALTER DATABASE airflow OWNER TO airflow;
GRANT CONNECT ON DATABASE airflow TO airflow;
GRANT USAGE ON SCHEMA public TO airflow;
GRANT CREATE ON SCHEMA public TO airflow;
GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO airflow;
GRANT ALL PRIVILEGES ON ALL SEQUENCES IN SCHEMA public TO airflow;
to the SQL he created, its overkill, but its just a test