Organizing Your Airflow Project Code with the Astro CLI
HTML-код
- Опубликовано: 18 сен 2024
- This "Live with Astronomer" session provides an overview of how to use the open-source Astro CLI to quickly get up and running with Airflow and keep your Airflow project code organized. The Astro CLI comes with a pre-baked project structure designed to make managing DAG files, extra scripts, plugins, and packages easy and scalable for your whole team.
Questions covered in the webinar include:
-How do you get started with Airflow using the Astro CLI?
-What directory structure does an Astro CLI project use?
-How do I organize my DAG files and scripts that my DAGs call to keep my project scalable and easy to use?
-How do I install Python and OS-level packages in my Airflow environment?
To try an Astro CLI project like the one covered in this webinar, check out our documentation here: docs.astronome...
Thanks to all the Atronomers team
Thanks for watching!
Thank you so much for sharing this with us
My pleasure 😊
Airflow wasnt able to locate my existing python scripts. I receive this error: ImportError: cannot import name 'weeklyExtract' from 'dags' (unknown location)
Would you mind sharing how you're referencing the script in the code? And where are you python scripts stored? Typically you'll need to create a sub-folder within the DAG's folder to store them and then you can reference them from that path.
Hi, I have already an existing airflow project, so how can use Astro CLI to run my project ?
How can I set up with hive, hadoop, spark, etc?
What are you trying to set up specifically?
@@Astronomer example if I use docker, I can setup airflow, hive, hadoop and hue with each dokerfile and docker-compose. But can I do that on astro?
Hi. Thank you for a great video. I have one question. Can I somehow start astro locally inside of my existing project that already follows a different structure? I would very much like to benefit from a conveneience of astro cli, but there's no way I want to modify a structure of a project that has been in place for more than 1.5 years :)
Having some issues including apache-airflow-providers-mysql==4.0.0 is this a knows error while generating package metadata?
What kind of errors specifically? Mysql does have some kinks in terms of other packages needed to be installed to work properly.
The webserver is not starting (port 8080). And I am getting this error.
Airflow is starting up!
Error: there might be a problem with your project starting up. The webserver health check timed out after 1mes but your project will continue trying to start. Run 'astro dev logs --webserver |--scheduler for details.
Try again or use the --wait flag to increase the time out
Is there another service running on that port? I recommend running a docker prune
hi guys, so i was trying astro cli for the first time on my win11 machine (i know windows is not the best env for this). But it worked for the first time, and when i needed to add more requirements and run the command "astro dev restart" it usually breaks, giving the following error:
"Error: error building, (re)creating or stating project containers: error response from daemon: unable to find user astro, no matching entries in passwd file"
it creates the airflow, default, postgres_data, aifrlow_logs and webserver;
but it says "started" for the postgres-1 container and "starting" for the scheduler, dont know if it helps. have you guys ever bumped into this an error like that? thanks in advance
Hey, try manually killing the cluster on docker desktop, and then running astro dev start again, if that doesn't work let me know!