Organizing Your Airflow Project Code with the Astro CLI

Поделиться
HTML-код
  • Опубликовано: 18 сен 2024
  • This "Live with Astronomer" session provides an overview of how to use the open-source Astro CLI to quickly get up and running with Airflow and keep your Airflow project code organized. The Astro CLI comes with a pre-baked project structure designed to make managing DAG files, extra scripts, plugins, and packages easy and scalable for your whole team.
    Questions covered in the webinar include:
    -How do you get started with Airflow using the Astro CLI?
    -What directory structure does an Astro CLI project use?
    -How do I organize my DAG files and scripts that my DAGs call to keep my project scalable and easy to use?
    -How do I install Python and OS-level packages in my Airflow environment?
    To try an Astro CLI project like the one covered in this webinar, check out our documentation here: docs.astronome...

Комментарии • 17

  • @openclass4all
    @openclass4all Год назад +2

    Thanks to all the Atronomers team

  • @sabaokangan
    @sabaokangan Год назад +1

    Thank you so much for sharing this with us

  • @bananaboydan3642
    @bananaboydan3642 9 месяцев назад +1

    Airflow wasnt able to locate my existing python scripts. I receive this error: ImportError: cannot import name 'weeklyExtract' from 'dags' (unknown location)

    • @Astronomer
      @Astronomer  8 месяцев назад

      Would you mind sharing how you're referencing the script in the code? And where are you python scripts stored? Typically you'll need to create a sub-folder within the DAG's folder to store them and then you can reference them from that path.

  • @ziedsalhi4503
    @ziedsalhi4503 6 месяцев назад

    Hi, I have already an existing airflow project, so how can use Astro CLI to run my project ?

  • @nghiepeiendesu7342
    @nghiepeiendesu7342 11 месяцев назад +1

    How can I set up with hive, hadoop, spark, etc?

    • @Astronomer
      @Astronomer  11 месяцев назад

      What are you trying to set up specifically?

    • @nghiepeiendesu7342
      @nghiepeiendesu7342 11 месяцев назад

      @@Astronomer example if I use docker, I can setup airflow, hive, hadoop and hue with each dokerfile and docker-compose. But can I do that on astro?

  • @vladislavzadvornev4548
    @vladislavzadvornev4548 7 месяцев назад

    Hi. Thank you for a great video. I have one question. Can I somehow start astro locally inside of my existing project that already follows a different structure? I would very much like to benefit from a conveneience of astro cli, but there's no way I want to modify a structure of a project that has been in place for more than 1.5 years :)

  • @dakadoodle6295
    @dakadoodle6295 Год назад

    Having some issues including apache-airflow-providers-mysql==4.0.0 is this a knows error while generating package metadata?

    • @Astronomer
      @Astronomer  Год назад

      What kind of errors specifically? Mysql does have some kinks in terms of other packages needed to be installed to work properly.

  • @shafiect2151
    @shafiect2151 Год назад

    The webserver is not starting (port 8080). And I am getting this error.
    Airflow is starting up!
    Error: there might be a problem with your project starting up. The webserver health check timed out after 1mes but your project will continue trying to start. Run 'astro dev logs --webserver |--scheduler for details.
    Try again or use the --wait flag to increase the time out

    • @Astronomer
      @Astronomer  Год назад

      Is there another service running on that port? I recommend running a docker prune

  • @dataengineermatheusbudin7011
    @dataengineermatheusbudin7011 Год назад

    hi guys, so i was trying astro cli for the first time on my win11 machine (i know windows is not the best env for this). But it worked for the first time, and when i needed to add more requirements and run the command "astro dev restart" it usually breaks, giving the following error:
    "Error: error building, (re)creating or stating project containers: error response from daemon: unable to find user astro, no matching entries in passwd file"
    it creates the airflow, default, postgres_data, aifrlow_logs and webserver;
    but it says "started" for the postgres-1 container and "starting" for the scheduler, dont know if it helps. have you guys ever bumped into this an error like that? thanks in advance

    • @Astronomer
      @Astronomer  Год назад

      Hey, try manually killing the cluster on docker desktop, and then running astro dev start again, if that doesn't work let me know!