Focus On: The Apache Airflow PythonOperator, all you need in 20 mins!

Поделиться
HTML-код
  • Опубликовано: 15 апр 2021
  • Focus On: The Apache Airflow PythonOperator, all you need in 20 mins!
    In this video you are going to discover everything about the PythonOperator in Airflow 2.0
    At the end of the video you will be able to:
    ⏺ Execute a Python function (Amazing isn't it?)
    ⏺ Pass positional arguments to your Python function
    ⏺ Pass keyword arguments to your Python function
    ⏺ Evaluate your arguments at runtime and pass variables
    ⏺ Reduce the number of calls to your database
    ⏺ Make your DAG cleaner
    ⏺ Get the current execution date in Airflow 2.0
    ⏺ Leverage the TaskFlow API to transform your PythonOperator in a mind-blowing way!
    Materials:
    www.notion.so/The-PythonOpera...
    The NEW WAY OF CREATING DAGS:
    • TaskFlow API in Airflo...
    Enjoy ❤️
  • РазвлеченияРазвлечения

Комментарии • 30

  • @goodmanshawnhuang
    @goodmanshawnhuang 4 дня назад

    Great job, thanks for sharing it.

  • @chris0628
    @chris0628 Год назад +1

    To the point, no fluff 👍🏽

  • @steliosioannides7128
    @steliosioannides7128 2 года назад

    Thanks Marc for this very useful video! High appreciated ! Keep it up !!

  • @Leonardo-jv1ls
    @Leonardo-jv1ls 2 года назад +1

    Thank you. You've helped me a lot.

  • @andrikramer686
    @andrikramer686 3 года назад

    Awesome. Thanks a lot, Marc.

  • @amitjain000
    @amitjain000 3 года назад +1

    Awesome.. thanks for such a beautiful knowledge share.

  • @averychen4633
    @averychen4633 6 месяцев назад

    you are the best

  • @JosePla
    @JosePla 2 года назад

    This is exactly what I was looking for

  • @AbhishekAmeria
    @AbhishekAmeria 3 года назад

    That's was very useful. Thanks :)

  • @vladimirobellini6128
    @vladimirobellini6128 3 года назад

    this is great. thanks!

  • @FranVarVar
    @FranVarVar Год назад +1

    Amazing. Thanks Marc!

  • @shivangitomar5557
    @shivangitomar5557 2 года назад

    Very good vidoe!!!

  • @manjunathmani3678
    @manjunathmani3678 3 года назад

    nice man

  • @SpiritOfIndiaaa
    @SpiritOfIndiaaa 3 года назад

    Excellent thanks a lot...I have to check oracle table if a record exists for current date ,if exists then only call downdown stream task ,how to achieve this ?

  • @sridhar43299
    @sridhar43299 2 года назад +1

    How to send object from gcs to email bu airflow. If possible send me code

  • @bullandrooster
    @bullandrooster 2 года назад

    Can I run 2 dags with different python version and different pandas version? For now I see airflow offers only one python version at the time, and only predefined required dependencies.
    And all my dags must match that specs.

  • @GeandersonLenz
    @GeandersonLenz 3 года назад +2

    How to turn ON this module includes?

    • @leomax87
      @leomax87 2 года назад

      did you resolve it?

  • @brendoaraujo9110
    @brendoaraujo9110 2 года назад

    instead of calling the script function, is there an operator that I can call the script completely to be executed?

  • @eunheechoi3745
    @eunheechoi3745 9 месяцев назад

    whenever I first run the airflow test command, it throws an error saying 'sqlalchemy.exc.NoReferenceTableError: Foreign key associated with column 'dag_run_note.user_id' could not find table 'ab_user' with which to generate a foreign key to target column 'id'
    do you what it is about? and how to fix it?
    Second run it doesn't occur....

    • @Astronomer
      @Astronomer  9 месяцев назад

      Hmmmmm interesting, if it works on the second run maybe it needs to generate the table first?

  • @BalvanshHeerekar
    @BalvanshHeerekar 2 года назад +1

    Hey, while using includes I get the error - ModuleNotFoundError: No module named 'includes'

    • @leomax87
      @leomax87 2 года назад

      I'm facing the same error. How Can I resolve it?

    • @eldrigeampong8573
      @eldrigeampong8573 Год назад

      @@leomax87 you can import the sys module in python after which you will navigate a number of levels either up or down into the directory your file is located.
      import sys
      sys.path.append(".") # that's move a level up into the dags folder
      from includes.my_dags.functions import process

  • @valetta6789
    @valetta6789 2 года назад

    I am trying to pull the value from another task in a different folder, however, I see only the current task instance. Is this new behaviour, or am I unable to pass between different directories?

    • @Astronomer
      @Astronomer  Год назад

      So your DAG's are in different folders?

    • @valetta6789
      @valetta6789 Год назад

      @@Astronomer I have no idea what I was talking about😄

  • @archanam4224
    @archanam4224 2 года назад

    import airflow cannot be resolved