Cloud Composer - Orchestrating an ETL Pipeline Using Cloud Dataflow

Поделиться
HTML-код
  • Опубликовано: 6 янв 2025

Комментарии • 16

  • @ShehneelAhmedKhan
    @ShehneelAhmedKhan 18 дней назад

    Great Stuff!
    One question, what should be done if we have to make ETL from Postgres to Bigquery where Postgres can have 70-80 relational tables? Do we convert each table into csv and write different transformations for each csv?

  • @vigsulagam2294
    @vigsulagam2294 Год назад +1

    Thanks!

  • @pradipfunde2099
    @pradipfunde2099 Год назад +1

    Useful video

  • @naren06938
    @naren06938 2 месяца назад +1

    Hi Bro...i tried similar fusion project....i got IAM permission error, even i given Editor. Owner Roles also....

  • @ashwinjoshi3331
    @ashwinjoshi3331 Год назад +2

    Thanks a lot . This is really helpful. One question - here source is CSV file . What changes are required to be done if the source is "Oracle on premise data" . I actually tried with Datastream but due to limitation at client end, we are asked to use Dataflow. I tried to check but could not get any specific detail . Could you please suggest any reference here from oracle connection point of view ?

    • @cloudaianalytics6242
      @cloudaianalytics6242  Год назад +1

      Sure. Please find the link below
      precocityllc.com/blog/configuring-cloud-composer-for-oracle-databases/

    • @ashwinjoshi3331
      @ashwinjoshi3331 Год назад +1

      Thanks for the link. It's really helpful.

  • @Ar001-hb6qn
    @Ar001-hb6qn 8 месяцев назад +1

    I am unable to create a GCP Composer environment. After around 45 minutes it shows the error "Some of the GKE pods failed to become healthy".
    I have configured the setting and given the necessary access. I am using composer-2.7.0-airflow-2.7.3. But it failed to create the environment. Can you please help with this? Thanks.

    • @cloudaianalytics6242
      @cloudaianalytics6242  2 месяца назад

      Maybe because of shortage of quotas in that region, try freeing it up.

  • @dataflorent
    @dataflorent Год назад +1

    This is very useful, thanks. Can we trigger trigger Apache Beam pipeline on Dataflow without using Template? I think Template is mandatory with composer 2

  • @Hariharasubramanian-n3o
    @Hariharasubramanian-n3o Год назад

    I will appreciate your effort. If possible could you please specify the way to replicate the same by end user step by step if you create storage bucket and create the script file and to save the same in which location so that we will follow you to understand in detail the way to achieve the same in different scenario. Hope you understand.