AWS Tutorials - Using Spark SQL in AWS Glue ETL Job

Поделиться
HTML-код
  • Опубликовано: 8 апр 2023
  • One can use Spark SQL in Glue ETL job to transform data using SQL Query. A SQL transform can work with multiple datasets as inputs and produce a single dataset as output. Learn use of SQL Transform in AWS Glue ETL Job to create transformation using Spark SQL.
  • НаукаНаука

Комментарии • 13

  • @tamasensei550
    @tamasensei550 4 месяца назад

    This is really helpful, i just started to use AWS Glue recently. Hats off to you Sir!

  • @baridie2002
    @baridie2002 Год назад +1

    I'm a big fan of your videos, dude! You're so good at explaining things, and the step-by-step instructions really help me understand the topic. Thanks for sharing your knowledge! By the way, can you make a video on how to extract filtered data from a source using SQL queries? That would be awesome. Greetings from Argentina

  • @Sidrockfitness007
    @Sidrockfitness007 Месяц назад

    Thankyou 😇

  • @viniciusfelizatti8103
    @viniciusfelizatti8103 2 месяца назад

    Hi. Am I supposed to bring 20 tables if my query uses then to create the SQL query using Glue visual ETL?

  • @udaynayak4788
    @udaynayak4788 Год назад +1

    thank you so much for detailed on spark SQL, can you share one video on UDF in aws glue with spark sql.

  • @arunbhandary4024
    @arunbhandary4024 8 месяцев назад

    Can you please make a video to implement a SCD2 type load using AWS glue

  • @josemanuelgutierrez4095
    @josemanuelgutierrez4095 Год назад +1

    My dear friend , I would appreciate it if one day you can do a video explaining about how to send non-verified email with SES or SNS , because in many cases when you want to add an email to send info , I cannot do that because many people said that you need to be out of the sandbox , that's why IDK if someday you can do that .. Thank you very much I learn a lot with ur videos

  • @AnishBhola
    @AnishBhola 8 месяцев назад

    why are there multiple files in the output bucket? shouldn’t it have been one single parquet file with all the rows of data? thanks in advance!

    • @hinalucas
      @hinalucas Месяц назад

      It happens because it is automatically partitioned.