Spark Scenario Based Question | SET Operation Vs Joins in Spark | Using PySpark | LearntoSpark

Поделиться
HTML-код
  • Опубликовано: 17 окт 2024
  • In this Video, we will learn when to choose Set operators such as intersect, subtract, union etc., and when to choose Join like left semi, left anti etc., We will learn this with a Scenario based question using PySpark. Hope this will be helpful in your interview preparation.
    Fb page:
    / learntospark-104523781...

Комментарии • 14

  • @tanushreenagar3116
    @tanushreenagar3116 8 месяцев назад

    Nice and perfect explanation 👌 sir

  • @bharathkumarreddy2834
    @bharathkumarreddy2834 3 года назад +1

    Bro you are doing a great job......explain about calculation of executor and executor memory all these things

  • @nehachopade1742
    @nehachopade1742 4 года назад +1

    I am performing except, intersect operator in spark dataframe using Scala, what I need to import outside of main class.?

  • @yestaykurmanov1865
    @yestaykurmanov1865 4 года назад +1

    Cool explanation

  • @sajeevkumar667
    @sajeevkumar667 2 года назад +1

    Thanks you, please make a video on delta table merge like SCD-type 2

    • @Soulfulreader786
      @Soulfulreader786 Год назад

      What do we use in industry scd or intersect or anythin else

  • @prabhathkota107
    @prabhathkota107 Год назад

    Very well expalined

  • @anveshreddy5905
    @anveshreddy5905 2 года назад

    Superb, good 👍 explanations

  • @mehroosali
    @mehroosali 3 года назад

    good explanation.

  • @maheshk1678
    @maheshk1678 3 года назад

    Can you share the same concept in spark scala

  • @anishamajumder8647
    @anishamajumder8647 3 года назад +1

    Hi... I have recently installed Jupyter notebook setup. Issue I m facing is that after applying intersect and all, I am not do a show() on the dataframe.
    like this"Py4JJavaError: An error occurred while calling o337.showString.
    "
    Any idea about why this comes?

    • @AzarudeenShahul
      @AzarudeenShahul  3 года назад +1

      I guess, this issue is with spark functions import. Check whether u imported all the required functions and types.

    • @anishamajumder8647
      @anishamajumder8647 3 года назад

      @@AzarudeenShahul ok will check..