Big Data Interview For 3-4 YOE | Hadoop, Spark Interview Questions | Mock Interview With Subscriber

Поделиться
HTML-код
  • Опубликовано: 15 янв 2025

Комментарии • 20

  • @rohitgade2382
    @rohitgade2382 Год назад +3

    really talented candidate

  • @vedhh7727
    @vedhh7727 3 месяца назад

    To reduce the shuffling, we can use reduce by key,
    In general narrow transformation to be used , if an unavoidable wide transformation like a join, group By, distinct is there, we can filter the data first and then do the wide transformation in order to minimize the unwanted data.
    So lesser records will be shuffled.

  • @akbarmunwar5435
    @akbarmunwar5435 2 года назад +1

    Very nice content

  • @prabhathkota107
    @prabhathkota107 2 года назад +1

    Good one

  • @skyfullofstars1634
    @skyfullofstars1634 2 года назад +1

    Sir
    .I have an experience of 2.5 years as a Manual test engineer but rather than automation I am thinking to switching to big data testing therefore I end up doing a certification course as well.How should i apply for interviews.

  • @mosininamdar9853
    @mosininamdar9853 2 года назад +2

    Hdfs for 128mb and local for 32mb

  • @Arumugam-fo6vj
    @Arumugam-fo6vj 9 месяцев назад

    can i attend this mock interview

  • @Mr_Dattrao_B.Andhori
    @Mr_Dattrao_B.Andhori 4 года назад +1

    Plz give me answer of "100 mb file and read the contents of this file and write it into another file 5 times"
    How we can write 5 times...?

    • @RajeshKumar-we5jb
      @RajeshKumar-we5jb 4 года назад +2

      When you write this file you can use .repartition(5)

    • @cleverstudies
      @cleverstudies  4 года назад +1

      count=5
      while [ $count -gt 0 ];
      do cat Test/Test.txt >> Test/Test2.txt;
      count=$((count-1));
      done

    • @abhilashkr1175
      @abhilashkr1175 3 года назад

      Use cp command in shell script

  • @satyanathparvatham4406
    @satyanathparvatham4406 2 года назад

    Default blocksize in lfs is 4 kb

  • @ajinkyahatolkar6518
    @ajinkyahatolkar6518 2 года назад +1

    Wo banda hushar tha.

  • @asathish3556
    @asathish3556 3 года назад +1

    Use rank functions to delete duplicate

    • @amolpayghon4308
      @amolpayghon4308 2 года назад

      No bro use row_id to delete dupliacates records

  • @siddhantsaxena1651
    @siddhantsaxena1651 3 года назад +1

    could you please provide feedback on this interview?

    • @cleverstudies
      @cleverstudies  3 года назад +1

      You will get Interview feedbacks in future videos. Thanks for watching.

  • @ProCrux
    @ProCrux 2 года назад

    How to apply for mock interview

    • @cleverstudies
      @cleverstudies  2 года назад

      Send resume to shareit2904@gmail.com