what is Apache Parquet file | Lec-7

Поделиться
HTML-код
  • Опубликовано: 2 май 2023
  • In this video I have talked about parquet file reading in spark. If you want to optimize your file and process in Spark then you should have a solid understanding of Parquet file format. Please do ask your doubts in comment section.
    Directly connect with me on:- topmate.io/manish_kumar25
    Download Parquet Data:- github.com/databricks/Spark-T...
    Download parquet tools in your local to run all the below commands.
    Parquet tools can be downloaded using pip command.
    Run the below command in cmd or terminal
    pip install parquet-tools
    Run the blow command inside python
    import pyarrow as pa
    import pyarrow.parquet as pq
    parquet_file = pq.ParquetFile(r'C:\Users
    ikita\Downloads\Spark-The-Definitive-Guide-master\data\flight-data\parquet\2010-summary.parquet\part-r-00000-1a9822ba-b8fb-4d8e-844a-ea30d0801b9e.gz.parquet')
    parquet_file.metadata
    parquet_file.metadata.row_group(0)
    parquet_file.metadata.row_group(0).column(0)
    parquet_file.metadata.row_group(0).column(0).statistics
    Run the below command in cmd/terminal
    parquet-tools show C:\Users\manish\Downloads\Spark-The-Definitive-Guide-master\data\flight-data\parquet\2010-summary.parquet\part-r-00000-1a9822ba-b8fb-4d8e-844a-ea30d0801b9e.gz.parquet
    parquet-tools inspect (path of your file location as above)
    parquet.apache.org/docs/file-...
    For more queries reach out to me on my below social media handle.
    Follow me on LinkedIn:- / manish-kumar-373b86176
    Follow Me On Instagram:- / competitive_gyan1
    Follow me on Facebook:- / manish12340
    My Second Channel -- / @competitivegyan1
    Interview series Playlist:- • Interview Questions an...
    My Gear:-
    Rode Mic:-- amzn.to/3RekC7a
    Boya M1 Mic-- amzn.to/3uW0nnn
    Wireless Mic:-- amzn.to/3TqLRhE
    Tripod1 -- amzn.to/4avjyF4
    Tripod2:-- amzn.to/46Y3QPu
    camera1:-- amzn.to/3GIQlsE
    camera2:-- amzn.to/46X190P
    Pentab (Medium size):-- amzn.to/3RgMszQ (Recommended)
    Pentab (Small size):-- amzn.to/3RpmIS0
    Mobile:-- amzn.to/47Y8oa4 ( Aapko ye bilkul nahi lena hai)
    Laptop -- amzn.to/3Ns5Okj
    Mouse+keyboard combo -- amzn.to/3Ro6GYl
    21 inch Monitor-- amzn.to/3TvCE7E
    27 inch Monitor-- amzn.to/47QzXlA
    iPad Pencil:-- amzn.to/4aiJxiG
    iPad 9th Generation:-- amzn.to/470I11X
    Boom Arm/Swing Arm:-- amzn.to/48eH2we
    My PC Components:-
    intel i7 Processor:-- amzn.to/47Svdfe
    G.Skill RAM:-- amzn.to/47VFffI
    Samsung SSD:-- amzn.to/3uVSE8W
    WD blue HDD:-- amzn.to/47Y91QY
    RTX 3060Ti Graphic card:- amzn.to/3tdLDjn
    Gigabyte Motherboard:-- amzn.to/3RFUTGl
    O11 Dynamic Cabinet:-- amzn.to/4avkgSK
    Liquid cooler:-- amzn.to/472S8mS
    Antec Prizm FAN:-- amzn.to/48ey4Pj

Комментарии • 89

  • @manish_kumar_1
    @manish_kumar_1  4 месяца назад +20

    I said 500 GB in the video by mistake. It is supposed to be 500MB, and when dividing 500/128, we will get 4 partitions.

  • @ArunNair-z3m
    @ArunNair-z3m 4 дня назад

    Hi Manish, thanks for such smooth explanation of not just information related to parquet but also things related to it, kudos to your efforts :D

  • @ankitachauhan6084
    @ankitachauhan6084 23 дня назад

    the best explanation ! you are a wonderful teacher

  • @ApoorvaShinde-on4ep
    @ApoorvaShinde-on4ep 20 дней назад

    This is so far the best video in which I got to know in depth knowledge of parquet and very easy to understand.
    Thankyou so much for sharing your knowledge.!!
    Could you please share the video having optimization of parquet?

  • @Shubhamkumar-cq5wt
    @Shubhamkumar-cq5wt 9 месяцев назад +4

    Literally the best and most detailed video on parquet file format on yt. Thank you!

  • @sahillohiya7658
    @sahillohiya7658 8 месяцев назад +1

    I love how indept you are going, please keep doing it ! We are loving it.

  • @user-qn6ud4hs3b
    @user-qn6ud4hs3b 2 месяца назад

    Never saw such a detailed video for parquet file, these videos are really valuable. Really appreciate the efforts put in making these videos

  • @susanthomas223
    @susanthomas223 Месяц назад

    Thank you so much for putting in so much time for making this video

  • @dishant_22
    @dishant_22 9 месяцев назад

    This is the best explanation for parquet file format available online. Thanks Manish.

  • @rahulgupta-po4ki
    @rahulgupta-po4ki 9 месяцев назад

    highly informative and detailed video on parquet. Thanks a lot Manish!

  • @akashprabhakar6353
    @akashprabhakar6353 Месяц назад

    Predicate pushdown - Rows filtering, Projection Pruning/Pushdown - Column filtering. Thanks for the session bro!!

  • @krishnasahoo6172
    @krishnasahoo6172 9 месяцев назад +1

    Wah....Itta clarification...maza aa gya...Video kab khatm hui pta hi ni chala....!!! Excellent explanation.

  • @shivakrishna1743
    @shivakrishna1743 Год назад

    Very detailed awesome video!! Thanks

  • @shubhamwaingade4144
    @shubhamwaingade4144 4 месяца назад +1

    The best explanation!!! Your videos are giving me motivation and inspiration to keep learning spark!

  • @vaibhavmore7936
    @vaibhavmore7936 Год назад

    Thanks for this Manish! Great Work!

  • @asifquasmi4538
    @asifquasmi4538 4 месяца назад

    Hats of Manish, Please keep doing the good work :)

  • @bidyasagarpradhan2751
    @bidyasagarpradhan2751 5 месяцев назад

    Someone ask me in interview about internals of parquet file format and i couldn't answer it,Then i found your video.Now i can explain easily.Best video on parquet file format.

  • @alokkumarmohanty8454
    @alokkumarmohanty8454 11 месяцев назад +2

    Hi Manish,
    the parquet file detail class was classic example for how to present something .if same like this avro and orc file format classes can be discussed then it would be really helpful. Nowadays the interviewer is asking on those as well

  • @neelshah8247
    @neelshah8247 3 месяца назад

    Excellent video. Thank you :)

  • @afjalahamad2465
    @afjalahamad2465 3 месяца назад

    really awesome explanation

  • @deeksha6514
    @deeksha6514 3 месяца назад

    Thanks! for this masterpiece

  • @lucky_raiser
    @lucky_raiser Год назад

    bhai, maza aa gya, thanks bro

  • @dollykushwah6352
    @dollykushwah6352 10 месяцев назад

    Hello Manish, excellent explanation, hats off to you. When will you give optimization video on parquet eagerly waiting for it

  • @ashutoshkumarsingh3337
    @ashutoshkumarsingh3337 11 месяцев назад

    what a gem you are

  • @manish_kumar_1
    @manish_kumar_1  Год назад

    Directly connect with me on:- topmate.io/manish_kumar25

  • @debopower2009
    @debopower2009 10 месяцев назад

    Very nice.

  • @krunalsuthar1420
    @krunalsuthar1420 7 дней назад

    Please make video on ORC and Avro as well

  • @lakkilakki772
    @lakkilakki772 9 месяцев назад +1

    Hi Manish, great explanation of parquet i'm using parquet but didn't know about these features which made things fast how were you able to learn all this knowledge please suggest any documentation/resources to get deep understanding like this. you made my day. Thank you 😊

  • @user-mf6cx8xx5d
    @user-mf6cx8xx5d Год назад

    Thanks Manish 🙂

  • @pramod3469
    @pramod3469 Год назад

    Thanks Manish

  • @user-lp3qe9jj3m
    @user-lp3qe9jj3m 10 месяцев назад

    Please make a video on avro file format in detail because I faced challenges when interviewers asked about avro file format questions

  • @pankajjagdale2005
    @pankajjagdale2005 10 месяцев назад

    informative Thanks

  • @wellwisher7333
    @wellwisher7333 Год назад

    Thanks Sir

  • @Wandering_words_of_INFJ
    @Wandering_words_of_INFJ 7 месяцев назад

    Manish, if we are writing parquet by making the files already sorted in asc or desc then the process of retrieval of data would be faster right? Because in row_number's meta data would have min and Max value in a certain range? Please correct me if I am wrong.

  • @prathamesh_a_k
    @prathamesh_a_k 6 месяцев назад

    nice explaination brother

    • @prathamesh_a_k
      @prathamesh_a_k 6 месяцев назад

      can you make one video on ORC also

  • @khadarvalli3805
    @khadarvalli3805 9 месяцев назад

  • @shubhamwaingade4144
    @shubhamwaingade4144 4 месяца назад

    One doubt, I did not understand the logical partitioning completely, it resembles with file size we can set in spark config. Please help me understand it

  • @MCAMadeEasy
    @MCAMadeEasy 2 месяца назад

    Manish bhai, nested json?

  • @nileshgodase1007
    @nileshgodase1007 5 месяцев назад

    Nested json to data frame explain kijiyee na

  • @nikhiljain8411
    @nikhiljain8411 14 дней назад

    How one will understand in which 1L records we need to fetch the data. Still we need to scan the complete file. Isn't it?
    Kindly explain

  • @sumitchoubey1284
    @sumitchoubey1284 27 дней назад

    unable to install parquet-tools. can you help or point n right direction

  • @pankajsolunke3714
    @pankajsolunke3714 Год назад +1

    Hi manish sir,Thanks for bringing such valuable info ..I have a question like how can we handle schema evaluation in parquet

    • @manish_kumar_1
      @manish_kumar_1  Год назад

      I didn't get you

    • @mohitdaxini3067
      @mohitdaxini3067 Год назад

      I think he wants to about shema evolution

    • @sankuM
      @sankuM Год назад

      @@manish_kumar_1 I think @pankajsolunke3714 is asking how to handle schema evolution in parquet if we can?

  • @radheshyama448
    @radheshyama448 10 месяцев назад

    😇

  • @avisinha2844
    @avisinha2844 Год назад +1

    Hello Manish, i really like your videos, thanks for the efforts you put in. Have a question, can you please tell a good tutorial/course that we can go through to get really good at pyspark, if not a single resource then what are the various resources that we can go through to get good at pyspark coding.

    • @manish_kumar_1
      @manish_kumar_1  Год назад +3

      You don't need a course. Still if you want to go for a course then you can buy a udemy course by Prashant Pandey titled pyspark for Beginner. Rest depends on you ki how much questions you want to solve. Solve more problems rather than running behind multiple courses. Practice is the key to success not a number of course you have done.

    • @sankuM
      @sankuM Год назад

      sparkbyexamples is the RESOURCE we need for practice!! :)

    • @royalkumar7658
      @royalkumar7658 10 месяцев назад

      Where can we practice spark from?

    • @royalkumar7658
      @royalkumar7658 10 месяцев назад

      ​@@manish_kumar_1 where can we practice spark from??

    • @manish_kumar_1
      @manish_kumar_1  10 месяцев назад

      @@royalkumar7658 Leet code se. Aap playlist start se follow kijiye tab Pata chal jayega kaha se and kaise

  • @navjotsingh-hl1jg
    @navjotsingh-hl1jg 9 месяцев назад

    bhai 500gb data mein 4 row kyun rakhe gaye and manish bhai 128mb hota hai . aap explain kar sakte ho aisa kyun bhai

  • @Marcopronto
    @Marcopronto 11 месяцев назад

    Hi Manish,
    In the last video, you told that u will explain about nested json in further videos. Where can i find that?

    • @manish_kumar_1
      @manish_kumar_1  10 месяцев назад

      I have not done yet. I will try to make one soon.

    • @220piyush
      @220piyush 2 месяца назад

      Yaar bhaiya wo bana do pls. Industry to usi pe chal ri

  • @patilsahab4278
    @patilsahab4278 5 месяцев назад

    hii bro each row grop stores 128mb or 128 gb data you told 128mb
    bur for for 500gb you told 4 row groups
    you are talking about 500mb or 500gb

  • @royalkumar7658
    @royalkumar7658 10 месяцев назад

    Null kaise write hota hai disk pe??

  • @AnubhavTayal
    @AnubhavTayal 11 месяцев назад +1

    Hi Manish, thank you for the information. Please can you elaborate whats the default value of 128 MB and when we have 500 GB data how does that convert to 4 row groups? Thank you

    • @manish_kumar_1
      @manish_kumar_1  11 месяцев назад +2

      500 mb not gb. 500 divided by 128 I.e 4.
      4 block of data will be created. So the thing is we have a default block size of 128 MB in hdfs and multiple cloud service provider also use the same block size. So let say if you have 140 mb data that means one partition will be of 128 mb and next partition will be having just 12 mb of data.

    • @AnubhavTayal
      @AnubhavTayal 11 месяцев назад

      @@manish_kumar_1 thank you so much!

  • @ShekharBhide
    @ShekharBhide Год назад

    sir, parquet file download nahi ho raha he github se

  • @yogesh9992008
    @yogesh9992008 Год назад

    Cmd-parquet-tool issue

  • @tnmyk_
    @tnmyk_ 4 месяца назад

    Where is the nested JSON video? You said you will make a separate video on it in the previous lecture "how to read json file in Pysaprk"

  • @rpraveenkumar007
    @rpraveenkumar007 11 месяцев назад

    Hi Manish, what is projection pruning? Unable to find it on Google. Or is it Partition Pruning*? Can you please explain/clarify?

    • @manish_kumar_1
      @manish_kumar_1  11 месяцев назад

      Projection pushdown Hota hai jisme columns ki pruning hoti hai. So Projection pushdown ya Projection pruning same hai.

    • @rpraveenkumar007
      @rpraveenkumar007 11 месяцев назад

      @@manish_kumar_1 thanks for clarifying!

  • @dineshboliwar9545
    @dineshboliwar9545 11 месяцев назад

    sir please make short video to downloadd and install parquet tool

  • @ajaypatil1881
    @ajaypatil1881 8 месяцев назад

    Example of Modi ji for finding age >18 was highlight of the video

  • @aravind5310
    @aravind5310 11 месяцев назад

    Your content is good.Why don't you do videoes in English.

    • @manish_kumar_1
      @manish_kumar_1  11 месяцев назад

      english nahi aati hai 😒. Just joking, I may record a session in future but not for now.

    • @izahmad90
      @izahmad90 10 месяцев назад

      ruclips.net/video/zM2OAAvJItQ/видео.html&ab_channel=knowledgeEpicenter (We are making videos for those people for whom no one is making videos.)

  • @yogesh9992008
    @yogesh9992008 Год назад

    Stage failure error show

  • @sankuM
    @sankuM Год назад

    Hey @manish_kumar_1, I was able to use the modes (append, overwrite, etc.) using this command:
    df.write.option("header", first_row_is_header) \
    .option("sep", delimiter) \
    .mode("Overwrite") \
    .csv(file_location)
    All other ways of writing is returning error on Databricks if the file exists.. even if we're trying to append the data..! :| Unsure why is this happening...! :\

    • @manish_kumar_1
      @manish_kumar_1  Год назад +1

      Same here. May be due to community edition. In production environment it does work

    • @sankuM
      @sankuM Год назад

      @@manish_kumar_1 oh..okay! Still weird, though!!! I'm yet to try databricks in production..

  • @dineshboliwar9545
    @dineshboliwar9545 11 месяцев назад

    anybody help me please i cant read parquet file using command prompt

    • @manish_kumar_1
      @manish_kumar_1  11 месяцев назад

      Koi issue nahi hai. Aap direct databricks me read kar lijiye. Ek baar video ko bas sahi se dekh lijiyega

    • @dineshboliwar9545
      @dineshboliwar9545 11 месяцев назад

      @manish_kumar_1 databricks me kr liya h command prompt ka nhi ho rha h

  • @ajaywade9418
    @ajaywade9418 7 месяцев назад

    21:25 500 GB or 500 Mb ?

  • @DevSharma_31
    @DevSharma_31 11 месяцев назад

    import pyarrow as pa
    import pyarrow.parquet as pq
    parquet_file = pq.ParquetFile(r'C:\Users\DELL\Desktop\part-r-00000-1a9822ba-b8fb-4d8e-844a-ea30d0801b9e.gz.parquet')
    parquet_file.metadata
    parquet_file.metadata.row_group(0)
    parquet_file.metadata.row_group(0).column(0)
    parquet_file.metadata.row_group(0).column(0).statistics Not able to see any output with this file. Not sure why

  • @ranvijaymehta
    @ranvijaymehta Год назад

    Thanks sir