Read large csv files | Read 10 gb of csv file

Поделиться
HTML-код
  • Опубликовано: 25 окт 2024

Комментарии • 22

  • @hiteshyadav2298
    @hiteshyadav2298 2 года назад +1

    nice i love the way to tell the basics

    • @JavaShastra
      @JavaShastra  2 года назад

      Thanks for your feedback 😊

  • @tapasranjansahu1742
    @tapasranjansahu1742 4 года назад +1

    Thanks Razan .... It's very helpful

    • @JavaShastra
      @JavaShastra  4 года назад

      Tapas Ranjan Sahu Thanks , this will motivate me to make such more stuffs, stay tuned

  • @HimanshuSingh-ti6qw
    @HimanshuSingh-ti6qw 4 месяца назад

    bhai how much it has taken to completely read can we do it under 5 min asked in interview

  • @purvidholakia7575
    @purvidholakia7575 3 года назад

    Hi, will you please upload the video related to how a big sql table import to csv file? will be a great help.

  • @maisoliman631
    @maisoliman631 3 года назад

    How to plot specific rows from large csv file?

  • @gopiravuri6477
    @gopiravuri6477 4 года назад +3

    How about reading large Excel files(.xlsx) like 2GB?

    • @JavaShastra
      @JavaShastra  4 года назад +1

      By using , Apache poi we can read , but I never tied with such huge size. Need to check 😊

  • @venkatasundarm2913
    @venkatasundarm2913 2 года назад

    How much memory is used by this code ?

  • @vishaljain9634
    @vishaljain9634 3 года назад +1

    want to read 10GB file and split in 10file how to do best approach and code if anyone have

    • @JavaShastra
      @JavaShastra  3 года назад

      Hi In java there if you want to split the file , we need to go through the entire file and then we need to split . I researched many things but could not get any solution apart from this. Let me know if you have any good solution .

    • @PraveenDeshmukh
      @PraveenDeshmukh Год назад +1

      @@JavaShastra We can use Kafka streaming to read chunks of records and process.

    • @JavaShastra
      @JavaShastra  Год назад

      Yes, we can , but here we have files and no Kafka .

  • @rishiraj2548
    @rishiraj2548 Год назад +1

    Namaskar 🙏😌

  • @aboudaladdin8604
    @aboudaladdin8604 3 года назад +1

    what you mean by "efficiently"?
    thats the normal reading method !

    • @JavaShastra
      @JavaShastra  3 года назад

      Different ways are there to read , out of them this is the best solution .

    • @aboudaladdin8604
      @aboudaladdin8604 3 года назад +4

      @@JavaShastra i thought there will be some technique like thread splitting the large csv into chunks or something to deal with a big file like this,

    • @aboudaladdin8604
      @aboudaladdin8604 3 года назад

      @@JavaShastra anyway good job

    • @JavaShastra
      @JavaShastra  3 года назад +2

      @@aboudaladdin8604 yes we can do , that one , I will try and will come up with a video . Thanks 😊.

  • @zouaghinadhem7870
    @zouaghinadhem7870 2 года назад

    hi can you help me?