MYSQL Tutorial: Efficiently Importing Large CSV Files into MySQL with Python and Pandas

Поделиться
HTML-код
  • Опубликовано: 14 окт 2024

Комментарии • 21

  • @jongdave2691
    @jongdave2691 Год назад +2

    This is really helpful, thank you. I wonder if the data has tons of "nan" value,such as Covid data, will still work?

    • @AnalyticsExplorers
      @AnalyticsExplorers  Год назад +1

      Yes sure it still works.
      But, you must do something before importing. Check that data is cleaned as much as possible, you can do this on excel for example. Check that all nan cells contain empty string no spaces no characters are there.
      Best of luck!

  • @sandeepanand3834
    @sandeepanand3834 21 день назад

    how to insert dataframe directly into mysql without manual pasting?

  • @sanikasunil5440
    @sanikasunil5440 6 месяцев назад

    bro i want to import csv file containing 60000 rows into wix but there is a limit of 50,000 rows so i am trying to upload it to mysql(external database option) how do i achieve this?

  • @jackrothschild3502
    @jackrothschild3502 11 месяцев назад

    Thank you, but,what should I do if there is a large number of columns? Maybe hundreds

  • @maxchergik2439
    @maxchergik2439 8 дней назад

    You know you can insert automatically into MySQL all data straight from python, right?

  • @mahimmittal7466
    @mahimmittal7466 Год назад

    Seriously I tried everyday and Finally I was able to import my file. Thanks to this video!

  • @ryuhayabusa3540
    @ryuhayabusa3540 Год назад +2

    Thanks, what if you have millions of rows?

    • @AnalyticsExplorers
      @AnalyticsExplorers  Год назад

      in your case, You can use the load data infile statement.
      This video method is also applicable, but you can perform it multiple times. For example, you can divide your rows into smaller parts and load each part separately into your database. Hope it helps.

  • @VitorioRazzera
    @VitorioRazzera Год назад

    Great content, thanks for sharing. It just helped me a lot here! 👍👍

  • @rajarajanr4353
    @rajarajanr4353 10 месяцев назад

    @Data Analytics - Thanks for sharing it. I tried it and its working for first time and second time its throwing error "'tuple' object is not callable"
    Can you please help?

    • @AnalyticsExplorers
      @AnalyticsExplorers  10 месяцев назад

      did you change anything in the code before you ran it again?

    • @rajarajanr4353
      @rajarajanr4353 9 месяцев назад

      @@AnalyticsExplorers - i did not change anything. i have found another way to handle this issue and its working fine
      y = []
      for index, row in df1.iterrows():
      x = tuple(row)
      y.append(x)
      y

  • @tholetivamsidharreddy6261
    @tholetivamsidharreddy6261 4 месяца назад

    Where the file will be saved

  • @pravinkhandale5158
    @pravinkhandale5158 9 месяцев назад

    my program isn't run in sql

  • @yaaryvidanpeled8814
    @yaaryvidanpeled8814 9 месяцев назад +1

    1.5 GB csv? Is this the way?

    • @AnalyticsExplorers
      @AnalyticsExplorers  9 месяцев назад

      1.5 GB!! What a size!
      Your CSV file contains how many rows and columns?

    • @ahmarhussain8720
      @ahmarhussain8720 6 месяцев назад

      @@AnalyticsExplorers I have a 50 gb csv

  • @tholetivamsidharreddy6261
    @tholetivamsidharreddy6261 4 месяца назад

    Help me out where the txt file will be saved

  • @tholetivamsidharreddy6261
    @tholetivamsidharreddy6261 4 месяца назад

    Where the file will be stored