Yes sure it still works. But, you must do something before importing. Check that data is cleaned as much as possible, you can do this on excel for example. Check that all nan cells contain empty string no spaces no characters are there. Best of luck!
bro i want to import csv file containing 60000 rows into wix but there is a limit of 50,000 rows so i am trying to upload it to mysql(external database option) how do i achieve this?
in your case, You can use the load data infile statement. This video method is also applicable, but you can perform it multiple times. For example, you can divide your rows into smaller parts and load each part separately into your database. Hope it helps.
@Data Analytics - Thanks for sharing it. I tried it and its working for first time and second time its throwing error "'tuple' object is not callable" Can you please help?
@@AnalyticsExplorers - i did not change anything. i have found another way to handle this issue and its working fine y = [] for index, row in df1.iterrows(): x = tuple(row) y.append(x) y
This is really helpful, thank you. I wonder if the data has tons of "nan" value,such as Covid data, will still work?
Yes sure it still works.
But, you must do something before importing. Check that data is cleaned as much as possible, you can do this on excel for example. Check that all nan cells contain empty string no spaces no characters are there.
Best of luck!
how to insert dataframe directly into mysql without manual pasting?
bro i want to import csv file containing 60000 rows into wix but there is a limit of 50,000 rows so i am trying to upload it to mysql(external database option) how do i achieve this?
Thank you, but,what should I do if there is a large number of columns? Maybe hundreds
You know you can insert automatically into MySQL all data straight from python, right?
how?
Seriously I tried everyday and Finally I was able to import my file. Thanks to this video!
Thanks, what if you have millions of rows?
in your case, You can use the load data infile statement.
This video method is also applicable, but you can perform it multiple times. For example, you can divide your rows into smaller parts and load each part separately into your database. Hope it helps.
Great content, thanks for sharing. It just helped me a lot here! 👍👍
@Data Analytics - Thanks for sharing it. I tried it and its working for first time and second time its throwing error "'tuple' object is not callable"
Can you please help?
did you change anything in the code before you ran it again?
@@AnalyticsExplorers - i did not change anything. i have found another way to handle this issue and its working fine
y = []
for index, row in df1.iterrows():
x = tuple(row)
y.append(x)
y
Where the file will be saved
my program isn't run in sql
1.5 GB csv? Is this the way?
1.5 GB!! What a size!
Your CSV file contains how many rows and columns?
@@AnalyticsExplorers I have a 50 gb csv
Help me out where the txt file will be saved
Where the file will be stored