For Step 3.3, I usually use following syntax, instead of using '?' for each columns. It will be helpful for larger data with huge number of columns. VALUES ({','.join(['?']*len(.columns))}) This will count the number of columns and gives you back the required '?' as shown in the example.
@@jiejenn my pleasure. I recently had to deal with multiple tables containing atleast 60 columns. Typing 60 to 70 ‘?’ Was confusing, untidy and higher chances of mistakes. Glad I found this trick somewhere in youtube as well 😃 happy to share here. Thank you for great video
Hello Jie, I need to parallel ingest a CSV file with half million records for a assignment. Should i chunk the file and then follow this video? or do you have some other suggestion. Thanks.
@Jie jenn. i need your help and it is urgent. I need to know how i can create A SCRIPT to populate a database. The tools i have to do this are python interpreter(visual studio code), Mysql workbench and xampp. Please help meeeee.do i need to install any ODBC DRIVER to start with?
Thanks for the video! Could you help me with a question, how could I restrict the number of insertions given a number, instead of inserting all the rows?
Very nice tutorial but i am wondering if this is the only/best way to import a large amount of data. In bash i always use mysqlimport but does python also have something like this?
It's really depending on the system. I came from MS SQL Server, so usually for bulk upload, I would start with SSIS. For anything else required data exchange with 3rd-party system, that's when I would go with scripting route. Executemany() method despite is efficiently when it comes to memory allocation, it is still not a true bulk insert.
Hey Jie - do you have any solutions for "TypeError: tuple indices must be integers or slices"? - This guide worked for most of my .csv files, but for one particular file, I'm getting the tuples error. edit: files
how can you import csv file to an already existing table using sql query? like suppose you have Customer table and now you want to add data from xyz.csv file and both Customer and csv file have same attribute ,then how to do such task?
please i don't have a SQL Server but i have Xampp connected to Mysql workbench. i know i can run the script on MYsql workbench. i don't understand the ODBC FILE you imported on visual studios. Also the connection is not clear to the data base is not clear.
I followed your code and tried to generate the data in SQL Server by running the SELECT script. However, when I ran the SELECT script again, the data was gone. It appears that the data is not being stored in the database. Could you please help me?
@@jiejenn I'm sorry if I didn't explain it clearly. I used a SELECT script to query the data in MS SQL, and it was successful. However, when I executed the same script for the second time, the queried data disappeared.
Bro when i'm trying to import multiple sheets from an excel file its not working ! Can you please make a tutorial on how to import multiple sheets, any help would very helpfull
Hi Jie - When I try to run this code, I get an error "Database Error: [08001] [Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server does not exist or access denied" how can I fix this?
@@jiejenn no problem. I was able to combine what I learned in your video with some other guides and it is now fully working! Thanks again for all your great videos!
Hi Jie, thanks for making this video. I was wondering if you could implement something like tqdm library into this same script. That would be very helpful specially if you're trying to load millions of records.
Great tutorial mate. just had difficulty in understanding few words but the subtitle made sure I have none. Thanks a lot.
For Step 3.3, I usually use following syntax, instead of using '?' for each columns. It will be helpful for larger data with huge number of columns.
VALUES ({','.join(['?']*len(.columns))})
This will count the number of columns and gives you back the required '?' as shown in the example.
Great tip, thanks for sharing.
@@jiejenn my pleasure. I recently had to deal with multiple tables containing atleast 60 columns. Typing 60 to 70 ‘?’ Was confusing, untidy and higher chances of mistakes. Glad I found this trick somewhere in youtube as well 😃 happy to share here.
Thank you for great video
Great vid! Thanks...This will help me big time!
if you struggle with the blank values in your files use this
df_data = df[columns]
newdf = df.fillna(' ')
records = newdf.values.tolist()
Great tip
Excellent tutorial. Thank you for sharing!
Thanks a lot, great tutorial!
Do you have a tutorial where we do something like this, but update if there is a primary key exist or add in the entire row if it doesnt?
Very helpful.. But what if i have to handle with multiple sheets in an exce(csv) file can you tell me what addition i need to do in this code
The workflow is a bit different. I will cover that topic in a separate video.
Hello Jie, I need to parallel ingest a CSV file with half million records for a assignment. Should i chunk the file and then follow this video? or do you have some other suggestion. Thanks.
Great video thanks..But i have a question how can i import multiple sheets from excel document?
I will be covering that topic in a separate video.
@@jiejenn Ok i'll be waiting for that video
Thankyou
@Jie jenn. i need your help and it is urgent. I need to know how i can create A SCRIPT to populate a database. The tools i have to do this are python interpreter(visual studio code), Mysql workbench and xampp. Please help meeeee.do i need to install any ODBC DRIVER to start with?
Try Google.
Hi Jie,
I got the Error as GETDATA' is not a recognized built-in function name, Any Suggestion?
Thanks for the video! Could you help me with a question, how could I restrict the number of insertions given a number, instead of inserting all the rows?
you can iloc or loc and store it another dataframe and send via to_sql
Very nice tutorial but i am wondering if this is the only/best way to import a large amount of data. In bash i always use mysqlimport but does python also have something like this?
It's really depending on the system. I came from MS SQL Server, so usually for bulk upload, I would start with SSIS. For anything else required data exchange with 3rd-party system, that's when I would go with scripting route. Executemany() method despite is efficiently when it comes to memory allocation, it is still not a true bulk insert.
Hey Jie - do you have any solutions for "TypeError: tuple indices must be integers or slices"? - This guide worked for most of my .csv files, but for one particular file, I'm getting the tuples error.
edit: files
how can you import csv file to an already existing table using sql query?
like suppose you have Customer table and now you want to add data from xyz.csv file and both Customer and csv file have same attribute ,then how to do such task?
can we use this python script to append data in same table of SQL if new files arrive?
Insert into appends records to the same table you specified.
@@jiejenn Yes correct, I did today and it worked! Thank you very much brother! you are saving lives!
Getting pypyodbc .error data source name not found and no default driver specified pls help me in this 🙏
Can you import database in XBRL to Microsoft Access ?
Don't have experience with XBRL, so I am not sure to be honest.
Thank for this
please i don't have a SQL Server but i have Xampp connected to Mysql workbench. i know i can run the script on MYsql workbench. i don't understand the ODBC FILE you imported on visual studios. Also the connection is not clear to the data base is not clear.
Maybe try Google.
I followed your code and tried to generate the data in SQL Server by running the SELECT script. However, when I ran the SELECT script again, the data was gone.
It appears that the data is not being stored in the database. Could you please help me?
The script doesn't generate any data, it merely import a data file into your SQL Server database.
@@jiejenn I'm sorry if I didn't explain it clearly. I used a SELECT script to query the data in MS SQL, and it was successful. However, when I executed the same script for the second time, the queried data disappeared.
@@aianIII Then I don't know if I can help without looking at your script. Perhaps there might a typo or two in your code somewhere.
Thanks for this
You're welcome.
Bro when i'm trying to import multiple sheets from an excel file its not working !
Can you please make a tutorial on how to import multiple sheets, any help would very helpfull
I will look into it.
FileNotFoundError: [Errno 2] No such file or directory: 'Real-Time_Traffic_Incident_Reports.csv' which is?
Perhaps this will help www.google.com/search?client=firefox-b-1-d&q=python+FileNotFoundError
Hi man you're awesome, can you help me sending code for inserting/uploading csv and images to database(SQL Server/Wamp Server)
Hi Jie - When I try to run this code, I get an error "Database Error:
[08001] [Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server does not exist or access denied" how can I fix this?
It is difficult to tell based on a very general error message. I would suggest you post your question on Stack Overflow.
@@jiejenn no problem. I was able to combine what I learned in your video with some other guides and it is now fully working! Thanks again for all your great videos!
@@cvillejin I got the same problem, coud you tell how does you have solved your problem ? thx
@@yannantso4652 try changing the driver to the version of your sql server
👍👍👍
Hi Jie, thanks for making this video. I was wondering if you could implement something like tqdm library into this same script. That would be very helpful specially if you're trying to load millions of records.
Hi Wilson. Never used tqdm library before, so don't know if I will be able to help.
Step 3 It is giving the following error: NameError: name 'cursor' is not defined
Makes sure cursor is created first.
PERFECT SYNTAX FOR USERNAME AND PASSWORD PLEASE
IN CONN_STRING