How to Bulk Insert Data With Laravel
HTML-код
- Опубликовано: 26 фев 2024
- As programmers, we often need to bulk insert a massive number of records into a database. Some languages and platforms offer built-in tools that make bulk insert a trivial thing. PHP (and Laravel), however, don't have these tools. But we can still efficiently bulk insert data! Let me show you how in this Larabit.
Watch Full Larabits Series on: laracasts.com/bits?teacher=jw...
Watch thousands of videos, track your progress, and participate in a massive Laravel community at Laracasts.com.
Laracasts: laracasts.com
Laracasts Twitter: / laracasts
Jeffrey Way Twitter: / jeffrey_way
Great speaker and a useful video! Thanks 👍
Truely went all the way with the bulk insert scenarios, good vid!
This is a great larabit. Thanks for the video Jeremy!
Extremely cool, yeah
What a great topic!
That’s amazing, thanks for the video, excellent well done 👍🏽
Great Topic, Thank you
great content, tqs for sharing valuable information with laravel developers, tq you very much.
Using DB:: instead of the Model class should speed up the process quite a bit
Good job! Thank you very much!
did not know about infile, thanks!
That's amazing! Thanks you
Wooooow great video, super useful, please more like this 🤩
very useful video
Just amazing
Thanks for the video.. One Note: Disabling foreign key check in a production app, can lead to slowdown or worst case an error inserting an id a does not exists on parent, so for your safety i would skip disabling foreign key check
thanks
Fantastic.
Amazing. Please share some ideas about database design for scalable Laravel app.
Thank you for sharing such a value and informative video.
This is my first time reaching your channel. I am gonna follow you and share this vid.
I have a question.
Is it suitable to use Transaction and Commit for bulking tons of record like this?
very good
Hi, This is a great way of inserting for a single table and where don't need to perform any functionality but if we want to perform some functionality and store the data in to different tables. Can you please cover that topic as well? I know that can be handled with queues but I wasn't able to implement it in an efficient way instead I overloaded the database with many mini jobs.
I won't fight you. Thanks 🙂
Great 🔥 what the name theme vscode?
I think using queues with chunking is also useful in this case, e.g chunking the file and storing 10k records into the database during each queue iteration
If possible I would recommend you not to use this method because when queuing you are essentially writing to DB 2 times for each queue, 1 for queue and 2nd for the actual record
If your are using redis then that's an ok ok case
sure I'm about redis driver not DB
Can I use bulk insert to get data from api and insert to my database?
What about exporting INSERT statements to a SQL file and using MySQL dump to import the file? Would that be more memory efficient in some cases?
I don't think so, in that case you will add one more useless step which is writing/creating a new file and it is not memory efficient because you will take the same data and put into the sql file... also insert query is limited, you can't inert 1M rows at once by default.
instead you can immidiatelly insert prepared data into the database and clean the memory. Using queues can also help, you can send a notification to user saying "your data insert is in progress", and then notify if the process is finished, in this case the user will not wait 20/60/80+ seconds to receive a response from the server
Which platform you use in which you give that data and just say do it.?
Chunking is way more available than threading. Still better than none at all.
What if we collect the information into an array named $data during the loop and then execute a single database insert query, similar to using Model::query()->insert($data);?
You can end up backing yourself into the corner of exceeding the max_allowed_packet size for the DBMS or depending on the load the DBMS is enduring, you could bring the application to a halt because of row locks. I would batch it into sensible chunks - 100-1000 records at a time depending on your dataset.
League/csv has been using generators for a long tim e now. I don't g et why you would use laravel and then not use a package like csv/league
You Could Have used Job here
Note for me before watching the full video: php generator will be the option to read such big sized files line by line. Lets see if I am right or wrong.