How to Bulk Insert Data With Laravel

Поделиться
HTML-код
  • Опубликовано: 26 фев 2024
  • As programmers, we often need to bulk insert a massive number of records into a database. Some languages and platforms offer built-in tools that make bulk insert a trivial thing. PHP (and Laravel), however, don't have these tools. But we can still efficiently bulk insert data! Let me show you how in this Larabit.
    Watch Full Larabits Series on: laracasts.com/bits?teacher=jw...
    Watch thousands of videos, track your progress, and participate in a massive Laravel community at Laracasts.com.
    Laracasts: laracasts.com
    Laracasts Twitter: / laracasts
    Jeffrey Way Twitter: / jeffrey_way

Комментарии • 39

  • @biryuk88
    @biryuk88 4 месяца назад +4

    Great speaker and a useful video! Thanks 👍

  • @DirkZz
    @DirkZz 4 месяца назад

    Truely went all the way with the bulk insert scenarios, good vid!

  • @swancompany_inc
    @swancompany_inc 4 месяца назад

    This is a great larabit. Thanks for the video Jeremy!

  • @user-pr5sl7sw8k
    @user-pr5sl7sw8k 2 месяца назад

    Extremely cool, yeah

  • @MT87840
    @MT87840 4 месяца назад +2

    What a great topic!

  • @franciscojunior2141
    @franciscojunior2141 4 месяца назад

    That’s amazing, thanks for the video, excellent well done 👍🏽

  • @yasser.elgammal
    @yasser.elgammal 4 месяца назад

    Great Topic, Thank you

  • @shaikhanuman8012
    @shaikhanuman8012 4 месяца назад

    great content, tqs for sharing valuable information with laravel developers, tq you very much.

  • @OliverKurmis
    @OliverKurmis 3 месяца назад +3

    Using DB:: instead of the Model class should speed up the process quite a bit

  • @user-vi2fp6dl7b
    @user-vi2fp6dl7b 4 месяца назад

    Good job! Thank you very much!

  • @wadecodez
    @wadecodez 4 месяца назад

    did not know about infile, thanks!

  • @BruceEmmanuelSueira
    @BruceEmmanuelSueira 4 месяца назад

    That's amazing! Thanks you

  • @vic_casanas
    @vic_casanas 4 месяца назад

    Wooooow great video, super useful, please more like this 🤩

  • @ParsclickTV
    @ParsclickTV 4 месяца назад

    very useful video

  • @mouhamaddiop1144
    @mouhamaddiop1144 4 месяца назад

    Just amazing

  • @JoseFranciscoIT
    @JoseFranciscoIT 4 месяца назад +1

    Thanks for the video.. One Note: Disabling foreign key check in a production app, can lead to slowdown or worst case an error inserting an id a does not exists on parent, so for your safety i would skip disabling foreign key check

  • @GergelyCsermely
    @GergelyCsermely 4 месяца назад

    thanks

  • @gabrielborges1185
    @gabrielborges1185 4 месяца назад

    Fantastic.

  • @WilliamShrek
    @WilliamShrek 4 месяца назад

    Amazing. Please share some ideas about database design for scalable Laravel app.

  • @edventuretech
    @edventuretech 4 месяца назад

    Thank you for sharing such a value and informative video.
    This is my first time reaching your channel. I am gonna follow you and share this vid.
    I have a question.
    Is it suitable to use Transaction and Commit for bulking tons of record like this?

  • @michalbany5162
    @michalbany5162 4 месяца назад

    very good

  • @shahzadwaris7193
    @shahzadwaris7193 4 месяца назад

    Hi, This is a great way of inserting for a single table and where don't need to perform any functionality but if we want to perform some functionality and store the data in to different tables. Can you please cover that topic as well? I know that can be handled with queues but I wasn't able to implement it in an efficient way instead I overloaded the database with many mini jobs.

  • @bryanhalstead
    @bryanhalstead 3 месяца назад

    I won't fight you. Thanks 🙂

  • @muhamadfikri7263
    @muhamadfikri7263 4 месяца назад +1

    Great 🔥 what the name theme vscode?

  • @arthmelikyan
    @arthmelikyan 4 месяца назад +2

    I think using queues with chunking is also useful in this case, e.g chunking the file and storing 10k records into the database during each queue iteration

    • @blank001
      @blank001 4 месяца назад

      If possible I would recommend you not to use this method because when queuing you are essentially writing to DB 2 times for each queue, 1 for queue and 2nd for the actual record
      If your are using redis then that's an ok ok case

    • @arthmelikyan
      @arthmelikyan 4 месяца назад

      ​sure I'm about redis driver not DB

  • @docetapedro5007
    @docetapedro5007 4 месяца назад

    Can I use bulk insert to get data from api and insert to my database?

  • @underflowexception
    @underflowexception 4 месяца назад

    What about exporting INSERT statements to a SQL file and using MySQL dump to import the file? Would that be more memory efficient in some cases?

    • @arthmelikyan
      @arthmelikyan 4 месяца назад

      I don't think so, in that case you will add one more useless step which is writing/creating a new file and it is not memory efficient because you will take the same data and put into the sql file... also insert query is limited, you can't inert 1M rows at once by default.
      instead you can immidiatelly insert prepared data into the database and clean the memory. Using queues can also help, you can send a notification to user saying "your data insert is in progress", and then notify if the process is finished, in this case the user will not wait 20/60/80+ seconds to receive a response from the server

  • @user-vp1fw1uj4d
    @user-vp1fw1uj4d 3 месяца назад

    Which platform you use in which you give that data and just say do it.?

  • @IndraKurniawan
    @IndraKurniawan 4 месяца назад

    Chunking is way more available than threading. Still better than none at all.

  • @wadday
    @wadday 4 месяца назад

    What if we collect the information into an array named $data during the loop and then execute a single database insert query, similar to using Model::query()->insert($data);?

    • @tass2001
      @tass2001 4 месяца назад +1

      You can end up backing yourself into the corner of exceeding the max_allowed_packet size for the DBMS or depending on the load the DBMS is enduring, you could bring the application to a halt because of row locks. I would batch it into sensible chunks - 100-1000 records at a time depending on your dataset.

  • @Fever1984
    @Fever1984 3 месяца назад

    League/csv has been using generators for a long tim e now. I don't g et why you would use laravel and then not use a package like csv/league

  • @bidhanbaniya7605
    @bidhanbaniya7605 4 месяца назад

    You Could Have used Job here

  • @mohammadashrafuddinferdous9347
    @mohammadashrafuddinferdous9347 4 месяца назад +1

    Note for me before watching the full video: php generator will be the option to read such big sized files line by line. Lets see if I am right or wrong.