PROOF JavaScript is a Multi-Threaded language

Поделиться
HTML-код
  • Опубликовано: 3 авг 2023
  • Learn the basics of parallelism and concurrency in JavaScript by experimenting with Node.js Worker Threads and browser Web Workers.
    #javascript #programming #computerscience
    Upgrade to Fireship PRO fireship.io/pro
    Node.js Worker Threads nodejs.org/api/worker_threads...
    Check out @codewithryan • Node.js is a serious t...

Комментарии • 530

  • @daedalus5070
    @daedalus5070 9 месяцев назад +1491

    I could feel my brain trying to stop me writing what I knew was an infinite loop but I did it anyway. I trusted you Jeff!

    • @ko-Daegu
      @ko-Daegu 9 месяцев назад +23

      0:31 concurrency incorporates parallelism
      what you should is asynchronism

    • @lucassilvas1
      @lucassilvas1 9 месяцев назад +73

      @@ko-Daegu who are you talking to, schizo?

    • @hypergraphic
      @hypergraphic 9 месяцев назад +4

      me too!

    • @universaltoons
      @universaltoons 9 месяцев назад +29

      @@ko-Daegu can I have some of what you're having

    • @lengors7327
      @lengors7327 9 месяцев назад +11

      ​@@ko-Daegu you really thought you are being smart with that remark, didnt you? Only problem is that you are wrong

  • @AlecThilenius
    @AlecThilenius 9 месяцев назад +326

    Fun nerd trivia:
    - A single CPU core runs multiple instructions concurrently, the CPU core just guarantees that it will appear AS IF the instructions were run serially within the context of a single thread. This is achieved primarily via instruction pipelining.
    - A single CPU core often executes instructions totally out of order, this is unimaginatively named "Out Of Order (OOO) execution".
    - A single core also executes instructions simultaneously from two DIFFERENT threads, only guaranteeing that each thread will appear AS IF it ran serially, all on the same shared hardware, all in the same core. This is called Hyperthreading.
    And we haven't even gotten to multi-core yet lol. I love you content Jeff, the ending was gold!

    • @ragggs
      @ragggs 9 месяцев назад +19

      in the spectre and meltdown era, we like to say “guarantees”

    • @RicardoSilvaTripcall
      @RicardoSilvaTripcall 9 месяцев назад +6

      But in a Hyperthreaded systems tasks just do not appear to be executed serially, they actually are executed serially ... the only difference is that the system is going to coordinate the execution of other tasks/threads while waiting for the previous one, that is probably blocked waiting for a I/O response ...
      If you have a 16 core processor with 32 logical processors, it doesn't mean it can execute 32 thread simultaneously ...

    • @ragggs
      @ragggs 9 месяцев назад +2

      @@RicardoSilvaTripcall hyperthreads are in many cases parallel by most meaningful definitions, due to interleaved pipelined operations on the cpu, and the observability problem of variable length operations. For an arbitrary pair of operations on two hyperthreads, without specifying what the operations are, and the exact cpu and microcode patch level you can not say which operation completes first even if you know the order in which they started.

    • @AlecThilenius
      @AlecThilenius 9 месяцев назад

      @@ragggs Lol! Maybe guarantee* (unless you're Intel)

    • @AlecThilenius
      @AlecThilenius 9 месяцев назад

      @@RicardoSilvaTripcall Uhhhh. No. Sorry.

  • @WolfPhoenix0
    @WolfPhoenix0 9 месяцев назад +66

    That chef analogy about concurrency and parallelism was genius. Makes it SO much easier to understand the differences.

  • @elhaambasheerch7058
    @elhaambasheerch7058 9 месяцев назад +62

    Love to see jeff going in depth on this channel, would love more videos like this one.

    • @beyondfireship
      @beyondfireship  9 месяцев назад +65

      That's why I made this channel. I've got a long list of ideas.

    • @morezco
      @morezco 9 месяцев назад +1

      @@beyondfireship wonderful. Keep it up

  • @RedlinePostal
    @RedlinePostal 9 месяцев назад +248

    Also when we say "one-core," that means "one-core at a *time*" -- computer kernels are concurrent by default, and the program's code will actually be constantly shifting to different CPUs, as the kernel manages a queue of things for the processor to do. Not too unlike the asynchronous system that javascript has, kernel will break each program you're running into executable chunks, and has a way to manage which programs and code get more priority.

    • @orbyfied
      @orbyfied 9 месяцев назад +7

      wouldnt that be kind of ineffective though, it wouldnt be able to take full advantage of the CPU cache, so i hope it does it as rarely as possible

    • @invinciblemode
      @invinciblemode 9 месяцев назад +17

      @@orbyfieduhh, different CPU cores use the same L2-L3 cache. L1 Cache is per core but they’re small and meant for minor optimisations.

    • @orbyfied
      @orbyfied 9 месяцев назад +6

      L1 is the fastest so having data available there is pretty significant. its also grown much in size to the point that it can basically cache all the memory a longer running task will need now. if L1 was so insignificant it wouldn't cause there data desync issues across threads

    • @LettersAndNumbers300
      @LettersAndNumbers300 9 месяцев назад +1

      Then…why do I only see one core active when running simple Python code…?

    • @jesusmods1
      @jesusmods1 9 месяцев назад +3

      ​@@orbyfiedit could be more inefficient if only one process took all the CPU core for himself during all his life time. Probably the process isn't switched between cores, but it is being swaped in and out with others on the same core for the sake of concurrency. Also take in account the hit rate that a cache may have.

  • @JThompson_VI
    @JThompson_VI 9 месяцев назад +96

    Moments like 0:52, the short memorable description of callback functions, is what makes you a great teacher. Thanks man!

    • @kisaragi-hiu
      @kisaragi-hiu 9 месяцев назад +1

      Keep in mind the JS world also calls any higher order function "callback" (like the function you'd pass to Array.map), whereas elsewhere afaik it only refers to the function you pass to something non-blocking.

    • @curlyfryactual
      @curlyfryactual 9 месяцев назад +2

      ​@@kisaragi-hiu a fact that caused me much grief coming into JS from systems level.

  • @ra2enjoyer708
    @ra2enjoyer708 9 месяцев назад +196

    It's a pretty good overview on how much more of a clusterfuck the code becomes once you add workers to it. And it didn't even get to the juice of doing fs/database/stream calls within workers and error handling for all of that.

    • @dan_le_brown
      @dan_le_brown 9 месяцев назад +10

      "Clusterfuck", I had the same word in mind 😭😂

    • @ko-Daegu
      @ko-Daegu 9 месяцев назад

      0:31 concurrency incorporates parallelism
      what you should is asynchronism

    • @angryman9333
      @angryman9333 9 месяцев назад

      just use Promises, it'll process all your asynchronous functions concurrently (very similar to parallel)

    • @SirusStarTV
      @SirusStarTV 9 месяцев назад +13

      @@angryman9333 Promise will run user written function in main thread blocking manner. Async function is just syntactic sugar for easier creation of promises. WIthout browser asynchronous api's or web workers it doesn't run code in parallel mode.

    • @platinumsun4632
      @platinumsun4632 9 месяцев назад

      @@angryman9333a what?

  • @H-Root
    @H-Root 9 месяцев назад +72

    I am stuck step programmer 😂😂

    • @rizkiaprita
      @rizkiaprita 9 месяцев назад +1

      rule #34 is calling

    • @nullbeyondo
      @nullbeyondo 9 месяцев назад +2

      break;

    • @catswolo421
      @catswolo421 5 месяцев назад

      I would watch out or youll het multi threaded

  • @srejonkhan
    @srejonkhan 9 месяцев назад +27

    6:13 To see how all of your cores utilizing, you can change the graph from 'Overall utilization' to 'Logical Processor' just by right clicking on the graph -> Change graph to -> Logical Processor.

  • @BRBS360
    @BRBS360 9 месяцев назад +36

    I'd like to see a video on JavaScript generators and maybe even coroutines.

    • @StiekemeHenk
      @StiekemeHenk 9 месяцев назад

      For sure, this is a really cool thing and I'm not sure how to actually use it.

    • @ko-Daegu
      @ko-Daegu 9 месяцев назад +2

      generics maybe ?
      garbage collector in more details ?
      benchmarkign agiants pythonic code, just to get people triggered ?

  • @yss64
    @yss64 9 месяцев назад +5

    Thanks for shouting out code with ryan! That channel is criminally underrated

  • @nomadshiba
    @nomadshiba 9 месяцев назад +2

    talking about multi-threading
    data oriented design always helps

  • @dzhaniivanov5837
    @dzhaniivanov5837 9 месяцев назад +1

    i watched a similar video early this year, but your way to deliver content is amazing, keep going

  • @user-fed-yum
    @user-fed-yum 9 месяцев назад +2

    That ending was possibly one of your best pranks ever, a new high watermark. Congratulations 😂

  • @7heMech
    @7heMech 9 месяцев назад +1

    Really cool, I actually saw the other video about nodejs taking it up a notch when it came out.

  • @EdgeGaming
    @EdgeGaming 9 месяцев назад +1

    Lots of comments about memorable descriptions, shoutout to the thread summary at 3:30. Your conciseness is excellent.

  • @deneguil-1618
    @deneguil-1618 9 месяцев назад +77

    just a heads up for your CPU; the 12900K doesn't have 8 physical cores, it indeed has 16, 8 performance and 8 efficiency cores, the performance cores have hyperthreading enabled but not the efficiency cores so you have 24 threads in total

    • @daleryanaldover6545
      @daleryanaldover6545 9 месяцев назад +1

      😮

    • @allesarfint
      @allesarfint 9 месяцев назад +3

      Oh yeah, right. So that's why the CPU didn't go to 100% after using 8 cores.

    • @godnyx117
      @godnyx117 9 месяцев назад +1

      You forgot the: 🤓

    • @adityaanuragi6916
      @adityaanuragi6916 9 месяцев назад +9

      But at 6:57 his cpu did go to a 100% with 16

    • @wertrager
      @wertrager 9 месяцев назад

      because hyperthreading is shit

  • @kiprasmel
    @kiprasmel 9 месяцев назад +2

    the `threads` package makes working with threads much more convenient. it also works well w/ typescript.

  • @Ihavetoreturnsomevideotapes
    @Ihavetoreturnsomevideotapes 9 месяцев назад

    ayo , was learning event loop and had a bit of confusion about performance b/w single and multithreading and jeff just posted the video at the right time.

  • @maxijonson
    @maxijonson 9 месяцев назад +6

    My brain: dont run it
    8 years of programming: dont run it
    the worker thread registering my inputs to the console as I type it: dont run it
    Jeff: run it.
    **RUNS IT**

  • @Bell_420
    @Bell_420 3 месяца назад

    the cook analogy was great and i now understand

  • @AntonisTzorvas
    @AntonisTzorvas 9 месяцев назад +1

    aside from the outstanding quality, this ending was quite funny and hilarious! keep it up, your content is TOP 🙇🚀

  • @wjlee7003
    @wjlee7003 9 месяцев назад +5

    although it's called concurrent, schedulers still can only work on one task at a time. It will delegate a certain amount of time to each task and switch between them (context switching). The switch Is just fast enough to make it seem truly "concurrent". If a task takes longer than the delegated time, the scheduler will still switch and come back to it to finish.

  • @AlexEscalante
    @AlexEscalante 9 месяцев назад +1

    ¡Wow! Just yesterday I was watching some videos about worker threads because I will use them to speed up the UI in my current development 😄

  • @robertjif6337
    @robertjif6337 9 месяцев назад +1

    Thanks, now I know what script I should include in my svgs

  • @TeaBroski
    @TeaBroski 9 месяцев назад +3

    It's like you read my client's requirement and came into support

  • @ahmad-murery
    @ahmad-murery 9 месяцев назад +28

    It would be nice if you right click on the cpu graph and *Change graph to > Logical Processors*, so we can see each thread separately.
    Thanks!

    • @crackwitz
      @crackwitz 9 месяцев назад +1

      less useful than you might think. the operating system's scheduler may bounce a thread around on any number of cores. doesn't make it faster but spreads the utilization around.

    • @ahmad-murery
      @ahmad-murery 9 месяцев назад

      @@crackwitz Do you mean that we will not see each core graph plotting one thread?

  • @timschannel247
    @timschannel247 25 дней назад

    Yes Yes Yes, and exactly extra Yes! Thank you Bro for this contribution! You are speaking out of my brain! Best Regards!

  • @boris---
    @boris--- 9 месяцев назад +2

    Task Manager --> Performance tab --> CPU --> Right click on graph --> Change graph to --> Logical Processors

  • @Quamsi
    @Quamsi 9 месяцев назад +34

    I have had hours long lectures in college level programming classes on the differences between concurrency and parallelism and the first 3 minutes of this video did a better job of explaining it. Shout outs to my bois running the us education system for wasting my money and my time 💀

    • @maskettaman1488
      @maskettaman1488 9 месяцев назад +11

      It's probably not their fault you failed to understand something so simple. Literally 1 minute on google would have cleared up any misunderstanding you had

    • @TechBuddy_
      @TechBuddy_ 9 месяцев назад

      ​@@maskettaman1488if you have to pay to study and then you have to sell yourself to a tech corp to learn something is not that great of a system and it should not exist IMHO

    • @Quamsi
      @Quamsi 9 месяцев назад +6

      @maskettaman1488 lmao im not saying i misunderstood it im saying fireship is much more consice and still gets all the relevant information across compared to college despite the fact that i dont have to pay fireship anything

  • @frankdearr2772
    @frankdearr2772 5 месяцев назад

    great topic, thanks 👍

  • @adaliszk
    @adaliszk 9 месяцев назад

    You can also pass initial data without needing to message the thread to start working, however, that one I feel like its better to use for initialization like connecting to a database.

  • @subratarudra2745
    @subratarudra2745 7 месяцев назад

    Amazing🔥

  • @markopolo2224
    @markopolo2224 9 месяцев назад

    man i been wanting something about workers for so long

  • @HenokWehibe
    @HenokWehibe 8 месяцев назад

    Just brilliant

  • @JeremyThille
    @JeremyThille 8 месяцев назад

    Niiiice we have the exact same machine! (And thanks for the video!)

  • @abhijay_hm
    @abhijay_hm 9 месяцев назад +1

    with the amount of time I've spent on this video because of the while loop, even the algorithm knows who my favourite youtuber is

  • @lionbryce10101
    @lionbryce10101 9 месяцев назад +1

    Woulda been cool if you set it to show core usage on taskmgr

  • @vforsh
    @vforsh 9 месяцев назад

    Wow, Love this tick - tock snippet

  • @avsync-live
    @avsync-live 9 месяцев назад +37

    Little known fact, you can also do DOM related operations on another thread. You have to serve it from a separate origin and use the Origin-Agent-Cluster header, and load the script in an . But you can still communicate with it using postMessage, and avoid thread blocking with large binary transfers using chunking. This is great for stuff that involves video elements and cameras.
    I use it to move canvas animations (that include video textures) off the UI thread, and calculating motion vectors of webcams.

    • @knoopx
      @knoopx 9 месяцев назад +1

      that looks handy! thanks for sharing

    • @among-us-99999
      @among-us-99999 9 месяцев назад +1

      that might just help with a few of my projects

    • @matheusvictor9629
      @matheusvictor9629 9 месяцев назад +1

      do you have any examples on github?

    • @avsync-live
      @avsync-live 9 месяцев назад

      @@matheusvictor9629 yes

    • @andrewmcgrail2276
      @andrewmcgrail2276 9 месяцев назад

      Sounds very interesting! I have a project where I think this would be useful.

  • @olharAgudo
    @olharAgudo 9 месяцев назад

    Awesome video ending

  • @zeta_meow_meow
    @zeta_meow_meow 3 месяца назад

    seeing my cpu throttle and core usage rise in realtime was impresive :)

  • @wusluf
    @wusluf 9 месяцев назад

    Adding more cores might still provide gains in a VM scenario depending on the hypervisor. As long a your VM isn't provisioned all physical cores the hypervisor is at liberty to utilize more cores and even up to all physical cores for a short amount of time resulting in increased performance for bursting tasks

  • @Zumito
    @Zumito 9 месяцев назад +1

    And remember, don't make promises you can't keep

  • @DranKof
    @DranKof 9 месяцев назад +2

    I tried the while loop thing and somehow my computer became sentient. Y'all should try that out.

  • @junama
    @junama 9 месяцев назад +16

    Good vídeo!
    Next time change the CPU graph with right click to see each threat graph.
    Hope it helps!

    • @LedimLPMore
      @LedimLPMore 9 месяцев назад

      Wow, didn't know that. Thanks!

  • @JeremyAndersonBoise
    @JeremyAndersonBoise 9 месяцев назад

    Spawning workers in Node is not new, but support for web workers in browsers is comparatively new. Good shit man.

  • @Bossslime
    @Bossslime 9 месяцев назад +1

    I remember when I first learned workers, I didn’t realize k could use a separate js file so I wrote all of my code in a string, it was just a giant string that I coded with no ide help. That was fun.

  • @gr.4380
    @gr.4380 9 месяцев назад

    love how you tell us to leave a comment if it's locked like we can even do that

  • @TonyAlcast
    @TonyAlcast 9 месяцев назад

    I'm still amazed at how you find such accurate images as the one at 0:32 🤔

  • @DumbledoreMcCracken
    @DumbledoreMcCracken 9 месяцев назад +1

    each value should be a random value, and you should sum them in the end to ensure the compiler / interpreter does not optimize all the work away because it detected that you never used the values

    • @ra2enjoyer708
      @ra2enjoyer708 9 месяцев назад

      Pretty sure compiler won't be able to optimize side effects like this, since worker and the main thread only interact indirectly through events on message channel.

  • @ninjaasmoke
    @ninjaasmoke 9 месяцев назад

    people watching on phone:
    “that level of genjutsu doesn’t work on me”

  • @sobeeeeer
    @sobeeeeer 9 месяцев назад

    mindblowing intro

  • @Rebel101
    @Rebel101 9 месяцев назад

    Epic!
    It's FLAT!

  • @HedleyLuna
    @HedleyLuna 9 месяцев назад +1

    I did use this back in 2018. I don't know how much it improved, but error handling was painful. Also, when you call postMessage(), v8 will serialize your message, meaning big payloads will kill any advantage you want. And also, remember that functions are not serializable. On the UI, I completely killed my ThreeJS app in production when I tried to offload some of its work to other threads :D
    Apart from that, you should NEVER share data between threads, that's an anti-pattern.

  • @nullternative
    @nullternative 9 месяцев назад +5

    I just recently experimented with the Offscreen Canvas handling rendering on a separate worker thread. Pretty cool.

  • @VileEnd
    @VileEnd 9 месяцев назад +1

    Love it, we are already doing that with our Lambdas - cause why not use the vCores when you got them 😍

  • @NithinJune
    @NithinJune 9 месяцев назад

    i wish you showed the CPU usage on each logical processor on task manager instead of the overview

  • @4541047
    @4541047 9 месяцев назад

    You are a youtube genius man

  • @NoFailer
    @NoFailer 9 месяцев назад

    I executed the while-loop on the orange youtube and I couldn't change the volume.... Thanks.

  • @yassinesafraoui
    @yassinesafraoui 9 месяцев назад

    That trick to force us to hear the sponsor block could only come from you 🤣🤣🤣

  • @coolingjam
    @coolingjam 9 месяцев назад

    The one time I look up something, fireship uploads a video about it lol

  • @akashrajpurohit97
    @akashrajpurohit97 9 месяцев назад

    6:42 bro really doubled it and gave it to the next thread

  • @mrcjm
    @mrcjm 9 месяцев назад

    Ending is the moment you are glad you watched it on a mobile device

  • @CC-1.
    @CC-1. 9 месяцев назад +1

    0:15 I already know this and already using this
    BLOB to create new Worker and going I use max 4 to 8 as one for each core

  • @eformance
    @eformance 9 месяцев назад

    Hyperthreading generally gives a 30% bump in performance, your test demonstrated that handily.

  • @mateuszabramek7015
    @mateuszabramek7015 9 месяцев назад

    Exactly. I don't know why I keep hearing otherwise

  • @MegaMech
    @MegaMech 9 месяцев назад +1

    A single x86 core can actually run more than one command at a time. And the n64 can run 1.5 commands at a time when it uses a branch delay slot.

  • @Malthael134
    @Malthael134 9 месяцев назад +1

    Just in time for my new browser game🎉

  • @DanFigueras
    @DanFigueras 9 месяцев назад

    Day 5, I'm still stuck with the window open, I tried exit the house and get back in. Rick is still singing.

  • @dan-cj1rr
    @dan-cj1rr 9 месяцев назад

    No clue if this could be an interesting video, but teach us about how to deploy on different environment ( ex: testing, production), as a junior i always don't know what this implies. Also show us tools to handle it. Thanks :)

  • @tinahalder8416
    @tinahalder8416 9 месяцев назад +27

    In python, handling Race Condition is easy,
    Use Queue, and Lock 😊

    • @dan_le_brown
      @dan_le_brown 9 месяцев назад +2

      I achieved something similar in TS, but rather than locking the queue, I ensured that the jobs that could cause a race condition had a predictable unique ID. By predictable, I mean a transaction reference/nonce...

    • @techtutorial9050
      @techtutorial9050 9 месяцев назад +1

      Well multiprocessing is much more mature than workers thread since multiprocessing has been the primary methods for concurrency in python, but for js it’s always been async.

  • @user-kt1qj2ok7e
    @user-kt1qj2ok7e 9 месяцев назад

    That's hillarious

  • @RajitRoy_NR
    @RajitRoy_NR 9 месяцев назад +1

    What are some of the useful libraries which help or use workers? Like Partytown or Comlink

  • @ko-Daegu
    @ko-Daegu 9 месяцев назад

    0:31 concurrency incorporates parallelism
    what you should is asynchronism

  • @landscapesandmotion
    @landscapesandmotion 9 месяцев назад

    Elixir is faster than I thought and getting faster with the new JIT compiler improvements.

  • @Xe054
    @Xe054 9 месяцев назад

    Fireship, the "S" sounds in your video sound really harsh. Consider using a de-esser plugin or a regular compressor plugin and your stuff will sound fantastic. Cheers.

  • @s0up1e
    @s0up1e 9 месяцев назад

    So weird, this was an interview question yesterday.

  • @dave6012
    @dave6012 9 месяцев назад

    Mr jeff will you do one on creating a websocket server in node js?

  • @wlockuz4467
    @wlockuz4467 9 месяцев назад +2

    I thought worker threads were virtual threads. you learn something new everyday!

    • @shimadabr
      @shimadabr 9 месяцев назад

      Aren't they? My understanding is that they are threads managed by the runtime, which in turn is responsible for allocating the appropriate amount of real threads on the O.S.

  • @jimbowers1298
    @jimbowers1298 9 месяцев назад +2

    UNLIMITED VIEW TIMES!! AWESOME!! What a great video!

  • @BoloH.
    @BoloH. 9 месяцев назад +4

    I once made a volume rendering thingie with Three.JS and it really, REALLY benefited from Web Workers, especially interpolation between Z slices.

    • @xinaesthetic
      @xinaesthetic 9 месяцев назад +1

      Hang on… wouldn’t a volume-renderer in three.js be doing things like interpolation between z-slices in the fragment shader? Could certainly see workers being useful for some data processing (although texture data still needs to be pushed to the gpu in the main thread). Care to elucidate? Was it maybe interpolating XYZ over time, like with fMRI data or something? That would certainly benefit…

  • @soulofjack7294
    @soulofjack7294 9 месяцев назад

    thanks for hanging

  • @DamonMedekMusic
    @DamonMedekMusic 9 месяцев назад +1

    I've used the web worker API to filter multiple arrays at once and it's okay but it is very unintuitive to use and it could definitely be improved upon. Ideally for multiple Dom manipulation at once too not just data processing.

    • @Steel0079
      @Steel0079 9 месяцев назад

      Now use the web worker where webpack is involved XD

    • @thecoolnewsguy
      @thecoolnewsguy 9 месяцев назад

      ​@@Steel0079vite is the future

  • @SinanWP
    @SinanWP 9 месяцев назад

    7:40 I knew the joke coming from mile away
    nice one 😂😂😂😂

  • @jack171380
    @jack171380 9 месяцев назад +1

    I wonder if there are things like mutex locks to help with the synchronisations of shared resources?

  • @ashishkkrishna
    @ashishkkrishna 9 месяцев назад

    it's been a whole day and i'm still stuck in this page lol.

  • @boomshakalaka656
    @boomshakalaka656 9 месяцев назад

    He definitely used an Al voice tool for the intro of this video

  • @victorpinasarnault9135
    @victorpinasarnault9135 9 месяцев назад

    I saw this video of Code with Ryan.

  • @Greediium
    @Greediium 2 месяца назад +2

    IM STILL STUCK OVER HERE, HELP!?!?!?!?
    MY PC WONT SHUTDOWN, ITS BEEN 5 MONTH'S...
    keep up the great work, love your vid's!

  • @joebgallegos
    @joebgallegos 9 месяцев назад

    I recently did a little side project where I needed to use a worker in a web app. The gist of the project is given a winning lottery number, how many “quick picks” or random tickets would it take to finally hit.

  • @dovanminhan
    @dovanminhan 9 месяцев назад

    Hi from Vietnam, where the kitchen image was taken.

  • @debarkamondal6406
    @debarkamondal6406 9 месяцев назад

    Dude he is helarious

  • @riendlyf
    @riendlyf 9 месяцев назад

    Can you cover native threads vs green threads?

  • @RegalWK
    @RegalWK 9 месяцев назад +5

    Every async thing you do goes to micro task or task queue, and every single one of them is executed on different thread, and once it’s done it goes back as message queue to micro task queue or task queue - when callstack is empty event loop takes firstly from microtask queue to callstack, later from task queue.
    Web workes once are done also goes to main thread to the callstack

    • @RegalWK
      @RegalWK 9 месяцев назад

      Js is still One thread

    • @alexanderpedenko6669
      @alexanderpedenko6669 9 месяцев назад

      Where did you find this info? As I know each thread has their own event loop, where micro and macro executes

    • @RegalWK
      @RegalWK 9 месяцев назад

      @@alexanderpedenko6669 when you make some Async operation like promises, timers (settineout setinterval) js engine (v8/ whatever is in node) notice that it’s async operation and delegate it to proper web/node api which is write in cpp Lang. And there it executes it code and once it’s done those api return result of that operation to some queue and later event loop moves it to main thread

  • @xcat4775
    @xcat4775 9 месяцев назад

    "and I'll try to troubleshoot from there"

  • @PlayWithNiz
    @PlayWithNiz 9 месяцев назад +1

    I'm thinking out loud here, but have a genuine question - Could you use workers combined with something like husky to do all pre-commit/push/etc checks at once?
    For example, I may have a large unit/integration test suite, followed by a large e2e test suite, along with code quality checks and so on... All of which are ran in sequence potentially taking upwards of a few minutes to complete.
    Could workers be used to run these jobs together at once?

    • @ra2enjoyer708
      @ra2enjoyer708 9 месяцев назад

      E2E will bottleneck regardless, because of quadrillion OS APIs it has to interact on start, majority of them are synchronous.

  • @AnwarulIslamYT
    @AnwarulIslamYT 9 месяцев назад +1

    JavaScript is referred to high level, single threaded, garbage collected, interpreted || jit compiled, prototype based, multi-paradigm, dynamic language with a, non-blocking event loop

    • @LedimLPMore
      @LedimLPMore 9 месяцев назад

      And you can still program with multiple threads... 😂

  • @SeanJonesYT
    @SeanJonesYT 9 месяцев назад

    It’s been 4 hours and my computer has now caught fire and is playing the interstellar theme song, help!

  • @adarshsaurabh7871
    @adarshsaurabh7871 9 месяцев назад

    Well last party was fun😂