Why does JavaScript's fetch make me wait TWICE?

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 512

  • @hyperprotagonist
    @hyperprotagonist 2 месяца назад +418

    Damn this video is everywhere. Algorithm is on your side.

    • @tomontheinternet
      @tomontheinternet  2 месяца назад +20

      Yeah. I don’t know why. It had 100 views after the first 24 hours and then it exploded.

    • @hyperprotagonist
      @hyperprotagonist 2 месяца назад +7

      @@tomontheinternetwell, I’m glad it did. You just earned another subscriber 🎉

    • @cdruc
      @cdruc 2 месяца назад +10

      @@tomontheinternet because it's great! it's straight to the point and answers something *many* people were wondering but never took the time to find out why/how it happens :D

    • @TheBuilder
      @TheBuilder Месяц назад

      @@tomontheinternet It's a strange, easy to understand question with a simple to the point thumbnail.

    • @etexas
      @etexas Месяц назад

      @@tomontheinternet I got this video recommended to me as well just now

  • @theyayaa
    @theyayaa 2 месяца назад +824

    Before you attack this man, please, AWAIT...he is a good man

    • @abdelrahmanhafez990
      @abdelrahmanhafez990 2 месяца назад +35

      I can tell he's a good man almost right away, but I needed to await to see what his content looks like.

    • @d.156
      @d.156 2 месяца назад +29

      .then(() => "I agree")

    • @antonpieper
      @antonpieper 2 месяца назад +3

      @@theyayaa nah, I had to Text decode his stream

    • @ArturDani
      @ArturDani 2 месяца назад +5

      not just once but twice, AWAIT... AWAIT...

    • @r-i-ch
      @r-i-ch 2 месяца назад +18

      Do you Promise?

  • @Ligma_Shlong
    @Ligma_Shlong 2 месяца назад +596

    tldw the first await just waits for headers, then the second waits for the body, because its parsing a response stream (bytes come in incrementally) not the entire payload at once

    • @tomontheinternet
      @tomontheinternet  2 месяца назад +193

      That’s right Ligma Schlong!

    • @sulaimanshabbir
      @sulaimanshabbir 2 месяца назад +2

      ❤❤

    • @waldolemmer
      @waldolemmer 2 месяца назад +26

      Thanks, Ligma Shlong. It's what I suspected, but the HTTP hint at the start threw me off

    • @rakha8812
      @rakha8812 2 месяца назад +26

      Thank you Ligma Shlong!

    • @meyegui
      @meyegui 2 месяца назад +19

      Ligma Shlong da real MVP

  • @TheBswan
    @TheBswan 2 месяца назад +238

    This is great. Love the node server with no framework, and the UI with code snippets during demo. Learned two new things (headers arrive before body + how simple streaming responses can be) in 6 minutes. Really nice job here, thank you!

    • @tomontheinternet
      @tomontheinternet  2 месяца назад +14

      Glad you like that. I really enjoy how much you can do without pulling in libraries, especially for demonstration apps.

    • @thomasle100
      @thomasle100 2 месяца назад +7

      ​@@tomontheinternetI agree! And we can do a lot with vanilla js in the frontend without frameworks :)

    • @beaucranston9586
      @beaucranston9586 2 месяца назад +1

      Don't ACTUALLY stream data like this. This was just a demonstration about how HTTP works. If you need live stream data, use a pipe (websocket, grpc, etc). Still a great video though.

    • @vryaboshapko
      @vryaboshapko Месяц назад +2

      > headers arrive before body
      This thing was quite a big pain in the neck for PHP developers back in times when I used to handle with it 😵‍💫 Just a portion of flashbacks.

    • @Kriszzzful
      @Kriszzzful 25 дней назад +1

      @@vryaboshapko story time?

  • @shableep
    @shableep Месяц назад +13

    Summary:
    The first await: When you call fetch, it returns a promise that resolves as soon as the server responds with headers. This happens quickly, before the full response body is received. At this point, you only have access to the response metadata (like status code and headers).
    The second await: To get the actual response body content, you need to call a method like response.json(). This method also returns a promise, because the body content might still be streaming in from the server.
    The response body can be much larger than the headers and, in some cases, might take significant time to fully download.

  • @mjdev-i1p
    @mjdev-i1p 2 месяца назад +366

    dude is doing the lords work fighting against JS-Terminators

  • @ColinRichardson
    @ColinRichardson 2 месяца назад +49

    Just to note, you don't actually send your headers until line 34 of your server code. the first instance of `res.write()` if you wanted to be sure you send the headers before the first body write, you will want `res.flushHeaders()` at that point the headers are locked and available for sending down the wire. `res.write()` ensures this internally also, but you are doing that 2ms after finalising your headers in your setInterval.

    • @tomontheinternet
      @tomontheinternet  2 месяца назад +13

      Thank you. I didn’t know that. Makes sense!

    • @ColinRichardson
      @ColinRichardson 2 месяца назад +9

      @@tomontheinternet You can check this yourself by making your interval be say 10 seconds, and then you will notice that your "got response" console log is a lot later than "got it instantly".

    • @ivandamyanov
      @ivandamyanov Месяц назад

      This basically means that without flushHeaders, the first response is only sent when the first part of the body is written and that's when the headers will be included, right?

    • @ColinRichardson
      @ColinRichardson Месяц назад +6

      @@ivandamyanov correct, delaying it til the writing of the body, allows you time to add (and potentially remove) headers from the header section until the last possible moment. EG: as the body is being sent. Once the body is being sent, all opportunity to modify the headers are gone. They are already sent.. Too late.. better luck next time.
      But, sometimes it's advantageous to send the headers early (with flushHeaders), while the body is still not ready to be sent. Though, these examples are few and far between.
      The only times I can think of off the top of my head, was using multipart image-replace, which was a dirty way of doing webcam streams, and the other example was sending a large zip file, but we purposely added a delay so the recipient client could read some custom headers, and decide to abort the connection before we actually started the zipping process, so not to waste server resources..
      I am sure there are more examples, but I am getting old and those are the ones that come to mind right now..

    •  16 дней назад

      @@ivandamyanov `writeHead()` only fills property of object (adds value to `Resonse.Headers[]` or whatever type that property is).
      `Response.flushHeaders()` and `Response.write` have actually trigger sending the response object

  • @charithaJayabahu
    @charithaJayabahu 2 месяца назад +19

    Bro went deep but explained everything crystal clearly. Hope you make more videos like this. Hats off.

  • @UliTroyo
    @UliTroyo 2 месяца назад +51

    Dang, what an illustrative demo.

  • @blarvinius
    @blarvinius 2 месяца назад +63

    I love this kind of DETAILED explanation ❤

  • @69k_gold
    @69k_gold 2 месяца назад +54

    This actually opens way to a super cool optimization. If the status code is not 2XX, there's a chance some APIs return an entire web page (like Microsoft Azure with 503 errors) waiting to retrieve that entire self-sufficient HTML document is such a waste of time. Rather, you can then flush it out and carry on with your other tasks

    • @Georgggg
      @Georgggg 2 месяца назад +4

      I doubt, that would give you anyhting in practice. Rule of thumb is 14kb per request, which will sent per 1 tcp round trip anyway, regardless if it is 1kb, or 10kb.

    • @micheloe
      @micheloe 2 месяца назад +8

      ​@@Georgggg Ethernet maximum transmission unit (MTU) is 1500 bytes. With TCP on top you get 1460 bytes, which is roughly 1.4kB and usually what you see flying around on the internet as well. That is nowhere near the 14k you mention. Where did you get that number from? There is something called jumbo frames, which ups the MTU to 9000 bytes, but that usually doesn't extend to the internet.

    • @eneg_
      @eneg_ 2 месяца назад +2

      Perhaps the difference lies in b vs B

    • @callowaysutton
      @callowaysutton Месяц назад

      @@micheloe he's talking about the TCP slow start algorithm, 10 packets are sent first and doubled until something drops

  • @풍월상신
    @풍월상신 2 месяца назад +9

    '텀' wasn't just a designed logo, it was intended Hangul.. That's lovely.
    And this video was helpful for me. Thanks 텀.

    • @sivasankaransomaskanthan8264
      @sivasankaransomaskanthan8264 2 месяца назад +1

      what does that mean ?

    • @풍월상신
      @풍월상신 2 месяца назад +5

      @@sivasankaransomaskanthan8264 Hangul is korean alphabet and writing system.
      'ㅌ' is like 'T' in pronounce, while 'ㅓ' like 'O' and 'ㅁ' is 'M'
      Altogether, '텀' is 'Tom' in Hangul.

    • @sivasankaransomaskanthan8264
      @sivasankaransomaskanthan8264 2 месяца назад +2

      @@풍월상신 Thanks for the answer. Now I feel like I have learned to read korean lanuage. that's cool

    • @elliotwaite
      @elliotwaite Месяц назад

      @@풍월상신 It also looks like the letters T O M going around clock-wise (but the ㅓ is the T and the ㅌ is the M).

    • @tomontheinternet
      @tomontheinternet  Месяц назад +1

      A lucky coincidence!

  • @LearnWebCode
    @LearnWebCode 2 месяца назад +2

    You're an amazing teacher; thanks for the great explanation and focusing on the "why" behind things.

  • @pranksterboss139
    @pranksterboss139 2 месяца назад +7

    Definitely didn't know this before watching this video, I just accepted that awaiting JSON is what you had to do. Nice video!

  • @appwala3728
    @appwala3728 2 месяца назад +10

    This is great we need more depp knowledge like this. It's a request for you sir.

  • @Pawansoni432
    @Pawansoni432 Месяц назад

    wow! never knew learning streams could be that easy! and today i got the idea on why we await two times for the fetch; first the headers arrives followed by body.

  • @homesynthesis
    @homesynthesis Месяц назад +1

    For await feels incredibly cursed but I'm glad I now know about it and can't wait to use it as much as humanly possible

  • @RusuTraianCristian
    @RusuTraianCristian 18 дней назад

    The browser and server already do the heavy lifting of streaming and parsing the big piece of data in small chunks, incrementally (thus the 'streaming').
    So in the frontend, all you need to do is create a 'for of' loop (because it can 'await' things) and build a string/content with the coming data. I've been doing this for years because I have many circumstances where we had to display things as they come in, rather than waiting for the whole thing to process and then BANG, fill up the screen.
    It's a nice, informative/educational piece of video nonetheless. Well done!

  • @JonLynchIsAlive
    @JonLynchIsAlive 2 месяца назад +1

    This is actually a great explanation of the need for the second promise. This made me think of the fact that some developers will make a "HEAD" request to see if a resource is available. But it could just be done with that initial response using headers alone.

    • @jpisello
      @jpisello Месяц назад +1

      Using the HEAD request will be more efficient for the server, though, since it won't actually send the body (contents) of the requested URI. (Also, the requester _does_ actually get the content in a GET/POST request-whether your code reads it or not.) So altogether, if you just need to know whether a resource exists, use HEAD.

    • @JonLynchIsAlive
      @JonLynchIsAlive Месяц назад

      @@jpisello true!

  • @rayzecor
    @rayzecor Месяц назад

    I really appreciated the pride you have for this simple and small demonstration. The ending was also funny

  • @ariyoujahan9662
    @ariyoujahan9662 27 дней назад +2

    Wow, it was absolutely amazing.
    I need to check out your other videos (this was the first one)
    Hope I'm finally finding some advanced JS RUclips channel.

  • @danielghirasim2544
    @danielghirasim2544 2 месяца назад +15

    Very cool explanation, thank you! I am also getting attacked by killer robots when I code

  • @ogyct
    @ogyct Месяц назад

    Thanks for the video. It's good to see not only how to do things, but also why. With deep understanding and good examples.

  • @AurelioPita
    @AurelioPita 2 месяца назад +4

    Very well explained. We need more advanced content like this.

  • @quinndirks5653
    @quinndirks5653 Месяц назад

    Great video! I have been doing javascript for awhile, but this is the first time I've seen someone show an example of streaming data in from a fetch request... I didn't even know it was possible. Very cool!

  • @ayoubalfurjani4531
    @ayoubalfurjani4531 23 дня назад

    I like the simple explanation and the overall vibes.

  • @cjnewbs
    @cjnewbs Месяц назад

    Only really figured this out a couple months ago when discussing it with a work colleague. (I don’t write much js at all). This is the thing that annoys me most about js tutorials that explain promises. They explain what the await does (or the .then) and then will always give the example using fetch but won’t explain why there’s 2 of them, I’m sat there like “did I miss something?!?”
    This video should be required watching for learning js. Good job!

  • @paperC_CSGO
    @paperC_CSGO 2 месяца назад

    Really like videos like this, going into detail on specific, small things in web development. So many other videos out there are general overview courses, but for us junior devs these type of videos are gold to expand are knowledge

  • @imadetheuniverse4fun
    @imadetheuniverse4fun 2 месяца назад +20

    i don't know if the abrupt cutoff in the middle of endorsing your channel was intentional, but it was perfect imo. subbed! :P

  • @dazealex
    @dazealex 25 дней назад

    Nice stuff! Ofcourse, as a Go programmer, I don't deal nor look too deep into JS, this was still useful and nicely illustrated. Give us more!

  • @AmitabhSuman
    @AmitabhSuman Месяц назад

    Supraaawesome!
    The interface, the code and the excitement with which you explain this. Thank you!

  • @r1konTheAutomator
    @r1konTheAutomator Месяц назад

    This is my first time finding your channel. I really like listening to you talk lol - but nothing beats the outtro. I almost spit coffee out of my face when it cut off at the very end. I looked away from the screen totally expecting "like, share, subscribe, patreon, etc" and got the "please distract these things, I really need he--" genius 🤣 You got a sub from me just on that.

  • @ApplyIT2021
    @ApplyIT2021 2 месяца назад +3

    Practical explanation is really amazing...

  • @douglasgabriel99
    @douglasgabriel99 2 месяца назад +3

    Already knew that, but the way this video explained was so entertaining! Really great work!

  • @jamesdenmark1396
    @jamesdenmark1396 Месяц назад

    20 years of using javascript, first time i know, thank you

  • @a_maxed_out_handle_of_30_chars
    @a_maxed_out_handle_of_30_chars 2 месяца назад +13

    simple and to the point, thank you :)

  • @sourav_kd
    @sourav_kd 2 месяца назад

    This is the first time I've come across someone explaining this topic! This could also be a great interview question. Thanks a lot. You gained a subscriber here.

  • @MAK_007
    @MAK_007 2 месяца назад

    love these type of videos with showcasing practically. thanks for the video. great work

  • @ThisAintMyGithub
    @ThisAintMyGithub 2 месяца назад +2

    Wow, I had no idea and I've been using Node for 5 years. Great video!

  • @exactzero
    @exactzero 2 месяца назад +1

    Simple, to the point but detailed. Subscribed.

  • @mkrzyzowski
    @mkrzyzowski 22 дня назад

    This is first explanation which make me understand that. Thank you!

  • @TENNISMANIAC144
    @TENNISMANIAC144 2 месяца назад

    Thank you! Fwiw, I bailed on the video early because as soon as you explained it high level I completely got it, but commenting to hopefully balance out not watching the whole video's impact on performance.

  • @callowaysutton
    @callowaysutton Месяц назад +1

    Oh that's pretty sweet, so you could get the content length of the body and make a nice little loading bar too, even if everything hasn't been received yet

  • @eddienubes
    @eddienubes Месяц назад

    the outro is golden, thank you for the quality vid!

  • @tomaspecl1082
    @tomaspecl1082 2 месяца назад +3

    I dont use javascript but I was using one rust library that has basically the same structure of awaiting twice. I did not understand why it did that, I could only think that they made the decoding asynchronous but that would be so weird. Now I think I know why they did the same thing.

  • @kal9421
    @kal9421 Месяц назад

    what I know is that response.json can return an error if the json is malformated ( same for JSON.parse ) and having it in a promise format help us to think about catching the promise if it fail for some reason anyway I loved this video and I learned a new thing thank you

  • @hackytech7494
    @hackytech7494 2 месяца назад

    Thank you very much this video, learned a lot, I was constantly being disturbed by the thought that why do I have to do the await two time, but now I know. Thanks to you

  • @realderek
    @realderek Месяц назад

    Great explanation. Seems very obvious now but I wasn’t thinking about the headers and body not coming in at the same time.

  • @cherubin7th
    @cherubin7th 23 дня назад +1

    Lifting one of the greatest mysteries ever.

  • @radvilardian740
    @radvilardian740 Месяц назад

    Hello, I comeback for the closing scene only, still not forgeting the "give me 1000 dollars" tho.

  • @MichaelChanCY
    @MichaelChanCY 2 месяца назад

    Wow! Your demonstration is perfect! Great work!

  • @antonpieper
    @antonpieper 2 месяца назад +3

    I didn't know xou could create a textdecoder and read the stream, very cool. I would have probably used SSE before this video

  • @victorlongon
    @victorlongon 23 дня назад

    Such a nice video. Very beautiful style and clear explanation! ❤

  • @PradeepMahato007
    @PradeepMahato007 28 дней назад

    This is informative and well explained. Thank you, Tom !! 🙂

  • @teraformerr
    @teraformerr 2 месяца назад

    very good and informative video, asked myself this week why we need to await 2 times

  • @itayperry8852
    @itayperry8852 Месяц назад

    Cool!! Very interesting to see that! I loved the ending with the decoder ♥️
    Thank you :)

  • @alibekcs
    @alibekcs Месяц назад

    Never knew streaming could be that easiy. Amazing video!

  • @saravanasai2391
    @saravanasai2391 24 дня назад

    Hey, that is a great explanation. I understood that the complete response body is waiting to get a transfer, but I was wondering when the response body converted into valid JSON. I know that conversion takes sometime incase of a huge pay load.

  • @pabloenriquegorga4222
    @pabloenriquegorga4222 2 месяца назад

    new here, i clicked on the video and i watched till the end. I understood the concept when you showed the documentation, but it was a great surprise to see the last part of the video, the "textDecode" part, and the "for 'await' ",that was awesome. Please do and other video about that part, I wonder what else can that be used for.
    but the way, new subscriber here too !

  • @antoinelb8509
    @antoinelb8509 2 месяца назад +7

    So why there is not second await on axios? (maybe both are combined?)

    • @JagaSantagostino
      @JagaSantagostino 2 месяца назад +5

      Axios doesn’t use fetch internally, it predates fetch even existing by a few year, it is based on XMLHttpRequest

    • @Georgggg
      @Georgggg 2 месяца назад +3

      because its better API, than native fetch

  • @testermobile834
    @testermobile834 Месяц назад +1

    Tom can you make a video on vocoder theme the terminal ui change you did& also how you edit the videos

  • @hacktor_92
    @hacktor_92 2 месяца назад

    so... to my understainding, `response.json` waits for server's data stream to finish up before doing the actual decode. Now I'm wondering if we can stream JSON content and progressively decode it (and render it in DOM) while incoming. I'm thinking for it from the perspective of huge data tables.

  • @heyiammahadev
    @heyiammahadev 2 месяца назад +5

    Is there a chance that server sends 200 first but something happened while sending body and server error or bad request occurs

    • @ShortFilmVD
      @ShortFilmVD 2 месяца назад +4

      Yes, that's why it's important to validate the body of the response and not rely on the http status header alone.

    • @hehimselfishim
      @hehimselfishim Месяц назад

      @@ShortFilmVD interesting, does this issue happen in axios as well? genuinely curious, never experienced this before.

    • @ShortFilmVD
      @ShortFilmVD Месяц назад

      @@hehimselfishim I don't think so, but I'd have to take a look at the code - it's been a while since I've used Axios.
      The main time this would happen is with a connection drop that doesn't get re-establish in time for TCP (or QUIC nowadays) to finish the transfer properly. These days it's a pretty rare occurrence.

  • @HarshdeepSaiyam
    @HarshdeepSaiyam Месяц назад

    The text decoder was soo cool man 😍

  • @ugotworms
    @ugotworms 2 месяца назад

    ...just finished watching Lost, cool to see John Locke is coding now! Seriously though, good explanation, I never gave thought to why I had to await the `.json()` call, just figured it was some backwards-compatibility quirk..

  • @RioChandra
    @RioChandra 15 дней назад

    amazing, simple and easy to understand about stream

  • @hellotherenameishere
    @hellotherenameishere Месяц назад

    Haha that ending is so self-aware, easy sub 👍

  • @micheledibe
    @micheledibe Месяц назад

    Today I feel more senior developer than Yestarday.
    well done video

  • @hi12167pies
    @hi12167pies 2 месяца назад

    awesome vid! i always wondered this but never bothered to actually find out why.

  • @foraminifera7001
    @foraminifera7001 2 месяца назад +2

    What is this theme of your terminal? I really like it) Is this something like OhMyPosh or Neovim theme?

  • @ericisawesome476
    @ericisawesome476 Месяц назад

    I was just wondering about this the other day, great video

  • @bige2899
    @bige2899 2 месяца назад

    I love this kind of videos, great explanation

  • @nimitsavant3127
    @nimitsavant3127 2 месяца назад

    amazing video 🥑 What are you using to record this?

  • @kusuma865
    @kusuma865 2 дня назад

    Doing God's work, loved the explanation and how simple it was to understand

  • @rabbyhossain6150
    @rabbyhossain6150 Месяц назад

    that's cool. I used "fetch" many times but I thought first promise actually fetched everything. Now I know it's just the header content.

  • @IamProcrastinatingRightNow
    @IamProcrastinatingRightNow Месяц назад

    After opening MDN, I thought, oh that makes sense. Alright, I can quit the video early... Didn't find something else to watch and watched till the end. With one eye, was looking in another window for something new to watch. But that streaming JSON was pretty cool. Worth to watch rest. Glad I stayed.

  • @yuvaleliav3709
    @yuvaleliav3709 Месяц назад

    normal youtubers: please subscribe
    this random guy: can you please distract the killer robots in this random picture while I finish my sente

  • @N_N_N
    @N_N_N 2 месяца назад

    Short and to the point, great video.

  • @louislecouturier
    @louislecouturier 2 месяца назад

    I always wondered ! Thanks for the answer and the examples !

  • @pixiedev
    @pixiedev 2 месяца назад

    I haven't thought about that. nice man simple clear crisp details 👌

  • @SigSeg-V
    @SigSeg-V Месяц назад +18

    0:15 who's Jason?

  • @andreidmt-asd
    @andreidmt-asd Месяц назад

    Nailed the background music.

    • @dvdrtrgn
      @dvdrtrgn 27 дней назад

      Background music is as pointless as a disco ball in this context

  • @hitdong
    @hitdong 2 месяца назад

    waaaait a second! it's 텀! wow. I'm surprised as a Korean. I thought I saw illusion, so I had to rewind and play it again. LOL nice to meet you 텀

  • @gabrielruffo6439
    @gabrielruffo6439 7 дней назад

    What happens if the server returning the body goes down while providing the response body byte by byte?

  • @iulikdev
    @iulikdev 2 месяца назад

    You explained very well. Nice. Long time ago i had hard time to understand this.

  • @95sita
    @95sita Месяц назад

    Very educative. By the way, which IDE are you using in this video? Thanks in advance.

  • @dennischen2922
    @dennischen2922 2 месяца назад +6

    Your Neovim setup looks gorgeous, could you share your dot files?

    • @tomontheinternet
      @tomontheinternet  2 месяца назад +1

      github.com/tom-on-the-internet/dotfiles
      Enjoy!

  • @jamashe
    @jamashe Месяц назад +1

    Could you share you neovim config? It looks pretty nice.
    And thanks.

  • @colorfulmoth
    @colorfulmoth 2 месяца назад +1

    thank you kind sir for this byte sized knowledge stream

  • @nicholas_obert
    @nicholas_obert 2 месяца назад +1

    How does JS know when the async for loop is finished? Does it check whether the "body" property is bound to a promise? Or maybe the "body" property is a Promise type and it gets converted into a blob type (or whatever JS type that is) only after it has been awaited so that typeof(response.body) is a promise and typeof(await response.body) is the actual body?

    • @msc8382
      @msc8382 2 месяца назад +1

      I believe they've implemented something similar to co-routines. This is the event loop in javascript. I suspect that the stack can be frozen and moved away from the event loop. In such a case, the await operation would move the suspended stack from the event loop, and the new promise is added to the end of the queue for promises.
      The event loop also takes promises from this queue if the event loop queue itself is empty. This means there are two levels of event handling; client events, timeouts, main javascript and promises. Eventually, the event loop processes the new promise, which promptly has an association to the earlier stack due the await operation.
      That's how the promise would cause the stack to be recovered and unfrozen, now with the result available.
      In normal co-routine implementations, you would have multiple threads where one main thread manages the order and locking of the operational threads. In an earlier version of Unity game engine for example, the co-routine implementation was used because then you could interact with the game engine from the main thread, which was the only real way to use it at the time. Real parallelisation can be hard, if you don't know about CAS operations.
      I've never personally understood why javascript implementations aren't multi-threaded. But I'm fairly happy with their async implementations. Could be worse.
      Perhaps this was not an answer you were looking for, but I think it should help you answer your own. for loops or not, promises are promises. Handled the same no matter what operation it represents. I can imagine that they've added more types of data to the promises hidden in the engine, such as iterator information. As long as they'd use a single thread or CAS operations, its fairly safe to find the next key that wasn't processed yet. I've personally programmed in C++11 where this was firstly released. I also use async a lot in javascript.
      PS: I create a lot of async arrow functions that use the bounded scope of the underlaying function to compute and hold information for a duration until an operation completes. The underlaying function does complete, and release that part of the stack. The scope itself is preserved due to the capture mechanics around variable functions. I find it an easy way to return signal registers or execute an async operation inline that way while having big or long call stacks be released. The benefit of it is that you get the full call stack trace still during errors, because it detects the connection due the capturing, but the actual underlaying stack has been released. I've never had performance issues on my web interfaces using this technique, and no corruption of data over time either. This was probably a bit vague and sorry for that. I've never seen anybody use async operations in the way I do them. To be exact: I do it so that all async operations complete as soon as possible, and by returning the underlaying function immediately, the event loop resumes the main events first. This ensures a reactive user interface regardless of what happens on the async queue. In my practical experience, it often means that input processing is delayed, but all html input and css stays fully reactive. That's how I want my interfaces to be.
      Cheers

    • @thomasle100
      @thomasle100 2 месяца назад +1

      hello !
      You can check my other response. I guess that the async for loop it's just sugar syntax on top of reader. You can get the "reader" by calling the method getReder on response. On reader, you have a method read, that returns a promise object. The object contains 2 properties : "done" which is a boolean, and a data chunk. So you have to call continuously the method read until the property done is true.
      So now imagine you create an async generator, you await the result of the read and then you yield each chunk of data. You'll get an async iterator that you can use with the for await loop syntax. I think it's what is done in the source code but I didn't check.

    • @nicholas_obert
      @nicholas_obert 2 месяца назад

      Makes sense

  • @Blue-bb9ro
    @Blue-bb9ro 2 месяца назад

    As a web dev, very interesting stuff, thanks 🙏

  • @AnasImloul
    @AnasImloul 22 дня назад

    In HTTP, we have a Content-Length header that indicates the number of bytes in the response body. Given that the await in the first fetch operation retrieves only the headers, does this imply that the Content-Length header cannot be set correctly before the entire body is streamed?

    • @AnasImloul
      @AnasImloul 22 дня назад

      Shouldn't the Content-Length header be specified by the server before it begins streaming the response? In your use case, you used res.end() to signal the end of the response body, which means that until res.end() is called, there’s no way to determine the Content-Length value.

  • @tiltMOD
    @tiltMOD 2 месяца назад

    This is beyond informative! Thank you!

  • @0xc0ffee_
    @0xc0ffee_ Месяц назад

    Very interested in your nvim configuration. It looks very clean :) Working on remodelling mine

  • @danielgilleland8611
    @danielgilleland8611 2 месяца назад +1

    ... which begs the question, what if all you want is the Header information? Can you then send a cancellation token after getting the response so that the server can dispense with continuing its response of the body?

    • @andrewcraig8177
      @andrewcraig8177 2 месяца назад +3

      Use the HEAD method instead of GET

  • @kaisersakhi4239
    @kaisersakhi4239 Месяц назад

    Great video! Which font are you using?

  • @johan7999
    @johan7999 2 месяца назад

    Thank you for teaching me something new today!

  • @dopetag
    @dopetag 2 месяца назад

    This is pure gold.

  • @capability-snob
    @capability-snob 2 месяца назад

    It's a bit of a shame that when promises came to JavaScript from E, we didn't also get the eventual send operator.

  • @filipesommer8253
    @filipesommer8253 2 месяца назад

    Very interesting. I wonder if they're would be a way to only need one await if we knew the response was a non-streaming request?