@@tomontheinternet because it's great! it's straight to the point and answers something *many* people were wondering but never took the time to find out why/how it happens :D
tldw the first await just waits for headers, then the second waits for the body, because its parsing a response stream (bytes come in incrementally) not the entire payload at once
Summary: The first await: When you call fetch, it returns a promise that resolves as soon as the server responds with headers. This happens quickly, before the full response body is received. At this point, you only have access to the response metadata (like status code and headers). The second await: To get the actual response body content, you need to call a method like response.json(). This method also returns a promise, because the body content might still be streaming in from the server. The response body can be much larger than the headers and, in some cases, might take significant time to fully download.
This is great. Love the node server with no framework, and the UI with code snippets during demo. Learned two new things (headers arrive before body + how simple streaming responses can be) in 6 minutes. Really nice job here, thank you!
Don't ACTUALLY stream data like this. This was just a demonstration about how HTTP works. If you need live stream data, use a pipe (websocket, grpc, etc). Still a great video though.
> headers arrive before body This thing was quite a big pain in the neck for PHP developers back in times when I used to handle with it 😵💫 Just a portion of flashbacks.
Just to note, you don't actually send your headers until line 34 of your server code. the first instance of `res.write()` if you wanted to be sure you send the headers before the first body write, you will want `res.flushHeaders()` at that point the headers are locked and available for sending down the wire. `res.write()` ensures this internally also, but you are doing that 2ms after finalising your headers in your setInterval.
@@tomontheinternet You can check this yourself by making your interval be say 10 seconds, and then you will notice that your "got response" console log is a lot later than "got it instantly".
This basically means that without flushHeaders, the first response is only sent when the first part of the body is written and that's when the headers will be included, right?
@@ivandamyanov correct, delaying it til the writing of the body, allows you time to add (and potentially remove) headers from the header section until the last possible moment. EG: as the body is being sent. Once the body is being sent, all opportunity to modify the headers are gone. They are already sent.. Too late.. better luck next time. But, sometimes it's advantageous to send the headers early (with flushHeaders), while the body is still not ready to be sent. Though, these examples are few and far between. The only times I can think of off the top of my head, was using multipart image-replace, which was a dirty way of doing webcam streams, and the other example was sending a large zip file, but we purposely added a delay so the recipient client could read some custom headers, and decide to abort the connection before we actually started the zipping process, so not to waste server resources.. I am sure there are more examples, but I am getting old and those are the ones that come to mind right now..
Месяц назад
@@ivandamyanov `writeHead()` only fills property of object (adds value to `Resonse.Headers[]` or whatever type that property is). `Response.flushHeaders()` and `Response.write` have actually trigger sending the response object
This actually opens way to a super cool optimization. If the status code is not 2XX, there's a chance some APIs return an entire web page (like Microsoft Azure with 503 errors) waiting to retrieve that entire self-sufficient HTML document is such a waste of time. Rather, you can then flush it out and carry on with your other tasks
I doubt, that would give you anyhting in practice. Rule of thumb is 14kb per request, which will sent per 1 tcp round trip anyway, regardless if it is 1kb, or 10kb.
@@Georgggg Ethernet maximum transmission unit (MTU) is 1500 bytes. With TCP on top you get 1460 bytes, which is roughly 1.4kB and usually what you see flying around on the internet as well. That is nowhere near the 14k you mention. Where did you get that number from? There is something called jumbo frames, which ups the MTU to 9000 bytes, but that usually doesn't extend to the internet.
Really Great Demo and explaination Tom. I have mostly used Axios for my projects but when I started using fetch, this was the very first question in mind that why the hell I have to wait twice to get my json data. Thanks for this!!!
Oh that's pretty sweet, so you could get the content length of the body and make a nice little loading bar too, even if everything hasn't been received yet
@@sivasankaransomaskanthan8264 Hangul is korean alphabet and writing system. 'ㅌ' is like 'T' in pronounce, while 'ㅓ' like 'O' and 'ㅁ' is 'M' Altogether, '텀' is 'Tom' in Hangul.
wow! never knew learning streams could be that easy! and today i got the idea on why we await two times for the fetch; first the headers arrives followed by body.
Great video! I have been doing javascript for awhile, but this is the first time I've seen someone show an example of streaming data in from a fetch request... I didn't even know it was possible. Very cool!
The browser and server already do the heavy lifting of streaming and parsing the big piece of data in small chunks, incrementally (thus the 'streaming'). So in the frontend, all you need to do is create a 'for of' loop (because it can 'await' things) and build a string/content with the coming data. I've been doing this for years because I have many circumstances where we had to display things as they come in, rather than waiting for the whole thing to process and then BANG, fill up the screen. It's a nice, informative/educational piece of video nonetheless. Well done!
This is my first time finding your channel. I really like listening to you talk lol - but nothing beats the outtro. I almost spit coffee out of my face when it cut off at the very end. I looked away from the screen totally expecting "like, share, subscribe, patreon, etc" and got the "please distract these things, I really need he--" genius 🤣 You got a sub from me just on that.
This is actually a great explanation of the need for the second promise. This made me think of the fact that some developers will make a "HEAD" request to see if a resource is available. But it could just be done with that initial response using headers alone.
Using the HEAD request will be more efficient for the server, though, since it won't actually send the body (contents) of the requested URI. (Also, the requester _does_ actually get the content in a GET/POST request-whether your code reads it or not.) So altogether, if you just need to know whether a resource exists, use HEAD.
Wow, it was absolutely amazing. I need to check out your other videos (this was the first one) Hope I'm finally finding some advanced JS RUclips channel.
This is the first time I've come across someone explaining this topic! This could also be a great interview question. Thanks a lot. You gained a subscriber here.
Really like videos like this, going into detail on specific, small things in web development. So many other videos out there are general overview courses, but for us junior devs these type of videos are gold to expand are knowledge
Only really figured this out a couple months ago when discussing it with a work colleague. (I don’t write much js at all). This is the thing that annoys me most about js tutorials that explain promises. They explain what the await does (or the .then) and then will always give the example using fetch but won’t explain why there’s 2 of them, I’m sat there like “did I miss something?!?” This video should be required watching for learning js. Good job!
what I know is that response.json can return an error if the json is malformated ( same for JSON.parse ) and having it in a promise format help us to think about catching the promise if it fail for some reason anyway I loved this video and I learned a new thing thank you
I dont use javascript but I was using one rust library that has basically the same structure of awaiting twice. I did not understand why it did that, I could only think that they made the decoding asynchronous but that would be so weird. Now I think I know why they did the same thing.
Thank you! Fwiw, I bailed on the video early because as soon as you explained it high level I completely got it, but commenting to hopefully balance out not watching the whole video's impact on performance.
Hey, that is a great explanation. I understood that the complete response body is waiting to get a transfer, but I was wondering when the response body converted into valid JSON. I know that conversion takes sometime incase of a huge pay load.
Thank you very much this video, learned a lot, I was constantly being disturbed by the thought that why do I have to do the await two time, but now I know. Thanks to you
In HTTP, we have a Content-Length header that indicates the number of bytes in the response body. Given that the await in the first fetch operation retrieves only the headers, does this imply that the Content-Length header cannot be set correctly before the entire body is streamed?
Shouldn't the Content-Length header be specified by the server before it begins streaming the response? In your use case, you used res.end() to signal the end of the response body, which means that until res.end() is called, there’s no way to determine the Content-Length value.
How does JS know when the async for loop is finished? Does it check whether the "body" property is bound to a promise? Or maybe the "body" property is a Promise type and it gets converted into a blob type (or whatever JS type that is) only after it has been awaited so that typeof(response.body) is a promise and typeof(await response.body) is the actual body?
I believe they've implemented something similar to co-routines. This is the event loop in javascript. I suspect that the stack can be frozen and moved away from the event loop. In such a case, the await operation would move the suspended stack from the event loop, and the new promise is added to the end of the queue for promises. The event loop also takes promises from this queue if the event loop queue itself is empty. This means there are two levels of event handling; client events, timeouts, main javascript and promises. Eventually, the event loop processes the new promise, which promptly has an association to the earlier stack due the await operation. That's how the promise would cause the stack to be recovered and unfrozen, now with the result available. In normal co-routine implementations, you would have multiple threads where one main thread manages the order and locking of the operational threads. In an earlier version of Unity game engine for example, the co-routine implementation was used because then you could interact with the game engine from the main thread, which was the only real way to use it at the time. Real parallelisation can be hard, if you don't know about CAS operations. I've never personally understood why javascript implementations aren't multi-threaded. But I'm fairly happy with their async implementations. Could be worse. Perhaps this was not an answer you were looking for, but I think it should help you answer your own. for loops or not, promises are promises. Handled the same no matter what operation it represents. I can imagine that they've added more types of data to the promises hidden in the engine, such as iterator information. As long as they'd use a single thread or CAS operations, its fairly safe to find the next key that wasn't processed yet. I've personally programmed in C++11 where this was firstly released. I also use async a lot in javascript. PS: I create a lot of async arrow functions that use the bounded scope of the underlaying function to compute and hold information for a duration until an operation completes. The underlaying function does complete, and release that part of the stack. The scope itself is preserved due to the capture mechanics around variable functions. I find it an easy way to return signal registers or execute an async operation inline that way while having big or long call stacks be released. The benefit of it is that you get the full call stack trace still during errors, because it detects the connection due the capturing, but the actual underlaying stack has been released. I've never had performance issues on my web interfaces using this technique, and no corruption of data over time either. This was probably a bit vague and sorry for that. I've never seen anybody use async operations in the way I do them. To be exact: I do it so that all async operations complete as soon as possible, and by returning the underlaying function immediately, the event loop resumes the main events first. This ensures a reactive user interface regardless of what happens on the async queue. In my practical experience, it often means that input processing is delayed, but all html input and css stays fully reactive. That's how I want my interfaces to be. Cheers
hello ! You can check my other response. I guess that the async for loop it's just sugar syntax on top of reader. You can get the "reader" by calling the method getReder on response. On reader, you have a method read, that returns a promise object. The object contains 2 properties : "done" which is a boolean, and a data chunk. So you have to call continuously the method read until the property done is true. So now imagine you create an async generator, you await the result of the read and then you yield each chunk of data. You'll get an async iterator that you can use with the for await loop syntax. I think it's what is done in the source code but I didn't check.
@@hehimselfishim I don't think so, but I'd have to take a look at the code - it's been a while since I've used Axios. The main time this would happen is with a connection drop that doesn't get re-establish in time for TCP (or QUIC nowadays) to finish the transfer properly. These days it's a pretty rare occurrence.
new here, i clicked on the video and i watched till the end. I understood the concept when you showed the documentation, but it was a great surprise to see the last part of the video, the "textDecode" part, and the "for 'await' ",that was awesome. Please do and other video about that part, I wonder what else can that be used for. but the way, new subscriber here too !
Using it an a catchy way that is not the standard form. You should be handling the response using a then() method to help resolve the promise returned by the fetch. You don’t need to use an additional await.
'then' sucks because nested code is ugly and hard to read and maintain and to debug. it is an anti pattern. google for 'javascript refactoring then to await'
After opening MDN, I thought, oh that makes sense. Alright, I can quit the video early... Didn't find something else to watch and watched till the end. With one eye, was looking in another window for something new to watch. But that streaming JSON was pretty cool. Worth to watch rest. Glad I stayed.
Learned something. I always thought the first fetch was resolved as soon as the connection is established. Didn't now it resolves when the headers have arrived
so... to my understainding, `response.json` waits for server's data stream to finish up before doing the actual decode. Now I'm wondering if we can stream JSON content and progressively decode it (and render it in DOM) while incoming. I'm thinking for it from the perspective of huge data tables.
You can create a reusable method that handles the fetching and JSON parsing in one line. Here’s an example: ``` const fetchJson = async (url) => (await fetch(url)).json(); ``` Then: ``` const result = await fetchJson('/api/endpoint/json'); ```
... which begs the question, what if all you want is the Header information? Can you then send a cancellation token after getting the response so that the server can dispense with continuing its response of the body?
I have an unrelated question. I have looked into your dotfiles and cannot find where you set it up so that highlighted text becomes bold ( as seen in 0:30 ). Is that a manual config or part of your colorscheme? Thanks :)
...just finished watching Lost, cool to see John Locke is coding now! Seriously though, good explanation, I never gave thought to why I had to await the `.json()` call, just figured it was some backwards-compatibility quirk..
Damn this video is everywhere. Algorithm is on your side.
Yeah. I don’t know why. It had 100 views after the first 24 hours and then it exploded.
@@tomontheinternetwell, I’m glad it did. You just earned another subscriber 🎉
@@tomontheinternet because it's great! it's straight to the point and answers something *many* people were wondering but never took the time to find out why/how it happens :D
@@tomontheinternet It's a strange, easy to understand question with a simple to the point thumbnail.
@@tomontheinternet I got this video recommended to me as well just now
Before you attack this man, please, AWAIT...he is a good man
I can tell he's a good man almost right away, but I needed to await to see what his content looks like.
.then(() => "I agree")
@@theyayaa nah, I had to Text decode his stream
not just once but twice, AWAIT... AWAIT...
Do you Promise?
tldw the first await just waits for headers, then the second waits for the body, because its parsing a response stream (bytes come in incrementally) not the entire payload at once
That’s right Ligma Schlong!
❤❤
Thanks, Ligma Shlong. It's what I suspected, but the HTTP hint at the start threw me off
Thank you Ligma Shlong!
Ligma Shlong da real MVP
Summary:
The first await: When you call fetch, it returns a promise that resolves as soon as the server responds with headers. This happens quickly, before the full response body is received. At this point, you only have access to the response metadata (like status code and headers).
The second await: To get the actual response body content, you need to call a method like response.json(). This method also returns a promise, because the body content might still be streaming in from the server.
The response body can be much larger than the headers and, in some cases, might take significant time to fully download.
dude is doing the lords work fighting against JS-Terminators
haha
This is great. Love the node server with no framework, and the UI with code snippets during demo. Learned two new things (headers arrive before body + how simple streaming responses can be) in 6 minutes. Really nice job here, thank you!
Glad you like that. I really enjoy how much you can do without pulling in libraries, especially for demonstration apps.
@@tomontheinternetI agree! And we can do a lot with vanilla js in the frontend without frameworks :)
Don't ACTUALLY stream data like this. This was just a demonstration about how HTTP works. If you need live stream data, use a pipe (websocket, grpc, etc). Still a great video though.
> headers arrive before body
This thing was quite a big pain in the neck for PHP developers back in times when I used to handle with it 😵💫 Just a portion of flashbacks.
@@vryaboshapko story time?
Just to note, you don't actually send your headers until line 34 of your server code. the first instance of `res.write()` if you wanted to be sure you send the headers before the first body write, you will want `res.flushHeaders()` at that point the headers are locked and available for sending down the wire. `res.write()` ensures this internally also, but you are doing that 2ms after finalising your headers in your setInterval.
Thank you. I didn’t know that. Makes sense!
@@tomontheinternet You can check this yourself by making your interval be say 10 seconds, and then you will notice that your "got response" console log is a lot later than "got it instantly".
This basically means that without flushHeaders, the first response is only sent when the first part of the body is written and that's when the headers will be included, right?
@@ivandamyanov correct, delaying it til the writing of the body, allows you time to add (and potentially remove) headers from the header section until the last possible moment. EG: as the body is being sent. Once the body is being sent, all opportunity to modify the headers are gone. They are already sent.. Too late.. better luck next time.
But, sometimes it's advantageous to send the headers early (with flushHeaders), while the body is still not ready to be sent. Though, these examples are few and far between.
The only times I can think of off the top of my head, was using multipart image-replace, which was a dirty way of doing webcam streams, and the other example was sending a large zip file, but we purposely added a delay so the recipient client could read some custom headers, and decide to abort the connection before we actually started the zipping process, so not to waste server resources..
I am sure there are more examples, but I am getting old and those are the ones that come to mind right now..
@@ivandamyanov `writeHead()` only fills property of object (adds value to `Resonse.Headers[]` or whatever type that property is).
`Response.flushHeaders()` and `Response.write` have actually trigger sending the response object
Bro went deep but explained everything crystal clearly. Hope you make more videos like this. Hats off.
Dang, what an illustrative demo.
I really appreciated the pride you have for this simple and small demonstration. The ending was also funny
This actually opens way to a super cool optimization. If the status code is not 2XX, there's a chance some APIs return an entire web page (like Microsoft Azure with 503 errors) waiting to retrieve that entire self-sufficient HTML document is such a waste of time. Rather, you can then flush it out and carry on with your other tasks
I doubt, that would give you anyhting in practice. Rule of thumb is 14kb per request, which will sent per 1 tcp round trip anyway, regardless if it is 1kb, or 10kb.
@@Georgggg Ethernet maximum transmission unit (MTU) is 1500 bytes. With TCP on top you get 1460 bytes, which is roughly 1.4kB and usually what you see flying around on the internet as well. That is nowhere near the 14k you mention. Where did you get that number from? There is something called jumbo frames, which ups the MTU to 9000 bytes, but that usually doesn't extend to the internet.
Perhaps the difference lies in b vs B
@@micheloe he's talking about the TCP slow start algorithm, 10 packets are sent first and doubled until something drops
Really Great Demo and explaination Tom. I have mostly used Axios for my projects but when I started using fetch, this was the very first question in mind that why the hell I have to wait twice to get my json data.
Thanks for this!!!
I love this kind of DETAILED explanation ❤
This is great we need more depp knowledge like this. It's a request for you sir.
Definitely didn't know this before watching this video, I just accepted that awaiting JSON is what you had to do. Nice video!
For await feels incredibly cursed but I'm glad I now know about it and can't wait to use it as much as humanly possible
You're an amazing teacher; thanks for the great explanation and focusing on the "why" behind things.
Oh that's pretty sweet, so you could get the content length of the body and make a nice little loading bar too, even if everything hasn't been received yet
'텀' wasn't just a designed logo, it was intended Hangul.. That's lovely.
And this video was helpful for me. Thanks 텀.
what does that mean ?
@@sivasankaransomaskanthan8264 Hangul is korean alphabet and writing system.
'ㅌ' is like 'T' in pronounce, while 'ㅓ' like 'O' and 'ㅁ' is 'M'
Altogether, '텀' is 'Tom' in Hangul.
@@풍월상신 Thanks for the answer. Now I feel like I have learned to read korean lanuage. that's cool
@@풍월상신 It also looks like the letters T O M going around clock-wise (but the ㅓ is the T and the ㅌ is the M).
A lucky coincidence!
Very well explained. We need more advanced content like this.
wow! never knew learning streams could be that easy! and today i got the idea on why we await two times for the fetch; first the headers arrives followed by body.
Great video! I have been doing javascript for awhile, but this is the first time I've seen someone show an example of streaming data in from a fetch request... I didn't even know it was possible. Very cool!
The browser and server already do the heavy lifting of streaming and parsing the big piece of data in small chunks, incrementally (thus the 'streaming').
So in the frontend, all you need to do is create a 'for of' loop (because it can 'await' things) and build a string/content with the coming data. I've been doing this for years because I have many circumstances where we had to display things as they come in, rather than waiting for the whole thing to process and then BANG, fill up the screen.
It's a nice, informative/educational piece of video nonetheless. Well done!
This is my first time finding your channel. I really like listening to you talk lol - but nothing beats the outtro. I almost spit coffee out of my face when it cut off at the very end. I looked away from the screen totally expecting "like, share, subscribe, patreon, etc" and got the "please distract these things, I really need he--" genius 🤣 You got a sub from me just on that.
This is actually a great explanation of the need for the second promise. This made me think of the fact that some developers will make a "HEAD" request to see if a resource is available. But it could just be done with that initial response using headers alone.
Using the HEAD request will be more efficient for the server, though, since it won't actually send the body (contents) of the requested URI. (Also, the requester _does_ actually get the content in a GET/POST request-whether your code reads it or not.) So altogether, if you just need to know whether a resource exists, use HEAD.
@@jpisello true!
Already knew that, but the way this video explained was so entertaining! Really great work!
i don't know if the abrupt cutoff in the middle of endorsing your channel was intentional, but it was perfect imo. subbed! :P
Intentional!
Supraaawesome!
The interface, the code and the excitement with which you explain this. Thank you!
Thanks for the video. It's good to see not only how to do things, but also why. With deep understanding and good examples.
Wow, it was absolutely amazing.
I need to check out your other videos (this was the first one)
Hope I'm finally finding some advanced JS RUclips channel.
This is the first time I've come across someone explaining this topic! This could also be a great interview question. Thanks a lot. You gained a subscriber here.
20 years of using javascript, first time i know, thank you
I like the simple explanation and the overall vibes.
Nice stuff! Ofcourse, as a Go programmer, I don't deal nor look too deep into JS, this was still useful and nicely illustrated. Give us more!
Really like videos like this, going into detail on specific, small things in web development. So many other videos out there are general overview courses, but for us junior devs these type of videos are gold to expand are knowledge
Only really figured this out a couple months ago when discussing it with a work colleague. (I don’t write much js at all). This is the thing that annoys me most about js tutorials that explain promises. They explain what the await does (or the .then) and then will always give the example using fetch but won’t explain why there’s 2 of them, I’m sat there like “did I miss something?!?”
This video should be required watching for learning js. Good job!
Practical explanation is really amazing...
Wow, I had no idea and I've been using Node for 5 years. Great video!
0:15 who's Jason?
BA DUM TSSS
😂😂😂
the outro is golden, thank you for the quality vid!
Simple, to the point but detailed. Subscribed.
love these type of videos with showcasing practically. thanks for the video. great work
what I know is that response.json can return an error if the json is malformated ( same for JSON.parse ) and having it in a promise format help us to think about catching the promise if it fail for some reason anyway I loved this video and I learned a new thing thank you
I dont use javascript but I was using one rust library that has basically the same structure of awaiting twice. I did not understand why it did that, I could only think that they made the decoding asynchronous but that would be so weird. Now I think I know why they did the same thing.
Very cool explanation, thank you! I am also getting attacked by killer robots when I code
Glad I’m not alone
simple and to the point, thank you :)
This is first explanation which make me understand that. Thank you!
Never knew streaming could be that easiy. Amazing video!
Great explanation. Seems very obvious now but I wasn’t thinking about the headers and body not coming in at the same time.
Lifting one of the greatest mysteries ever.
Wow! Your demonstration is perfect! Great work!
Thank you! Fwiw, I bailed on the video early because as soon as you explained it high level I completely got it, but commenting to hopefully balance out not watching the whole video's impact on performance.
Today I feel more senior developer than Yestarday.
well done video
Cool!! Very interesting to see that! I loved the ending with the decoder ♥️
Thank you :)
Such a nice video. Very beautiful style and clear explanation! ❤
I was just wondering about this the other day, great video
Hello, I comeback for the closing scene only, still not forgeting the "give me 1000 dollars" tho.
Tom can you make a video on vocoder theme the terminal ui change you did& also how you edit the videos
Hey, that is a great explanation. I understood that the complete response body is waiting to get a transfer, but I was wondering when the response body converted into valid JSON. I know that conversion takes sometime incase of a huge pay load.
So why there is not second await on axios? (maybe both are combined?)
Axios doesn’t use fetch internally, it predates fetch even existing by a few year, it is based on XMLHttpRequest
because its better API, than native fetch
Thank you very much this video, learned a lot, I was constantly being disturbed by the thought that why do I have to do the await two time, but now I know. Thanks to you
The text decoder was soo cool man 😍
Haha that ending is so self-aware, easy sub 👍
very good and informative video, asked myself this week why we need to await 2 times
What was that ending?
normal youtubers: please subscribe
this random guy: can you please distract the killer robots in this random picture while I finish my sente
Your Neovim setup looks gorgeous, could you share your dot files?
github.com/tom-on-the-internet/dotfiles
Enjoy!
In HTTP, we have a Content-Length header that indicates the number of bytes in the response body. Given that the await in the first fetch operation retrieves only the headers, does this imply that the Content-Length header cannot be set correctly before the entire body is streamed?
Shouldn't the Content-Length header be specified by the server before it begins streaming the response? In your use case, you used res.end() to signal the end of the response body, which means that until res.end() is called, there’s no way to determine the Content-Length value.
awesome vid! i always wondered this but never bothered to actually find out why.
amazing, simple and easy to understand about stream
waaaait a second! it's 텀! wow. I'm surprised as a Korean. I thought I saw illusion, so I had to rewind and play it again. LOL nice to meet you 텀
This is informative and well explained. Thank you, Tom !! 🙂
How does JS know when the async for loop is finished? Does it check whether the "body" property is bound to a promise? Or maybe the "body" property is a Promise type and it gets converted into a blob type (or whatever JS type that is) only after it has been awaited so that typeof(response.body) is a promise and typeof(await response.body) is the actual body?
I believe they've implemented something similar to co-routines. This is the event loop in javascript. I suspect that the stack can be frozen and moved away from the event loop. In such a case, the await operation would move the suspended stack from the event loop, and the new promise is added to the end of the queue for promises.
The event loop also takes promises from this queue if the event loop queue itself is empty. This means there are two levels of event handling; client events, timeouts, main javascript and promises. Eventually, the event loop processes the new promise, which promptly has an association to the earlier stack due the await operation.
That's how the promise would cause the stack to be recovered and unfrozen, now with the result available.
In normal co-routine implementations, you would have multiple threads where one main thread manages the order and locking of the operational threads. In an earlier version of Unity game engine for example, the co-routine implementation was used because then you could interact with the game engine from the main thread, which was the only real way to use it at the time. Real parallelisation can be hard, if you don't know about CAS operations.
I've never personally understood why javascript implementations aren't multi-threaded. But I'm fairly happy with their async implementations. Could be worse.
Perhaps this was not an answer you were looking for, but I think it should help you answer your own. for loops or not, promises are promises. Handled the same no matter what operation it represents. I can imagine that they've added more types of data to the promises hidden in the engine, such as iterator information. As long as they'd use a single thread or CAS operations, its fairly safe to find the next key that wasn't processed yet. I've personally programmed in C++11 where this was firstly released. I also use async a lot in javascript.
PS: I create a lot of async arrow functions that use the bounded scope of the underlaying function to compute and hold information for a duration until an operation completes. The underlaying function does complete, and release that part of the stack. The scope itself is preserved due to the capture mechanics around variable functions. I find it an easy way to return signal registers or execute an async operation inline that way while having big or long call stacks be released. The benefit of it is that you get the full call stack trace still during errors, because it detects the connection due the capturing, but the actual underlaying stack has been released. I've never had performance issues on my web interfaces using this technique, and no corruption of data over time either. This was probably a bit vague and sorry for that. I've never seen anybody use async operations in the way I do them. To be exact: I do it so that all async operations complete as soon as possible, and by returning the underlaying function immediately, the event loop resumes the main events first. This ensures a reactive user interface regardless of what happens on the async queue. In my practical experience, it often means that input processing is delayed, but all html input and css stays fully reactive. That's how I want my interfaces to be.
Cheers
hello !
You can check my other response. I guess that the async for loop it's just sugar syntax on top of reader. You can get the "reader" by calling the method getReder on response. On reader, you have a method read, that returns a promise object. The object contains 2 properties : "done" which is a boolean, and a data chunk. So you have to call continuously the method read until the property done is true.
So now imagine you create an async generator, you await the result of the read and then you yield each chunk of data. You'll get an async iterator that you can use with the for await loop syntax. I think it's what is done in the source code but I didn't check.
Makes sense
Short and to the point, great video.
Is there a chance that server sends 200 first but something happened while sending body and server error or bad request occurs
Yes, that's why it's important to validate the body of the response and not rely on the http status header alone.
@@ShortFilmVD interesting, does this issue happen in axios as well? genuinely curious, never experienced this before.
@@hehimselfishim I don't think so, but I'd have to take a look at the code - it's been a while since I've used Axios.
The main time this would happen is with a connection drop that doesn't get re-establish in time for TCP (or QUIC nowadays) to finish the transfer properly. These days it's a pretty rare occurrence.
I love this kind of videos, great explanation
new here, i clicked on the video and i watched till the end. I understood the concept when you showed the documentation, but it was a great surprise to see the last part of the video, the "textDecode" part, and the "for 'await' ",that was awesome. Please do and other video about that part, I wonder what else can that be used for.
but the way, new subscriber here too !
Using it an a catchy way that is not the standard form. You should be handling the response using a then() method to help resolve the promise returned by the fetch. You don’t need to use an additional await.
'then' sucks because nested code is ugly and hard to read and maintain and to debug. it is an anti pattern. google for 'javascript refactoring then to await'
What happens if the server returning the body goes down while providing the response body byte by byte?
You explained very well. Nice. Long time ago i had hard time to understand this.
that's cool. I used "fetch" many times but I thought first promise actually fetched everything. Now I know it's just the header content.
It's a bit of a shame that when promises came to JavaScript from E, we didn't also get the eventual send operator.
Very interesting. I wonder if they're would be a way to only need one await if we knew the response was a non-streaming request?
After opening MDN, I thought, oh that makes sense. Alright, I can quit the video early... Didn't find something else to watch and watched till the end. With one eye, was looking in another window for something new to watch. But that streaming JSON was pretty cool. Worth to watch rest. Glad I stayed.
Very interested in your nvim configuration. It looks very clean :) Working on remodelling mine
I always wondered ! Thanks for the answer and the examples !
I didn't know xou could create a textdecoder and read the stream, very cool. I would have probably used SSE before this video
Doing God's work, loved the explanation and how simple it was to understand
Learned something. I always thought the first fetch was resolved as soon as the connection is established. Didn't now it resolves when the headers have arrived
Loved this man. I learned something really great.
so... to my understainding, `response.json` waits for server's data stream to finish up before doing the actual decode. Now I'm wondering if we can stream JSON content and progressively decode it (and render it in DOM) while incoming. I'm thinking for it from the perspective of huge data tables.
You can create a reusable method that handles the fetching and JSON parsing in one line.
Here’s an example:
```
const fetchJson = async (url) => (await fetch(url)).json();
```
Then:
```
const result = await fetchJson('/api/endpoint/json');
```
... which begs the question, what if all you want is the Header information? Can you then send a cancellation token after getting the response so that the server can dispense with continuing its response of the body?
Use the HEAD method instead of GET
What is this theme of your terminal? I really like it) Is this something like OhMyPosh or Neovim theme?
As a web dev, very interesting stuff, thanks 🙏
Could you share you neovim config? It looks pretty nice.
And thanks.
I have an unrelated question.
I have looked into your dotfiles and cannot find where you set it up so that highlighted text becomes bold ( as seen in 0:30 ).
Is that a manual config or part of your colorscheme?
Thanks :)
...just finished watching Lost, cool to see John Locke is coding now! Seriously though, good explanation, I never gave thought to why I had to await the `.json()` call, just figured it was some backwards-compatibility quirk..
great explanation on a very interesting topic, subscribed!
I think what is missing here might be a proper alternative to parse the JSON in a streaming way.