Best to think of these concepts as a pipe with water flowing through it. Bandwidth is the max amount of water that can flow through it per unit time if the pipe was completely full, throughput is the more practical measurement that measures the actual amount of water flowing through the pipe per unit time
Okay so latency is how long it takes someone to fly to a destination and back again. Throughput is how long it takes for the airline to transport a certain number of passengers from A to B
When a client makes a request, many a time the request is not immediately handled (or, attended to). The request spends some time waiting due to various reasons (e.g., the server/API is busy working on other requests, context switch to another process, garbage collection pause, etc.). This waiting time is called "latency". Once the request is taken up by the server/API, some time is required to process it (perform calculations/processing, database calls, etc.). This time is called "processing time". "Response time" is the sum of these two. Therefore => Response time = latency + processing time "Response time" is what is experienced by the client. It is the total time elapsed from when the client makes the request till it gets the complete response from the server/API. Hope this clarifies the difference between "latency" and "response time".
Incorrect explanation. The time taken by the request to get the data within the server is the wait time and is part of latency calculations. Throughput is the number of requests /total time.
Best to think of these concepts as a pipe with water flowing through it. Bandwidth is the max amount of water that can flow through it per unit time if the pipe was completely full, throughput is the more practical measurement that measures the actual amount of water flowing through the pipe per unit time
Throughput is not the "time" I guess, as you denote afterwards it is rate
Very good explanation for someone that had too many questions about this subject. Amazing!!
Thank you very much, for the explanation. I had issue understanding it. And you just simplified them. Thank you very much.
Simple yet best. You ellaborate in a very simple manners. Liked it
I have just got excellent clarity, Thank you.
Okay so latency is how long it takes someone to fly to a destination and back again. Throughput is how long it takes for the airline to transport a certain number of passengers from A to B
A clear and concise explanation. Thank you
Best Explanation!!!
Thanks for this amazing video sir!
This is great. Thanks.
Clear explanation. 10/10
Great explanation
What things you use to give online lectures ?, which software and devices ?
Can we say throughput as a "Speed of network" or something other 🤔
super explonation🤗
Tq sir for explaining this concept
Throughput is a rate of date to hit the serve but not a time
good explanation
Latency = Amount of Delay
Throughput = Amount of Data
Low Latency - High Throughput (Ideal)
What is the response time??
thank you ,bro useful
That's why I love Indian guy tutorial :D
bruh many of my indians lecturers have a youtube channel and they are really great unlike other lecturers
Can I call letnacy as response time
No. Latency and response time both are different.
They are vaguely defined
When a client makes a request, many a time the request is not immediately handled (or, attended to). The request spends some time waiting due to various reasons (e.g., the server/API is busy working on other requests, context switch to another process, garbage collection pause, etc.). This waiting time is called "latency".
Once the request is taken up by the server/API, some time is required to process it (perform calculations/processing, database calls, etc.). This time is called "processing time".
"Response time" is the sum of these two. Therefore =>
Response time = latency + processing time
"Response time" is what is experienced by the client. It is the total time elapsed from when the client makes the request till it gets the complete response from the server/API.
Hope this clarifies the difference between "latency" and "response time".
Nyc vid
i guess you messed up throughput defination.
Throughput is the ( "number of successful transmitted packets/ time taken to transmit all the packets " )
Thank you very much
great
Thanks!
👍
that is not throughtput. Totally incorrect
Incorrect explanation. The time taken by the request to get the data within the server is the wait time and is part of latency calculations. Throughput is the number of requests /total time.