Idea

  • Latency is the time it takes for a packet of data to travel from source to a destination.
  • In terms of networking, it is generally considered to be the amount of time it takes from when a request is made by the user to the time it takes for the response to get back to the user.
  • On the first request, example for the first 14kb bytes, latency is longer because it includes a DNS lookup, a TCP handshake, the secure TLS negotiation.
  • Subsequent requests will have less latency because the connection to the server is already set.
  • Latency can be measured one way, for example, the amount of time it takes to send a request for resources, or the length of the entire round-trip from the browsers’ request for a resource to the moment when the requested resource arrives at the browser.

Context

  • Performance optimization
  • Web performance