Latency (delay) is defined here as the time that a packet spends in travelling between sender and reciever.
Above definition is made for IP packets as far as I can understand. Can we say latency includes retransmission time for missing frames in data link layer? Or this definition assumes there is no missing frame?
Is it possible to make a latency definition for application level? Say, we have an application A. A uses TCP to send messages to a remote application. Since TCP is used missing segments will be retransmitted. So that, latency of an A message includes the missing segments' retransmission time.