Wim Hellenthal
05/20/2010 3:16 AM
post55204
|
Dear readers,
I'm have a small TCP server that wait for commands and sends back responses. One of the commands sends back two
response values in two seperate send calls. When measuring the response delay in the client application (on WIN32) I
notice the following; the first response is received without any notable delay < 1ms, the second response will have a
delay of about 200ms. This delay is quit fixed.
Some while ago I stumbled into the same behaviour sending telnet requests from a windows application that came back
(response) into multiple packets (using a sniffer to watch the traffic) the first fraction(ip packet) of the telnet
response had a low response delay, the second fraction of the telnet response also had a delay of about 200 ms.
So the main questions is. Is this some kind of TCP/IP stack problem or do I one way or the other misuse the stack.
All help is appreciated
Thanks Wim
|
|
|