I've been working on Visual C++ code for my Serial Wombat GPIO expander project. It connects via UART and uses 8 byte packets for communication. Each transaction requires 8 bytes to be sent and an 8 byte response to be processed. Latency here is paramount, because the faster you can process packets, the faster you can take measurements or drive IO. More data per interrupt doesn't help, because its effectively half-duplex. I'm using an FTDI adapter under Windows 10, using the default drivers installed when you plug it in. For the serial receive, I'm setting up the timeouts as follows:
COMMTIMEOUTS timeouts = { 0 };
timeouts.ReadIntervalTimeout = MAXDWORD;
timeouts.ReadTotalTimeoutConstant = 0;
timeouts.ReadTotalTimeoutMultiplier = 0;
timeouts.WriteTotalTimeoutConstant = 10;
timeouts.WriteTotalTimeoutMultiplier = 1;
which makes RX not wait for any bytes to come in, only return the ones that are available. Then I read repeatedly until I have enough bytes for the 8 byte response:
uint8_t buffer[2];
bool result = ReadFile(hSerial, buffer, 1, &dwBytesRead, NULL);
if (dwBytesRead > 0)
{
return(buffer[0]);
}
else
{
return(-1);
}
By default, I see the roughly 16mS delay you're seeing looking on a logic analyzer. However, if you go into the windows device manager, choose Properties for the FTDI Com port, go to Port Settings and Advanced, you get this gem:
FTDI Screen Capture
Change the Latency Timer to 1 mS to maximize responsiveness. Looked at the result on the logic analyzer, and it works exactly as expected.