I'm writing a multi-player game for Windows Phone 7. I need to make sure that events happen at the same time for each of the players. My approach at the moment is to broadcast, in advance, the time at which I want the event to take place, and rely on the phone's clock being reasonably accurate.
The trouble is, I've seen some situations where the clock is not accurate - it can be out by a couple of seconds. So what I'd like to do is estimate how different the phone clock's time is to the server's. Of course there's network latency to be taken into account, particular since the only network protocol open to me is Http.
So my question is, does anybody know of an algorithm that I can use to estimate the difference in clock time between client and server, to an accuracy of about 100ms?
From my days as a Maths undergraduate, I seem to remember that there was a statistical model that could be used in this situation where we are sampling a value that is assumed to consist of a constant plus an error amount (the latency) that can be assumed to follow some distribution. Does anybody know about this, and does it actually apply?