In a web farm environment, should we base the system date/time to the web servers or the database server?
Asked Answered
I

2

7

Assuming there are a number of load-balanced web servers in a web farm, is it safe to use the app/web server time in the application code for getting the system date/time or should we leave this responsibility to the database server?

Would there be a chance that machine date/time settings on all servers in the webfarm are out of sync?

If date/time will be the responsible of the DBMS, how will this strategy work if we have load-balanced clustered DBs?

Incomprehensible answered 14/5, 2010 at 8:22 Comment(2)
It rather depends what you want to use the time for. If you only need to know to the nearest minute, then that's a different question to needing it to the millisecond.Milline
Rather use System.DateTime.UtcNow - this will avoid any issues you might encounter if the machines aren't all set to the same timezone.Sporogenesis
V
3

You should have a Time Server (-:)

Seriously, the first approach is to make sure all servers use a protocol to sync their clocks. That will leave you with a known worst case deviation. Only if that deviation is larger than you can tolerate (unlikely in a Web App) you will need to engineer something special. Usually the database will be OK, but if that is clustered then you may need to appoint a dedicated server as the keeper of time.

But note that your accuracy will always be bound by the maximum lag of the network.

Visually answered 14/5, 2010 at 8:34 Comment(0)
W
1

Best just to have the same time set on all the servers so you don't have to worry about it, otherwise there is always confusion about whether the time comes from.

If the clocks on the servers are set from a time server regularly, they should be accurate to within 100ms of each other at least, which is probably good enough, though obviously it depends on what exactly you are trying to do.

Willywillynilly answered 14/5, 2010 at 8:29 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.