From .NET documentation:
DateTime.Ticks Property
The value of this property represents the number of 100-nanosecond intervals that have elapsed since 12:00:00 midnight, January 1, 0001 (0:00:00 UTC on January 1, 0001, in the Gregorian calendar), which represents DateTime.MinValue. It does not include the number of ticks that are attributable to leap seconds.
In PHP this is implemented simply as time():
time
Returns the current time measured in the number of seconds since the Unix Epoch (January 1 1970 00:00:00 GMT).
microtime()
similarly returns time in seconds and microseconds after decimal point, so it has greater precision. For some archaic reasons, the default value is a string, but if you pass true
as first argument, you'll get a nice float:
rr-@burza:~$ php -r 'echo microtime(true);'
1434193280.3929%
So all you have to do is to scale the value returned by either time()
or microtime()
by a constant factor.
According to Wikipedia, a nanosecond is equal to 1000 picoseconds or 1⁄1000 microsecond, or 1/1000000000 second. So 100 nanoseconds would mean 100/1000000000 microseconds, i.e. one .NET tick = 1/10000000 second, i.e. one second = 10000000 .NET ticks. Thus you need to multiply value returned by time()
or microtime()
by 10000000 like this:
microtime(true) * 10000000