I've a class which is using java.util.Date
class to create a date object and using getTime()
to get current milliseconds.
I've seen in the Java documentation that getTime()
returns the milliseconds, and the same case is on my machine.
I've one other server, when I am deploying my application on server, the same getTime()
returns the timestamp in seconds.
e.g.
- value on server: 1350054625
- value on local: 1350054625000
I am wondering how this is possible, I tried the same code locally and again I got timestamp in milliseconds.
Below is the part of code...
String longTime = new Long((new Date().getTime())).toString();
if(log.isDebugEnabled())log.debug("LAST_FEED_TIME will be " + longTime + " stored.");
System.currentTimeMillis();
can you try using that instead? – Riotnew Date().getTime()
and run it on your server with the same JVM – MonomerSystem.currentTimeMillis()
is correct but the Date is wrong then the value used to construct the Date was also incorrect. I.e. the bug is not in Date but how it was constructed. – RiotString longTime = String.valueOf(System.currentTimeMillis());
– Maimonidesjava.util.Date
at the server. This boils down to the server-side JRE or, theoretically but not very likely, to another JAR on the classpath that (should I say maliciously?) defines a brokenjava.util.Date
. – Maimonides