Could you please shed some light on how to obtain correct epoch time in milliseconds for a default system timezone and given timezone.
Given
1. TimeZone: GMT+3
2. The following code snippet:
import java.time.*;
public class Main {
public static void main(String[] args) {
System.out.println(LocalDateTime
.now()
.atZone(ZoneOffset.UTC)
.toInstant()
.toEpochMilli()
);
System.out.println(LocalDateTime
.now()
.atZone(ZoneOffset.of("+3"))
.toInstant()
.toEpochMilli()
);
System.out.println(System.currentTimeMillis());
}
}
3. Output:
1444158955508
1444148155508
1444148155508
4. JavaDoc for System.currentTimeMillis() that tells that returned value will be the difference, measured in milliseconds, between the current time and midnight, January 1, 1970 UTC.
So, why
- the output of the
LocalDateTime
atGMT+3
is the same as ofSystem.currentTimeMillis()
, although the docs for theSystem.currentTimeMillis()
mentionUTC
? - the output of the
LocalDateTime
atUTC
differs fromSystem.currentTimeMillis()
, although the docs for theSystem.currentTimeMillis()
mentionUTC
?