Say we have the following code that creates date:
SimpleDateFormat sdf = new SimpleDateFormat( "dd/MM/yyyy" );
sdf.setTimeZone( TimeZone.getTimeZone( "UTC" ) ); // we know the date being parsed is UTC
Date bizDate = sdf.parse( "12/12/2015" ); // milliseconds: 1449878400000
log.info( "ms: {}", bizDate.getTime() );
log.info( "date: {}", bizDate );
... // save to db
If that code runs on a JVM in UTC on an Oracle DB in UTC, I'm getting:
JVM params: -Duser.timezone=UTC
millisecond: 1449878400000
date: Sat Dec 12 00:00:00 UTC 2015
in oracle db: 2015-Dec-12 00:00:00 // zero time
For JVM not set to UTC (e.g. SGT) I'm getting:
JVM params: none (default timezone is SGT or UTC+8:00)
millisecond: 1449878400000
date: Sat Dec 12 08:00:00 SGT 2015
in oracle db: 2015-Dec-12 08:00:00 // plus 8 hours
Notice that they both have the same milliseconds but they were inserted differently in the DB.
My questions are:
Does JDBC standard say it adjusts the Date objects before inserting it? Please cite your source.
If JDBC does really adjust the Date objects before inserting it into the DB when your JVM's timezone is not set to UTC, why was it designed that way? I feel like that's making it more confusing. I was expecting it will insert it as it is. Imagine if you create a date using milliseconds (e.g. new Date( 1449878400000L ) ) and it will be stored differently and you have no information about the JVM's timezone your code will be running in. Or imagine your code will be running on multiple JVMs set to different timezones.
How do I prevent JDBC from adjusting the date when JVM's timezone is set to anything other than UTC? I'm using ibatis and I may not have direct access to PreparedStatements.
I have set the SimpleDateFormat's timezone to UTC because I want to treat the date being parsed as UTC (or as other timezone as required). It would have not been a problem had there been no such requirement. Now it seems I need to adjust the Date to reverse what JDBC is doing before inserting it.