I am trying to get the local time in spark-scala but it is returning UTC.
I am using java.time.LocalDateTime to get the current timestamp. But its returning the UTC standard.
java.sql.Timestamp.valueOf(DateTimeFormatter.ofPattern("YYYY-MM-dd HH:mm:ss.SSSSSS").format(LocalDateTime.now))
The LocalDateTime is returning local time in spark shell, but in my code it is giving UTC standard.
val time: LocalDateTime = LocalDateTime.now
How to get the current time?
The current output is UTC. I need the local time. I need to change the zone.
Timestamp
. That class is poorly designed and long outdated. Instead use a class from java.time, the modern Java date and time API. – Expansible