I am using Spark 2.1.0, Scala 2.11.8 and Oracle 12c, while loading spark dataframe to Oracle it converts time, Timezone on server is EDT, I am changing it to Asia/Kolkata as date in 1st record is not valid in EDT(DST kicked in at that particular hour). Below is default timezone code and dataframe
import java.util.TimeZone
val tz= TimeZone.getTimeZone("Asia/Kolkata")
TimeZone.setDefault(tz)
//Create a DataFrame to Load Data
df3.show
+---+----------+--------------------+
| id| something|              dateee|
+---+----------+--------------------+
|  1|Date ---- |2017-03-12 02:02:...|
|  2|Date ---- |2017-02-12 02:02:...|
|  3|Date ---- |2017-01-12 02:02:...|
|  4|Date ---- |2017-08-01 11:21:...|
+---+----------+--------------------+
when I run df3.write.mode("append").jdbc(Url, "test_timestamp", dbProp)
Data Loaded into DB is
1   Date ----   3/11/2017 3:32:00 PM
2   Date ----   2/11/2017 3:32:00 PM
3   Date ----   1/11/2017 3:32:00 PM
4   Date ----   8/1/2017 1:51:33 AM
Table Schema -
Name                                      Null?    Type                        
----------------------------------------- -------- ------------------------
id                                                 NUMBER(38)                  
something                                          VARCHAR2(100)               
dateee                                             DATE                        
I want timestamp to be loaded as it is in DataFrame.
This doesn't happen when I don't use java.util.TimeZone and set another TimeZone as Default.