The timestamp field is losing precision when querying the same table from Hive Metastore using Spark SQL.
My table description goes like this :
col_name  data_type  comment
id          bigint    null
name        string    null
joined_time timestamp null
Using Hive QL, I get joined_time values in milliseconds precision.
Hive QL results:
select * from employees;
1   foo 2016-07-04 02:12:10.0
2   bar 2016-07-04 02:12:10.0
While using spark-sql, I lose precision, upto minutes. e.g :
val result = sqlContext.sql("select * from employees")
result.show()
1  foo 2016-07-04 02:12:...
2  bar 2016-07-04 02:12:...
 
    