I have a dataframe in spark:
column1 | column2
-------------------
 a         1
 b         2
Both, column1 and column2 are of type string.
How can I convert column2 from string to big int?
I have a dataframe in spark:
column1 | column2
-------------------
 a         1
 b         2
Both, column1 and column2 are of type string.
How can I convert column2 from string to big int?
 
    
     
    
    You simply need to cast your column to a bigint or a long (which is the same in Spark)
val df = sc
    .parallelize(Seq(("a", "1"), ("b", "2")))
    .toDF("A", "B")
df.printSchema
root
     |-- A: string (nullable = true)
     |-- B: string (nullable = true)
df.withColumn("B", 'B cast "bigint").printSchema
or
df.withColumn("B", 'B cast "long").printSchema
root
     |-- A: string (nullable = true)
     |-- B: long (nullable = true)
