Creating new column
data = spark.createDataFrame([(1,2,5), (3,4,9)], ['Col_1', 'Col_2','Col_3'])
data.show()
+-----+-----+-----+
|Col_1|Col_2|Col_3|
+-----+-----+-----+
|    1|    2|    5|
|    3|    4|    9|
+-----+-----+-----+
tmp_str = "F.col('Col_1')"
print(type(col_temp))
data = data.withColumn('Col_11',tmp_str)
AssertionError: col should be Column
AssertionError                            Traceback (most recent call last)
<command-2932446311694149> in <module>
     46 print(type(col_temp))
     47 print(col_temp)
---> 48 data = data.withColumn('Col_11',tmp_str)
     49 data.show()
     50 
I just gave the simple condition, but it is little complex. We can use expr , but need to use same thing like this. Any implicits will convert that string to column. Is there any way we can pass tmp_str as string, but need to calculate value
