I am using spark-sql-2.4.1v how to do various joins depend on the value of column I need get multiple look up values of map_val column for given value columns as show below.
Sample data:
val data = List(
  ("20", "score", "school", "2018-03-31", 14 , 12),
  ("21", "score", "school", "2018-03-31", 13 , 13),
  ("22", "rate", "school", "2018-03-31", 11 , 14),
  ("21", "rate", "school", "2018-03-31", 13 , 12)
 )
val df = data.toDF("id", "code", "entity", "date", "value1", "value2")
df.show
+---+-----+------+----------+------+------+
| id| code|entity|      date|value1|value2|
+---+-----+------+----------+------+------+
| 20|score|school|2018-03-31|    14|    12|
| 21|score|school|2018-03-31|    13|    13|
| 22| rate|school|2018-03-31|    11|    14|
| 21| rate|school|2018-03-31|    13|    12|
+---+-----+------+----------+------+------+
 val resultDs = df
                 .withColumn("value1",
                        when(col("code").isin("rate") , functions.callUDF("udfFunc",col("value1")))
                         .otherwise(col("value1").cast(DoubleType))
                      )
udfFunc maps as follows
11->a
12->b
13->c
14->d
Expected output
+---+-----+------+----------+------+------+
| id| code|entity|      date|value1|value2|
+---+-----+------+----------+------+------+
| 20|score|school|2018-03-31|    14|    12|
| 21|score|school|2018-03-31|    13|    13|
| 22| rate|school|2018-03-31|    a |    14|
| 21| rate|school|2018-03-31|    c |    12|
+---+-----+------+----------+------+------+
But it is giving output as
+---+-----+------+----------+------+------+
| id| code|entity|      date|value1|value2|
+---+-----+------+----------+------+------+
| 20|score|school|2018-03-31|  null|    12|
| 21|score|school|2018-03-31|  null|    13|
| 22| rate|school|2018-03-31|    a |    14|
| 21| rate|school|2018-03-31|    c |    12|
+---+-----+------+----------+------+------+
why "otherwise" condition is not working as expected. any idea what is wrong here ??