I'm doing a spark app using scala with following data:
+----------+--------------------+
|        id|                data|
+----------+--------------------+
|    id1   |[AC ED 00 05 73 7...|
|    id2   |[CF 33 01 61 88 9...|
+----------+--------------------+
The schema shows:
root
 |-- id: string (nullable = true)
 |-- data: binary (nullable = true)
I tried to convert this dataframe into a map object, with id being key and data being value
I have tried:
df.as[(String, BinaryType)].collect.toMap
but I got following error:
java.lang.UnsupportedOperationException: No Encoder found for org.apache.spark.sql.types.BinaryType
- field (class: "org.apache.spark.sql.types.BinaryType", name: "_2")
- root class: "scala.Tuple2"
 
    