I am trying to solve a Spark serialization issue with HashMaps using Java. I am referring to the link Save Spark Dataframe into Elasticsearch - Can’t handle type exception .
Now I am hitting the following issue:
java.lang.ClassCastException: com.spark.util.umf.MyKryoRegistrator cannot be cast to org.apache.spark.serializer.Serializer
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:259)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
at org.apache.spark.SparkContext.(SparkContext.scala:270)
at org.apache.spark.api.java.JavaSparkContext.JavaSparkContext.scala:61)
at com.spark.util.umf.MyMain.main(MyMain.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:480)
15/10/16 01:47:22 INFO yarn.ApplicationMaster: Final app status:
FAILED, exitCode: 15, (reason: User class threw exception:
com.spark.util.umf.MyKryoRegistrator cannot be cast to
org.apache.spark.serializer.Serializer)
I create my Kryo registrator as followed :
import java.io.Serializable;
import org.apache.spark.serializer.KryoRegistrator;
import com.esotericsoftware.kryo.Kryo;
public class MyKryoRegistrator implements KryoRegistrator, Serializable {
@Override
public void registerClasses(Kryo kryo) {
// Product POJO associated to a product Row from the DataFrame
kryo.register(MyRecord.class);
}
}
Main method :
public static void main(String args[]){
SparkConf sConf= new SparkConf().setAppName("SparkTestJob");
sConf.set( "spark.driver.allowMultipleContexts", "true");
//Kryo kryo = new Kryo();;
//kryo.setDefaultSerializer(MyRecord.class);
//my.registerClasses(kryo);
sConf.set("spark.serializer","com.spark.util.umf.MyKryoRegistrator");
[...]
}