When I run Spark Submit I have the following error:
Exception in thread "main" org.apache.spark.sql.AnalysisException: Failed to find data source: avro. Avro is built-in but external data source module since Spark 2.4. Please deploy the application as per the deployment section of "Apache Avro Data Source Guide".
But everything works fine in IDE.
Jar is built using sbt assembly.
build.sbt looks like this:
val sparkVersion = "2.4.3"
val jacksonVersion = "2.8.7"
dependencyOverrides ++= Seq(
"com.fasterxml.jackson.core" % "jackson-core" % jacksonVersion,
"com.fasterxml.jackson.core" % "jackson-databind" % jacksonVersion,
"com.fasterxml.jackson.module" %% "jackson-module-scala" % jacksonVersion
)
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided" ,
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"org.apache.spark" %% "spark-avro" % "2.4.3",
"io.confluent" % "kafka-avro-serializer" % "5.0.1",
"org.apache.avro" % "avro" % "1.8.2"
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs@_*) => MergeStrategy.discard
case x => MergeStrategy.first
}
I tried it with Scala versions 2.11.12 and 2.12.8.
The job looks like this:
Seq(1, 2, 3).toDF("id")
.write
.format("avro")
.mode(SaveMode.Overwrite)
.save("testavro")