How to use spark-avro package to read avro file from spark-shell?
Asked Answered
V

3

9

I'm trying to use the spark-avro package as described in Apache Avro Data Source Guide.

When I submit the following command:

val df = spark.read.format("avro").load("~/foo.avro")

I get an error:

java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Provider org.apache.spark.sql.avro.AvroFileFormat could not be instantiated
  at java.util.ServiceLoader.fail(ServiceLoader.java:232)
  at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
  at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
  at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
  at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
  at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
  at scala.collection.Iterator$class.foreach(Iterator.scala:891)
  at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
  at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
  at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
  at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
  at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
  at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
  at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:630)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:194)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
  ... 49 elided
Caused by: java.lang.NoSuchMethodError: org.apache.spark.sql.execution.datasources.FileFormat.$init$(Lorg/apache/spark/sql/execution/datasources/FileFormat;)V
  at org.apache.spark.sql.avro.AvroFileFormat.<init>(AvroFileFormat.scala:44)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  at java.lang.Class.newInstance(Class.java:442)
  at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
  ... 62 more

I've tried different versions of the org.apache.spark:spark-avro_2.12:2.4.0 package (2.4.0, 2.4.1, and 2.4.2), and I currently use Spark 2.4.1, but neither worked.

I start my spark-shell with the following command:

spark-shell --packages org.apache.spark:spark-avro_2.12:2.4.0

enter image description here

Verlaverlee answered 26/4, 2019 at 18:16 Comment(0)
Z
11

tl;dr Since Spark 2.4.x+ provides built-in support for reading and writing Apache Avro data, but the spark-avro module is external and not included in spark-submit or spark-shell by default, you should make sure that you use the same Scala version (ex. 2.12) for the spark-shell and --packages.


The reason for the exception is that you use spark-shell that is from Spark built against Scala 2.11.12 while --packages specifies a dependency with Scala 2.12 (in org.apache.spark:spark-avro_2.12:2.4.0).

Use --packages org.apache.spark:spark-avro_2.11:2.4.0 and you should be fine.

Zandra answered 26/4, 2019 at 18:16 Comment(0)
B
2

just incase if some one is interested for pyspark 2.7 and spark 2.4.3

below package works

bin/pyspark --packages org.apache.spark:spark-avro_2.11:2.4.3
Bartizan answered 13/8, 2019 at 16:23 Comment(0)
S
0

One more thing I noticed when I had the same issue is that it runs fine for the first time and shows the error thereafter. So clear the cache by adding rm command to the docker file. That was sufficient in my case.

Skyward answered 10/3, 2020 at 9:51 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.