java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument
Asked Answered
G

4

42

I am trying to run this grpc-Java example on my local. Corresponding proto file for the same is here. When i try to run in on local it throws the following exception from here :

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)V
    at io.grpc.ServiceDescriptor.validateMethodNames(ServiceDescriptor.java:129)
    at io.grpc.ServiceDescriptor.<init>(ServiceDescriptor.java:83)
    at io.grpc.ServiceDescriptor.<init>(ServiceDescriptor.java:51)
    at io.grpc.ServiceDescriptor$Builder.build(ServiceDescriptor.java:219)
    at io.grpc.examples.helloworld.GreeterGrpc.getServiceDescriptor(GreeterGrpc.java:251)
    at io.grpc.examples.helloworld.GreeterGrpc$GreeterImplBase.bindService(GreeterGrpc.java:84)
    at io.grpc.internal.AbstractServerImplBuilder.addService(AbstractServerImplBuilder.java:125)
    at io.grpc.internal.AbstractServerImplBuilder.addService(AbstractServerImplBuilder.java:63)
    at com.cw.predictive.HelloWorldServer.start(HelloWorldServer.java:56)
    at com.cw.predictive.HelloWorldServer.main(HelloWorldServer.java:92)

This is my pom.xml as mentioned here :

 <dependencies>
         <dependency>
          <groupId>io.grpc</groupId>
          <artifactId>grpc-netty</artifactId>
          <version>1.1.2</version>
        </dependency>

      <dependency>
        <groupId>io.grpc</groupId>
        <artifactId>grpc-protobuf</artifactId>
        <version>1.1.2</version>
      </dependency>

      <dependency>
        <groupId>io.grpc</groupId>
        <artifactId>grpc-stub</artifactId>
        <version>1.1.2</version>
      </dependency>
  </dependencies>
Giraffe answered 13/2, 2017 at 14:27 Comment(0)
P
40
**Please add following dependencies in your project.**
<dependency>
    <groupId>com.google.guava</groupId>
    <artifactId>guava</artifactId>
    <version>23.6-jre</version>
</dependency> 
<dependency>
    <groupId>org.apache.httpcomponents</groupId>
    <artifactId>httpcore</artifactId>
    <version>4.4.8</version>
</dependency>
Palisade answered 24/12, 2017 at 8:10 Comment(5)
I have this problem, and this doesn't fix it, even with later versions now current, and I put these before the grpc dependencies, if that matters. Also, my app doesn't seem to use the httpcore artifact at all. I have tried lots of things, and I'm stumped.Brunner
I had the same issue too and I have solved it by updating the version of com.google.inject.guice from 4.1.0 to 4.2.2. I have realized that, guice was using guava with version 19.0 and it caused the bug for no such method for checkArgument. By updating the version of guice, the problem solved for me. You can check if there is another library using guava by "mvn dependency:tree" and you can update it.Dakotadal
This saved my day . Looking at this , looks like issue of an outdated versionKlina
@Dakotadal how did you figure out guice was using the incorrect guava version?Profluent
Yup, thanks for this, I added the dependencies and it worked, i think I have to organize and make sure all my maven modules are using the same versions which I should declare in properties of a parent pomIce
C
6

For anyone else getting this error on EMR. In order to use the s3a address system for S3 files I needed the S3aFileSystem Jar which didn't seem to be available on the EMR 6.7, Hadoop 3.2.1 cluster - otherwise I received the error Class org.apache.hadoop.fs.s3a.S3AFileSystem not found. To fix this I included --packages org.apache.hadoop:hadoop-aws:3.2.1 in the spark-submit arguments. Then I started getting the error noted in the OP's question. Downgrading to 3.2.0 fixed this problem, but then I hit another bug; NoSuchMethodError: SemaphoredDelegatingExecutor while writing files to S3. I solved this problem by upgrading instead to 3.2.2: --packages org.apache.hadoop:hadoop-aws:3.2.2. Then I was able to read and write to S3a adresses with EMR 6.7

Chimerical answered 10/7, 2022 at 19:13 Comment(0)
H
1

I had this issue while trying to read a file from an S3 bucket using apache spark (PySpark). I'm using spark 3.3.0 downloaded and installed by brew on MacOS.

I solved my problem updating some jars on my spark jars folder, like @irukeru said in the comment above.

List of jars that a have updated:

Just download and copy this new jars to your spark folder, in my case, it was /usr/local/Cellar/apache-spark/3.3.0/libexec/jars

Hying answered 20/7, 2022 at 17:16 Comment(0)
F
0

With pyspark 3.3.0, the following combination of versions worked for me:

packages = [
    f'org.apache.hadoop:hadoop-aws:3.3.1',
    'com.google.guava:guava:30.1.1-jre',
    'org.apache.httpcomponents:httpcore:4.4.14', 
    'com.google.inject:guice:4.2.2', 
    'com.google.inject.extensions:guice-servlet:4.2.2'
]

conf = SparkConf().setAll([
    ('spark.jars.packages', ','.join(packages)),
    ...
])

spark = SparkSession.builder.config(conf=conf).getOrCreate()

PS: this is the preferred way of installing jar packages in Spark

Floss answered 14/4, 2023 at 8:48 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.