java.lang.NoSuchMethodError: scala.Predef$.refArrayOps
Asked Answered
J

13

62

I have the following class:

import scala.util.{Success, Failure, Try}


class MyClass {

  def openFile(fileName: String): Try[String]  = {
    Failure( new Exception("some message"))
  }

  def main(args: Array[String]): Unit = {
    openFile(args.head)
  }

}

Which has the following unit test:

class MyClassTest extends org.scalatest.FunSuite {

  test("pass inexistent file name") {
    val myClass = new MyClass()
    assert(myClass.openFile("./noFile").failed.get.getMessage == "Invalid file name")
  }

}

When I run sbt test I get the following error:

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
        at org.scalatest.tools.FriendlyParamsTranslator$.translateArguments(FriendlyParamsTranslator.scala:174)
        at org.scalatest.tools.Framework.runner(Framework.scala:918)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:533)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:527)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.immutable.Map$Map1.foreach(Map.scala:109)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.AbstractTraversable.map(Traversable.scala:105)
        at sbt.Defaults$.createTestRunners(Defaults.scala:527)
        at sbt.Defaults$.allTestGroupsTask(Defaults.scala:543)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:35)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:34)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
[error] (test:executeTests) java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

Build definitions:

version := "1.0"

scalaVersion := "2.12.0"

// https://mvnrepository.com/artifact/org.scalatest/scalatest_2.11
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "3.0.0"

I can't figure out what causes this. My class and unit test seem simple enough. Any ideas?

Jeana answered 30/10, 2016 at 12:16 Comment(5)
Can you share your build definition as well?Thecla
I confirmed your class methods work as expected in a standard scala repl. Must be an issue with the sbt build def.Vale
This specific error happens when you use Scala 2.11 JAR files in Scala 2.12 projects. Scalatest is cross compiled with Scala 2.11 and Scala 2.12, so you can avoid this error by leveraging the SBT %% operator, as indicated in the accepted question. See my answer to learn more about the SBT %% operator and cross compilation, topics all Scala programmers must understand to avoid headaches.Phonotypy
For those using spark, it also matters what scala is in the runtime where you submit. And for those using AWS EMR specifically, they use 2.11 (at least for EMR 5.x.x) even though 2.12 is also compatible with spark 2.4.x.Ellamaeellan
#75947949Lucilius
C
42

scalatest_2.11 is the version of ScalaTest compatible only with Scala 2.11.x. Write libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test" (note %%) instead to pick the correct version automatically and switch to Scala 2.11.8 until scalatest_2.12 is released (it should be very soon). See http://www.scala-sbt.org/0.13/docs/Cross-Build.html for more.

Caprice answered 30/10, 2016 at 14:15 Comment(1)
ScalaTest is now available for Scala 2.12.Hluchy
P
70

I had SDK in global libraries with a different version of Scala(IntelliJ IDEA).
File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild. It fixed the Exception for me.

Pestilent answered 2/10, 2017 at 7:57 Comment(4)
Another thing that helped in addition to this: reimporting the maven module (if you're using that). Also check Project Structure -> Problems, which may indicate a reference to an invalid/outdated scala SDK library. This happened to me after extended troubleshooting and trying different scala versions.Ebba
Worked Like Charm. also add scala sdk again in global dependencyJorum
Does it mean we are not allowed to have a few SDKs in Global Libraries? The use case is I have different projects with different scala versions: 2.11 / 2.12. What is the proper way to handle such cases?Unmoving
UPD: https://mcmap.net/q/121854/-java-lang-nosuchmethoderror-scala-predef-refarrayopsUnmoving
C
42

scalatest_2.11 is the version of ScalaTest compatible only with Scala 2.11.x. Write libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test" (note %%) instead to pick the correct version automatically and switch to Scala 2.11.8 until scalatest_2.12 is released (it should be very soon). See http://www.scala-sbt.org/0.13/docs/Cross-Build.html for more.

Caprice answered 30/10, 2016 at 14:15 Comment(1)
ScalaTest is now available for Scala 2.12.Hluchy
L
15

I used IntelliJ, and just import the project again. I mean, close the open project and import in as Maven or SBT. Note: I select the mvn (import Maven projects automatically) It disappeared.

Logrolling answered 17/5, 2017 at 21:35 Comment(1)
or check your scala version , it should be the same as your pom.xml or sbt settingLogrolling
P
8

This error occurs when you use a Scala JAR file that was compiled with Scala 2.11 for a Scala 2.12 project.

Scala libraries are generally cross compiled with different versions of Scala, so different JAR files are published to Maven for different project versions. For example, Scalatest version 3.2.3 publishes separate JAR files to Maven to Scala 2.10, 2.11, 2.12, and 2.13, as you can see here.

Lots of Spark programmers will run into this error when they attach a JAR file that was compiled with Scala 2.11 to a cluster that's running Scala 2.12. See here for a detailed guide on how to migrate Spark projects from Scala 2.11 to Scala 2.12.

As the accepted answer mentioned, the SBT %% operator should be used when specifying Scala dependencies so you can automatically grab library dependencies that correspond with your project's Scala version (as mentioned in the accepted answer). The %% operator won't help you if the library dependency doesn't have a JAR file for the Scala version you're looking for. Look at the Spark releases for example:

spark releases

This build.sbt file will work because there is a Scala 2.12 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

This code will not work because there isn't a Scala 2.11 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

You can cross compile your project and build JAR files for different Scala versions if your library dependencies are also cross compiled. Spark 2.4.7 is cross compiled with Scala 2.11 and Scala 2.12, so you can cross compile your project with this code:

scalaVersion := "2.11.12"
crossScalaVersions := Seq("2.11.12", "2.12.10")
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.7"

The sbt +assembly code will build two JAR files for your project, one that's compiled with Scala 2.11 and another that's compiled with Scala 2.12. Libraries that release multiple JAR files follow a similar process cross compilation workflow.

Phonotypy answered 2/12, 2020 at 14:20 Comment(0)
L
3

In my experience, if you still get errors after matching scalatest version and scala version in build.sbt, you have to think about your actual scala version that is running on your machine. You can check it by $ scala, seeing

Welcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121). Type in expressions for evaluation. Or try :help. this type of messages. You need to match this Scala version(eg. 2.12.1 here) and build.sbt's one.

Letterpress answered 21/3, 2017 at 6:5 Comment(0)
S
3

In my case, the Spark version makes it incompatible. Change to Spark 2.4.0 works for me.

Severini answered 28/4, 2019 at 1:17 Comment(1)
AWS EMR 5.x.x clusters always use scala 2.11 for spark submit jobs even though 2.12 scala should be a compatible option with spark 2.4.x.Ellamaeellan
O
2

This was happening to me in DataBricks. Problem was the same as noted by previous answers, the incompatibility with spark and scala version. For DataBricks, I had to change the cluster DataBricks Runtime Version. Default was Scala 2.11/Spark 2.4.5, bump this up to at least Scala 2.12/Spark 3.0.0

Click Clusters > Cluster_Name > Edit > DataBricks Runtime Version

enter image description here

Outlive answered 1/10, 2020 at 18:52 Comment(1)
Still happening in December 2020Faris
R
1

When you are using Spark, Hadoop, Scala, and java, some incompatibilities arise. You can use the version of each one that are compatible with others. I use Spark version: 2.4.1 , Hadoop: 2.7 , java: 9.0.1 and Scala: 2.11.12 they are compatible with each other.

Roadwork answered 15/4, 2019 at 9:39 Comment(1)
Important to note that starting in Spark version 2.4.2 the default distribution is compiled using Scala 2.12; prior to that 2.11 is used by default. So if you hit this error and you are using 2.11 dependencies in your project, make sure your Spark installation is also built using 2.11Weka
O
0

Try adding the following line to your build.sbt

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"

your build.sbt should be like this:

libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"

With this, the error for me is solved.

Obduce answered 24/12, 2016 at 13:52 Comment(2)
What is the difference between libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1" and libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"Micelle
How can this be handled in maven project?Micelle
E
0

in eclipse ide the project tends to be preselected with the scala installation 'Latest 2.12 bundle (dynamic)' configuration. If you are not actually using 2.12 for your Scala project and you attempt to run your project through the IDE, then this issue will manifest itself.

I've also noticed if I rebuild my eclipse project with the sbt command: "eclipse with-source" that this has the side effect of resetting the eclipse project scala installation back to the 2.12 setting (even though my build.sbt file is configured for a 2.11 version of Scala). So be on the lookout for both of those scenarios.

Event answered 9/1, 2018 at 22:50 Comment(0)
T
0

In my case, I had a project jar dependency which was depending on a different version of scala. This was found under Project Structure -> Modules -> (selected project) -> Dependencies tab. Everything else in the project and its libraries lined up in scala version (2.12), but the other jar was hiding a transitive dependency on an older version (2.11).

Tarrant answered 15/11, 2020 at 5:38 Comment(0)
H
0

This happens if you use scala 2.12 libraries with scala 2.11 or vice versa.

If cleaning the pom.xml with good packages didn't work. It is because the jars are already downloaded in your local.

Delete this folder ~/.m2/ and reload the packages

Haldes answered 23/6, 2022 at 14:49 Comment(0)
K
0

For anyone encountering this issue with scala in AWS Glue 3

Glue 3 is using spark 3 and scala 2.12 and as the other answers indicate you can't use scala 2.11 jars on a cluster running scala 2.12

So if you were like me and you where trying to use an extra jar file compiled with scala 2.11 (and which worked in earlier versions of glue), you will now get this error and will need to rebuild/change your jar to one using scala 2.12 for use with Glue 3.

See aws migration recommendations: https://docs.aws.amazon.com/glue/latest/dg/migrating-version-30.htmlLscala/collection/mutable/ArrayOps

Kristine answered 15/9, 2022 at 19:45 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.