Why "could not find implicit" error in Scala + Intellij + ScalaTest + Scalactic but not from sbt
Asked Answered
G

7

16

I have this code that is working 100% from sbt , executing sbt test but throw a compilation error in Intellij Idea.

import org.scalatest.{BeforeAndAfter, FunSuite, GivenWhenThen}

class SimpleTest extends FunSuite with GivenWhenThen with BeforeAndAfter {
  test("Simple Test") {
    Given("Why this error?")
    assert("ok" === "ok")
  }
}

The error is:

Error:(5, 10) could not find implicit value for parameter pos: org.scalactic.source.Position
    Given("Why this error?")
Error:(5, 10) not enough arguments for method Given: (implicit pos: org.scalactic.source.Position)Unit.
Unspecified value parameter pos.
    Given("Why this error?")
Error:(6, 11) could not find implicit value for parameter prettifier: org.scalactic.Prettifier
    assert("ok" === "ok")
Error:(6, 11) macro applications do not support named and/or default arguments
    assert("ok" === "ok")
Error:(6, 11) not enough arguments for macro method assert: (implicit prettifier: org.scalactic.Prettifier, implicit pos: org.scalactic.source.Position)org.scalatest.Assertion.
Unspecified value parameters prettifier, pos.
    assert("ok" === "ok")
Error:(4, 23) could not find implicit value for parameter pos: org.scalactic.source.Position
  test("Simple Test") {

After refresh and reload as suggested:

Error:(6, 11) exception during macro expansion: 
java.lang.NoSuchMethodError: org.scalactic.BooleanMacro.genMacro(Lscala/reflect/api/Exprs$Expr;Ljava/lang/String;Lscala/reflect/api/Exprs$Expr;)Lscala/reflect/api/Exprs$Expr;
    at org.scalatest.AssertionsMacro$.assert(AssertionsMacro.scala:34)
    assert("ok" === "ok")

I am using:

IntelliJ IDEA 2016.3.2
Build #IU-163.10154.41, built on December 21, 2016

scalaVersion := "2.11.0",
"org.scalactic" %% "scalactic" % "3.0.1" % "test",
"org.scalatest" %% "scalatest" % "3.0.1" % "test"

Notes: - Using File -> Invalidate Caches / Restart does not fix the problem - Example that reproduce the error: Example in Github

Gwenette answered 10/1, 2017 at 9:2 Comment(8)
These problems typically occur when you have updated your build.sbt but you have not refreshed your IDEA project. Can you try to do so?Thermocouple
@Edmondo1984 Now a lot of errors related with macros. :) "File -> Refresh Cache / Restart" necessary. It is crazy.Gwenette
@Edmondo1984 updated. Other error after File -> Invalidate Caches / RestartGwenette
No, not (just) invalidate cache and restart. Go into your build.sbt and refresh the project (IntelliJ asks you to do so in the upper right corner unless you have deliberately clicked "ignore" in the past, in which case you need to find that option).Paradox
you do not need to invalidate and restart, you need to regenerate your .iml file from your build.sbt. This is done by the SBT tab on the right of your IDE by clicking refreshThermocouple
@Edmondo1984 Yes, I did it before invalidate. I am going to close the IDE, remove all folders and config files related with the IDE and import again. Cross fingers.Gwenette
@Edmondo1984 No luck after delete. I found a bug in idea that maybe is related with. youtrack.jetbrains.com/issue/SCL-10912Gwenette
@Edmondo1984 I think that I found the problem. It is a Idea BUG. They say that they are working on it. Check my response.Gwenette
G
11

Workarounds at the bottom of the response. ;)

This problem is related with this list of BUGs:

The problem is that there are dependencies in the project that are using, using test scope, other versions of scalatest and scalactic.

IntelliJ Idea is mixing compile scope and test scope, but SBT is working correctly. IntelliJ Idea team said in the BUG that they are working in this.

My workaround, at the moment, has been move to the same older version that the other libraries are using for testing.

Notes:

@justin-kaeser is assigned and working to fix this. Thx!

A lot of improvement related to the Scala plugin in that latest previews.

Example to reproduce the error : https://github.com/angelcervera/idea-dependencies-bug

Few Workarounds:

  1. Remove problematic dependencies from the Project structure -> Modules
  2. Exclude libraries in the sbt.
  3. Use the same version.
  4. Try with the last EAP: https://www.jetbrains.com/idea/nextversion/
Gwenette answered 10/1, 2017 at 12:7 Comment(2)
Thanks. it was quick help.Philpott
Removed scalatest 2.0.5 version from Project -> Libraries and it worked..Hawser
F
8

It's possible some dependencies are transitively including incompatible versions of Scalactic or Scalatest in the compile scope, which also are included in the test scope.

You can check this in the Project Structure under Project Settings / Modules / Dependencies tab, and analyze it more closely with the sbt-dependency-graph plugin.

SBT does however perform dependency evictions which IntelliJ does not (issue), which can cause additional problems when compiling from the IDE. If sbt-dependency-graph shows that the conflicting versions are evicted, then it is probably an instance of this issue.

Workaround: when you find the offending transitive dependency, exclude it from the root dependency in your build.sbt. For example:

"org.apache.spark" %% "spark-core" % "2.1.0" % "provided" exclude("org.scalatest", "scalatest_2.11")
Farah answered 25/1, 2017 at 17:54 Comment(6)
Hi @justin-kaeser. Both libraries are in scope test (% "test") so MUST be NOT included in the compile scope. Definitely is related with the BUG.Gwenette
Your direct dependencies are, but another dependency may be including them transitively. Have you checked if there are any other versions of the same dependencies on the classpath?Farah
Yes, of course. Other dependencies are using scalatest 2 for their test scope. But all that transitive dependencies should be not affect to my test scope. Why Idea is not using sbt to resolve dependencies and it is developing its own way? IMO: If it is working in SBT as expected and not in Idea, then it is a bug in Idea. What do you think? We can continue the conversation in the BUG: youtrack.jetbrains.com/issue/SCL-10912Gwenette
Commenting here for the benefit of anybody with similar problems: IDEA is using sbt to resolve the dependencies, but uses its own compiler that doesn't evict dependencies of lower version numbers. We are also working on better sbt integration, so that builds can get delegated directly to an sbt process.Farah
You are right. We can comment here. Anyway, I added all BUGs that I know that are related to this behaviour in the accepted answer. As IntelliJ IDEA Ultimate user I am really happy to know that Jetbrain is working to solve that problem. Thx @justin-kaeser I hope to see that fixed soon.Gwenette
Added URL to Github with simple example. github.com/angelcervera/idea-dependencies-bugGwenette
R
2

Not sure if this was an IDE bug, but for me upgrading the IDE to latest didn't proved to be of any help. After wasting few hours here is my approach to resolve this error. Which states following.

could not find implicit value for parameter prettifier: org.scalactic.Prettifier

Solution :

In IntelliJ press Ctrl+Alt+Shift+S -> Modules -> Dependencies -> Search for 
org.scalactic:3.0.0.jar (Test scope) and most probably there would be 
another version as 2.x.x in compile scope. Right click on 2.x.x and select 
EDIT and then choose the 3.0.0 version in compile scope, and apply new 
settings.

P.S. Depending on your case there may be only one entry but make sure you 
use 3.0.0 in compile scope to get rid of that weird error.
Reed answered 1/6, 2017 at 13:45 Comment(6)
Yes, that is the first workaround of the accepted answer. Remove problematic dependencies from the Project structure -> ModulesGwenette
This may be treated as a more detailed solution for newbies like me. ;)Reed
My recommendation is to use the second one, to avoid modify the IDE everytime that you or your coworkers important the project.Gwenette
But in our case it is not a direct dependency, it's coming as a part of some other dependencies. Spark job server to be specific.Reed
Yes, in my case was also the crazy dependencies set from Spark. But you can exclude dependencies, even if it is a transitive dependency. You can use the tree dependency plugin to know how is the problem. Another option is to try with the Idea EAP version. A lot of modifications about Scala: jetbrains.com/idea/nextversion I hope to help you with my comments.Gwenette
Sure I will. And thank you so much for sharing this knowledge.Reed
A
1

Also make sure your project JDK is set to JDK 8. Scala is not compatible with JDK 11 which is the default now in IntelliJ.

The same also happened with Maven.

I had a project where everything worked fine. After the latest IntelliJ upgrade it forgot the JDK setting. I did all the steps in the answers but none of them helped. As a last resort, I reinstalled IntelliJ from scratch, checked out a clean repo (no .idea folder or .iml files) and... didn't help. Then during setting up the project again I noticed JDK 11. It rang me a bell, added JDK 8, and there you go. Test are green again.

Ainslie answered 11/9, 2019 at 10:10 Comment(0)
G
0

I had similar issue.

For me, simplest way to solve this was just removing .idea folder and re-importing the project.

Gynaecomastia answered 12/2, 2018 at 9:37 Comment(0)
W
0

As mentioned in issue 170, it can be a issue with mixup of spark-testing-base dependency.

Make sure you are not mixing the dependency.

I had the following dependencies

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.1.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.1.0",
  "org.apache.spark" % "spark-streaming_2.11" % "2.1.0",
  "org.apache.spark" % "spark-mllib_2.11" % "2.1.0",
  "com.holdenkarau" %% "spark-testing-base" % "2.1.0_0.8.0" % "test",
  "org.scalatest" % "scalatest_2.11" % "2.1.0" % "test",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0" classifier "models"
)

And when i tried to run test classes I was getting

Error:(32, 14) could not find implicit value for parameter pos: org.scalactic.source.Position test("lsi"){ Error:(32, 14) not enough arguments for method test: (implicit pos: org.scalactic.source.Position)Unit. Unspecified value parameter pos. test("lsi"){ ..........

Then I change the dependencies to

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.2.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.2.0",
  "org.apache.spark" % "spark-streaming_2.11" % "2.2.0",
  "org.apache.spark" % "spark-mllib_2.11" % "2.2.0",
  "com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test",
  "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0" classifier "models"
)

Re-imported my project (as clean and package didn't work)

And the test classes passed.

Weigand answered 18/3, 2018 at 5:43 Comment(0)
F
0

Below change in the sbt file solved the compilation issue in IntelliJ

"org.apache.spark" %% "spark-core" % "2.1.0" % "provided" exclude("org.scalatest", "scalatest_2.11")

Code explorer is still showing

No implicit arguments of type org.scalactic.Prettifier.

however, tests are working in IntelliJ after the above fix.

Falconer answered 22/2, 2021 at 11:30 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.