How to exclude jar in final sbt assembly plugin
Asked Answered
N

2

10

I need to exclude spark and test dependencies from my final assembly jar. I tried to use provider but it was not working.

libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.11" % "2.0.1" % "provided")

and execute sbt assembly.

Please help me resolve this issue.

Nikolenikoletta answered 27/1, 2017 at 12:30 Comment(0)
E
14

Use exclude option of assembly plugin filtering by direct name or with contains

assemblyExcludedJars in assembly := {
    val cp = (fullClasspath in assembly).value
    cp filter { f =>
      f.data.getName.contains("spark-core") ||
      f.data.getName == "spark-core_2.11-2.0.1.jar"
    }
  }
Etch answered 27/1, 2017 at 12:42 Comment(4)
it is working perfectly. I have hundreds of jar and is there any better way to avoid all related dependencies of a particular artifact/group.Nikolenikoletta
Isn't the disjunction superfluous there?Nowise
amazing, thank you so much you made my day. I was struggling with that for so long.Aurore
This does not workCombes
S
-2

I don't think || works. Instead use:

assemblyExcludedJars in assembly := {

var cp = (fullClasspath in assembly).value

cp     = cp filter { f=>f.data.getName.contains("spark-core")}

cp     = cp filter { f=>f.data.getName.contains("spark-core_2.11-2.0.1.jar")

 }

}
Shamekashameless answered 22/8, 2018 at 15:20 Comment(1)
Works perfectly fine with the logical OR (||). It's a Boolean expression after all.Intensive

© 2022 - 2024 — McMap. All rights reserved.