(Spark) object {name} is not a member of package org.apache.spark.ml
Asked Answered
F

2

9

I'm trying to run self-contained application using scala on apache spark based on example here: http://spark.apache.org/docs/latest/ml-pipeline.html

Here's my complete code:

import org.apache.spark.ml.classification.LogisticRegression
import org.apache.spark.ml.linalg.{Vector, Vectors}
import org.apache.spark.ml.param.ParamMap
import org.apache.spark.sql.Row

object mllibexample1 {
  def main(args: Array[String]) {
    val spark = SparkSession
      .builder()
      .master("local[*]")
      .appName("logistic regression example 1")
      .getOrCreate()


    val training = spark.createDataFrame(Seq(
      (1.0, Vectors.dense(0.0, 1.1, 0.1)),
      (0.0, Vectors.dense(2.0, 1.0, -1.0)),
      (0.0, Vectors.dense(2.0, 1.3, 1.0)),
      (1.0, Vectors.dense(0.0, 1.2, -0.5))
    )).toDF("label", "features")

    val lr = new LogisticRegression()

    println("LogisticRegression parameters:\n" + lr.explainParams() + "\n")

    lr.setMaxIter(100)
      .setRegParam(0.01)

    val model1 = lr.fit(training)

    println("Model 1 was fit using parameters: " + model1.parent.extractParamMap)
  }
}

Dependencies in build.sbt:

name := "example"
version := "1.0.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "2.0.1",
    "org.apache.spark" %% "spark-sql" % "2.0.1",
    "org.apache.spark" %% "spark-mllib-local" % "2.0.1",
    "com.github.fommil.netlib" % "all" % "1.1.2"
  )

However after running the program in sbt shell, I got the following error:

[info] Compiling 1 Scala source to /dataplatform/example/target/scala-2.11/classes...
[error] /dataplatform/example/src/main/scala/mllibexample1.scala:1: object classification is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.classification.LogisticRegression
[error]                            ^
[error] /dataplatform/example/src/main/scala/mllibexample1.scala:3: object param is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.param.ParamMap
[error]                            ^
[error] /dataplatform/example/src/main/scala/mllibexample1.scala:8: not found: value SparkSession
[error]     val spark = SparkSession
[error]                 ^
[error] /dataplatform/example/src/main/scala/mllibexample1.scala:22: not found: type LogisticRegression
[error]     val lr = new LogisticRegression()

I can successfully run this code in spark interactive shell. Did I miss something in *.sbt file ?

Thanks, Bayu

Fetiparous answered 27/10, 2016 at 10:7 Comment(1)
possible duplicate of https://mcmap.net/q/1089433/-mllib-dependency-errorChouest
H
15

You missed a MLlib dependency:

"org.apache.spark" %% "spark-mllib" % "2.0.1"

Local is not enough.

Haftarah answered 27/10, 2016 at 10:8 Comment(1)
I get this when I try to import it: [warn] :: org.apache.spark#spark-mllib_2.12;2.0.1: not foundKimbell
F
3

I had the same issue and I am having a Maven Scala project.

I used the below Maven dependency. After adding this maven dependency, the issue was resolved.

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>2.0.2</version>
        </dependency
Feodore answered 20/11, 2017 at 0:45 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.