Can I run spark unit tests within eclipse
Asked Answered
B

1

1

Recently we moved from using scalding to spark. I used eclipse and the scala IDE for eclipse to write code and tests. The tests ran fine with twitter's JobTest class. Any class using JobTest would be automatically available to run as a scala unit test within eclipse. I've lost that ability now. The spark test cases are perfectly runnable using sbt, but the run configuration in eclipse for these tests lists 'none applicable'.

Is there a way to run spark unit tests within eclipse?

Bosco answered 21/5, 2015 at 1:42 Comment(0)
S
2

I think this same approach using Java would work in Scala. Basically just make a SparkContext using the master as "local" and then build and run unit tests as normal. Be sure to stop the SparkContext when the test is finished.

I have this working for Spark 1.0.0 but not a newer version.

public class Test123 {
  static JavaSparkContext sparkCtx;

  @BeforeClass
  public static void sparkSetup() {
    // Setup Spark
    SparkConf conf = new SparkConf();
    sparkCtx = new JavaSparkContext("local", "test", conf);     
  }

  @AfterClass
  public static void sparkTeardown() {
    sparkCtx.stop();
  }

  @Test
  public void integrationTest() {
    JavaRDD<String> logRawInput = sparkCtx.parallelize(Arrays.asList(new String[] {
            "data1",
            "data2",
            "garbage",
            "data3",
        }));
  }
}
Subsume answered 25/8, 2015 at 20:19 Comment(1)
The code for doing this in Scala is similar, and works in Spark 1.5.xLogo

© 2022 - 2024 — McMap. All rights reserved.