Spark Java and the classpath
Asked Answered
T

3

6

I'm trying to start up with http://www.sparkjava.com/, a small Java web framework. The instructions tell you to add it as a Maven dependency (done), but when I mvn package, I get a class def not found for spark/Route.

I assume this is from Spark not being in my classpath. How can I add it? Would it go in pom.xml?

EDIT: Sorry, here is my pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.bernsteinbear.myapp</groupId>
  <artifactId>myapp</artifactId>
  <packaging>jar</packaging>
  <version>1.0-SNAPSHOT</version>
  <name>myapp</name>
  <url>http://maven.apache.org</url>
  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>com.sparkjava</groupId>
      <artifactId>spark-core</artifactId>
      <version>1.1</version>
    </dependency>
  </dependencies>
</project>

EDIT: Trace

λ chaos myapp → java -cp target/myapp-1.0-SNAPSHOT.jar com.bernsteinbear.myapp.App
Exception in thread "main" java.lang.NoClassDefFoundError: spark/Route
Caused by: java.lang.ClassNotFoundException: spark.Route
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

aaaand the source (the example from the homepage):

λ chaos myapp → cat src/main/java/com/bernsteinbear/myapp/App.java
/**
 * Hello world!
 *
 */

package com.bernsteinbear.myapp;
import spark.*;
import static spark.Spark.*;

public class App {

    public static void main(String[] args) {

    get(new Route("/hello") {
        @Override
        public Object handle(Request request, Response response) {
            return "Hello World!";
        }
        });

    }

}
Turpitude answered 27/9, 2013 at 16:13 Comment(3)
please share your pom.xmlTopping
The pom.xml looks right from a Maven perspective (assuming Spark itself doesn't have a bug). Can you share the stack trace and some of your source? Have you tried the examples from the source?Assault
@Assault here you are. example from the front pageTurpitude
Q
13

What works for me to make it run:

mvn package
mvn exec:java -Dexec.mainClass="com.your.class.with.main.method"
Qadi answered 6/11, 2013 at 20:54 Comment(0)
I
4

I was facing the same problem when trying to deploy the application to Heroku. I added the following to my POM.xml. This plugin ensures that the maven dependencies are copied over to your application.

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-dependency-plugin</artifactId>
            <version>2.4</version>
            <executions>
                <execution>
                    <id>copy-dependencies</id>
                    <phase>package</phase>
                    <goals><goal>copy-dependencies</goal></goals>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

After this you can go ahead and run

java -cp target/classes:"target/dependency/*" com.bernsteinbear.myapp.App

to run your application

Indicate answered 2/4, 2014 at 2:42 Comment(0)
A
1

Ok, so the maven package itself did not throw the exception; it was execution. The package produced by Maven must not contain everything required to run the app. (You can unzip the jar if you're curious about exactly what it contains.) So now it's a matter of either including Maven dependencies to your packaged classpath (I wouldn't necessarily recommend bothering with that yet), or instead simply including additional JARs in your runtime classpath (for Unix looks like -cp a.jar:b.jar:...). I suspect the spark-dependencies module has all the missing dependencies. (Unfortunately the readme is not very clear on this.)

Assuming the spark-dependencies module is sufficient, you'd just do:

java -cp target/myapp-1.0-SNAPSHOT.jar:lib/jetty-webapp-7.3.0.v20110203.jar:lib/log4j-1.2.14.jar:lib/slf4j-api-1.6.1.jar:lib/servlet-ap‌​i-3.0.pre4.jar:lib/slf4j-log4j12-1.6.1.jar com.bernsteinbear.myapp.App

Note you have to get the paths right. This is assuming the spark-dependencies zip file is unzipped to a lib folder.

If that still doesn't do it, or for additional information or to give feedback, you might also ping the author directly.

Assault answered 1/10, 2013 at 20:35 Comment(9)
Hmm. How do I figure out the "jetty-webapp..." string?Turpitude
The JARs inside spark-dependencies. Just unzip the zip file and take a look. I'm sorry -- I just corrected the link to spark-dependencies in case that was holding you up.Assault
I've updated the answer with the exact JAR names (as of the current spark-dependencies).Assault
is there any way to create a complete independent JAR?Turpitude
Yes, using Maven. See github.com/perwendel/spark/blob/master/pom.xml for the dependencies you'd need to copy over, and this Community Wiki answer on how to incorporate dependencies into a JAR using Maven: #575094Assault
Why are dependencies not recursively computed? Like in Ruby gems it just knows what to bundle...Turpitude
Good question. They are recursively computed, but not necessarily recursively included in the built package unless you follow the SO answer I linked above. Maven has a notion of different dependency scopes -- for things like compile time, test time, run time. So you can include only what you need where you need it. Actually "copy[ing] over" the dependencies as I suggested may be overkill; I haven't dug into the spark pom enough to know what it exports. You'll have to do some experimentation.Assault
Ah, thank you. How can I get maven to compile for a specific Java version? Spark's not compatible with 7, apparently...Turpitude
Set the source and target elements in the compiler plugin configuration: maven.apache.org/plugins/maven-compiler-plugin/examples/…Assault

© 2022 - 2024 — McMap. All rights reserved.