Multiple Spark servers in a single JVM
Asked Answered
H

3

10

Is there any way to run multiple instances of Sparkjava server in the same JVM? I am using it in a "plugin" software and based on external circumstances multiple instances of my plugin might be started up which then cause

java.lang.IllegalStateException: This must be done before route mapping has begun
at spark.SparkBase.throwBeforeRouteMappingException(SparkBase.java:256)
at spark.SparkBase.port(SparkBase.java:101)
at com.foo.bar.a(SourceFile:59)

It seems to me by looking at the code that it is heavily built around static fields in the code, so I am thinking about a classloader trick or working with SparkServerFactory somehow eliminating SparkBase.

Horten answered 3/1, 2017 at 20:52 Comment(3)
Prrobably those instances are using the same port number. When you start a spark instance it has to run on its own portHorsefly
No, it happens when you try to initialize a second one regardless of the port being the same or not. The first initialization sets a flag to true and from that point on most of the configuration methods get locked down.Horten
I had the same issue when running integration tests with sparkjava. As a workaround make sure to set forkCount=1/reuseForks=false for maven failsafe plugin to execute each test class in its own JVM process (see maven.apache.org/surefire/maven-surefire-plugin/examples/…)Amarillo
M
14

From Spark 2.5 you can use ignite():

http://sparkjava.com/news.html#spark25released

Example:

public static void main(String[] args) {
    igniteFirstSpark();
    igniteSecondSpark();
}

static void igniteSecondSpark() {
    Service http = ignite();

    http.get("/basicHello", (q, a) -> "Hello from port 4567!");
}

static void igniteFirstSpark() {
    Service http = ignite()
                      .port(8080)
                      .threadPool(20);

    http.get("/configuredHello", (q, a) -> "Hello from port 8080!");
}

I personally initialize them something like this:

import spark.Service

public static void main(String[] args) {
    Service service1 = Service.ignite().port(8080).threadPool(20)
    Service service2 = Service.ignite().port(8081).threadPool(10)
}

I recommend to read about how to use those services outside your main method, which I think would be a great use here.

Madrigal answered 3/4, 2017 at 4:15 Comment(2)
is there a way to make them all run on one port?Stereoscopy
One solution I see is to reuse Service instanceStereoscopy
H
2

The trick is to ignore the external static shell around Spark implemented in spark.Spark and directly work with the internal spark.webserver.SparkServer. There are some obstackles in the code that require workaround, e.g. spark.webserver.JettyHandler is not public, so you can't instantiate it from your code, but you can extend that with your own class placed into that package and turn it public.

So the solution is along these lines:

SimpleRouteMatcher routeMatcher1 = new SimpleRouteMatcher();
routeMatcher1.parseValidateAddRoute("get '/foo'", "*/*", wrap("/foo", "*/*", (req, res) -> "Hello World 1"));

MatcherFilter matcherFilter1 = new MatcherFilter(routeMatcher1, false, false);
matcherFilter1.init(null);
PublicJettyHandler handler1 = new PublicJettyHandler(matcherFilter1);
SparkServer server1 = new SparkServer(handler1);

new Thread(() -> {
            server1.ignite("0.0.0.0", 4567, null, null, null, null, "/META-INF/resources/", null, new CountDownLatch(1),
                    -1, -1, -1);
        }).start();

And need to duplicate the wrap method in your codebase:

protected RouteImpl wrap(final String path, String acceptType, final Route route) {
        if (acceptType == null) {
            acceptType = "*/*";
        }
        RouteImpl impl = new RouteImpl(path, acceptType) {
            @Override
            public Object handle(Request request, Response response) throws Exception {
                return route.handle(request, response);
            }
        };
        return impl;
    }

This seems to be a viable workaround if you need multiple Spark servers in your app.

Horten answered 18/1, 2017 at 21:28 Comment(0)
X
1

I have this problem running unit tests with Spark, to fix it I modify the pom.xml file.

forkCount=1 reuseForks=false

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <version>${surefire.version}</version>
            <dependencies>
                <dependency>
                    <groupId>org.junit.platform</groupId>
                    <artifactId>junit-platform-surefire-provider</artifactId>
                    <version>${junit.platform.version}</version>
                </dependency>
            </dependencies>
            <configuration>
                <forkCount>1</forkCount>
                <reuseForks>false</reuseForks>
Xray answered 13/6, 2018 at 13:52 Comment(1)
I couldn't get your suggestion to work as is - I got "Execution default-test of goal org.apache.maven.plugins:maven-surefire-plugin:2.22.0:test failed: java.lang.ClassNotFoundException". But with the "dependencies" section removed it works. ThanksGatefold

© 2022 - 2024 — McMap. All rights reserved.