Does Java Spark provide any support for dependency injection or IoC containers?
Asked Answered
B

6

10

Having been in .NET, I am well versed in the support that micro web frameworks such as NancyFX and Web API have for IoC containers.

In similar frameworks in Ruby like Sinatra (NancyFX is based on Sinatra) it seems like you have the capability for dependency injection.

From what I see, because Java spark applications run as the main method, it doesn't seem like you can pass in your dependencies or IoC containers.

public class HelloWorld {
    public static void main(String[] args) {
        get("/hello", (req, res) -> "Hello World");
    }
}

I have a hard time understanding how a framework like this could be useful without supporting this.

If this framework doesn't, is there another lightweight framework (Spring is not lightweight from what I remember, but maybe things have changed) that does support this?

Bolometer answered 19/8, 2015 at 4:14 Comment(2)
Spring can be used as a lightweight DI by configuring the XML beans and not using libraries outside the core. Although I've not used it personally there's the @Inject annotation if that looks like something up your alley.Lamarckian
You can try Pippo webframework. It has support for spring, guice and weld cdiLathe
T
8

Spring can be simply integrated with Spark. e.g.

public interface Spark {

  /**
   * adds filters, routes, exceptions, websockets and others
   */
   void register();

}

@Configuration
public class SparkConfiguration {

   @Autowired(required = false)
   private List<Spark> sparks = new ArrayList<>();

   @Bean
   CommandLineRunner sparkRunner() {
       return args -> sparks.stream().forEach( spark -> spark.register());
   }

}

@Component
public class HelloSpark implements Spark {

    @Autowired
    private HelloWorldService helloWorldService;

    @Override
    public void register() {
        get("/hello", (request, response) -> helloWorldService.hello() );
    }

}

You can find more on https://github.com/pmackowski/spring-boot-spark-java

Tripodic answered 6/11, 2015 at 22:57 Comment(0)
D
5

It is quite easy to use Guice with Java Spark. Basically you need to extend the SparkFilter in the following way in order to create the Guice injector.

public class SparkGuiceFilter extends SparkFilter {

    private Injector injector = null;

    @Override
    protected SparkApplication[] getApplications(final FilterConfig filterConfig) throws ServletException {
        final SparkApplication[] applications = super.getApplications(filterConfig);

        if (this.injector == null) {
            this.injector = Guice.createInjector(new MainModule());
        }

        if (applications != null && applications.length != 0) {
            for (SparkApplication application : applications) {
                this.injector.injectMembers(application);
            }
        }

        return applications;
    }
}

Then you need a web.xml and have to run your Spark application as a normal war application using Jetty or any other servlet container:

<web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://java.sun.com/xml/ns/javaee"
         xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd"
         version="3.0">

    <filter>
        <filter-name>SparkGuiceFilter</filter-name>
        <filter-class>com.devng.spark.guice.SparkGuiceFilter</filter-class>
        <init-param>
            <param-name>applicationClass</param-name>
            <param-value>com.devng.spark.SparkApp</param-value>
        </init-param>
    </filter>
    <filter-mapping>
        <filter-name>SparkGuiceFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>
</web-app>

However, there are some limitations with this approach. You cannot use request based or session scopes with Guice. If you don't need this then you are good to go, otherwise you need to integrate Guice Servlet Extensions and add the GuiceFilter in your web.xml as described in the official Guice documentation. You also need to make sure that you use the same injector instance in the GuiceFilter and the SparkGuiceFilter to do that you need to define a GuiceServletContextListener in your web.xml as described here.

You can find a fully working example in my GitHub here https://github.com/devng/demo/tree/master/sparkjava-guice

Discourtesy answered 25/4, 2017 at 11:21 Comment(0)
T
3

I'm actually experimenting with Spark and Guice and as far as I can see, using dependency injection using both is very simple, at least as of today (August 2017).

All you have to do is the following:

public class MySparkApplication {

    public static void main(String[] args) {

        Injector injector = Guice.createInjector();
        SomeClass someClass = injector.getInstance(SomeClass.class);

        get("/hello", (req, res) -> someClass.getSomeString());
    }
}

It actually seems to be as easy as that. I just followed the Guice Getting Started guide and it works. When I run Spark and open http://localhost:4567 in my browser, the string is displayed that is returned from my method.

Teach answered 12/8, 2017 at 10:4 Comment(0)
G
1

I've been working with Spark lately, and it does not include an IoC provider out of the box, however, you could easily include Spring or Guice core, and that would be a lightweight solution.

All you need to do is add the dependency to Maven and start using it.

As an alternative, you would look at Ninja, which is a full-stack framework and includes Guice, JPA/Hibernate out of the box.

Grease answered 19/10, 2015 at 14:32 Comment(0)
R
1

Working on my own IoC with Guice. It works find with not a lot of code ;) Link: https://github.com/Romain-P/SparkJava-JFast

No guice module required by default, auto-detect binding objects are detected.

public class Main {
    public static void main(String[] args) {
        /* give a class as argument for package scanning from its path recursively */
        Injector injector = SparkApplication.init(Application.class);
        injector.getInstance(Application.class).initialize();
    }
}

@Binding
public class Application {
    @Inject Service http;
    @Inject HelloController helloController;

    public void initialize() {
        http.port(8080);
        http.get("/hello", (req, res) -> helloController.hello());
    }
}

@Binding
public class HelloController {
    @Inject HelloService service;

    public Object hello() {
        //business logic
        return service.hello();
    }
}

@Binding
@Slf4j
public class HelloService {
    public Object hello() {
        log.info("hello");
        return new Object();
    }
}
Riggle answered 25/10, 2017 at 9:15 Comment(0)
K
0

I think the neutrino framework fits your requirement.

Disclaimer: I am the author of the neutrino framework.

What is the neutrino framework

It is a Guice-based dependency injection framework for apache spark and is designed to relieve the serialization work of development.

Unlike old DI solution on spark which only apply DI at the driver, the neutrino provides the capabilities to use DI to generate objects at executors and even control their scope there.

For details, please refer to the neutrino readme file.

Limitation

Since this framework uses scala macro to generate some class, the guice modules and the logic of how to wire up these modules needs to be written with scala. Other classes can be java.

Kinder answered 7/5, 2022 at 15:13 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.