Connection pooling in Spark Java framework
Asked Answered
K

2

14

Java Spark framework uses embedded Jetty as as web server. Jetty supports connection pooling with tools such as HikariCP and provides configuration options in XML files. However, according to these posts, Spark does allow to configure Jetty. There are plenty of examples using Spark, but they either do not use database or use DriverManager to connect to the database.

Is it possible to configure connection pooling via datasource and JNDI in Spark? If so, how?

Koball answered 23/3, 2017 at 10:45 Comment(6)
Have you figured it out? I am facing the same issue.Buttonhole
No haven't. I have put bounty on the question.Koball
There is thread in github about this feature request. If you are good at java, have a look at it.Buttonhole
I looked at the thread (you mentioned in these posts) and commit, it is providing flexibility for threads being used by embedded jetty using JettyServerFactory. It is not about solving the connection pooling thing.Bullfight
@JanBodnar I opened issue on github: github.com/perwendel/spark/issues/833Buttonhole
OK. I follow it; if they finish the request I will post an answer.Koball
S
5

I configured pooling in Spark Java with HikariCP for MariaDB. I did not use Jetty and used Apache Tomcat instead. Here are some code snippets:

src/main/resources/mysql-connection.properties

dataSourceClassName=com.mysql.jdbc.jdbc2.optional.MysqlDataSource
dataSource.url=<url>
dataSource.user=<user>
dataSource.password=<password>
dataSource.cachePrepStmts=true
dataSource.prepStmtCacheSize=100
dataSource.prepStmtCacheSqlLimit=2048
dataSource.useServerPrepStmts=true
maximumPoolSize=10

com/example/app/DataSourceFactory.java

public final class DataSourceFactory {

    static final Logger LOG = LoggerFactory.getLogger(DataSourceFactory.class);

    private static DataSource mySQLDataSource;

    private DataSourceFactory() {
    }

    //returns javax.sql.DataSource
    public static DataSource getMySQLDataSource() {
        if (mySQLDataSource == null) {
            synchronized (DataSourceFactory.class) {
                if (mySQLDataSource == null) {
                    mySQLDataSource = getDataSource("mysql-connection.properties");
                }
            }
        }
        return mySQLDataSource;
    }

    // method to create the DataSource based on configuration
    private static DataSource getDataSource(String configurationProperties) {
        Properties conf = new Properties();
        try {
            conf.load(DataSourceFactory.class.getClassLoader().getResourceAsStream(configurationProperties));
        } catch (IOException e) {
            LOG.error("Can't locate database configuration", e);
        }
        HikariConfig config = new HikariConfig(conf);
        HikariDataSource dataSource = new HikariDataSource(config);
        LOG.info("DataSource[" + configurationProperties + "] created " + dataSource);
        return dataSource;
    }

}

WebContent/WEB-INF/web.xml

<?xml version="1.0" encoding="UTF-8"?>
<web-app xmlns="http://xmlns.jcp.org/xml/ns/javaee" 
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee                              
    http://xmlns.jcp.org/xml/ns/javaee/web-app_3_1.xsd" version="3.1">

    <display-name>My Spark App</display-name>
    <filter>
        <filter-name>SparkFilter</filter-name>
        <filter-class>spark.servlet.SparkFilter</filter-class>
        <init-param>
            <param-name>applicationClass</param-name>
            <param-value>com.example.app.MySparkApp</param-value>
            <!-- MySparkApp implements spark.servlet.SparkApplication -->
        </init-param>
    </filter>
    <filter-mapping>
        <filter-name>SparkFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>
    <listener>
        <listener-class>com.example.app.AppServletContextListener</listener-class>
    </listener>
</web-app>

com/example/app/AppServletContextListener.java

public class AppServletContextListener implements ServletContextListener {

    static final Logger LOG = LoggerFactory.getLogger(AppServletContextListener.class);

    @Override
    public void contextInitialized(ServletContextEvent arg0) {
        LOG.info("contextInitialized...");
    }

    @Override
    public void contextDestroyed(ServletContextEvent arg0) {
        LOG.info("contextDestroyed...");
        try {
            if (DataSourceFactory.getMySQLDataSource() != null) {
                DataSourceFactory.getMySQLDataSource().unwrap(HikariDataSource.class).close();
            }

        } catch (SQLException e) {
            LOG.error("Problem closing HikariCP pool", e);
        }

    }

}

And finally you can obtain a pooled java.sql.Connection by calling DataSourceFactory.getMySQLDataSource().getConnection()

Showiness answered 11/4, 2017 at 8:10 Comment(3)
Hi, thanks for your answer. I know that it is possible for Tomcat. The question is how to do it with embedded Jetty, which is default for Spark.Koball
This is talking about apache-spark??Consumptive
@Consumptive sparkjava.comButtonhole
T
1

Should be possible, but you must:

  1. configure the pool
  2. keep a static reference to the datasource
  3. find where to start/connect to the database and more important find a way to close/release the pool

Point 1) and 2) could be as simple as:

static final HikariDataSource datasource = new HikariDataSource(new HikariConfig());

Don't know if Sparkjava supports start/stop events. I can't help you there.

Optionally, you could try http://jooby.org (I'm the author) with the jdbc module:

{
  use(new Jdbc());

  get("/db", () -> {
    DataSource ds = require(DataSource.class);
  });
}

Jdbc module give you a HikariDatasource with sensible defaults options, but at the same time you can customize the pool via application.conf file.

There are a couple of high level jdbc modules too, like Hibernate, Ebean, jOOQ, Jdbi, etc...

Tomfool answered 18/4, 2017 at 15:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.