Java spark framework enable logging
Asked Answered
E

4

11

I'm building a java application with Spark framework with embedded Jetty and handlebars template engine. But when i get an 500 Internal Error, the console didn't say anything. I have added to my pom.xml the dependencies here: http://sparkjava.com/documentation.html#add-a-logger but does not print all exceptions / errors (like errors 500)

Here my pom.xml dependecies

<dependencies>

    <!-- FRAMEWORK:     Spark -->
    <dependency>
        <groupId>com.sparkjava</groupId>
        <artifactId>spark-core</artifactId>
        <version>2.5</version>
    </dependency>

    <!-- TEMPLATES:     Handlebars -->
    <dependency>
        <groupId>com.sparkjava</groupId>
        <artifactId>spark-template-handlebars</artifactId>
        <version>2.3</version>
    </dependency>

    <!-- DB-MAPPING:    sql2o -->
    <dependency>
        <groupId>org.sql2o</groupId>
        <artifactId>sql2o</artifactId>
        <version>1.5.4</version>
    </dependency>

    <!-- DRIVERS: sqlite-->
    <dependency>
        <groupId>org.xerial</groupId>
        <artifactId>sqlite-jdbc</artifactId>
        <version>3.8.11.2</version>
    </dependency>

    <!-- LOGGER:        slf4j -->
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-simple</artifactId>
        <version>1.7.21</version>
    </dependency>

</dependencies>

How i can enable all the logging for spark?

Enteritis answered 22/7, 2016 at 13:56 Comment(1)
you can configure log4j for capturings logs normallyBallistic
S
2

Use log4j to make a logging implementation. That's why you don't have an idea why are you getting an internal server error

http://logging.apache.org/log4j/2.x/

Sublease answered 22/7, 2016 at 16:43 Comment(0)
E
17

To enable logging, just add the following dependency to your project:

<dependency>
  <groupId>org.slf4j</groupId>
  <artifactId>slf4j-simple</artifactId>
  <version>1.7.21</version>
</dependency>

and you can register a catch-all Spark exception handler to log uncaught exceptions:

Spark.exception(Exception.class, (exception, request, response) -> {
    exception.printStackTrace();
});
Elkeelkhound answered 23/11, 2016 at 23:56 Comment(0)
S
2

Use log4j to make a logging implementation. That's why you don't have an idea why are you getting an internal server error

http://logging.apache.org/log4j/2.x/

Sublease answered 22/7, 2016 at 16:43 Comment(0)
E
-1

Not sure if this meant disabling spark or Hadoop in built logging but if thats the case, setting the log level in SparkContext helped me.

sc.setLogLevel("ERROR");

Possible options are ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN

https://spark.apache.org/docs/2.2.0/api/java/org/apache/spark/SparkContext.html#setLogLevel-java.lang.String-

Ellipsoid answered 1/10, 2018 at 0:14 Comment(0)
C
-3

Have you added a log4j properties file? Have a look at this documentation.

Configuring Logging Spark uses log4j for logging. You can configure it by adding a log4j.properties file in the conf directory. One way to start is to copy the existing log4j.properties.template located there.

Cowen answered 22/7, 2016 at 14:5 Comment(3)
I'm using the 2.5 version, and on the website it say to add slf4j: sparkjava.com/documentation.html#add-a-loggerEnteritis
The documentation you're referring to is for Spark the data processing framework, not Spark the web framework. Yes, it's an annoying namespace conflict. ;)Selfpropulsion
This does not seem to answer the question. This is for Spark Java and not Apache spark :) sparkjava.com/documentation.html#how-do-i-enable-loggingLheureux

© 2022 - 2024 — McMap. All rights reserved.