Stack Driver logging with SLF4J not logging properly in scala
Asked Answered
T

0

7

I am using Logback with SLF4J for Stackdriver logging and I am following the example from Google Cloud. My application is written in Scala and running on Dataproc cluster on GCP.

logback.xml have the following contents.

<configuration>
  <appender name="CLOUD" class="com.google.cloud.logging.logback.LoggingAppender">
    <!-- Optional : filter logs at or above a level -->
    <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
      <level>INFO</level>
    </filter>
    <log>application.log</log> <!-- Optional : default java.log -->
    <resourceType>gae_app</resourceType> <!-- Optional : default: auto-detected, fallback: global -->
    <enhancer>com.company.customer.utils.MyLoggingEnhancer</enhancer> <!-- Optional -->
    <flushLevel>WARN</flushLevel> <!-- Optional : default ERROR -->
  </appender>

  <root level="info">
    <appender-ref ref="CLOUD" />
  </root>
</configuration>

and MyLoggingEnhancer is

package com.company.customer.utils

import com.google.cloud.logging.LogEntry.Builder
import com.google.cloud.logging.LoggingEnhancer

class MyLoggingEnhancer extends LoggingEnhancer {

  override def enhanceLogEntry(builder: Builder): Unit = {
    builder
      .addLabel("test-label-1", "test-value-1") //also not showing on the stack driver
  }
}

And I am constructing logging object in classes

 private val LOGGER: Logger = LoggerFactory.getLogger(this.getClass)

And I logged a message from logger object like

LOGGER.info("Spark Streaming Started for MyApplication")

The issue is resourceType and logName is not properly set on Stackdriver logs. logName is automatically set to yarn-userlogs and resourceType is set to cloud_dataproc_cluster. I want to set resourceType = Global and logName = MyApp so that I can filter MyApp logs under Global hierarchy.

I tried with after removing this line from logback.xml.

<resourceType>gae_app</resourceType>

and by adding this line

<resourceType>global</resourceType>

but no luck. Any kind of Help highly appreciated. The label test-label-1 which i provide is in LoggingEnhancer is not showing on Stackdriver as well.

My Json Payload on Stackdriver is

{
insertId: "gn8clokwpjqptwo5h"
jsonPayload: {
    application: "application_1568634817510_0189"
    class: "com.MyClass"
    container: "container_e01_1568634817510_0189_01_000001"
    container_logname: "stderr"
    filename: "application_1568634817510_0189.container_e01_1568634817510_0189_01_000001.stderr"
    message: "Spark Streaming Started for MyApplication"
}
labels: {
    compute.googleapis.com / resource_id: "1437319101399877659"
    compute.googleapis.com / resource_name: "e-spark-w-2"
    compute.googleapis.com / zone: "us"
}
logName: "projects/myProject/logs/yarn-userlogs"
receiveTimestamp: "2019-10-01T05:25:16.044579001Z"
resource: {
    labels: {
        cluster_name: "shuttle-spark"
        cluster_uuid: "d1557db6-72ee-4873-a276-4bd4ea0e89bb"
        project_id: "MyProjectId"
        region: "us"
    }
    type: "cloud_dataproc_cluster"
}
severity: "INFO"
timestamp: "2019-10-01T05:25:10Z"
}
Tennessee answered 1/10, 2019 at 7:13 Comment(5)
Did you get any solution for this?Cyclothymia
@VikasVats not yet through slf4j. Currently i am logging via com.google.cloud.logging.Logging. I am getting logger object in this way. logger: Logging = LoggingOptions.getDefaultInstance.getServiceTennessee
My issue is with LoggingEnhancer. I am not able to see fields added in stackdriver.Cyclothymia
my issue was the same.Tennessee
Thanks. With this way, I think I would have to write log entries code to each class? I want a common configuration to add metadata to all application logs. Do you have any idea?Cyclothymia

© 2022 - 2024 — McMap. All rights reserved.