Logging from Java

Your Java logs are boring? Go see our blog to learn powerful and simple techniques to make them valuable! click here

Purpose of good Java Logging

Java logs are quite complex to handle, mainly because of stack traces. These stack traces are split into multiple lines which makes them difficult to associate to the original log event:

//4 events generated when only one is expected!
Exception in thread "main" java.lang.NullPointerException
        at com.example.myproject.Book.getTitle(Book.java:16)
        at com.example.myproject.Author.getBookTitles(Author.java:25)
        at com.example.myproject.Bootstrap.main(Bootstrap.java:14)

By asking your logging library to log into JSON, you will:

  • Ensure to have a stack_trace properly wrapped into the proper LogEvent
  • Ensure that all the attributes of a log event are properly extracted (severity, logger name, thread name, etc...)
  • You'll have access to MDC, which are attributes you can attach to any log events

How do we do it? We'll show you some examples of our Log4j2, Logback and log4j solutions in the next 2 chapters

Log from java using Log4j2

Source-code integration compatibility

Actually, the source-code integration doesn't work with Log4j2 lib until this issue LOG4J2-1694 is not resolved by the community.

Step 1 - Get the dependency

Assuming you are using maven or similar tool, just add the following dependencies to your configuration file.

<dependencies>
  <dependency>
    <groupId>org.apache.logging.log4j</groupId>
    <artifactId>log4j-core</artifactId>
    <version>2.7</version>
  </dependency>
  <dependency>
    <groupId>com.fasterxml.jackson.core</groupId>
    <artifactId>jackson-core</artifactId>
    <version>2.8.3</version>	
  </dependency>
  <dependency>
    <groupId>com.fasterxml.jackson.core</groupId>
    <artifactId>jackson-databind</artifactId>
    <version>2.8.3</version>
  </dependency>
  <dependency>
    <groupId>com.fasterxml.jackson.core</groupId>
    <artifactId>jackson-annotations</artifactId>
    <version>2.8.3</version>
  </dependency>
</dependencies>

Step 2 - Create the appender layout

Once the library is in your classpath, you can attach the following layout to any appender:

 <JSONLayout compact="true" eventEol="true" properties="true"/>

Step 3 - Create the appender

Now you can also send it directly through the TCP layer:

<?xml version="1.0" encoding="UTF-8"?>
<Configuration>
    <Appenders>
        <File name="json_file" fileName="logs/appname_json.log">
            <JSONLayout compact="true" eventEol="true" properties="true"/>
        </File>
    </Appenders>
    <Loggers>
        <Root level="TRACE">
            <AppenderRef ref="json_file" />
        </Root>
    </Loggers>
</Configuration>

Step 4 - Send the log file through a log-shipper

Follow the corresponding instruction:

This is what you will get on your plateform:

A sample with Log4J2

A sample with Log4J2

Log from java using Slf4j & Logback

JSON logging with Slf4J Logback

Step - 1 : Add the JSON library

The JSON library we selected for Logback is logstash-logback-encoder. One advantage is: it's inside the main Maven repository.

To add it into your classpath, simply add the following dependency (version 4.5.1 on the example):

<dependency>
  <groupId>net.logstash.logback</groupId>
  <artifactId>logstash-logback-encoder</artifactId>
  <version>4.5.1</version>
</dependency>

<dependency>
	<groupId>ch.qos.logback</groupId>
	<artifactId>logback-classic</artifactId>
	<version>1.1.3</version>
</dependency>

Step - 2 : Create the appender

Now that you have the library setup, you can log the JSON in your console or straight in the TCP to Logmatic.io, as shown in the example below:

<appender name="JSON" class="ch.qos.logback.core.ConsoleAppender">
	<encoder class="net.logstash.logback.encoder.LogstashEncoder"/>
</appender>

<appender name="JSON_TCP" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
	<remoteHost>api.logmatic.io</remoteHost>
  <port>10514</port>
  <keepAliveDuration>1 minute</keepAliveDuration>
	<encoder class="net.logstash.logback.encoder.LogstashEncoder">
		<customFields>{"logmaticKey":"<your_api_key>"}</customFields>
	</encoder>
</appender>
 
<root level="debug">
	<appender-ref ref="JSON_TCP" />
	<appender-ref ref="JSON" />
</root>

Don't forget to correctly replace <your_api_key> by the one generated for your usage

Using Rsyslog/Syslog-ng/Nxlog with Logback

Sending logs through Rsyslog, Syslog-NG or NXLog is also a good idea...

As with Log4j2 an other good way to send these logs is to write them into a local file:

<configuration>
(....)
   <timestamp key="byDay" datePattern="yyyyMMdd'T'HHmmss"/>

   <appender name="FILE" class="ch.qos.logback.core.FileAppender">
      <file> ~/logs/log-${byDay}.log </file>
      <append>true</append>
      <encoder>
          <Pattern>%d{yyyy-MM-dd_HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</Pattern>
        </encoder>
   </appender>
(....)
	<root level="debug">
		<appender-ref ref="FILE" />
	</root>
</configuration>

and then send it to Logmatic.io with an official log agent Like:

All the information related to the machine and the process will then come in a standardized format.

Log from java using Log4j

Logging to a remote server in json may be difficult with log4j that's why we advise you to use a SLF4J ship with a module called log4j-over-slf4j and then use logback for the json format.

It allows log4j users to migrate existing applications to SLF4J without changing a single line of code but simply by replacing the log4j.jar file with log4j-over-slf4j.jar, as described below.

Step 1 - Build a bridge to logback using SLF4J

To use log4j-over-slf4j in your own application, the first step is to locate and then to replace log4j.jar with log4j-over-slf4j.jar.
Note that you still need an SLF4J binding and its dependencies for log4j-over-slf4j to work properly.

In most situations, replacing a jar file is all it takes in order to migrate from log4j to SLF4J.

<dependency>
	<groupId>org.slf4j</groupId>
	<artifactId>log4j-over-slf4j</artifactId>
	<version>1.7.13</version>
</dependency>

<dependency>
  <groupId>net.logstash.logback</groupId>
  <artifactId>logstash-logback-encoder</artifactId>
  <version>4.5.1</version>
</dependency>

<dependency>
	<groupId>ch.qos.logback</groupId>
	<artifactId>logback-classic</artifactId>
	<version>1.1.3</version>
</dependency>

Watch out

As a result of this migration, log4j configuration files will no longer be picked up. So you need to migrate your log4j.properties file to logback.xml the log4j translator might be of help

Once that done, just edit your logback.xml such as seen in the logback section and.. voila!

Using Rsyslog/Syslog-ng/Nxlog with Log4j

As with Log4j2 and Logback a way to send these logs to Logmatic.io is to write them into a local file:

<appender name="fileAppender" class="org.apache.log4j.FileAppender">
  <param name="File" value="/logs/log4j.log" />
  <param name="Append" value="true" />
  <layout class="org.apache.log4j.PatternLayout">
	  <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n" />
  </layout>
</appender>

and then send it to logmatic.io with an official log agent Like:

All the information related to the machine and the process will then come in a standardized format.

Enrich your log events with valuable attributes!

Logging is great- It tells developers and administrators what is happening at specific moments in time. However, you must always remember to decorate them with contextual attributes.

Using the Key/Value parser

The Key/Value parser extracts any <key>=<value> pattern recognized in any log event.

To enrich your log events in Java, you can re-write messages in your code and introduce <key>=<value> sequences.

For instance if you have:

logger.info("Emitted 1001 messages during the last 93 seconds for customer scope prod30");

You can easily change it to:

logger.info("Emitted quantity=1001 messages during the last durationInMs=93180 ms for customer scope=prod30");

With the Key/Value parser enabled, Logmatic.io will automatically extract each pair from your final JSON document:

{
	//...
	"message" : "Emitted quantity=1001 messages during the last durationInMs=93180 ms for customer scope=prod30",
	"scope" : "prod30",
	"durationInMs" : 93180,
	"quantity" : 1001
	//...
}

So you can exploit scope as a field, and durationInMs & quantity as metrics.

A fantastic way to troubleshoot faster is filtering by customer's scope, algorithm, platform or whatever classifying attribute related to your application's business.

MDC (Mapped Diagnostic Context)

Another option to enrich your logs is to use Java's MDC (Mapped Diagnostic Contexts).

If you use the logback technologie. It would give us the following Java code:

...
MDC.put("scope", "prod30");
logger.info("Emitted 1001 messages during the last 93 seconds");
...

Only srting types are admitted

MDC are great but for some reason only string types are allowed. Therefore, providing numerical values for metrics with MDCs would be a bad idea

This is what it would look like in the Logmatic.io platform:

Example of Java's MDC

Example of Java's MDC

Highly efficient troubleshooting with Logmatic.io

If all your log events are now sent in JSON; you will have to activate a few fields:

  • logger or logger_name : most of the time a standard representing the name of the Java class
  • level : the severity level (INFO, WARNING, ERROR, etc...)
  • thread_name : the name of the thread who logged the event

Please refer to Manage Fields & Metrics if you don't yet know how to define fields.

Here are a few views you may be interested in:

Severity share of voice

A share of voice graph split by level:

Share of severity

Share of severity

Severity pie charts over hostnames

Pie charts showing severity over multiple hosts. Adminsys are able to see the health of the running servers:

Severity ratios per hostname

Severity ratios per hostname

At any moment, you are able to focus on a specific facet of your application by doing a search. The following example shows how we select a specific class:

Selecting a specific class

Selecting a specific class

Severity levels for each class

This is probably one of the most useful and efficient ways to troubleshoot for developers. Select the Breakdown view and select logger_name as a first field and level as a second.

You then observe the health of each class at any moment of time and over one or multiple hostnames:

Severity for each Java class

Severity for each Java class

Since every piece of our graphs are clickable, you can easily check which log events are delivering warnings or errors. You can focus on them with the contextual panel:

Using the explore view  and a leaderboard to troubleshoot java errors

Using the explore view and a leaderboard to troubleshoot java errors

Logging from Java

Your Java logs are boring? Go see our blog to learn powerful and simple techniques to make them valuable! click here