Spring boot and Graylog

Graylog is a platform (free and enterprise) that is useful for log management of aggregated data. It is based on Elasticsearch, MongoDB and Scala.
It can display messages and histograms.

To send data to a graylog server through an HTTP appender with log4J2, i used the GELF layout with the following configuration :

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="debug" monitorInterval="30">
	<Properties>
		<Property name="LOG_PATTERN">
			%d{yyyy-MM-dd HH:mm:ss.SSS} %5p ${hostName}
			--- [%15.15t] %-40.40c{1.} : %m%n%ex
		</Property>
	</Properties>
	<Appenders>
		<Console name="ConsoleAppender" target="SYSTEM_OUT" follow="true">
			<PatternLayout pattern="${LOG_PATTERN}" />
		</Console>

        <!--  Appender for GRAYLOG -->     
		<Http name="Http" url="http://xxx.yyy.zz.aa/gelf" >		
			<GelfLayout host="localhostGraylog" compressionType="OFF" includeStacktrace="true">
				<KeyValuePair key="environment" value="myappGraylog" />
			</GelfLayout>
		</Http>
		
	</Appenders>

	<Loggers>
		<Logger name="com.celinio.app" level="debug"
			additivity="false">
			<AppenderRef ref="ConsoleAppender" />
 			<AppenderRef ref="Http" /> 
		</Logger>

		<!-- Spring Boot -->
		<Logger name="org.springframework.boot" level="debug"
			additivity="false">
		        <AppenderRef ref="ConsoleAppender" />
 			<AppenderRef ref="Http" /> 
		</Logger>

		<Logger name="org.hibernate" level="debug" additivity="false">
			<AppenderRef ref="ConsoleAppender" />
 			<AppenderRef ref="Http" /> 
		</Logger>

		<Root level="debug">
			<AppenderRef ref="ConsoleAppender" />
 			<AppenderRef ref="Http" /> 
		</Root>
	</Loggers>
</Configuration>

I had to use the latest version of log4j2 (2.11.2) and i also had to comment out the use of the Spring boot starter for log4j2 :

<properties>
        ...
    	<log4j.version>2.11.2</log4j.version>
</properties>
<dependencies>
        ...
	<!-- 
	<dependency>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-log4j2</artifactId>
	</dependency>
	 -->
	<dependency>
		<groupId>org.apache.logging.log4j</groupId>
		<artifactId>log4j-api</artifactId>	
		<version>${log4j.version}</version>
	</dependency>
	<dependency>
		<groupId>org.apache.logging.log4j</groupId>
		<artifactId>log4j-core</artifactId>
		<version>${log4j.version}</version>
	</dependency>
</dependencies>

Version of Spring boot : 1.5.21.RELEASE

[Spring Batch] Order of execution of the reader, the processor and the writer

It seems to me that the order of execution of the reader, the processor and the writer, inside a step and a chunk, is not always clear in everybody’s mind (included mine when i started with Spring Batch).

So I made the following 2 tables that should help – I hope – clarify things.

1) First scenario : we read, process and write all data at once. There is no commit interval defined.

Here is an excerpt of a job with one step. The reader reads the data from a datasource such as a database.


<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns:b="http://www.springframework.org/schema/batch"
xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/batch
http://www.springframework.org/schema/batch/spring-batch-2.1.xsd
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">
...

<b:job id="doSomething" incrementer="idIncrementer" >
<b:step id="mainStep" >
<b:tasklet>
<b:chunk reader="doSomethingReader" processor="doSomethingProcessor"
writer="doSomethingWriter"    chunk-completion-policy="defaultResultCompletionPolicy" />
</b:tasklet>
</b:step>
</b:job>

</beans>

If the total number of lines (items) returned from the database is 6, then here is how Spring Batch will process each item :

Order of execution with defaultResultCompletionPolicy

Execution orderReaderProcessorWriterTransactions
11st itemT1
22nd item
33rd item
44th item
55th item
66th item
71st item
82nd item
93rd item
104th item
115th item
126th item
13The 6 items at the same time

In that configuration, there is a single transaction. The items are all written at once.

2) Second scenario : we define a size of 4 items for each chunk. So that means there will be a commit every 4 items.


<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns:b="http://www.springframework.org/schema/batch"
xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/batch
http://www.springframework.org/schema/batch/spring-batch-2.1.xsd
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">

...

<b:job id="doSomething" incrementer="idIncrementer" >
<b:step id="mainStep" >
<b:tasklet>
<b:chunk reader="doSomethingReader" processor="doSomethingProcessor"
writer="doSomethingWriter"    commit-interval="4" />
</b:tasklet>
</b:step>
</b:job>

</beans>

Then chunk processing will occur in that order (supposing there are 6 items) :

Order of execution with chunk processing

Execution orderReaderProcessorWriterTransactions
11st itemT1
22nd item
33rd item
44th item
51st item
62nd item
73rd item
84th item
9The first 4 lines, at the same time
105th itemT2
116th item
125th item
136th item
14The last 2 lines, at the same time
There are 2 transactions, one for each chunk.
That means if there is a problem with the 6th item, the first 4 items (1st chunk) will already have been processed and committed.
A rollback will occur for all items of the 2nd chunk (items 5 and 6).

Read a properties file external to a webapp

Externalizing part of the configuration of a webapp in a .properties file that is located outside of that webapp (outside the WAR, the EAR …) is a frequent requirement and I initially thought that the PropertyPlaceholderConfigurer class, provided by the Spring framework and which is an implementation of the BeanFactoryPostProcessor interface, would help me to get the values of the properties defined in that external file.
The properties are pulled into the application context definition file.
The typical sample code used to achieve this is the following :


<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">  
    <property name="locations">
      <list>
          <value>file:d:/temp/toto/myFile.properties</value>  
      </list>
     </property
</bean>

<bean id="smtpserverurl" class="com.vendor.SomeSmtpServer">  
    <property name="URL" value="${smtpserver.url}"/>  
</bean>  

or :

<context:property-placeholder location="file:d:/temp/toto/myFile.properties" />

<bean id="smtpserverurl" class="com.vendor.SomeSmtpServer">  
    <property name="URL" value="${smtpserver.url}"/>  
</bean>  

And the properties file :


smtpserver.url=smtp.gmail.com

It is a good solution as long as you do not need dynamic reloading of the properties, i.e. without the need for restarting the application whenever the properties values are changed.
So I find out that in that case the Apache Commons Configuration is a good alternative because it supports dynamic reloading of the properties.
And it is as easy to use as this :


package com.celinio.readprops;
import java.io.File;
import java.util.Iterator;

import org.apache.commons.configuration.ConfigurationException;
import org.apache.commons.configuration.PropertiesConfiguration;
import org.apache.commons.configuration.reloading.FileChangedReloadingStrategy;
import org.apache.log4j.Logger;

/**
 * This class reads properties that are stored in a file outside the webapp
 * @author Celinio Fernandes
 *
 */
public class ReadExternalPropertiesUtil {
		
	/**
	 * Logger for this class
	 */
	private static final Logger logger = Logger.getLogger(ReadExternalPropertiesUtil.class);

	private static final String PROPERTIES_FILE = "config.properties";
	private static final String ENV_VAR = "PROPERTIES_APP";
	public static String propertiesFolder;
	public static PropertiesConfiguration propertiesConfig  = null;
...

FileChangedReloadingStrategy strategy  = null;
strategy  = new FileChangedReloadingStrategy();
strategy.setRefreshDelay(reloadInterval);
propertiesConfig.setReloadingStrategy(strategy);
propertiesConfig.setDelimiterParsingDisabled(true);

...

Line 23 : the path is given as an environment variable.

It works great. The only problem that I met was when I tested the reload interval (the delay period before the configuration file’s last modification date is checked again, to avoid permanent disc access on successive property lookups), it did not work : if i set the reload interval to 30 seconds for instance and if i modify the properties several times during that delay period, the new property values are still displayed. I would have expected to get the old values again, until the delay period expired.
The source code for this sample webapp is available at http://code.google.com/p/celinio/source/browse/#svn%2Ftrunk%2Fpropreloadable

Securing a GWT application with AOP

AOP can be a good solution to protect the business methods in an application developed with (or without) GWT.
If we have a method called displayCustomerData(…) and we want to make sure only a certain category of users can call it, for instance the category MANAGER, then AOP can come to the rescue.

I used AspectJ and Spring AOP to implement that. One should not be surprised to notice a tight integration of AOP in Spring since the spec lead for AspectJ is Adrian Coyler, who is also one of the main Spring committers.

0) First, add the needed dependencies with Maven :

<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-aop</artifactId>
	<version>3.0.6.RELEASE</version>
</dependency>

<!-- ASPECTJ dependencies -->
<!-- ******************** -->			
<dependency>
	<groupId>org.aspectj</groupId> 
	<artifactId>aspectjrt</artifactId>
	<version>1.6.12</version> 
</dependency>
							
<dependency>
	<groupId>aspectj</groupId>
	<artifactId>aspectjweaver</artifactId>
	<version>1.5.4</version>
</dependency>	

1) Then create your own annotation

package com.myproject.aop;
import java.io.Serializable;

public enum RoleEnumDto implements Serializable {
	MANAGER, SUPERVISER, BASIC_USER;
}

package com.myproject.aop;
import java.io.Serializable;

@Retention(RetentionPolicy.RUNTIME)
@Target( { ElementType.METHOD })
@Inherited
public @interface AuthorizedRoles {

	RoleEnumDto[] value();
}


2) Then add the AspectJ and Spring dependencies


<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:aop="http://www.springframework.org/schema/aop" 
    xsi:schemaLocation="http://www.springframework.org/schema/beans   
    http://www.springframework.org/schema/beans/spring-beans-3.0.xsd     
       http://www.springframework.org/schema/aop  
       http://www.springframework.org/schema/aop/spring-aop-3.0.xsd ">	
	
	<!-- Enable the @AspectJ support. -->
	<aop:aspectj-autoproxy/>

	<bean id="checkAuthorizedRoles" class="com.myproject.aop.CheckAuthorizedRoleAspect"   />

</beans>

3) Then develop the aspect, which is a class annotated with @Aspect.


package com.myproject.aop;

import java.lang.annotation.Annotation;
import java.lang.reflect.Method;
import java.util.Arrays;
import java.util.List;

import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpSession;

import org.apache.log4j.Logger;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Before;
import org.springframework.core.annotation.Order;
import org.aspectj.lang.annotation.Pointcut;
import org.aspectj.lang.reflect.MethodSignature;
import org.springframework.web.context.request.RequestAttributes;
import org.springframework.web.context.request.RequestContextHolder;
import org.springframework.web.context.request.ServletRequestAttributes;


@Order(0)
@Aspect
public class CheckAuthorizedRoleAspect {
	
private static final Logger log = Logger.getLogger(CheckAuthorizedRoleAspect.class);
	
/**
 * The pointcut expression where to bind the advice
 */
@Pointcut("  @annotation(com.myproject.AuthorizedRoles)")
private void businessMethods() {		
	log.debug("Entering businessMethods()");		
}// the pointcut signature
	
	
 @Before("businessMethods()")
 public void checkAuthorizedAdvice(JoinPoint joinPoint, AuthorizedRoles authorizedRoles) throws    ForbiddenAccessException {
  log.debug("Entering checkAuthorizedAdvice()");
  log.debug(joinPoint.getSignature().getName());

  // Do some role checking 
  // 1. Proceed the method if the user has the required role
  if (...) {
          return pjp.proceed();
  //2. Throw an exception if not.
  else {
  throw new ForbiddenAccessException ("The user is not allowed to call this method");
  }
 }
	 		 
}

The @Order(0) annotation is useful in the case where a jointpoint can be intercepted by several aspects.
The annotation AuthorizedRoles is passed as a parameter to the checkAuthorizedAdvice(…) method, which is the advice.
The pointcut expression matches all methods annotated with the AuthorizedRoles annotation.

4) Finally, add the annotation to the business method you want to restrict the access to. This business method can be called by the implementation of an RPC service for instance :


@Override
@Transactional(rollbackFor = SomeException.class)
@AuthorizedRoles( { RoleEnumDto.MANAGER, RoleEnumDto.SUPERVISER })
public Customer displayCustomerData(long id) throws NotFoundException {
		...
}

Pretty easy and pretty powerful.