Wednesday, May 20, 2009

Testing private methods using Reflection

- What about testing private methods ?
- Hum... You can't because they're private dummy.
- And what about using reflection to make them accessible ?

Because we don't have access to private methods, we generally don't test them. This can be a weakness in your testing strategy. Private methods are usually a very sensible part of your code. I've have seen a lot of developers modifying the visibility of their code from private to protected. That's a bad practice. Don't change the visibility of your code for testing purposes.

A solution consists of using reflection to make these private methods accessible for testing.

I really don't like to use reflexion in my application code because if someone is doing some refactoring like renaming a method and forget to update the reflexive code part, you will get a very bad runtime exception. If it is possible to perform an operation without using reflection, then it is preferable to avoid using it.

Test code is not application code. Your tests does not go into production, that's why I'm not afraid of using reflection in my test classes.

Here's an example :

MyClass.java

public class MyClass {

private String myPrivateMethod(Long id) {
//Do something private
return "SomeString_" + id;
}
}

MyClassTest.java

import java.lang.reflect.Method;

import static org.junit.Assert.*;
import org.junit.Test;

public class MyClassTest {

private MyClass underTest;

@Test
public void testMyPrivateMethod() throws Exception {

underTest = new MyClass();

Class[] parameterTypes = new Class[1];
parameterTypes[0] = java.lang.Long.class;

Method m = underTest.getClass().getDeclaredMethod("myPrivateMethod", parameterTypes);
m.setAccessible(true);

Object[] parameters = new Object[1];
parameters[0] = 5569L;

String result = (String) m.invoke(underTest, parameters);

//Do your assertions
assertNotNull(result);
}
}

Sunday, May 17, 2009

Integration testing with maven 2.0

One thing to keep in mind is that unit tests are not integration tests.

The characteristics of unit tests are :
- They test your code in isolation
- They must be fast because they have to be run a lot of times

The problem with unit testing is that even if they cover a lot of your code, you can still have errors on integration when you put all the pieces together. That's why you must create integration tests.

The characteristics of integration tests are :
- They tests all your pieces of code together.
- They are pretty slow because they can be fired in a context like a spring context. They rely on real database or web services.

Integration tests are slow so they must not be run in the same phase as unit tests. Unit testing must be run in the maven test phase while integration tests must be run in the maven integration-test phase. The problem is that maven doesn't help you with that.

The trick is to customize the maven surefire plugin. You have to write your integration tests in a specific package and exclude them from the execution of the maven test phase. Then, you have to bind the execution of your integration tests to the maven integration-test phase.

Here's how you do that :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<excludes>
<exclude>**/integration/*Test.java</exclude>
</excludes>
</configuration>
<executions>
<execution>
<id>integration-tests</id>
<phase>integration-test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<skip>false</skip>
<excludes>
<exclude>none</exclude>
</excludes>
<includes>
<include>**/integration/*Test.java</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
A common thing to do is to start a web server like Jetty in the pre-integration-test phase using the maven-jetty plugin.
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>
<version>6.1.10</version>
<configuration>
<contextPath>/yourContextPath</contextPath>
</configuration>
<executions>
<execution>
<id>start-jetty</id>
<phase>pre-integration-test</phase>
<goals>
<goal>run-exploded</goal>
</goals>
<configuration>
<scanIntervalSeconds>0</scanIntervalSeconds>
<daemon>true</daemon>
</configuration>
</execution>
</executions>
</plugin>

You can even populate your database in the pre-integration-test phase with the dbunit-maven plugin if you need to add specific test case in your database.

Maybe maven 3.0 will improve the integration of integration testing in the maven process. A good solution would be to have something similar to the test phase like a src/it directory where you will put all your integration tests.

Updated on July 23, 2009 :

Another solution is to use the Maven Failsafe Plugin. The Failsafe Plugin is a fork of the Surefire plugin designed to run integration tests.

By default, the Surefire plugin executes **/Test*.java, **/*Test.java, and **/*TestCase.java test classes. The Failsafe plugin will look for **/IT*.java, **/*IT.java, and **/*ITCase.java

Take a look at this article on Sonatype Blog.

Saturday, May 16, 2009

A simple example of the State Design Pattern

Here's the formal definition of the state design pattern :

The State Pattern allows an object to alter its behavior when its internal state changes. The object will appear to change its class.


The UML diagram :


If you don't understand this obscure definition nor the UML diagram, don't worry. I made a simple example for you. The state pattern is just a clean way for an object to partially change its type at runtime.

Let's take a pizza store. A pizza store is cooking pizza, baking it and delivering it to their clients. Our pizza is the context object with a state.

So here's how you do that the old fashion way :


Pizza.class

public class Pizza {

public final static int COOKED = 0;
public final static int BAKED = 1;
public final static int DELIVERED = 2;

private String name;

int state = COOKED;

public String getName() {
return name;
}

public void setName(String name) {
this.name = name;
}

public int getState() {
return state;
}

public void setState(int state) {
this.state = state;
}

public void bake() throws Exception {

if(state == COOKED) {
System.out.print("Baking the pizza...");
state = BAKED;
}
else if(state == BAKED) {
throw new Exception("Can't bake a pizza already baked");
}
else if(state == DELIVERED) {
throw new Exception("Can't bake a pizza already delivered");
}
}

public void deliver() throws Exception {

if(state == COOKED) {
throw new Exception("Can't deliver a pizza not baked yet");
}
else if(state == BAKED) {
System.out.print("Delivering the pizza...");
state = DELIVERED;
}
else if(state == DELIVERED) {
throw new Exception("Can't deliver a pizza already delivered");
}
}
}

The problem with this implementation is that everything is going messy when you have a lot of state. Moreover, the add of a new state is not that simple.

Let's see the re-factored example using the State Design Pattern.

Firstable, we have to write the state interface. This interface will describe the different transitions.


PizzaState.class

public interface PizzaState {

void bake() throws Exception;

void deliver() throws Exception;
}

Then, we refactor our Pizza object with our new state interface.


Pizza.class

public class Pizza {

PizzaState cookedState;
PizzaState bakedState;
PizzaState deliveredState;

private String name;

//State initialization
private PizzaState state = cookedState;

public Pizza() {
cookedState = new CookedPizzaState(this);
bakedState = new BakedPizzaState(this);
deliveredState = new DeliveredPizzaState(this);
}

public String getName() {
return name;
}

public void setName(String name) {
this.name = name;
}

public PizzaState getState() {
return state;
}

public void setState(PizzaState state) {
this.state = state;
}

public void bake() throws Exception {
this.state.bake();
}

public void deliver() throws Exception {
this.state.deliver();
}

public PizzaState getCookedState() {
return createdState;
}

public PizzaState getBakedState() {
return bakedState;
}

public PizzaState getDeliveredState() {
return deliveredState;
}
}

And last but not least, we write the state implementations.


CookedPizzaState.class

public class CookedPizzaState implements PizzaState {

private Pizza pizza;

public CookedPizzaState(Pizza pizza) {
this.pizza = pizza;
}

public void bake() throws Exception {
System.out.print("Baking the pizza...");
pizza.setState(pizza.getBakedState());
}

public void deliver() throws Exception {
throw new Exception("Can't deliver a pizza not baked yet");
}

}

You still have to write BakedPizzaState.class and the DeliveredPizzaState.class.

The state design pattern is one of those you need to know and master. It can help you in complex situations.

Entreprise Integration Pattern with Apache Camel 2.0

Apache Camel is an open source messaging routing framework based on the enterprise integration pattern described in the book of the same name written by Gregor Hohpe and Bobby Woolf.

Apache Camel can easily work with any kind of transport or messaging model such as HTTP, JMS, CXF, POP and a lot of others.

Camel lets you create the Enterprise Integration Patterns to implement routing and mediation rules in either Java based DSL (Domain Specific Language), via Spring configuration files or via the Scala DSL .

Personally, I recommend using the Java DSL which is maybe a bit confusing at first sight but is very powerful in the end. The benefits of using the Java DSL is that your IDE can smart complete your code as you start typing, rather than having to mess around with buckets of XML.

Let's try with an example. We have a CSV file containing candidates for a job. Each line represents the candidates characteristics (name, size and age).

Our goal is to process this file, find if the candidate is suitable for the job and then send the result in XML into a JMS queue.

First, let's create our CSV file named testfile.csv in the src/test/resources directory with the content below :

Georges,14,168
Alain,58,175
Jean,67,168

To start our Camel example, we have to create a route by implementing the RouteBuilder interface.

CandidateRouteBuilder.java
package com.jdechmann.proto.camel.route;

import java.util.List;

import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.dataformat.xstream.XStreamDataFormat;

import com.jdechmann.proto.vo.Candidate;
import com.thoughtworks.xstream.XStream;

public class CandidateRouteBuilder extends RouteBuilder {

@Override
public void configure() throws Exception {

XStream xstream = new XStream();
xstream.processAnnotations(Candidate.class);

XStreamDataFormat xStreamDataFormat = new XStreamDataFormat();
xStreamDataFormat.setXStream(xstream);

//Intercept the exceptions
onException(Exception.class)
.handled(true)
//Convert the object to XML
.marshal(xStreamDataFormat)
//Send the result to a JMS queue
.to("jms:queue.candidate.rejected");

//THE ROUTE STARTS HERE

//Consume from CSV files
from("file:src/test/resources/?fileName=testfile.csv")
//Unmarshal CSV files. The resulting message contains a List<List<String>>
.unmarshal().csv()
//Split the message into a number of pieces
.split(body(List.class))
//Convert the message into a Person object
.convertBodyTo(Candidate.class)
//Process the candidates
.process(new CandidateProcessor())
//Convert the object to XML
.marshal(xStreamDataFormat)
//Send the result to a JMS queue
.to("jms:queue.candidate.selected");
}
}

The spring configuration file contains the location of the RouteBuilder classes and the configuration of the JMS broker. You will need to have an activeMQ server running to make this example works.

applicationContext.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:camel="http://camel.apache.org/schema/spring"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-2.5.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context-2.5.xsd
http://camel.apache.org/schema/spring
http://camel.apache.org/schema/spring/camel-spring.xsd">

<bean id="camelTracer" class="org.apache.camel.processor.interceptor.Tracer">
<property name="traceOutExchanges" value="true" />
</bean>

<bean id="traceFormatter" class="org.apache.camel.processor.interceptor.DefaultTraceFormatter">
<property name="showOutBody" value="true" />
<property name="showOutBodyType" value="true" />
</bean>

<camelContext id="camel" xmlns="http://camel.apache.org/schema/spring">
<package>com.jdechmann.proto.camel.route</package>
</camelContext>

<bean id="jms" class="org.apache.activemq.camel.component.ActiveMQComponent">
<property name="brokerURL" value="tcp://localhost:61616"/>
</bean>

</beans>

Then we have to create our Java business object.

Candidate.java
package com.jdechmann.proto.vo;

import com.thoughtworks.xstream.annotations.XStreamAlias;

@XStreamAlias("candidate")
public class Candidate {

private String name;
private int age;
private int size;

public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public int getSize() {
return size;
}
public void setSize(int size) {
this.size = size;
}

@Override
public String toString() {
return "Candidate - " + " Name: " + name +
" Age: " + age + " Size: " + size;
}
}

Here's the processor which have to implement the camel Processor interface.
This class will send an exception if the candidate does not fit for the job.

CandidateProcessor.java
package com.jdechmann.proto.camel.route;

import org.apache.camel.Exchange;
import org.apache.camel.Processor;

import com.jdechmann.proto.vo.Candidate;

/**
* Process a Candidate object
* Throw an exception if the candidate does not
* match the criterias
*/
public class CandidateProcessor implements Processor {

@Override
public void process(Exchange exchange) throws Exception {
Candidate candidate = exchange.getIn().getBody(Candidate.class);

if(candidate.getAge() > 60 || candidate.getSize() < 160)
throw new Exception("Candidate refused " + candidate.toString());
else
System.out.println("Candidate accepted " + candidate.toString());
}
}

In order to convert the message body into a Candidate object, you need to tell Camel where to find your converter class. Camel will look in the classpath for a file named TypeConverter in META-INF/services/org/apache/camel/ directory. The TypeConverter file must contains the name of your package (com.jdechmann.proto.camel.converter in our case).

CandidateConverter.java
package com.jdechmann.proto.camel.converter;

import java.util.List;

import org.apache.camel.Converter;
import com.jdechmann.proto.vo.Candidate;

@Converter
public class CandidateConverter {

@Converter
public Candidate toCandidate(List personArray) {

Candidate candidate = new Candidate();
candidate.setName(personArray.get(0));
candidate.setAge(Integer.valueOf(personArray.get(1)));
candidate.setSize(Integer.valueOf(personArray.get(2)));

return candidate;
}
}

Finally, here's our Main class to start our route.

Main.java
package com.jdechmann.proto;

import org.apache.camel.CamelContext;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

public class Main {

public static void main(String[] args) throws Exception {
ApplicationContext context =
new ClassPathXmlApplicationContext("applicationContext.xml");

//Starting the camel context
CamelContext camel = (CamelContext) context.getBean("camel");
camel.start();
}
}

If you're using maven, you will need to add the following dependencies to you POM file.
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-spring</artifactId>
<version>2.0-M1</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>2.0-M1</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jms</artifactId>
<version>2.0-M1</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-csv</artifactId>
<version>2.0-M1</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-xstream</artifactId>
<version>2.0-M1</version>
</dependency>
If you want to see the route traces, add log4j in your dependencies
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.14</version>
</dependency>
And add a log4j.properties in your classpath

# Set root category priority to INFO and its only appender to CONSOLE.
log4j.rootCategory=DEBUG, CONSOLE

# CONSOLE is set to be a ConsoleAppender using a PatternLayout.
log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender
log4j.appender.CONSOLE.layout=org.apache.log4j.PatternLayout
log4j.appender.CONSOLE.layout.ConversionPattern=- %m%n

You can download the code of this example from this URL

Monday, May 11, 2009

Validation of nested properties with Oval 1.3x

When I used Oval for the first time I didn't realize right away that Oval didn't validate nested properties.

At the beginning of the project I was working on, I had simple objects with simple types like String or int. But my objects evolved in a more complex way with nested objects in it. I was very disappointed when I ran my Oval validator and didn't see the outcome I expected.

I found the solution on this blog and customize it a little bit.

The solution consists of creating an annotation and a custom validator class which wrap the OVal validator.

Here's the implementation.

Firstable we have to create the annotation.
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;

@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.FIELD)
public @interface ValidateNestedProperty {

}
The second step is to wrap the OVal validator into our own Validator class like this :
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.List;

import net.sf.oval.ConstraintViolation;

/**
* Default oval implementation does not validate nested property
*
* @author Julien Dechmann
*
*/
public class CustomOValValidator {

private net.sf.oval.Validator validator;

public CustomOValValidator() {
validator = new net.sf.oval.Validator();
}

/**
* Process the validation
* @param objectToValidate
*/
public List validate(Object objectToValidate) {
return doValidate(objectToValidate, new ArrayList() );
}

private List doValidate(Object target, List errors) {

List violations = validator.validate(target);

if(violations != null)
errors.addAll(violations);

Field[] fields = getFields(target);

for (Field field : fields) {
ValidateNestedProperty validate = field.getAnnotation(ValidateNestedProperty.class);

if(validate!=null) {
if (!field.isAccessible()) {
field.setAccessible(true);
}

Object nestedProperty;

try {
nestedProperty = field.get(target);
} catch (Exception ex) {
throw new RuntimeException("Reflexion error", ex);
}

if(nestedProperty!=null)
doValidate(nestedProperty, errors);
}
}
return errors;
}

/**
* Return the list of fields from an object
* @param clazz
* @return
*/
@SuppressWarnings("unchecked")
public static Field[] getFields(Object target) {
Class clazz = target.getClass();
return clazz.getDeclaredFields();
}
}

Let's write a simple example.

NestedBusinessObject.java :

import net.sf.oval.constraint.NotEmpty;
import net.sf.oval.constraint.NotNull;

/**
* Nested business object
*/
public class NestedBusinessObject {

@NotNull
@NotEmpty
private String property;

public String getProperty() {
return property;
}
public void setProperty(String property) {
this.property = property;
}
}
Here's the business object we want to validate :

BusinessObject.java
import net.sf.oval.constraint.NotEmpty;
import net.sf.oval.constraint.NotNull;

public class BusinessObject {

@NotNull
private int id;

@NotNull
@NotEmpty
private String name;

@ValidateNestedProperty
private NestedBusinessObject nestedBusinessObject;

public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public NestedBusinessObject getNestedBusinessObject() {
return nestedBusinessObject;
}
public void setNestedBusinessObject(NestedBusinessObject nestedBusinessObject) {
this.nestedBusinessObject = nestedBusinessObject;
}
}

And the JUnit 4 test class :
import java.util.List;
import net.sf.oval.ConstraintViolation;

import org.junit.Before;
import org.junit.Test;

public class BuisinessObjectValidationTest {

private BusinessObject businessObject;

@Before
public void before() {
businessObject = new BusinessObject();
businessObject.setId(3455);
businessObject.setName("My business object");

//Since the property of the nestedBusinessObject
//is empty, the validation of the businessObject
//should return a ConstraintViolation
NestedBusinessObject nestedBusinessObject = new NestedBusinessObject();
nestedBusinessObject.setProperty("");

businessObject.setNestedBusinessObject(nestedBusinessObject);
}

/**
* Test with the OVal validator
*/
@Test
public void testValidationWithOvalValidator() {
net.sf.oval.Validator validator = new net.sf.oval.Validator();
List violations = validator.validate(businessObject);

System.out.println("testValidationWithOvalValidator " +
"- Number of errors: "+violations.size());
}

/**
* Test with the custom validator
*/
@Test
public void testValidationWithCustomValidator() {
CustomOValValidator validator = new CustomOValValidator();
List violations = validator.validate(businessObject);

System.out.println("testValidationWithCustomValidator " +
"- Number of errors: "+violations.size());
}
}

The output of the execution of this test is :
testValidationWithOvalValidator - Number of errors: 0
testValidationWithCustomValidator - Number of errors: 1
The custom validator found 1 error . On the contrary, the OVal validator found none.

Sunday, May 10, 2009

Good unit testing with JMock 2

Sometimes it can be difficult to unit test objects like services which can return multiple errors. A good unit testing strategy is to test every part of your code. JMock can help you do that by mocking your objects and code what they will return. You don't need to create your mock objects anymore.

Here's an example.

Account.java :
package com.jmock.vo;

public class Account {

private String id;
private String name;
private boolean activated;

public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public boolean isActivated() {
return activated;
}
public void setActivated(boolean activated) {
this.activated = activated;
}
}

AccountServicesImpl.java is the class under test.

AccountServicesImpl.java :
package com.jmock.services.impl;

import org.springframework.beans.factory.annotation.Autowired;

import com.jmock.dao.AccountDAO;
import com.jmock.general.ErrorConst;
import com.jmock.services.exception.AccountServicesException;
import com.jmock.vo.Account;

public class AccountServicesImpl {

@Autowired
private AccountDAO accountDAO; //Spring injected

public Account getAccount(String id) throws AccountServicesException {
Account account = accountDAO.selectAccount(id);

if(account == null)
throw new AccountServicesException(ErrorConst.ERROR_ACCOUNT_UNKNOWN);

if(!account.isActivated())
throw new AccountServicesException(ErrorConst.ERROR_ACCOUNT_INACTIVE);

return account;
}
}

As you can see, there is 3 scenarios to implement :
- The normal test case
- The account unknown test case
- The account inactive test case

And here's the test class.

AccountServicesTest.java :

import static org.junit.Assert.assertEquals;
import static org.junit.Assert.fail;

import java.lang.reflect.Field;

import org.jmock.Expectations;
import org.jmock.Mockery;
import org.jmock.integration.junit4.JMock;
import org.jmock.integration.junit4.JUnit4Mockery;
import org.junit.Before;
import org.junit.Test;

import com.jmock.dao.AccountDAO;
import com.jmock.general.ErrorConst;
import com.jmock.services.exception.AccountServicesException;
import com.jmock.services.impl.AccountServicesImpl;
import com.jmock.vo.Account;
import org.junit.runner.RunWith;

@RunWith(JMock.class)
public class AccountServicesTest {

//Mock context
private Mockery context;

private AccountDAO accountDAO;

private AccountServicesImpl underTest;

@Before
public void before() throws Exception {
context = new JUnit4Mockery();
accountDAO = context.mock(AccountDAO.class);

underTest = new AccountServicesImpl();

//We use reflection to access the private field
Field fieldCurrencyServices = underTest.getClass().getDeclaredField("accountDAO");
fieldCurrencyServices.setAccessible(true);
fieldCurrencyServices.set(underTest, accountDAO);
}

@Test
public void testGetAccount() {
final Account account = new Account();
account.setId("124110002055");
account.setActivated(true);

context.checking(new Expectations() {{
oneOf (accountDAO).selectAccount(with(account.getId()));
will(returnValue(account));
}});

try {
underTest.getAccount(account.getId());
} catch (AccountServicesException ex) {
ex.printStackTrace();
fail("No exception expected");
}
}

@Test
public void testGetUnknownAccount() {
final Account account = new Account();
account.setId("124110002055");
account.setActivated(true);

context.checking(new Expectations() {{
oneOf (accountDAO).selectAccount(with(account.getId()));
will(returnValue(null));
}});

try {
underTest.getAccount(account.getId());
} catch (AccountServicesException ex) {
assertEquals(ErrorConst.ERROR_ACCOUNT_UNKNOWN, ex.getErrorId());
}
}

@Test
public void testGetInactiveAccount() {
final Account account = new Account();
account.setId("124110002055");
account.setActivated(false);

context.checking(new Expectations() {{
oneOf (accountDAO).selectAccount(with(account.getId()));
will(returnValue(account));
}});

try {
underTest.getAccount(account.getId());
} catch (AccountServicesException ex) {
assertEquals(ErrorConst.ERROR_ACCOUNT_INACTIVE, ex.getErrorId());
}
}
}

Here's are the dependencies you need to add to your pom.xml
<dependency>
<groupId>org.jmock</groupId>
<artifactId>jmock</artifactId>
<version>2.5.1</version>
</dependency>
<dependency>
<groupId>org.jmock</groupId>
<artifactId>jmock-junit4</artifactId>
<version>2.5.1</version>
</dependency>

Friday, May 8, 2009

Using the iBATOR eclipse plugin for iBATIS code generation

This tutorial explains how to use the iBATOR eclipse plugin 1.2.0 for iBATIS.

iBATOR is a code generator for iBATIS. iBATOR will introspect one or more database table and will generate iBATIS artifacts that can be used to access the tables.

iBATOR will generate the following artifacts :

- SqlMapXML files (XML files used by iBATIS to map objects and data).
- Java model classes (Match the fields of the table).
- Dao classes (Implements the CRUD operations).

You will still need to hand code SQL and objects for custom queries, or stored procedures.

But what happen to your custom SQL queries when you re-launch iBATOR. Don't worry, any hand coded additions to generated Java classes or SqlMap files will remain undisturbed. The eclipse plugin will merge Java and XML Files.

Let's start,

1. First of all, install the ibator plugin for eclipse using the update site http://ibatis.apache.org/tools/ibator

2. Create a simple java project named ibator. if you have multiple projects, it's a good practice to isolate the ibator generation part. Since I use iBATOR in a pretty big financial project, I have multiple maven module like project-persistence, project-core, project-server, etc... I generate the DAO classes in the persistence project and the domain objects in the core project. If you are, like me, writing some customs plugins for iBATOR, you can put these classes in the ibator project and not polluting your other projects.

3. Right click on your new ibator project and if the plugin is correctly installed, you'll see the entry 'Add iBATOR to the build path'. Click on it.

4. Create the configuration file ibatorConfig.xml

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE ibatorConfiguration
PUBLIC "-//Apache Software Foundation//DTD Apache iBATIS Ibator Configuration 1.0//EN"
"http://ibatis.apache.org/dtd/ibator-config_1_0.dtd">

<ibatorConfiguration>
<ibatorContext
id="ibatorContext"
targetRuntime="Ibatis2Java5"
defaultModelType="flat">

<ibatorPlugin type="org.apache.ibatis.ibator.plugins.SerializablePlugin"/>

<jdbcConnection
driverClass="oracle.jdbc.driver.OracleDriver"
connectionURL="jdbc:oracle:thin:@host:1521:dbname"
userId="user"
password="password">
</jdbcConnection>

<javaModelGenerator targetPackage="com.yourcompany.vo"
targetProject="yourEclipseProject">
<property name="enableSubPackages" value="true" />
<property name="trimStrings" value="true" />
</javaModelGenerator>

<sqlMapGenerator targetPackage="com.yourcompany.dao.ibatis.ibatis.maps"
targetProject="yourEclipseProject">
<property name="enableSubPackages" value="true" />
</sqlMapGenerator>

<daoGenerator type="SPRING"
targetPackage="com.yourcompany.dao"
implementationPackage="com.yourcompany.dao.ibatis.impl"
targetProject="yourEclipseProject">
<property name="enableSubPackages" value="true" />
<property name="methodNameCalculator" value="extended" />
</daoGenerator>

<table tableName="MY_TABLE" domainObjectName="yourObject">

</ibatorContext>
</ibatorConfiguration>

I use some special properties in this file. To fully understand the configuration file, see the online documentation.

5. Create a build.xml file at the root of your ibator project.

<project default="runIbator">

<target name="runIbator">
<eclipse.convertPath resourcepath="ibator/resources/ibatorConfig.xml" property="thePath"/>
<ibator.generate configfile="${thePath}" ></ibator.generate>
</target>

</project>

6. Create a lib directory and put your ojdbc jar in it.

7. Create an external tool to run ibator. Go to Run/External tools/External tools configuration... and create a new Ant build configuration. In the main tab, select your build.xml file in the buildfile part. Add your ojdbc jar file in the classpath tab. In the JRE tab, tick 'Run in the same JRE as the workspace'.

8. Run the tool.

Another way would be to use the 'Generate Ibatis artifacts' instead of an external tool when you right click on your project. The problem is, in that case, you don't have any control over the classpath. Since I create custom plugins for iBATOR, I need to add these classes to the classpath.

To conclude, iBATOR is a great tool if you manage to make it work. It can save you a lot of time at the beginning of a project generating all the DAOs and the domain objects.