Article Series: Migrating Spring Applications to Java EE 6 – Part 3

May 10, 2012

Developer Topics, Java EE

Bert Ertman  and Paul Bakker of Luminis have written a multi-part article that covers migrating Spring Applications to Java EE 6 technology.  It also consists of sample code a working project.  In the series Paul and Bert discuss the rationale for migrating your applications from Spring to Java EE 6 and show you real examples of upgrading the web UI, replacing the data access layer, migrating AOP to CDI interceptors, migrating JMX, how to deal with JDBC templates, and as an added bonus will demonstrate how to perform integration tests of you Java EE 6 application using Arquillian.

Introduction

This is the third part of an article series about migrating from Spring to Java EE. The first part is a general introduction explaining reasons to migrate and an overview of the step-by-step migration approach. In the second article we explained how to migrate the web layer of a Spring application to Java EE while still using Spring components in other layers of the application. To do so we showed how to run Java EE and Spring side-by-side.

In this article we will discuss migrating the DAO layer, AOP and JMX. The article series comes with an example project hosted on GitHub. The example project is a Spring Pet Clinic application which is migrated to Java EE step-by-step. The articles use code snippets from the example project, but to get a better grip on the code we advise to look at the full code.

Migrating the DAO layer

In this article we will migrate the data access layer to EJB based JPA. In the previous article we already discussed the reasons to migrate. In the case of data access we often find applications that are using an outdated persistence solution. Before JPA became widely adopted there were several ORM solutions that could be used such as Toplink, Kodo and JDO making it hard to choose sometimes ending up with a myriad of different ORM implementations that have to be maintained. Luckily these times are behind us now that JPA is widespread and is accepted by almost everybody as the Java platform persistence default. Another problem with using one of these outdated frameworks is that they are often not supported anymore and it’s hard to find developers with knowledge about these frameworks.

In the first two articles of this series we already suggested to migrate the ORM solution within Spring first, before transitioning to Java EE. Spring has decent support for JPA, and the programming model is very similar to Java EE. This article will not discuss migrating old frameworks to JPA because there are just too many different scenarios to make that relevant. The example Pet Clinic application was already using JPA, so we just have to migrate it from Spring JPA to EJB. This is fairly easy because the programming models are very similar, and better yet EJBs just need less configuration.

Oh my God, did you just say EJB? Yes, we did, and for all you EJB’ophobes out there it is about time to realize that the 2004 rhetorics about EJB being an ivory tower component model is no longer true. Sure Entity Beans were a mess back in the days, but the current incarnation of EJBs should be considered as the crown jewels of Java EE 6. They are lightweight, convention-over-configuration, POJO-based, transactional components bundling all the goodness of Java EE in just a couple of annotations. The programming model is straightforward and simple and they need even less configuration than your typical Spring bean. EJBs are ideal for meeting non-functional requirements and we will introduce them in the migration solution to control the data access in a transactional way.

Let’s get started on migrating more code!

Step 1 – Migrate the DAO code to EJB

Spring uses two annotations to create a transactional bean.

@Repository to make a class a bean (or use XML instead).

@Transactional to make the bean transactional.

In EJB we just have to use the @Stateless annotation. A Stateless Session Bean is transactional by default. In a Stateless Session Bean we can use @PersistenceContext to get a managed EntityManager. Spring implemented this too, so in our Pet Clinic example the code looks the same.

@Stateless
public class EntityManagerClinic implements Clinic {
  @PersistenceContext
  private EntityManager em;
   ….

In the Pet Clinic DAO there is also the use of Spring’s @Transaction annotation on methods to override default transaction behavior. In EJB we use the @TransactionAttribute annotation for this purpose. In the example some transactions are marked as read-only; this doesn’t work in Spring when using JPA anyway, so we just remove the annotation.

The rest of the code remains unchanged!

Step 2 – Remove JPA configuration

Spring needs quite a lot of XML configuration to work with JPA, and we can remove most. Remember that there is still some code that uses JDBC templates in the project, so we can’t remove all data access related code just yet.

First of all we can remove the setup of an EntityManagerFactory and the transaction manager configuration. Because the JDBC templates also use transactions we still need a basic transaction manager however.

<!-- JPA EntityManagerFactory -->
<bean id="entityManagerFactory"
  class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean"
  p:dataSource-ref="dataSource">
    <property name="jpaVendorAdapter">
      <bean   
        class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter"
        p:database="${jpa.database}" p:showSql="${jpa.showSql}"/>
    </property>
    <property name="persistenceXmlLocation  
      value="classpath:META-INF/jpa-persistence.xml"/>
</bean>

<!-- Transaction manager for a single JPA EntityManagerFactory (alternative to JTA) -->
<bean id="transactionManager"
     class="org.springframework.orm.jpa.JpaTransactionManager"
     p:entityManagerFactory-ref="entityManagerFactory"/>
<bean id="clinic"
  class="org.springframework.samples.petclinic.dao.jpa.EntityManagerClinic"/>

Replace all of the above by this single line:

<bean id="transactionManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager" p:dataSource="dataSource"/>

This transaction manager and other data access configuration will also be removed in the future as soon as we start migrating the JDBC templates.

Step 3 – Create a data source

The Pet Clinic was using a direct connection to the database. This is ok for testing but is far from ideal for production. Instead we should use a data source that is managed on the application server. Note that this is also the preferred way of working in Spring, but many applications including the Pet Clinic example don’t. Create a data source on your application server and name it petclinic. 

Step 4 – Fix persistence.xml

In the Pet Clinic application the JPA configuration is partly in the standardized persistence.xml configuration file, and partly in Spring configuration. Because we removed the Spring configuration we need to add some information to the persistence.xml. The file should contain the following persistence-unit configuration.

<persistence-unit name="PetClinic">
        <provider>org.hibernate.ejb.HibernatePersistence</provider>
        <jta-data-source>petclinic</jta-data-source>

       <mapping-file>META-INF/orm.xml</mapping-file>
<exclude-unlisted-classes>true</exclude-unlisted-classes>
        <properties>
            <property name="hibernate.hbm2ddl.auto" value="create"/>
        </properties>
</persistence-unit>

Note that we are now using container managed transactions, and are using a data source. During development we can also use automated table creation from Hibernate (or any other provider).

Step 5 – Remove Spring bean bootstrap

In the previous article we used the Seam Spring module to bootstrap a Spring container and publish Spring beans in our Java EE 6 code using CDI. Since Clinic is not a Spring bean now we don’t need to do this anymore.

Remove the following code from WebApplicationContextProducer:

@Produces
    @SpringBean
    Clinic clinic;

Step 6 (optional) – Improve the DAO

Although we successfully migrated the Spring DAO to EJB it can still be improved, but this is fully optional. You could have done the same improvements while still using Spring.

Use the TypedQuery API instead of Query. For example rewrite this line:

return this.em.createQuery("SELECT vet FROM Vet vet ORDER BY vet.lastName, vet.firstName").getResultList();

as follows:

return this.em.createQuery("SELECT vet FROM Vet vet ORDER BY vet.lastName, vet.firstName", Vet.class).getResultList();

Step 7 (optional) – Replace Entity XML configuration by annotations

In the Pet Clinic application Entitites are mapped in orm.xml. This is fine and has the benefit of keeping your “domain” classes free of technology specific annotations, but in most cases this doesn’t weight up to the ease of use of annotations.  

Step 8 – Remove Spring ORM dependency

We are not using any Spring libraries for JPA any more, so we can remove the Maven dependency org.springframework.orm from the POM.

Migrating JDBC templates

Step 1 – Use JDBC Templates in Java EE

In general, JDBC Templates are a poor solution. They don’t have enough abstraction to work on different databases because you use plain SQL in queries. There is also no real ORM mapping which results in quite a lot of boilerplate code. There are some valid reasons to use JDBC templates however, such as the need for database specific query features. This could be done by using JPA, or more advanced features of the underlying providers such as Hibernate too, and we would advise to use them instead of using JDBC templates. If you already heavily invested in JDBC Templates however, it doesn’t always make sense to migrate this to a provider specific JPA implementation. In case you don’t want to migrate your JDBC Templates you can use the following approach.

JDBC Templates do not require a Spring container. A JDBC Template can be constructed from a DataSource, which of course is a standard Java interface. In a Spring application you would use dependency injection to inject a configured DataSource and create a JDBC Template from code. We can do the same in Java EE.

public class JdbcClinicReporting implements ClinicReporting {
   @Resource(mappedName = "petclinic")
   DataSource ds;

   @PostConstruct
   public void setup() {
       simpleJdbcTemplate = new SimpleJdbcTemplate(ds);
   }

   private SimpleJdbcTemplate simpleJdbcTemplate;

   public int countVisitsAtDate(Date date) {
       return simpleJdbcTemplate.queryForInt("select count(*) from visits where
visit_date = ?", date);
   }
}

Using the @Resource annotation we can inject a DataSource in code. Next we use a @PostContruct to construct a SimpleJDBCTemplate when a new instance of JdbcClinicReporting is created.

Step 2 – Improving the code

The only downside of this code is that there is a hard dependency on the JNDI name of the datasource. This brakes when the JNDI name of the datasource is changed. The code can be improved by putting this configuration in a single place using a CDI Producer and a CDI Qualifier.

public class JdbcClinicReporting implements ClinicReporting {
    @Inject @PetClinicDS
    DataSource ds;

    @PostConstruct
    public void setup() {
        simpleJdbcTemplate = new SimpleJdbcTemplate(ds);
        System.out.println("creating jdbc template using provider");
    }

    private SimpleJdbcTemplate simpleJdbcTemplate;

    public int countVisitsAtDate(Date date) {
        return simpleJdbcTemplate.queryForInt("select count(*) from visits where visit_date = ?", date);
    }
}

public class DataSourceProducer {
    @Produces @PetClinicDS
    @Resource(mappedName = "petclinic")
    DataSource ds;
}

@Retention(RetentionPolicy.RUNTIME)
@Qualifier
public @interface PetClinicDS {
}

While using the producer class the configuration of the JNDI name of the datasource is now in a single place, while multiple classes could use this datasource in a semi type-safe way.

Step 3 – Removing configuration

Now that we migrated the JDBC Templates from Spring to CDI the project doesn’t contain any Spring code related to data access any more. This means we can remove the Spring datasource and transaction configuration and the corresponding Maven dependencies.

Remove both applicationContext-datasource.xml and applicationContext-jpa. Also remove the corresponding Spring context and Spring bean from the WebApplicationContextProducer class.

The project is a lot simpler now. While Spring needs a lot of XML configuration to set up datasources and transaction management, this is not necessary when using EJB because of its convention-over-configuration mechanism.

Step 4 – Removing dependencies

Because we are still using some Spring APIs (the JDBC Templates) we can’t completely get rid of all the data access related Maven dependencies. This is not a big problem, as long as the dependency is small and doesn’t pull in many other dependencies. Because we are not using any Spring ORM functionality any more we can replace the org.springframework.orm depencency by org.springframework.jdbc. 

Migrating aspects

Most Spring applications use some form of AOP. Of course the Spring transaction management is based on this, but many applications also use some custom AOP code. This comes in two flavors, full blown AspectJ and Spring AOP. AspectJ has many advanced features such as Mixins and Introductions. These features can be very powerful, but are also very dangerous at the same time, especially in the hands of the unskilled they can produce all sorts of weird side effects that are hard to debug. Furthermore they can easily obscure your code and then become a maintenance problem. To use these features you would also need to add the AspectJ compiler to your build process. Most Spring applications don’t need those advanced features however and use Spring AOP instead. Spring AOP features a simplified way of using AspectJ, but only on Spring beans and more limited in functionality. For most applications this is just good enough, because this gives you the power of implementing cross-cutting code while not having the disadvantages of using AspectJ directly. Java EE has a very similar way of implementing crossing-cutting concerns with CDI Interceptors. We can migrate all Spring AOP Aspects of the application easily to CDI Interceptors. Let’s take the UsageLogAspect as an example.

Step 1 – Refactor the Aspect code

First of all we remove the @Aspect and @Before annotations from the code and replace them by an @Interceptor on top of the class and an @AroundInvoke on top of the method. Because the way parameters are retrieved in an Aspect and in an Interceptor are slightly different we have to modify the method to the following.

@AroundInvoke
    public Object logNameRequest(InvocationContext ctx) throws Exception {
        if (this.namesRequested.size() > this.historySize) {
            this.namesRequested.remove(0);
        }

        Object[] parameters = ctx.getParameters();
        if(parameters.length > 0 && parameters[0] instanceof String) {
            this.namesRequested.add((String)parameters[0]);
        }

        System.out.println("Now we have " + namesRequested.size() + " names stored");

        return ctx.proceed();
    }

As commented in the original PetClinic code this is not a very useful and not very scalable interceptor. It’s fine as an example however and will get you underway with migrating your own, hopefully more useful, aspects.

Step 2 – Apply the interceptor to methods

In Spring you use an Aspectj pointcut definition in the aspect to define which methods should be intercepted. In Java EE we do this in a more declarative way. This can be a disadvantage in some cases, but increases code readability in most cases.

First we create a new Qualifier interceptor.

@Retention(RetentionPolicy.RUNTIME)
@InterceptorBinding
public @interface CountUsage {}

Now we apply this method both on the interceptor class and the method that should be intercepted.

UsageLogInterceptor
@Interceptor
@CountUsage
@javax.inject.Singleton
public class UsageLogInterceptor {

EntityManagerClinic
@CountUsage
public Collection findOwners(String lastName) {

Step 3 – Enable the interceptor

The last step is to enable the interceptor. Interceptors are by default disabled in Java EE, you have to enable them explicitly in the beans.xml file. This may look inconvenient, but it gives you the possibility to enable and disable interceptors depending on the deployment. For example, we could disable the interceptor during development.

beans.xml
<interceptors>
    <class>example.interceptors.UsageLogInterceptor</class>
</interceptors>

That’s it! We can do the same for the other interceptors. As you can see interceptors in Java EE are more declarative which improves code readability but can be more restricting.

Migrating JMX

JMX enables managing an application using generic management tools such as JConsole. All you have to do as a developer is expose data and management methods as JMX beans. In Spring this is done with a combination of XML configuration and annotations in the beans that should be exported as JMX beans. In Java EE we can do the same with a few lines of code.

Step 1 – Writing a JMXBean qualifier

Let’s start by extending the UsageLogInterceptor so it exposes it’s data to JMX. Unfortunately there is no standard annotation to do so in Java EE, but we can easily add this ourselves by writing a CDI Qualifier. We can use a member in the annotation to specify the name of the JMXBean. Alternatively we could default to the classname. If you use members in a qualifier annotation you would normally need to match the values exactly in the injection points, so you can use values as filters. You can prevent this filtering by using the @Nonbinding annotation on the member.

@Retention(RetentionPolicy.RUNTIME)
@Qualifier
public @interface JMXBean {
  @Nonbinding String objectName() default "";
}

Now add this on top of the UsageLogInterceptor. JMX also requires to implement an interface with a name ending with MBean.

UsageLogInterceptor

@Interceptor
@CountUsage
@Singleton
@JMXBean(“UsageLog”)
public class UsageLogInterceptor implement UsageLogMBean{
...

UsageLogMBean

public interface UsageLogMBean {
    public String[] getNames();
}

Step 2- Register JMX beans

Next we need a bootstrap mechanism to register all JMX beans to the JMX container. There are several ways to implement this, including writing a CDI extension to do this in the most re-usable way. In this case we will use an EJB however because this is the most simple way. The EJB specification allows us to create application startup hooks, and that’s what we will use to bootstrap the JMX beans.

@Startup
@Singleton
public class JmxBeanExtension  {
    @Inject @JMXBean Instance<Object> jmxBeans;
    private MBeanServer mBeanServer;

    @PostConstruct
    public void exportsBean() throws Exception {
        mBeanServer = ManagementFactory.getPlatformMBeanServer();

        for (Object bean : jmxBeans) {
            String annotationValue = bean.getClass().getAnnotation(JMXBean.class).objectName();
            ObjectName objectName;
            if(annotationValue.equals("")) {
                objectName = new ObjectName(bean.getClass().getName());
            } else {
                objectName = new ObjectName(annotationValue + ":type=" + bean.getClass().getName());
            }

            mBeanServer.registerMBean(bean, objectName);
            System.out.println("Registered " + objectName);
        }
    }
}

Let’s see what’s happening here. First of all we inject all beans annotated with the @JMXBean qualifier. Because we don’t require to implement a specific interface this will return Object types. Normally CDI would not allow an injection point with multiple matching beans, but we can override this behavior by using the Instance interface.

Because the EJB is a @Startup @Singleton and the exportBeans method is annotated @PostConstruct, this method will be called by the container after deployment is completed but before the application can be accessed by any client. In the exportBeans method we create a MBeanServer using the JMX API and register each @JMXBean to it. This looks like a lot more code than it’s Spring counterpart, but in Spring you do more or less the same in XML.

Step 3 – Cleaning up configuration

We don’t need the Spring configuration for aspects any more, so we can simply remove the aop.xml file.

Cleaning up

There is no Spring code in our Pet Clinic any more! This means we can remove some more dependencies and the last pieces of configuration. Because we are using JDBC Templates we can’t remove all dependencies, but that’s a choice we made. In the next article we will show an alternative way to migrating JDBC Templates.

Remove jdbc.properties, log4j.properties and the following Maven dependencies:

org.springframework.context
org.springframework.oxm
org.springframework.aspects
com.springsource.org.aspectj.weaver
com.springsource.slf4j.api
com.springsource.slf4j.org.apache.commons.logging
com.springsource.slf4j.log4j  
com.springsource.org.apache.log4j
com.springsource.org.apache.commons.dbcp
com.springsource.org.apache.commons.pool
com.springsource.org.hsqldb
com.springsource.org.jdom
com.springsource.javax.xml.bind
org.springframework.test
seam-spring-core

Unfortunately the Spring JDBC module that we need for JDBC templates is not very modular, it pulls in a lot of other dependencies that we don’t actually need. This is not a big problem, but it does make our WAR file bigger than necessary. Even so, our POM file looks a lot cleaner now!

This is not really a benefit for day to day development, but an important benefit when looking at the complexity of the solution as a whole. This is important when, for example, new team members join the team.

Integration tests

We did cheat a little in migrating this code. The original Spring code included integration tests based on the Spring test framework to test the DAO code. We completely ignored those and obviously they break (they don’t even compile after removing the dependencies). We will make up for this in the next article in the series by introducing the Arquillian test framework. When migrating real applications you should definitely write tests as part of the migration, not just after migration is completed. For the sake of readability of the article series we decided to discuss this step separately however.

Was it worth the trouble?

We fully migrated the Spring code to Java EE (minus the integration tests). This was quite a lot of work so it’s a very reasonable question to ask if it’s all worth the trouble.

The obvious answer is that it really depends on your specific situation. Remember the scenario we introduced in the first part of the article? This was an old Spring application using outdated frameworks within Spring. Using many framework within Spring is very typical for an application built five or six years ago. Although the Spring framework has preserved backwards compatibility reasonably well, migrating away from frameworks used within Spring can be very painful (but necessary). If you are in this situation it’s hardly any more work to migrate away from Spring as well in the process of upgrading. This gives you the advantage of a simpler code base (including dependency management and configuration) and most importantly a standards based solution.

If you are in the situation where you are on the latest and greatest Spring technology with a team that has a lot of Spring experience and knowledge it’s a whole different question. Migration would be quite easy because the programming models are so similar these days and it would give you the benefit of having a standards based solution. The benefits on a technical level are very limited however, while your team needs to be trained in using Java EE. If this is worth the investment is a difficult question with a different answer for each situation. You should probably not do any kind of migration when there are no problems with the current solution. Keep in mind that Spring is not a standard however, and this alone is a valid reason to consider migration if the investment is reasonably small.

What’s next?

In the next article of this series we will write about integration testing, an alternative way of JDBC Template migration, and an alternative non-component based web approach.

- Bert  Ertman & Paul Bakker

About the authors

Bert Ertman (@BertErtman) is a Fellow at Luminis in the Netherlands and a Sun/Oracle recognized Java Champion. Besides his day-job he is a JUG Leader for the Netherlands Java User Group (3500 members).

Paul Bakker (@pbakker) is a senior developer at Luminis Technologies in the Netherlands and a contributor on the JBoss Seam, Arquillian, and Forge projects.

Both authors have extensive experience in building enterprise applications using a variety of technologies ranging from the pre-J2EE, J2EE, Spring, and modern Java EE technologies. They have been discussing the use of new enterprise technologies regularly and had endless discussions on Spring vs. Java EE. Currently, they believe that both new, green field, enterprise applications, and large-scale maintenance migration on legacy apps can be done best using Java EE 6 technology. The authors have been evangelizing Java EE (6) technology at various conferences around the world, including J-Fall, Jfokus, Devoxx, and JavaOne. This series of articles is based upon their well-received JavaOne 2011 presentation titled “Best Practices for Migrating Spring to Java EE 6”.


Disclaimer: The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of Red Hat, Inc. Examples of analysis performed within this article are only examples. Assumptions made within the analysis are not reflective of the position of Red Hat, Inc.

,

About rayploski

I'm lucky enough to have a dream job. I run developer strategy for Red Hat.

View all posts by rayploski

7 Comments on “Article Series: Migrating Spring Applications to Java EE 6 – Part 3”

  1. phillips1021 Says:

    Very much appreciate these articles. Spring has worked very well for my development and my team has excellent Spring knowledge. I appreciated that Java EE 6 has greatly improved and is the Java standard. But your comment “The benefits on a technical level are very limited however, while your team needs to be trained in using Java EE.” hits the nail on the head.

    I’m still looking for a specific list of technical benefits that I can present to management to justify spending time and money on training our Java programmers on Java EE 6 and changing our development technologies to Java EE 6.

    The argument that Java EE 6 is easier to develop with and is the Java standard isn’t sufficient.

    Reply

  2. Oli Says:

    Is there a concept for the access of multiple datasources at request time in JEE 6? What I mean is that in my scenario I only know which datasource to use when my service method is executed. A passed attribute identifies a datasource by naming convention. Because I have to do this I can not use JTA and CMT but BMT. My decision was to use JdbcTemplate but also I have to use TransactionTemplate. Is it true that for my use case JEE 6 has no solution? All example have the datasource defined in xml or annotation which too hard for my case.

    Reply

  3. Jmac Says:

    Parts 1 and 2 were available in PDF. Is there a PDF version for part 3?

    Reply

  4. Tan Nguyen Says:

    “the 2004 rhetorics about EJB being an ivory tower component model is no longer true”: a key issue back then was the burdensome architecture required to support EJB, I think. Can you expand on how these rhetorics have been addressed? in particular in terms of the low level protocols like IIOP, the EJB containers, etc. Any changes there? or does JEE6 just creates a new wrapper to facilitate newer programming models like injection. Thanks.

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 117 other followers

%d bloggers like this: