Empower your CSS in your Maven build

July 8th, 2012 No comments

People who know me also know I’ve interest in the GUI: that means I’ve sometimes to get my hands dirty and dig deep into stylesheets (even though I’ve no graphical skill whatsoever). When it happens, I’ve always questions regarding how to best factorize styles. In this regard, the different CSS versions are lacking, because they were not meant to be managed by engineers. A recent trend is to generate CSS from a source, which brings some interesting properties such as nesting, variables, mixins, inheritance and others. Two examples of this trend are LESS and SASS.

A not-so-quick example can be found just below.

Given that I’m an engineer, the only requirement I have regarding those technologies is that I can generate the final CSS during my build in an automated and reproductible way. After a quick search, I became convinced to use wro4j. Here are the reasons why, in a simple use-case.

Maven plugin

Wro4j includes a Maven plugin. In order to call it in the build, just add it to your POM:

<plugin>
	<groupId>ro.isdc.wro4j</groupId>
	<artifactId>wro4j-maven-plugin</artifactId>
	<version>1.4.6</version>
	<executions>
		<execution>
			<goals>
				<goal>run</goal>
			</goals>
			<phase>generate-resources</phase>
		</execution>
	</executions>
</plugin>

You’re allergic to Maven? No problem, a command-line tool is also provided, free of charge.

Build time generation

For evident performance reasons, I favor build time generation instead of runtime. But if you prefer the latter, there’s a JavaEE filter ready just for that.

Configurability

Since wro4j original strategy is runtime generation, default configuration files are meant to be inside the archive. However, this can easily tweaked by setting some tags in the POM:

<configuration>
	<wroManagerFactory>ro.isdc.wro.maven.plugin.manager.factory.ConfigurableWroManagerFactory</wroManagerFactory>
	<wroFile>${basedir}/src/main/config/wro.xml</wroFile>
	<extraConfigFile>${basedir}/src/main/config/wro.properties</extraConfigFile>
</configuration>

The final blow

The real reason to praise wro4j is that LESS generation is only a small part of its features: for wro4j, it’s only a processor among others. You’ve only to look at the long list of processors (pre- and post-) available to want to use them ASAP. For example, wro4j also wraps a JAWR processor (a bundling and minifying product I’ve blogged about some time ago).

Once you get there, however, a whole new universe opens before your eyes, since you get processors for:

  • Decreasing JavaScript files size by minifying them with Google, YUI, JAWR, etc.
  • Decreasing CSS files size by minifying them
  • Minimizing the number of requests by merging your JavaSscript files into a single file
  • Minimizing the number of requests by merging your CSS files into a single file
  • Processing LESS CSS
  • Processing SASS CSS
  • Analyzing your JavaScript with LINT

The list is virtually endless, you should really have a look. Besides, you can bridge your favorite by writing your own processor if the need be.

Quick how-to

In order to set up wro4j real quick, here are some ready to use snippets:

  • The Maven POM, as seen above
    <plugin>
    	<groupId>ro.isdc.wro4j</groupId>
    	<artifactId>wro4j-maven-plugin</artifactId>
    	<version>1.4.6</version>
    	<configuration>
    		<wroManagerFactory>ro.isdc.wro.maven.plugin.manager.factory.ConfigurableWroManagerFactory</wroManagerFactory>
    		<wroFile>${basedir}/src/main/config/wro.xml</wroFile>
    		<extraConfigFile>${basedir}/src/main/config/wro.properties</extraConfigFile>
    		<targetGroups>all</targetGroups>
    		<cssDestinationFolder>${project.build.directory}/${project.build.finalName}/style/</cssDestinationFolder>
    		<jsDestinationFolder>${project.build.directory}/${project.build.finalName}/script/</jsDestinationFolder>
    		<contextFolder>${basedir}/src/main/webapp/</contextFolder>
    		<ignoreMissingResources>false</ignoreMissingResources>
    	</configuration>
    	<executions>
    		<execution>
    			<goals>
    				<goal>run</goal>
    			</goals>
    			<phase>generate-resources</phase>
    		</execution>
    	</executions>
    </plugin>

    Note the use of the ConfigurableWroManagerFactory. It makes adding and removing processors a breeze by updating the following file.

  • The wro.properties processors file, that list processings that should be executed on the source files. Here, we generate CSS from LESS, resolve imports to have a single file and minify both CSS (with JAWR) and JavaScript:
    preProcessors=lessCss,cssImport
    postProcessors=cssMinJawr,jsMin
  • The wro.xml configuration file, to tell wro4j how to merge CSS and JS files. In our case, styles.css will be merged into all.css and global.js into all.js.
    <?xml version=&quot;1.0&quot; encoding=&quot;UTF-8&quot;?>
    <groups xmlns=&quot;http://www.isdc.ro/wro&quot;>
      <group name=&quot;all&quot;>
        <css>/style/styles.css</css>
        <js>/script/global.js</js>
      </group>
    </groups>
  • Finally, a LESS example can be found just below.

Conclusion

There are some paths to optimization for webapps readily available. Some provide alternative ways to define your CSS, some minify your CSS, some your JS, other merge those files together, etc. It would be a shame to ignore them because they can be a real asset in heavy-load scenarii. wro4j provides an easy way to wrap all those operations into a reproductible build.

LESS
@default-font-family: Helvetica, Arial, sans-serif;

@default-radius: 8px;

@default-color: #5B83AD;
@dark-color: @default-color - #222;
@light-color: @default-color + #222;

.bottom-left-rounded-corners (@radius: @default-radius) {
	.rounded-corners(0, 0, @radius, 0)
}

.bottom-right-rounded-corners (@radius: @default-radius) {
	.rounded-corners(0, 0, 0, @radius)
}

.bottom-rounded-corners (@radius: @default-radius) {
	.rounded-corners(0, 0, @radius, @radius)
}

.top-left-rounded-corners (@radius: @default-radius) {
	.rounded-corners(@radius, 0, 0, 0)
}

.top-right-rounded-corners (@radius: @default-radius) {
	.rounded-corners(0, @radius, 0, 0)
}

.top-rounded-corners (@radius: @default-radius) {
	.rounded-corners(@radius, @radius, 0, 0)
}

.rounded-corners(@top-left: @default-radius, @top-right: @default-radius, @bottom-left: @default-radius, @bottom-right: @default-radius) {
	-moz-border-radius: @top-left @top-right @bottom-right @bottom-left;
	-webkit-border-radius: @top-left @top-right @bottom-right @bottom-left;
	border-radius: @top-left @top-right @bottom-right @bottom-left;
}

.default-border {
	border: 1px solid @dark-color;
}

.no-bottom-border {
	border-bottom: 0;
}

.no-left-border {
	border-left: 0;
}

.no-right-border {
	border-right: 0;
}

body {
	font-family: @default-font-family;
	color: @default-color;
}

h1 {
	color: @dark-color;
}

a {
	color: @dark-color;
	text-decoration: none;
	font-weight: bold;

	&amp;:hover {
		color: black;
	}
}

th {
	color: white;
	background-color: @light-color;
}

td, th {
	.default-border;
	.no-left-border;
	.no-bottom-border;
	padding: 5px;

	&amp;:last-child {
		.no-right-border;
	}
}

tr:first-child {

	th:first-child {
		.top-left-rounded-corners;
	}

	th:last-child {
		.top-right-rounded-corners;
	}

	th:only-child {
		.top-rounded-corners;
	}
}

tr:last-child {

	td:first-child {
		.bottom-left-rounded-corners;
	}

	td:last-child {
		.bottom-right-rounded-corners;
	}

	td:only-child {
		.bottom-rounded-corners;
	}
}

thead tr:first-child th, thead tr:first-child th {
	border-top: 0;
}

table {
	.rounded-corners;
	.default-border;
	border-spacing: 0;
	margin: 5px;
}
Generated CSS
.default-border {
	border: 1px solid #39618b;
}

.no-bottom-border {
	border-bottom: 0;
}

.no-left-border {
	border-left: 0;
}

.no-right-border {
	border-right: 0;
}

body {
	font-family: Helvetica, Arial, sans-serif;
	color: #5b83ad;
}

h1 {
	color: #39618b;
}

a {
	color: #39618b;
	text-decoration: none;
	font-weight: bold;
}

a:hover {
	color: black;
}

th {
	color: white;
	background-color: #7da5cf;
}

td,th {
	border: 1px solid #39618b;
	border-left: 0;
	border-bottom: 0;
	padding: 5px;
}

td:last-child,th:last-child {
	border-right: 0;
}

tr:first-child th:first-child {
	-moz-border-radius: 8px 0 0 0;
	-webkit-border-radius: 8px 0 0 0;
	border-radius: 8px 0 0 0;
}

tr:first-child th:last-child {
	-moz-border-radius: 0 8px 0 0;
	-webkit-border-radius: 0 8px 0 0;
	border-radius: 0 8px 0 0;
}

tr:first-child th:only-child {
	-moz-border-radius: 8px 8px 0 0;
	-webkit-border-radius: 8px 8px 0 0;
	border-radius: 8px 8px 0 0;
}

tr:last-child td:first-child {
	-moz-border-radius: 0 0 0 8px;
	-webkit-border-radius: 0 0 0 8px;
	border-radius: 0 0 0 8px;
}

tr:last-child td:last-child {
	-moz-border-radius: 0 0 8px 0;
	-webkit-border-radius: 0 0 8px 0;
	border-radius: 0 0 8px 0;
}

tr:last-child td:only-child {
	-moz-border-radius: 0 0 8px 8px;
	-webkit-border-radius: 0 0 8px 8px;
	border-radius: 0 0 8px 8px;
}

thead tr:first-child th,thead tr:first-child th {
	border-top: 0;
}

table {
	-moz-border-radius: 8px 8px 8px 8px;
	-webkit-border-radius: 8px 8px 8px 8px;
	border-radius: 8px 8px 8px 8px;
	border: 1px solid #39618b;
	border-spacing: 0;
	margin: 5px;
}
Categories: JavaEE Tags: ,

Easier JPA with Spring Data JPA

July 1st, 2012 2 comments

Database access in Java went through some steps:

  • at first, pure JDBC
  • proprietary frameworks
  • standards such as EJB Entities and JDO
  • OpenSource frameworks such as Hibernate and EclipseLink (known as TopLink at the time)

When JPA was finally released, it seemed my wishes came true. At last, there was a standard coming from the trenches to access databases in Java

Unfortunately, JPA didn’t hold its promises when compared to Hibernate: for example, you don’t have Query by Example. Even worse, in its first version, JPA didn’t provide simple features like Criteria, so that even simple queries would have to be implemented through JPQL and thus achieved with String concatenation. IMHO, this completely defeated ORM purposes.

JPA2 to the rescue

At last, JPA2 supplies something usable in a real-world application. And yet, I feel there’s still so much boilerplate code to write a simple CRUD DAO:

public class JpaDao {

    @PersistenceContext
    private EntityManager em;

    private Class managedClass;

    private JpaDao(Class managedClass) {

        this.managedClass = managedClass;
    }

    public void persist(E entity) {

        em.persist(entity);
    }

    public void remove(E entity) {

        em.remove(entity);
    }

    public E findById(PK id) {

        return em.find(managedClass, id);
    }
}

Some would (and do) object that in such a use-case, there’s no need for a DAO: the EntityManager just needs to be injected in the service class and used directly. This may be a relevant point-of-view, but only when there’s no query for as soon as you go beyond that, you need to separate between data access and business logic.

Boilerplate code in JPA2

Two simple use-cases highlight the useless boilerplate code in JPA 2: @NamedQuery and simple criteria queries. In the first case, you have to get the handle on the named query through the entity manager, then set potential parameters like so:

Query query = em.createNamedQuery("Employee.findHighestPaidEmployee");

In the second, you have to implement your own query with the CriteriaBuilder:

CriteriaBuilder builder = em.getCriteriaBuilder();

CriteriaQuery query = builder.createQuery(Person.class);

Root fromPerson = query.from(Person.class);

return em.createQuery(query.select(fromPerson)).getResultList();

IMHO, these lines of code bring nothing to the table and just clutter our own code. By chance, some time ago, I found project Hades, a product which was based on this conclusion and wrote simple code for you.

Spring Data JPA

Given the fate of some excellent OpenSource projects, Hades fared much better since it has been brought into the Spring ecosystem under the name Spring Data JPA. Out of the box, SDJ provides DAOs that have advanced CRUD features. For example, the following interface can be used as-is:

public interface EmployeeRepository extends JPARepository

Given some Spring magic, an implementation will be provided at runtime with the following methods:

  • void deleteAllInBatch()
  • void deleteInBatch(Iterable<Employee> entities)
  • List<Employee> findAll()
  • List<Employee> findAll(Sort sort)
  • void flush()
  • <S extends Employee> List<S> save(Iterable<S> entities)
  • Employee saveAndFlush(Employee entity)
  • Page<Employee> findAll(Pageable pageable)
  • Iterable<Employee> findAll(Sort sort)
  • long count()
  • void delete(ID id)
  • void delete(Iterable<? extends Employee> entities)
  • void delete(Employee entity)
  • void deleteAll()
  • boolean exists(Long id)
  • Iterable<Employee> findAll()
  • Iterable<Employee> findAll(Iterable ids)
  • Employee findOne(Long id)
  • <S extends Employee> Iterable<S> save(Iterable<S> entities)
  • <S extends Employee> S save(S entity)

Yes, SDJ provides you with a generic DAO, like so many frameworks around but here, wiring into the underlying implementation is handled by the framework, free of charge. For those that don’t need them all and prefer the strict minimum, you can also use the following strategy, where you have to choose the methods from the list above (and use the annotation):

@RepositoryDefinition(domainClass = Employee.class, idClass = Long.class)
public interface EmployeeRepository {

    long count();

    Employee save(Employee employee);
}

It sure is nice, but the best is yet to come. Remember the two above use-cases we had to write on our own? The first is simply handled by adding the unqualified query name to the interface like so:

@RepositoryDefinition(domainClass = Employee.class, idClass = Long.class)
public interface EmployeeRepository {

    ...

    Employee findHighestPaidEmployee();
}

The second use-case, finding all employees, is provided in the JPA repository. But let’s pretend for a second we have a WHERE clause, for example on the first name. SDJ is capable of handling simple queries based on the method name:

@RepositoryDefinition(domainClass = Employee.class, idClass = Long.class)
public interface EmployeeRepository {

    ...

    List findByLastname(String firstName);
}

We had to code only an interface and its methods: no implementation code nor metamodel generation was involved! Don’t worry, if you need to implement some complex queries, SDJ let you wire your own implementation.

Conclusion

If you’re already a Spring user, Spring Data JPA is really (really!) a must. If you’re not, you’re welcome to test it to see its added value by yourself. IMHO, SDJ is one of the reason JavaEE has not killed Spring yet: it bridged the injection part, but the boilerplate code is still around every corner.

This article is not a how-to but a teaser to let you into SDJ. You can find the sources for this article here, in Maven/Eclipse format.

To go further:

For those that aren’t into JPA yet, there’s Data JDBC; for those that are well beyond that (think Big Data); there’s a Data Hadoop. Check all of Spring Data projects!

Categories: Java Tags: , ,

How to test code that uses Envers

June 24th, 2012 3 comments

Envers is a Hibernate module that can be configured to automatically audit changes made to your entities. Each audited entity are thus associated with a list of revisions, each revision capturing the state of the entity when a change occurs. There is however an obstacle I came across while I was “unit testing” my DAO, and that’s what I want to share to avoid others to fall in the same pit.

First, let’s have an overview of the couple of steps needed to use Envers:

  • Annotate your entity with the @Audited annotation:
    @Entity
    @Audited
    public class Person {
    
        // Properties
    }
  • Register the Envers AuditEventListener in your Hibernate SessionFactory through Spring:
    <bean id="sessionFactory" class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
        <property name="dataSource" ref="dataSource" />
        <property name="packagesToScan" value="ch.frankel.blog.envers.entity" />
        <property name="hibernateProperties">
            <props>
                <prop key="hibernate.dialect">org.hibernate.dialect.H2Dialect</prop>
            </props>
        </property>
        <property name="schemaUpdate" value="true" />
        <property name="eventListeners">
            <map>
                <entry key="post-insert" value-ref="auditListener" />
                <entry key="post-update" value-ref="auditListener" />
                <entry key="post-delete" value-ref="auditListener" />
                <entry key="pre-collection-update" value-ref="auditListener" />
                <entry key="pre-collection-remove" value-ref="auditListener" />
                <entry key="post-collection-recreate" value-ref="auditListener" />
            </map>
        </property>
    </bean>
    
    <bean id="auditListener" class="org.hibernate.envers.event.AuditEventListener" />
  • Configure the Hibernate transaction manager as your transaction manager. Note auditing won’t be triggered if you use another transaction manager (DataSourceTransactionManager comes to mind):
    <bean id="transactionManager" class="org.springframework.orm.hibernate3.HibernateTransactionManager">
            <property name="sessionFactory" ref="sessionFactory" />
    </bean>
  • Now is the time to create your test class:
    @ContextConfiguration("classpath:spring-persistence.xml")
    @TransactionConfiguration(defaultRollback = false)
    public class PersonDaoImplTest extends AbstractTransactionalTestNGSpringContextTests {
    
        @Autowired
        private PersonDao personDao;
    
        @BeforeMethod
        protected void setUp() {
    
            // Populate database
        }
    
        @Test
        public void personShouldBeAudited() {
    
            Person person = personDao.get(1L);
    
            person.setFirstName("Jane");
    
            List<Person> history = personDao.getPersonHistory(1L);
    
            assertNotNull(history);
            assertFalse(history.isEmpty());
            assertEquals(history.size(), 1);
        }
    }

Strangely, when you execute the previous test class, the test method fails when checking the list is not empty: it is, meaning there’s no revision associated with the entity. Morevoer, nothing shows up in the log. However, the revision shows up in the audited table at the end of the test (provide you didn’t clear the table after its execution).

Comes the dreaded question: why? Well, it seems Hibernate post-event listeners are only called when the transaction is commited. In our case, it matches: the transaction is commited by Spring after method completion, and our test trie to assert inside the method.

In order for our test to pass, we have to manually manage a transaction inside our method, to commit the update to the database.

@Test
public void personShouldBeAuditedWhenUpdatedWithManualTransaction() {

    PlatformTransactionManager txMgr = applicationContext.getBean(PlatformTransactionManager.class);

	// A new transaction is required, the wrapping transaction is for Envers
    TransactionStatus status = txMgr.getTransaction(new DefaultTransactionDefinition(PROPAGATION_REQUIRES_NEW));

    Person person = personDao.get(1L);

    person.setFirstName("Jane");

    txMgr.commit(status);

    List<Person> history = personDao.getPersonHistory(1L);

    assertNotNull(history);
    assertFalse(history.isEmpty());
    assertEquals(history.size(), 1);
}

On one hand, the test passes and the log shows the SQL commands accordingly. On the other hand, the cost is the additional boilerplate code needed to make it pass.

Of course, one could (should?) question the need to test the feature in the first place. Since it’s a functionality brought by a library, the reasoning behind could be that if you don’t trust the library, don’t use it at all. In my case, it was the first time I used Envers, so there’s no denying I had to build the trust between me and the library. Yet, even with trusted libraries, I do test specific cases: for example, when using Hibernate, I create test classes to verify that complex queries get me the right results. As such, auditing qualifies as a complex use-case whose misbehaviors I want to be aware of as soon as possible.

You’ll find the sources for this article here, in Maven/Eclipse format.

Categories: Java Tags: , ,

Vagrant your Drupal

June 17th, 2012 No comments

In one of my recent post, I described how I used VMWare to create a Drupal I could play with before deploying updates to morevaadin.com. Then, at Devoxx France, I attended a session where the talker detailed how he set up a whole infrastructure for after work formations with Vagrant.

Meanwhile, a little turn of fate put me in charge of some Drupal projects and I had to get better at it… fast. I put my hands on the Definitive Guide to Drupal 7 that talks about Drupal use with Vagrant. This was definitely too much: I decided to take on this opportunity to automatically manage my own Drupal infrastructure. These are the steps I followed and the lessons I learned. Note my host OS is Windows 7 :-)

Download VirtualBox

Oracle’s VirtualBox is the format used by Vagrant. Go on their download page and choose your pick.

Download Vagrant

Vagrant download page is here. Once installed on your system, you should put the bin directory on your PATH.

Now, get the Ubuntu Lucyd Linx box ready with:

vagrant box add base http://files.vagrantup.com/lucid32.box

This download the box in your %USER_HOME%/.vagrant.d/boxes under the lucid32 folder (at least on Windows).

Get Drupal Vagrant project

Download the current version of Drupal Vagrant and extract it in the directory of your choice. Edit the Vagrantfile according to the following guidelines:

config.vm.box = "lucid32" // References the right box
...
config.vm.network :hostonly, "33.33.33.10" // Creates a machine with this IP

Then, boot the virtual box with vagrant up and let Vagrant take care of all things (boot the VM, get the necessary applications, configure all, etc.).

Update your etc/hosts file to have the 2 following domains pointing to 33.33.33.10.

33.33.33.10	drupal.vbox.local
33.33.33.10		dev-site.vbox.local

At the end of the process (it could be a lenghty one), browsing on your host system to http://drupal.vbox.local/install.php should get you the familiar Drupal installation screen. You’re on!

SSH into the virtual box

Now is time to get into the host system, with vagrant ssh.

If on Windows, here comes the hard part. Since there’s no SSH utility out-of-the-box, you have to get one. Personally, I used PuTTY. Not the VM uses a SSH key for authentication and unfortunately, the format of Vagrant’s provided key is not compatible with PuTTY so that we have to use PuTTYGen to translate %USER_HOME%/.vagrant.d/insecure_private_key into a format PuTTY is able to use. When done, connect with PuTTY on the system (finally).

Conclusion

All in all, this approach works alright, altough Drush is present in /usr/share/drush but doesn’t seem to work (Git is installed and works fine).

Note: I recently stumbled upon this other Drupal cookbook but it cannot be used as is. Better DevOps than me can probably fix it.

Categories: Development Tags: , ,

Ways of comparing Date objects in Java

June 3rd, 2012 3 comments

Let’s face it, compairing Date objects in Java is complex and as well as error-prone. This is the case in standard code, but this is also the case in testing code, where we regularly need to create Date objects that point at specific instant in time, to be our reference in comparison.

The good ol’ deprecated way

In test code, I’ve no qualms about using deprecated methods. So, I used the old Date constructor to initialize dates:

Date date = new Date(112, 5, 3);

Pro: it’s concise. Con: it really isn’t intuitive and you need a good knowledge of the Java API to know the first parameter is the year minus 1900 and the second is the month count (beginning at 0, for January). It’s almost a surprise to learn the last parameter is simply… the day of the month.

The canonical way

Since Java 1.1, Calendar was introduced in the Java API to dissociate between an instant in time (the date) and its view in a specific referential (the calendar). The following snippet is a naive way to obtain the same result as above.

Calendar calendar = Calendar.getInstance();

calendar.set(YEAR, 2012);
calendar.set(MONTH, JUNE);
calendar.set(DAY_OF_MONTH, 3);

It’s not only more verbose, it’s also a mistake: hours, minutes and the rest are not 0 (depending on the exact creation time of the calendar) so using equals() here will return false. Here’s the correct code:

Calendar calendar = Calendar.getInstance();

calendar.set(YEAR, 2012);
calendar.set(MONTH, JUNE);
calendar.set(DAY_OF_MONTH, 3);
calendar.set(HOUR_OF_DAY, 0);
calendar.set(MINUTE, 0);
calendar.set(SECOND, 0);
calendar.set(MILLISECOND, 0);

It defeats brevity purpose, to say the least ;-)

Apache Commons Lang

Apache Commons provides since ages different utility libraries that help develop in Java. One of such library is Apache Commons Lang, which aims at providing code that would merit being part of the Java API.

In our case, the DateUtils class let us shorten the previous code, while keeping its readibility:

Calendar calendar = Calendar.getInstance();

calendar.set(YEAR, 2012);
calendar.set(MONTH, JUNE);
calendar.set(DAY_OF_MONTH, 3);
calendar = DateUtils.truncate(calendar, DAY_OF_MONTH);

Even better, DateUtils let us work directly on Date objects, to make the following alternative also possible:

Date date = new Date();

date = DateUtils.setYears(date, 2012);
date = DateUtils.setMonths(date, JUNE);
date = DateUtils.setDays(date, 3);
date = DateUtils.truncate(date, DAY_OF_MONTH);

Note that it leaves parameters untouched, enforcing the immutability dear to Functional Programing proponents. Pro: we use the standard Java API. Con: none. And yet, wouldn’t a full-fledged DSL feel somewhat more right?

Joda Time

The final option is to use Joda Time, which aims at being a replacement for Date and Calendar. It also has spawned the JSR-310“a new and improved date and time API for Java”, that should be part of Java 8 (it was originally scheduled for Java 7). A whole article (or even a mini-guide) could be dedicated to Joda Time. For our present concern, the following snippet can advantageously replace our original one:

DateMidnight dm = new DateMidnight(2012, 6, 3);

Back to square one, it seems: clear and concise. And yet, the parameters self explanatory, no real need to regularly check the JavaDocs to see how to initialize the year. Besides, the semantics of the class name are clear. Finally, the toDate() method let us bridge to the standard Java API.

Conclusion

The conclusion is your own. As for myself, I regularly use Apache Commons Lang, but I’m leaning toward Joda Time these days.

The code is available here as a Eclipse/Maven project archive.

Note: if you need to work with business days, I’ve been recently made aware of ObjectLab Kit. I haven’t used it yet, and feedbacks are welcomed.

Categories: Java Tags:

Database unit testing with DBUnit, Spring and TestNG

June 3rd, 2012 4 comments

I really like Spring, so I tend to use its features to the fullest. However, in some dark corners of its philosophy, I tend to disagree with some of its assumptions. One such assumption is the way database testing should work. In this article, I will explain how to configure your projects to make Spring Test and DBUnit play nice together in a multi-developers environment.

Context

My basic need is to be able to test some complex queries: before integration tests, I’ve to validate those queries get me the right results. These are not unit tests per se but let’s assilimate them as such. In order to achieve this, I use since a while a framework named DBUnit. Although not maintained since late 2010, I haven’t found yet a replacement (be my guest for proposals).

I also have some constraints:

  • I want to use TestNG for all my test classes, so that new developers wouldn’t think about which test framework to use
  • I want to be able to use Spring Test, so that I can inject my test dependencies directly into the test class
  • I want to be able to see for myself the database state at the end of any of my test, so that if something goes wrong, I can execute my own queries to discover why
  • I want every developer to have its own isolated database instance/schema

Considering the last point, our organization let us benefit from a single Oracle schema per developer for those “unit-tests”.

Basic set up

Spring provides the AbstractTestNGSpringContextTests class out-of-the-box. In turn, this means we can apply TestNG annotations as well as @Autowired on children classes. It also means we have access to the underlying applicationContext, but I prefer not to (and don’t need to in any case).

The structure of such a test would look like this:

@ContextConfiguration(location = "classpath:persistence-beans.xml")
public class MyDaoTest extends AbstractTestNGSpringContextTests {

    @Autowired
    private MyDao myDao;

    @Test
    public void whenXYZThenTUV() {
        ...
    }
}

Readers familiar with Spring and TestNG shouldn’t be surprised here.

Bringing in DBunit

DbUnit is a JUnit extension targeted at database-driven projects that, among other things, puts your database into a known state between test runs. [...] DbUnit has the ability to export and import your database data to and from XML datasets. Since version 2.0, DbUnit can also work with very large datasets when used in streaming mode. DbUnit can also help you to verify that your database data match an expected set of values.

DBunit being a JUnit extension, it’s expected to extend the provided parent class org.dbunit.DBTestCase. In my context, I have to redefine some setup and teardown operation to use Spring inheritance hierarchy. Luckily, DBUnit developers thought about that and offer relevant documentation.

Among the different strategies available, my tastes tend toward the CLEAN_INSERT and NONE operations respectively on setup and teardown. This way, I can check the database state directly if my test fails. This updates my test class like so:

@ContextConfiguration(locations = {"classpath:persistence-beans.xml", "classpath:test-beans.xml"})
public class MyDaoTest extends AbstractTestNGSpringContextTests {

    @Autowired
    private MyDao myDao;

    @Autowired
    private IDatabaseTester databaseTester;

        @BeforeMethod
        protected void setUp() throws Exception {

            // Get the XML and set it on the databaseTester
            // Optional: get the DTD and set it on the databaseTester

            databaseTester.setSetUpOperation(DatabaseOperation.CLEAN_INSERT);
            databaseTester.setTearDownOperation(DatabaseOperation.NONE);
            databaseTester.onSetup();
        }

        @Test
        public void whenXYZThenTUV() {
            ...
    }
}

Per-user configuration with Spring

Of course, we need to have a specific Spring configuration file to inject the databaseTester. As an example, here is one:

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.springframework.org/schema/beans

http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">

        <bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
            <property name="location" value="${user.name}.database.properties" />
        </bean>

        <bean name="dataSource" class="org.springframework.jdbc.datasource.SingleConnectionDataSource">
             <property name="driverClass" value="oracle.jdbc.driver" />
             <property name="username" value="${db.username}" />
             <property name="password" value="${db.password}" />
             <property name="url" value="jdbc:oracle:thin:@<server>:<port>/${db.schema}" />
        </bean>

        <bean name="databaseTester" class="org.dbunit.DataSourceDatabaseTester">
            <constructor-arg ref="dataSource" />
        </bean>
</beans>

However, there’s more than meets the eye. Notice the databaseTester has to be fed a datasource. Since a requirement is to have a database per developer, there are basically two options: either use a in-memory database or use the same database as in production and provide one such database schema per developer. I tend toward the latter solution (when possible) since it tends to decrease differences between the testing environment and the production environment.

Thus, in order for each developer to use its own schema, I use Spring’s ability to replace Java system properties at runtime: each developer is characterized by a different user.name. Then, I configure a PlaceholderConfigurer that looks for {user.name}.database.properties file, that will look like so:

db.username=myusername1
db.password=mypassword1
db.schema=myschema1

This let me achieve my goal of each developer using its own instance of Oracle. If you want to use this strategy, do not forget to provide a specific database.properties for the Continuous Integration server.

Huh oh?

Finally, the whole testing chain is configured up to the database tier. Yet, when the previous test is run, everything is fine (or not), but when checking the database, it looks untouched. Strangely enough, if you did load some XML dataset and assert it during the test, it does behaves accordingly: this bears all symptoms of a transaction issue. In fact, when you closely look at Spring’s documentation, everything becomes clear. Spring’s vision is that the database should be left untouched by running tests, in complete contradiction to DBUnit’s. It’s achieved by simply rollbacking all changes at the end of the test by default.

In order to change this behavior, the only thing to do is annotate the test class with @TransactionConfiguration(defaultRollback=false). Note this doesn’t prevent us from specifying specific methods that shouldn’t affect the database state on a case-by-case basis with the @Rollback annotation.

The test class becomes:

@ContextConfiguration(locations = {classpath:persistence-beans.xml", "classpath:test-beans.xml"})
@TransactionConfiguration(defaultRollback=false)
public class MyDaoTest extends AbstractTestNGSpringContextTests {

    @Autowired
    private MyDao myDao;

    @Autowired
    private IDatabaseTester databaseTester;

	@BeforeMethod
	protected void setUp() throws Exception {

		// Get the XML and set it on the databaseTester
		// Optional: get the DTD and set it on the databaseTester

        databaseTester.setSetUpOperation(DatabaseOperation.CLEAN_INSERT);
        databaseTester.setTearDownOperation(DatabaseOperation.NONE);
        databaseTester.onSetup();
    }

	@Test
	public void whenXYZThenTUV() {
		...
	}
}

Conclusion

Though Spring and DBUnit views on database testing are opposed, Spring’s configuration versatility let us make it fit our needs (and benefits from DI). Of course, other improvements are possible: pushing up common code in a parent test class, etc.

To go further:

Categories: Java Tags: , , ,

Arquillian on legacy servers

May 27th, 2012 No comments

In most contexts, when something doesn’t work, you just Google the error and you’re basically done. One good thing about working for organizations that lag behind technology-wise is that it generally is more challenging and you’re bound to be creative. Me, I’m stuck on JBoss 5.1 EAP, but that doesn’t stop me for trying to use modern approach in software engineering. In the quality domain, one such try is to be able to provide my developers a way to test their code in-container. Since we are newcomers in the EJB3 realm, that means they will have a need for true integration testing.

Given the stacks available at the time of this writing, I’ve decided for Arquillian, which seems the best (and only?) tool to achieve this. With this framework (as well as my faithful TestNG), I was set to test Java EE components on JBoss 5.1 EAP. This article describes how to do just that (as well as mentioning the pitfalls I had to overcome).

Arquillian basics

Arquillian basically provides a way for developers to manage the lifecycle of an application server and deploy a JavaEE artifact in it, in an automated way and integrated with your favorite testing engine (read TestNG – or JUnit if you must). Arquillian architecture is based on a generic engine, and adapters for specific application servers. If no adapter is available for your application server, that’s tough luck. If you’re feeling playful, try searching for a WebSphere adapter… the develop one. The first difficulty was that at the time of my work, there was no JBoss EAP 5.1 adapter, but I read from the web that it’s between a 5.0 GA and a 6.0 GA: a lengthy trial-and-error process took place (now, there’s at least some documentation, thanks to Arquillian guys hearing my plea on Twitter – thanks!).

Server type and version are not enough to get the right adapter, you’ll also need to choose how you will interact with the application server:

  • Embedded: you download all dependencies, provides a configuration and presto, Arquillian magically creates a running embedded application server. Pro: you don’t have to install the app server on each of you devs computer; cons: configuring may well be a nightmare, and there may be huge differences with the final platform, defeating the purpose of integration testing.
  • Remote: use an existing and running application server. Choose wisely between a dedicated server for all devs (who will share the same configuration, but also the same resources) and a single app server per dev (who said maintenace nightmare?).
  • Managed: same as remote, but tests will start and stop the server. Better suited for one app server per dev.
  • Local: I’ve found traces of this one, even though it seems to be buried deep. This seems to be the same as managed, but it uses the filesystem instead of the wire to do its job (starting/stopping by calling scripts, deploying by copying archives to the expected location, …), thus the term local.

Each adapter can be configured according to your needs through the arquilian.xml file. For Maven users, it takes place at the root of src/test/resources. Don’t worry, each adapter is aptly documented for configuration parameters. Mine looks like this:

<?xml version="1.0" encoding="UTF-8"?>
<arquillian xmlns="http://jboss.org/schema/arquillian"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://jboss.org/schema/arquillian http://jboss.org/schema/arquillian/arquillian_1_0.xsd">
    <container qualifier="jboss" default="true">
        <configuration>
            <property name="javaHome">${jdk.home}</property>
            <property name="jbossHome">${jboss.home}</property>
            <property name="httpPort">8080</property>
            <property name="rmiPort">1099</property>
            <property name="javaVmArguments">-Xmx512m -Xmx768m -XX:MaxPermSize=256m
                -Djava.net.preferIPv4Stack=true
                -Djava.util.logging.manager=org.jboss.logmanager.LogManager
                -Djava.endorsed.dirs=${jboss.home}/lib/endorsed
                -Djboss.boot.server.log.dir=${jboss.home}
                -Dsun.rmi.dgc.client.gcInterval=3600000 -Dsun.rmi.dgc.server.gcInterval=3600000
            </property>
        </configuration>
    </container>
</arquillian>

Finally, the test class itself has to extend org.jboss.arquillian.testng.Arquillian (or if you stick with JUnit, use @RunWith). The following snippet shows an example of a TestNG test class using Arquillian:

public class WebIT extends Arquillian {

    @Test
    public void testMethod() { ... }

    @Deployment
    public static WebArchive createDeployment() { ... }
}

Creating the archive

Arquillian tactics regarding in-container testing is to deploy only what you need. As such, it comes with a tool named ShrinkWrap that let you package only the relevant parts of what is to be tested.

For example, the following example creates an archive named web-test.war, bundling the ListPersonServlet class and the web deployment descriptor. More advanced uses would include libraries.

WebArchive archive = ShrinkWrap.create(WebArchive.class, "web-test.war")
    .addClass(ListPersonServlet.class)
    .setWebXML(new File("src/main/webapp/WEB-INF/web.xml"));

The Arquillian framework will look for a static method, annotated with @Deployment that returns such an archive, in order to deploy it to the application server (through the adapter).

Tip: use the toString() method on the archive to see what it contains if you have errors.

Icing on the Maven cake

Even if you’re no Maven fan, I think this point deserves some attention. IMHO, integration tests shouldn’t be part of the build process since they are by essence more fragile: a simple configuration error on the application server and your build fails without real cause.

Thus, I would recommend putting your tests in a separate module that is not called by its parent.

Besides, I also use the Maven Failsafe plugin, instead of the Surefire one so that I get clean separation of unit tests and integration tests, even with separated modules, so that I get different metrics for both (in Sonar for example). In order to get that separation out-of-the-box, just ensure your test classes end with IT (like Integration Test). Note that integration tests are bound to the integration-test lifecycle phase, just before install.

Here comes JBoss EAP

Now comes the hard part, actual deployment on JBoss 5.1 EAP. This stage will probably last for a long long time, and involve a lot of going back and forth. In order to be as productive as possible, the first thing to do is to locate the logs. If you kept the above configurations, they are in the JBoss <profile>/logs directory. If you feel the logs are too verbose for your own tastes (I did), just change the root priority from ${jboss.server.log.threshold} to INFO in the <profile>/conf/jboss-log4j.xml file.

On my machine, I kept getting JVM bind port errors. Thus, I changed the guilty connector port in the server.xml file (FYI, it was the AJP connector and JBoss stopped complaining when I changed 8009 to 8010).

One last step for EAP is to disable security. Since it’s enterprise oriented, security is enabled by default in EAP: in testing contexts, I couldn’t care less about authenticating to deploy my artifacts. Open the <profile>/deploy/profileservice-jboss-beans.xml configuration file and search for “comment this list to disable auth checks for the profileservice”. Do as you’re told :-) Alternatively, you could also walk the hard path and configure authentication: detailed instructions are available on the adapter page.

Getting things done, on your own

Until this point, we more or less followed instructions here and there. Now, we have to sully our nails and use some of our gray matter.

  • The first thing to address is some strange java.lang.IllegalStateExceptionwhen launching a test. Strangely enough, this is caused by Arquillian missing some libraries that have to be ShrinkWrapped along with your real code. In my case, I had to had the following snippet to my web archive:
    MavenDependencyResolver resolver = DependencyResolvers.use(MavenDependencyResolver.class);
    archive.addAsLibraries(
        resolver.artifact("org.jboss.arquillian.testng:arquillian-testng-container:1.0.0.Final")
                .artifact("org.jboss.arquillian.testng:arquillian-testng-core:1.0.0.Final")
                .artifact("org.testng:testng:6.5.2").resolveAsFiles());
  • The next error is much more vicious and comes from Arquillian inner workings.
    java.util.NoSuchElementException
        at java.util.LinkedHashMap$LinkedHashIterator.nextEntry(LinkedHashMap.java:375)
        at java.util.LinkedHashMap$KeyIterator.next(LinkedHashMap.java:384)
        at org.jboss.arquillian.container.test.spi.util.TestRunners.getTestRunner(TestRunners.java:60)

    When you look at the code, you see Arquillian uses the Service Provider feature (for more info, see here). But much to my chagrin, it doesn’t configure the implementation the org.jboss.arquillian.container.test.spi.TestRunner service should use and thus fails miserably. We have to create such a configuration manually: the file should only contain org.jboss.arquillian.testng.container.TestNGTestRunner (for such is the power of the Service Provider).Don’t forget to package it along the archive to have any chance at success:

    archive.addAsResource(new File("src/test/resources/META-INF/services"), "META-INF/services");

Update [28th May]: the two points above can be abandoned if you use the correct Maven dependency (the Arquillian BOM). Check the POMs in the attached project.

At the end, everything should work fine except a final message log in the test:

Shutting down server: default
Writing server thread dump to /path/to/jboss-as/server/default/log/threadDump.log

This means Arquillian cannot shutdown the server because it can’t authenticate. This would have no consequence whatsoever but it marks the test as failed and thus needs to be corrected. Edit the <profile>/conf/props/jmx-console-users.properties file and uncomment the admin = admin line.

Conclusion

The full previous steps took me about about half a week work spread over a week (it seems I’m more productive when not working full-time as my brain launches some background threads to solve problems). This was not easy but I’m proud to have beaten the damn thing. Moreover, a couple of proprietary configuration settings were omitted in this article. In conclusion, Arquillian seems to be a nice in-container testing framework but seems to have to be polished around some corners: I think using TestNG may be the cultprit here.

You can find the sources for this article here, in Eclipse/Maven format.

To go further:

EJB3 façade over Spring services

May 20th, 2012 No comments

As a consultant, you seldom get to voice out your opinions regarding the technologies used by your customers and it’s even more extraordinary when you’re heard. My current context belongs to the usual case: I’m stuck with Java 6 running JBoss 5.1 EAP with no chance of going forward in the near future (and I consider myself happy since a year and a half ago, that was Java 5 with JOnAS 4). Sometimes, others wonder if I’m working in a museum but I see myself more as an archaeologist than a curator.

Context

I recently got my hands on an application that had to be migrated from a proprietary framework, to more perennial technologies. The application consists of one web-front office and a Swing back-office. The key difficulty was to make the Swing part communicate with the server part, since both lives in two different network zones, separated by a firewall (with some open ports for RMI and HTTP). Moreover, our Security team enforces that such communications has to be secured.

The hard choice

The following factors played a part in my architecture choice:

  • My skills, I’ve plenty more experience in Spring than in EJB3
  • My team skills, more oriented toward Swing
  • Reusing as much as possible the existing code or at least interfaces
  • Existing requirement toward web-services:
  • Web services security is implemented through the reverse proxy, and its reliability is not the best I’ve ever seen (to put it mildly)
  • Web services applications have to be located on dedicated infrastructure
  • Mature EJB culture
  • Available JAAS LoginModule for secure EJB calls and web-services from Swing

Now, it basically boils down to exposing Spring web-services on HTTP or EJB3. In the first case, cons include no experience of Spring remoting, performance (or even reliability) issues, and deployment on different servers thus more complex packaging for the dev team. In the second case, they include a slightly higher complexity (yes, EJB3 are easier than with EJB 2.1, but still), a higher ramp-up time and me not being able to properly support my team when a difficulty is met.

In the end, I decided to use Spring services in fullest, but to put them behind a EJB3 façade. That may seem strange but I think that I get the best of both world: EJB3 skills are kept at a bare minimum (transactions will be managed by Spring), while the technology gets me directly through the reverse-proxy. I’m open for suggestions and arguments toward such and such solutions given the above factors, but the quicker the better :-)

Design

To ease the design to the maximum, each Spring service will have exactly one and only one EJB3 façade, which will delegate calls to the underlying service. Most IDEs will be able to take care of the boilerplate delegating code (hell, you can even use Project Lombok with @Delegate – I’m considering it).

On the class level, the following class design will be used:

This is only standard EJB design, with the added Spring implementation.

On the module level, this means we will need a somewhat convoluted packaging for the service layer:

  • A Business Interfaces module
  • A Spring Implementations module
  • An EJB3 module, including remote interfaces and session beans (thanks to the Maven EJB plugin, it will produce two different artifacts)

How-to

Finally, developing  the EJB3 façade and injecting it with Spring beans is ridiculously simple. The magic lies in the JavaEE 5 Interceptors annotation on top of the session bean class that references the SpringBeanAutowiringInterceptor class, that will kick in Spring injection after instantiation (as well as activation) on every referenced dependency.

The only dependency in our case is the delegate Spring bean, which as to be annotated with legacy @Autowired.

import javax.ejb.Stateless;
import javax.interceptor.Interceptors;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.ejb.interceptor.SpringBeanAutowiringInterceptor;

import ch.frankel.blog.ejb.spring.service.client.RandomGeneratorService;

@Stateless
@Interceptors(SpringBeanAutowiringInterceptor.class)
public class RandomGeneratorBean implements RandomGeneratorService {

    @Autowired
    private ch.frankel.blog.ejb.spring.service.RandomGeneratorService delegate;

    @Override
    public int generateNumber(int lowerLimit, int upperLimit) {

        return delegate.generateNumber(lowerLimit, upperLimit);
    }
}

In order to work, we have to use a specific Spring configuration file, which references the Spring application context defined in our services module as well as activate annotation configuration.

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xmlns:context="http://www.springframework.org/schema/context"
  xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
    http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.1.xsd">

  <context:annotation-config />

  <bean>
    <constructor-arg>
      <list>
        <value>classpath:ejb-service.xml</value>
      </list>
    </constructor-arg>
  </bean>

</beans>

Warning: it’s mandatory to name this file beanRefContext.xml and to make it available at the root of the EJB JAR (and of course to set the Spring service module as a dependency).

Conclusion

Sometimes, you have to make some interesting architectural choices that are pertinent only in your context. In these cases, it’s good to know somebody paved the road for you: this is the thing with EJB3 façade over Spring services.

To go further:

Categories: JavaEE Tags: ,

Quick evaluation of Twitter Bootstrap

May 13th, 2012 5 comments

I must admit I suck at graphical design. It’s one of the reasons that put me in the way of Flex and Vaadin in the first place: out-of-the-box, you got an application that is pleasing to the eye.

Using one of these technologies is not possible (nor relevant) in all contexts, and since I’ve got a strong interest in UI, I regularly have a look at other alternatives for clean looking applications. The technology I studied this week is Twitter Bootstrap.

Bootstrap is a lightweight client-side framework, that comes (among others) with pre-styled “components” and a standard layout. Both are completely configurable and you can download the customized result here. From a technical point of view, Bootstrap offers a CSS – and an optional JavaScript file that relies on jQuery (both are also available in minified flavor). I’m no big fan of fancy client-side JavaScript so what follows will be focused on what you can do with plain CSS.

Layout

Boostrap offers a 940px wide canvas, separated in 12 columns. What you do with those columns is up to you: most of the time, you’ll probably want to group some of them. It’s easily achieved by using simple CSS classes. In fact, in Bootstrap, everything is done with selectors. For example, the following snippet offers a main column that has twice the width of its left and right sidebars:

<div>
    <div>Left sidebar</div>
    <div>Main</div>
    <div>Right sidebar</div>
</div>

A feature provided worth mentioning is responsive design: in essence, the layout adapts itself to its screen, an important advantage considering the fragmentation of today’s user agents.

Components

Whatever application you’re developing, chances are high that you’ll need some components, like buttons, menus, fields (in forms), tabs, etc.

Through CSS classes, Bootstrap offers a bunch of components. For existing HTML components, it applies a visually appealing style on them. For non-existing components (such as menus and tabs), it tweaks the rendering of standard HTML.

Conclusion

I tried to remake some pages of More Vaadin with Bootstrap and all in all, the result in not unpleasing to the eye. In the case I can’t use Vaadin, I think I would be a strong proponent of Bootstrap when creating an application from scratch.

You can find the sources of my attempt here.

Categories: Development Tags: ,

Specification by Example review

This review is about Specifications by Example by Gojko Adzic from Manning.

Facts

  • 18 chapters, 254 pages, $29.21
  • This book covers Specifications by Example (you could have guessed it from the title). In effect, SBE are a way to build the right software (for the customers), as opposed to build the software right (which is our trade as engineers).

Specification by Example is a set of process patterns that facilitate change in software products to ensure that the right product is delivered efficiently.

Pros

  • Not all methods are adequate for each context. Each hint described is made contextual.
  • Do’s are detailed all right, with their respective benefits, but dont’s are also characterized along with the reasons why you shouldn’t go down their path.
  • Full of real-life examples, with people and teams talking about what they did and why.
  • 6 case studies

Cons

  • Missing a tooling section to explain how to put these ideas in action.

Conclusion

As software engineers, we are on the lower part in the software creation chain and it shows. We focus on building the software right… Yet, we fail to address problems that find their origin in the upper part of the chain; agile methodologies are meant to prevent this. This book comes from a simple idea: in most projects, specifications, tests and software are all but synchronized. It lists a collection of agile recipes to remedy that.

Whether you’re a business analyst, a project manager or just a technical guy who wants to build better software, this book is for you. In any cases, you’ll want to keep it under arm’s reach.

Categories: Book review Tags: ,