Archive

Posts Tagged ‘spring’

Final release of Integration Testing from the Trenches

March 1st, 2015 2 comments
Writing a book is a journey. At the beginning of the journey, you mostly know where you want to go, but have only vague notion of the way to get there and the time it will take. I’ve finally released the paperback version of on Amazon and that means this specific journey is at end.

The book starts by a very generic discussion about testing and continues by defining Integration Testing in comparison to Unit Testing. The next chapter compares the respective merits of Junit and TestNG. It is followed by complete description on how to make a design testable: what works for Unit Testing works also for Integration Testing. Testing in software relies on automation, so that specific usage of the Maven build tool is described in regard to Integration Testing – as well as Gradle. Dependencies on external resources make integration tests more fragile so faking those make them more robust. Those resources include: databases, the file system, SOAP and REST web services, etc. The most important dependency in any application is the container. The last chapters are dedicated to the Spring framework, including Spring MVC and Java EE.

In this journey, I also dared ask Josh Long of Spring fame and Aslak Knutsen, team lead of the Arquillian project to write a foreword to the book – and I’ve been delighted to have them both answer positively. Thank you guys!

I’ve also talked on the subject at some JUG and European conferences: JavaDay Kiev, Joker, Agile Tour London, and JUG Lyon and will again at JavaLand, DevIt, TopConf Romania and GeeCon. I hope that by doing so, Integration Testing will be used more effectively on projects and with bigger ROI.

Should you want to go further, the book is available in multiple formats:

  1. A paperback version on for $49.99
  2. Electronic versions for Mac, Kindle and plain old PDF on . The pricing here is more open, starting from $21.10 with a suggested price of $31.65. Note you can get it in all formats to read on all your devices.

If you’re already a reader and you like it, please feel free to recommend it. If you don’t, I welcome your feedback in the comments section. Of course, if neither – I encourage you to get a book and see for yourself!

Send to Kindle

Improving the Vaadin 4 Spring project with a simpler MVP

January 18th, 2015 No comments

I’ve been using the Vaadin 4 Spring library on my current project, and this has been a very pleasant experience. However, in the middle of the project, a colleague of mine decided to “improve the testability”. The intention was laudable, though the project already tried to implement the MVP pattern (please check this article for more detailed information). Instead of correcting the mistakes here and there, he refactored the whole codebase using the provided MVP module… IMHO, this has been a huge mistake. In this article, I’ll try to highlights the stuff that bugs me in the existing implementation, and an alternative solution to it.

The existing MVP implementation consists of a single class. Here it is, abridged for readability purpose:

public abstract class Presenter<V extends View> {

    @Autowired
    private SpringViewProvider viewProvider;

    @Autowired
    private EventBus eventBus;

    @PostConstruct
    protected void init() {
        eventBus.subscribe(this);
    }

    public V getView() {
        V result = null;
        Class<?> clazz = getClass();
        if (clazz.isAnnotationPresent(VaadinPresenter.class)) {
            VaadinPresenter vp = clazz.getAnnotation(VaadinPresenter.class);
            result = (V) viewProvider.getView(vp.viewName());
        }
        return result;
    }
    // Other plumbing code
}

This class is quite opinionated and suffers from the following drawbacks:

  1. It relies on field auto-wiring, which makes it extremely hard to unit test Presenter classes. As a proof, the provided test class is not a unit test, but an integration test.
  2. It relies solely on component scanning, which prevents explicit dependency injection.
  3. It enforces the implementation of the View interface, whether required or not. When not using the Navigator, it makes the implementation of an empty enterView() method mandatory.
  4. It takes responsibility of creating the View from the view provider.
  5. It couples the Presenter and the View, with its @VaadinPresenter annotation, preventing a single Presenter to handle different View implementations.
  6. It requires to explicitly call the init() method of the Presenter, as the @PostConstruct annotation on a super class is not called when the subclass has one.

I’ve developed an alternative class that tries to address the previous points – and is also simpler:

public abstract class Presenter<T> {

    private final T view;
    private final EventBus eventBus;

    public Presenter(T view, EventBus eventBus) {
        Assert.notNull(view);
        Assert.notNull(eventBus);
        this.view = view;
        this.eventBus = eventBus;
        eventBus.subscribe(this);
    }
    // Other plumbing code
}

This class makes every subclass easily unit-testable, as the following snippets proves:

public class FooView extends Label {}

public class FooPresenter extends Presenter {

    public FooPresenter(FooView view, EventBus eventBus) {
        super(view, eventBus);
    }

    @EventBusListenerMethod
    public void onNewCaption(String caption) {
        getView().setCaption(caption);
    }
}

public class PresenterTest {

    private FooPresenter presenter;
    private FooView fooView;
    private EventBus eventBus;

    @Before
    public void setUp() {
        fooView = new FooView();
        eventBus = mock(EventBus.class);
        presenter = new FooPresenter(fooView, eventBus);
    }

    @Test
    public void should_manage_underlying_view() {
        String message = "anymessagecangohere";
        presenter.onNewCaption(message);
        assertEquals(message, fooView.getCaption());
    }
}

The same Integration Test as for the initial class can also be handled, using explicit dependency injection:

public class ExplicitPresenter extends Presenter<FooView> {

    public ExplicitPresenter(FooView view, EventBus eventBus) {
        super(view, eventBus);
    }

    @EventBusListenerMethod
    public void onNewCaption(String caption) {
        getView().setCaption(caption);
    }
}

@Configuration
@EnableVaadin
public class ExplicitConfig {

    @Autowired
    private EventBus eventBus;

    @Bean
    @UIScope
    public FooView fooView() {
        return new FooView();
    }

    @Bean
    @UIScope
    public ExplicitPresenter fooPresenter() {
        return new ExplicitPresenter(fooView(), eventBus);
    }
}

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(classes = ExplicitConfig.class)
@VaadinAppConfiguration
public class ExplicitPresenterIT {

    @Autowired
    private ExplicitPresenter explicitPresenter;

    @Autowired
    private EventBus eventBus;

    @Test
    public void should_listen_to_message() {
        String message = "message_from_explicit";
        eventBus.publish(this, message);
        assertEquals(message, explicitPresenter.getView().getCaption());
    }
}

Last but not least, this alternative also let you use auto-wiring and component-scanning if you feel like it! The only difference being that it enforces constructor auto-wiring instead of field auto-wiring (in my eyes, this counts as a plus, albeit a little more verbose):

@UIScope
@VaadinComponent
public class FooView extends Label {}

@UIScope
@VaadinComponent
public class AutowiredPresenter extends Presenter<FooView> {

    @Autowired
    public AutowiredPresenter(FooView view, EventBus eventBus) {
        super(view, eventBus);
    }

    @EventBusListenerMethod
    public void onNewCaption(String caption) {
        getView().setCaption(caption);
    }
}

@ComponentScan
@EnableVaadin
public class ScanConfig {}

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(classes = ScanConfig.class)
@VaadinAppConfiguration
public class AutowiredPresenterIT {

    @Autowired
    private AutowiredPresenter autowiredPresenter;

    @Autowired
    private EventBus eventBus;

    @Test
    public void should_listen_to_message() {
        String message = "message_from_autowired";
        eventBus.publish(this, message);
        assertEquals(message, autowiredPresenter.getView().getCaption());
    }
}

The good news is that this module is now part of the vaadin4spring project on Github. If you need MVP for your Vaadin Spring application, you’re just a click away!

Send to Kindle
Categories: JavaEE Tags: , ,

Spring profiles or Maven profiles?

January 4th, 2015 2 comments

Deploying on different environments requires configuration, e.g. database URL(s) must be set on each dedicated environment. In most – if not all Java applications, this is achieved through a .properties file, loaded through the appropriately-named Properties class. During development, there’s no reason not to use the same configuration system, e.g. to use an embedded h2 database instead of the production one.

Unfortunately, Jave EE applications generally fall outside this usage, as the good practice on deployed environments (i.e. all environments save the local developer machine) is to use a JNDI datasource instead of a local connection. Even Tomcat and Jetty – which implement only a fraction of the Java EE Web Profile, provide this nifty and useful feature.

As an example, let’s take the Spring framework. In this case, two datasource configuration fragments have to be defined:

  • For deployed environment, one that specifies the JNDI location lookup
  • For local development (and test), one that configures a connection pool around a direct database connection

Simple properties file cannot manage this kind of switch, one has to use the build system. Shameless self-promotion: a detailed explanation of this setup for integration testing purposes can be found in my book, Integration Testing from the Trenches.

With the Maven build system, change between configuration is achieved through so-called profiles at build time. Roughly, a Maven profile is a portion of a POM that’s can be enabled (or not). For example, the following profile snippet replaces Maven’s standard resource directory with a dedicated one.


    
      dev
      
        
          
            profile/dev
            
              **/*
            
          
        
      
    

Activating a single or different profiles is as easy as using the -P switch with their id on the command-line when invoking Maven. The following command will activate the dev profile (provided it is set in the POM):

mvn package -Pdev

Now, let’s add a simple requirement: as I’m quite lazy, I want to exert the minimum effort possible to package the application along with its final production release configuration. This translates into making the production configuration, i.e. the JNDI fragment, the default one, and using the development fragment explicitly when necessary. Seasoned Maven users know how to implement that: create a production profile and configure it to be the default.


  dev
  
    true
  
  ...

Icing on the cake, profiles can even be set in Maven settings.xml files. Seems to good to be true? Well, very seasoned Maven users know that as soon as a single profile is explicitly activated, the default profile is de-activated. Previous experiences have taught me that because profiles are so easy to implement, they are used (and overused) so that the default one gets easily lost in the process. For example, in one such job, a profile was used on the Continuous Integration server to set some properties for the release in a dedicated setting files. In order to keep the right configuration, one has to a) know about the sneaky profile b) know it will break the default profile c) explicitly set the not-default-anymore profile.

Additional details about the dangers of Maven profiles for building artifacts can be found in this article.

Another drawback of this global approach is the tendency for over-fragmentation of the configuration files. I prefer to have coarse-grained configuration files, with each dedicated to a layer or a use-case. For example, I’d like to declare at least the datasource, the transaction manager and the entity manager in the same file with possibly the different repositories.

Come Spring profiles. As opposed to Maven profiles, Spring profiles are activated at runtime. I’m not sure whether this is a good or a bad thing, but the implementation makes it possible for real default configurations, with the help of @Conditional annotations (see my previous article for more details). That way, the wrapper-around-the-connection bean gets created when the dev profile is activated, and when not, the JNDI lookup bean. This kind of configuration is implemented in the following snippet:

@Configuration
public class MyConfiguration {

    @Bean
    @Profile("dev")
    public DataSource dataSource() throws Exception {
        org.apache.tomcat.jdbc.pool.DataSource dataSource = new org.apache.tomcat.jdbc.pool.DataSource();
        dataSource.setDriverClassName("org.h2.Driver");
        dataSource.setUrl("jdbc:h2:file:~/conditional");
        dataSource.setUsername("sa");
        return dataSource;
    }

    @Bean
    @ConditionalOnMissingBean(DataSource.class)
    public DataSource fakeDataSource() {
        JndiDataSourceLookup dataSourceLookup = new JndiDataSourceLookup();
        return dataSourceLookup.getDataSource("java:comp/env/jdbc/conditional");
    }
}

In this context, profiles are just a way to activate specific beans, the real magic is achieved through the different @Conditional annotations.

Note: it is advised to create a dedicated annotation to avoid String typos, to be more refactoring friendly and improve search capabilities on the code.

@Retention(RUNTIME)
@Target({TYPE, METHOD})
@Profile("dev")
public @interface Development {}

Now, this approach has some drawbacks as well. The most obvious problem is that the final archive will contain extra libraries, those that are use exclusively for development. This is readily apparent when one uses Spring Boot. One of such extra library is the h2 database, a whooping 1.7 Mb jar file. There are a two main counterarguments to this:

  • First, if you’re concerned about a couple of additional Mb, then your main issue is probably not on the software side, but on the disk management side. Perhaps a virtual layer such as VMWare or Xen could help?
  • Then, if the need be, you can still configure the build system to streamline the produced artifact.

The second drawback of Spring profiles is that along with extra libraries, the development configuration will be packaged into the final artifact as well. To be honest, when I first stumbled this approach, this was a no-go. Then, as usual, I thought more and more about it, and came to the following conclusion: there’s nothing wrong with that. Packaging the development configuration has no consequence whatsoever, whether it is set through XML or JavaConfig. Think about this: once an archive has been created, it is considered sealed, even when the application server explodes it for deployment purposes. It is considered very bad practice to do something on the exploded archive in all cases. So what would be the reason not to package the development configuration along? The only reason I can think of is: to be clean, from a theoretical point of view. Me being a pragmatist, I think the advantages of using Spring profiles is far greater than this drawback.

In my current project, I created a single configuration fragment with all beans that are dependent on the environment, the datasource and the Spring Security authentication provider. For the latter, the production configuration uses an internal LDAP, so that the development bean provides an in-memory provider.

So on one hand, we’ve got Maven profiles, which have definite issues but which we are familiar with, and on the other hand, we’ve got Spring profiles which are brand new, hurt our natural inclination but gets the job done. I’d suggest to give them a try: I did and am so far happy with them.

Send to Kindle
Categories: Java Tags: , ,

Optional dependencies in Spring

December 21st, 2014 6 comments

I’m a regular Spring framework user and I think I know the framework pretty well, but it seems I’m always stumbling upon something useful I didn’t know about. At Devoxx, I learned that you could express conditional dependencies using Java 8’s new Optional type. Note that before Java 8, optional dependencies could be auto-wired using @Autowired(required = false), but then you had to check for null.

How good is that? Well, I can think about a million use-cases, but here are some that come out of my mind:

  • Prevent usage of infrastructure dependencies, depending on the context. For example, in a development environment, one wouldn’t need to send metrics to a MetricRegistry
  • Provide defaults when required infrastructure dependencies are not provided e.g. a h2 datasource
  • The same could be done in a testing environment.
  • etc.

The implementation is very straightforward:

@ContextConfiguration(classes = OptionalConfiguration.class)
public class DependencyPresentTest extends AbstractTestNGSpringContextTests {

    @Autowired
    private Optional<HelloService> myServiceOptional;

    @Test
    public void should_return_hello() {
        String sayHello = null;
        if (myServiceOptional.isPresent()) {
            sayHello = myServiceOptional.get().sayHello();
        }
        assertNotNull(sayHello);
        assertEquals(sayHello, "Hello!");
    }
}

At this point, not only does the code compile fine, but the dependency is evaluated at compile time. Either the OptionalConfiguration contains the HelloService bean – and the above test succeeds, or it doesn’t – and the test fails.

This pattern is very elegant and I suggest you list it into your bag of available tools.

Send to Kindle
Categories: Java Tags: ,

Avoid conditional logic in @Configuration

December 7th, 2014 2 comments

Integration Testing Spring applications mandates to create small dedicated configuration fragments and to assemble them either during normal run of the application or during tests. Even in the latter case, different fragments can be assembled in different tests.

However, this practice doesn’t handle the use-case where I want to use the application in two different environments. As an example, I might want to use a JNDI datasource in deployed environments and a direct connection when developing  on my local machine. Assembling different fragment combinations is not possible, as I want to run the application in both cases, not test it.

My only requirement is that the default should use the JNDI datasource, while activating a flag – a profile, should switch to the direct connection. The Pavlovian reflex in this case would be to add a simple condition in the @Configuration class.

@Configuration
public class MyConfiguration {

    @Autowired
    private Environment env;

    @Bean
    public DataSource dataSource() throws Exception {
        if (env.acceptsProfiles("dev")) {
            org.apache.tomcat.jdbc.pool.DataSource dataSource = new org.apache.tomcat.jdbc.pool.DataSource();
            dataSource.setDriverClassName("org.h2.Driver");
            dataSource.setUrl("jdbc:h2:file:~/conditional");
            dataSource.setUsername("sa");
            return dataSource;
        }
        JndiDataSourceLookup dataSourceLookup = new JndiDataSourceLookup();
        return dataSourceLookup.getDataSource("java:comp/env/jdbc/conditional"); 
    }
}

Starting to use this kind flow control statements is the beginning of the end, as it will lead to adding more control flow statements in the future, which will lead in turn to a tangled mess of spaghetti configuration, and ultimately to an unmaintainable application.

Spring Boot offers a nice alternative to handle this use-case with different flavors of @ConditionalXXX annotations. Using them have the following advantages while doing the job: easy to use, readable and limited. While the latter point might seem to be a drawback, it’s the biggest asset IMHO (not unlike Maven plugins). Code is powerful, and with great power must come great responsibility, something that is hardly possible during the course of a project with deadlines and pressure from the higher-ups. That’s the main reason one of my colleagues advocates XML over JavaConfig: with XML, you’re sure there won’t be any abuse while the project runs its course.

But let’s stop the philosophy and back to @ConditionalXXX annotations. Basically, putting such an annotation on a @Bean method will invoke this method and put the bean in the factory based on a dedicated condition. There are many of them, here are some important ones:

  • Dependent on Java version, newer or older – @ConditionalOnJava
  • Dependent on a bean present in factory – @ConditionalOnBean, and its opposite, dependent on a bean name not present – @ConditionalOnMissingBean
  • Dependent on a class present on the classpath – @ConditionalOnClass, and its opposite @ConditionalOnMissingClass
  • Whether it’s a web application or not – @ConditionalOnWebApplication and @ConditionalOnNotWebApplication
  • etc.

Note that the whole list of existing conditions can be browsed in Spring Boot’s org.springframework.boot.autoconfigure.condition package.

With this information, we can migrate the above snippet to a more robust implementation:

@Configuration
public class MyConfiguration {

    @Bean
    @Profile("dev")
    public DataSource dataSource() throws Exception {
        org.apache.tomcat.jdbc.pool.DataSource dataSource = new org.apache.tomcat.jdbc.pool.DataSource();
        dataSource.setDriverClassName("org.h2.Driver");
        dataSource.setUrl("jdbc:h2:file:~/localisatordb");
        dataSource.setUsername("sa");
        return dataSource;
    }

    @Bean
    @ConditionalOnMissingBean(DataSource.class)
    public DataSource fakeDataSource() {
        JndiDataSourceLookup dataSourceLookup = new JndiDataSourceLookup();
        return dataSourceLookup.getDataSource("java:comp/env/jdbc/conditional");
    }
}

The configuration is now neatly separated into two different methods, the first method will be called only when the dev profile is active while the second will be when the first method is not called, hence when the dev profile is not active.

Finally, the best thing in that feature is that it is easily extensible, as it depends only on the @Conditional annotation and the Condition interface (who are part of Spring proper, not Spring Boot).

Here’s a simple example in Maven/IntelliJ format for you to play with. Have fun!

Send to Kindle
Categories: Java Tags: ,

Metrics, metrics everywhere

November 16th, 2014 No comments

With DevOps, metrics are starting to be among the non-functional requirements any application has to bring into scope. Before going further, there are several comments I’d like to make:

  1. Metrics are not only about non-functional stuff. Many metrics represent very important KPI for the business. For example, for an e-commerce shop, the business needs to know how many customers leave the checkout process, and in which screen. True, there are several solutions to achieve this, though they are all web-based (Google Analytics comes to mind) and metrics might also be required for different architectures. And having all metrics in the same backend mean they can be correlated easily.
  2. Metrics, as any other NFR (e.g. logging and exception handling) should be designed and managed upfront and not pushed in as an afterthought. How do I know that? Well, one of my last project focused on functional requirement only, and only in the end did project management realized NFR were important. Trust me when I say it was gory – and it has cost much more than if designed in the early phases of the project.
  3. Metrics have an overhead. However, without metrics, it’s not possible to increase performance. Just accept that and live with it.

The inputs are the following: the application is Spring MVC-based and metrics have to be aggregated in Graphite. We will start by using the excellent Metrics project: not only does it get the job done, its documentation is of very high quality and it’s available under the friendly OpenSource Apache v2.0 license.

That said, let’s imagine a “standard” base architecture to manage those components.

First, though Metrics offer a Graphite endpoint, this requires configuration in each environment and this makes it harder, especially on developers workstations. To manage this, we’ll send metrics to JMX and introduce jmxtrans as a middle component between JMX and graphite. As every JVM provides JMX services, this requires no configuration when there’s none needed – and has no impact on performance.

Second, as developers, we usually enjoy develop everything from scratch in order to show off how good we are – or sometimes because they didn’t browse the documentation. My point of view as a software engineer is that I’d rather not reinvent the wheel and focus on the task at end. Actually, Spring Boot already integrates with Metrics through the Actuator component. However, it only provides GaugeService – to send unique values, and CounterService – to increment/decrement values. This might be good enough for FR but not for NFR so we might want to tweak things a little.

The flow would be designed like this: Code > Spring Boot > Metrics > JMX > Graphite

The starting point is to create an aspect, as performance metric is a cross-cutting concern:

@Aspect
public class MetricAspect {

    private final MetricSender metricSender;

    @Autowired
    public MetricAspect(MetricSender metricSender) {
        this.metricSender = metricSender;
    }

    @Around("execution(* ch.frankel.blog.metrics.ping..*(..)) ||execution(* ch.frankel.blog.metrics.dice..*(..))")
    public Object doBasicProfiling(ProceedingJoinPoint pjp) throws Throwable {
        StopWatch stopWatch = metricSender.getStartedStopWatch();
        try {
            return pjp.proceed();
        } finally {
            Class clazz = pjp.getTarget().getClass();
            String methodName = pjp.getSignature().getName();
            metricSender.stopAndSend(stopWatch, clazz, methodName);
        }
    }
}

The only thing outside of the ordinary is the usage of autowiring as aspects don’t seem to be able to be the target of explicit wiring (yet?). Also notice the aspect itself doesn’t interact with the Metrics API, it only delegates to a dedicated component:

public class MetricSender {

    private final MetricRegistry registry;

    public MetricSender(MetricRegistry registry) {
        this.registry = registry;
    }

    private Histogram getOrAdd(String metricsName) {
        Map registeredHistograms = registry.getHistograms();
        Histogram registeredHistogram = registeredHistograms.get(metricsName);
        if (registeredHistogram == null) {
            Reservoir reservoir = new ExponentiallyDecayingReservoir();
            registeredHistogram = new Histogram(reservoir);
            registry.register(metricsName, registeredHistogram);
        }
        return registeredHistogram;
    }

    public StopWatch getStartedStopWatch() {
        StopWatch stopWatch = new StopWatch();
        stopWatch.start();
        return stopWatch;
    }

    private String computeMetricName(Class clazz, String methodName) {
        return clazz.getName() + '.' + methodName;
    }

    public void stopAndSend(StopWatch stopWatch, Class clazz, String methodName) {
        stopWatch.stop();
        String metricName = computeMetricName(clazz, methodName);
        getOrAdd(metricName).update(stopWatch.getTotalTimeMillis());
    }
}

The sender does several interesting things (but with no state):

  • It returns a new StopWatch for the aspect to pass back after method execution
  • It computes the metric name depending on the class and the method
  • It stops the StopWatch and sends the time to the MetricRegistry
  • Note it also lazily creates and registers a new Histogram with an ExponentiallyDecayingReservoir instance. The default behavior is to provide an UniformReservoir, which keeps data forever and is not suitable for our need.

The final step is to tell the Metrics API to send data to JMX. This can be done in one of the configuration classes, preferably the one dedicated to metrics, using the @PostConstruct annotation on the desired method.

@Configuration
public class MetricsConfiguration {

    @Autowired
    private MetricRegistry metricRegistry;

    @Bean
    public MetricSender metricSender() {
        return new MetricSender(metricRegistry);
    }

    @PostConstruct
    public void connectRegistryToJmx() {
        JmxReporter reporter = JmxReporter.forRegistry(metricRegistry).build();
        reporter.start();
    }
}

The JConsole should look like the following. Icing on the cake, all default Spring Boot metrics are also available:

Sources for this article are available in Maven “format”.

To go further:

Send to Kindle

Using exceptions when designing an API

August 31st, 2014 2 comments

Many knows the tradeoff of using exceptions while designing an application:

  • On one hand, using try-catch block nicely segregates between regular code and exception handling code
  • On the other hand, using exceptions has a definite performance cost for the JVM

Every time I’ve been facing this quandary, I’ve ruled in favor of the former, because “premature optimization is evil”. However, this week has proved me that exception handling in designing an API is a very serious decision.

I’ve been working to improve the performances of our application and I’ve noticed many silent catches coming from the Spring framework (with the help of the excellent dynaTrace tool). The guilty lines comes from the RequestContext.initContext() method:

if (this.webApplicationContext.containsBean(REQUEST_DATA_VALUE_PROCESSOR_BEAN_NAME)) {
    this.requestDataValueProcessor = this.webApplicationContext.getBean(
    REQUEST_DATA_VALUE_PROCESSOR_BEAN_NAME, RequestDataValueProcessor.class);
}

Looking at the JavaDocs, it is clear that this method (and the lines above) are called each time the Spring framework handles a request. For web applications under heavy load, this means quite a lot! I provided a pass-through implementation of the RequestDataValueProcessor and patched one node of the cluster. After running more tests, we noticed response times were on average 5% faster on the patched node compared to the un-patched node. This is not my point however.

Should an exception be thrown when the bean is not present in the context? I think not… as the above snippet confirms. Other situations e.g. injecting dependencies, might call for an exception to be thrown, but in this case, it has to be the responsibility of the caller code to throw it or not, depending on the exact situation.

There are plenty of viable alternatives to exceptions throwing:

Returning null
This means the intent of the code is not explicit without looking at the JavaDocs, and so the worst option on our list
Returning an Optional
This makes the intent explicit compared to returning null. Of course, this requires Java 8
Return a Guava’s Optional
For those of us who are not fortunate enough to have Java 8
Returning one’s own Optional
If you don’t use Guava and prefer to embed your own copy of the class instead of relying on an external library
Returning a Try
Cook up something like Scala’s Try, which wraps either (hence its old name – Either) the returned bean or an exception. In this case, however, the exception is not thrown but used like any other object – hence there will be no performance problem.

Conclusion: when designing an API, one should really keep using exceptions for exceptional situations only.

As for the current situation, Spring’s BeanFactory class lies at center of a web of dependencies and its multiple getBean() method implementation cannot be easily replaced with one of the above options without forfeiting backward compatibility. One solution, however, would be to provide additional getBeanSafe() methods (or a better relevant name) using one of the above options, and then replace usage of their original counterpart step by step inside the Spring framework.

 

Send to Kindle
Categories: Java Tags: , ,

Spring configuration modularization for Integration Testing

July 27th, 2014 1 comment

Object-Oriented Programming advocates for modularization in order to build small and reusable components. There are however other reasons for this. In the case of the Spring framework, modularization enables Integration Testing, the ability to test the system or parts of it, including assembly configuration.

Why is it so important to test the system assembled with the final configuration? Let’s take a simple example, the making of a car. Unit Testing the car would be akin to testing every nuts and bolts of the car separately, while Integration Testing the car would be like driving it on a circuit. By testing only the car’s components separately, selling the assembled car is a huge risk as nothing guarantees it will behave correctly in real-life conditions.

Now that we have asserted Integration Testing is necessary to guarantee the adequate level of internal quality, it’s time to enable Integration Testing with the Spring framework. Integration Testing is based on the notion of SUT. Defining the SUT is to define the boundaries between what is tested and its dependencies. In nearly all cases, test setup will require to provide some kind of test double for each required dependency. Configuring those test doubles can only be achieved by modularizing Spring configuration, so that they can replace dependent beans located outside the SUT.

Sample bean dependency diagram

Fig. 1 – Sample bean dependency diagram

Spring’s DI configuration comes in 3 different flavors: XML – the legacy way, autowiring and the newest JavaConfig. We’ll have a look at how modularization can be achieved for each flavor. Mixed DI modularization can be deduced from each separate entry.

Autowiring

Autowiring is an easy way to assemble Spring applications. It is achieved through the use of either @Autowiring or @Inject. Let’s cover quickly autowiring: as injection is implicit, there’s no easy way to modularize configuration. Applications using autowiring will just have to migrate to another DI flavor to allow for Integration Testing.

XML

XML is the legacy way to inject dependencies, but is still in use. Consider the following monolithic XML configuration file:



  
  
    
  
  
    
  
  
    
  
  
    
    
    
  

At this point, Integration Testing orderService is not easy as it should be. In particular, we need to:

  • Download the application server
  • Configure the server for the jdbc/MyDataSource data source
  • Deploy all classes to the server
  • Start the server
  • Stop the server After the test(s)

Of course, all previous tasks have to be automated! Though not impossible thanks to tools such as Arquillian, it’s contrary to the KISS principle. To overcome this problem and make our life (as well as test maintenance) easier in the process requires tooling and design. On the tooling part, we’ll be using a local database. Usually, such a database is of the in-memory kind e.g. H2. On the design part, his requires separating our beans by creating two different configuration fragments, one solely dedicated to the data source to be faked and the other one for the beans constituting the SUT.

Then, we’ll use a Maven classpath trick: Maven puts the test classpath in front of the main classpath when executing tests. This way, files found in the test classpath will “override” similarly-named files in the main classpath. Let’s create two configuration files fragments:

  • The “real” JNDI datasource as in the monolithic configuration
    
      
    
  • The Fake datasource
    
      
          
          
          
          
      
    

    Note we are using a Tomcat datasource object, this requires the org.apache.tomcat:tomcat-jdbc:jar library on the test classpath. Also note the maxActive property. This reflects the maximum number of connections to the database. It is advised to always set it to 1 for test scenarios so that connections pool exhaustion bugs can be checked as early as possible.

The final layout is the following:

Fig. 2 – Project structure for Spring XML configuration Integration Testing

  1. JNDI datasource
  2. Other beans
  3. Fake datasource

The final main-config.xml file looks like:



  
  

Such a structure is the basics to enable Integration Testing.

JavaConfig

JavaConfig is the most recent way to configure Spring applications, bringing both compile-time (as autowiring) and explicit configuration (as XML) safety.

The above datasources fragments can be “translated” in Java as follows:

  • The “real” JNDI datasource as in the monolithic configuration
    @Configuration
    public class DataSourceConfig {
    
        @Bean
        public DataSource dataSource() throws Exception {
            Context ctx = new InitialContext();
            return (DataSource) ctx.lookup("jdbc/MyDataSource");
        }
    }
  • The Fake datasource
    public class FakeDataSourceConfig {
    
        public DataSource dataSource() {
            org.apache.tomcat.jdbc.pool.DataSource dataSource = new org.apache.tomcat.jdbc.pool.DataSource();
            dataSource.setDriverClassName("org.h2.Driver");
            dataSource.setUrl("jdbc:h2:~/test");
            dataSource.setUsername("sa");
            dataSource.setMaxActive(1);
            return dataSource;
        }
    }

However, there are two problems that appear when using JavaConfig.

  1. It’s not possible to use the same classpath trick with an import as with XML previously, as Java forbids to have 2 (or more) classes with the same qualified name loaded by the same classloader (which is the case with Maven). Therefore, JavaConfig configuration fragments shouldn’t explicitly import other fragments but should leave the fragment assembly responsibility to their users (application or tests) so that names can be different, e.g.:
    @ContextConfiguration(classes = {MainConfig.class, FakeDataSource.class})
    public class SimpleDataSourceIntegrationTest extends AbstractTestNGSpringContextTests {
    
        @Test
        public void should_check_something_useful() {
            // Test goes there
        }
    }
  2. The main configuration fragment uses the datasource bean from the other configuration fragment. This mandates for the former to have a reference on the latter. This is obtained by using the @Autowired annotation (one of the few relevant usage of it).
    @Configuration
    public class MainConfig {
    
        @Autowired
        private DataSource dataSource;
    
        // Other beans go there. They can use dataSource!
    }

Summary

In this article, I showed how Integration Testing to a Fake data source could be achieved by modularizing the monolithic Spring configuration into different configuration fragments, either XML or JavaConfig.

However, the realm of Integration Testing – with Spring or without, is vast. Should you want to go further, I’ll hold a talk on Integration Testing at Agile Tour London on Oct. 24th and at Java Days Kiev on Oct. 17th-18th.

This article is an abridged version of a part of the Spring chapter of Integration Testing from the Trenches. Have a look at it, there’s even a sample free chapter!

Integration Testing from the Trenches

Send to Kindle
Categories: Java Tags: ,

Easier Spring version management

July 6th, 2014 No comments

Earlier on, Spring migrated from a monolithic approach – the whole framework, to a modular one – bean, context, test, etc. so that one could decide to use only the required modules. This modularity came at a cost, however: in the Maven build configuration (or the Gradle one for that matter), one had to specify the version for each used module.



    ...
    
        
            org.springframework
            spring-webmvc
            4.0.5.RELEASE
        
        
            org.springframework
            spring-jdbc
            4.0.5.RELEASE
        
        
            org.springframework
            spring-test
            4.0.5.RELEASE
            test
        
    

Of course, professional Maven users would improve this POM with the following:



    ...
   
        4.0.5.RELEASE
    
    
        
            org.springframework
            spring-webmvc
            ${spring.version}
        
        
            org.springframework
            spring-jdbc
            ${spring.version}
        
        
            org.springframework
            spring-test
            ${spring.version}
            test
        
    

There’s an more concise way to achieve the same through a BOM-typed POM (see section on scope import) since version 3.2.6 though.



    ...
    
        
            
                org.springframework
                spring-framework-bom
                pom
                4.0.5.RELEASE
                import
            
        
    
    
        
            org.springframework
            spring-webmvc
        
        
            org.springframework
            spring-jdbc
        
        
            org.springframework
            spring-test
            test
        
    

Note that Spring’s BOM only sets version but not scope, this has to be done in each user POM.

Spring released very recently the Spring IO platform which also includes a BOM. This BOM not only includes Spring dependencies but also other third-party libraries.



    ...
    
        
            
                io.spring.platform
                platform-bom
                pom
                1.0.0.RELEASE
                import
            
        
    
    
        
            org.springframework
            spring-webmvc
        
        
            org.springframework
            spring-jdbc
        
        
            org.springframework
            spring-test
            test
        
        
            org.testng
            testng
            test
        
    

There’s one single problem with Spring IO platform’s BOM, there’s no simple mapping from the BOM version to declared dependencies versions. For example, the BOM’s 1.0.0.RELEASE maps to Spring 4.0.5.RELEASE.

To go further:

Send to Kindle
Categories: Java Tags: ,

The right bean at the right place

June 22nd, 2014 No comments

Among the different customers I worked for, I noticed a widespread misunderstanding regarding the use of Spring contexts in Spring MVC.

Basically, you have contexts, in a parent-child relationship:

  • The main context is where service beans are hosted. By convention, it is spawned from the /WEB-INF/applicationContext.xml file
    but this location can be changed by using the contextConfigLocation context parameter. Alternatively, one can use the AbstractAnnotationConfigDispatcherServletInitializer and in this case, configuration classes should be parts of the array returned by the getRootConfigClasses() method.
  • Web context(s) is where Spring MVC dispatcher servlet configuration beans and controllers can be found. It is spawned from -servlet.xml or, if using the JavaConfig above, comes from classes returned by the getServletConfigClasses() method

As in every parent-child relationship, there’s a catch:

Beans from the child contexts can access beans from the parent context, but not the opposite.

That makes sense if you picture this: I want my controllers to be injected with services, but not the other way around (it would be a funny idea to inject controllers in services). Besides, you could have multiple Spring servlets with its own web context, each sharing the same main context for parent. When it goes beyond controllers and services, one should decide in which context a bean should go. For some, that’s pretty evident: view resolvers, message sources and such go into the web context; for others, one would have to spend some time thinking about it.

A good rule of thumb to decide in which context which bean should go is the following: IF you had multiple servlets (even if you do not), what would you like to share and what not.

Note: this way of thinking should not be tied to your application itself, as otherwise you’d probably end up sharing message sources in the main application context, which is a (really) bad idea.

This modularization let you put the right bean at the right place, promoting bean reusability.

Send to Kindle
Categories: JavaEE Tags: , ,