Guava is an heavyweight library and I would like this to change

January 12th, 2014 4 comments

Google Guava is an useful library that offers many different but unrelated features:

However, this article is not about those features but about offering a single heavyweight Uber JAR for all. From Google’s point-of-view, providing an Uber library for all projects makes sense: “Hey guys, just add this dependency and it will meet your every requirement”. However, from my point of view, this is just making my applications heavier.

Cohesion is at the root of good software development. Many framework providers, such as Spring and JBoss, release nicely cohesive packages and manage dependencies between them through a Dependency Management tool. What is strange is that most Guava’s features are not coupled together, so a having single library is not mandatory. Even stranger, Guava has previously been released in different JARs but Google stopped doing that with version r03.

I have found no solution beside creating separate JARs and handling dependencies myself, then storing them in a Maven Enterprise Repository. Since these tasks are required for each release, I never found the ROI interesting enough. The easiest way would be for Google to do that at build time.

Dear Google engineers, if by chance you happen to stumble upon this article, I’d be very grateful if you’d consider it.

Categories: Java Tags: ,

Solr, as a Spring Data module

December 1st, 2013 No comments

At the end of October, I attended a 3-days Solr training. It was very interesting, in light of the former Elastic Search talk I attended mid-year. As an old Spring Data fan, when I found out Spring Data offered a Solr module, I jumped at the chance to try it.

In this case, I’m well aware that an abstraction layer over Solr doesn’t mean we can easily change the underlying datastore: Solr is designed as an inverted index, while other Spring Data modules are much more specialized (JPA, MongoDB, etc.). However, there are still some advantages of using Spring Data over Solr:

  • Quick prototyping, when you don’t need the whole nine yards
  • Use Solr when most of the team already knows about Spring Data and not Solr
  • Spring integration from the bottom
  • Last but not least important of all, you can easily switch between the embedded Solr and the standalone one. Spring Data nicely wraps both the Solrj API and HTTP behind its own API. If you think this is not a relevant use-case as it never will happen, think about integration testing

After having initialized Solr with relevant data, here are steps you have to go through to start developing:

  1. The initial step is to configure the Spring context with the underlying Solr. Spring Data Solr requires a bean named solrTemplate of type SolrOperations. Let use the JavaConfig for this:
    @Configuration
    @EnableSolrRepositories("ch.frankel.blog.springdata.solr.repository")
    public class JavaConfig {
        @Bean
        protected HttpSolrServerFactoryBean solrServerFactory() {
    
            HttpSolrServerFactoryBean factory = new HttpSolrServerFactoryBean();
    
            factory.setUrl("http://localhost:8983/solr");
    
            return factory;
        }
    
        @Bean
        public SolrOperations solrTemplate() throws Exception {
    
            return new SolrTemplate(solrServerFactory().getObject());
        }
    }
  2. Create the managed entity. All fields stored in Solr have to be annotated with @Field. If the entity attribute name is different from Solr field name, @Field accepts a value to override the name:
    public class Loan {
        @Field
        private String id;
    
        @Field("gov_type")
        private String governmentType;
    
        // Getters and setters
    }
  3. Create the repository interface: either inherit from SolrRepository<Bean,ID> or SolrCrudRepository<Bean,ID>. Note the former only provides the count() method.
    public interface LoanRepository extends SolrRepository<Loan, String> {}
  4. Add query methods to the former interface:
    • Simple query methods benefit from Spring Data parsing: List<Loan> findByLoanType(String) will translate into ?q=loan_type:...
    • More advanced queries (or method whose name do not follow the pattern) should be annotated with @Query@Query("state_name:[* TO *]") will find all loans which have a value for state_name. The annotation can bind as many parameters as necessary, through the use of ?x where x is the parameter index in the method parameters list.
    • Facets are also handled through the @Facet annotation, which takes the field names the facet should use. This changes the method signature, though, to a FacetPage and it is up to the calling method (probably located the service layer) to handle that.

The final class should look something like that, which is awesome considering all capabilities it provides:

public interface LoanRepository extends SolrRepository<Loan, String> {

    List<Loan> findByLoanType(String loan);

    List<Loan> findByTitleContaining(String title);

    @Query("state_name:[* TO *]")
    List<Loan> findLocalized();

    @Query("*:*")
    @Facet(fields = "loan_type")
    FacetPage<Loan> findAllLoanTypes(Pageable page);
}

The entire project source code is available there in IntelliJ/Maven format, complete with tests. Note that you’ll need a running Solr instance with a specific Solr schema and documents in it. Both the schema and the document files (federal.xml) are provided, but those are not automated: you’ll need to overwrite your schema and the document file to the running instance (with the classic java -jar post.jar command line).

To go further:

Categories: Java Tags: ,

Spreading some JavaFX love

November 24th, 2013 1 comment

I’m not a big fan of JavaFX: version 1 was just a huge failure, and investing in fat-client architecture in 2013 is either because you have very specific needs or are completely out of your mind. Nevertheless, I wanted to write about fat-client testing in “Integration testing from the trenches”, and JavaFX is the Java way for fat-client GUI.

So I was searching for some easily available application I could take as an example when I stumbled upon the Ensemble samples project from Oracle itself. Sources are available so I thought this could be a good match. Nothing could be further from the truth, the project is plagued by pitfalls:

  1. It seems the good people at Oracle had to choose between promoting NetBeans and JavaFX and their priority was on the former. Sources are available in NetBeans format. Tough luck if you don’t use it as your IDE…
  2. NetBeans uses Ant as its underlying build tool. Not only are Java sources and binary resources stored in the same directories, you have to read carefully through the build file to tweak some things.
  3. The code itself distinguishes between running from sources or from JAR. In the former case, it uses a text file listing all the desired samples to include, but it is built by the Ant build, and if it’s not present (or empty), you get a not-so-nice NullPointerException.
  4. Even worse, though JavaFX runtime is included in JRE 7, it isn’t on the classpath.

I think that putting such obstacles in the way of developers is hardly the thing to do to promote a technology. In order to fix those problems, I decided to tackle this heads-on, so I’ve created a Maven project. I changed no code, just moved resources around to adapt to the Maven standard project structure and provided the required text file.

The whole project is on Github, you’re welcome to give it a try!

Categories: Java Tags:

Integrate Spring JavaConfig with legacy configuration

November 9th, 2013 1 comment

The application I’m working on now uses Spring both by parsing for XML Spring configuration files in pre-determined locations and by scanning annotations-based autowiring. I’ve already stated my stance on autowiring previously, this article only concerns itself on how I could use Spring JavaConfig without migrating the whole existing codebase in a single big refactoring.

This is easily achieved by scanning the package where the JavaConfig class is located in the legacy Spring XML configuration file:

<ctx:component-scan base-package="ch.frankel.blog.spring.javaconfig.config" />

However, we may need to inject beans created through the old methods into our new JavaConfig file. In order to achieve this, we need to autowire those beans into JavaConfig:

@Configuration
public class JavaConfig {

    @Autowired
    private FooDao fooDao;

    @Autowired
    private FooService fooService;

    @Autowired
    private BarService barService;

    @Bean
    public AnotherService anotherService() {

        return new AnotherService(fooDao);
    }

    @Bean
    public MyFacade myFacade() {

        return new MyFacade(fooService, barService, anotherService());
    }
}

You can find complete sources for this article (including tests) in IDEA/Maven format.

Thanks to Josh Long and Gildas Cuisinier for the pointers on how to achieve this.

Categories: Java Tags: ,

My case against autowiring

November 3rd, 2013 8 comments

Autowiring is a particular kind of wiring, where injecting dependencies is not explicit but actually managed implicitly by the container. This article tries to provide some relevant info regarding disadvantages of using autowiring. Although Spring is taken as an example, the same reasoning can apply to JavaEE’s CDI.

Autowiring basics

Autowiring flavors

Autowiring comes into different flavors:

  • Autowiring by type means Spring will look for available beans that match the type used in the setter. For example, for a bean having a setCar(Car car) method, Spring will look for a Car type (or one of its subtype) in the bean context.
  • Autowiring by name means Spring search strategy is based on beans name. For the previous example, Spring will look for a bean named car regardless of its type. This both requires the bean to be named (no anonymous bean here) and implies that the developer is responsible for any type mistmatch (if the car bean is of type Plane).

Autowiring is also possible for constructor injection.

Autowiring through XML configuration

Let’s first dispel some misconceptions: autowiring does not imply annotations. In fact, autowiring is available through XML since ages and can be enabled with the following syntax.

<bean autowire="byType" class="some.Clazz" />

This means Spring will try to find the right type to fill the dependency(ies) by using setter(s).

Alternative autowire parameters include:

  • by name: instead of matching by type, Spring will use bean names
  • constructor: constructor injection restricted to by-type
  • autodetect: use constructor injection, but falls back to by-type if no adequate constructor is found

Autowiring through annotations configuration

Available annotations are the following:

Annotation Description Source
 org.springframework.bean.factory.@Autowired  Wires by type  Spring 2.5+
 javax.annotation.@Resource  Wires by name  JavaEE 5+
 javax.inject.@Inject  Wires by type  JavaEE 6+
javax.inject.@Qualifier  Narrows type wiring  JavaEE 6+

Dependency Injection is all about decoupling your classes from one another to make them more testable. Autowiring through annotations strongly couples annotated classes to annotations libraries. Admittedly, this is worse for Spring’s @Autowired than for others as they are generally provided by application servers.

Why it is so bad after all?

Autowiring is all about simplifying dependency injection, and although it may seem seductive at first, it’s not maintainable in real-life projects.

Explicit dependency injection is – guess what, explicitly defining a single bean to match the required dependency. Even better, if Java Configuration is used, it is validated at compile time.

On the opposite, autowiring is about expressing constraints on all available beans in the Spring context so that one and only one matches the required dependency. In effect, you delegate wiring to Spring, tasking him to find the right bean for you. This means that by adding beans in your context, you run the risk of providing more than one match, if your previous constraints do not apply to the new beans. The larger the project, the larger the team (or even worse, the number of teams), the larger the context, the higher the risk.

For example, imagine you crafted a real nice service with a single implementation, that you need to inject somewhere. After a while, someone notices your service but creates an alternative implementation for some reason. By just declaring this new bean in the context, along with the former, it will break container initialization! Worse, it may be done in an entirely different part of the application, with seemingly no link with the former. At this point, good luck to analyze the reason of the bug.

Note that autowiring by name if even worse, since bean names have a probability of collision, even across different type hierarchies.

Autowiring is bad in Spring, but it is even worse in CDI where every class in the classpath is a candidate for injection. By the way, any CDI guru reading this post care to explain why autowiring is the only way of DI? That would be really enlightening.

Summary

In this post, I tried to explain what is autowiring. It can be used all right, but now you should be aware of its con. IMHO, you should only use it for prototyping or quick-and-dirty proof of concepts, everything that can be discarded after a single use. If really needed, prefer wiring by type over wiring by name as at least it matching doesn’t depend on a String.

Categories: Java Tags: , ,

Spring method injection with Java Configuration

October 27th, 2013 No comments

Last week, I described how a Rich Model Object could be used with Spring using Spring’s method injection from an architectural point of view.

What is missing, however, is how to use method injection with my new preferred method for configuration, Java Config. My start point is the following, using both autowiring (<shudder>) and method injection.

public abstract class TopCaller {

    @Autowired
    private StuffService stuffService;

    public SomeBean newSomeBean() {

        return newSomeBeanBuilder().with(stuffService).build();
    }

    public abstract SomeBeanBuilder newSomeBeanBuilder();
}

Migrating to Java Config requires the following steps:

  1. Updating the caller structure to allow for constructor injection
  2. For method injection, provide an implementation for the abstract method in the configuration class
  3. That’s all…
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Scope;

@Configuration
public class JavaConfig {

    @Bean
    public TopCaller topCaller() {

        return new TopCaller(stuffService()) {

            @Override
            public SomeBeanBuilder newSomeBeanBuilder() {

                return someBeanBuilder();
            }
        };
    }

    @Bean
    public StuffService stuffService() {

        return new StuffService();
    }

    @Bean
    @Scope(BeanDefinition.SCOPE_PROTOTYPE)
    public SomeBeanBuilder someBeanBuilder() {

        return new SomeBeanBuilder();
    }
}
Categories: Java Tags: , ,

Rich Domain Objects and Spring Dependency Injection are compatible

October 20th, 2013 3 comments

I’m currently working in a an environment where most developers are Object-Oriented fanatics. Given that we develop in Java, I think that it is a good thing – save the fanatics part. In particular, I’ve run across a deeply-entrenched meme that states that modeling Rich Domain Objects and using Spring dependency injection at the same time is not possible. Not only is this completely false, it reveals a lack of knowledge of Spring features, one I’ll be trying to correct in this article.

However, my main point is not about Spring but about whatever paradigm one holds most dear, be it Object-Oriented Programming, Functional Programming, Aspect-Oriented Programming or whatever. Those are only meant to give desired properties to software: unit-testable, readable, whatever… So one should always focus on those properties than on the way of getting them. Remember:

When the wise man points at the moon, the idiot looks at the finger.

Back to the problem at hand. The basic idea is to have some bean having reference on some service to do some stuff (notice the real-word notions behind this idea…) so you could call:

someBean.doStuff()

The following is exactly what not to do:

public class TopCaller {

    @Autowired
    private StuffService stuffService;

    public SomeBean newSomeBean() {

        return new SomeBeanBuilder().with(stuffService).build();
    }
}

This design should be anathema to developers promoting unit-testing as the new() deeply couples the TopCaller and SomeBeanBuilder classes. There’s no way to stub this dependency in a test, it prevents testing the createSomeBean() method in isolation.

What you need is to inject SomeBean prototypes into the SomeBeanBuilder singleton. This is method injection and is possible within Spring with the help of lookup methods (I’ve already blogged about that some time ago and you should probably have a look at it).

public abstract class TopCaller {

    @Autowired
    private StuffService stuffService;

    public SomeBean newSomeBean() {

        return newSomeBeanBuilder().with(stuffService).build();
    }

    public abstract SomeBeanBuilder newSomeBeanBuilder();
}

With the right configuration, Spring will take care of providing a different SomeBeanBuilder each time the newSomeBeanBuilder() method is called. With this new design, we changed the strong coupling between TopCaller and SomeBeanBuilder to a soft coupling, one that can be stubbed in tests and allow for unit-testing.

In the current design, it seems the only reason for SomeBeanBuilder to exist is to pass the StuffService from the TopCaller to SomeBean instances. There is no need to keep it with method injection.

There are two different possible improvements:

  1. Given our “newfound” knowledge, inject:
    • method newSomeBean() instead of newSomeBeanBuilder() into TopCaller
    • StuffService directly into SomeBean

  2. Keep StuffService as a TopCaller attribute and pass it every time doStuff() is invoked

I would favor the second option since I frown upon a Domain Object keeping references to singleton services, unless there are many parameters to pass at every call. As always, there’s not a single best choice, but only contextual choices.

Also, I would also use explicit dependency management instead of some automagic stuff, but that another debate.

I hope this piece proved beyond any doubt that Spring does not prevent Rich Domain Model, far from it. As a general rule, know about tools you use and go their way instead of following some path just for the sake of it .

getCaller() hack

October 13th, 2013 2 comments

Disclaimer: the following is a devious hack, it should only be used if you know what you’re doing.

As developers, we should only call public APIs. However, the Java language cannot differentiate between public API and private stuff: as soon as a class and one of its method is public, we can reference the former and call the later. Therefore, we are exposed to the Dark Side of the Force, and sometimes tempted to use it.

A good example of this terrible temptation is the sun.reflect.Reflection.getCaller(int) method. As its name implies, this evil piece returns which class called your current code, letting you tailor your code behavior depending on the calling class. The Dark Side can be seductive indeed!

This way, we can check whether some caller has some property and let it invoke our stuff or not accordingly. For example, let’s design a code piece that can be invoked only from “privileged” code. Such privileges include: be a specific class, come from a specific package, be annotated with a specific annotation, etc. The following code uses the third option.

import java.lang.annotation.Retention;
import java.lang.annotation.Target;

import static java.lang.annotation.ElementType.TYPE;
import static java.lang.annotation.RetentionPolicy.RUNTIME;

@Retention(RUNTIME)
@Target(TYPE)
public @interface Privileged {}

Now, checking for this annotation’s existence in the calling code is achieved like this:

public class Restricted {

    public Restricted() {

    int i = 2;

    while (true) {

        Class<?> caller = Reflection.getCallerClass(i++);

        if (caller == null) {

            throw new SecurityException();
        }

        if (caller.getAnnotation(Privileged.class) != null) {

            break;
        }
    }
}
  1. The int parameter in the getCallerClass() method refers to the number of frames up the calling stack. Index 0 is Reflection.getCallerClass() itself, while index 1 is our restricted constructor, so that real work should begin at 2.
  2. The algorithm is to search up the stack until either the whole stack has been browsed and no privileged code has been found or stop when the matching annotation is found. In the former case, throw an exception, in the latter, do the job as expected.
  3. Of course, this algorithm shouldn’t be hard-coded but factorized in a aspect and injected at some determined pointcut (perhaps a @Restricted annotation?)

Ok, but what use-case does this solve? Well, I’ve always thought about what requirements Hibernate (or JPA) put on your design. One of such requirement is to provide a no-arg constructor, for the framework to instantiate. After that, it can use setter or inject attributes directly. Unfortunately, this conflicts with my desire to provide immutable objects that have all mandatory parameters in the constructor. A basic solution would be to provide both constructors but document them accordingly (/** Do not use: this one is for Hibernate. */). However, to really enforce this rule, you’d probably need to use getCallerClass() with a check on an org.hibernate package.

Beware that Oracle intended to remove this method in the next JDK. Again, do not use this at home, but it opens some interesting perspectives, doesn’t it? [...evil laugh...]

Download the above code in IntelliJ/Maven format here.

Categories: Java Tags:

A dive into the Builder pattern

October 6th, 2013 8 comments

The Builder pattern has been described in the Gang of Four “Design Patterns” book:

The builder pattern is a design pattern that allows for the step-by-step creation of complex objects using the correct sequence of actions. The construction is controlled by a director object that only needs to know the type of object it is to create.


A common implementation of using the Builder pattern is to have a fluent interface, with the following caller code:

Person person = new PersonBuilder().withFirstName("John").withLastName("Doe").withTitle(Title.MR).build();

This code snippet can be enabled by the following builder:

public class PersonBuilder {

    private Person person = new Person();

    public PersonBuilder withFirstName(String firstName) {

        person.setFirstName(firstName);

        return this;
    }

    // Other methods along the same model
    // ...

    public Person build() {

        return person;
    }
}

The job of the Builder is achieved: the Person instance is well-encapsulated and only the build() method finally returns the built instance. This is usually where most articles stop, pretending to have covered the subject. Unfortunately, some cases may arise that need deeper work.

Let’s say we need some validation handling the final Person instance, e.g. the lastName attribute is to be mandatory. To provide this, we could easily check if the attribute is null in the build() method and throws an exception accordingly.

public Person build() {

    if (lastName == null) {

        throw new IllegalStateException("Last name cannot be null");
    }

    return person;
}

Sure, this resolves our problem. Unfortunately, this check happens at runtime, as developers calling our code will find (much to their chagrin). To go the way to true DSL, we have to update our design – a lot. We should enforce the following caller code:

Person person1 = new PersonBuilder().withFirstName("John").withLastName("Doe").withTitle(Title.MR).build(); // OK

Person person2 = new PersonBuilder().withFirstName("John").withTitle(Title.MR).build(); // Doesn't compile

We have to update our builder so that it may either return itself, or an invalid builder that lacks the build() method as in the following diagram. Note the first PersonBuilder class is kept as the entry-point for the calling code doesn’t have to cope with Valid-/InvaliPersonBuilder if it doesn’t want to.


This may translate into the following code:

public class PersonBuilder {

    private Person person = new Person();

    public InvalidPersonBuilder withFirstName(String firstName) {

        person.setFirstName(firstName);

        return new InvalidPersonBuilder(person);
    }

    public ValidPersonBuilder withLastName(String lastName) {

        person.setLastName(lastName);

        return new ValidPersonBuilder(person);
    }

    // Other methods, but NO build() methods
}

public class InvalidPersonBuilder {

    private Person person;

    public InvalidPersonBuilder(Person person) {

        this.person = person;
    }

    public InvalidPersonBuilder withFirstName(String firstName) {

        person.setFirstName(firstName);

        return this;
    }

    public ValidPersonBuilder withLastName(String lastName) {

        person.setLastName(lastName);

        return new ValidPersonBuilder(person);
    }

    // Other methods, but NO build() methods
}

public class ValidPersonBuilder {

    private Person person;

    public ValidPersonBuilder(Person person) {

        this.person = person;
    }

    public ValidPersonBuilder withFirstName(String firstName) {

        person.setFirstName(firstName);

        return this;
    }

    // Other methods

    // Look, ma! I can build
    public Person build() {

        return person;
    }
}

This is a huge improvement, as now developers can know at compile-time their built object is invalid.

The next step is to imagine more complex use-case:

  1. Builder methods have to be called in a certain order. For example, a house should have foundations, a frame and a roof. Building the frame requires having built foundations, as building the roof requires the frame.
  2. Even more complex, some steps are dependent on previous steps (e.g. having a flat roof is only possible with a concrete frame)

The exercise is left to interested readers. Links to proposed implementations welcome in comments.

There’s one flaw with our design: just calling the setLastName() method is enough to qualify our builder as valid, so passing null defeats our design purpose. Checking for null value at runtime wouldn’t be enough for our compile-time strategy. The Scala language features may leverage an enhancement to this design called the type-safe builder pattern.

Summary

  1. In real-life software, the builder pattern is not so easy to implement as quick examples found here and there
  2. Less is more: create an easy-to-use DSL is (very) hard
  3. Scala makes it easier for complex builder implementation’s designers than Java
Categories: Development Tags: , ,

Learning Vaadin 7 is out!

September 29th, 2013 No comments

Hey everyone! No code nor rant for this week’s article but just a very good news: Learning Vaadin 7 has been published!

This new edition of Learning Vaadin describes of course what is new in Vaadin 7 as well has changes from v6 to v7, but also introduces some great additional stuff:

  • Added a section about JavaEE integration: Servlet API, CDI and JPA
  • Added a section about the new connectore architecture: reworked GWT wrapping and added JavaScript wrapping
  • Moved and updated stuff from add-ons to core platform, the best example is Server Push
  • Added a section on a XML-based GUI feature with the Clara add-on – IMHO, this should be integrated into the core sooner or later
  • Upgraded versions of Gatein (for portals) and GlassFish (for OSGi)
  • Replaced Cloud Foundry PaaS with Jelastic PaaS as the latter is part of Vaadin offering
  • After having evaluated some Spring add-ons, completely dropped Spring section, as I found none of them satisfying. Stay tuned however as Spring integration is on the roadmap. Stay tuned on morevaadin.com when it’s launched, I’ll probably want to play with it.
  • Updated Maven integration – it’s not as easy as it was
  • And much much more!

You can get Learning Vaadin 7 through a variety of channels, including Packt, Amazon, Barnes & Nobles or Safari Books Online.

Categories: Java Tags: ,