Archive

Posts Tagged ‘maven’

Polyglot everywhere – part 1

April 12th, 2015 No comments

This is the era of polyglot! Proponents of this practice spread the word that you’ve to choose the language best adapted to the problem at hand. And with a single team dedicated to a microservice, this might make sense.

My pragmatic side tells me it means that developers get to choose the language they are developing with and don’t care how it will be maintained when they go away… On the other hand, my shiny-loving side just want to try – albeit in a more controlled environment, such as this blog!

Introduction

In this 3 parts serie, I’ll try to use polyglot on a project:

  • The first part is about the build system
  • The second part will be about the server side
  • The final part will be about the client-side

My example will use a Vaadin project built with Maven and using a simple client-side extension. You can follow the project on Github.

Polyglot Maven

Though it may have been largely ignored, Maven can now talk many different languages since its version 3.3.1 thanks to an improved extension mechanism. In the end, the system is quite easy:

  • Create a .mvn folder at the root of your project
  • Create a extensions.xml file
  • Set the type of language you’d like to use:
    
    
      
        io.takari.polyglot
        polyglot-yaml
        0.1.8
      
    

    Here, I set the build “language” as YAML.

In the end, the translation from XML to YAML is very straightforward:

modelVersion: 4.0.0
groupId: ch.frankel.blog.polyglot
artifactId: polyglot-example
packaging: war
version: 1.0.0-SNAPSHOT

dependencies:
    - { groupId: com.vaadin, artifactId: vaadin-spring, version: 1.0.0.beta2 }

build:
    plugins:
        - artifactId: maven-compiler-plugin
          version: 3.1
          configuration:
            source: 1.8
            target: 1.8
        - artifactId: maven-war-plugin
          version: 2.2
          configuration:
            failOnMissingWebXml: false

The only problem I had was in the YAML syntax itself: just make sure to align the elements of the plugin to the plugin declaration (e.g. align version with artifactId).

Remember to check the POM on Github with each new part of the serie!

Send to Kindle
Categories: Development Tags: , ,

Better developer-to-developer collaboration with Bintray

March 29th, 2015 1 comment

I recently got interested in Spring Social, and as part of my learning path, I tried to integrate their Github module which is still in Incubator mode. Unfortunately, this module seems to have been left behind, and its dependency on the core module uses an old version of it. And since I use the latest version of this core, Maven resolves one version to put in the WEB-INF/lib folder of the WAR package. Unfortunately, it doesn’t work so well at runtime.

The following diagram shows this situation:

Dependencies original situation

 

I could have excluded the old version from the transitive dependencies, but I’m lazy and Maven doesn’t make it easy (yet). Instead, I decided to just upgrade the Github module to the latest version and install it in my local repository. That proved to be quite easy as there was no incompatibility with the newest version of the core – I even created a pull request. This is the updated situation:

Dependencies final situation

Unfortunately, if I now decide to distribute this version of my application, nobody will be able to neither build nor run it since only I have the “patched” (latest) version of the Github module available in my local repo. I could distribute along the updated sources, but it would mean you would have to build it and install it into your local repo first before using my app.

Bintray to the rescue! Bintray is a binary repository, able to host any kind of binaries: jars, wars, deb, anything. It is hosted online, and free for OpenSource projects, which nicely suits my use-case. This is how I uploaded my artifact on Bintray.

Create an account
Bintray makes it quite easy to create such an account, using of the available authentication providers – Github, Twitter or Google+. Alternatively, one can create an old-style account, with password.
Create an artifact
Once authentified, an artifact needs to be created. Select your default Maven repository, it can be found at https://bintray.com//maven. Then, click on the big Add New Package button located on the right border. On the opening page, fill in the required information. The package can be named whatever you want, I chose to use the Maven artifact identifier: spring-social-github.
Create a version
Files can only be added to a version, so that a version need to be created first. On the package detail page, click on the New Version link (second column, first line).On the opening page, fill in the version name. Note that snapshots are not accepted and this is only checked through the -SNAPSHOT suffix. I chose to use 1.0.0.BUILD.
Upload files
Once the version is created, files can finally be uploaded. In the top bar, click the Upload Files button. Drag and drop all desired files, of course the main JAR and the POM, but it can also include source and javadoc JARs. Notice the Target Repository Path field: it should be set to the logical path to the Maven artifact, including groupId, artifactId and version separated by slashes. For example, my use-case should resolve to org/springframework/social/spring-social-github/1.0.0.BUILD. Note that instead of filling this field, you can wait for the files to be uploaded as Bintray will detect this upload, analyze the POM and propose to set it automatically: if this fits – and it probably does, just accept the proposal.
Publish
Uploading files is not enough, as those files are temporary until publication. A big notice warns about it: just click on the Publish link located on the right border.

At this point, you need only to add the Bintray repository in the POM.


    
        bintray
        http://dl.bintray.com/nfrankel/maven
        
            true
        
    
Send to Kindle
Categories: Java Tags: ,

Spring profiles or Maven profiles?

January 4th, 2015 2 comments

Deploying on different environments requires configuration, e.g. database URL(s) must be set on each dedicated environment. In most – if not all Java applications, this is achieved through a .properties file, loaded through the appropriately-named Properties class. During development, there’s no reason not to use the same configuration system, e.g. to use an embedded h2 database instead of the production one.

Unfortunately, Jave EE applications generally fall outside this usage, as the good practice on deployed environments (i.e. all environments save the local developer machine) is to use a JNDI datasource instead of a local connection. Even Tomcat and Jetty – which implement only a fraction of the Java EE Web Profile, provide this nifty and useful feature.

As an example, let’s take the Spring framework. In this case, two datasource configuration fragments have to be defined:

  • For deployed environment, one that specifies the JNDI location lookup
  • For local development (and test), one that configures a connection pool around a direct database connection

Simple properties file cannot manage this kind of switch, one has to use the build system. Shameless self-promotion: a detailed explanation of this setup for integration testing purposes can be found in my book, Integration Testing from the Trenches.

With the Maven build system, change between configuration is achieved through so-called profiles at build time. Roughly, a Maven profile is a portion of a POM that’s can be enabled (or not). For example, the following profile snippet replaces Maven’s standard resource directory with a dedicated one.


    
      dev
      
        
          
            profile/dev
            
              **/*
            
          
        
      
    

Activating a single or different profiles is as easy as using the -P switch with their id on the command-line when invoking Maven. The following command will activate the dev profile (provided it is set in the POM):

mvn package -Pdev

Now, let’s add a simple requirement: as I’m quite lazy, I want to exert the minimum effort possible to package the application along with its final production release configuration. This translates into making the production configuration, i.e. the JNDI fragment, the default one, and using the development fragment explicitly when necessary. Seasoned Maven users know how to implement that: create a production profile and configure it to be the default.


  dev
  
    true
  
  ...

Icing on the cake, profiles can even be set in Maven settings.xml files. Seems to good to be true? Well, very seasoned Maven users know that as soon as a single profile is explicitly activated, the default profile is de-activated. Previous experiences have taught me that because profiles are so easy to implement, they are used (and overused) so that the default one gets easily lost in the process. For example, in one such job, a profile was used on the Continuous Integration server to set some properties for the release in a dedicated setting files. In order to keep the right configuration, one has to a) know about the sneaky profile b) know it will break the default profile c) explicitly set the not-default-anymore profile.

Additional details about the dangers of Maven profiles for building artifacts can be found in this article.

Another drawback of this global approach is the tendency for over-fragmentation of the configuration files. I prefer to have coarse-grained configuration files, with each dedicated to a layer or a use-case. For example, I’d like to declare at least the datasource, the transaction manager and the entity manager in the same file with possibly the different repositories.

Come Spring profiles. As opposed to Maven profiles, Spring profiles are activated at runtime. I’m not sure whether this is a good or a bad thing, but the implementation makes it possible for real default configurations, with the help of @Conditional annotations (see my previous article for more details). That way, the wrapper-around-the-connection bean gets created when the dev profile is activated, and when not, the JNDI lookup bean. This kind of configuration is implemented in the following snippet:

@Configuration
public class MyConfiguration {

    @Bean
    @Profile("dev")
    public DataSource dataSource() throws Exception {
        org.apache.tomcat.jdbc.pool.DataSource dataSource = new org.apache.tomcat.jdbc.pool.DataSource();
        dataSource.setDriverClassName("org.h2.Driver");
        dataSource.setUrl("jdbc:h2:file:~/conditional");
        dataSource.setUsername("sa");
        return dataSource;
    }

    @Bean
    @ConditionalOnMissingBean(DataSource.class)
    public DataSource fakeDataSource() {
        JndiDataSourceLookup dataSourceLookup = new JndiDataSourceLookup();
        return dataSourceLookup.getDataSource("java:comp/env/jdbc/conditional");
    }
}

In this context, profiles are just a way to activate specific beans, the real magic is achieved through the different @Conditional annotations.

Note: it is advised to create a dedicated annotation to avoid String typos, to be more refactoring friendly and improve search capabilities on the code.

@Retention(RUNTIME)
@Target({TYPE, METHOD})
@Profile("dev")
public @interface Development {}

Now, this approach has some drawbacks as well. The most obvious problem is that the final archive will contain extra libraries, those that are use exclusively for development. This is readily apparent when one uses Spring Boot. One of such extra library is the h2 database, a whooping 1.7 Mb jar file. There are a two main counterarguments to this:

  • First, if you’re concerned about a couple of additional Mb, then your main issue is probably not on the software side, but on the disk management side. Perhaps a virtual layer such as VMWare or Xen could help?
  • Then, if the need be, you can still configure the build system to streamline the produced artifact.

The second drawback of Spring profiles is that along with extra libraries, the development configuration will be packaged into the final artifact as well. To be honest, when I first stumbled this approach, this was a no-go. Then, as usual, I thought more and more about it, and came to the following conclusion: there’s nothing wrong with that. Packaging the development configuration has no consequence whatsoever, whether it is set through XML or JavaConfig. Think about this: once an archive has been created, it is considered sealed, even when the application server explodes it for deployment purposes. It is considered very bad practice to do something on the exploded archive in all cases. So what would be the reason not to package the development configuration along? The only reason I can think of is: to be clean, from a theoretical point of view. Me being a pragmatist, I think the advantages of using Spring profiles is far greater than this drawback.

In my current project, I created a single configuration fragment with all beans that are dependent on the environment, the datasource and the Spring Security authentication provider. For the latter, the production configuration uses an internal LDAP, so that the development bean provides an in-memory provider.

So on one hand, we’ve got Maven profiles, which have definite issues but which we are familiar with, and on the other hand, we’ve got Spring profiles which are brand new, hurt our natural inclination but gets the job done. I’d suggest to give them a try: I did and am so far happy with them.

Send to Kindle
Categories: Java Tags: , ,

Easier Spring version management

July 6th, 2014 No comments

Earlier on, Spring migrated from a monolithic approach – the whole framework, to a modular one – bean, context, test, etc. so that one could decide to use only the required modules. This modularity came at a cost, however: in the Maven build configuration (or the Gradle one for that matter), one had to specify the version for each used module.



    ...
    
        
            org.springframework
            spring-webmvc
            4.0.5.RELEASE
        
        
            org.springframework
            spring-jdbc
            4.0.5.RELEASE
        
        
            org.springframework
            spring-test
            4.0.5.RELEASE
            test
        
    

Of course, professional Maven users would improve this POM with the following:



    ...
   
        4.0.5.RELEASE
    
    
        
            org.springframework
            spring-webmvc
            ${spring.version}
        
        
            org.springframework
            spring-jdbc
            ${spring.version}
        
        
            org.springframework
            spring-test
            ${spring.version}
            test
        
    

There’s an more concise way to achieve the same through a BOM-typed POM (see section on scope import) since version 3.2.6 though.



    ...
    
        
            
                org.springframework
                spring-framework-bom
                pom
                4.0.5.RELEASE
                import
            
        
    
    
        
            org.springframework
            spring-webmvc
        
        
            org.springframework
            spring-jdbc
        
        
            org.springframework
            spring-test
            test
        
    

Note that Spring’s BOM only sets version but not scope, this has to be done in each user POM.

Spring released very recently the Spring IO platform which also includes a BOM. This BOM not only includes Spring dependencies but also other third-party libraries.



    ...
    
        
            
                io.spring.platform
                platform-bom
                pom
                1.0.0.RELEASE
                import
            
        
    
    
        
            org.springframework
            spring-webmvc
        
        
            org.springframework
            spring-jdbc
        
        
            org.springframework
            spring-test
            test
        
        
            org.testng
            testng
            test
        
    

There’s one single problem with Spring IO platform’s BOM, there’s no simple mapping from the BOM version to declared dependencies versions. For example, the BOM’s 1.0.0.RELEASE maps to Spring 4.0.5.RELEASE.

To go further:

Send to Kindle
Categories: Java Tags: ,

Scala on Android and stuff: lessons learned

June 1st, 2014 3 comments

I play Role-Playing since I’m eleven, and me and my posse still play once or twice a year. Recently, they decided to play Earthdawn again, a game we didn’t play since more than 15 years! That triggered my desire to create an application to roll all those strangely-shaped dice. And to combine the useful with the pleasant, I decided to use technologies I’m not really familiar with: the Scala language, the Android platform and the Gradle build system.

The first step was to design a simple and generic die-rolling API in Scala, and that was the subject of one of my former article. The second step was to build upon this API to construct something more specific to Earthdawn and design the GUI. Here’s the write up of my musings in this development.

Here’s the a general component overview:

Component overview

Reworking the basics

After much internal debate, I finally changed the return type of the roll method from (Rollable[T], T) to simply T following a relevant comment on reddit. I concluded that it’s to the caller to get hold of the die itself, and return it if it wants. That’s what I did in the Earthdawn domain layer.

Scala specs

Using Scala meant I also dived into Scala Specs 2 framework. Specs 2 offers a Behavior-Driven way of writing tests as well as integration with JUnit through runners and Mockito through traits. Test instances can be initialized separately in a dedicated class, isolated of all others.

My biggest challenge was to configure the Maven Surefire plugin to execute Specs 2 tests:


    maven-surefire-plugin
    2.17
    
        -Dspecs2.console
            
                **/*Spec.java
            
    

Scala + Android = Scaloid

Though perhaps disappointing at first, wanting to run Scala on Android is feasible enough. Normal Java is compiled to bytecode and then converted to Dalvik-compatible Dex files. Since Scala is also compiled to bytecode, the same process can also be applied. Not only can Scala be easily be ported to Android, some frameworks to do that are available online: the one which seemed the most mature was Scaloid.

Scaloid most important feature is to eschew traditional declarative XML-based layout in favor of a Scala-based Domain Specific Language with the help of implicit:

val layout = new SLinearLayout {
  SButton("Click").<<.Weight(1.0f).>>
}

Scaloid also offers:

  • Lifecycle management
  • Implicit conversions
  • Trait-based hierarchy
  • Improved getters and setters
  • etc.

If you want to do Scala on Android, this is the prject you’ve to look at!

Some more bitching about Gradle

I’m not known to be a big fan of Gradle – to say the least. The bigest reason however, is not because I think Gradle is bad but because using a tool based on its hype level is the worst reason I can think of.

I used Gradle for the API project, and I must admit it it is more concise than Maven. For example, instead of adding the whole maven-scala-plugin XML snippet, it’s enough to tell Gradle to use it with:

apply plugin: 'scala'

However the biggest advantage is that Gradle keeps the state of the project, so that unnecessary tasks are not executed. For example, if code didn’t change, compilation will not be executed.

Now to the things that are – to put in politically correct way, less than optimal:

  • First, interestingly enough, Gradle does not output test results in the console. For a Maven user, this is somewhat unsettling. But even without this flaw, I’m afraid any potential user is interested in the test ouput. Yet, this can be remedied with some Groovy code:
    test {
        onOutput { descriptor, event ->
            logger.lifecycle("Test: " + descriptor + ": " + event.message )
        }
    }
  • Then, as I wanted to install my the resulting package into my local Maven repository, I had to add the Maven plugin: easy enough… This also required Maven coordinates, quite expected. But why am I allowed to install without executing test phases??? This is not only disturbing, but I cannot accept any rationale to allow installing without testing.
  • Neither of the previous points proved to be a show stopper, however. For the Scala-Android project, you might think I just needed to apply both scala and android plugins and be done with that. Well, this is wrong! It seems that despite Android being showcased as the use-case for Gradle, scala and android plugins are not compatible. This was the end of my Gradle adventure, and probably for the foreseeable future.

Testing with ease and Genymotion

The build process i.e. transforming every class file into Dex, packaging them into an .apk and signing it takes ages. It would be even worse if using the emulator from Google Android’s SDK. But rejoice, Genymotion is an advanced Android emulator that not only very fast but easy as pie to use.

Instead of doing an adb install, installing an apk on Genymotion can be achieved by just drag’n’dropping it on the emulated device. Even better, it doesn’t require first uninstalling the old version and it launches the application directly. Easy as pie I told you!

Conclusion

I have now a working Android application, complete with tests and repeatable build. It’s not much, but it gets the job done and it taught me some new stuff I didn’t know previously. I can only encourage you to do the same: pick a small application and develop it with languages/tools/platforms that you don’t use in your day-to-day job. In the end, you will have learned stuff and have a working application. Doesn’t it make you feel warm inside?

 

Send to Kindle
Categories: Java Tags: , , , ,

Stop the f… about Gradle

July 21st, 2013 20 comments

Stop the f… about #Spring & #Hibernate migrating to #Gradle. Repeat after me: “my project do NOT have the same requirements” #Maven

This was my week’s hate tweet, and I take full responsibility for every character in it. While that may seem like a troll, Twitter is not really the place to have a good-natured debate with factual arguments, so here is the follow up.

Before going into full-blown rhetoric mode, let me first say that despite popular belief, I’m open to shiny new things. For example, despite being a Vaadin believer – which is a stateful server-side technology, I’m also interested in AngularJS – which is its exact opposite. I’m also in favor of TestNG over JUnit, and so on. I even went as far as going to a Gradle session at Devoxx France! So please, hear my arguments out, only then think them over.

So far, I’ve heard only two arguments in favor of Gradle:

  1. It’s flexible (implying Maven is not)
  2. Spring and Hibernate use it

Both are facts, let’s go in detail over each of them to check why those are not arguments.

Gradle is flexible

There’s no denying that Gradle is flexible: I mean, it’s Groovy with a build DSL. Let us go further: how is flexibility achieved? My take is that it comes from the following.

  • Providing very fine-grained – almost atomic operations: compilation, copying, moving, packaging, etc.
  • Allowing to define lists of those operation – tasks: packaging a JAR would mean copying classes and resources files into a dedicated folder and zipping it
  • Enabling dependencies between those: packaging is dependent on compilation

If you look at it closely, what I said can be applied to Gradle, of course, but also to Ant! Yes, both operate at the same level of granularity.

Now the problem lies in that Gradle proponents tell Gradle is flexible as if flexibility was a desirable quality. Let me say: flexibility is not a quality for a build tool. I would even say it is a big disadvantage. That’s the same as having your broken arm put in a cast. That’s a definite lack of flexibility (to says the least) but it’s for your own good. The cast prevents you from moving in a painful way, just as unflexible build tools (such as Maven) make strange things definitely expensive.

If you need to do something odd over and over because of your specific context, it’s because of a recurring requirement. In Maven, you’d create a plugin to address it and be done with that. If it’s a one-shot requirement, that’s probably no requirement but a quirk. Rethink the way you do it, it’s a smell something is definitely fishy.

Spring and Hibernate both use Gradle

That one really makes me laugh: because some framework choose a build, we should just use theirs, no questions asked? Did you really check why they migrated in the first place?

I won’t even look at the Hibernate case, because it annoys me to no end to read arguments such as “I personally hate…” or “…define the build and directories the way that seemed to make sense to me”. That’s no good reason to change a build tool (but a real display of inflated ego in the latter case).

For Spring, well… Groovy is just in their strategic path and has been for years. SpringSource Tools Suite fully support Groovy, so I guess using Gradle is a way to spread Groovy love all over the world. A more technical reason I’ve heard is that Spring must be compatible with different Java versions, and Maven cannot address that. I’m too lazy to check for myself but even if that’s true, it has only a slight chance of applying to your current project.

Gradlew is one cool feature

The only feature I know of that I currently lack – and would definitely love to have, is to set the build engine version once and for all, to be able to run my build 10 years from now if I need it. It’s amazing the number of software products that go to their first maintenance cycle in years and are at loss building from sources. Believe me, it happened to me (in a VB environment) but I had the fortune to have a genius at hand, something I unfortunately cannot count on.

In the Gradle parlance, such feature is achieved through something known the Gradle Wrapper. Using that downloads the build engine itself, so you can put it into your version control. Too bad nobody ever raised this as an argument :-) though this is not enough to make me want to migrate.

Note: during this writing, I just searched for a port of this feature to Maven and I found maven-wrapper. Any feedback?

Wrap-up

TL;DR:

  • Gradle is just Ant with Groovy instead of XML
  • Your context is (probably) different from those of frameworks which are using Gradle

Both points have only one logical conclusion: there’s no real reason to use Gradle, stop the f… about it and go back to develop the next Google Search, there’s much more value in that!

Send to Kindle
Categories: Java Tags: , , ,

Maven between different environments

July 7th, 2013 3 comments

As a consultant, I find myself in different environments in need of different configurations. One such configuration is about the Maven settings file. This file is very important, for it governs such things as servers, mirrors and proxies. When you have a laptop, switching from customer configuration to home configuration and vice versa when you change place quickly becomes a bore. When you have to handle more than one customer, it escalates a nightmarish and tangled configuration mess.

In a former environment, colleagues handled Eclipse ini file switch, a very similar concern, by having a dedicated .bat to overwrite the reference file. I heard a colleague of mine do exactly the same for Maven settings file. It does the job, but it is not portable, is more than slightly intrusive and has something I cannot put quite my finger on that does not “fit”.

As IT people are, I’m lazy but idealist, so I scratched my head to handle this problem in a way I would deem more elegant. I think I may have found one, through Maven command native CLI. If you run mvn --help, you’ll get plenty of CLI options: go to the -s section.

 -s,--settings      Alternate path for the user
                               settings file

This means Maven let you use settings files other than ~/.m2/settings.xml. So you can create settings-cust.xml, set all needed configuration for this customer and run mvn -s ~/.m2/settings-cust.xml.

And since I’m really lazy, I just added the following snippet in my ~/.bash_profile to make my life even easier:

alias mvncust='mvn -s ~/.m2/settings-cust.xml'

Now, I just need to run mvncust to run Maven with all relevant configuration for this environment. And it is compatible with other options!

The only drawback I found so far is I’ve to explicitly set the settings file in Eclipse’s m2e but that doesn’t bother me much since I’ve a dedicated Eclipse instance (for configuration and plugins) for each of my environment.

Send to Kindle
Categories: Java Tags:

Re-use your test classes across different projects

August 25th, 2012 1 comment

Sometimes, you need to reuse your test classes across different projects. These are two use-cases that I know of:

  • Utility classes that create relevant domain objects used in different modules
  • Database test classes (ans resources) that need to be run in the persistence project as well as the integration test project

Since I’ve seen more than my share of misuses, this article aim to provide an elegant solution once and for all.

Creating the test artifact

First, we have to use Maven: I know that not everyone is a Maven fanboy, but it get the work done – and in our case, it does it easily. Then, we configure the JAR plugin to attach tests. This will compile test classes and copy test resources, and package them in an attached test artifact.


  
    
     
       org.apache.maven.plugins
       maven-jar-plugin
       2.2
       
         
           
             test-jar
           
         
       
     
    
  

The test artifact is stored side-by-side with the main artifact once deployed in the repository. Note that the configured test-jar is bound to the install goal.

Using the test artifact

The newly-created test artifact can be expressed as a dependency of a projet with the following snippet:


  ch.frankel.blog.foo
  foo
  1.0.0
  test-jar
  test

The type has to be test-jar instead of simply jar in order to Maven to pick the attached artifact and not the main one. Also, note although you could configure the dependency with a classifier instead of a type, the current documentation warns about possible bugs and favor the type configuration.

To go further:

Send to Kindle
Categories: Java Tags: ,

Empower your CSS in your Maven build

July 8th, 2012 No comments

People who know me also know I’ve interest in the GUI: that means I’ve sometimes to get my hands dirty and dig deep into stylesheets (even though I’ve no graphical skill whatsoever). When it happens, I’ve always questions regarding how to best factorize styles. In this regard, the different CSS versions are lacking, because they were not meant to be managed by engineers. A recent trend is to generate CSS from a source, which brings some interesting properties such as nesting, variables, mixins, inheritance and others. Two examples of this trend are LESS and SASS.

A not-so-quick example can be found just below.

Given that I’m an engineer, the only requirement I have regarding those technologies is that I can generate the final CSS during my build in an automated and reproductible way. After a quick search, I became convinced to use wro4j. Here are the reasons why, in a simple use-case.

Maven plugin

Wro4j includes a Maven plugin. In order to call it in the build, just add it to your POM:


	ro.isdc.wro4j
	wro4j-maven-plugin
	1.4.6
	
		
			
				run
			
			generate-resources
		
	

You’re allergic to Maven? No problem, a command-line tool is also provided, free of charge.

Build time generation

For evident performance reasons, I favor build time generation instead of runtime. But if you prefer the latter, there’s a JavaEE filter ready just for that.

Configurability

Since wro4j original strategy is runtime generation, default configuration files are meant to be inside the archive. However, this can easily tweaked by setting some tags in the POM:


	ro.isdc.wro.maven.plugin.manager.factory.ConfigurableWroManagerFactory
	${basedir}/src/main/config/wro.xml
	${basedir}/src/main/config/wro.properties

The final blow

The real reason to praise wro4j is that LESS generation is only a small part of its features: for wro4j, it’s only a processor among others. You’ve only to look at the long list of processors (pre- and post-) available to want to use them ASAP. For example, wro4j also wraps a JAWR processor (a bundling and minifying product I’ve blogged about some time ago).

Once you get there, however, a whole new universe opens before your eyes, since you get processors for:

  • Decreasing JavaScript files size by minifying them with Google, YUI, JAWR, etc.
  • Decreasing CSS files size by minifying them
  • Minimizing the number of requests by merging your JavaSscript files into a single file
  • Minimizing the number of requests by merging your CSS files into a single file
  • Processing LESS CSS
  • Processing SASS CSS
  • Analyzing your JavaScript with LINT

The list is virtually endless, you should really have a look. Besides, you can bridge your favorite by writing your own processor if the need be.

Quick how-to

In order to set up wro4j real quick, here are some ready to use snippets:

  • The Maven POM, as seen above
    
    	ro.isdc.wro4j
    	wro4j-maven-plugin
    	1.4.6
    	
    		ro.isdc.wro.maven.plugin.manager.factory.ConfigurableWroManagerFactory
    		${basedir}/src/main/config/wro.xml
    		${basedir}/src/main/config/wro.properties
    		all
    		${project.build.directory}/${project.build.finalName}/style/
    		${project.build.directory}/${project.build.finalName}/script/
    		${basedir}/src/main/webapp/
    		false
    	
    	
    		
    			
    				run
    			
    			generate-resources
    		
    	
    

    Note the use of the ConfigurableWroManagerFactory. It makes adding and removing processors a breeze by updating the following file.

  • The wro.properties processors file, that list processings that should be executed on the source files. Here, we generate CSS from LESS, resolve imports to have a single file and minify both CSS (with JAWR) and JavaScript:
    preProcessors=lessCss,cssImport
    postProcessors=cssMinJawr,jsMin
  • The wro.xml configuration file, to tell wro4j how to merge CSS and JS files. In our case, styles.css will be merged into all.css and global.js into all.js.
    
    
      
        /style/styles.css
        /script/global.js
      
    
  • Finally, a LESS example can be found just below.

Conclusion

There are some paths to optimization for webapps readily available. Some provide alternative ways to define your CSS, some minify your CSS, some your JS, other merge those files together, etc. It would be a shame to ignore them because they can be a real asset in heavy-load scenarii. wro4j provides an easy way to wrap all those operations into a reproductible build.

LESS
@default-font-family: Helvetica, Arial, sans-serif;

@default-radius: 8px;

@default-color: #5B83AD;
@dark-color: @default-color - #222;
@light-color: @default-color + #222;

.bottom-left-rounded-corners (@radius: @default-radius) {
	.rounded-corners(0, 0, @radius, 0)
}

.bottom-right-rounded-corners (@radius: @default-radius) {
	.rounded-corners(0, 0, 0, @radius)
}

.bottom-rounded-corners (@radius: @default-radius) {
	.rounded-corners(0, 0, @radius, @radius)
}

.top-left-rounded-corners (@radius: @default-radius) {
	.rounded-corners(@radius, 0, 0, 0)
}

.top-right-rounded-corners (@radius: @default-radius) {
	.rounded-corners(0, @radius, 0, 0)
}

.top-rounded-corners (@radius: @default-radius) {
	.rounded-corners(@radius, @radius, 0, 0)
}

.rounded-corners(@top-left: @default-radius, @top-right: @default-radius, @bottom-left: @default-radius, @bottom-right: @default-radius) {
	-moz-border-radius: @top-left @top-right @bottom-right @bottom-left;
	-webkit-border-radius: @top-left @top-right @bottom-right @bottom-left;
	border-radius: @top-left @top-right @bottom-right @bottom-left;
}

.default-border {
	border: 1px solid @dark-color;
}

.no-bottom-border {
	border-bottom: 0;
}

.no-left-border {
	border-left: 0;
}

.no-right-border {
	border-right: 0;
}

body {
	font-family: @default-font-family;
	color: @default-color;
}

h1 {
	color: @dark-color;
}

a {
	color: @dark-color;
	text-decoration: none;
	font-weight: bold;

	&:hover {
		color: black;
	}
}

th {
	color: white;
	background-color: @light-color;
}

td, th {
	.default-border;
	.no-left-border;
	.no-bottom-border;
	padding: 5px;

	&:last-child {
		.no-right-border;
	}
}

tr:first-child {

	th:first-child {
		.top-left-rounded-corners;
	}

	th:last-child {
		.top-right-rounded-corners;
	}

	th:only-child {
		.top-rounded-corners;
	}
}

tr:last-child {

	td:first-child {
		.bottom-left-rounded-corners;
	}

	td:last-child {
		.bottom-right-rounded-corners;
	}

	td:only-child {
		.bottom-rounded-corners;
	}
}

thead tr:first-child th, thead tr:first-child th {
	border-top: 0;
}

table {
	.rounded-corners;
	.default-border;
	border-spacing: 0;
	margin: 5px;
}
Generated CSS
.default-border {
	border: 1px solid #39618b;
}

.no-bottom-border {
	border-bottom: 0;
}

.no-left-border {
	border-left: 0;
}

.no-right-border {
	border-right: 0;
}

body {
	font-family: Helvetica, Arial, sans-serif;
	color: #5b83ad;
}

h1 {
	color: #39618b;
}

a {
	color: #39618b;
	text-decoration: none;
	font-weight: bold;
}

a:hover {
	color: black;
}

th {
	color: white;
	background-color: #7da5cf;
}

td,th {
	border: 1px solid #39618b;
	border-left: 0;
	border-bottom: 0;
	padding: 5px;
}

td:last-child,th:last-child {
	border-right: 0;
}

tr:first-child th:first-child {
	-moz-border-radius: 8px 0 0 0;
	-webkit-border-radius: 8px 0 0 0;
	border-radius: 8px 0 0 0;
}

tr:first-child th:last-child {
	-moz-border-radius: 0 8px 0 0;
	-webkit-border-radius: 0 8px 0 0;
	border-radius: 0 8px 0 0;
}

tr:first-child th:only-child {
	-moz-border-radius: 8px 8px 0 0;
	-webkit-border-radius: 8px 8px 0 0;
	border-radius: 8px 8px 0 0;
}

tr:last-child td:first-child {
	-moz-border-radius: 0 0 0 8px;
	-webkit-border-radius: 0 0 0 8px;
	border-radius: 0 0 0 8px;
}

tr:last-child td:last-child {
	-moz-border-radius: 0 0 8px 0;
	-webkit-border-radius: 0 0 8px 0;
	border-radius: 0 0 8px 0;
}

tr:last-child td:only-child {
	-moz-border-radius: 0 0 8px 8px;
	-webkit-border-radius: 0 0 8px 8px;
	border-radius: 0 0 8px 8px;
}

thead tr:first-child th,thead tr:first-child th {
	border-top: 0;
}

table {
	-moz-border-radius: 8px 8px 8px 8px;
	-webkit-border-radius: 8px 8px 8px 8px;
	border-radius: 8px 8px 8px 8px;
	border: 1px solid #39618b;
	border-spacing: 0;
	margin: 5px;
}
Send to Kindle
Categories: JavaEE Tags: ,

Why Eclipse WTP doesn’t publish libraries when using m2e

April 29th, 2012 No comments

Lately, I noticed my libraries weren’t published to Tomcat when I used a Maven project in Eclipse, even though it was standard war packaging. Since I mostly use Vaadin, I didn’t care much, I published the single vaadin-x.y.z.jar to the deployed WEB-INF/lib manually and I was done with it.

Then, I realized it happened on two different instances of Eclipse and for the writing of Develop Vaadin apps with Scala, I used 3 different libraries, so I wanted to correct the problem. At first, I blamed it on the JRebel plugin I recently installed, then I began investigating the case further and I found out that I needed a special Maven connector that was present in neither of my Eclipse instances: the WTP connector. I had installed this kind of connector back when Maven Integration was done by m2eclipse by Sonatype, but had forgotten that a while ago. Now the integration is called m2e, but I have to do the whole nine yards again…

The diagnostics that can be hard but the solution is very simple.

Go to the Windows -> Preferences menu and from this point on go to Maven -> Discovery.

Click on Open Catalog.

Search for “wtp”, select Maven Integration for WTP and click on Finish. Restart and be pleased. Notice that most Maven plugins integration in Eclipse can be resolved the same way.

For a sum up on my thoughts about the current state of m2e, please see here.

Send to Kindle
Categories: JavaEE Tags: , ,