Posts Tagged ‘scala’

Forget the language, the important is the tooling

November 1st, 2015 No comments

There’s not one week passing without stumbling upon a post claiming language X is superior to all others, and offers you things you cannot do in other languages, even make your kitchenware shine brightier and sometimes even return lost love. I wouldn’t mind these claims, because some features really open my Java developer mind to the lacking of what I’m using now, but in general, they are just bashing another language – usually Java.

For those that love to bitch, here’s a quote that might be of interest:

There are only two kinds of programming languages: those people always bitch about and those nobody uses.

— Bjarne Stroustrup

That said, my current situation spawned some thinking. I was trying to migrate the Android application I’m developing in my spare time to Kotlin. I used Android Studio to do that, for which JetBrains provide a migration tool through the Kotlin plugin. The process is quite straightforward, requiring only minor adjustments for some files. This made me realize the language is not the most important, the tooling is. You can have the best language in the world – and it seems each person has his own personal definition of “best”, if the tooling lacks, it amounts to just nothing.

Take Scala for example. I don’t pretend to be an expert in Scala, but I know enough to know it’s very powerful. However, if you don’t have a tool to handle advanced language features, such as implicit parameters, you’re in for a lot of trouble. You’d better have an advanced IDE to display where they come from. If to go beyond mere languages, the same can be said about technologies such as Dependency Injection – whether achieved though Spring or CDI or aspects from AOP.

Another fitting example would be XML. It might not be everyone’s opinion, but XML is still very much used in the so-called enterprise world. However, beyond a hundred of lines of code and a few namespaces, XML becomes quite hard to read without help. Come Eclipse or XMLSpy and presto, XML files can be displayed in a very handy tree-like representation.

On the opposite, successful languages (and technologies) come in with tooling. Look around and see for yourself. Still, I don’t pretend I know between the cause and the consequence: are languages successful because of their tooling, or are tools constructed around successful languages? Perhaps it’s both…

Previously, I didn’t believe Kotlin and its brethren had many chances for success. Given that Jetbrains is behind both Kotlin and IntelliJ IDEA, I believe Kotlin might have a very bright future ahead.

Send to Kindle
Categories: Development Tags: , ,

Scala on Android and stuff: lessons learned

June 1st, 2014 3 comments

I play Role-Playing since I’m eleven, and me and my posse still play once or twice a year. Recently, they decided to play Earthdawn again, a game we didn’t play since more than 15 years! That triggered my desire to create an application to roll all those strangely-shaped dice. And to combine the useful with the pleasant, I decided to use technologies I’m not really familiar with: the Scala language, the Android platform and the Gradle build system.

The first step was to design a simple and generic die-rolling API in Scala, and that was the subject of one of my former article. The second step was to build upon this API to construct something more specific to Earthdawn and design the GUI. Here’s the write up of my musings in this development.

Here’s the a general component overview:

Component overview

Reworking the basics

After much internal debate, I finally changed the return type of the roll method from (Rollable[T], T) to simply T following a relevant comment on reddit. I concluded that it’s to the caller to get hold of the die itself, and return it if it wants. That’s what I did in the Earthdawn domain layer.

Scala specs

Using Scala meant I also dived into Scala Specs 2 framework. Specs 2 offers a Behavior-Driven way of writing tests as well as integration with JUnit through runners and Mockito through traits. Test instances can be initialized separately in a dedicated class, isolated of all others.

My biggest challenge was to configure the Maven Surefire plugin to execute Specs 2 tests:


Scala + Android = Scaloid

Though perhaps disappointing at first, wanting to run Scala on Android is feasible enough. Normal Java is compiled to bytecode and then converted to Dalvik-compatible Dex files. Since Scala is also compiled to bytecode, the same process can also be applied. Not only can Scala be easily be ported to Android, some frameworks to do that are available online: the one which seemed the most mature was Scaloid.

Scaloid most important feature is to eschew traditional declarative XML-based layout in favor of a Scala-based Domain Specific Language with the help of implicit:

val layout = new SLinearLayout {

Scaloid also offers:

  • Lifecycle management
  • Implicit conversions
  • Trait-based hierarchy
  • Improved getters and setters
  • etc.

If you want to do Scala on Android, this is the prject you’ve to look at!

Some more bitching about Gradle

I’m not known to be a big fan of Gradle – to say the least. The bigest reason however, is not because I think Gradle is bad but because using a tool based on its hype level is the worst reason I can think of.

I used Gradle for the API project, and I must admit it it is more concise than Maven. For example, instead of adding the whole maven-scala-plugin XML snippet, it’s enough to tell Gradle to use it with:

apply plugin: 'scala'

However the biggest advantage is that Gradle keeps the state of the project, so that unnecessary tasks are not executed. For example, if code didn’t change, compilation will not be executed.

Now to the things that are – to put in politically correct way, less than optimal:

  • First, interestingly enough, Gradle does not output test results in the console. For a Maven user, this is somewhat unsettling. But even without this flaw, I’m afraid any potential user is interested in the test ouput. Yet, this can be remedied with some Groovy code:
    test {
        onOutput { descriptor, event ->
            logger.lifecycle("Test: " + descriptor + ": " + event.message )
  • Then, as I wanted to install my the resulting package into my local Maven repository, I had to add the Maven plugin: easy enough… This also required Maven coordinates, quite expected. But why am I allowed to install without executing test phases??? This is not only disturbing, but I cannot accept any rationale to allow installing without testing.
  • Neither of the previous points proved to be a show stopper, however. For the Scala-Android project, you might think I just needed to apply both scala and android plugins and be done with that. Well, this is wrong! It seems that despite Android being showcased as the use-case for Gradle, scala and android plugins are not compatible. This was the end of my Gradle adventure, and probably for the foreseeable future.

Testing with ease and Genymotion

The build process i.e. transforming every class file into Dex, packaging them into an .apk and signing it takes ages. It would be even worse if using the emulator from Google Android’s SDK. But rejoice, Genymotion is an advanced Android emulator that not only very fast but easy as pie to use.

Instead of doing an adb install, installing an apk on Genymotion can be achieved by just drag’n’dropping it on the emulated device. Even better, it doesn’t require first uninstalling the old version and it launches the application directly. Easy as pie I told you!


I have now a working Android application, complete with tests and repeatable build. It’s not much, but it gets the job done and it taught me some new stuff I didn’t know previously. I can only encourage you to do the same: pick a small application and develop it with languages/tools/platforms that you don’t use in your day-to-day job. In the end, you will have learned stuff and have a working application. Doesn’t it make you feel warm inside?


Send to Kindle
Categories: Java Tags: , , , ,

Dead simple API design for Dice Rolling

May 18th, 2014 No comments

I wanted to create a small project where I could achieve results fairly quickly in technologies I never (or rarely) use. At the Mix-IT conference, I realized the few stuff I learned in Scala had been quickly forgotten. And I wanted wanted to give Gradle a try, despite my regular bitching about it. Since my Role-Playing crew wants to play Earthdawn (we stopped for like 20 years), I decided to create a Dice Roller app in Scala, running on Android (all of my friends have Android devices) and built with Gradle (I promised it at Devoxx).

I soon realized that there were a definite Dice Rolling API that could be isolated from the rest. Rolling a die in Earthdawn has some definite quirk – if you roll the highest result, you re-roll and add it to the result and so on until you roll not the highest, but the basics of rolling a die is similar in every system. It was time to design an extensible API. By extensible, I’m talking about something I could re-use in every RPG system.

I’d already been trying to create such an API and the root of it is the roll() method signature. Before, I used 2 methods in conjunction:



This is a big mistake, as implementations will require state handling! This created plenty of problems, among them:

  • creating instances each time I needed to roll
  • calling 2 different methods in the right order, also known as time coupling

This time, having learned from my mistake, I replaced that with the following:


This time, implementation can (and will) do without state.

The next design decision is about the result type. I’m not really sure about this point, but I formerly returned only the result. This time, I returned the result as well as the object itself, so I can pass both along and it let users know which die was rolled as well as the result. This is possible without creating a new class structure thanks to Scala’s out-of-the-box Tuple2. My final interface looks like:

trait Rollable[T] {

As I’m aiming toward RPG, I just need to set the sides number as a parameter (to be able to create those strange 12 and 20-sided dice). A naive implementation is very straightforward:

class Die (val sides:Int)extends Rollable[Int] {

  var random:SecureRandom

  override def roll:(Die, Int) = (this, random.nextInt(sides) + 1)

Seems good enough. However, how can we test this design? If I need to build upon this, I will need to be able to set desired results: in this case, this is not cheating, it’s faking! As it turns out, I need to pass the random as a constructor parameter (but without a getter).

class Die (val sides:Int, random:Random)extends Rollable[Int] {

  override def roll:(Die, Int) = (this, random.nextInt(sides) + 1)

That’s better, but only marginally. With this code, I need to pass the random parameter each time I create a new instance. A slightly better option would be to add a constructor with a default SecureRandom instance. However, what if the next Java version offers an even better Random implementation? Or if the API user prefers to rely on an external secure entropy source? It would he would still have to pass the new improved random for each call – back to square one. Fortunately, Scala offers this one nice language feature called implicit. With implicit, API users only have to reference the random generator once in a file to use it everywhere. The improved design now looks like:

class Die (val sides:Int)(implicit random:Random) extends Rollable[Int] {
  override def roll:(Die, Int) = (this, random.nextInt(sides) + 1)

object SecureDie {
  implicit val random = new SecureRandom

Callers then just need to import the provided random, or to use their own and call the constructor with the desired sides number:

import SecureDie.random // or import MyQuantumRandomGenerator.random

val die = new Die(6)

The final step is to create Scala object (singletons) in order to offer a convenient API. This is only possible since we designed our classes with no state:

import SecureDie.random

object d3 extends Die(3)
object d4 extends Die(4)
object d6 extends Die(6)
object d8 extends Die(8)
object d10 extends Die(10)
object d12 extends Die(12)
object d20 extends Die(20)
object d100 extends Die(100)

Users now just need to call d6.roll to roll dice!

With a simple domain, I showed how Scala’s most basic feature can really help getting a clean design. Results are available on Github.

In the next article I will detail how I got Scala to run on Android and pitfalls I stumbled upon. Spoiler: there will some Gradle involved…

Send to Kindle
Categories: Development Tags: ,

A dive into the Builder pattern

October 6th, 2013 11 comments

The Builder pattern has been described in the Gang of Four “Design Patterns” book:

The builder pattern is a design pattern that allows for the step-by-step creation of complex objects using the correct sequence of actions. The construction is controlled by a director object that only needs to know the type of object it is to create.

A common implementation of using the Builder pattern is to have a fluent interface, with the following caller code:

Person person = new PersonBuilder().withFirstName("John").withLastName("Doe").withTitle(Title.MR).build();

This code snippet can be enabled by the following builder:

public class PersonBuilder {

    private Person person = new Person();

    public PersonBuilder withFirstName(String firstName) {


        return this;

    // Other methods along the same model
    // ...

    public Person build() {

        return person;

The job of the Builder is achieved: the Person instance is well-encapsulated and only the build() method finally returns the built instance. This is usually where most articles stop, pretending to have covered the subject. Unfortunately, some cases may arise that need deeper work.

Let’s say we need some validation handling the final Person instance, e.g. the lastName attribute is to be mandatory. To provide this, we could easily check if the attribute is null in the build() method and throws an exception accordingly.

public Person build() {

    if (lastName == null) {

        throw new IllegalStateException("Last name cannot be null");

    return person;

Sure, this resolves our problem. Unfortunately, this check happens at runtime, as developers calling our code will find (much to their chagrin). To go the way to true DSL, we have to update our design – a lot. We should enforce the following caller code:

Person person1 = new PersonBuilder().withFirstName("John").withLastName("Doe").withTitle(Title.MR).build(); // OK

Person person2 = new PersonBuilder().withFirstName("John").withTitle(Title.MR).build(); // Doesn't compile

We have to update our builder so that it may either return itself, or an invalid builder that lacks the build() method as in the following diagram. Note the first PersonBuilder class is kept as the entry-point for the calling code doesn’t have to cope with Valid-/InvaliPersonBuilder if it doesn’t want to.

This may translate into the following code:

public class PersonBuilder {

    private Person person = new Person();

    public InvalidPersonBuilder withFirstName(String firstName) {


        return new InvalidPersonBuilder(person);

    public ValidPersonBuilder withLastName(String lastName) {


        return new ValidPersonBuilder(person);

    // Other methods, but NO build() methods

public class InvalidPersonBuilder {

    private Person person;

    public InvalidPersonBuilder(Person person) {

        this.person = person;

    public InvalidPersonBuilder withFirstName(String firstName) {


        return this;

    public ValidPersonBuilder withLastName(String lastName) {


        return new ValidPersonBuilder(person);

    // Other methods, but NO build() methods

public class ValidPersonBuilder {

    private Person person;

    public ValidPersonBuilder(Person person) {

        this.person = person;

    public ValidPersonBuilder withFirstName(String firstName) {


        return this;

    // Other methods

    // Look, ma! I can build
    public Person build() {

        return person;

This is a huge improvement, as now developers can know at compile-time their built object is invalid.

The next step is to imagine more complex use-case:

  1. Builder methods have to be called in a certain order. For example, a house should have foundations, a frame and a roof. Building the frame requires having built foundations, as building the roof requires the frame.
  2. Even more complex, some steps are dependent on previous steps (e.g. having a flat roof is only possible with a concrete frame)

The exercise is left to interested readers. Links to proposed implementations welcome in comments.

There’s one flaw with our design: just calling the setLastName() method is enough to qualify our builder as valid, so passing null defeats our design purpose. Checking for null value at runtime wouldn’t be enough for our compile-time strategy. The Scala language features may leverage an enhancement to this design called the type-safe builder pattern.


  1. In real-life software, the builder pattern is not so easy to implement as quick examples found here and there
  2. Less is more: create an easy-to-use DSL is (very) hard
  3. Scala makes it easier for complex builder implementation’s designers than Java
Send to Kindle
Categories: Development Tags: , ,

On the merits of verbosity and the flaws of expressiveness

September 8th, 2013 2 comments

Java is too verbose! Who didn’t stumble on such a rant on the Internet previously? And the guy bragging about [Insert expressive language there], that which soon replace Java because it is much more concise: it can replace those 10 lines of Java code with a one-liner. Ah, the power!

Unfortunately, in order to correlate conciseness with power (and verbosity with lack of power), those people take many shortcuts that once put into perspective make no sense at all. This article aims to surgically deconstruct such shortcuts to expose their weaknesses. And because I like a factual debate – and because there are not only trolls on the Internet, this post will be linked to Andy Petrella’s different point of view.

Verbose is not bad

Having more data than necessary prevents messages corruption by indirect communication. Here are two simple but accurate examples:

  • In network engineering, there’s a concept of DBIt or Delivery Confirmation Bit. It is just a summary of all 0 or 1 of the current sequence and guarantees the delivered packet has been transmitted correctly, despite not-so-adequate network quality.
  • In real-life, bank accounts have a 2-digits control key (at least in most European countries I know of). Likewise, it is to avoid wiring funds to a erroneous account.

It’s exactly the same for “verbose” languages; it decreases probability of understanding mistakes.

More concise is not (necessarily) better

Telling that one-liner are better than 10 lines implies that shorter is better: this is sadly not the case.

Let us take a code structure commonly found in Java:

public List<Product> getValidProducts(List<Product> products) {

    List<Product> validProducts = new ArrayList<Product>();

    for (Product product : products) {

        if (product.isValid()) {


    return validProducts;

In this case, we have a list of Product and want to filter out invalid ones.

An easy way to reduce verbosity is to set everything on the same line:

public List<Product> getValidProducts(List<Product> products) { List<Product> validProducts = new ArrayList<Product>(); for (Product product : products) { if (product.isValid()) { validProducts.add(product); } } return validProducts;}

Not concise enough? Let’s rename our variables and method obfuscated-style:

public List<Product> g(List<Product> s) { List<Product> v = new ArrayList<Product>(); for (Product p : s) { if (p.isValid()) { v.add(p); } } return v;}

We drastically reduced our code verbosity – and we coded everything in a single line! Who said Java was too verbose?

Expressiveness implies implicitness

I already wrote previously on the dangers of implicitness: those same arguments could be used for programming.

Let us rewrite our previous snippet à la Java 8:

public List<Product> getValidProducts(List<Product> products) {

    return -> p.isValid()).collect(Collectors.toList());


Much more concise and expressive. But it requires, you, me and everybody (sorry, I couldn’t resist) to have a good knowledge of the API. The same goes for Scala operators and so on. And here comes my old friend, the context:

  • The + operator is known to practically everyone on the planet
  • The ++ operator is known to practically every software developer
  • The Elvis operator is probably known to Groovy developers, but can be inferred and remembered quite easily by most programmers
  • While the :/ operator (aka foldLeft()) requires Scala and FP knowledge

So from top to bottom, the operator requires more and more knowledge thus reducing more and more the number of people being able to read your code. As you write code to be read, readibility becomes the prime factor in code quality.

Thus, the larger the audience you want to address, the more verbose your code should be. And yes, since you will not use advanced features nor perfect expressiveness, seasoned developers will probably think this is just average code, and you will not be praised for your skills… But ego is not the reason why you code, do you?


In this article, I tried to prove that “Less is more” does not apply to programming with these points:

  • Verbosity prevents communication mistakes
  • Spaces and tabs, though not needed by the compiler/interpreter, do improve readibility
  • Expressiveness requires you and your readers share some common and implicit understanding

I will be satisfied if this post made you consider perhaps verbosity is not so bad after all.

Now is the time to read Andy’s post if you didn’t already and make your own mind.

Send to Kindle
Categories: Development Tags: , ,

Devoxx France 2013 – Day 2

March 29th, 2013 3 comments

Object and Functions, conflict without a cause by Martin Odersky

The aim of Scala is to merge features of Object-Programming and Functional Programming. The first popular OOP language was Simula in 67, aimed at simulations; the second one was Smalltalk for GUIs. What is the reason OOP became popular: only because of the things you could do., not because of its individual features (like encapsulation). Before OOP, the data structure was well known with an unbounded number of operations while with OOP, the number of operations is fixed but the number of implementation is unbounded. Though it is possible for procedural languages (such as C) to apply to the field of simulation & GUI, it is to cumbersome to develop with them in real-life projects.

FP has advantages over OOP but none of them is enough to led to mainstream adoption (remember it has been around for 50 years). What can spark this adoption is the complexity to develop OOP applications multicores and cloud computing ready. Requirements for these scopes include:

  • parallel
  • reactive
  • distributed

In each of these, mutable state is a huge liability. Shared mutable state and concurrent threads leads to non-determinism. To avoid this, just avoid mutable state 🙂 or at least reduce it.

The essence of FP is to concentrate on transformation of immutable values instead of stepwise updates of a single mutable data structure.

In Scala, the .par member turns a collection into a parallel collection. But then, you have to became FP and forego of any side-effects. With Future and Promise, non-blocking is also possible but is hard to write (and read!), while Scala for-expressions syntax is an improvement. It also make parallel calls very easy.

Objects are not to put put away: in fact, they are not about imperative, they are about modularization. There are no module systems (yet) that are on par with OOP. It feels like using FP & OOP is like sitting between two chairs. Bridging the gap require letting go of some luggage first.

Objects are characterized by state, identity and behavior
Grady Booch

It would be better to focus on behavior…

Ease development of offline applications in Java with GWT by Arnaud Tournier

HTML5 opens new capabilities that were previously the domain of native applications (local storage, etc.). However, it is not stable and mature yet: know that it will have a direct impact on development costs.

GWT is a tool of choice for developing complex Java applications leveraging HTML5 features. A module called “elemental” completes lacking features. Moreover, the JNSI API is able to use JavaScript directly. In GWT, one develops in Java and a compiler transforms Java code into JavaScript instead of bytecode. Generated code is compatible with most modern browsers.

Mandatory features for offline include application cache and local storage. Application cache is a way for browsers to store files locally to use when offline. It is based on a manifest file, and has to be referenced by desired HTML pages (in the tag). A cache management API is provided to listen to cache-related events. GWT already manages resources: we only need to provide a linker class to generate the manifest file that includes wanted resources. Integration of the cache API is achieved through usual JSNI usage [the necessary code is not user-friendly… in fact, it is quite gory].

Local storage is a feature that stores user data on the client-side. Some standards are available: WebSQL, IndexedDB, LocalStorage, etc. Unfortunately, only the latter is truly cross-browser and is based on a key-value map of strings. Unlike application cache, there’s an existing out-of-the-box GWT wrapper around local storage. Objects stored being strings and running client-side, JSON is the serialization mechanism of choice. Beware that standard mandates 5MB maximum of storage (while some browsers provide more).

We want:

  1. Offline authentication
  2. A local database to be able to run offline
  3. JPA features for developers
  4. Transparent data synch when coming online again for users

In regard to offline authentication, it is not a real problem. Since local storage is not secured, we just have to store the password hash. Get a SHA-1 Java library and GWT takes care of the rest.

SQL capabilities is a bigger issue, there are many incomplete solutions. sql.js a JavaScript SQLite port that provides limited SQL capabilities. As for integration, back to JSNI again ([* sigh *]). You will be responsible for developing a high-level API to ease usage of this, as you have to talk to either a true JPA backend or local storage. Note that JBoss Errai is a proposed JPA implementation to resolve this (unfortunately, it is not ready for production use – yet).

State sync between client and server is the final problem. It can be separated into 3 ascending complexity levels: read-only, read-add and read-add-delete-update. Now, sync has to be done manually, only the process itself is generic. In the last case, there are no rules, only different conflict resolution strategies. What is mandatory is to have causality relations data (see Lamport timestamps).

Conclusion is that developing offline applications now is a real burden, with a large mismatch between possible HTML5 capabilities and existing tools.

Comparing JVM web frameworks by Matt Raible

Starting with JVM Web frameworks history, it all began with PHP 1.0 in 1995. In the J2EE world, Struts replaced proprietary frameworks in 2001.

Are there many too many Java web frameworks? Consider Vaadin, MyFaces, Struts2, Wicket, Play!, Stripes, tapestry, RichFaces, Spring MVC, Rails, Sling, Stripes, Grails, Flex, PrimeFaces, Lift, etc.

And now, for SOFEA architecture, there are again so many frameworks on the client-side: Backbone.ja, AngularJS, HTML5, etc. But, “traditional” frameworks are still relevant because of client-side development limitations, including development speed and performance issues.

In order to make relevant decision when faced with a choice, first set your goals and then evaluate each option in regard to these goals. Pick your best option and then re-set your goals. Maximizers trie to make the best possible choice, satisficers try to find the first suitable choice. Note that the former are generally more unhappy than the latter.

Here is a proposed typology (non-exhaustive):

Pure web Full stack SOFEA
Apache GWT JSF Miscellaneous API JavaScript MVC
  • Wicket
  • Struts
  • Spring
  • Tapestry
  • Click
  • SmartGWT
  • GXT
  • Vaadin
  • Errai
  • Mojarra (RI)
  • MyFaces
  • Tomahawk
  • IceFaces
  • RichFaces
  • PrimeFaces
  • Spring MVC
  • Stripes
  • RIFE
  • ZK
  • Rails
  • Grails
  • Play!
  • Lift
  • Spring Roo
  • Seam
  • RESTEasy
  • Jersey
  • CXF
  • vert.x
  • Dropwizard
  • Backbone.js
  • Batman.js
  • JavaScript MVC
  • Ember.js
  • Sprout Core
  • Knockout.js
  • AngularJS

The former matrix with fine-grained criteria is fine, but you probably have to create your own, with your own weight for each criterion. There are so many ways to tweak the comparison: you can assign more fine-grained grades, compare performances, locs, etc. Most of the time, you are influenced by your peers and by people who have used such and such frameworks. Interestingly enough, performance-oriented tests show that most of the time, bottlenecks appear in the database.

  • For full stack, choose by language
  • For pure web, Spring MVC, Struts 2, Vaadin, Wicket, Tapestry, PrimeFaces. Then, eliminate further by books, job trends, available skills (i.e. LinkedIn), etc.

Fun facts: a great thing going for Grails and Spring MVC is backward compatibility. On the opposite side, Play! is the first framework that has community revive a legacy version.

Conclusion: web frameworks are not the productivity bottleneck (administrative tasks are as show in the JRebel productivity report), make your own opinion, be nether a maximizer (things change too quickly) nor a picker.

Puzzlers and curiosities by Guillaume Tardif & Eric Lefevre-Ardant

[Interesting presentation on self-reference, in art and code. Impossible to resume in written form! Wait for the Parleys video…]

Exotic data structures, beyond ArrayList, HashMap & HashSet by Sam Bessalah

If all you have is a hammer, everything looks like a nail

In some cases, problem can be solved in an easier way by using the right data structure instead of the one we know. Those 4 different “exotic” data structures are worth knowing:

  1. Skip lists are ordered data sets. The benefit of skip lists over array lists is that every operation (insertion, removal, contains and retrieval, ranges) is in o(log N). It is achieved by adding extra levels for “express” lines. Within the JVM, it is even faster with JVM region localizing feature. The type is non-locking (thread-safe) and included in Java 6 with ConcurrentSkipListMap and ConcurrentSkipListSet. The former is ideal for cache implementations.
  2. Tries are ordered trees. Whereas traditional trees have complexity of o(log N) where N is the tree depth, tries have constant time complexity whatever the depth. A specialized kind of trie is the Hash Array Mapped Trie (HAMT), a functional data structure for fast computations. Scala offers CTrie structure, a concurrent trie.
  3. Bloom filters are probabilistic data structures, designed to return very fast whether an element belongs to a data structure. In this case, there are no false negatives, accurately returning when an element does not belong to the structure. On the contrary, false positives are possible: it may return true when it is not the case. In order to reduce the probability of false positives, one can choose an optimal hash function (cryptographic functions are best suited), in order to avoid collision between hashed values. To go further, one can add hash functions. The trade off is memory space consumption.
    Because of collisions, you cannot remove elements from Bloom filters. In order to achieve them, you can enhance Bloom filters with counting, where you also store the number of elements at a specific location.
  4. Count Min Sketches are advanced Bloom filters. It is designed to work best when working with highly uncorrelated, unstructured data. Heavy hitters are based on Count Min Sketches.
Send to Kindle
Categories: Event Tags: , ,

Scala cheatsheet part 1 – collections

November 11th, 2012 2 comments

As a follow-up of point 4 of my previous article, here’s a first little cheatsheet on the Scala collections API. As in Java, knowing API is a big step in creating code that is more relevant, productive and maintainable. Collections play such an important part in Scala that knowing the collections API is a big step toward better Scala knowledge.

Type inference

In Scala, collections are typed, which means you have to be extra-careful with elements type. Fortunaltey, constructors and companion objects factory have the ability to infer the type by themselves (most of the type). For example:

scala>val countries = List("France", "Switzerland", "Germany", "Spain", "Italy", "Finland")
countries: List[java.lang.String] = List(France, Switzerland, Germany, Spain, Italy, Finland)

Now, the countries value is of type List[String] since all elements of the collections are String.

As a corollary, if you don’t explicitly set the type if the collection is empty, you’ll have a collection typed with Nothing .

scala>val empty = List()
empty: List[Nothing] = List()

scala> 1 :: empty
res0: List[Int] = List(1)

scala> "1" :: empty
res1: List[java.lang.String] = List(1)

Adding a new element to the empty list will return a new list, typed according to the added element. This is also the case if a element of another type is added to a typed-collection.

scala> 1 :: countries
res2: List[Any] = List(1, France, Switzerland, Germany, Spain, Italy, Finland)

Default immutability

In Functional Programming, state is banished in favor of “pure” functions. Scala being both Object-Oriented and Functional in nature, it offers both mutable and immutable collections under the same name but under different packages: scala.collection.mutable and scala.collection.immutable. For example, Set and Map are found under both packages (interstingly enough, there’s a scala.collection.immutable.List but a scala.collection.mutable.MutableList). By default, collections that are imported in scope are those that are immutable in nature, through the scala.Predef companion object (which is imported implicitly).

The collections API

The heart of the matter lies in the API themselves. Beyond expected methods also found in Java (like size() and indexOf()), Scala brings to the table a unique functional approach to collections.

Filtering and partitioning

Scala collections can be filtered so that they return:

  • either a new collection that retain only elements that satisfy a predicate (filter())
  • or those that do not (filterNot())

Both take a function that takes the element as a parameter and return a boolean. The following example returns a collection which only retains countries whose name has more than 6 characters.

scala> countries.filter(_.length > 6)
res3: List[java.lang.String] = List(Switzerland, Germany, Finland)

Additionally, the same function type can be used to partition the original collection into a pair of two collections, one that satisfies the predicate and one that doesn’t.

scala> countries.partition(_.length > 6)
res4: (List[java.lang.String], List[java.lang.String]) = (List(Switzerland, Germany, Finland),List(France, Spain, Italy))

Taking, droping and splitting

  • Taking a collection means returning a collection that keeps only the first n elements of the original one
    scala> countries.take(2)
    res5: List[java.lang.String] = List(France, Switzerland)
  • Droping a collection consists of returning a collection that keeps all elements but the first n elements of the original one.
    scala> countries.drop(2)
    res6: List[java.lang.String] = List(Germany, Spain, Italy, Finland)
  • Splitting a collection consists in returning a pair of two collections, the first one being the one before the specified index, the second one after.
    scala> countries.splitAt(2)
    res7: (List[java.lang.String], List[java.lang.String]) = (List(France, Switzerland),List(Germany, Spain, Italy, Finland))

Scala also offers takeRight(Int) and dropRight(Int) variant methods that do the same but start with the end of the collection.

Additionally, there are takeWhile(f: A => Boolean) and dropWhile(f: A => Boolean) variant methods that respectively take and drop elements from the collection sequentially (starting from the left) while the predicate is satisfied.


Scala collections elements can be grouped in key/value pairs according to a defined key. The following example groups countries by their name’s first character.

res8: scala.collection.immutable.Map[Char,List[java.lang.String]] = Map(F -> List(France, Finland), S -> List(Switzerland, Spain), G -> List(Germany), I -> List(Italy))

Set algebra

Three methods are available in the set algebra domain:

  • union (::: and union())
  • difference (diff())
  • intersection (intersect())

Those are pretty self-explanatory.


The map(f: A => B) method returns a new collection, which length is the same as the original one, and whose elements have been applied a function.

For example, the following example returns a new collection whose names are reversed.

res9: List[String] = List(ecnarF, dnalreztiwS, ynamreG, niapS, ylatI, dnalniF)


Folding is the operation of, starting from an initial value, applying a function to a tuple composed of an accumulator and the element under scrutiny. Considering that, it can be used as the above map if the accumulator is a collection, like so:

scala> countries.foldLeft(List[String]())((list, x) => x.reverse :: list)
res10: List[String] = List(dnalniF, ylatI, niapS, ynamreG, dnalreztiwS, ecnarF)

Alternatively, you can provide other types of accumulator, like a string, to get different results:

scala> countries.foldLeft("")((concat, x) => concat + x.reverse)
res11: java.lang.String = ecnarFdnalreztiwSynamreGniapSylatIdnalniF


Zipping creates a list of pairs, from a list of single elements. There are two variants:

  • zipWithIndex() forms the pair with the index of the element and the element itself, like so:
    scala> countries.zipWithIndex
    res12: List[(java.lang.String, Int)] = List((France,0), (Switzerland,1), (Germany,2), (Spain,3), (Italy,4), (Finland,5))

    Note: zipping with index is very important when you want to use an iterator but still want to have a reference to the index. It keeps you from declaring a variable outside the iteration and incrementing the former inside the latter.

  • Additionally, you can also zip two lists together:
    scala>"Paris", "Bern", "Berlin", "Madrid", "Rome", "Helsinki"))
    res13: List[(java.lang.String, java.lang.String)] = List((France,Paris), (Switzerland,Bern), (Germany,Berlin), (Spain,Madrid), (Italy,Rome), (Finland,Helsinki))

Note that the original collections don’t need to have the same size. The returned collection’s size will be the min of the sizes of the two original collections.

The reverse operation is also available, in the form of the unzip() method which returns two lists when provided with a list of pairs. The unzip3() does the same with a triple list.


I’ve written this article in the form of a simple fact-oriented cheat sheet, so you can use it as such. In the next months, I’ll try to add other such cheatsheets.

To go further:

I’ve found the following references around the web:

Send to Kindle
Categories: Development Tags:

My view on Coursera’s Scala courses

November 5th, 2012 2 comments

I’ve spent my last 7 weeks trying to follow Martin Odersky’s Scala courses on the Coursera platform.

In doing so, my intent was to widen my approach on Functional Programming in general, and Scala in particular. This article sums up my personal thoughts about this experience.

Time, time and time

First, the courses are quite time-consuming! The course card advises for 5 to 7 hours of personal work a week and that’s the least. Developers familiar with Scala will probably take less time, but other who have no prior experience with it will probably have to invest as much.

Given that I followed the course during my normal work time, I can assure you it can be challenging. People who also followed the course confirmed this appreciation.

Functional Programming

I believe the course allowed me to put the following Functional Programming principles in practice:

  • Immutable state
  • Recursivity

Each assignment was checked for code-quality, specifically for mutable state. Since in Scala, mutable variables have to be defined with the var keyword, the check was easily enforced.


I must admit I only received the barest formal computer programming education. I’ve picked up the tricks of the trade only from hard-won experience, thus I’ve only the barest algorithmics skills.

The offered Scala course clearly required much needed skills in this area and I’m afraid I couldn’t fulfill some assignments because of these lackings.

Area of improvement

Since I won’t code any library or framework in Scala anytime soon, I feel my next area of improvement will be focused on the whole Scala collections API.

I found I missed a lot of knowledge of these API during my assignments, and I do think improving this knowledge will let me code better Scala applications in the future.

What’s next

At the beginning, I aimed to have 10/10 grade in all assignments but in the end, I only succeeded to achieve these in about half of them. Some reasons for this have been provided above. To be frank, it bothers the student part in me… but the more mature part sees this as a way to improve myself. I won’t be able to get much further, since Devoxx takes place the following week in Antwerp (Belgium). I’ll try to write about the conferences I’ll attend to or I’ll see you there: in the later case, don’t miss out my hands-on lab on Vaadin 7!

Send to Kindle
Categories: Development Tags: ,

Why I enrolled in an online Scala course

September 23rd, 2012 2 comments

When I heard that the Coursera online platform offered free Scala courses, I jumped at the opportunity.

Here are some reasons why:

  • Over the years, I’ve been slowly convinced that whatever the language you program in your professional life, learning new languages is an asset as it change the way you design your code.
    For example, the excellent LambdaJ library gave me an excellent overview of how functional programming can be leveraged to ease manipulation of collections in Java.
  • Despite my pessimistic predictions toward how Scala can penetrate mainstream enterprises, I still see the language as being a strong asset in small companies with high-level developers. I do not wish to be left out of this potential market.
  • The course if offered by Martin Oderski himself, meaning I get data directly from the language creator. I guess one cannot hope for a better teacher than him.
  • Being a developer means you’ve to keep in touch with the way the world is going. And the current trend is revolutions everyday. Think about how nobody speak about Big Data 3 or 4 years ago. Or how at the time, you developed your UI layer with Flex. Amazing how things are changing, and changing faster and faster. You’d better keep the rythm…
  • The course is free!
  • There’s the course, of course, but there are also a weekly assignment, each being assigned a score. Those assignments fill the gap in most online courses, where you lose motivation with each passing day: this way, the regular challenge ensures a longer commitment.
  • Given that some of my colleagues have also enrolled in the course, there’s some level of competition (friendly, of course). This is most elating and enables each of us not only to search for a solution, but spend time looking for the most elegant one.
  • The final reason is that I’m a geek. I love learning new concepts and new ways to do things. In this case, the concept is functional programming and the way is Scala. Other alternatives are also available: Haskell, Lisp, Erlang or F#. For me, Scala has the advantage of being able to be run on the JVM.

In conclusion, I cannot recommend you enough to do likewise, there are so many reasons to choose from!

Note: I also just stumbled upon this Git kata; an area where I also have considerable progress to make.

Send to Kindle
Categories: Development Tags: ,

Java, Scala, complexity and aardvarks

December 4th, 2011 2 comments

This week saw another flame war from some of the Scala crowd. This time, it was toward Stephen Colebourne, the man behind Joda time.

The article in question can be found here, and Stephen’s answer here.

To be frank, I tend to agree to Stephen’s predicat but for very different reasons. Now, if you’re a Scala user, there are basically 2 options:

  • either you react like some did before, telling me I’m too stupid or too lazy to really learn the language and stop reading at this point. In this case, I’m afraid there’s nothing I can say apart from ‘Please, don’t harm me’ because it has a feeling of religion war coming from supposedly scientific-minded people.
  • Or you can read on and we’ll debate like educated people.

Truth is, I’m interested in Scala. I try to program in Scala for personal projects. So far, I’ve gotten the hold on traits (what I would do to have them in Java) and closures and I’ve understood some of the generics concepts (so long as it’s not too complex). I plan on diving in the Collections API to better use closures next.

I think that Scala, like EJB2, was designed by smart, even brilliant people, but with a different experience than mine.

In real life, projects are full of developpers of heterogeneous levels: good developers, average developers and bad developers. And please don’t blame it on the HR department, the CEO or the global strategy of the company: it’s just Gaussian distribution, just like in any population.

In effect, that makes Scala a no-go in most contexts. Take Java as an example. What did it make it so successful, even with all its lackings? The compile-once, run-everywhere motto? Perhaps, but IMHO, the first and foremost reason behind Java success is the change of responsibility in memory management, from developers (like in C/C++) to system administrators. Before, the worse the developer, the higher the probability of a bug in memory management but Java changed all this: it’s not perfect, but much simpler.

Like Freddy Mallet, I’m sure a technology has to be simple to become mainstream and Scala has not taken this road. As a consequence, it’s fated to stay in a niche market… Stating this should raise no reactions from anybody, it’s just a fact.

Note: aardvarks will be a point for a future blog post.

Send to Kindle
Categories: Java Tags: ,