a little madness

A man needs a little madness, or else he never dares cut the rope and be free -Nikos Kazantzakis

Zutubi

Archive for November, 2008

Maven – Pain = Gradle?

Being in the continuous integration game, it’s part of my job to keep an eye on build tools and technologies. Occasionally I hear of something interesting enough to try out, so when I next start a small project I’ll give it a go.

This time it is the turn of Gradle. From the Gradle website:

Gradle is a build system which provides:

  • A very flexible general purpose build tool like Ant.
  • Switchable, build-by-convention frameworks a la Maven (for Java and Groovy projects). But we never lock you
    in!
  • Powerful support for multi-project builds.
  • Powerful dependency management (based on Apache Ivy).
  • Full support for your existing Maven or Ivy repository infrastructure.
  • Support for transitive dependency management without the need for remote repositories and pom.xml
    or ivy.xml files (optional).
  • Ant tasks as first class citizens.
  • Groovy build scripts.

Build tools that leverage scripting languages such as Groovy, Ruby and Python are all the rage. This is undoubtedly a useful feature, but so common these days that it is not a differentiating factor. After all, just adding a more concise way to write procedural build scripts is not a big win. The focus needs to be on making builds as declarative as possible.

The current king of declarative builds in the Java world is undoubtedly Maven. However, as I have said in a previous post, the current implementation of Maven leaves a lot to be desired. Still, the Maven idea of build-by-convention is still a good one if it can be achieved in a flexible way. This, then, is what attracts me to Gradle — its specific goal of providing build-by-convention without the lock-in.

Installation

To begin, I set myself the lofty goal of writing a “Hello, World” build script. This gets me to the point where I have a working gradle installation. As I already had a JDK installed (the only external dependency), installation was as simple as:

$ wget http://dist.codehaus.org/gradle/gradle-0.4-all.zip
$ unzip gradle-0.4-all.zip
...
$ export GRADLE_HOME="$(pwd)/gradle-0.4"
$ export PATH="$PATH:$GRADLE_HOME/bin"
$ gradle -v
Gradle 0.4
Gradle buildtime: Tuesday, September 16, 2008 9:20:38 AM CEST
Groovy 1.5.5
Java 1.6.0_06
JVM 10.0-b22
JVM Vendor: Sun Microsystems Inc.
OS Name: Linux

Gradle build files are named “build.gradle”, so next I created a trivial example as follows:

createTask('hello')
{
println 'Hello, world!'
}

The above is normal Groovy source code, executed in an environment provided by gradle. It defines a single task which is gradle’s equivalent of a target (almost like the combination of an Ant task and target). Executing this task gives:

$ gradle -q hello
Hello, world!
$

Note that the -q flag suppresses some gradle output.

A Simple Java Project

To test gradle’s claims of build-by-convention, I next put it to work on a simple Java project. Gradle’s build-by-convention support is implemented as “plugins” for different languages, with Java and Groovy plugins provided out of the box. The project to be built is a JavaDoc doclet, so the build just needs to compile Java source files into a single jar. To make life a little interesting, the project does not fit gradle’s default conventions in two ways:

  1. It should not be named after the containing directory.
  2. The source is located under “src/java”, not “src/main/java”.

These are truly simple customisations — so you would expect them to be easily configured. And indeed they are, as shown in the build.gradle file:

name = archivesBaseName = 'com.zutubi.xmlrpc.doclet'
version = '0.1'

usePlugin('java')
sourceCompatibility = 1.5

targetCompatibility = 1.5
srcDirNames = ['java']

The first line customises the project and jar file names, and the last line is used to override the default location for Java source files. With this build file in place, I can build the jar as follows:

$ gradle -q libs
$ ls build
classes com.zutubi.xmlrpc.doclet-0.1.jar reports test-classes test-results

That was pleasantly simple! The Java plugin also gives me a bunch of other tasks for free:

$ gradle -qt
**************************************************
Project :
Task :archive_jar [:test]
Task :clean []
Task :compile [:resources]
Task :dists [:libs]
Task :eclipse [:eclipseCp, :eclipseProject]
Task :eclipseClean []
Task :eclipseCp []
Task :eclipseProject []
Task :eclipseWtpModule []
Task :init []
Task :javadoc []
Task :libs [:archive_jar, :test]
Task :resources [:init]
Task :test [:testCompile]
Task :testCompile [:testResources]
Task :testResources [:compile]
Task :upload [:uploadDists, :uploadLibs]
Task :uploadDists [:dists]
Task :uploadInternalLibs [:libs]
Task :uploadLibs [:libs]

Conclusion

So far, gradle looks very promising. The example above is too simple to judge how it would fare on a larger, more challenging build, but it shows the basics are right: at least a simple case is simple. I hope to give it a try on a larger code base soon.

At the moment the gradle project is still in its infancy, so I’ll be keeping a keen eye on its development. Indeed, if it can achieve its stated goals it will become a build tool to reckon with, and a tough competitor for Maven.


Zutubi Pulse: Continuous Integration made easy
Does your project have a pulse? Try it for free

Pulse 2.0 RC: The Docs Are In

I’ve just pushed up version 2.0.16 of Pulse, the first 2.0 build that can be considered as an RC. Part of the criteria for getting to RC was of course to update the documentation: a large task due to the many changes in 2.0.

This gives me a chance to highlight a cool new feature in 2.0: built-in documentation for all forms. Now when you are configuring your Pulse server, there is no need to leave the UI to look up the docs, they are all there in a slide-out panel. The help UI also sports tooltips for the most commonly-needed pointers, and includes examples to illustrate field values. The shiny new online manual has a screenshot showing the configuration UI complete with help panel.

This is not the sort of feature that makes headlines, but it sure as heck is useful when you are doing admin tasks. Reducing time spent fighting configuration is a key goal of Pulse and this is another piece of the puzzle. And lets face it, any time saved on admin can be spent doing something more interesting!

Continuous Integration Myth: The Build Must Never Break

Continuing the theme, another misguided idea is that your CI build must never break. I think the real issue here is that this idea focuses on the wrong thing: it’s not really broken builds that matter, it’s why they break and what you do about it.

Broken Builds Are Information

If your build never breaks, then it doesn’t necessarily tell you much. Saying an always-green build implies your product works is analogous to saying that high test coverage implies your code is fully tested. Green builds are good, but how informative they are is highly dependent on their quality. I particularly like Paul Duvall’s characterisation of always-green builds as a possible case of Continuous Ignorance.

When a build breaks, however, you know something is wrong1. This is a concrete indicator that your build has some value: it has found a problem.

Broken Builds != Broken Windows

So a broken build gives you information; the key thing now is how you react to it. Teams in tune with CI react to broken builds by:

  1. Fixing them fast: to reduce the impact on the team, and in the knowledge that the cheapest time to make the fix is now.
  2. Learning from this failure: and finding ways to prevent it in the future.

Teams that don’t deal with broken builds suffer “Broken Windows” syndrome. This is not because the build broke, however, but because the team didn’t respond as they should have.

What About the Cost?

I’m not saying that a broken build is all good — after all productivity is easily lost while the build is red. In Pulse we have even added many features like personal builds specifically to help reduce the frequency of build failures. However, aiming for completely unbreakable builds also has a cost. All solutions must in the end rely on complete serialisation of checkins and builds. I see this as analogous to pessimistic locking: you take a producitivity hit on every commit just in case something breaks. Not to mention the fact that:

  1. For many larger projects this is simply impractical due to the combination of commit frequency and build time.
  2. There is still no guarantee of a green build thanks to the existence of non-deterministic bugs.

We advocate an optimistic approach: by using local smoke testing and optional personal builds on each checkin, the vast majority of problems are found. This leaves only the much less frequent chance of a failure due to a logical merge conflict2 or non-determinism. The great thing about this approach is it doesn’t enforce any heavyweight overhead such as extra branching or serialisation on the process, you can pretty much work as you always have.

Conclusion

So, put your effort into a high quality build and listen to what it is telling you. Don’t get hung up on absolute guarantees of unbreakability: with the right response to broken builds you’re better off with an optimistic approach.


1 OK, I am glossing over spurious failures here, but I would like to deal with that in another post.
2 As opposed to a textual merge conflict which must be fixed prior to checking in.

Pulse 2.0 Beta Rolls On; Subversion 1.5 Working Copy Support

Today we’ve just released Pulse 2.0.15. The release notes for this build show that the beta phase is running hot now: 29 improvements and fixes since 2.0.14 which itself was released just 9 days ago! Thanks to our beta users: the feedback has been great lately allowing us to nail several minor bugs in the past week.

The key new feature in this build is an upgrade to SvnKit 1.2, which supports Subversion 1.5 working copies. I know several Subversion-based projects that are keen users of personal builds have been waiting for this one.

We’ve also finally updated the dashboard view to match the new 2.0 browse view:

New Layout

Just like the browse view you can now view projects in their template hiearchy if you like, as well as expanding/collapsing groups. The new format also uses a lot less vertical space per project, so you can fit more in one screen. And like always, you are able to filter the groups and projects you see on your dashboard to suit your preference.

So, what are you waiting for — go and get it!

Continuous Integration Myth: Build Server == CI

Possibly the most common myth around Continuous Integration (CI) is the idea that if you have an automated build server, then you are doing CI. In fact, a build server is just a tool to help you achieve part of CI: automated builds on an independent machine. This is not even the most important aspect of CI, which is really all about fast feedback. Even with a build server, you aren’t doing CI if:

  • Developers don’t check in regularly (aim for daily or better).
  • The build is too slow.
  • You have no automated test suite or one with insufficient coverage.
  • Broken builds are ignored.

When you are starting out, getting the build automated and a build server running are great first steps. But don’t forget to tackle the simultaneously more difficult and more rewarding part: cultural change. The team needs to see the value of fast feedback and be dedicated to improving its quality and frequency.


Hey! Why not give Pulse 2.0 (beta) a spin — it’s free for small teams, open source projects, and free to evaluate for everyone else!