Archive for the ‘Continuous Integration’ Category

Wiki Syntax for Your Commit Comments

Wednesday, January 24th, 2007

Pulse has strong SCM integration, which includes showing changelist information for changes between builds. For some time, Pulse has had the ability to transform the commit comments that it displays to insert links to external systems. For example, you could link the text “bug 123″ to the summary for that bug in your issue tracking software. This is neat, as you can jump straight from Pulse to any external tool you choose to link to.

As part of Pulse 1.2, we have beefed up this system. One of our customers wanted to be able to strip redundant information from their commit comments as shown in Pulse. This led us to adding the ability to transform commit comments in more general ways than just linking. Now, you can specify an arbitrary regular expression to match and corresponding replacement text. The cool thing is that this mechanism gives you the power to do all sorts of useful transformations on your commit messages. It soon occurred to me that one very useful thing to do would be to support a Wiki-like syntax for your commit messages! For example, you can add a transformer to render text in bold when surrounded by asterixes:

Expression: \*([\w ]+)\*
Replacement: <b>$1</b>

The text between the asterixes is captured into group 1, and the replacement wraps this group with <b> tags. Now when viewing changes in Pulse, important bits stand out:

Of course, we can go much further. Linking is a good thing, so why not support turning anything that looks like a link into one? And anything that looks like an email address into a mailto: link? Here it goes:

Expression: http://[\w./]+
Replacement: <a href=”$0″>$0</a>

Expression: [\w]+@[\w.]+
Replacement: <a href=”mailto:$0″>$0</a>

Now whenever someone references a useful URL, it is but a click away from the Pulse web UI:

We have really just scratched the surface here. The mechanism is flexible enough to do all sorts of weird, wonderful and (hopefully) useful things! Give Pulse a try and go nuts!

New in Pulse 1.2: Integration with Fisheye/Trac/ViewVC

Thursday, January 11th, 2007

A great new feature we have introduced into Pulse 1.2 is tighter integration with change viewers. We have added simpler configuration for common viewers such as Fisheye, Trac and ViewVC (formerly ViewCVS), with custom configuration also possible for other systems. We have also deepened the integration by adding more direct links from Pulse back to your change viewer.

Now, when viewing changelist information in Pulse, you can directly access further information in your change viewer:

  • The changelist itself can be viewed in the change viewer by clicking on the revision wherever it appears in the Pulse UI.
  • Download and/or view the contents of any file as it was at the revision
  • Jump directly to the diff view in your change viewer to see what changes were made to a file.

The shot below shows how these links are presented for each file:

This feature is simple but incredibly useful. Figuring out why a build is broken often boils down to seeing what has changed. With change viewer integration, this information is at your fingertips. There is no need to change context. Enjoy!

UnitTest++: The New Choice for C++ Unit Testing?

Monday, December 18th, 2006

In an earlier post on C++ Unit Testing Frameworks, I came across a relatively new framework by the name of UnitTest++. At first glance, this framework appealed to me for a couple of reasons:

  • Unlike most of the other frameworks, it is relatively recent, and in development
  • One of the originators of the project is the author of the best comparison of C++ unit testing libraries online. The experience of reviewing several other frameworks should inform the design of a new framework.

So, I’ve decided to take a closer look. I’ll start in this post with the basics: how do we write tests, fixtures and suites in UnitTest++? These are the fundamentals of a unit testing library, and should be very simple to use.

First, we need the UnitTest++ distribution. It is available as a simple tarball from SourceForge. Exploding the tarball gives a basic structure with build files at the top level, and child docs and src directories. To build the library itself, on Linux at least, requires a simple make:

jsankey@shiny:~/tools/UnitTest++$ make
src/AssertException.cpp
src/Test.cpp

Creating libUnitTest++.a library…
src/tests/Main.cpp
src/tests/TestAssertHandler.cpp

Linking TestUnitTest++…
Running unit tests…
Success: 162 tests passed.
Test time: 0.31 seconds.

The primary output is libUnitTest++.a at the top level. This, along with the header files under src (excluding src/test), forms the redistributables needed to build against UnitTest++ in your own project. It is a little awkward that no binary distributions, nor a “dist” or similar Makefile target are available. However, the source tree is so simple that it is not hard to extract what you need.

Armed with the library, the next step is to create out first test case, and run it. UnitTest++ makes use of macros to simplify creating a new test case. It could hardly get an easier:

#include "UnitTest++.h"

TEST(MyTest)
{
    CHECK(true);
}

int main(int, char const *[])
{
    return UnitTest::RunAllTests();
}

A test case is created using the TEST macro, which takes the case name as an argument. The macro adds the test case to a global list of cases automatically. The body of the test utilises the CHECK macro to assert conditions under test. Various CHECK* macros are available for common cases. Finally, to actually run the test, we call UnitTest::RunAllTests(). This runs all cases using a default reporter that prints a result summary to standard output:

jsankey@shiny:~/repo/utpp$ ./utpp
Success: 1 tests passed.
Test time: 0.00 seconds.

RunAllTests returns the number of failed cases, so using this as the program exit code works well. If we change the check to CHECK(false), we get a failure report:

jsankey@shiny:~/repo/utpp$ ./utpp
utpp.cpp(9): error: Failure in MyTest: false
FAILURE: 1 out of 1 tests failed (1 failures).
Test time: 0.00 seconds.

The next step is to create a test fixture, which allows us to surround our test cases with shared setup/teardown code. This is achieved in UnitTest++ by building upon standard C++ construction/destruction semantics. To create a fixture, you just create a standard C++ struct. The setup and teardown code go in the struct constructor and destructor respectively. Let’s illustrate how this works:

#include <iostream>
#include <string>
#include "UnitTest++.h"

struct MyFixture
{
    std::string testData;

    MyFixture() :
        testData(“my test data”)
    {
        std::cout << “my setup” << std::endl;
    }

    ~MyFixture()
    {
        std::cout << “my teardown” << std::endl;
    }
};

TEST_FIXTURE(MyFixture, MyTestCase)
{
    std::cout << testData << std::endl;
}

int main(int, char const *[])
{
    return UnitTest::RunAllTests();
}

Instead of the TEST macro, we use TEXT_FIXTURE to create a test case that uses the fixture struct. The example is artificial, but serves to illustrate the order in which the functions are called. Also of interest is how members of the fixture struct are referenced directly by name within the test case. Under the covers, the TEST_FIXTURE macro derives a type from MyTestFixture, making this possible. Running this new program gives the following:

jsankey@shiny:~/repo/utpp$ ./utpp
my setup
my test data
my teardown
Success: 1 tests passed.
Test time: 0.01 seconds.

The setup and teardown wrap execution of the test case, which has simple access to the data in the fixture. By leveraging construction/destruction, the fixture code is both familiar and concise.

The final step is to organise test cases into suites. UnitTest++ again uses macros to simplify the creation of suites. You simply wrap the tests in the SUITE macro:

#include <iostream>
#include "UnitTest++.h"

SUITE(SuiteOne)
{
    TEST(TestOne)
    {
        std::cout << “SuiteOne::TestOne” << std::endl;
    }

    TEST(TestTwo)
    {
        std::cout << “SuiteOne::TestTwo” << std::endl;
    }
}

SUITE(SuiteTwo)
{
    TEST(TestOne)
    {
        std::cout << “SuiteTwo:TestOne” << std::endl;
    }
}

int main(int, char const *[])
{
    return UnitTest::RunAllTests();
}

As shown above, it is possible to have two tests of the same name in different suites. This illustrates the first function of suites: namespacing. Running the above gives:

jsankey@shiny:~/repo/utpp$ ./utpp
SuiteOne::TestOne
SuiteOne::TestTwo
SuiteTwo:TestOne
Success: 3 tests passed.
Test time: 0.01 seconds.

Suites also have another function: they allow you to easily run a group of related tests. We can change our main function to only run SuiteOne (note we also need to include TestReporterStdout.h):

int main(int, char const *[])
{
    UnitTest::TestReporterStdout reporter;
    return UnitTest::RunAllTests(reporter,
                                 UnitTest::Test::GetTestList(),
                                 “SuiteOne”,
                                 0);
}

Running this new main will only execute SuiteOne:

jsankey@shiny:~/repo/utpp$ ./utpp
SuiteOne::TestOne
SuiteOne::TestTwo
Success: 2 tests passed.
Test time: 0.00 seconds.

So there you have it, a taste of the basics in UnitTest++. The most appealing thing about this library is simplicity: you can tell that the authors have made an effort to keep construction of cases, fixtures and suites as easy as possible. This lets you get on with writing the actual test code. In this overview I have not explored all of the details, most notably the various CHECK macros that test for equality, exceptions and so on. However, as it stands UnitTest++ is quite a simple framework, and there is not a whole lot more to it. Although you may need more features than you currently get out of the box, UnitTest++ is young and thus I expect still growing. The simplicity also makes it an easy target for customisation, which is important given the diversity of C++ environments. I’ll be keeping an eye on UnitTest++ as it evolves, and recommend you take a look yourself.

AJAX Goodness in Pulse 1.2

Thursday, December 14th, 2006

Pulse has always used a bit of AJAX (and plain old JavaScript) here and there to make the interface more responsive. For example, there are plenty of instances where you can test new configuration before you save it, without leaving the configuration form (a huge time saver when configuring!). We also try to avoid gratuitous use of AJAX, which seems to be popping up all over the place as the hype takes its toll. However, in Pulse 1.2 we found some key places to introduce AJAX to give users that warm and fuzzy feeling.

My personal favourite is a new widget to customise the columns in build results tables. These tables are used to summarise the most important build information throughout the Pulse UI. Over time, our customers have requested several new pieces of information to be shown in the tables. Adding them all for everyone would lead to information overload, not to mention the required screen real estate. The obvious solution was to make the table columns customisable. This is a prime case where a rich client-side UI is far more usable than a “click-refresh-click-refresh…” approach. The widget we came up with is simple: a bunch of checkboxes to choose the columns to show, and the ability to drag and drop the columns to reorder them:

Using it is a snap, and it just Feels Good. Everything happens client-side until you apply and the changes take effect by an AJAX-refresh of the underlying page.

Another prime candidate for AJAXification was the views for browsing working copies and build artifacts. We already had a treeview in place for browsing directories (e.g. during the setup wizard), and with some work adapted it to these views:

I can not tell you how much faster it is to browse around using these views! The page only loads what is needed when you first hit it, and drilling down is much, much easier.

Pulse Continuous Integration Server 1.2 Goes Beta

Wednesday, December 13th, 2006

Well, we’re pretty pumped today. The latest major release of Pulse has gone beta today, and been promoted to zutubi.com! Many thanks to the customers who rode the bleeding edge of the Pulse 1.2 Early Access Program, your feedback has been invaluable. Now you have a kick arse build server in return :).

Pulse 1.2 is packed with new features, and dozens of those little improvements that just Make Life Better. The list includes:

  • Personal Builds: The headline feature for 1.2, personal builds allow you to submit your changes directly to Pulse for testing before you commit them.
  • Reports: Each pulse project now has its own “reports” page, which displays build data for the project visually.
  • Change Viewers: easily integrate Pulse with change viewers such as Fisheye, P4Web, Trac and ViewVC. Use custom settings to integrate with other viewers.
  • Commit Message Transformers: control how your commit messages appear in Pulse. Link them to your bug tracker, or highlight important information.
  • Customisable Build Columns: choose the fields to view for build results, and reorder them using drag and drop!
  • AJAX-powered browsing: browse your working copies and captured artifacts using a dynamic tree view.
  • “Broken since” Support: when a test has been failing for multiple builds, it is displayed differently. The build where it first failed is just a click away!
  • Windows System Tray Notification: a new Windows client, “Stethoscope”, allows you to see your project status at a glance.
  • Customisable Notifications: you can now override the default notification messages (email or Jabber) by creating your own notification templates.
  • Automatic Agent Upgrades: when the Pulse master server is upgraded, it will automatically upgrade all agent servers.
  • Much, much more: dozens of other minor features and improvements.

I’ve talked a bit about personal builds, and how they combine with distributed builds to make Pulse a must-have development tool. I’ll also post about some of the other new features, and the cool things they let you do.

For now, check out Pulse at zutubi.com. Give it a try for 30 days for free: you’ll be hooked ;).

Article: Reducing the Impact of Broken Builds

Tuesday, December 5th, 2006

The “build” is the current status of any software project, and as such reflects the health, vitality and progress of the project. In this article we first review some of the impacts of a broken build. Those already familiar with the negative impacts of broken builds may wish to skip this lead in. There follows an analysis of various techniques to reduce the frequency and impact of broken builds. These techniques vary from the optimistic to the preventative, and from lightweight to quite intensive.

Read the full article at zutubi.com.

Pulse Continuous Integration Server 1.2 M3

Tuesday, December 5th, 2006

We just keep on punching out the milestones on 1.2 ;). New features in this puppy include:

  • Customisable columns: customise build tables with drag and drop.
  • Manual release support: configure Pulse to prompt for release build properties
  • Post stage actions: hook in after each agent build to cleanup/reboot agents.
  • P4Web support: built in linking to P4Web.
  • LDAP group integration: manage permissions through LDAP
  • Broken since support for tests: differentiate new and existing test failures.
  • Executable projects: run custom build scripts without dropping down to XML.

See the early access page for M3 packages and full details.

Get More Out Of Your Continuous Integration Server With Personal Builds

Friday, November 17th, 2006

The headline new feature in Pulse 1.2 is personal builds. A personal build is a build of the current state of your working copy on the Pulse server. This allows you to test your changes before you commit them to version control. The most obvious advantage of this is that you don’t have to taint your shared source base with untested code to get a CI build. You test first, then commit when you are happy. However, there are also some less obvious advantages:

  • Multiplatform testing: Pulse supports distributing a build across multiple agents in parallel. This allows you to easily test on multiple platforms. Couple this with personal builds, and you can easily test code on platforms other than your preferred development platform while you develop. No need to struggle with slow builds over a networked file system, or to manually move the code about to test.
  • More efficient resource usage: you can submit a personal build to your Pulse build farm and still have your development machine free for other tasks. You can also make use of free developer machines as Pulse agents, but that is for another post :).
  • Full reporting: just like every other build, Pulse extracts and reports interesting information for personal builds via a rich web interface. This beats digging through hundreds of lines of build logs and scratching up test reports to find the relevant info. Pulse can also be configured to send you notifications when personal builds complete, just like a CI build. You can even customise both the information Pulse extracts from the logs and the format of the notification messages.
  • Build history: Pulse will remember your recent personal builds (as many as you choose) and keep the results available for browsing. This makes it possible to refer back to an earlier issue without scrolling frantically in your console buffer (if you’re lucky enough to have the output at all!).

Personal builds are just one way in which we are making Pulse do more than your regular continuous integration server. From the start, we have seen Pulse as not just a server that sits on the sideline, but as a tool that you can leverage during every day development. This is why we have a strong focus on the developer in Pulse: every developer has their own account with a configurable dashboard and very flexible notification settings. Adding personal builds to the mix expands on what you can do with Pulse as you develop, and it is just one of a suite of tools we have or plan to add to Pulse in the near future.

Anyhow, I hope I’ve piqued your interest in the idea. If so, check out the Early Access Page for Pulse 1.2, and enjoy!

Article: Automated Releases

Tuesday, November 14th, 2006

In this article, we look into automating the release process. We begin by reviewing the benefits of automated releases. We then take a look at common steps involved in the automation process, and some of the challenges they may present.

Read the full article at zutubi.com.

Pulse 1.2 M2

Tuesday, November 14th, 2006

Pulse 1.2 M2 has been released! This is the second milestone in the 1.2 series. New features include:

  • Project groups: manage projects by organising them into groups.
  • Change Viewers: an easier, more powerful way to link changelists and files to external viewers such as Fisheye.
  • Commit message transformers: powerful tools for transforming commit messages, for linking to external tools or highlighting details.
  • Improved remote API: new functions for managing projects, users and agents.
  • Bootstrap improvements: realtime output and the ability to cancel during bootstrapping.
  • All-in-one packaging: download agent and tools packages from your Pulse server.
  • Simpler tools configuration: configure the personal build client without editing any files.

See the early access page for M2 packages and full details.