Software craftsmanship roundtable

Last night was the second London Software Craftsmanship round table. This was a great session with lots of lively discussion. We seemed to cover a wide variety of topics and I came away with loads of things I need to follow up on.

It’s amazing, as soon as you start discussing things with other developers you pick up on lots of different ideas, tools and technologies you’d not previously explored. These sessions are proving to be a great way to find out some of the cool stuff people are doing. So what did we discuss?

*phew* I think that covers it. If I missed anything, let me know below.

So, are you coming to the next one? What should we discuss?

 

Book Chain Relaunch

The book swapping site I run – www.BookChain.co.uk – has relaunched!

What Is It?

Book Chain is a free book swapping site. Send books you’ve read to others; get sent books you want in return.

How Does It Work?

Once you’ve signed up you can start adding books you’ve read to your library. This lets other users know the books you’re willing to share. As soon as someone asks for one of your books, we’ll let you know. Each book you send earns credit; the more credit you earn, the more books you can receive.

Once you’ve earned credit add books you want to your wanted list. Then others can send you books you want to read!

What’s New?

The biggest difference is moving to new servers (more on that later). This has made the whole site much faster; as well as helped expose some horrible performance bugs.

Book Chain also integrates more closely with Facebook now. You can now “like” books on Book Chain to let your Facebook friends know books you’re willing to share.

To celebrate the relaunch we’re running a competition. One lucky person that posts a book between now and the end of the year will win £50 in Amazon vouchers! So sign up today and start swapping books!

New Hardware

After a long time running on a shared server, over the last couple of weeks I’ve moved Book Chain into the cloud with Elastic Hosts. Amazingly, for roughly the same price I was paying for my own Tomcat instance on a shared server, I now get my own virtual Ubuntu box, with way more memory and CPU than I had before! And not only do I get more bang for buck, with Elastic Hosts I can scale my hardware up and down as I need to.

How Much Better Is It?

Well as you can see from the red line in the performance graph below, from the 7th of November (after a bit of a shaky start while I ironed out some issues) mean response times have plummeted from anywhere between 2 and 4 seconds down to consistently under 100ms. Some of this was me fixing some absolutely horrific bugs (more on that in a future post!) – but much of this improvement was simply from moving to better hardware.

Book Chain performance over the last month

Code coverage with unit & integration tests

On a pet project recently I set out to build automated UI (integration) tests as well as the normal unit tests. I wanted to get all of this integrated into my maven build, with code coverage reports so I could get an idea of areas with insufficient test coverage. Rather than just publish the source code for the project, I’ve put together a simple example to demonstrate how I got all this setup; so if you’re looking to integrate maven, junit, webdriver (now selenium) and emma – read on to find out how I went about it.

First off, all the source code for this is available on github: https://github.com/activelylazy/coverage-example. I’ll show key snippets, but obviously there’s lots of detail omitted that (hopefully) isn’t relevant.

The Example App

Rather than break with tradition, the example application is a simple, if slightly contrived, hello world:

How It Works

The start page is a simple link to the hello world page:

<h1>Example app</h1>
<p>See the <a id="messageLink" href="helloWorld.html">message</a></p>

The hello world page just displays the message:

<h1>Example app</h1>
<p id="message"><c:out value="${message}"/></p>

The hello world controller renders the view, passing in the message:

public class HelloWorldController extends ParameterizableViewController {
    // Our message factory
    private MessageFactory messageFactory;
    @Override
    protected ModelAndView handleRequestInternal(HttpServletRequest request,
        HttpServletResponse response) throws Exception {
        // Get the success view
        ModelAndView mav = super.handleRequestInternal(request, response);
        // Add our message
        mav.addObject("message",messageFactory.createMessage());
        return mav;
    }
    @Autowired
    public void setMessageFactory(MessageFactory messageFactory) {
        this.messageFactory = messageFactory;
    }
}

Finally the MessageFactory simply returns the hard-coded message:

public String createMessage() {
    return "Hello world";
}

The unit test

We define a simple unit test to verify that the MessageFactory behaves as expected:

public class MessageFactoryTest {
    // The message factory
    private MessageFactory messageFactory;
    @Test
    public void testCreateMessage() {
        assertEquals("Hello world",messageFactory.createMessage());
    }
    @Autowired
    public void setMessageFactory(MessageFactory messageFactory) {
        this.messageFactory = messageFactory;
    }
}

Build

A basic maven pom file is sufficient to build this and run the unit test. At this point we have a working app, with a unit test for the core functionality (such as it is) that we can build and run.

<project>
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.example</groupId>
    <artifactId>helloworld</artifactId>
    <packaging>war</packaging>
    <version>1.0-SNAPSHOT</version>
    <name>helloworld Maven Webapp</name>
    <build>
        <finalName>helloworld</finalName>
    </build>
    <dependencies>
        ...omitted...
    </dependencies>
</project>

Code Coverage

Now let’s integrate Emma so we can get some code coverage reports. First, we define a new Maven profile, this allows us to control whether or not we use emma on any given build.

<profile>
    <id>with-emma</id>
    <build>
        <plugins>
            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>emma-maven-plugin</artifactId>
                <inherited>true</inherited>
                <executions>
                    <execution>
                        <id>instrument</id>
                        <phase>process-test-classes</phase>
                        <goals>
                            <goal>instrument</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</profile>

This simply invokes the “instrument” goal during the Maven “process-test-classes” phase; i.e. once we’ve compiled our class files, use emma to instrument them. We can run this by invoking maven with the new profile:

mvn clean install -Pwith-emma

Once the build has completed, we can run Emma to generate code coverage reports:

On Windows:

java -cp %USERPROFILE%/.m2/repository/emma/emma/2.0.5312/emma-2.0.5312.jar emma report -r xml,html -in coverage.ec -in target/coverage.em

On Linux:

java -cp ~/.m2/repository/emma/emma/2.0.5312/emma-2.0.5312.jar emma report -r xml,html -in coverage.ec -in target/coverage.em

We can now view the HTML coverage report in coverage/index.html. At this point, it shows we have 50% test coverage (by classes). MessageFactory is fully covered, but the HelloWorldController doesn’t have any tests at all.

Integration Test

To test our controller and JSP, we’ll use WebDriver to create a simple integration test; this is a JUnit test that happens to launch a browser.

public class HelloWorldIntegrationTest {
    // The webdriver
    private static WebDriver driver;
    @BeforeClass
    public static void initWebDriver() {
        driver = new FirefoxDriver();
    }
    @AfterClass
    public static void stopSeleniumClent() {
        try {
            driver.close();
            driver.quit();
        } catch( Throwable t ) {
            // Catch error & log, not critical for tests
            System.err.println("Error stopping driver: "+t.getMessage());
            t.printStackTrace(System.err);
        }
    }
    @Test
    public void testHelloWorld() {
        // Start from the homepage
        driver.get("http://localhost:9080/helloworld/");
        HomePage homePage = new HomePage(driver);
        HelloWorldPage helloWorldPage = homePage.clickMessageLink();
        assertEquals("Hello world",helloWorldPage.getMessage());
    }
}

Lines 4-18 simply start Web Driver before the test and shut it down (closing the browser window) once the test is finished.

On line 22 we navigate to the homepage with a hard-coded URL.

On line 23 we initialise our Web Driver page object for the homepage. This encapsulates all the details of how the page works, allowing the test to interact with the page functionally, without worrying about the mechanics (which elements to use etc).

On line 24 we use the homepage object to click the “message” link; this navigates to the hello world page.

On line 25 we confirm that the message shown on the hello world page is what we expect.

Note: I’m using page objects to separate test specification (what to do) from test implementation (how to do it). For more on why this is important see keeping tests from being brittle.

Homepage

The homepage object is pretty simple:

public HelloWorldPage clickMessageLink() {
    driver.findElement(By.id("messageLink")).click();
    return new HelloWorldPage(driver);
}

HelloWorldPage

The hello world page is equally simple:

public String getMessage() {
    return driver.findElement(By.id("message")).getText();
}

Running the Integration Test

To run the integration test during our Maven build we need to make a few changes. First, we need to exclude integration tests from the unit test phase:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-surefire-plugin</artifactId>
    ...
    <configuration>
        ...
        <excludes>
            <exclude>**/*IntegrationTest.java</exclude>
            <exclude>**/common/*</exclude>
        </excludes>
    </configuration>
</plugin>

Then we define a new profile, so we can optionally run integration tests:

<profile>
    <id>with-integration-tests</id>
    <build>
        <plugins>
            <plugin>
                <groupId>org.mortbay.jetty</groupId>
                <artifactId>maven-jetty-plugin</artifactId>
                <version>6.1.22</version>
                <configuration>
                    <scanIntervalSeconds>5</scanIntervalSeconds>
                    <stopPort>9966</stopPort>
                    <stopKey>foo</stopKey>
                    <connectors>
                        <connector implementation="org.mortbay.jetty.nio.SelectChannelConnector">
                            <port>9080</port>
                            <maxIdleTime>60000</maxIdleTime>
                        </connector>
                    </connectors>
                </configuration>
                <executions>
                    <execution>
                        <id>start-jetty</id>
                        <phase>pre-integration-test</phase>
                        <goals>
                            <goal>run</goal>
                        </goals>
                        <configuration>
                            <daemon>true</daemon>
                        </configuration>
                    </execution>
                    <execution>
                        <id>stop-jetty</id>
                        <phase>post-integration-test</phase>
                        <goals>
                            <goal>stop</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.5</version>
                <inherited>true</inherited>
                <executions>
                    <execution>
                        <id>integration-tests</id>
                        <phase>integration-test</phase>
                        <goals>
                            <goal>test</goal>
                        </goals>
                        <configuration>
                            <excludes>
                                <exclude>**/common/*</exclude>
                            </excludes>
                            <includes>
                                <include>**/*IntegrationTest.java</include>
                            </includes>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</profile>
<profile>
<id>with-integration-tests</id>
<build>
<plugins>
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>
<version>6.1.22</version>
<configuration>
<scanIntervalSeconds>5</scanIntervalSeconds>
<stopPort>9966</stopPort>
<stopKey>foo</stopKey>
<connectors>
<connector implementation=”org.mortbay.jetty.nio.SelectChannelConnector”>
<port>${test.server.port}</port>
<maxIdleTime>60000</maxIdleTime>
</connector>
</connectors>
</configuration>
<executions>
<execution>
<id>start-jetty</id>
<phase>pre-integration-test</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<daemon>true</daemon>
</configuration>
</execution>
<execution>
<id>stop-jetty</id>
<phase>post-integration-test</phase>
<goals>
<goal>stop</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.5</version>
<inherited>true</inherited>
<executions>
<execution>
<id>integration-tests</id>
<phase>integration-test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<excludes>
<exclude>**/common/*</exclude>
</excludes>
<includes>
<include>**/*IntegrationTest.java</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>

This may look complex, but really we’re just configuring jetty to run while we run our integration tests; then configuring how to run the integration tests themselves.

In lines 9-19 configure jetty – the port to run on and how to stop it.

Lines 21-30 configure jetty to run during the “pre-integration-test” phase of the maven build.

Lines 31-37 configure jetty to be stopped during the “post-integration-test” phase of the maven build.

In lines 40-62 we use the maven-surefire-plugin again, this time to run during the “integration-test” phase of the build, only running our integration test classes.

We can run this build with:

mvn clean install -Pwith-emma -Pwith-integration-tests

This will build everything, run the unit tests, build the war, fire up jetty to host the war, run our integration tests (you’ll see a firefox window popup while the rest runs) then shut down jetty. Because the war is built with instrumented classes, Emma also tracks code coverage while we run our integration tests.

We can now build our application, running unit tests and integration tests, gathering combined code coverage reports. If we re-run the emma report and check code coverage we now see we have 100% test coverage – since the controller is also being covered through tests.

Issues

What are the outstanding issues with this, what further extensions can be made?

  • The build produces an instrumented WAR – this means you need to run a second build, without emma, to get a production-ready build.
  • The integration test hard-codes the port that Jetty is configured to start on; meaning the tests can’t be run directly within Eclipse. It is possible to pass this port in, defaulting to say, 8080, so that integration tests can be run seemlessly within Eclipse as well via the maven build
  • When running on your build server you probably don’t want Firefox popping up at random (if X is even installed); so running xvfb is a good idea. It is possible to setup maven to start & stop xvfb before & after the integration tests.

First company coding dojo

Last month we ran our first company coding dojo – this was only open to company staff, but attendance was good (around a dozen people).

For those that have never heard of it, a coding dojo – based on the idea of a martial arts dojo – is an opportunity for programmers to improve their skills. This means getting a group of developers together, round a big screen, to work through a problem. Everything is pair programmed, with one “driver” and one “co-pilot”. Every so often the pair is changed: the driver returns to the audience, the co-pilot becomes the driver and a new co-pilot steps up. That way everyone gets a turn writing code, while the rest of the group provide advice (no matter how unwelcome).

For the first dojo we tackled a problem in Scala – this was the first time using Scala for most people, so a lot of time was spent learning the language. But thanks to Daniel Korzekwa, Kingsley Davies & DJ everyone got to grips with the language and we eventually got a solution! The session was a lot of fun, with a lot of heated discussion – but everyone felt they learned something.


Afterwards, in true agile style, we ran a quick retrospective. The lessons learned showed the dojo had been an interesting microcosm of development – with us making the same mistakes we so often see in the day job! For example, we knew we should start with a design and went as far as getting a whiteboard; but failed to actually do any design. This led to repeated rework as the final design emerged, slowly, from numerous rewrites. One improvement for next time was to do just in time design – in true agile style.

We also set out to do proper test-first TDD. However, as so often happens, this degenerated into code-first development with tests occasionally run and passing rarely. It was interesting to see how quickly a group of experienced developers fall out of doing TDD. Our retrospective highlighted that next time we should always write tests first, and take “baby steps” – by doing the simplest thing that could possibly make the test pass.

Overall it was a great session and very enjoyable – it was fascinating to see the impact of ignoring “best practices” on something small where the results are so much more immediate.