Devel::Cover recommendations... or maybe not?

Pete Sergeant pete at clueball.com
Thu Mar 15 09:53:25 GMT 2007


On Thu, Mar 15, 2007 at 09:17:59AM +0000, Matt Wilson wrote:
> > - Some configuration might be needed to run all the tests.
> Surely that's the point, though? Write your unit tests explicitly so
> that your configurations are going to achieve maximum coverage.

Currently I'm enjoying having a dedicated test-engineer at my disposal.
I wrote a simple test harness that reads .yaml files, and these
translate in to calls against my API.

He can't quite hit 100% coverage using this, but a few helper test
classes to get the last few and I'm sorted.

We realised a while ago in our team that getting 100% test coverage was
HARD but that the result was often worth it. Perhaps you have to rewrite
some code to make it more Devel::Cover friendly (mostly expanding out ||
and && statments), and perhaps you have to occasionally tie yourself in
knots trying to hit certain failure conditions (especially file-system
error ones), but knowing that everything is meant to have 100% coverage
is a handy way of highlighting what new tests need to be written if you
add new functionality.

This approach has worked /really/ well for us - we've had a grand total
of two (I think) bugs found by the large-ish external test team, and we
find developing new functionality a doddle because we've got such an
extensive coverage of acceptance tests (and a fair number of regression
tests).

Whether this would work without the test-engineer devising bizarre
edge-case acceptance tests, I don't know - but in all, our approach is:
if we have 100% coverage on our developer unit tests, and all the test
engineer's acceptance tests pass, we're probably virtually bug free, and
that's mostly true. A large helping of egos in the team in terms of "ew,
your code doesn't have full coverage or docs, what were you thinking?"
seems to help too :-)

+Pete



More information about the london.pm mailing list