Deploying perl code

Leo Lapworth leo at
Thu Jul 24 17:01:39 BST 2014

On 24 July 2014 16:25, David Cantrell <david at> wrote:

> What tools do you use for:
> * deploying code to multiple servers;
>   * and multiple environments - eg live, staging, ...
>     so eg we have,, ...

We use a combination of:

minicpan + CPAN::Mini::Inject

All under version control in it's own repo

Then build to:

/opt/perl_VERSION_dev (under it's own version control)

which is rsynced to 'dev' machines

/opt/perl_VERSION_live  (under it's own version control)
is build by recording what was installed into dev and
then mimicking that ( a yes/ no prompt for each branch,
it works well but is complicated ).

> * updated/added dependencies
>   * both CPANish modules and possibly OS packages such as libraries
>     that those modules wrap around

Puppet for OS packages, and.. well everything except perl
and html/image deployment.

> * database structure updates;

Thankfully don't do enough changes for this to be an issue.

> I'm looking for tools that will make it easy to go from a bunch of code
> in a release branch on github to an updated bunch of servers, with
> minimal downtime. If it matters we're using Debian.
> Is this the sort of thing that puppet and chef are for? If you've used
> them are they as awesome as the hype makes out, or will they just push
> me into the same murderous rage as our current bunch of incomplete shell
> scripts do?

Puppet 3 with 'stdlib' is great, you can implement one feature at a time,
so don't have to switch everything over at once, it IS software but
then don't think you can escape that and it's WAY better than
dozens of little shell scripts. We now have 99.9% of setup in there, we
even configure Debian desktops with it at work.

I'm just working on upgrading the metacpan puppet config here...

If I was starting work config again from scratch I might look at:

A combination of: - or Pinto ( )

Heading to tech talk now



More information about the mailing list