Perl linked list segfault

Andy Wardley abw at wardley.org
Wed Nov 4 07:55:26 GMT 2009


I've got some code that's making Perl segfault.

I'm creating a linked list using array references as nodes.  The first element
in the array ref contains some data, the second item contains a reference to
the next item.  Think "cons" lists.

     while (++$n < $max) {
         $token = ["token $n"];
         $last->[1] = $token if $last;
         $last = $token;
     }

Using the above code I can create a linked list of 100 million nodes and
everything works just fine (assuming you don't mind twiddling your thumbs
for a few minutes).

However, if I also stuff the nodes into a container list then Perl segfaults
at cleanup time, either when the tokens go out of scope or during global
cleanup.

     while (++$n < $max) {
         $token = ["token $n"];
         $last->[1] = $token if $last;
         $last = $token;
         push(@tokens, $token);        # add this line -> BOOM!
     }

In this case Perl will reliably segfault with a mere 30,000 nodes.

If I just push them onto @tokens and don't create the linked list then it
also works fine.  It's the combination of linked list + container list that
farks things up.

I've tested this on my Macbook using versions 5.8, 5.10.0, 5.10.1 and 5.11.1,
and 5.10 on Linux.   They all fail somewhere between the 25k and 40k mark.

Now that I'm convinced it's a real bug, I'm at a bit of a loss as to how to
debug it.  The -D flags that I've tried (most of them, in various different 
combinations) don't offer much in the way of help and I've got no core dump
to analyse (can anyone explain why this doesn't dump core?).  The only thing
that I'm sure of is that the SEGV is happening during memory cleanup.

I'll perlbug it when I get a moment, but I was hoping to be able to
investigate it further myself.

Any suggestions gratefully received.

A







More information about the london.pm mailing list