GNU bug report logs - #7597
multi-threaded sort can segfault (unrelated to the sort -u segfault)

Previous Next

Package: coreutils;

Reported by: Jim Meyering <jim <at> meyering.net>

Date: Thu, 9 Dec 2010 12:11:01 UTC

Severity: normal

Tags: fixed

Done: Assaf Gordon <assafgordon <at> gmail.com>

Bug is archived. No further changes may be made.

Full log


Message #29 received at submit <at> debbugs.gnu.org (full text, mbox):

From: Jim Meyering <jim <at> meyering.net>
To: Paul Eggert <eggert <at> cs.ucla.edu>
Cc: Chen Guo <chen.guo.0625 <at> gmail.com>, bug-coreutils <at> gnu.org,
	DJ Lucas <dj <at> linuxfromscratch.org>, coreutils <at> gnu.org
Subject: Re: bug#7597: multi-threaded sort can segfault (unrelated to the sort
	-u segfault)
Date: Sun, 12 Dec 2010 16:41:07 +0100
Paul Eggert wrote:
> Sorry for botching the NEWS and the change log.  To help
> make amends, how about if I add a test case for that?

That would be welcome.  Thanks.

> I'm thinking of the 2nd test case in
> <http://lists.gnu.org/archive/html/bug-coreutils/2010-12/msg00043.html>,
> namely this one:
>
> gensort -a 10000 > gensort-10k
> for i in $(seq 2000); do printf '% 4d\n' $i; src/sort -S 100K \
>   --parallel=2 gensort-10k > j; test $(wc -c < j) = 1000000 || break; done

That sounds good, assuming it triggers the bug reliably for you.
I was hoping to find a way to reproduce it without relying on gensort,
but won't object if you want to do that.

I ran the above a few times on a 6-core i7 970 and it would usually
fail only 2 or 3 times out of 2000.  For one trial, it failed
only once, and that was on the 1932nd iteration.

Interestingly, when I run enough of those 2-process jobs in parallel to keep
all nominal 12 processors busy, I get 80-100 failures in 10,000 trials.

cat <<\EOF > sort-test
#!/bin/bash
exp_output=$1
t=$(mktemp) || exit 2
src/sort --parallel=2 -S 100k in > $t || exit 3
cmp $t $exp_output || exit 4
rm -f $t
exit 0
EOF
chmod a+x sort-test

i.e., by running that little script via GNU parallel:

    export TMPDIR=/dev/shm/z; mkdir $TMPDIR
    gensort -a 1000 in; sort in > exp; seq 10000 \
      | parallel --eta -j $(($(nproc)/2)) ./sort-test exp

This shows how many failed of the 10,000:

    ls -1 $TMPDIR/tmp.*|wc -l                                                     :

I'm not suggesting to use GNU parallel in the test,
but if you have a few spare cores or systems, it does
make it easier to run large parallel tests in an attempt
to find something more reliable.

I noticed that decreasing the input size to 1000 lines
had little effect on the number of failures, but obviously
makes the test less expensive.

> Or if you have a better one in mind, please let me know.
>
> There are also some test cases I need to add for the
> (unrelated) sort-compression bug, which is next on my
> list of coreutils bugs to look at.

It would be great to fix that for 8.8, too,
but don't worry if you don't get to it soon.
I can always make an 8.9 not long afterward.
Perhaps very few people use --compress-program=PROG,
or this is due to a latent problem that is showing up only now.




This bug report was last modified 6 years and 285 days ago.

Previous Next


GNU bug tracking system
Copyright (C) 1999 Darren O. Benham, 1997,2003 nCipher Corporation Ltd, 1994-97 Ian Jackson.