GNU bug report logs -
#78476
GNU 'factor' problems with 128-bit word
Previous Next
Full log
Message #20 received at 78476 <at> debbugs.gnu.org (full text, mbox):
Paul Eggert <eggert <at> cs.ucla.edu> writes:
> if (n < (wide_uint) FIRST_OMITTED_PRIME * FIRST_OMITTED_PRIME)
> return true;
> was incorrect.
> Why is it incorrect?
Because when you run it on a machine with 128-bit unsigned long (or
whatever type is being used for the word size), N can be 155 and 155
is not prime.
I thought you said that test was wrong, and I asked what was wrong with
it, and why did you make that test return false.
Now I understand that you made the test return false in order to mask an
underlying bug. Please don't do that!
The change is questionable only because it hurts performance when the
word size exceeds 64 bits. It's not questionable on correctness
grounds.
I cannot tell for sure that you change cannot coase other problems.
Say, an infinite loop? (I will not lose sleep over that until somebody
creates a computer with native 128-bit multiply hardware.)
However, what's a good way to debug it?
As a last resort, compare the execution for a working base type with one
which fails? When do they diverge? Why do they diverge?
I am too busy to take care of this. Sorry.
The current factor.c code is becoming insanely complex. Contructs
masking bugs triggered on hypothetical systems doesn't help with
managing complexity.
Preferably, this problem should be fully understood. If it can be
argued that it is a real bug which affects real systems, it needs a
clean fix.
--
Torbjörn
Please encrypt, key id 0xC8601622
This bug report was last modified 22 days ago.
Previous Next
GNU bug tracking system
Copyright (C) 1999 Darren O. Benham,
1997,2003 nCipher Corporation Ltd,
1994-97 Ian Jackson.