GNU bug report logs - #7325
new test failure due to non-portability of printf formats like %05.3s

Previous Next

Package: coreutils;

Reported by: Jim Meyering <jim <at> meyering.net>

Date: Wed, 3 Nov 2010 18:56:02 UTC

Severity: normal

Done: Pádraig Brady <P <at> draigBrady.com>

Bug is archived. No further changes may be made.

Full log


View this message in rfc822 format

From: Jim Meyering <jim <at> meyering.net>
To: Pádraig Brady <P <at> draigBrady.com>
Cc: Paul Eggert <eggert <at> CS.UCLA.EDU>, 7325 <at> debbugs.gnu.org
Subject: bug#7325: new test failure due to non-portability of printf formats like %05.3s
Date: Mon, 08 Nov 2010 16:31:21 +0100
Pádraig Brady wrote:

> On 08/11/10 14:33, Jim Meyering wrote:
>> Looks like I got very lucky here and hit a number of nanoseconds
>> that happened to be a multiple of 100,000:
>>
>>     $ for i in $(seq 1000); do touch -d '1970-01-01 18:43:33.5000000000' 2; t=$(stat -c "%.W %.X %.Y %.Z" 2); test $(echo "$t"|wc -c) -lt 57 && echo "$t"; done
>>     0.000000 63813.500000 63813.500000 1289224045.731146
>>     0.0000 63813.5000 63813.5000 1289224047.8224
>>     [Exit 1]
>>
>> I realize this is due to the way the precision estimation
>> heuristic works.  Wondering if there's a less-surprising
>> way to do that.
>
> You could snap to milli, micro, nano,
> though that would just mean it would
> happen less often.
>
>> Now, I'm thinking that this variable precision feature would be better
>> if it were somehow optional, rather than the default for %.X.
>> Consistency/reproducibility are more important, here.
>
> You could touch -d '0.123456789' stat.prec.test
> at program start, but that wouldn't always work.
> Non writable dir, disparity between read and
> write support for time stamp resolutions, :(
>
> You could sample X preexisting files/dirs on the
> same file system, and stop when Y have not increased
> in precision. That combined with snapping to milli,micro,nano
> would usually work. Though that's starting to
> get too hacky IMHO while still not being general.

I agree.
The only possibility I can imagine is to use some sort of statfs-like
interface that would provide file system time stamp resolution.

> I guess we're back to doing 9 by default for %.Y
> and using %#.Y to mean auto precision ?

The default of 9 seems far less likely to cause subtle misuse.
Letting the "#" in %#.Y mean auto precision sounds appropriate.




This bug report was last modified 14 years and 191 days ago.

Previous Next


GNU bug tracking system
Copyright (C) 1999 Darren O. Benham, 1997,2003 nCipher Corporation Ltd, 1994-97 Ian Jackson.