GNU bug report logs -
#65491
[PATCH] Improve performance allocating vectors
Previous Next
Full log
Message #64 received at 65491 <at> debbugs.gnu.org (full text, mbox):
16 sep. 2023 kl. 18.17 skrev Eli Zaretskii <eliz <at> gnu.org>:
>> The latter one completely broke the 32-bit build --with-wide-int, most
>> probably because the last argument to XUNTAG is frequently a pointer
>> to a 64-bit type, where uintptr_t is only 32-bit wide.
>
> No, that's not it: the reason is that the _first_ argument is a 64-bit
> data type, and then casting XLP(a) to uintptr_t causes the warning,
> because uintptr_t is a 32-bit type.
Let's see if I understand this correctly, as I can't try that configuration myself.
In your configuration, Lisp_Object is a 64-bit integer (or a struct containing one).
LISP_WORD_TAG(type) is also a 64-bit integer with tag bits at the top (USE_LSB_TAG=0).
The old XUNTAG was:
#define XUNTAG(a, type, ctype) ((ctype *) \
((char *) XLP (a) - LISP_WORD_TAG (type)))
so that we get a subtraction of a pointer and a very large 64-bit number, which results in just the pointer.
The new XUNTAG is:
#define XUNTAG(a, type, ctype) ((ctype *) \
((uintptr_t) XLP (a) - LISP_WORD_TAG (type)))
so you get a warning from what, conversion of a 64-bit number to (ctype *)?
Does changing the definition to
#define XUNTAG(a, type, ctype) \
((ctype *) ((uintptr_t) XLP (a) - (uintptr_t)LISP_WORD_TAG (type)))
help? (That is, cast the LISP_WORD_TAG return value to uintptr_t.)
This bug report was last modified 1 year and 263 days ago.
Previous Next
GNU bug tracking system
Copyright (C) 1999 Darren O. Benham,
1997,2003 nCipher Corporation Ltd,
1994-97 Ian Jackson.