GNU bug report logs -
#19993
25.0.50; Unicode fonts defective on Windows
Previous Next
Full log
Message #53 received at 19993 <at> debbugs.gnu.org (full text, mbox):
On Sat, Mar 07, 2015 at 10:18:25AM +0200, Eli Zaretskii wrote:
> > > (set-fontset-font "fontset-default" '(#x1d400 . #x1d7ff) "Symbola")
> >
> > I do not follow. What is going on now? Are you saying that it should
> > NOT work out-of-the-box?
>
> On the slim chance that you'd like this to work for you, and didn't
> yet figure it out, I described what worked for me.
It would be nice if there was a recipe which works for everyone.
(After this, one could make it a default. ;-)
But the major hurdle is that the semantic of fontsets is completely
undocumented. After your suggestions, I think I arrived at some
description which does not contradict anything I have seen:
=======================================================
When Emacs wants to show a character using a fontset:
• Emacs looks in the fontset and finds the font specifications associated
to this character.
• Emacs checks which Unicode Subset contains the given character.
(What if not unique???)
• From fonts matching the font specifications, Emacs picks up those
which have this Unicode Subset “identified” within the font.
• From these, Emacs choses one (which?).
Emacs uses this procedure for two fontsets: the currently enabled one, and
the default fontset. If none of two obtained fonts supports the given
character, a HEX representation is shown.
=======================================================
Is it similar to what actually happens? (I’m not asking about the
implementation, just whether there is a functional equivalence.)
Ilya
This bug report was last modified 10 years and 153 days ago.
Previous Next
GNU bug tracking system
Copyright (C) 1999 Darren O. Benham,
1997,2003 nCipher Corporation Ltd,
1994-97 Ian Jackson.