GNU bug report logs -
#19993
25.0.50; Unicode fonts defective on Windows
Previous Next
Full log
Message #50 received at 19993 <at> debbugs.gnu.org (full text, mbox):
On Sat, Mar 07, 2015 at 10:18:25AM +0200, Eli Zaretskii wrote:
> > Date: Fri, 6 Mar 2015 14:13:51 -0800
> > From: Ilya Zakharevich <ilya <at> math.berkeley.edu>
> > Cc: 19993 <at> debbugs.gnu.org
> >
> > On Fri, Mar 06, 2015 at 11:12:17PM +0200, Eli Zaretskii wrote:
> > > This will set up the default fontset to use Symbola for the
> > > Mathematical Alphanumeric Symbols block:
> > >
> > > (set-fontset-font "fontset-default" '(#x1d400 . #x1d7ff) "Symbola")
> >
> > I do not follow. What is going on now? Are you saying that it should
> > NOT work out-of-the-box?
>
> On the slim chance that you'd like this to work for you, and didn't
> yet figure it out, I described what worked for me.
>
> Also, others who read this discussion, now or in the future, might
> benefit from this information.
Thanks for clarifying this. However, my answer still remains unanswered…
> > And what fontset-default has to do with this discussion? It is not
> > used by Emacs, right?
>
> AFAICT, it _is_ used. At least if I evaluate the above in "emacs -Q",
> and then type characters in the range, they are displayed using
> Symbola, instead of showing a box with hex code, per glyphless
> character display.
I see: I mixed it up with “standard fontset”. Thanks.
Ilya
This bug report was last modified 10 years and 154 days ago.
Previous Next
GNU bug tracking system
Copyright (C) 1999 Darren O. Benham,
1997,2003 nCipher Corporation Ltd,
1994-97 Ian Jackson.