1F37D ; text ; L2 ; none ; w # V7.0 (🍽) FORK AND KNIFE WITH PLATE
so you should look at the "Plain" section instead.
Ok, looking in the plain section under text-vs on macos 11.6.5 in Safari
I see an emoji glyph there too for U+1F37D. This is the plain section
and U+1F37D is at the end of the top row in this image. There are a lot
of emoji shown in that section but less than in the emojiFont section.
Howard> Can emacs be configured to display these lone codepoints via my emoji font?
Howard> I gather that's what using the 'symbol script does but also includes more.
Howard> Can I (or emacs out-of-the-box) be more selective in the call to
Howard> set-fontset-font or some other api?
Yes. Try:
(set-fontset-font t #x1f37d
'("Apple Color Emoji" . "iso10646-1") nil 'prepend)
For a range of codepoints, replace #x1f37d with something like
'(#x1f37d . #x1f3aa)
Thanks, doing these definitely gets me further to where I'd like:
(set-fontset-font t '(#x1F170 . #x1F6F3) '("Apple Color Emoji" . "iso10646-1") nil 'prepend)
(set-fontset-font t '(#x2139 . #x3299) '("Apple Color Emoji" . "iso10646-1") nil 'prepend)
I'm still confused as to why the above works but this didn't:
(set-fontset-font t 'emoji '("Apple Color Emoji" . "iso10646-1") nil 'prepend)
And I as I look at script-representative-chars, emoji is defined to be (emoji 127744 128512)
which I think means the hex range x1F300 - x1F600 so shouldn't include x1f37d?
Or does it not because the default expression is text? And if so how is that
factored into the emoji script symbol passed to set-fontset-font, I don't see
how that's defined other than as this range. And when I specify a range
directly I get my pretty glyph displayed.
I donʼt think we should follow what the mac does when it contradicts
what Unicode is telling us.
I certainly agree with this. I see that
says:
• only fully-qualified emoji zwj sequences should be generated by keyboards and other user input devices.
and working through the definition of fully-qualified emoji
a lone U+1F37D is not fully-qualified.
If I understand emacs' state correctly, insert-char is doing the right thing
because it's just inserting a character. I think I'm picking an emoji but
I'm not really, I'm picking a single character (in this case U+1F37D).
A later Emacs will have an emoji input method that would
be like a real emoji picker that lets me insert a proper fully-qualified sequence.
Howard> And I'll add, if that's displayed equivalently I'd prefer it, because I wouldn't
Howard> have to deal with "extra invisible characters" after the glyph when
Howard> using emacs editing commands (unless this is different behavior in 29
Howard> than in 28 when I add the variation selector character).
Those characters get composed, so they get treated as a single
unit. They really donʼt cause any problems.
Well C-f and C-b seem to move point between them which is somewhat startling.
Modulo `use-default-font-for-symbols'
Howard> FWIW this variable set to t for me which I think is the default.
I meant you should try setting it to 'nil'.
In an emacs -Q in the scratch buffer I inserted a lone U+1F37D
Toggling use-default-font-for-symbols had no effect on its display.
Even after I did:
(set-fontset-font t 'emoji '("Apple Color Emoji" . "iso10646-1") nil 'prepend)