Package: emacs;
Reported by: Ewan <ewan <at> etown.dev>
Date: Sun, 24 Aug 2025 06:07:02 UTC
Severity: normal
Merged with 79297
Message #88 received at 79298 <at> debbugs.gnu.org (full text, mbox):
From: Eli Zaretskii <eliz <at> gnu.org> To: Ewan <ewan <at> etown.dev> Cc: 79298 <at> debbugs.gnu.org Subject: Re: bug#79298: patch: full color in windows terminal Date: Sun, 14 Sep 2025 15:03:52 +0300
> Date: Sat, 06 Sep 2025 09:19:40 +0000 > From: Ewan <ewan <at> etown.dev> > Cc: Eli Zaretskii <eliz <at> gnu.org>, 79298 <at> debbugs.gnu.org > > > I might be wrong, but I believe the change is for master (Emacs > > 31.1). So pls modify etc/NEWS instead of etc/NEWS.30. > > Ahh, thank you - updated patches attached. > > ...-minimal: system cursor, VT sequences just for color. > ...-full: system or Emacs-drawn cursor, complete migration to VT sequences. Thanks. For now, let's focus on the "minimal" patch only. I'm also ignoring the documentation for the time being. My plan is to arrive at the agreed-upon initial patch for the code, then try it locally and fix any problems, then finalize it (including the docs). So the comments below are intended to (a) slightly cleanup the design and implementation; and (b) avoid unnecessary changes. > +;; W32 uses different color indexes than standard > (defvar w32-tty-standard-colors > '(("black" 0 0 0 0) > ("blue" 1 0 0 52480) ; MediumBlue > @@ -43,56 +42,133 @@ > ("lightmagenta" 13 65535 0 65535) ; Magenta > ("yellow" 14 65535 65535 0) ; Yellow > ("white" 15 65535 65535 65535)) > -"A list of VGA console colors, their indices and 16-bit RGB values.") > + "A list of VGA console colors, their indices and 16-bit RGB values.") > + > +;; When using VT sequences for color, use xterm-like indices > +(defvar w32-tty-virtual-terminal-base-colors > + '(("black" 0 0 0 0) > + ("red" 1 45568 8704 8704) ; FireBrick What is the rationale for using different RGB components in the VT mode? The RGB components of the 16 colors were determined by comparing X and PC colors on the same screen. I presume that the VT mode uses the same colors, so I don't understand why we need different definitions. Can you explain? > (defun terminal-init-w32console () > "Terminal initialization function for w32 console." > ;; Share function key initialization with w32 gui frames > (x-setup-function-keys (selected-frame)) > ;; Set terminal and keyboard encodings to the current OEM codepage. > (let ((oem-code-page-coding > - (intern (format "cp%d" (w32-get-console-codepage)))) > - (oem-code-page-output-coding > - (intern (format "cp%d" (w32-get-console-output-codepage)))) > - oem-cs-p oem-o-cs-p) > - (setq oem-cs-p (coding-system-p oem-code-page-coding)) > - (setq oem-o-cs-p (coding-system-p oem-code-page-output-coding)) > - (when oem-cs-p > - (set-keyboard-coding-system oem-code-page-coding) > - (set-terminal-coding-system > - (if oem-o-cs-p oem-code-page-output-coding oem-code-page-coding)) > - ;; Since we changed the terminal encoding, we need to repeat > - ;; the test for Unicode quotes being displayable. > - (startup--setup-quote-display))) > - (let* ((colors w32-tty-standard-colors) > - (color (car colors))) > - (tty-color-clear) > - (while colors > - (tty-color-define (car color) (cadr color) (cddr color)) > - (setq colors (cdr colors) > - color (car colors)))) > - (clear-face-cache) > - ;; Figure out what are the colors of the console window, and set up > - ;; the background-mode correspondingly. > - (let* ((screen-color (get-screen-color)) > - (bg (cadr screen-color)) > - (descr (tty-color-by-index bg)) > - r g b bg-mode) > - (setq r (nth 2 descr) > - g (nth 3 descr) > - b (nth 4 descr)) > - (if (< (+ r g b) (* .6 (+ 65535 65535 65535))) > - (setq bg-mode 'dark) > - (setq bg-mode 'light)) > - (set-terminal-parameter nil 'background-mode bg-mode)) > - (tty-set-up-initial-frame-faces) > + (intern (format "cp%d" (w32-get-console-codepage)))) > + (oem-code-page-output-coding > + (intern (format "cp%d" (w32-get-console-output-codepage)))) In the interests of making the patch smaller and easier to review, could you either avoid changing whitespace, or produce the patch using the --ignore-space-changes command-line switch? AFAIU, most of the above are whitespace changes (TABs to SPCes, right?), but the code is basically left intact, by an large. > --- a/src/term.c > +++ b/src/term.c > @@ -73,7 +73,6 @@ static void clear_tty_hooks (struct terminal *terminal); > static void set_tty_hooks (struct terminal *terminal); > static void dissociate_if_controlling_tty (int fd); > static void delete_tty (struct terminal *); > - > #endif /* !HAVE_ANDROID */ There are quite a few changes that add or remove empty lines, please avoid that, to make the patch smaller and more to-the-point. > @@ -346,10 +345,10 @@ tty_hide_cursor (struct tty_display_info *tty) > { > if (tty->cursor_hidden == 0) > { > - tty->cursor_hidden = 1; > #ifdef WINDOWSNT > w32con_hide_cursor (); > #else > + tty->cursor_hidden = 1; > OUTPUT_IF (tty, tty->TS_cursor_invisible); Is this change of order important? If so, why? If it isn't important, please leave the original order intact, again to make the patch smaller and easier to grasp. Same in other places where AFAICT you've modified the code for stylistic reasons, without actually changing it. > @@ -2264,18 +2263,42 @@ tty_setup_colors (struct tty_display_info *tty, int mode) > default: > tty_default_color_capabilities (tty, 0); > break; > - case 8: /* 8 standard ANSI colors */ > - tty->TS_orig_pair = "\033[0m"; > + case 8: /* 8 standard ANSI colors */ > + tty->TS_orig_pair = "\x1b[0m"; > + tty->TS_set_foreground = "\x1b[3%dm"; > + tty->TS_set_background = "\x1b[4%dm"; > #ifdef TERMINFO > - tty->TS_set_foreground = "\033[3%p1%dm"; > - tty->TS_set_background = "\033[4%p1%dm"; > -#else > - tty->TS_set_foreground = "\033[3%dm"; > - tty->TS_set_background = "\033[4%dm"; > + tty->TS_set_foreground = "\x1b[3%p1%dm"; > + tty->TS_set_background = "\x1b[4%p1%dm"; > +#endif > +#ifdef WINDOWSNT > + tty->TS_orig_pair = "\x1b[39m\x1b[49m"; > + tty->TS_set_foreground = "\x1b[%lum"; > + tty->TS_set_background = "\x1b[%lum"; I see you've consistently replaced \033 with \x1b. Any reason for that? If it's possible to use \033 as in original code, it will again make the patch smaller and easier to read. Also, WINDOWSNT and TERMINFO are mutually-exclusive, so please use #elif to show that. > { > - tty->TS_set_foreground = "\033[%?%p1%{8}%<%t3%p1%d%e38;2;%p1%{65536}%/%d;%p1%{256}%/%{255}%&%d;%p1%{255}%&%d%;m"; > - tty->TS_set_background = "\033[%?%p1%{8}%<%t4%p1%d%e48;2;%p1%{65536}%/%d;%p1%{256}%/%{255}%&%d;%p1%{255}%&%d%;m"; > + tty->TS_set_foreground = "\x1b[%?%p1%{8}%<%t3%p1%d%e38;2;%p1%{65536}%/%d;%p1%{256}%/%{255}%&%d;%p1%{255}%&%d%;m"; > + tty->TS_set_background = "\x1b[%?%p1%{8}%<%t4%p1%d%e48;2;%p1%{65536}%/%d;%p1%{256}%/%{255}%&%d;%p1%{255}%&%d%;m"; > tty->TN_max_colors = 16777216; > } This AFAIU also merely replaces \033 with \x1b. > +#define SEQMAX 256 /* Arbitrary upper limit on VT sequence size */ GNU coding conventions frown on arbitrary limits. At the very least, let's make sure the SSPRINTF macro bails out if it needs to produce more than SEQMAX bytes, and let's have eassert there in case it bails out early because the caller wanted more than that -- so that any such problems could be caught early. > +/* For debugging */ > +static void > +vt_seq_error (char *seq) Please explain in a comment how this supposed to be used for debugging. > + turn_on_face (f, face_id); > + WriteConsole (cur_screen, conversion_buffer, > + coding->produced, &r, NULL); Why do you use WriteConsole here, but WriteConsoleA elsewhere? Is there a difference? If not, let's be consistent. > static void > w32con_update_begin (struct frame * f) > { > + current_tty = FRAME_TTY (f); > + > + if (!w32_use_virtual_terminal_sequences > + && current_tty->TN_max_colors > 16) > + { > + tty_setup_colors (current_tty, 16); > + safe_calln (Qw32con_set_up_initial_frame_faces); > + } Why is this needed? This will call to Lisp each redisplay cycle, which is expensive and will slow down redisplay. Why do we need to setup colors and faces each redisplay cycle? We never needed this before. > +/* returns the pixel value for the given index into VT base color map */ > +static unsigned long pixel_cache[16]; > +static unsigned long > +get_pixel (unsigned long index) > +{ > + unsigned int i = (unsigned int) index; > + if (i > 15) return 0; > + if (i == 0 || pixel_cache[i] > 0) > + return pixel_cache[i]; > + > + Lisp_Object pix = safe_calln (Qw32con_get_pixel, make_ufixnum (i)); > + pixel_cache[i] = (unsigned long) XUFIXNUM (pix); > + return pixel_cache[i]; > +} This should be implemented in C to avoid a costly call into Lisp (each time we need to use a different color!). Reusing existing code is important, but speed of redisplay trumps that by a large margin. > + DEFVAR_BOOL ("w32-use-virtual-terminal-sequences", > + w32_use_virtual_terminal_sequences, > + doc: /* If non-nil w32 console uses terminal sequences for some output processing. > +This variable is set automatically based on the capabilities of the terminal. > +It determines the number and indices of colors used for faces in the terminal. > +If the terminal cannot handle VT sequences, the update hook triggers recomputation of faces. > +See `w32con-set-up-initial-frame-faces', which should be called after setting this variable > +manually in a running session. */); > + w32_use_virtual_terminal_sequences = 0; Do we want to support setting this from Lisp or by the user (in addition to setting it automatically at startup)? If yes, does setting it require some initialization? Thanks again for working on this.
GNU bug tracking system
Copyright (C) 1999 Darren O. Benham,
1997,2003 nCipher Corporation Ltd,
1994-97 Ian Jackson.