GNU bug report logs - #20154
25.0.50; json-encode-string is too slow for large strings

Previous Next

Package: emacs;

Reported by: Dmitry Gutov <dgutov <at> yandex.ru>

Date: Fri, 20 Mar 2015 14:27:01 UTC

Severity: normal

Found in version 25.0.50

Done: Dmitry Gutov <dgutov <at> yandex.ru>

Bug is archived. No further changes may be made.

Full log


View this message in rfc822 format

From: Eli Zaretskii <eliz <at> gnu.org>
To: Dmitry Gutov <dgutov <at> yandex.ru>
Cc: 20154 <at> debbugs.gnu.org
Subject: bug#20154: 25.0.50; json-encode-string is too slow for large strings
Date: Sun, 22 Mar 2015 20:32:38 +0200
> Date: Sun, 22 Mar 2015 20:26:37 +0200
> From: Dmitry Gutov <dgutov <at> yandex.ru>
> CC: 20154 <at> debbugs.gnu.org
> 
> The question of "why encode everything again" comes to down programmer's 
> convenience, and not re-implementing parts of the JSON encoder.
> 
> At least until `json-encode' has a way to pass an already-encoded string 
> verbatim, how else would you encode an alist like
> 
>        `(("file_data" .
>           ((,full-path . (("contents" . ,file-contents)
>                           ("filetypes" . ,file-types)))))
>          ("filepath" . ,full-path)
>          ("line_num" . ,line-num)
>          ("column_num" . ,column-num))
> 
> to JSON, except by encoding everything again?

Caveat: I'm probably missing something simple here, so excuse in
advance for asking stupid questions.

You said you need to encode everything on every keystroke, so I was
wondering why you couldn't encode just the new keystroke, and append
the result to what you already encoded earlier.  Then send everything
to the server, as it expects.  The problem is in encoding, not in
sending.




This bug report was last modified 10 years and 38 days ago.

Previous Next


GNU bug tracking system
Copyright (C) 1999 Darren O. Benham, 1997,2003 nCipher Corporation Ltd, 1994-97 Ian Jackson.