GNU bug report logs -
#20154
25.0.50; json-encode-string is too slow for large strings
Previous Next
Reported by: Dmitry Gutov <dgutov <at> yandex.ru>
Date: Fri, 20 Mar 2015 14:27:01 UTC
Severity: normal
Found in version 25.0.50
Done: Dmitry Gutov <dgutov <at> yandex.ru>
Bug is archived. No further changes may be made.
Full log
View this message in rfc822 format
> Date: Fri, 20 Mar 2015 17:20:25 +0200
> From: Dmitry Gutov <dgutov <at> yandex.ru>
> CC: 20154 <at> debbugs.gnu.org
>
> On 03/20/2015 05:03 PM, Eli Zaretskii wrote:
>
> > Yes, I could. What's your point, though?
>
> That if asking the question takes the same time as doing the profiling
> yourself, the latter would be more efficient. I don't really mind, just
> puzzled.
I have other things on my plate while I read email. If my advice
bothers you, I can shut up in the future.
> > json-encode-char and mapconcat take most of the time, so it seems.
>
> So it does. But here's an alternative implementation I tried:
>
> (defun json-encode-big-string (str)
> (with-temp-buffer
> (insert str)
> (goto-char (point-min))
> (while (re-search-forward "[\"\\/\b\f\b\r\t]\\|[^ -~]" nil t)
> (replace-match (json-encode-char (char-after (match-beginning 0)))
> t t))
> (format "\"%s\"" (buffer-string))))
>
> It takes 0.15s here, which is still too long.
I suggest to rewrite json-encode-char, it does a lot of unnecessary
stuff, starting with the call to encode-char (which was needed in
Emacs 22 and before, but no more). The call to rassoc is also
redundant, since you already have that covered in your regexp.
This bug report was last modified 10 years and 38 days ago.
Previous Next
GNU bug tracking system
Copyright (C) 1999 Darren O. Benham,
1997,2003 nCipher Corporation Ltd,
1994-97 Ian Jackson.