GNU bug report logs -
#71295
29.3; url-retrieve-synchronously does not timeout if initial connection hangs
Previous Next
Full log
Message #11 received at 71295 <at> debbugs.gnu.org (full text, mbox):
On Sat, Jun 1, 2024, at 15:38, Eli Zaretskii wrote:
> The timeout cannot work when Emacs is stuck in a system call trying to
> initiate a network connection. That is blocking, unless you use
> url-retrieve asynchronously. The timeout starts working after the
> connection is established, so it basically protects against too slow
> downloading of data after the initial connection.
That makes sense, although it's not what I would have guessed from the docstring.
> I think it's a bug in Spacemacs that it attempts to solve connectivity
> problems this way.
What would be a reasonable alternative? This is the same approximate method used by, e.g., request.el. See https://github.com/tkf/emacs-request/blob/01e338c335c07e4407239619e57361944a82cb8a/request.el#L767-L775.
Is the idea just to use url-retrieve directly? I hacked together a quick function that doesn't seem to hang:
(defun url-retrieve-synchronously-but-dont-hang (url &optional silent inhibit-cookies timeout)
(with-timeout (timeout
(error "Timed out retrieving %S after waiting for %s seconds"
url timeout))
(let* (done
(buf
(url-retrieve url
(lambda (&rest _)
(setq done (current-buffer)))
nil silent inhibit-cookies)))
(when buf
(let ((proc (get-buffer-process buf)))
(while (accept-process-output proc))
done)))))
This is probably rather naive of me, but I guess now I'm wondering why url-retrieve-synchronously actually sets url-asynchronous to nil. Is there a good reason not to use :nowait when it is available? It seems like it would be useful to have a wrapper around url-retrieve that just "does what I mean" here.
This bug report was last modified 1 year and 12 days ago.
Previous Next
GNU bug tracking system
Copyright (C) 1999 Darren O. Benham,
1997,2003 nCipher Corporation Ltd,
1994-97 Ian Jackson.