GNU bug report logs -
#61052
[PATCH] download: Add url-fetch/xz-file.
Previous Next
Reported by: Hilton Chain <hako <at> ultrarare.space>
Date: Wed, 25 Jan 2023 09:09:01 UTC
Severity: normal
Tags: moreinfo, patch
Done: Hilton Chain <hako <at> ultrarare.space>
Bug is archived. No further changes may be made.
Full log
View this message in rfc822 format
[Message part 1 (text/plain, inline)]
Your bug report
#61052: [PATCH] download: Add url-fetch/xz-file.
which was filed against the guix-patches package, has been closed.
The explanation is attached below, along with your original report.
If you require more details, please reply to 61052 <at> debbugs.gnu.org.
--
61052: https://debbugs.gnu.org/cgi/bugreport.cgi?bug=61052
GNU Bug Tracking System
Contact help-debbugs <at> gnu.org with problems
[Message part 2 (message/rfc822, inline)]
On Sun, 26 Feb 2023 03:25:24 +0800,
Tobias Geerinckx-Rice wrote:
>
> Hi Hilton,
>
> I agree with Ludo' and also wonder if a generic
> ‘url-fetch/compressed-file’ wouldn't be better. There are closure
> arguments to be made for this xz-only approach. I don't know if
> they're convincing. Cluebats welcome.
>
> (I was going to bring up ‘url-fetch/tarbomb’ as an example, but it
> doesn't actually handle anything besides gzip! Madness.)
>
> On 2023-01-25 10:07, Hilton Chain wrote:
> > + (setenv "XZ_OPT"
> > + (string-join (%xz-parallel-args)))
>
> Why set this kluge…
>
> > + (invoke (string-append #+xz "/bin/unxz")
> > + #$file-name)
>
> …when we have full control over xz's arguments?
>
> Kind regards,
>
> T G-R
>
> Sent from a Web browser. Excuse or enjoy my brevity.
Sorry for the long delay...
Yes, I would prefer a generic approach. But currently I don't have a
usecase with this url-fetch/xz-file or something more generic, so I'll
close the issue for now.
[Message part 3 (message/rfc822, inline)]
* guix/download.scm (url-fetch/xz-file): New variable.
---
guix/download.scm | 43 +++++++++++++++++++++++++++++++++++++++++++
1 file changed, 43 insertions(+)
diff --git a/guix/download.scm b/guix/download.scm
index 2e9ecb43fc..cce62c4185 100644
--- a/guix/download.scm
+++ b/guix/download.scm
@@ -41,6 +41,7 @@ (define-module (guix download)
(url-fetch* . url-fetch)
url-fetch/executable
url-fetch/tarbomb
+ url-fetch/xz-file
url-fetch/zipbomb
download-to-store))
@@ -602,6 +603,48 @@ (define tar
#:graft? #f
#:local-build? #t)))
+(define* (url-fetch/xz-file url hash-algo hash
+ #:optional name
+ #:key (system (%current-system))
+ (guile (default-guile)))
+ "Similar to 'url-fetch' but decompress the xz file at URL as the result.
+This is mainly used for adding xz-compressed patches to a origin definition."
+ (define file-name
+ (match url
+ ((head _ ...)
+ (basename head))
+ (_
+ (basename url))))
+ (define xz
+ (module-ref (resolve-interface '(gnu packages compression)) 'xz))
+
+ (mlet %store-monad ((drv (url-fetch* url hash-algo hash
+ (or name (basename file-name ".xz"))
+ #:system system
+ #:guile guile))
+ (guile (package->derivation guile system)))
+ ;; Take the xz file, and simply decompress it.
+ ;; Use ungrafted xz so that the resulting file doesn't depend on whether
+ ;; grafts are enabled.
+ (gexp->derivation (or name file-name)
+ (with-imported-modules '((guix build utils))
+ #~(begin
+ (use-modules (guix build utils))
+ (setenv "XZ_OPT"
+ (string-join (%xz-parallel-args)))
+
+ (copy-file #$drv #$file-name)
+ (make-file-writable #$file-name)
+ (invoke (string-append #+xz "/bin/unxz")
+ #$file-name)
+
+ (copy-file (basename #$file-name ".xz")
+ #$output)))
+ #:system system
+ #:guile-for-build guile
+ #:graft? #f
+ #:local-build? #t)))
+
(define* (url-fetch/zipbomb url hash-algo hash
#:optional name
#:key (system (%current-system))
base-commit: 718223c58c20fa066527fb30da2b5dccca82913f
--
2.39.1
This bug report was last modified 1 year and 322 days ago.
Previous Next
GNU bug tracking system
Copyright (C) 1999 Darren O. Benham,
1997,2003 nCipher Corporation Ltd,
1994-97 Ian Jackson.