GNU bug report logs - #70175
[PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing

Previous Next

Package: guix-patches;

Reported by: John Fremlin <john <at> fremlin.org>

Date: Thu, 4 Apr 2024 03:49:01 UTC

Severity: normal

Tags: patch

Done: Christopher Baines <mail <at> cbaines.net>

Bug is archived. No further changes may be made.

Full log


View this message in rfc822 format

From: help-debbugs <at> gnu.org (GNU bug Tracking System)
To: Christopher Baines <mail <at> cbaines.net>
Cc: tracker <at> debbugs.gnu.org
Subject: bug#70175: closed ([PATCH] gnu: llama-cpp: support OpenBLAS for
 faster prompt processing)
Date: Fri, 05 Apr 2024 11:36:02 +0000
[Message part 1 (text/plain, inline)]
Your message dated Fri, 05 Apr 2024 12:35:02 +0100
with message-id <871q7kghzt.fsf <at> cbaines.net>
and subject line Re: [bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing
has caused the debbugs.gnu.org bug report #70175,
regarding [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing
to be marked as done.

(If you believe you have received this mail in error, please contact
help-debbugs <at> gnu.org.)


-- 
70175: https://debbugs.gnu.org/cgi/bugreport.cgi?bug=70175
GNU Bug Tracking System
Contact help-debbugs <at> gnu.org with problems
[Message part 2 (message/rfc822, inline)]
From: John Fremlin <john <at> fremlin.org>
To: guix-patches <at> gnu.org
Cc: John Fremlin <john <at> fremlin.org>
Subject: [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing
Date: Wed,  3 Apr 2024 23:46:25 -0400
OpenBLAS is recommended by https://github.com/ggerganov/llama.cpp

Change-Id: Iaf6f22252da13e2d6f503992878b35b0da7de0aa
---
 gnu/packages/machine-learning.scm | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm
index 225bff0ca2..ea3674ce3e 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -542,6 +542,8 @@ (define-public llama-cpp
       (build-system cmake-build-system)
       (arguments
        (list
+        #:configure-flags
+        '(list "-DLLAMA_BLAS=ON" "-DLLAMA_BLAS_VENDOR=OpenBLAS")
         #:modules '((ice-9 textual-ports)
                     (guix build utils)
                     ((guix build python-build-system) #:prefix python:)
@@ -576,8 +578,9 @@ (define-public llama-cpp
               (lambda _
                 (copy-file "bin/main" (string-append #$output "/bin/llama")))))))
       (inputs (list python))
+      (native-inputs (list pkg-config))
       (propagated-inputs
-       (list python-numpy python-pytorch python-sentencepiece))
+       (list python-numpy python-pytorch python-sentencepiece openblas))
       (home-page "https://github.com/ggerganov/llama.cpp")
       (synopsis "Port of Facebook's LLaMA model in C/C++")
       (description "This package provides a port to Facebook's LLaMA collection

base-commit: 1441a205b1ebb610ecfae945b5770734cbe8478c
-- 
2.41.0



[Message part 3 (message/rfc822, inline)]
From: Christopher Baines <mail <at> cbaines.net>
To: John Fremlin via Guix-patches via <guix-patches <at> gnu.org>
Cc: John Fremlin <john <at> fremlin.org>, 70175-done <at> debbugs.gnu.org
Subject: Re: [bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for faster
 prompt processing
Date: Fri, 05 Apr 2024 12:35:02 +0100
[Message part 4 (text/plain, inline)]
John Fremlin via Guix-patches via <guix-patches <at> gnu.org> writes:

> OpenBLAS is recommended by https://github.com/ggerganov/llama.cpp
>
> Change-Id: Iaf6f22252da13e2d6f503992878b35b0da7de0aa
> ---
>  gnu/packages/machine-learning.scm | 5 ++++-
>  1 file changed, 4 insertions(+), 1 deletion(-)

Looks good to me, I tweaked the commit message a bit and pushed this to
master as d8a63bbcee616f224c10462dbfb117ec009c50d8.

Chris
[signature.asc (application/pgp-signature, inline)]

This bug report was last modified 1 year and 124 days ago.

Previous Next


GNU bug tracking system
Copyright (C) 1999 Darren O. Benham, 1997,2003 nCipher Corporation Ltd, 1994-97 Ian Jackson.