[bug#71219] gnu: llama-cpp: Update configure flags for shared library build.

Message ID 6c6fc91b2febb791d50aac5e9a16a220090eb31d.1716786995.git.atai@atai.org
State New
Headers
Series [bug#71219] gnu: llama-cpp: Update configure flags for shared library build. |

Commit Message

Andy Tai May 27, 2024, 5:16 a.m. UTC
* gnu/packages/machine-learning.scm (lama-cpp):
  [arguments](configure-flags): add cmake configure flag
  to force position independent code generation from C
  compiler for shared library build.

Change-Id: I7c4bc219a22aa9a949e811b340c7cf745b176d14
---
 gnu/packages/machine-learning.scm | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)


base-commit: 0f3a25a25e212bfa8ab9db37d267fb260a087e5d
  

Comments

Andy Tai June 6, 2024, 5:52 p.m. UTC | #1
patch passes Guix QA
https://qa.guix.gnu.org/issue/71219
  
Christopher Baines June 6, 2024, 6:12 p.m. UTC | #2
Andy Tai <atai@atai.org> writes:

> patch passes Guix QA
> https://qa.guix.gnu.org/issue/71219

It does, but the build is still failing.

The status is "Succeeding" overall because the package was failing to
build before, so the situation isn't worse.
  
Andy Tai July 10, 2024, 4 a.m. UTC | #3
patch passed QA
https://qa.guix.gnu.org/issue/71219
  

Patch

diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm
index a385ddc18c..398b42f203 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -541,7 +541,8 @@  (define-public llama-cpp
       (build-system cmake-build-system)
       (arguments
        (list
-        #:configure-flags #~'("-DLLAMA_BLAS=ON"
+        #:configure-flags #~'("-DCMAKE_POSITION_INDEPENDENT_CODE=TRUE"
+                              "-DLLAMA_BLAS=ON"
                               "-DLLAMA_BLAS_VENDOR=OpenBLAS"
 
                               "-DLLAMA_NATIVE=OFF" ;no '-march=native'