Message ID | 6c6fc91b2febb791d50aac5e9a16a220090eb31d.1716786995.git.atai@atai.org |
---|---|
State | New |
Headers | show |
Series | [bug#71219] gnu: llama-cpp: Update configure flags for shared library build. | expand |
patch passes Guix QA https://qa.guix.gnu.org/issue/71219
Andy Tai <atai@atai.org> writes: > patch passes Guix QA > https://qa.guix.gnu.org/issue/71219 It does, but the build is still failing. The status is "Succeeding" overall because the package was failing to build before, so the situation isn't worse.
diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm index a385ddc18c..398b42f203 100644 --- a/gnu/packages/machine-learning.scm +++ b/gnu/packages/machine-learning.scm @@ -541,7 +541,8 @@ (define-public llama-cpp (build-system cmake-build-system) (arguments (list - #:configure-flags #~'("-DLLAMA_BLAS=ON" + #:configure-flags #~'("-DCMAKE_POSITION_INDEPENDENT_CODE=TRUE" + "-DLLAMA_BLAS=ON" "-DLLAMA_BLAS_VENDOR=OpenBLAS" "-DLLAMA_NATIVE=OFF" ;no '-march=native'