mbox series

[bug#72471,00/24] Update PyTorch and dependencies

Message ID 20240804220847.15842-1-david.elsing@posteo.net
Headers show
Series Update PyTorch and dependencies | expand

Message

David Elsing Aug. 4, 2024, 9:53 p.m. UTC
Hello,

this patch series updates python-pytorch to version 2.4.0 and some of
its dependencies and dependent packages.

I noticed that on Linux, upstream does not support 32-bit architectures
anyway, so I set supported-systems to x86_64-linux and aarch64-linux,
but tested only the former.

For onnx and onnx-optimizer, I referenced the patches in
https://issues.guix.gnu.org/65630 and therefore added a "Co-authored-by"
line for Andy Tai. Is that correct to do?

r-torch still depends on version 2.0.1 of PyTorch, so it is kept.

Cheers,
David

David Elsing (24):
  gnu: asmjit: Update to commit 062e69c.
  gnu: Remove python-typing-extensions-4.10.
  gnu: python-optree: Update to 0.11.0.
  gnu: flatbuffers-next: Update to 24.3.25.
  gnu: pthreadpool: Update to commit 560c60d.
  gnu: cpuinfo: Update to commit 05332fd.
  gnu: nnpack: Depend on python-peachpy only for x86_64.
  gnu: oneapi-dnnl: Update to 3.5.3.
  gnu: fbgemm: Update to 0.7.0.
  gnu: xnnpack: Update to commit 08f1489.
  gnu: Add xnnpack-for-r-torch.
  gnu: python-nbval: Update to 0.11.0.
  gnu: Add python-parameterized-next.
  gnu: Remove onnx-optimizer-for-torch2.
  gnu: Remove onnx-for-torch2.
  gnu: onnx: Update to 1.16.2.
  gnu: onnx-optimizer: Update to 0.3.19.
  gnu: gloo: Update to commit 81925d1.
  gnu: cpp-httplib: Update to 0.16.0.
  gnu: python-pytorch: Update to 2.4.0.
  gnu: python-torchvision: Update to 0.19.0.
  gnu: python-lightning-utilities: Update to 0.11.6.
  gnu: python-torchmetrics: Update to 1.4.1.
  gnu: python-pytorch-lightning: Update to commit 2064887.

 gnu/local.mk                                  |    3 -
 gnu/packages/check.scm                        |   22 +
 gnu/packages/cpp.scm                          |    8 +-
 gnu/packages/machine-learning.scm             |  778 +++--
 gnu/packages/parallel.scm                     |   16 +-
 .../onnx-optimizer-system-library.patch       |   60 +-
 .../patches/onnx-shared-libraries.patch       |   18 +-
 .../patches/onnx-skip-model-downloads.patch   |   16 +-
 .../patches/onnx-use-system-googletest.patch  |   57 -
 .../patches/python-pytorch-fix-codegen.patch  |   26 +-
 .../patches/python-pytorch-runpath.patch      |   19 +-
 .../python-pytorch-system-libraries.patch     |  122 +-
 .../python-pytorch-without-kineto.patch       |   10 +-
 .../patches/xnnpack-remove-broken-tests.patch |  337 ---
 .../patches/xnnpack-system-libraries.patch    | 2660 -----------------
 gnu/packages/python-build.scm                 |   12 -
 gnu/packages/python-check.scm                 |   27 +-
 gnu/packages/python-xyz.scm                   |    9 +-
 gnu/packages/serialization.scm                |    4 +-
 19 files changed, 641 insertions(+), 3563 deletions(-)
 delete mode 100644 gnu/packages/patches/onnx-use-system-googletest.patch
 delete mode 100644 gnu/packages/patches/xnnpack-remove-broken-tests.patch
 delete mode 100644 gnu/packages/patches/xnnpack-system-libraries.patch

Comments

Ludovic Courtès Sept. 6, 2024, 9:54 a.m. UTC | #1
Hi David,

David Elsing <david.elsing@posteo.net> skribis:

> this patch series updates python-pytorch to version 2.4.0 and some of
> its dependencies and dependent packages.
>
> I noticed that on Linux, upstream does not support 32-bit architectures
> anyway, so I set supported-systems to x86_64-linux and aarch64-linux,
> but tested only the former.
>
> For onnx and onnx-optimizer, I referenced the patches in
> https://issues.guix.gnu.org/65630 and therefore added a "Co-authored-by"
> line for Andy Tai. Is that correct to do?

Yes.

> r-torch still depends on version 2.0.1 of PyTorch, so it is kept.

Makes sense.

I applied the whole series (thanks, ‘mumi am’!), rebased on top of
‘master’, and confirmed that r-torch and python-pytorch build fine on
x86_64-linux (since qa.guix is lagging behind).

Pushed as 571c605f17481e8c606c876e04129d99632bc2ec.

Thanks for all the work!

Ludo’.