Differentiate, compile, and transform Numpy code.

JAX is Autograd and XLA, brought together for high-performance numerical computing, including large-scale machine learning research. With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy functions. It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order.


Packagepython-jaxlib
Version0.4.20
Channelguix-science
Definitionguix-science/packages/python.scm
Build statusview 🚧
Home pagehttps://github.com/google/jax
Source
Installation command
guix install python-jaxlib