Differentiate, compile, and transform Numpy code.
Differentiate, compile, and transform Numpy code.
JAX is Autograd and XLA, brought together for high-performance numerical computing, including large-scale machine learning research. With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy functions. It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order.
Package | python-jaxlib |
Version | 0.4.20 |
Channel | guix-science |
Definition |
|
Build status | view 🚧 |
Home page | https://github.com/google/jax |
Source | — |
Installation command |
|