https://enzyme.mit.edu/ [logo] Enzyme AD Enzyme Automatic Differentiation Framework * Community + Discussion List * Source * Bugs * FAQ Enzyme Overview The Enzyme project is a tool for performing reverse-mode automatic differentiation (AD) of statically-analyzable LLVM IR. This allows developers to use Enzyme to automatically create gradients of their source code without much additional work. double foo(double); double grad_foo(double x) { return __enzyme_autodiff(foo, x); } By differentiating code after optimization, Enzyme is able to create substantially faster derivatives than existing tools that differentiate programs before optimization. [all_top] Components Enzyme is composed of four pieces: * An optional preprocessing phase which performs minor transformations that tend to be helpful for AD. * A new interprocedural type analysis that deduces the underlying types of memory locations * An activity analaysis that determines what instructions or values can impact the derivative computation (common in existing AD systems). * An optimization pass which creates any required derivative functions, replacing calls to __enzyme_autodiff with the generated functions. More resources For more information on Enzyme, please see: * The Enzyme getting started guide * The Enzyme mailing list for any questions. * Previous talks . Citing Enzyme To cite Enzyme, please cite the following: @incollection{enzymeNeurips, title = {Instead of Rewriting Foreign Code for Machine Learning, Automatically Synthesize Fast Gradients}, author = {Moses, William S. and Churavy, Valentin}, booktitle = {Advances in Neural Information Processing Systems 33}, year = {2020}, } The original Enzyme is also avaiable as a preprint on arXiv . Next - Installation Powered by Hugo. Theme by TechDoc. Designed by Thingsym. * Home * Installation * Charter * Getting Started+ + CUDA Guide + Using Enzyme + Calling Convention + FAQ * Talks and Related Publications