Some highlights on Source-to-Source Adjoint AD

Abstract : Algorithmic Differentiation (AD) provides the analytic derivatives of functions given as programs. Adjoint AD, which computes gradients, is similar to Back Propagation for Machine Learning. AD researchers study strategies to overcome the difficulties of adjoint AD, to get closer to its theoretical efficiency. To promote fruitful exchanges between Back Propagation and adjoint AD, we present three of these strategies and give our view of their interest and current status.
Document type :
Conference papers
Complete list of metadatas

Cited literature [17 references]  Display  Hide  Download

https://hal.inria.fr/hal-01655085
Contributor : Laurent Hascoet <>
Submitted on : Monday, December 4, 2017 - 4:07:22 PM
Last modification on : Thursday, January 11, 2018 - 4:16:39 PM

File

StAdMl.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01655085, version 1

Collections

Citation

Laurent Hascoët. Some highlights on Source-to-Source Adjoint AD. NIPS 2017 - workshop The future of gradient-based machine learning software & techniques, Dec 2017, Long Beach, Californie, United States. pp.1-5. ⟨hal-01655085⟩

Share

Metrics

Record views

151

Files downloads

25