Algorithmic Differentiation (AD) provides the analytic derivatives of functions given as programs. Adjoint AD, which computes gradients, is similar to Back Propagation for Machine Learning. AD researchers study strategies to overcome the difficulties of adjoint AD, to get closer to its theoretical efficiency. To promote fruitful exchanges...
-
December 4, 2017 (v1)Conference paperUploaded on: March 25, 2023
-
November 18, 2023 (v1)Journal article
Data-flow reversal is at the heart of source-transformation reverse algorithmic differentiation (reverse ST-AD), arguably the most efficient way to obtain gradients of numerical models. However, when the model implementation language uses garbage collection (GC), for instance in Java or Python, the notion of address that is needed for data-flow...
Uploaded on: January 17, 2024 -
September 12, 2016 (v1)Conference paper
Algorithmic Differentiation (AD) has become one of the most powerful tools to improve our understanding of theEarth System. If AD has been used by the ocean and atmospheric circulation modelingcommunity for almost 20 years, it is relatively new in the ice sheet modeling community. The Ice SheetSystem Model (ISSM) is a C++, object-oriented,...
Uploaded on: March 25, 2023 -
February 20, 2018 (v1)Journal article
As Automatic Differentiation (AD) usage is spreading to larger and more sophisticated applications, problems arise for codes that use several programming languages. This work describes the issues involved in interoperability between languages and focuses on the main issue which is parameter passing. It describes the architecture of a source...
Uploaded on: December 4, 2022 -
June 2016 (v1)Conference paper
Checkpointing is a classical technique to mitigate the overhead of adjoint Al-gorithmic Differentiation (AD). In the context of source transformation AD with the Store-All approach, checkpointing reduces the peak memory consumption of the adjoint, at the cost of duplicate runs of selected pieces of the code. Checkpointing is vital for long...
Uploaded on: March 25, 2023 -
2016 (v1)Journal article
The computation of gradients via the reverse mode of algorithmic differentiation is a valuable technique in modeling many science and engineering applications. This technique is particularly efficient when implemented as a source transformation, as it may use static data-flow analysis. However, some features of the major programming languages...
Uploaded on: March 25, 2023 -
April 2, 2022 (v1)Publication
International audience
Uploaded on: February 22, 2023 -
2021 (v1)Journal article
This paper presents our work toward correct and efficient automatic differentiation of OpenMP parallel worksharing loops in forward and reverse mode. Automatic differentiation is a method to obtain gradients of numerical programs, which are crucial in optimization, uncertainty quantification, and machine learning. The computational cost to...
Uploaded on: December 4, 2022 -
August 29, 2022 (v1)Conference paper
This paper presents a novel combination of reverse mode automatic differentiation and formal methods, to enable efficient differentiation of (or backpropagation through) shared-memory parallel loops. Compared to the state of the art, our approach can reduce the need for atomic updates or private data copies during the parallel derivative...
Uploaded on: February 22, 2023 -
September 12, 2016 (v1)Conference paper
As AD usage is spreading to larger and more sophisticated applications, problems arise for codes that use several programming languages. Many AD tools have been designed with one application language in mind. Only a few use an internal representation that promotes language-independence, at least conceptually. When faced with the problem of...
Uploaded on: March 25, 2023 -
July 2016 (v1)Conference paper
Checkpointing is a classical strategy to reduce the peak memory consumption of the adjoint. Checkpointing is vital for long run-time codes, which is the case of most MPI parallel applications. However, for MPI codes this question has always been addressed by ad-hoc hand manipulations of the differentiated code, and with no formal assurance of...
Uploaded on: March 25, 2023 -
September 14, 2015 (v1)Conference paper
Efficient Algorithmic Differentiation of Fixed-Point loops requires a specific strategy to avoid explosion of memory requirements. Among the strategies documented in literature, we have selected the one introduced by B. Christianson. This method features original mechanisms such as repeated access to the trajectory stack or duplicated...
Uploaded on: March 25, 2023 -
June 23, 2020 (v1)Journal article
A computational fluid dynamics code is differentiated using algorithmic differentiation (AD) in both tangent and adjoint modes. The two novelties of the present approach are (1) the adjoint code is obtained by letting the AD tool Tapenade invert the complete layer of message passing interface (MPI) communications, and (2) the adjoint code...
Uploaded on: December 4, 2022 -
September 12, 2016 (v1)Conference paper
Development of a capable algorithmic differentiation (AD) tool requires large developer effort to provide the various flavors of derivatives, to experiment with the many AD model variants, and to apply them to the candidate application languages. Considering the relatively small size of the academic teams that develop AD tools, collaboration...
Uploaded on: March 25, 2023 -
2020 (v1)Journal article
A computational fluid dynamics code relying on a high-order spatial discretization is differentiated using algorithmic differentiation (AD). Two unsteady test cases are considered: a decaying incompressible viscous shear layer and an inviscid compressible flow around a NACA 0012 airfoil. Both tangent and adjoint modes of AD are explored in the...
Uploaded on: December 4, 2022 -
April 8, 2019 (v1)Journal article
Algorithmic differentiation tools can automate the adjoint transformation of parallel messagepassing codes [23] using the AMPI library. Nevertheless, a non-trivial and manual step afterthe differentiation is the initialisation of the seed and retrieval of the output values from thedifferentiated code.MPK:Ambiguities in seeding occurs in ...
Uploaded on: December 4, 2022 -
2016 (v1)Journal article
Algorithmic differentiation (AD) by source-transformation is an established method for computing derivatives of computational algorithms. Static data-flow analysis is commonly used by AD tools to determine the set of active variables, that is, variables that are influenced by the program input in a differentiable way and have a differentiable...
Uploaded on: March 25, 2023 -
October 2023 (v1)Journal article
In the framework of fusion energy research, divertor design and model calibration based on plasma edge codes currently rely either on manual iterative tuning of parameters, or alternatively on parameter scans. This, combined with the complex and computationally expensive nature of plasma edge codes, makes these procedures extremely cumbersome....
Uploaded on: January 17, 2024 -
2015 (v1)Journal article
The information on sensitivity provided by derivatives is indispensable in many fields of science. In numerical analysis, computing the accurate value of the derivatives of a function can be a challenge. The classical Finite Differences (FD) method is a simple solution to implement when estimating the value of a derivative. However, it remains...
Uploaded on: March 25, 2023 -
2016 (v1)Journal article
International audience
Uploaded on: March 25, 2023 -
September 2, 2024 (v1)Journal article
Abstract Automatic differentiation is a popular technique for computing derivatives of computer programs. While automatic differentiation has been successfully used in countless engineering, science, and machine learning applications, it can sometimes nevertheless produce surprising results. In this paper, we categorize problematic usages of...
Uploaded on: September 11, 2024 -
September 16, 2024 (v1)Conference paper
Checkpointing is a cornerstone of data-flow reversal in adjoint algorithmic differentiation. Checkpointing is a storage/recomputation trade-off that can be applied at different levels, one of which being the call tree. We are looking for good placements of checkpoints onto the call tree of a given application, to reduce run time and memory...
Uploaded on: September 11, 2024 -
June 5, 2022 (v1)Conference paper
We illustrate the benefits of Algorithmic Differentiation (AD) for the development of aerodynamic flow simulation software. In refining the architecture of the elsA CFD solver, developed jointly by ONERA and Safran, we consider AD as a key technology to cut development costs of some derivatives of interest, namely the tangent, adjoint, and...
Uploaded on: December 3, 2022 -
March 7, 2023 (v1)Journal article
International audience
Uploaded on: January 17, 2024