Mihail Mihov

GSoC 2024 Contributor

email: mihovmihailp@gmail.com

Completed project: Add support for consteval and constexpr functions in clad
In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. Automatic differentiation is an alternative technique to Symbolic differentiation and Numerical differentiation (the method of finite differences). Clad is based on Clang which provides the necessary facilities for code transformation. The AD library can differentiate non-trivial functions, to find a partial derivative for trivial cases and has good unit test coverage. C++ provides the specifiers consteval and constexpr to allow compile time evaluation of functions. constexpr declares a possibility, i.e the function will be evaluated at compile time if possible, else at runtime; whereas consteval makes it mandatory, i.e every call to the function must produce a compile-time constant. The aim of this project is to ensure that same semantics are followed by the generated derivative function, i.e if the primal function is evaluated at compile time (because of constexpr or consteval specifier), then the generated derivative code should also have the same specifier to be evaluatable at compile time. This will enable clad to demonstrate the benefits of doing automatic differentiation directly on C++ frontend to utilize the benefits of clang’s infrastructure.

Project Proposal: URL

Mentors: Vaibhav Thakkar, Petro Zaritskyi, Vassil Vassilev

Presentations



Constexpr and consteval support in Clad, Slides, Team Meeting, 3 September 2024
Clad Constexpr Consteval Support Project Introduction, Slides, Team Meeting, 22 May 2024