Related Tutorials

Aaron Jomy (CERN); Vassil Vassilev (Princeton/CERN), Efficient Python Programming (2024-08-20).

Sunho Kim (De Anza College, Cupertino); Lang Hames (Apple); Vassil Vassilev (Princeton/CERN), Building Programming Language Infrastructure With LLVM Components (2023-07-17).

  • LLVM and Clang are one of the foundational instruments for building research and production programming languages. The libraries are efficient and composable providing many opportunities to build research language infrastructure. The tutorial is organized into three major parts:

  1. Introduction, design principles, library layering (~45m) – Provides an overview of LLVM and Clang.

  2. Just-in-Time Infrastructure (~60m) – Introduces new capabilities of ORCv2 such as speculative JITing and re-optimization using a simple language called Kaleidoscope.

  3. Incremental compilation apt for dynamic programming languages (~45m) – Outlines how to use Clang as a library to enable build basic C/C++/Python on-demand by building a C++ interpreter which connects to the Python interpreter. Video available here.

Upon completion of the tutorials researchers learn how to set up various LLVM components and use them to quickly bootstrap their research projects.

Simeon Ehrig, Game of Life on GPU Using Cling-CUDA (2021-11-09).

  • This tutorial demonstrates some functions of Cling-CUDA and Jupyter Notebooks and gives an idea what you can do with C++ in a web browser. The example shows the usual workflow of simulation and analysis. The simulation runs Conway’s Game of Life on a GPU.

Garima Singh, Floating-Point Error Estimation Using Automatic Differentiation with Clad (2021-08-21).

  • CLAD provides an in-built floating-point error estimation framework that can automatically annotate code with error estimation information. This framework also provides the ability for users to write their own error models and use the same for generating error estimates. The aim of this tutorial is to demonstrate building a simple custom error model and using it in conjunction with CLAD’s error estimation framework.

Ioana Ifrim, Interactive Automatic Differentiation With Clad and Jupyter Notebooks (2021-08-20).

  • xeus-cling provides a Jupyter kernel for C++ with the help of the C++ interpreter cling and the native implementation of the Jupyter protocol xeus. Within the xeus-cling framework, CLAD can enable automatic differentiation (AD) such that users can automatically generate C++ code for their computation of derivatives of their functions. In mathematical optimization, the Rosenbrock function is a non-convex function used as a performance test problem for optimization problems, this tutorial shows the computation of the function’s derivatives, by employing either CLAD’s Forward Mode or Reverse Mode.

Ioana Ifrim, How to Execute Gradients Generated by Clad on a CUDA GPU (2021-08-20).

  • CLAD provides automatic differentiation (AD) for C/C++ and works without code modification (legacy code). Given that the range of AD application problems are defined by their high computational requirements, it means that they can greatly benefit from parallel implementations on graphics processing units (GPUs). This tutorial showcases how to firstly use CLAD to obtain your function’s gradient and how to schedule the function’s execution on the GPU.