Group Members

Jump to staff, students

Staff

David J Lange

Research Staff
email: David.Lange@princeton.edu

  • Scientific Staff with Lawrence Livermore National Laboratory (2002-2016).
  • Postdoc with Lawrence Livermore National Laboratory (1999-2002).
  • PhD Phyiscs, University of California, Santa Barbara (1999).
  • B.A. Physics, College of William and Mary (1993).

Vassil Vassilev

Research Staff
email: vvasilev@cern.ch

  • Various positions at CERN and FNAL (2010-2017).
  • PhD Computer Science, University of Plovdiv "Paisii Hilendarski", Plovdiv, Bulgaria (2015).
  • MSc Software Technologies, University of Plovdiv "Paisii Hilendarski", Plovdiv, Bulgaria (2010)
  • BSc Informatics, University of Plovdiv "Paisii Hilendarski", Plovdiv, Bulgaria (2009)

Ioana Ifrim

Research Staff
email: ioana.ifrim@cern.ch

  • Research Fellowship Generative Deep Learning - CERN (2018-2020)
  • Research student part of Morphological Computation and Learning Lab, Imperial College; Biologically Inspired Robotics Laboratory University of Cambridge (2016-2018)
  • MPhil Advanced Computer Science, University of Cambridge (2018)
  • BSc Computer Science with Robotics, King’s College London (2017)

Students

Garima Singh

Add Numerical Differentiation Support in Clad
email: garimasingh0028@gmail.com

Education: B. Tech in Information Technology, Manipal Institute of Technology, Manipal, India

Project description: In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. Automatic differentiation is an alternative technique to Symbolic differentiation and Numerical differentiation (the method of finite differences). Clad is based on Clang which provides the necessary facilities for code transformation. The AD library can differentiate non-trivial functions, find a partial derivative for trivial cases, and has good unit test coverage. In several cases, due to different limitations, it is either inefficient or impossible to differentiate a function. For example, clad cannot differentiate declared-but-not-defined functions. In that case, it issues an error. Instead, clad should fall back to its future numerical differentiation facilities.

Project Proposal: URL

Mentors: Vassil Vassilev, Alexander Penev

Past Projects

Floating point error evaluation with Clad

Project description: Floating-point estimation errors have been a testament to the finite nature of computing. Moreover, the predominance of Floating-point numbers in real-valued computation does not help that fact. Float computations are highly dependent on precision, and in most cases, very high precision calculation is not only not possible but very inefficient. Here, one has no choice but to resort to lower precision computing, which in turn is quite prone to errors. These errors result in inaccurate and sometimes catastrophic results; hence, it is imperative to estimate these errors accurately. This project aims to use Clad, a source transformation AD tool for C++ implemented as a plugin for the C++ compiler Clang, to develop a generic error estimation framework that is not bound to a particular error approximation model. It will allow users to select their preferable estimation logic and automatically generate functions augmented with code for the specified error estimator.

Project Proposal: URL

Mentors: Vassil Vassilev, David Lange

Parth Arora

Add support for differentiating functor objects in clad
email: partharora99160808@gmail.com

Education: B.Tech in Computer Science, USICT, Guru Gobind Singh Indraprastha University, New Delhi, India

Project description: Differentiation support for functions is available in clad. But support for direct differentiation of functors and lambda expressions is missing. Many computations are modelled using functors and functors and lambda expressions are becoming more and more relevant in modern C++. This project aims to add support for directly differentiating functors and lambda expressions in clad.

Project Proposal: URL

Mentors: Vassil Vassilev, David Lange

Baidyanath Kundu

Utilize second order derivatives from Clad in ROOT
email: kundubaidya99@gmail.com

Education: B. Tech in Computer Science and Engg., Manipal Institute of Technology, Manipal, India

Project description: ROOT is a framework for data processing, born at CERN, at the heart of the research on high-energy physics. ROOT has a clang-based C++ interpreter Cling and integrates with the automatic differentiation plugin Clad to enable flexible automatic differentiation facility. TFormula is a ROOT class which bridges compiled and interpreted code. This project aims to add second order derivative support in TFormula using clad::hessian. The PR that added support for gradients in ROOT is taken as a reference and can be accessed here.

Project Proposal: URL

Mentors: Vassil Vassilev, Ioana Ifrim

Purva Chaudhari

Reduce boost dependencies in CMSSW
email: purva.chaudhari02@gmail.com

Education: Computer Science, Vishwakarma Institute of Technology

Project description: This project has the objective to reduce CMSSW technical debt by finding and replacing boost dependencies that have an equivalent solution in standard C++. Reducing boost dependencies helps us create more lightweight boost clang modules for upcoming c++20. This also reduces the amount of headers that we need to work on to be able to use c++20 clang modules.

Project Proposal: URL

Mentors: Vassil Vassilev, David Lange

This could be you!

See openings for more info
email: vvasilev@cern.ch

Former students

Ajay Uppili Arasanipalai

Modernize the LLVM “Building A JIT” Tutorial Series
email: aua2@illinois.edu

Education: University of Illinois at Urbana-Champaign, Grainger College of Engineering

Project description: The LLVM JIT API has changed many times over the years. However, the official tutorials have failed to keep up. This project aims to update the official “Building a JIT” tutorials to use the latest version of the OrcJIT API and add new content that might be relevant to new LLVM users interested in writing their own JIT compilers.

Final Report:

Mentors: Lang Hames, Vassil Vassilev

Vaibhav Garg

Enable Modules on Windows, GSoC 2020
email: gargvaibhav64@gmail.com

Education: Computer Science, Birla Institute of Technology and Science, Pilani, India

Project description: ROOT has several features that interact with libraries and require implicit header inclusion. This can be triggered by reading or writing data on disk, or user actions at the prompt. Exposing the full shared library descriptors to the interpreter at runtime translates into an increased memory footprint. ROOT’s exploratory programming concepts allow implicit and explicit runtime shared library loading. It requires the interpreter to load the library descriptor. Re-parsing of descriptors’ content has a noticeable effect on runtime performance. C++ Modules are designed to minimize the reparsing of the same header content by providing an efficient on-disk representation of the C++ Code. C++ Modules have been implemented for Unix and OS X systems already and it is expected that with next release of ROOT, C++ modules will be default on OS X. This project aims to extend the C++ Modules support for Windows, by implementing compatible solutions to the UNIX baseline and also display corresponding performance results.

Final Report: GSoC 2020 Archive

Mentors: Vassil Vassilev, Bertrand Bellenot

Lucas Camolezi

Reduce boost dependence in CMSSW, GSoC 2020
email: camolezi@usp.br

Education: Computer Engineering, University of São Paulo, Brazil

Project description: This project has the goal to find and decrease boost dependencies in CMSSW. Modern C++ introduced a lot of new features that were only available through boost packages. Thus, some boost code can be replaced with similar C++ standard library features. Using standard features is a good practice, this project will move the CMSSW codebase in that direction.

Final Report: GSoC 2020 Archive

Mentors: Vassil Vassilev, David Lange

Roman Shakhov

Extend clad to compute Jacobians, GSoC 2020
email: r.intval@gmail.com

Education: Mathematics and Computer Science, Voronezh State University, Russia

Project description: In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. Automatic differentiation is an alternative technique to Symbolic differentiation and Numerical differentiation (the method of finite differences). CLAD is based on Clang which will provide the necessary facilities for code transformation. The AD library is able to differentiate non-trivial functions, to find a partial derivative for trivial cases and has good unit test coverage. Currently, clad does not provide an easy way to compute Jacobians.

Final Report: Poster

Mentors: Vassil Vassilev, Alexander Penev