Group Members

Jump to staff, students


David J Lange

Research Staff

  • Scientific Staff with Lawrence Livermore National Laboratory (2002-2016).
  • Postdoc with Lawrence Livermore National Laboratory (1999-2002).
  • PhD Phyiscs, University of California, Santa Barbara (1999).
  • B.A. Physics, College of William and Mary (1993).

Vassil Vassilev

Research Staff

  • Various positions at CERN and FNAL (2010-2017).
  • PhD Computer Science, University of Plovdiv "Paisii Hilendarski", Plovdiv, Bulgaria (2015).
  • MSc Software Technologies, University of Plovdiv "Paisii Hilendarski", Plovdiv, Bulgaria (2010)
  • BSc Informatics, University of Plovdiv "Paisii Hilendarski", Plovdiv, Bulgaria (2009)

Ioana Ifrim

Research Staff

  • Research Fellowship Generative Deep Learning - CERN (2018-2020)
  • Research student part of Morphological Computation and Learning Lab, Imperial College; Biologically Inspired Robotics Laboratory University of Cambridge (2016-2018)
  • MPhil Advanced Computer Science, University of Cambridge (2018)
  • BSc Computer Science with Robotics, King’s College London (2017)


Garima Singh

Floating point error evaluation with Clad

Education: B. Tech in Information Technology, Manipal Institute of Technology, Manipal, India

Project description: Floating-point estimation errors have been a testament to the finite nature of computing. Moreover, the predominance of Floating-point numbers in real-valued computation does not help that fact. Float computations are highly dependent on precision, and in most cases, very high precision calculation is not only not possible but very inefficient. Here, one has no choice but to resort to lower precision computing, which in turn is quite prone to errors. These errors result in inaccurate and sometimes catastrophic results; hence, it is imperative to estimate these errors accurately. This project aims to use Clad, a source transformation AD tool for C++ implemented as a plugin for the C++ compiler Clang, to develop a generic error estimation framework that is not bound to a particular error approximation model. It will allow users to select their preferable estimation logic and automatically generate functions augmented with code for the specified error estimator.

Project Proposal: URL

Mentors: Vassil Vassilev, David Lange

This could be you!

See openings for more info

Former students

Vaibhav Garg

Enable Modules on Windows, GSoC 2020

Education: Computer Science, Birla Institute of Technology and Science, Pilani, India

Project description: ROOT has several features that interact with libraries and require implicit header inclusion. This can be triggered by reading or writing data on disk, or user actions at the prompt. Exposing the full shared library descriptors to the interpreter at runtime translates into an increased memory footprint. ROOT’s exploratory programming concepts allow implicit and explicit runtime shared library loading. It requires the interpreter to load the library descriptor. Re-parsing of descriptors’ content has a noticeable effect on runtime performance. C++ Modules are designed to minimize the reparsing of the same header content by providing an efficient on-disk representation of the C++ Code. C++ Modules have been implemented for Unix and OS X systems already and it is expected that with next release of ROOT, C++ modules will be default on OS X. This project aims to extend the C++ Modules support for Windows, by implementing compatible solutions to the UNIX baseline and also display corresponding performance results.

Final Report: GSoC 2020 Archive

Mentors: Vassil Vassilev, Bertrand Bellenot

Lucas Camolezi

Reduce boost dependence in CMSSW, GSoC 2020

Education: Computer Engineering, University of São Paulo, Brazil

Project description: This project has the goal to find and decrease boost dependencies in CMSSW. Modern C++ introduced a lot of new features that were only available through boost packages. Thus, some boost code can be replaced with similar C++ standard library features. Using standard features is a good practice, this project will move the CMSSW codebase in that direction.

Final Report: GSoC 2020 Archive

Mentors: Vassil Vassilev, David Lange

Roman Shakhov

Extend clad to compute Jacobians, GSoC 2020

Education: Mathematics and Computer Science, Voronezh State University, Russia

Project description: In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. Automatic differentiation is an alternative technique to Symbolic differentiation and Numerical differentiation (the method of finite differences). CLAD is based on Clang which will provide the necessary facilities for code transformation. The AD library is able to differentiate non-trivial functions, to find a partial derivative for trivial cases and has good unit test coverage. Currently, clad does not provide an easy way to compute Jacobians.

Final Report: Poster

Mentors: Vassil Vassilev, Alexander Penev