Skip to content

Commit bf7d11e

Browse files
committed
New page added for Research Area "AD"
- also linked to research page - (SEO Optimized) this will help populate CR website in Google Search results for "Automatic Differentiation" - Descriptions taken from/verified from Presentations/Papers found on CR website. Rephrased for better SEO ranking (no duplication from original content on PDFs/Papers/Presentations)
1 parent e04970e commit bf7d11e

File tree

2 files changed

+154
-1
lines changed

2 files changed

+154
-1
lines changed

_pages/automatic_differentiation.md

Lines changed: 153 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,153 @@
1+
---
2+
title: "Compiler Research Research Areas"
3+
layout: gridlay
4+
excerpt: "Automatic differentiation (AD) is a powerful technique for evaluating the
5+
derivatives of mathematical functions in C++, offering significant advantages
6+
over traditional differentiation methods."
7+
sitemap: true
8+
permalink: /automatic_differentiation
9+
---
10+
11+
## Automatic differentiation
12+
13+
Automatic differentiation (AD) is a technique for evaluating the derivatives
14+
of mathematical functions in C++, offering a number of advantages over
15+
traditional differentiation methods. By leveraging the principles of Automatic
16+
Differentiation, programmers can efficiently calculate partial derivatives of
17+
functions, opening up a range of applications in scientific computing and
18+
machine learning.
19+
20+
### Understanding Differentiation in Computing
21+
22+
Differentiation in calculus is the process of finding the rate of change of
23+
one quantity with respect to another. There are several principles and
24+
formulas for differentiation, such as Sum Rule, Product Rule, Quotient Rule,
25+
Constant Rule, and Chain Rule. For Automatic Differentiation, the Chain Rule
26+
of differential calculus is of special interest.
27+
28+
Within the context of computing, there are various methods for
29+
differentiation:
30+
31+
- **Manual Differentiation**: This consists of manually applying the rules of
32+
differentiation to a given function. While straightforward, it can be
33+
tedious and error-prone, especially for complex functions.
34+
35+
- **Numerical Differentiation**: This method approximates the derivatives
36+
using finite differences. It is relatively simple to implement, but can
37+
suffer from numerical instability and inaccuracy in its results.
38+
39+
- **Symbolic Differentiation**: This approach uses symbolic manipulation to
40+
compute derivatives analytically. It provides accurate results but can lead to
41+
lengthy expressions for large computations. It is limited to closed-form
42+
expressions; that is, it cannot process the control flow.
43+
44+
- **Automatic Differentiation (AD)**: Automatic Differentiation is a highly
45+
efficient technique that computes derivatives of mathematical functions by
46+
applying differentiation rules to every arithmetic operation in the code.
47+
Automatic Differentiation can be used in two modes:
48+
49+
- Forward Mode: calculates derivatives with respect to a single variable, and
50+
51+
- Reverse Mode: calculates gradients with respect to all inputs
52+
simultaneously.
53+
54+
### Automatic Differentiation in C++
55+
56+
Automated Differentiation implementations are based on Operator Overloading or
57+
Source Code Transformation. C++ allows operator overloading, making it
58+
possible to implement Automatic Differentiation. The derivative of a function
59+
can be evaluated at the same time as the function itself. Automatic
60+
Differentiation exploits the fact that every computer calculation consists of
61+
elementary mathematical operations and functions, and by applying the chain
62+
rule recurrently, partial derivatives of arbitrary order can be computed
63+
accurately. Following are some of its highlights:
64+
65+
- Automatic Differentiation can calculate derivatives without any additional
66+
precision loss.
67+
68+
- It is not confined to closed-form expressions.
69+
70+
- It can take derivatives of algorithms involving conditionals, loops, and
71+
recursion.
72+
73+
- It works without generating inefficiently long expressions.
74+
75+
### Automatic Differentiation Implementation with Clad - a Clang Plugin
76+
77+
Implementing Automatic Differentiation from the ground up can be challenging.
78+
However, several C++ libraries and tools are available to simplify the
79+
process. The Compiler Research Group has been working on [Clad], a C++ library
80+
that enables Automatic Differentiation using the LLVM compiler infrastructure.
81+
It is implemented as a plugin for the Clang compiler.
82+
83+
[Clad] operates on Clang AST (Abstract Syntax Tree) and is capable of
84+
performing C++ Source Code Transformation. When Clad is given the C++ source
85+
code of a mathematical function, it can automatically generate C++ code for
86+
the computing derivatives of that function. Clad has comprehensive coverage of
87+
the latest C++ features and a well-rounded fallback and recovery system in
88+
place.
89+
90+
**Clad's Key Features**:
91+
92+
- Support for both, Forward Mode and Reverse Mode Automatic Differentiation.
93+
94+
- Support for differentiation of the built-in C input arrays, built-in C/C++
95+
scalar types, functions with an arbitrary number of inputs, and functions
96+
that only return a single value.
97+
98+
- Support for loops and conditionals.
99+
100+
- Support for generation of single derivatives, gradients, Hessians, and
101+
Jacobians.
102+
103+
- Integration with CUDA for GPU programming.
104+
105+
- Integration with Cling and ROOT for high-energy physics data analysis.
106+
107+
### Basics of using Clad
108+
109+
Clad provides five API functions:
110+
111+
- `clad::differentiate` to use Forward Mode Automatic Differentiation.
112+
- `clad::gradient` to use Reverse Mode Automatic Differentiation.
113+
- `clad::hessian` to construct a Hessian matrix using a combination of Forward
114+
Mode and Reverse Mode Automatic Differentiation.
115+
- `clad::jacobian` to construct a Jacobian matrix using Reverse Mode Automatic
116+
Differentiation.
117+
- `clad::estimate-error` to calculate the Floating-Point Error of the
118+
requested program using Reverse Mode Automatic Differentiation.
119+
120+
These API functions label an existing function for differentiation and return
121+
a functor object that contains the generated derivative, which can be called
122+
by using the `.execute` method.
123+
124+
[Benchmarks] show that Clad is numerically faster than the conventional
125+
Numerical Differentiation methods, providing Hessians that are 450x (~dim/25
126+
times faster). [General benchmarks] demonstrate a 3378x improvement in speed
127+
with Clad (compared to Numerical Differentiation) based on central
128+
differences.
129+
130+
For more information on Clad, please view:
131+
132+
- [Clad - Github Repository](https://github.com/vgvassilev/clad)
133+
134+
- [Clad - ReadTheDocs](https://clad.readthedocs.io/en/latest/)
135+
136+
- [Clad - Video Demo](https://www.youtube.com/watch?v=SDKLsMs5i8s)
137+
138+
- [Clad - PDF Demo](https://indico.cern.ch/event/808843/contributions/3368929/attachments/1817666/2971512/clad_demo.pdf)
139+
140+
- [Clad - Automatic Differentiation for C++ Using Clang - Slides](https://indico.cern.ch/event/1005849/contributions/4227031/attachments/2221814/3762784/Clad%20--%20Automatic%20Differentiation%20in%20C%2B%2B%20and%20Clang%20.pdf)
141+
142+
- [Automatic Differentiation in C++ - Slides](https://compiler-research.org/assets/presentations/CladInROOT_15_02_2020.pdf)
143+
144+
145+
146+
[Clad]: https://compiler-research.org/clad/
147+
148+
[Benchmarks]: https://compiler-research.org/assets/presentations/CladInROOT_15_02_2020.pdf
149+
150+
[General benchmarks]: https://indico.cern.ch/event/1005849/contributions/4227031/attachments/2221814/3762784/Clad%20--%20Automatic%20Differentiation%20in%20C%2B%2B%20and%20Clang%20.pdf
151+
152+
153+

_pages/research.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ only improves performance but also simplifies code development and debugging
9090
processes, offering a more efficient alternative to static binding methods.
9191

9292

93-
[Automatic Differentiation ↗]: https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p2072r0.pdf
93+
[Automatic Differentiation ↗]: https://compiler-research.org/automatic_differentiation
9494

9595
[Interactive C++]: https://blog.llvm.org/posts/2020-12-21-interactive-cpp-for-data-science/
9696

0 commit comments

Comments
 (0)