Skip to content

Commit 29dd71c

Browse files
author
Matthieu Lemerre
committed
Upd
1 parent 02e62c2 commit 29dd71c

File tree

1 file changed

+3
-100
lines changed

1 file changed

+3
-100
lines changed

papers/2024-oopsla-typedc-dependent-nominal-physical-type-system.md

+3-100
Original file line numberDiff line numberDiff line change
@@ -2,113 +2,16 @@
22
# All detailed paper info is in _data/publication.yml
33
layout: paper
44
parent: Publications
5-
title: "PLDI'24 Research Paper"
5+
title: "OOPSLA'2024 Research Paper"
66
categories: new publication
7-
date: 2024-04-22
8-
nav_order: 2024-04-22
7+
date: 2024-08-28
8+
nav_order: 2024-08-28
99
---
1010

1111
## Context
1212

13-
Software analyzers and compilers have a lot in common: they both have to read and
14-
understand source code in order to **prove facts about it** and **transform it** into
15-
an equivalent code. While the goal of an analyzer is to prove facts (correctness, safety...)
16-
about the source, it often transforms the code through rewrites to make the analysis easier.
17-
As an example, rewriting `e + e` into `2*e` makes it obvious that the value is even.
18-
Symmetrically, the goal of a compiler is code transformation, but compilers often
19-
run analyses on the code to perform optimizations. For instance, many compilers
20-
remove variables which are never read by the program: this is possible thanks to
21-
a liveness analysis.
22-
23-
With this in mind, it seems compilers and analyzers could be written using the same
24-
core library of analysis and program transformations.
25-
A problem that creeps up in both case is known as the **phase-ordering problem**:
26-
in what order should we run transformations and analyses? Should we start by analyzing
27-
the code, and use that information to transform it? Or rather should first transform the
28-
code and then run the analysis on the simplified version? Should we alternate between
29-
transformation and analysis passes?
30-
31-
In practice, the best precision is obtained by running transformations and analyses
32-
simultaneously. Fortunately, **abstract interpretation** is well-suited to fuse different analyses together. In abstract interpretation, each analysis is viewed as a domain,
33-
and all domains have a common signature/interface. This allows running multiple analyses
34-
in parallel (using a product of the relevant domains), and have them collaborate
35-
(domains can query other domains to see if they can prove a property).
36-
3713
## Example
3814

39-
Below is an example of compiling a source program to SSA using our technique (you can see the labels in the nodes as just different names for each node). Our analyzer is capable of eliminating the dead else branch inside the loop. Doing so requires simultaneously
40-
performing
41-
**numerical analysis** (to learn that z is even), some **syntactic transformations** (to learn
42-
that F(j + z%2) is F(j)), optimistic **global value numbering** (to learn that i = j), and
43-
**dead code elimination** so that no analysis takes the else branch (which breaks all those properties).
44-
45-
<img src="/assets/publications/imgs/2024-pldi-full-example.svg"
46-
style="width:700px; display:block; margin-left:auto; margin-right:auto">
47-
48-
49-
50-
5115
## Contributions
5216

53-
Our paper shows the following novel results:
54-
- A standard abstract interpretation framework can be turned into a
55-
compiler: create a domain that is a **free-algebra** of the domain signature (i.e.
56-
a domain where each domain operation is a constructor creating an expression), then the analysis
57-
result **can be used to construct a new program**. Different abstract domain signatures correspond to different languages: the classical domain signature correspond to imperative programs expressed as a control-flow graph, and we provide a SSA domain signature corresponding to programs in SSA form.
58-
- **Functors can implement compiler passes**.
59-
A functor is just a function that builds a new abstract domain on top of abstract
60-
domains received as arguments. Functors are modular, they can be proved independently
61-
and then combined in a full compilation chain. Functor soundness and completeness
62-
imply forward and backward simulation between source and compiled program respectively.
63-
64-
Here is a small example of a sound and complete functor transformation:
65-
replacing a ternary operator with explicit control flow jumps.
66-
67-
<img src="/assets/publications/imgs/2024-pldi-transformation-example.svg"
68-
style="width:450px; display:block; margin-left:auto; margin-right:auto">
69-
70-
- **Compiling to SSA recovers missing context and improves numerical analysis precision**.
71-
We describe a functor for compiling a small imperative language to SSA.
72-
This allows performing a numerical analysis on the SSA form, which leverages
73-
variable immutability to store information on expressions.
74-
This is always more precise than direct numerical analysis, while just adding a constant overhead.
75-
76-
In particular, this domain car analyze compiled code with the same precision as source
77-
(when compilation corresponds, e.g., to transformation into three-address code).
78-
The usual precision loss resulting from compiling large expressions into
79-
instruction sequences with multiple intermediate variables is recovered thanks to our SSA-based analysis.
80-
81-
Here are a few examples of assertion this domain can prove:
82-
- Propagate information across statements
83-
```c
84-
c = (y >= 0);
85-
if (c) {
86-
assert(y >= 0);
87-
}
88-
```
89-
- Learn from related variables:
90-
```c
91-
y = x + 1;
92-
z = x * x;
93-
if (2 <= y && y <= 5) {
94-
assert(1 <= x && x <= 4);
95-
assert(1 <= z && z <= 16);
96-
}
97-
```
98-
- Increase precision of the numeric abstraction: even though intervals
99-
can't represent `x != 0`, using our domain with interval abstraction can prove this
100-
```c
101-
if (x != 0) {
102-
assert(x != 0);
103-
}
104-
```
105-
106-
107-
108-
10917
## Further information
110-
111-
- Read the **paper**: either [published version](/assets/publications/pdfs/2024-pldi-compiling-with-abstract-interpretation.pdf) or the [version with appendices](/assets/publications/pdfs/2024-pldi-compiling-with-abstract-interpretation-with-appendices.pdf)
112-
- Download the [software artifact](https://doi.org/10.5281/zenodo.10895582) from
113-
Zenodo to try out our example analyzer and explore the code.
114-
- To appear at the [Programming Language Design and Implementation (PLDI) 2024 conference](https://pldi24.sigplan.org/)

0 commit comments

Comments
 (0)