Optimization Overview

Most complexity in modern compilers is in the optimizer

When should we perform optimizations?

We will be looking at optimizations on an IR which has the following grammar:

P --> S P | S
S --> id := id op id
       | id := op id
       | id := id
       | push id
       | id := pop
       | if id relop id goto L
       | L:
       | jump L

A basic bloack is a maximal sequence of instructions with

Idea:

Once we reach the start of a basic block, we are guaranteed to execute all instructions in the BB. Furthermore, the only way into the basic block is through the first statement.

Consider the basic block:

1. L:
2.   t := 2*x
3.   w := t + x
4.   if w > 0 goto L'
(3) executes only after (2)

A control-flow graph is a directed graph with

Example control-flow graph:

BB1:
  x := 1
  i := 1

BB1-->BB2

BB2:
L:
  x := x * x
  i := i + 1
  if i < 10 goto L

BB2 --> BB2

BB2 --> BB3

The body of a method (or procedure) can be represented as a control-flow graph. There is one initial node (entry node). All "return" nodes are terminal.

Optimization seeks to improve a program's resource utilization

Optimization should not alter what the program computes

For languages like C, there are typically three granularities of optimization

Production compilers do all these types of optimizations. In general, easies to implement local optimizations and hardest to implement inter-procedural optimizations.

Criteria for evaluating an optimization

In practice, often a conscious decision is made not to implement the fanciest optimization known. Why?

Current state-of-the-art: the goal is "Maximum benefit for minimum cost"