The discovery of efficient probabilistically checkable proofs for NP,
and their surprising connection to hardness of approximation, has
greatly enhanced our understanding of the approximability of many
optimization problems. Yet this research has essentially stayed
on problem-specific results. As a result, some fundamental questions
about the behaviour of optimization problems as a whole have
remained unresolved. One fundamental source of difficulty is that
to understand the behaviour of an entire class, we must develop
tools to understand the exact behaviour of infinitely many problems.
There are other barriers too, posed by classic results such as
Rice's theorem and Ladner's theorem.
Using the notion of constraint satisfaction problems, we develop a
framework to study the approximability of infinite classes of
optimization problems which circumvents the above-mentioned
barriers. For each of these classes, we establish succinct classification
theorems which partition all problems into finitely many sub-classes such
that the approximability of any problem is completely determined by
the sub-class it belongs to. That is, we develop approximation algorithms
and matching hardness results that uniformaly apply to all problems
within a sub-class. The result is surprising because our framework is
general enough to capture an entire spectrum of important optimization
problems, including s-t min cut, max cut, max 3sat, max clique, vertex
cover, hitting set with bounded size sets, deleting minimum number of
edges to make a graph bipartite, deleting minimum number of clauses to
make a 2CNF formula satisfiable, and nearest codeword. We will show
that, within the framework of constraint satisfaction, these results
allow us to resolve some important questions about optimization problems.
This is based on joint work with Madhu Sudan, Luca Trevisan and David
Williamson.