**SPEAKER:** Ignacio Grossman

**ABSTRACT:**

This seminar deals with the theory of reformulations and numerical solution of generalized disjunctive programming (GDP) problems, which are expressed in terms of Boolean and continuous variables, and involve algebraic constraints, disjunctions and propositional logic statements. We propose a framework to generate alternative MINLP formulations for convex nonlinear GDPs that lead to stronger relaxations by generalizing the seminal work by Egon Balas (1988) for linear disjunctive programs. We define for the case of convex nonlinear GDPs an operation equivalent to a basic step for linear disjunctive programs that takes a disjunctive set to another one with fewer conjuncts. We show that the strength of relaxations increases as the number of conjuncts decreases, leading to a hierarchy of relaxations. We prove that the tightest of these relaxations, allows in theory the solution of the convex GDP problem as an NLP problem. We present a guide for the generation of strong relaxations without incurring in an exponential increase of the size of the reformulated MINLP. We apply the proposed theory for generating strong relaxations to a dozen convex GDPs which are solved with a NLP-based branch and bound method. Compared to the reformulation based on the hull relaxation, the computational results show that with the proposed reformulations significant improvements can be obtained in the predicted lower bounds, which in turn translates into a smaller number of nodes for the branch and bound enumeration.

We next address the extension of
the above ideas to the solution of nonconvex GDPs that involve bilinear and
concave terms. In order to solve these nonconvex problems with a spatial branch
and bound method, a *convex GDP relaxation*
is obtained by using suitable under- and over-estimating functions of the nonconvex
constraints such as the convex envelopes proposed by McCormick (1976). In order to predict tighter lower bounds to
the global optimum we exploiting the hierarchy of relaxations for linear GDP
problems. A family of tighter reformulations is obtained by performing a
sequence of basic steps on the original disjunctive set. Since each
intersection usually creates new variables and constraints some general rules
are described to limit the size of the problem. We illustrate
the application of these ideas with linear relaxations of several bilinear and
concave GDPs related to the optimization of process systems to demonstrate the
computational savings that can be achieved with the tighter lower bounds. Finally,
extensions for nonconvex problems involving nonlinear convex under- and
over-estimating functions.

]]>