The advent of computational science has unveiled large classes of nonlinear
optimization problems where derivatives of the objective and/or constraints are
unavailable. Often, these problems are posed as blackbox optimization problems,
but rarely is this by necessity. In this talk, we report on our experience
extracting additional structure on problems consisting of both blackbox and
algebraic or otherwise known components. These problems include nonlinear least
squares/calibration problems and knowing derivatives of some nonlinear
constraints or with respect to a subset of the variables. In each case, we use
quadratic surrogates to model both the blackbox and algebraic components to
obtain new greybox optimization methods.
