Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices

M. L. Overton, R. S. Womersley

Research output: Contribution to journalArticlepeer-review

Abstract

The sum of the largest eigenvalues of a symmetric matrix is a nonsmooth convex function of the matrix elements. Max characterizations for this sum are established, giving a concise characterization of the subdifferential in terms of a dual matrix. This leads to a very useful characterization of the generalized gradient of the following convex composite function: the sum of the largest eigenvalues of a smooth symmetric matrix-valued function of a set of real parameters. The dual matrix provides the information required to either verify first-order optimality conditions at a point or to generate a descent direction for the eigenvalue sum from that point, splitting a multiple eigenvalue if necessary. Connections with the classical literature on sums of eigenvalues and eigenvalue perturbation theory are discussed. Sums of the largest eigenvalues in the absolute value sense are also addressed.

Original languageEnglish (US)
Pages (from-to)321-357
Number of pages37
JournalMathematical Programming
Volume62
Issue number1-3
DOIs
StatePublished - Feb 1993

Keywords

  • convex composite optimization
  • generalized gradient
  • maximum eigenvalue
  • Nonsmooth optimization
  • sum of eigenvalues

ASJC Scopus subject areas

  • Software
  • General Mathematics

Fingerprint

Dive into the research topics of 'Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices'. Together they form a unique fingerprint.

Cite this