### Abstract

The sum of the largest eigenvalues of a symmetric matrix is a nonsmooth convex function of the matrix elements. Max characterizations for this sum are established, giving a concise characterization of the subdifferential in terms of a dual matrix. This leads to a very useful characterization of the generalized gradient of the following convex composite function: the sum of the largest eigenvalues of a smooth symmetric matrix-valued function of a set of real parameters. The dual matrix provides the information required to either verify first-order optimality conditions at a point or to generate a descent direction for the eigenvalue sum from that point, splitting a multiple eigenvalue if necessary. Connections with the classical literature on sums of eigenvalues and eigenvalue perturbation theory are discussed. Sums of the largest eigenvalues in the absolute value sense are also addressed.

Original language | English (US) |
---|---|

Pages (from-to) | 321-357 |

Number of pages | 37 |

Journal | Mathematical Programming |

Volume | 62 |

Issue number | 1-3 |

DOIs | |

State | Published - Feb 1993 |

### Keywords

- convex composite optimization
- generalized gradient
- maximum eigenvalue
- Nonsmooth optimization
- sum of eigenvalues

### ASJC Scopus subject areas

- Applied Mathematics
- Mathematics(all)
- Safety, Risk, Reliability and Quality
- Management Science and Operations Research
- Software
- Computer Graphics and Computer-Aided Design
- Computer Science(all)

## Fingerprint Dive into the research topics of 'Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices'. Together they form a unique fingerprint.

## Cite this

*Mathematical Programming*,

*62*(1-3), 321-357. https://doi.org/10.1007/BF01585173