Local Convergence of Gradient Descent-Ascent for Training Generative Adversarial Networks

Evan Becker, Parthe Pandit, Sundeep Rangan, Alyson K. Fletcher

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Generative Adversarial Networks (GANs) are a popular formulation to train generative models for complex high dimensional data. The standard method for training GANs involves a gradient descent-ascent (GDA) procedure on a minimax optimization problem. This procedure is hard to analyze in general due to the nonlinear nature of the dynamics. We study the local dynamics of GDA for training a GAN with a kernel-based discriminator. This convergence analysis is based on a linearization of a nonlinear dynamical system that describes the GDA iterations, under an isolated points model assumption from [2]. Our analysis brings out the effect of the learning rates, regularization, and the bandwidth of the kernel discriminator, on the local convergence rate of GDA. Importantly, we show phase transitions that indicate when the system converges, oscillates, or diverges. We also provide numerical simulations that verify our claims. A full version with complete proofs is available on arXiv [3].

Original languageEnglish (US)
Title of host publicationConference Record of the 57th Asilomar Conference on Signals, Systems and Computers, ACSSC 2023
EditorsMichael B. Matthews
PublisherIEEE Computer Society
Pages892-896
Number of pages5
ISBN (Electronic)9798350325744
DOIs
StatePublished - 2023
Event57th Asilomar Conference on Signals, Systems and Computers, ACSSC 2023 - Pacific Grove, United States
Duration: Oct 29 2023Nov 1 2023

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
ISSN (Print)1058-6393

Conference

Conference57th Asilomar Conference on Signals, Systems and Computers, ACSSC 2023
Country/TerritoryUnited States
CityPacific Grove
Period10/29/2311/1/23

ASJC Scopus subject areas

  • Signal Processing
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Local Convergence of Gradient Descent-Ascent for Training Generative Adversarial Networks'. Together they form a unique fingerprint.

Cite this