TY - GEN
T1 - A Permutation-Equivariant Neural Network Architecture For Auction Design
AU - Rahme, Jad
AU - Jelassi, Samy
AU - Bruna, Joan
AU - Weinberg, S. Matthew
N1 - Publisher Copyright:
Copyright © 2021, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
PY - 2021
Y1 - 2021
N2 - Designing an incentive compatible auction that maximizes expected revenue is a central problem in Auction Design. Theoretical approaches to the problem have hit some limits in the past decades and analytical solutions are known for only a few simple settings. Building on the success of deep learning, a new approach was recently proposed by Duetting et al. (2019) in which the auction is modeled by a feed-forward neural network and the design problem as a learning problem. However, the architectures used in that work are general purpose and do not take advantage of any structure the solution might possess. For example, symmetric auctions are known to be optimal in many settings of interest, and near-optimal quite generally (Daskalakis and Weinberg 2012; Kothari et al. 2019a), yet previous architectures do not recover this structure (even in settings where it is known to exist). In this work, we construct a neural architecture that is capable of perfectly recovering the optimal symmetric mechanism. We further demonstrate that permutation-equivariant architectures are not only capable of recovering previous results, they also have better generalization properties.
AB - Designing an incentive compatible auction that maximizes expected revenue is a central problem in Auction Design. Theoretical approaches to the problem have hit some limits in the past decades and analytical solutions are known for only a few simple settings. Building on the success of deep learning, a new approach was recently proposed by Duetting et al. (2019) in which the auction is modeled by a feed-forward neural network and the design problem as a learning problem. However, the architectures used in that work are general purpose and do not take advantage of any structure the solution might possess. For example, symmetric auctions are known to be optimal in many settings of interest, and near-optimal quite generally (Daskalakis and Weinberg 2012; Kothari et al. 2019a), yet previous architectures do not recover this structure (even in settings where it is known to exist). In this work, we construct a neural architecture that is capable of perfectly recovering the optimal symmetric mechanism. We further demonstrate that permutation-equivariant architectures are not only capable of recovering previous results, they also have better generalization properties.
UR - http://www.scopus.com/inward/record.url?scp=85129950522&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85129950522&partnerID=8YFLogxK
U2 - 10.1609/aaai.v35i6.16711
DO - 10.1609/aaai.v35i6.16711
M3 - Conference contribution
AN - SCOPUS:85129950522
T3 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
SP - 5664
EP - 5672
BT - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
PB - Association for the Advancement of Artificial Intelligence
T2 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
Y2 - 2 February 2021 through 9 February 2021
ER -