Improving quantitative studies of international conflict: A conjecture

Nathaniel Beck, Gary King, Langche Zeng

Research output: Contribution to journalArticlepeer-review

Abstract

We address a well-known but infrequently discussed problem in the quantitative study of international conflict: Despite immense data collections, prestigious journals, and sophisticated analyses, empirical findings in the literature on international conflict are often unsatisfying. Many statistical results change from article to article and specification to specification. Accurate forecasts are nonexistent. In this article we offer a conjecture about one source of this problem: The causes of conflict, theorized to be important but often found to be small or ephemeral, are indeed tiny for the vast majority of dyads, but they are large, stable, and replicable wherever the ex ante probability of conflict is large. This simple idea has an unexpectedly rich array of observable implications, all consistent with the literature. We directly test our conjecture by formulating a statistical model that includes its critical features. Our approach, a version of a "neural network" model, uncovers some interesting structural features of international conflict and, as one evaluative measure, forecasts substantially better than any previous effort. Moreover, this improvement comes at little cost, and it is easy to evaluate whether the model is a statistical improvement over the simpler models commonly used.

Original languageEnglish (US)
Pages (from-to)21-35
Number of pages15
JournalAmerican Political Science Review
Volume94
Issue number1
DOIs
StatePublished - Mar 2000

ASJC Scopus subject areas

  • Sociology and Political Science
  • Political Science and International Relations

Fingerprint Dive into the research topics of 'Improving quantitative studies of international conflict: A conjecture'. Together they form a unique fingerprint.

Cite this