sbv IMPROVER diagnostic signature challenge: Scoring strategies

Raquel Norel, Erhan Bilal, Nathalie Conrad-Chemineau, Richard Bonneau, Alberto de la Fuente, Igor Jurisica, Daniel Marbach, Pablo Meyer, J. Jeremy Rice, Tamir Tuller, Gustavo Stolovitzky

Research output: Contribution to journalArticlepeer-review

Abstract

Evaluating the performance of computational methods to analyze high throughput data are an integral component of model development and critical to progress in computational biology. In collaborative-competitions, model performance evaluation is crucial to determine the best performing submission. Here we present the scoring methodology used to assess 54 submissions to the IMPROVER Diagnostic Signature Challenge. Participants were tasked with classifying patients' disease phenotype based on gene expression data in four disease areas: Psoriasis, Chronic Obstructive Pulmonary Disease, Lung Cancer, and Multiple Sclerosis. We discuss the criteria underlying the choice of the three scoring metrics we chose to assess the performance of the submitted models. The statistical significance of the difference in performance between individual submissions and classification tasks varied according to these different metrics. Accordingly, we consider an aggregation of these three assessment methods and present the approaches considered for aggregating the ranking and ultimately determining the final overall best performer.

Original languageEnglish (US)
Pages (from-to)208-216
Number of pages9
JournalSystems Biomedicine
Volume1
Issue number4
DOIs
StatePublished - 2014

Keywords

  • Crowdsourcing
  • Diagnostic signature
  • Gene expression
  • Improver
  • Molecular classification
  • Peer-review

ASJC Scopus subject areas

  • Genetics(clinical)
  • Genetics
  • Biochemistry
  • Biotechnology
  • Cell Biology
  • Medicine (miscellaneous)

Fingerprint

Dive into the research topics of 'sbv IMPROVER diagnostic signature challenge: Scoring strategies'. Together they form a unique fingerprint.

Cite this