Machine collaboration

Qingfeng Liu, Yang Feng

Research output: Contribution to journalArticlepeer-review

Abstract

We propose a new ensemble framework for supervised learning, called machine collaboration (MaC), using a collection of possibly heterogeneous base learning methods (hereafter, base machines) for prediction tasks. Unlike bagging/stacking (a parallel and independent framework) and boosting (a sequential and top-down framework), MaC is a type of circular and recursive learning framework. The circular and recursive nature helps the base machines to transfer information circularly and update their structures and parameters accordingly. The theoretical result on the risk bound of the estimator from MaC reveals that the circular and recursive feature can help MaC reduce risk via a parsimonious ensemble. We conduct extensive experiments on MaC using both simulated data and 119 benchmark real datasets. The results demonstrate that in most cases, MaC performs significantly better than several other state-of-the-art methods, including classification and regression trees, neural networks, stacking, and boosting.

Original languageEnglish (US)
Article numbere661
JournalStat
Volume13
Issue number1
DOIs
StatePublished - 2024

Keywords

  • boosting
  • CART
  • deep neural network
  • ensemble learning
  • machine learning
  • regression

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Machine collaboration'. Together they form a unique fingerprint.

Cite this