Abstract
We propose a new ensemble framework for supervised learning, called machine collaboration (MaC), using a collection of possibly heterogeneous base learning methods (hereafter, base machines) for prediction tasks. Unlike bagging/stacking (a parallel and independent framework) and boosting (a sequential and top-down framework), MaC is a type of circular and recursive learning framework. The circular and recursive nature helps the base machines to transfer information circularly and update their structures and parameters accordingly. The theoretical result on the risk bound of the estimator from MaC reveals that the circular and recursive feature can help MaC reduce risk via a parsimonious ensemble. We conduct extensive experiments on MaC using both simulated data and 119 benchmark real datasets. The results demonstrate that in most cases, MaC performs significantly better than several other state-of-the-art methods, including classification and regression trees, neural networks, stacking, and boosting.
Original language | English (US) |
---|---|
Article number | e661 |
Journal | Stat |
Volume | 13 |
Issue number | 1 |
DOIs | |
State | Published - 2024 |
Keywords
- CART
- boosting
- deep neural network
- ensemble learning
- machine learning
- regression
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty