TY - JOUR
T1 - An improved algorithm for incremental extreme learning machine
AU - Song, Shaojian
AU - Wang, Miao
AU - Lin, Yuzhang
N1 - Funding Information:
This work was supported by the National Natural Science Foundation of China [grant number 51767005]; [Natural Science Foundation of Guangxi Province [grant number 2016GXNSFAA380327].
Publisher Copyright:
© 2020, © 2020 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.
PY - 2020/1/1
Y1 - 2020/1/1
N2 - Incremental extreme learning machine (I-ELM) randomly obtains the input weights and the hidden layer neuron bias during the training process. Some hidden nodes in the ELM play a minor role in the network outputs which may eventually increase the network complexity and even reduce the stability of the network. In order to avoid this issue, this paper proposed an enhanced method for the I-ELM which is referred to as the improved incremental extreme learning machine (II-ELM). At each learning step of original I-ELM, an additional offset k will be added to the hidden layer output matrix before computing the output weights for the new hidden node and analysed the existence of the offset k. Compared with several improved algorithms of ELM, the advantages of the II-ELM in the training time, the forecasting accuracy, and the stability are verified on several benchmark datasets in the UCI database.
AB - Incremental extreme learning machine (I-ELM) randomly obtains the input weights and the hidden layer neuron bias during the training process. Some hidden nodes in the ELM play a minor role in the network outputs which may eventually increase the network complexity and even reduce the stability of the network. In order to avoid this issue, this paper proposed an enhanced method for the I-ELM which is referred to as the improved incremental extreme learning machine (II-ELM). At each learning step of original I-ELM, an additional offset k will be added to the hidden layer output matrix before computing the output weights for the new hidden node and analysed the existence of the offset k. Compared with several improved algorithms of ELM, the advantages of the II-ELM in the training time, the forecasting accuracy, and the stability are verified on several benchmark datasets in the UCI database.
KW - Extreme learning machine
KW - incremental algorithm
KW - random hidden nodes
KW - single-hidden layer feedforward neural networks
UR - http://www.scopus.com/inward/record.url?scp=85084797371&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85084797371&partnerID=8YFLogxK
U2 - 10.1080/21642583.2020.1759156
DO - 10.1080/21642583.2020.1759156
M3 - Review article
AN - SCOPUS:85084797371
SN - 2164-2583
VL - 8
SP - 308
EP - 317
JO - Systems Science and Control Engineering
JF - Systems Science and Control Engineering
IS - 1
ER -