An improved algorithm for incremental extreme learning machine

Shaojian Song, Miao Wang, Yuzhang Lin

Research output: Contribution to journalReview articlepeer-review


Incremental extreme learning machine (I-ELM) randomly obtains the input weights and the hidden layer neuron bias during the training process. Some hidden nodes in the ELM play a minor role in the network outputs which may eventually increase the network complexity and even reduce the stability of the network. In order to avoid this issue, this paper proposed an enhanced method for the I-ELM which is referred to as the improved incremental extreme learning machine (II-ELM). At each learning step of original I-ELM, an additional offset k will be added to the hidden layer output matrix before computing the output weights for the new hidden node and analysed the existence of the offset k. Compared with several improved algorithms of ELM, the advantages of the II-ELM in the training time, the forecasting accuracy, and the stability are verified on several benchmark datasets in the UCI database.

Original languageEnglish (US)
Pages (from-to)308-317
Number of pages10
JournalSystems Science and Control Engineering
Issue number1
StatePublished - Jan 1 2020


  • Extreme learning machine
  • incremental algorithm
  • random hidden nodes
  • single-hidden layer feedforward neural networks

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Control and Optimization
  • Artificial Intelligence


Dive into the research topics of 'An improved algorithm for incremental extreme learning machine'. Together they form a unique fingerprint.

Cite this