TY - GEN
T1 - Team of Tiny ANNs
T2 - 2nd IEEE International Conference on Artificial Intelligence, ICAI 2022
AU - Younis, Hamad
AU - Hassan, Muhammad
AU - Younis, Shahzad
AU - Shafique, Muhammad
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Deep neural networks (DNNs) have latterly accomplished enormous success in various image recognition tasks. Although, training large DNN models are computationally expensive and memory intensive. So the natural idea is to do network compression and acceleration without significantly diminishing the performance of the model. In this paper, we propose a rapid and accurate method of training a neural network that has a small computation time and fewer parameters. The features are extracted using the Discrete Wavelet Transform (DWT) method. A voting-based classifier comprising a team of tiny artificial neural networks is proposed. The proposed classifier combines all the classification votes from the different sub-bands (models) to obtain the final class label, thus, achieving a similar classification accuracy of standard neural network architecture. The experiments were illustrated on benchmark data-sets of MNIST and EMNIST. On MNIST dataset, the trained models achieve the highest accuracy of 93.16 % for original and 90.44 % for Low-Low (LL) sub-band images. On the EMNIST dataset, accuracy of 90.13% for original and 87.40% for LL sub-band images has been obtained, respectively.
AB - Deep neural networks (DNNs) have latterly accomplished enormous success in various image recognition tasks. Although, training large DNN models are computationally expensive and memory intensive. So the natural idea is to do network compression and acceleration without significantly diminishing the performance of the model. In this paper, we propose a rapid and accurate method of training a neural network that has a small computation time and fewer parameters. The features are extracted using the Discrete Wavelet Transform (DWT) method. A voting-based classifier comprising a team of tiny artificial neural networks is proposed. The proposed classifier combines all the classification votes from the different sub-bands (models) to obtain the final class label, thus, achieving a similar classification accuracy of standard neural network architecture. The experiments were illustrated on benchmark data-sets of MNIST and EMNIST. On MNIST dataset, the trained models achieve the highest accuracy of 93.16 % for original and 90.44 % for Low-Low (LL) sub-band images. On the EMNIST dataset, accuracy of 90.13% for original and 87.40% for LL sub-band images has been obtained, respectively.
KW - Artificial Neural Network
KW - Deep Learning
KW - Discrete Wavelet Transform
KW - Distributed Learning
KW - Ensemble Learning
KW - Handwritten Digit Recognition
UR - http://www.scopus.com/inward/record.url?scp=85130899877&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85130899877&partnerID=8YFLogxK
U2 - 10.1109/ICAI55435.2022.9773451
DO - 10.1109/ICAI55435.2022.9773451
M3 - Conference contribution
AN - SCOPUS:85130899877
T3 - 2nd IEEE International Conference on Artificial Intelligence, ICAI 2022
SP - 52
EP - 57
BT - 2nd IEEE International Conference on Artificial Intelligence, ICAI 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 30 March 2022 through 31 March 2022
ER -