TY - GEN
T1 - Konumsal Bilgi Evrisimsel Sinir Aglarini Nasil Etkiler?
AU - Saritas, Erdi
AU - Ekenel, Hazim Kemal
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - The success of Transformers, including in the field of image processing, has recently attracted the attention of researchers. Some of the researchers tried to design Transformer- based or Convolutional Neural Network-based models separately, while others tried to combine them to produce hybrid models. A significant amount of hybrid model studies have examined the sub-model design and/or the applicability of self-attention in Convolutional Neural Networks. However, position embedding, another contribution from Transformers, has received much less attention. In this study, the effect of position information on Convolutional Neural Networks was analyzed. As a result of the experiments, it has been observed that the use of position information affects performance. In the AgeDB-30, CALFW, and LFW test sets, models with different position information usage have been able to surpass the performance of the model without position information by achieving 95.12%, 93.95%, and 99.52% accuracy, respectively.
AB - The success of Transformers, including in the field of image processing, has recently attracted the attention of researchers. Some of the researchers tried to design Transformer- based or Convolutional Neural Network-based models separately, while others tried to combine them to produce hybrid models. A significant amount of hybrid model studies have examined the sub-model design and/or the applicability of self-attention in Convolutional Neural Networks. However, position embedding, another contribution from Transformers, has received much less attention. In this study, the effect of position information on Convolutional Neural Networks was analyzed. As a result of the experiments, it has been observed that the use of position information affects performance. In the AgeDB-30, CALFW, and LFW test sets, models with different position information usage have been able to surpass the performance of the model without position information by achieving 95.12%, 93.95%, and 99.52% accuracy, respectively.
KW - Convolutional Neural Network
KW - Transformers
UR - http://www.scopus.com/inward/record.url?scp=85177549585&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85177549585&partnerID=8YFLogxK
U2 - 10.1109/UBMK59864.2023.10286763
DO - 10.1109/UBMK59864.2023.10286763
M3 - Conference contribution
AN - SCOPUS:85177549585
T3 - UBMK 2023 - Proceedings: 8th International Conference on Computer Science and Engineering
SP - 526
EP - 531
BT - UBMK 2023 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 8th International Conference on Computer Science and Engineering, UBMK 2023
Y2 - 13 September 2023 through 15 September 2023
ER -