Effect on model performance of regularization methods


Creative Commons License

BUDAK C., MENÇİK V., ASKER M. E.

Dicle Üniversitesi Mühendislik Fakültesi Mühendislik Dergisi, cilt.12, sa.5, ss.757-765, 2021 (Hakemli Dergi) identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 12 Sayı: 5
  • Basım Tarihi: 2021
  • Doi Numarası: 10.24012/dumf.1051352
  • Dergi Adı: Dicle Üniversitesi Mühendislik Fakültesi Mühendislik Dergisi
  • Derginin Tarandığı İndeksler: TR DİZİN (ULAKBİM)
  • Sayfa Sayıları: ss.757-765
  • Abdullah Gül Üniversitesi Adresli: Evet

Özet

Artificial Neural Networks with numerous parameters are tremendously powerful machine learning systems. Nonetheless, overfitting is a crucial problem in such networks. Maximizing the model accuracy and minimizing the amount of loss is significant in reducing in-class differences and maintaining sensitivity to these differences. In this study, the effects of overfitting for different model architectures with the Wine dataset were investigated by Dropout, AlfaDropout, GausianDropout, Batch normalization, Layer normalization, Activity normalization, L1 and L2 regularization methods and the change in loss function the combination with these methods. Combinations that performed well were examined on different datasets using the same model. The binary cross-entropy loss function was used as a performance measurement metric. According to the results, the Layer and Activity regularization combination showed better training and testing performance compared to other combinations.