Re-exploring the Kayseri Culture Route by Using Deep Learning for Cultural Heritage Image Classification


Creative Commons License

Kevseroğlu Ö., Kurban R.

AICCONF '24: Cognitive Models and Artificial Intelligence Conference (Association for Computing Machinery, NY, US), İstanbul, Türkiye, 25 Mayıs 2024, ss.196-201 identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1145/3660853.3660913
  • Basıldığı Şehir: İstanbul
  • Basıldığı Ülke: Türkiye
  • Sayfa Sayıları: ss.196-201
  • Abdullah Gül Üniversitesi Adresli: Evet

Özet

The categorization of images captured during the documentation of architectural structures is a crucial aspect of preserving cultural heritage in digital form. Dealing with a large volume of images makes this categorization process laborious and time-consuming, often leading to errors. Introducing automatic techniques to aid in sorting would streamline this process, enhancing the efficiency of digital documentation. Proper classification of these images facilitates improved organization and more effective searches using specific terms, thereby aiding in the analysis and interpretation of the heritage asset. This study primarily focuses on applying deep learning techniques, specifically SqueezeNet convolutional neural networks (CNNs), for classifying images of architectural heritage. The effectiveness of training these networks from scratch versus fine-tuning pre-existing models is examined. In this study, we concentrate on identifying significant elements within images of buildings with architectural heritage significance of Kayseri Culture Route. Since no suitable datasets for network training were found, a new dataset was created. Transfer learning enables the use of pre-trained convolutional neural networks to specific image classification tasks. In the experiments, 99.8% of classification accuracy have been achieved by using SqueezeNet, suggesting that the implementation of the technique can substantially enhance the digital documentation of architectural heritage.