eTNT: Enhanced TextNetTopics with Filtered LDA Topics and Sequential Forward / Backward Topic Scoring Approaches


Voskergian D., Jayousi R., Bakir-Gungor B.

International Journal of Advanced Computer Science and Applications, vol.15, no.7, pp.1135-1144, 2024 (ESCI) identifier

  • Publication Type: Article / Article
  • Volume: 15 Issue: 7
  • Publication Date: 2024
  • Doi Number: 10.14569/ijacsa.2024.01507110
  • Journal Name: International Journal of Advanced Computer Science and Applications
  • Journal Indexes: Emerging Sources Citation Index (ESCI), Scopus, Compendex, Index Islamicus, INSPEC
  • Page Numbers: pp.1135-1144
  • Keywords: machine learning, text classification, topic modeling, Topic scoring
  • Abdullah Gül University Affiliated: Yes

Abstract

TextNetTopics is a novel text classification-based topic modelling approach that focuses on topic selection rather than individual word selection to train a machine learning algorithm. However, one key limitation of TextNetTopics is its scoring component, which evaluates each topic in isolation and ranks them accordingly, ignoring the potential relationships between topics. In addition, the chosen topics may contain redundant or irrelevant features, potentially increasing the feature set size and introducing noise that can degrade the overall model performance. To address these limitations and improve the classification performance, this study introduces an enhancement to TextNetTopics. eTNT integrates two novel scoring approaches: Sequential Forward Topic Scoring (SFTS) and Sequential Backward Topic Scoring (SBTS), which consider topic interactions by assessing sets of topics simultaneously. Moreover, it incorporates a filtering component that aims to enhance topics' quality and discriminative power by removing non-informative features from each topic using Random Forest feature importance values. These integrations aim to streamline the topic selection process and enhance classifier efficiency for text classification. The results obtained from the WOS-5736, LitCovid, and MultiLabel datasets provide valuable insights into the superior effectiveness of eTNT compared to its counterpart, TextNetTopics.