A noise-aware feature selection approach for classification

Sabzekar M., AYDIN Z.

SOFT COMPUTING, vol.25, no.8, pp.6391-6400, 2021 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 25 Issue: 8
  • Publication Date: 2021
  • Doi Number: 10.1007/s00500-021-05630-7
  • Journal Name: SOFT COMPUTING
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Applied Science & Technology Source, Compendex, Computer & Applied Sciences, INSPEC, zbMATH
  • Page Numbers: pp.6391-6400
  • Keywords: Feature selection, Importance degree, Noisy data, Sequential backward search
  • Abdullah Gül University Affiliated: Yes


A noise-aware version of support vector machines is utilized for feature selection in this paper. Combining this method and sequential backward search (SBS), a new algorithm for removing irrelevant features is proposed. Although feature selection methods in the literature which utilize support vector machines have provided acceptable results, noisy samples and outliers may affect the performance of SVM and feature selections method, consequently. Recently, we have proposed relaxed constrains SVM (RSVM) which handles noisy data and outliers. Each training sample in RSVM is associated with a degree of importance utilizing the fuzzy c-means clustering method. Therefore, a less importance degree is assigned to noisy data and outliers. Moreover, RSVM has more relaxed constraints that can reduce the effect of noisy samples. Feature selection increases the accuracy of different machine learning applications by eliminating noisy and irrelevant features. In the proposed RSVM-SBS feature selection algorithm, noisy data have small effect on eliminating irrelevant features. Experimental results using real-world data verify that RSVM-SBS has better results in comparison with other feature selection approaches utilizing support vector machines.