International Journal of Intelligent Systems and Applications(IJISA)

ISSN: 2074-904X (Print), ISSN: 2074-9058 (Online)

Published By: MECS Press

IJISA Vol.11, No.1, Jan. 2019

Ensemble Feature Selection Algorithm

Full Text (PDF, 773KB), PP.24-31

Views:142   Downloads:13



Index Terms

Feature selection;ensemble;library;benchmark;datasets;subset;model;algorithm


In this paper, we propose a new feature selection algorithm based on ensemble selection. In order to generate the library of models, each model is trained using just one feature. This means each model in the library represents a feature. Ensemble construction returns a well performing subset of features associated to the well performing subset of models. Our proposed approaches are evaluated using eight benchmark datasets. The results show the effectiveness of our ensemble selection approaches.

Cite This Paper

Yassine AKHIAT, Mohamed CHAHHOU, Ahmed ZINEDINE, "Ensemble Feature Selection Algorithm", International Journal of Intelligent Systems and Applications(IJISA), Vol.11, No.1, pp.24-31, 2019. DOI: 10.5815/ijisa.2019.01.03


[1]VERLEYSEN, Michel et FRANÇOIS, Damien. The curse of dimensionality in data mining and time series prediction. In: International Work-Conference on Artificial Neural Networks. Springer, Berlin, Heidelberg, 2005. p. 758-770.

[2]LI, Jundong, CHENG, Kewei, WANG, Suhang, et al. Feature selection: A data perspective. ACM Computing Surveys (CSUR), 2017, vol. 50, no 6, p. 94.

[3]SHARDLOW, Matthew. An analysis of feature selection techniques. The University of Manchester, 2016, p. 1-7.

[4]GU, Quanquan, LI, Zhenhui, et HAN, Jiawei. Generalized fisher score for feature selection. arXiv preprint arXiv:1202.3725, 2012.

[5]PENG, Hanchuan, LONG, Fuhui, et DING, Chris. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on pattern analysis and machine intelligence, 2005, vol. 27, no 8, p. 1226-1238.

[6]MOLINA, Luis Carlos, BELANCHE, Lluís, et NEBOT, Àngela. Feature selection algorithms: A survey and experimental evaluation. In: Data Mining, 2002. ICDM 2003. Proceedings. 2002 IEEE International Conference on. IEEE, 2002. p. 306-313.

[7]KOHAVI, Ron et JOHN, George H. Wrappers for feature subset selection. Artificial intelligence, 1997, vol. 97, no 1-2, p. 273-324.

[8]TIBSHIRANI, Robert. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Methodological), 1996, p. 267-288.

[9]ZOU, Hui et HASTIE, Trevor. Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2005, vol. 67, no 2, p. 301-320.

[10]FILALI, Ameni, JLASSI, Chiraz, et AROUS, Najet. Recursive Feature Elimination with Ensemble Learning Using SOM Variants. International Journal of Computational Intelligence and Applications, 2017, vol. 16, no 01, p. 1750004.

[11]CARUANA, Rich, NICULESCU-MIZIL, Alexandru, CREW, Geoff, et al. Ensemble selection from libraries of models. In: Proceedings of the twenty-first international conference on Machine learning. ACM, 2004. p. 18.

[12]SRIVASTAVA, Nitish, HINTON, Geoffrey, KRIZHEVSKY, Alex, et al. Dropout: a simple way to prevent neural networks from overfitting. The Journal of Machine Learning Research, 2014, vol. 15, no 1, p. 1929-1958.

[13]FREUND, Yoav, SCHAPIRE, Robert, et ABE, Naoki. A short introduction to boosting. Journal-Japanese Society For Artificial Intelligence, 1999, vol. 14, no 771-780, p. 1612.

[14]SUN, Quan et PFAHRINGER, Bernhard. Bagging ensemble selection. In: Australasian Joint Conference on Artificial Intelligence. Springer, Berlin, Heidelberg, 2011. p. 251-260.

[15]Newman, D., Hettich, S., Blake, C., Merz, C., UCI Repository of Machine Learning Databases,∼mlearn/MLRepository.html, 2003.

[16]KOHAVI, Ron et SOMMERFIELD, Dan. Feature Subset Selection Using the Wrapper Method: Overfitting and Dynamic Search Space Topology. In: KDD. 1995. p. 192-197.