Binary Multi-Verse Optimization (BMVO) Approaches for Feature Selection.

Authors

DOI:

https://doi.org/10.9781/ijimai.2019.07.004

Keywords:

Machine Learning, Feature Selection, K-Nearest Neighbors, Binary Multi-Verse Optimization

Abstract

Multi-Verse Optimization (MVO) is one of the newest meta-heuristic optimization algorithms which imitates the theory of Multi-Verse in Physics and resembles the interaction among the various universes. In problem domains like feature selection, the solutions are often constrained to the binary values viz. 0 and 1. With regard to this, in this paper, binary versions of MVO algorithm have been proposed with two prime aims: firstly, to remove redundant and irrelevant features from the dataset and secondly, to achieve better classification accuracy. The proposed binary versions use the concept of transformation functions for the mapping of a continuous version of the MVO algorithm to its binary versions. For carrying out the experiments, 21 diverse datasets have been used to compare the Binary MVO (BMVO) with some binary versions of existing metaheuristic algorithms. It has been observed that the proposed BMVO approaches have outperformed in terms of a number of features selected and the accuracy of the classification process.

Downloads

Download data is not yet available.

References

[1] Chandrashekar, G., & Sahin, F. (2014). A survey on feature selection methods. Computers & Electrical Engineering, 40(1), 16-28.

[2] Hancer, E., Xue, B., Karaboga, D., & Zhang, M. (2015). A binary ABC algorithm based on advanced similarity scheme for feature selection. Applied Soft Computing, 36, 334-348.

[3] Sayed, G. I., Hassanien, A. E., & Azar, A. T. (2017). Feature selection via a novel chaotic crow search algorithm. Neural Computing and Applications, 1-18.

[4] Inbarani, H. H., Azar, A. T., & Jothi, G. (2014). Supervised hybrid feature selection based on PSO and rough sets for medical diagnosis. Computer methods and programs in biomedicine, 113(1), 175-185.

[5] Mlakar, U., Fister, I., Brest, J., & Potočnik, B. (2017). Multi-objective differential evolution for feature selection in facial expression recognition systems. Expert Systems with Applications, 89, 129-137.

[6] Sweetlin, J. D., Nehemiah, H. K., & Kannan, A. (2017). Feature selection using ant colony optimization with tandem-run recruitment to diagnose bronchitis from CT scan images. Computer methods and programs in biomedicine, 145, 115-125.

[7] Jain, I., Jain, V. K., & Jain, R. (2018). Correlation feature selection based improved-Binary Particle Swarm Optimization for gene selection and cancer classification. Applied Soft Computing, 62, 203-215.

[8] Adeli, A., & Broumandnia, A. (2018). Image steganalysis using improved particle swarm optimization based feature selection. Applied Intelligence, 48(6), 1609-1622.

[9] Lin, K. C., Zhang, K. Y., Huang, Y. H., Hung, J. C., & Yen, N. (2016). Feature selection based on an improved cat swarm optimization algorithm for big data classification. The Journal of Supercomputing, 72(8), 3210- 3221.

[10] Chen, L. F., Su, C. T., Chen, K. H., & Wang, P. C. (2012). Particle swarm optimization for feature selection with application in obstructive sleep apnea diagnosis. Neural Computing and Applications, 21(8), 2087-2096.

[11] Shang, L., Zhou, Z., & Liu, X. (2016). Particle swarm optimization-based feature selection in sentiment classification. Soft Computing, 20(10), 3821-3834.

[12] ElAlami, M. E. (2011). A novel image retrieval model based on the most relevant features. Knowledge-Based Systems, 24(1), 23-32.

[13] Zawbaa, H. M., Emary, E., Parv, B., & Sharawi, M. (2016, July). Feature selection approach based on moth-flame optimization algorithm. In Evolutionary Computation (CEC), 2016 IEEE Congress on (pp. 4612- 4617). IEEE.

[14] Ghaemi, M., & Feizi-Derakhshi, M. R. (2016). Feature selection using forest optimization algorithm. Pattern Recognition, 60, 121-129.

[15] Yang, X. S., Deb, S., & Fong, S. (2014). Metaheuristic algorithms: optimal balance of intensification and diversification. Applied Mathematics & Information Sciences, 8(3), 977.

[16] Mafarja, M. M., & Mirjalili, S. (2017). Hybrid Whale Optimization Algorithm with simulated annealing for feature selection. Neurocomputing, 260, 302- 312.

[17] Chuang, L. Y., Chang, H. W., Tu, C. J., & Yang, C. H. (2008). Improved binary PSO for feature selection using gene expression data.Computational Biology and Chemistry, 32(1), 29-38.

[18] Chen, Y. P., Li, Y., Wang, G., Zheng, Y. F., Xu, Q., Fan, J. H., & Cui, X. T. (2017). A novel bacterial foraging optimization algorithm for feature selection. Expert Systems with Applications, 83, 1-17.

[19] Babatunde, O. H., Armstrong, L., Leng, J., & Diepeveen, D. (2014). A genetic algorithm-based feature selection. International Journal of Electronics Communication and Computer Engineering, 5(4), 899-905.

[20] Ghamisi, P., & Benediktsson, J. A. (2015). Feature selection based on hybridization of genetic algorithm and particle swarm optimization. IEEE Geoscience and Remote Sensing Letters, 12(2), 309-313.

[21] Wang, G., Chu, H. E., Zhang, Y., Chen, H., Hu, W., Li, Y., & Peng, X. (2015). Multiple parameter control for ant colony optimization applied to feature selection problem. Neural Computing and Applications, 26(7), 1693-1708.

[22] Ahmad, F., Isa, N. A. M., Hussain, Z., Osman, M. K., & Sulaiman, S. N. (2015). A GA-based feature selection and parameter optimization of an ANN in diagnosing breast cancer.Pattern Analysis and Applications, 18(4), 861-870.

[23] Chen, Y., Miao, D., & Wang, R. (2010). A rough set approach to feature selection based on ant colony optimization. Pattern Recognition Letters, 31(3), 226-233.

[24] Eardley DM (1974) Death of white holes in the early Universe. Phys Rev Lett 33:442.

[25] Davies PC (1978) Thermodynamics of black holes. Rep Prog Phys 41:1313.

[26] Morris MS, Thorne KS (1988) Wormholes in spacetime and their use for interstellar travel: a tool for teaching general relativity. Am J Phys 56:395-412.

[27] Mirjalili, S., Mirjalili, S. M., & Hatamlou, A. (2016). Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Computing and Applications, 27(2), 495-513.

[28] Guth AH (2007) Eternal inflation and its implications. J Phys A Math Theor 40:6811.

[29] Steinhardt PJ, Turok N (2005) The cyclic model simplified. New Astron Rev 49:43–57.

[30] Chuang, L. Y., Yang, C. H., & Li, J. C. (2011). Chaotic maps based on binary particle swarm optimization for feature selection. Applied Soft Computing, 11(1), 239-248.

[31] Banati, H., & Bajaj, M. (2011). Fire fly based feature selection approach. IJCSI International Journal of Computer Science Issues, 8(4), 473-480.

[32] Liu, Y., Wang, G., Chen, H., Dong, H., Zhu, X., & Wang, S. (2011). An improved particle swarm optimization for feature selection. Journal of Bionic Engineering, 8(2), 191-200.

[33] Nakamura, R. Y., Pereira, L. A., Costa, K. A., Rodrigues, D., Papa, J. P., & Yang, X. S. (2012, August). BBA: a binary bat algorithm for feature selection. In 2012 25th SIBGRAPI conference on graphics, Patterns and Images (pp. 291-297). IEEE.

[34] Han, X., Chang, X., Quan, L., Xiong, X., Li, J., Zhang, Z., & Liu, Y. (2014). Feature subset selection by gravitational search algorithm optimization. Information Sciences, 281, 128-146.

[35] Emary, E., Zawbaa, H. M., Ghany, K. K. A., Hassanien, A. E., & Parv, B. (2015, September). Firefly optimization algorithm for feature selection. In Proceedings of the 7th Balkan Conference on Informatics Conference (p. 26). ACM.

[36] Emary, E., Zawbaa, H. M., & Hassanien, A. E. (2016). Binary grey wolf optimization approaches for feature selection. Neurocomputing, 172, 371- 381.

[37] Wang, H., & Niu, B. (2017). A novel bacterial algorithm with randomness control for feature selection in classification. Neurocomputing, 228, 176- 186.

[38] Mafarja, M., Eleyan, D., Abdullah, S., & Mirjalili, S. (2017, July). S-shaped vs. V-shaped transfer functions for ant lion optimization algorithm in feature selection problem. In Proceedings of the International Conference on Future Networks and Distributed Systems (p. 14). ACM.

[39] Ewees, A. A., El Aziz, M. A., & Hassanien, A. E. (2019). Chaotic multi-verse optimizer-based feature selection. Neural Computing and Applications, 31(4), 991-1006.

[40] Shunmugapriya, P., & Kanmani, S. (2017). A hybrid algorithm using ant and bee colony optimization for feature selection and classification (ACABC Hybrid). Swarm and Evolutionary Computation, 36, 27-36.

[41] Aladeemy, M., Tutun, S., & Khasawneh, M. T. (2017). A new hybrid approach for feature selection and support vector machine model selection based on self-adaptive cohort intelligence. Expert Systems with Applications, 88, 118-131.

[42] Nagpal, S., Arora, S., & Dey, S. (2017). Feature Selection using Gravitational Search Algorithm for Biomedical Data. Procedia Computer Science, 115, 258-265.

[43] Faris, H., Hassonah, M. A., Ala’M, A. Z., Mirjalili, S., & Aljarah, I. (2018). A multi-verse optimizer approach for feature selection and optimizing SVM parameters based on a robust system architecture. Neural Computing and Applications, 30(8), 2355-2369.

[44] Aljarah, I., Ala’M, A. Z., Faris, H., Hassonah, M. A., Mirjalili, S., & Saadeh, H. (2018). Simultaneous Feature Selection and Support Vector Machine Optimization Using the Grasshopper Optimization Algorithm. Cognitive Computation, 10(3), 478-495.

[45] Mafarja, M., & Mirjalili, S. (2018). Whale optimization approaches for wrapper feature selection. Applied Soft Computing, 62, 441-453.

[46] Suganthi, M., & Karunakaran, V. (2018). Instance selection and featureextraction using cuttlefish optimization algorithm and principal component analysis using decision tree. Cluster Computing, 1-13.

[47] Dong, H., Li, T., Ding, R., & Sun, J. (2018). A Novel Hybrid Genetic Algorithm with Granular Information for Feature Selection and Optimization. Applied Soft Computing, 65, 33-46.

[48] Mirjalili, S., & Lewis, A. (2013). S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm and Evolutionary Computation, 9, 1-14.

[49] Saremi, S., Mirjalili, S., & Lewis, A. (2015). How important is a transfer function in discrete heuristic algorithms. Neural Computing and Applications, 26(3), 625-640.

[50] Xue, B., Zhang, M., & Browne, W. N. (2013). Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE transactions on cybernetics, 43(6), 1656-1671.

[51] Hall, M. A. (1999). Correlation-based feature selection for machine learning.

[52] Altman, N. S. (1992). An introduction to kernel and nearest-neighbor nonparametric regression. The American Statistician, 46(3), 175-185.

[53] Emary, E., Zawbaa, H. M., & Hassanien, A. E. (2016). Binary ant lion approaches for feature selection. Neurocomputing, 213, 54-65.

[54] Asuncion, A., & Newman, D. (2007). UCI machine learning repository.

[55] Jensen, R., & Shen, Q. (2008). Computational intelligence and feature selection: rough and fuzzy approaches (Vol. 8). John Wiley & Sons.

[56] Derrac, J., García, S., Molina, D., & Herrera, F. (2011). A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation, 1(1), 3-18

[57] Reddy, K. S., Panwar, L. K., Panigrahi, B. K., & Kumar, R. (2018). A New Binary Variant of Sine–Cosine Algorithm: Development and Application to Solve Profit-Based Unit Commitment Problem. Arabian Journal for Science and Engineering, 43(8), 4041-4056.

[58] Faris, H., Hassonah, M. A., Ala’M, A. Z., Mirjalili, S., & Aljarah, I. (2018). A multi-verse optimizer approach for feature selection and optimizing SVM parameters based on a robust system architecture. Neural Computing and Applications, 30(8), 2355-2369.

[59] Sohrabi, M. K., & Tajik, A. (2017). Multi-objective feature selection for warfarin dose prediction. Computational biology and chemistry, 69, 126- 133.

[60] Zouache, D., & Abdelaziz, F. B. (2018). A cooperative swarm intelligence algorithm based on quantum-inspired and rough sets for feature selection. Computers & Industrial Engineering, 115, 26-36.

[61] Tamimi, E., Ebadi, H., & Kiani, A. (2017). Evaluation of different metaheuristic optimization algorithms in feature selection and parameter determination in SVM classification. Arabian Journal of Geosciences, 10(22), 478.

[62] Tiwari, S., Singh, B., & Kaur, M. (2017). An approach for feature selection using local searching and global optimization techniques. Neural Computing and Applications, 28(10), 2915-2930.

[63] Ekbal, A., & Saha, S. (2016). Simultaneous feature and parameter selection using multiobjective optimization: application to named entity recognition. International Journal of Machine Learning and Cybernetics, 7(4), 597-611.

[64] Lin, S. W., Chen, S. C., Wu, W. J., & Chen, C. H. (2009). Parameter determination and feature selection for back-propagation network by particle swarm optimization. Knowledge and Information Systems, 21(2), 249-266.

[65] Vijaya, J., & Sivasankar, E. (2017). An efficient system for customer churn prediction through particle swarm optimization based feature selection model with simulated annealing. Cluster Computing, 1-12.

[66] Manikandan, R. P. S., & Kalpana, A. M. (2017). Feature selection using fish swarm optimization in big data. Cluster Computing, 1-13.

[67] Emary, E., & Zawbaa, H. M. (2019). Feature selection via Lèvy Antlion optimization. Pattern Analysis and Applications, 22(3), 857-876.

[68] Tegmark, M. (2004). Barrow, JD Davies, PC Harper, CL, Jr eds. Science and Ultimate Reality Cambridge University Press Cambridge.

[69] Rashedi, E., Nezamabadi-Pour, H., & Saryazdi, S. (2010). BGSA: binary gravitational search algorithm. Natural Computing, 9(3), 727-745.

[70] Bagga, P., Hans, R., & Sharma, V. (2017). N-grams Based Supervised Machine Learning Model for Mobile Agent Platform Protection against Unknown Malicious Mobile Agents. International Journal of Interactive Multimedia and Artificial Intelligence, 4(6), 33-39.

[71] H. M. Keerthi Kumar, B. S. Harish. Automatic Irony Detection using Feature Fusion and Ensemble Classifier, International Journal of Interactive Multimedia and Artificial Intelligence, (2019).

[72] Revanasiddappa, M. B., & Harish, B. S. (2018). A new feature selection method based on intuitionistic fuzzy entropy to categorize text documents. International Journal of Interactive Multimedia and Artificial Intelligence, 5(3), 106-117.

[73] Pujari, D., Yakkundimath, R., & Byadgi, A. S. (2016). SVM and ANN based classification of plant diseases using feature reduction technique. International Journal of Interactive Multimedia and Artificial Intelligence, 3(7), 6-14.

[74] Belkhodja, L., & Hamdadou, D. (2019). IMCAD: Computer Aided System for Breast Masses Detection based on Immune Recognition. International Journal of Interactive Multimedia and Artificial Intelligence, 5(5), 97-108.

Downloads

Published

2020-03-01
Metrics
Views/Downloads
  • Abstract
    284
  • PDF
    66

How to Cite

Hans, R. and Kaur, H. (2020). Binary Multi-Verse Optimization (BMVO) Approaches for Feature Selection. International Journal of Interactive Multimedia and Artificial Intelligence, 6(1), 91–106. https://doi.org/10.9781/ijimai.2019.07.004