2022 | |
[1] | "A Systematic Review of Ensemble Techniques for Software Defect and Change Prediction", In e-Informatica Software Engineering Journal, vol. 16, no. 1, pp. 220105, 2022.
DOI: , 10.37190/e-Inf220105. Download article (PDF)Get article BibTeX file |
Authors
Megha Khanna
Abstract
Background: The use of ensemble techniques have steadily gained popularity in several software quality assurance activities. These aggregated classifiers have proven to be superior than their constituent base models. Though ensemble techniques have been widely used in key areas such as Software Defect Prediction (SDP) and Software Change Prediction (SCP), the current state-of-the-art concerning the use of these techniques needs scrutinization.
Aim: The study aims to assess, evaluate and uncover possible research gaps with respect to the use of ensemble techniques in SDP and SCP.
Method: This study conducts an extensive literature review of 77 primary studies on the basis of the category, application, rules of formulation, performance, and possible threats of the proposed/utilized ensemble techniques.
Results: Ensemble techniques were primarily categorized on the basis of similarity, aggregation, relationship, diversity, and dependency of their base models. They were also found effective in several applications such as their use as a learning algorithm for developing SDP/SCP models and for addressing the class imbalance issue.
Conclusion: The results of the review ascertain the need of more studies to propose, assess, validate, and compare various categories of ensemble techniques for diverse applications in SDP/SCP such as transfer learning and online learning.
Keywords
Ensemble learning, Software change prediction, Software defect prediction, Software quality, Systematic review
References
1. N.E. Fenton and N. Ohlsson. “Quantitative analysis of faults and failures in a complex software system”. IEEE Transactions on Software Engineering , Volume 26(8):797–814, Aug 2000.
2. A.G. Koru and J. Tian. “Comparing high-change modules and modules with the highest measurement values in two large-scale open-source products”. IEEE Transactions on Software Engineering , Volume 31(8):625–642, Aug 2005.
3. S. Lessmann, B. Baesens, C. Mues and S. Pietsch. “Benchmarking classification models for software defect prediction: A proposed framework and novel findings”. IEEE Transactions on Software Engineering , Volume 34(4):485–496, May 2008.
4. N. Seliya, T.M. Khoshgoftaar and V.J. Hulse. Predicting faults in high assurance software. In 2010 IEEE 12th International Symposium on High Assurance Systems Engineering , pages 26–34. IEEE, Nov 2010.
5. R. Malhotra and M. Khanna. An exploratory study for software change prediction in object-oriented systems using hybridized techniques. Automated Software Engineering , Volume 24(3):673–717, Sep 2017.
6. A.G. Koru and H. Liu. “Identifying and characterizing change-prone classes in two large-scale open-source products”. Journal of Systems and Software , Volume 80(1):63–73, Jan 2007.
7. D. Romano and M. Pinzger. Using source code metrics to predict change-prone java interfaces. In 2011 27th IEEE International Conference on Software Maintenance (ICSM) , pages 303–312. IEEE, Sep 2011.
8. E. Giger, M. Pinzger and H.C. Gall. “Can we predict types of code changes? an empirical analysis”. In 2012 9th IEEE Working Conference on Mining Software Repositories (MSR), IEEE, June 2012, pp. 217–226.
9. M.O. Elish and M. Al Khiaty. “A suite of metrics for quantifying historical changes to predict future change-prone classes in object-oriented software”. Journal of Software: Evolution and Process , Volume 25(5):407–437, May 2013.
10. R. Malhotra. A systematic review of machine learning techniques for software fault prediction. Applied Soft Computing , Volume 27:504–518, Feb 2015.
11. R. S. Wahono. A systematic literature review of software defect prediction: research trends, datasets, methods and frameworks. Journal of Software Engineering , Volume 1(1):1–16, Apr 2015.
12. A. Idri, M. Hosni and A. Abran Systematic literature review of ensemble effort estimation. Journal of Systems and Software , Volume 118:151–175, Aug 2016.
13. R. Malhotra and M. Khanna Software Change Prediction: A Systematic Review and Future Guidelines. e-Informatica Software Engineering Journal , Volume 13(1):227–259, 2019.
14. G. Catolino and F. Ferrucci. An extensive evaluation of ensemble techniques for software change prediction. Journal of Software: Evolution and Process , Volume 31(9):e2156, Sep 2019.
15. X. Zhu, Y. He, L. Cheng, X. Jia, and L. Zhu. Software change-proneness prediction through combination of bagging and resampling methods. Journal of Software: Evolution and Process , Volume 30(12):e2111, Oct 2018.
16. L. Rokach Ensemble methods for classifiers. Data mining and knowledge discovery handbook , Springer, Boston, MA, 957–980, 2005.
17. R.I. Kuncheva and C.J. Whitaker Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Machine Learning , Volume 51(2):181–207, May 2003.
18. L. Jonsson, M. Borg, D. Broman, K. Sandahl, S. Eldh and P. Runeson Automated bug assignment: Ensemble-based machine learning in large scale industrial contexts. Empirical Software Engineering , Volume 21(4):1533–1578, Aug 2016.
19. S.S. Rathore and S. Kumar Linear and non-linear heterogeneous ensemble methods to predict the number of faults in software systems. Knowledge-Based Systems , Volume 119:232–256, Mar 2017.
20. R. Malhotra and M. Khanna. Particle swarm optimization-based ensemble learning for software change prediction. Information and Software Technology , Volume 102:65–84, Oct 2018.
21. M. Re and G. Valentini Ensemble methods: a review, EBT Advances in Machine Learning and Data Mining for Astronomy. Data mining and Knowledge Discovery , Chapman-Hall, 563–594, 2012.
22. V. Bolón-Canedo and A. Alonso-Betanzos. Ensembles for feature selection: A review and future trends. Information Fusion , Volume 52:1–12, Dec 2019.
23. M. Galar, A. Fernandez, E. Barrenechea, H. Bustince and F. Herrera. A review on ensembles for the class imbalance problem: bagging-, boosting-, and hybrid-based approaches. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) , Volume 42(4):463–484, Aug 2011.
24. D. Radjenović, M. Heričko, R. Torkar and A. Živkovič. Software fault prediction metrics: A systematic literature review. Information and Software Technology , Volume 55(8):1397–1418, Aug 2013.
25. C. Catal. Software fault prediction: A literature review and current trends. Expert systems with applications , Volume 38(4):4626–4636, Apr 2011.
26. S. Hosseini, B. Turhan and D. Gunarathna A systematic literature review and meta-analysis on cross project defect prediction. IEEE Transactions on Software Engineering , Volume 45(2):111–147, Nov 2017.
27. R. Malhotra, M. Khanna and R.R. Raje. “On the application of search-based techniques for software engineering predictive modeling: A systematic review and future directions”. Swarm and Evolutionary Computation , Volume 32:85–109, Feb 2017.
28. R. Malhotra and M. Khanna. Threats to validity in search-based predictive modelling for software engineering. IET Software , Volume 12(4):293–305, June 2018.
29. C. Catal and B. Diri. A systematic review of software fault prediction studies. Expert systems with applications , Volume 36(4):7346–7354, May 2009.
30. T. Hall, S. Beecham, D. Bowes, D. Gray and S. Counsell. A systematic literature review on fault prediction performance in software engineering. IEEE Transactions on Software Engineering , Volume 38(6):1276–1304, Oct 2011.
31. R. Malhotra and A.J. Bansal. Software change prediction: a literature review. International Journal of Computer Applications in Technology , Volume 54(4):240–256, 2016.
32. B.A. Kitchenham, D. Budgen and P. Brereton. Evidence-based software engineering and systematic reviews . Volume 4, CRC Press, 2015.
33. J. Wen, S. Li, Z. Lin, Y. Hu and C. Huang. Systematic literature review of machine learning based software development effort estimation models. Information and Software Technology , Volume 54(1):41–59, Jan 2012.
34. Y. Jiang, B. Cukic and T. Menzies. Fault prediction using early lifecycle data. In 18th IEEE International Symposium on Software Reliability (ISSRE’07) , pages 237–246. IEEE, Nov 2007.
35. Y. Ma, L. Guo, and B. Cukic. A statistical framework for the prediction of fault-proneness. In Advances in Machine Learning Applications in Software Engineering , pages 237–263. IGI Global, 2007.
36. H. Jia, F. Shu, Y. Yang and Q. Wang. Predicting fault-prone modules: A comparative study. In International Conference on Software Engineering Approaches for Offshore and Outsourced Development , pages 45–59. Springer, Berlin, Heidelberg, July 2009.
37. T.M. Khoshgoftaar, P. Rebours and N. Seliya. Software quality analysis by combining multiple projects and learners. Software Quality Journal , Volume 17(1):25–49, Mar 2009.
38. T. Mende and R. Koschke. Revisiting the evaluation of defect prediction models. In 5th International Conference on Predictor Models in Software Engineering , pages 1–10, May 2009.
39. C. Seiffert, T.M. Khoshgoftaar and J. Van Hulse. Improving software-quality predictions with data sampling and boosting. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans , Volume 39(6):1283–1294, Sep 2009.
40. E. Arisholm, L.C. Briand and E.B. Johannessen. A systematic and comprehensive investigation of methods to build and evaluate fault prediction models. Journal of Systems and Software , Volume 83(1):2–17, Jan 2010.
41. Y. Liu, T.M. Khoshgoftaar and N. Seliya. Evolutionary optimization of software quality modeling with multiple repositories. IEEE Transactions on Software Engineering , Volume 36(6):852–864, May 2010.
42. J. Zheng. Cost-sensitive boosting neural networks for software defect prediction. Expert Systems with Applications , Volume 37(6):4537–4543, Jun 2010.
43. A.T. Mısırlı, A.B. Bener and B. Turhan. An industrial case study of classifier ensembles for locating software defects. Software Quality Journal , Volume 19(3):515–536, Sep 2011.
44. Y. Peng, G. Kou, G. Wang, W. Wu and Y. Shi. Ensemble of software defect predictors: an AHP-based evaluation method. International Journal of Information Technology and Decision Making , Volume 10(01):187–206, Jan 2011.
45. N. Seliya and T.M. Khoshgoftaar. The use of decision trees for cost-sensitive classification: an empirical study in software quality prediction. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery , Volume 1(5):448–459, Sep 2011.
46. K. Gao, T.M. Khoshgoftaar and A. Napolitano. A hybrid approach to coping with high dimensionality and class imbalance for software defect prediction. In 2012 11th international conference on machine learning and applications , IEEE, pages 281–288, Dec 2012.
47. Z. Sun, Q. Song and X. Zhu. Using coding-based ensemble learning to improve software defect prediction. IEEE Transactions on Systems, Man, and Cybernetics-Part C (Applications and Reviews) , Volume 42(6):1806–1817, Dec 2012.
48. S. Wang, L.L. Minku and X. Yao. Online class imbalance learning and its applications in fault detection. International Journal of Computational Intelligence and Applications , Volume 12(4):p.1340001, Dec 2013.
49. S. Wang and X. Yao. Using class imbalance learning for software defect prediction. IEEE Transactions on Reliability , Volume 62(2):434–443, Apr 2013.
50. A. Kaur and A. Kaur. Performance analysis of ensemble learning for predicting defects in open source software. In 2014 international conference on advances in computing, communications and informatics (ICACCI) , IEEE, pages 219–225, Sep 2014.
51. A. Panichella, R. Oliveto and A. De Lucia. Cross-project defect prediction models: L’union fait la force. In Software Evolution Week-IEEE Conference on Software Maintenance, Reengineering, and Reverse Engineering (CSMR-WCRE) , IEEE, pages 164–173, Feb 2014.
52. D. Rodriguez, I. Herraiz, R. Harrison, J. Dolado and J.C. Riquelme. Preliminary comparison of techniques for dealing with imbalance in software defect prediction. In 18th International Conference on Evaluation and Assessment in Software Engineering , pages 1–10, May 2014.
53. V. Suma, T.P. Pushphavathi and V. Ramaswamy. An approach to predict software project success based on random forest classifier. In Proceedings of the 48th Annual Convention of Computer Society of India-Vol II , Springer, Cham, pages 329–336, 2014.
54. L. Chen, B. Fang, Z. Shang and Y. Tang. Negative samples reduction in cross-company software defects prediction. Information and Software Technology , Volume 62:67–77, Jun 2015.
55. M.O. Elish, H. Aljamaan and I. Ahmad. Three empirical studies on predicting software maintainability using ensemble methods. Soft Computing , Volume 19(9):2511–2524, Sep 2015.
56. S. Hussain, J. Keung, A.A. Khan and K.E. Bennin. Performance evaluation of ensemble methods for software fault prediction: An experiment. In Proceedings of the ASWEC 2015 24th Australasian Software Engineering Conference , pages 91–95, Sep 2015.
57. I.H. Laradji, M. Alshayeb and L. Ghouti. Software defect prediction using ensemble learning on selected features. Information and Software Technology , Volume 58:388–402, Feb 2015.
58. E. Rubinić, G. Mauša and T.G. Grbac. Software defect classification with a variant of NSGA-II and simple voting strategies. In International Symposium on Search Based Software Engineering , Springer, Cham, pages 347–353, Sep 2015.
59. M.J. Siers and M.Z. Islam. Software defect prediction using a cost sensitive decision forest and voting, and a potential solution to the class imbalance problem. Information Systems , Volume 51:62–71, Jul 2015.
60. G. Li and S. Wang. Oversampling boosting for classification of imbalanced software defect data. In 2016 35th Chinese Control Conference (CCC) , IEEE, pages 4149–4154, Jul 2016.
61. R. Malhotra. An empirical framework for defect prediction using machine learning techniques with Android software. Applied Soft Computing , Volume 49:1034–1050, Dec 2016.
62. D. Ryu, O. Choi and J. Baik. Value-cognitive boosting with a support vector machine for cross-project defect prediction. Empirical Software Engineering , Volume 21(1)::43–71, Feb 2016.
63. J. Petrić, D. Bowes, T. Hall, B. Christianson and N. Baddoo. Building an ensemble for software defect prediction based on diversity selection. In Proceedings of the 10th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement , pages 1–10, Sep 2016.
64. T. Wang, Z. Zhang, X. Jing and L. Zhang. Multiple kernel ensemble learning for software defect prediction. Automated Software Engineering , Volume 23(4):569–590, Dec 2016.
65. T. Wang, Z. Zhang, X. Jing and Y. Liu. Non-negative sparse-based SemiBoost for software defect prediction. Software Testing, Verification and Reliability , Volume 26(7):498–515, Nov 2016.
66. X. Xia, D. Lo, S.J. Pan, N. Nagappan and X. Wang. Hydra: Massively compositional model for cross-project defect prediction. IEEE Transactions on software Engineering , Volume 42(10):977–998, Nov 2016.
67. H. Alsawalqah, H. Faris, I. Aljarah, L. Alnemer and N. Alhindawi. Hybrid SMOTE-ensemble approach for software defect prediction. In Computer Science On-line Conference , Springer, Cham, pages 355–366, Apr 2017.
68. D. Di Nucci, F. Palomba, R. Oliveto and A. De Lucia. Dynamic selection of classifiers in bug prediction: An adaptive method. IEEE Transactions on Emerging Topics in Computational Intelligence , Volume 1(3):202–212, May 2017.
69. L. Kumar, S. Misra and S.K. Rath. An empirical analysis of the effectiveness of software metrics and fault prediction model for identifying faulty classes. Computer Standards and Interfaces , Volume 53:1–32, Aug 2017.
70. R. Malhotra and M. Khanna. An empirical study for software change prediction using imbalanced data. Empirical Software Engineering , Volume 22(6):2806–2851, Dec 2017.
71. D. Ryu, J.I. Jang and J. Baik. A transfer cost-sensitive boosting approach for cross-project defect prediction. Software Quality Journal , Volume 25(1):235–272, Mar 2017.
72. C.W. Yohannese, T. Li, M. Simfukwe and F. Khurshid. Ensembles based combined learning for improved software fault prediction: A comparative study. In 2017 12th International Conference on Intelligent Systems and Knowledge Engineering (ISKE) , IEEE, pages 1–6, Nov 2017.
73. A. Agrawal and R.K. Singh. Empirical validation of OO metrics and machine learning algorithms for software change proneness prediction. In Towards Extensible and Adaptable Methods in Computing , IEEE, Springer, Singapore, pages 69–84, 2018.
74. D. Bowes, T. Hall and J. Petrić. Software defect prediction: do different classifiers find the same defects?. Software Quality Journal , Volume 26(2):525–552, Jun 2018.
75. L. Chen, B. Fang, Z. Shang and Y. Tang. Tackling class overlap and imbalance problems in software defect prediction. Software Quality Journal , Volume 26(1):97–125, Jun 2018.
76. S.A. El-Shorbagy, W.M. El-Gammal and W.M. Abdelmoez. Using SMOTE and heterogeneous stacking in ensemble learning for software defect prediction. In Proceedings of the 7th International Conference on Software and Information Engineering , ACM, pages 44–47, May 2018.
77. R. Malhotra and A. Bansal. Investigation of various data analysis techniques to identify change prone parts of an open source software. International Journal of System Assurance Engineering and Management , Volume 9(2):401–426, Apr 2018.
78. R. Mousavi, M. Eftekhari and F. Rahdari. Omni-ensemble learning (OEL): utilizing over-bagging, static and dynamic ensemble selection approaches for software defect prediction. International Journal on Artificial Intelligence Tools , Volume 27(06):p.1850024, Sep 2018.
79. S. Moustafa, M.Y. ElNainay, N. El Makky and M.S. Abougabal. Software bug prediction using weighted majority voting techniques. Alexandria engineering journal , Volume 57(4):2763–2774, Dec 2018.
80. H. Tong, B. Liu and S. Wang. Software defect prediction using stacked denoising autoencoders and two-stage ensemble learning. Information and Software Technology , Volume 96:94–111, Apr 2018.
81. Y. Zhang, D. Lo, X. Xia and J. Sun. Combined classifier for cross-project defect prediction: an extended empirical study. Frontiers of Computer Science , Volume 12(2):280–296, 2018.
82. A. Ali, M. Abu-Tair, J. Noppen, S. McClean, Z. Lin, Z. and I. McChesney. Contributing Features-Based Schemes for Software Defect Prediction. In International Conference on Innovative Techniques and Applications of Artificial Intelligence , Springer, Cham, pages 350–361, Dec 2019.
83. J.R. Campos, E. Costa and M. Vieira. Improving failure prediction by ensembling the decisions of machine learning models: A case study. IEEE Access , Volume 7:177661–177674, Dec 2019.
84. L. Gong, S. Jiang and L. Jiang. An improved transfer adaptive boosting approach for mixed-project defect prediction. Journal of Software: Evolution and Process , Volume 31(10):e2172, Oct 2019.
85. H. He, X. Zhang, Q. Wang, J. Ren, J. Liu, X. Zhao and Y. Cheng. Ensemble MultiBoost based on RIPPER classifier for prediction of imbalanced software defect data. IEEE Access , Volume 7:110333–110343, Aug 2019.
86. L. Kumar, S. Lal, A. Goyal and N.B. Murthy. Change-proneness of object-oriented software using combination of feature selection techniques and ensemble learning techniques. In Proceedings of the 12th Innovations on Software Engineering Conference (formerly known as India Software Engineering Conference) , pages 350–361, Feb 2019.
87. Z. Li, X.Y. Jing, X. Zhu, H. Zhang, B. Xu and S. Ying. Heterogeneous defect prediction with two-stage ensemble learning. Automated Software Engineering , Volume 26(3):599–651, 2019.
88. R. Li, L. Zhou, S. Zhang, H. Liu, X. Huang and Z. Sun. Software defect prediction based on ensemble learning. In Proceedings of the 2019 2nd International Conference on Data Science and Information Technology , pages 1–6, Jul 2019.
89. R. Malhotra and S. Kamal. An empirical study to investigate oversampling methods for improving software defect prediction using imbalanced data. Neurocomputing , Volume 343:120–140, May 2019.
90. R. Malhotra and M. Khanna. Dynamic selection of fitness function for software change prediction using particle swarm optimization. Information and Software Technology , Volume 112:51–67, Aug 2019.
91. H. Tong, B. Liu and S. Wang. Kernel spectral embedding transfer ensemble for heterogeneous defect prediction. IEEE Transactions on Software Engineering , Volume 14(8):1–21, Sep 2019.
92. H.D. Tran, L.T.M. Hanh and N.T. Binh. Combining feature selection, feature learning and ensemble learning for software fault prediction. In 2019 11th International Conference on Knowledge and Systems Engineering (KSE) , pages 1–8, Oct 2019.
93. T. Zhou, X. Sun, X. Xia, B. Li and X. Chen. Improving defect prediction with deep forest. Information and Software Technology , Volume 114:204–216, Oct 2019.
94. R. Abbas, F.A. Albalooshi and M. Hammad. Software Change Proneness Prediction Using Machine Learning. In 2020 International Conference on Innovation and Intelligence for Informatics, Computing and Technologies (3ICT) , IEEE, pages 1–7, Dec 2020.
95. H. Aljamaan and A. Alazba. Software defect prediction using tree-based ensembles. In Proceedings of the 16th ACM International Conference on Predictive Models and Data Analytics in Software Engineering , ACM, pages 1–10, Nov 2020.
96. A.A. Ansari, A. Iqbal and B. Sahoo. Heterogeneous Defect Prediction Using Ensemble Learning Technique. In Artificial Intelligence and Evolutionary Computations in Engineering Systems , Springer, Singapore, pages 283–293, 2020.
97. M. Banga and A. Bansal. Proposed software faults detection using hybrid approach. Security and Privacy , p.e103, Jan 2020.
98. E. Elahi, S. Kanwal and A.N. Asif. A new Ensemble approach for Software Fault Prediction. In 2020 17th International Bhurban Conference on Applied Sciences and Technology (IBCAST) , IEEE, pages 407–412, Jan 2020.
99. L. Goel, M. Sharma, S.K. Khatri and D. Damodaran. Defect Prediction of Cross Projects Using PCA and Ensemble Learning Approach. In Micro-Electronics and Telecommunication Engineering , Springer, Singapore, pages 307–315, 2020.
100. T.T. Khuat and M.H. Le. Evaluation of Sampling-Based Ensembles of Classifiers on Imbalanced Data for Software Defect Prediction Problems. SN Computer Science , Volume 1(2):1–16, Mar 2020.
101. R. Malhotra and J. Jain. Handling imbalanced data using ensemble learning in software defect prediction. In 2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence) , IEEE, pages 300–304, Jan 2020.
102. S.K. Pandey, R.B. Mishra and A.K. Tripathi. BPDET: An effective software bug prediction model using deep representation and ensemble learning techniques. Expert Systems with Applications , Volume 144:113085, Apr 2020.
103. S.S. Rathore and S. Kumar. An empirical study of ensemble techniques for software fault prediction. Applied Intelligence , Volume 51(6):3615–3644, Jun 2021.
104. A.A. Saifan and L. Abu-wardih. Software Defect Prediction Based on Feature Subset Selection and Ensemble Classification. ECTI Transactions on Computer and Information Technology (ECTI-CIT) , Volume 14(2):213–228, Oct 2020.
105. F. Yucalar, A. Ozcift, E. Borandag and D. Kilinc. Multiple-classifiers in software quality engineering: Combining predictors to improve software fault prediction ability. Engineering Science and Technology, an International Journal , Volume 23(4):938–950, Aug 2020.
106. L. Rokach. Taxonomy for characterizing ensemble methods in classification tasks: A review and annotated bibliography. Computational statistics & data analysis , Volume 53(12):4046–4072, Oct 2009.
107. G. Brown. Ensemble Learning. Encyclopedia of Machine Learning , 312, 2010.
108. L. Rokach and O. Sagi. Ensemble learning: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery , Volume 8(4):e1249, 2018.
109. A.J. Sharkey. Types of multinet system. In International Workshop on Multiple Classifier Systems. In Micro-Electronics and Telecommunication Engineering , Springer, Berlin, Heidelberg, pages 108–117, June 2002.
110. H. He and E.A. Garcia. Learning from imbalanced data. IEEE Transactions on knowledge and data engineering , Volume 21(9):1263–1284, Jun 2009.
111. M. Tan, L. Tan, S. Dara and C. Mayeux. Online defect prediction for imbalanced data. In 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering , IEEE, vol. 2, pages 99–108, May 2015.
112. T. Fawcett. An introduction to ROC analysis. Pattern recognition letters , Volume 27(8):861–874, Jun 2006.
113. T. Menzies, A. Dekhtyar, J. Distefano and J. Greenwald. Problems with Precision: A Response to” comments on’data mining static code attributes to learn defect predictors’. IEEE Transactions on Software Engineering , Volume 33(9):637–640, Aug 2007.
114. J. Derrac, S. García, D. Molina and F. Herrera. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation , Volume 1(1):3–18, Mar 2011.
115. T.G. Dietterich. Ensemble methods in machine learning. In International workshop on multiple classifier systems , Springer, Berlin, Heidelberg, pages 1–15, June 2000.
116. E. Frank, M.A. Hall and I.A. Witten. The WEKA Workbench. Online Appendix for “Data Mining: Practical Machine Learning Tools and Techniques”. Morgan Kaufmann, Fourth Edition, 2016.
117. T.D. Cook, D.T. Campbell and A. Day. Quasi-experimentation: Design & analysis issues for field settings . Vol. 351, Boston: Houghton Mifflin, 1979.
118. W. Fu, V. Nair and T. Menzies. Why is differential evolution better than grid search for tuning defect predictors?. arXiv preprint arXiv:1609.02613. , 2016.
119. C. Tantithamthavorn, S. McIntosh, A.E. Hassan and K. Matsumoto. The impact of automated parameter optimization on defect prediction models. IEEE Transactions on Software Engineering , Volume 45(7):683–711, Jan 2018.
120. S. Omri and C. Sinz. Deep Learning for Software Defect Prediction: A Survey. In Proceedings of the IEEE/ACM 42nd International Conference on Software Engineering Workshops , pages 209–214, June 2020.
121. E.N. Akimova, A.Y. Bersenev, A.A. Deikov, K.S. Kobylkin, A.V. Konygin, I.P. Mezentsev and V.E. Misilov. A Survey on Software Defect Prediction Using Deep Learning. Mathematics , Volume 9(11):1180, Jan 2021.