Selection of key predictor variables in classical statistical procedures such as predictive discriminant analysis (PDA) not only leads to the identification of key predictor variables which separate the groups well but also improves prediction or classification accuracy. Because the dependent variable in PDA is categorical, the technique lends itself to various uses in higher education predictions. But the predictive validity of the predictive discriminant function (PDF), in the context of academic prediction, can best be evaluated based on the relevance of the selected key predictor variables to the PDF underlying construct when compared to what the PDF solution is intended to measure. This is not provided for by existing variable selection procedures used in PDA. Hence, final key predictor variables obtained by PDA in the context of academic prediction often lack the basic quality desired in a criterion measure such as 'relevance'. A simple approach that will validate the relevance of the key predictors to the PDF underlying construct when compared to what the PDF solution is intended to measure is proposed. The approach is based on modified splitting of historical sample and profiling of the selected key predictors. In application to four training samples from the same population, the final selected subset of key predictors' description was relevant when compared to what the PDF solution is intended to measure.
Aggarwal, Y. P. (2012). Statistical methods: concepts, application and computation. Sterling, New Delhi.
Aluko, R.O., Adenuga, O.A , Kukoyi, P.O., Soyingbe, A.A and Oyedeji, J.O. (2016). Predicting the academic success of architecture students by pre-enrolment requirement: Using Machine-Learning Techniques. Construction Economics and Building, 16(4): 86-98.
Baggaley, A. R., Isard, E. S., and Sherwood, E. J. (1970). Discrimination of academic curricula by the Tunner studies of attitude patternscollege form. Measurement and Evaluation in Guidance, 16(4): 86-98.
Bakari, H.R., Isa, A.M. and Zannah, U. (2016). Application of discriminant analysis in modelling students placement in colleges of education. Discovery, 52(249): 1702-1712.
Bertrand, C., Ernest, F., and Hao, H. Z. (2009). Principles and theory for data mining and machine learning. Springer Series in Statistics doi:10.1007/978-0-387-98135-2, Springer Science+Business Media New York, 569-576.
Burnham, P. S., and Hewitt, B. A. (1972). Determining vocational characteristics versus predicting membership: A follow-up of male college graduates. J. of Psych., 81:73-84.
Chiang, L.H., Russell, E.L. and Braatz, R.D. (2001). Fault detection and diagnosis in industrial systems. Springer, New York.
Chiang, L. H. and Pell, R. J. (2004). Genetic algorithms combined with discriminant analysis for key variables identification. Journal of Process Control, 14: 143-155.
Draper, N. R. and Smith, H. (1981). Applied regression analysis. Wiley, New York.
Divjak, B. and Oreski, D. (2009). Prediction of academic performance using discriminant analysis. Proceedings of the ITI 2009 31st International Conference on Information Technology Interfaces. DOI: 10.1109/ITI.2009.5196084.
Erimafa, J. T., Iduseri, A., and Edokpa, I. W. (2009). Application of discriminant analysis to predict the class of degree for graduating students in a university system. Int.J. Phys. Sci, 4(1): 016-021.
Geisser, S. (1975). The predictive sample reuse method with applications. J. Amer.Statist. Associ, 70: 320-328.
Hand, D. J. (1997). Constructon and assessment of classification rules, Chicester,UK, Wiley.
Huberty, C. J. (1974). Discriminant analysis, Paper presented at the annual meeting of the American Educational Research Association, Chicago, Illinois.
Huberty, C. J. and Olejnik, S. (2006). Applied manova and discriminant analysis.Wiley, Hoboken.
Iduseri, A. and Osemwenkhae, J. E. (2015). An ecient variable selection method for predictive discriminant analysis. Ann. Data. Sci, 2(4): 489{504 doi: 10.1007/s40745-015-0061-9.
Louw, W. and Steep, S. J. (2006). Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimination. Comput Stat Data Anali, 51: 2043-2055.
McLachlan, G. J. (1992). Discriminant analysis and statistical pattern recognition,Wiley, New York.
Mairal, J., Bach, F., Ponce, J., Sapiro, G. and Zisserman, A. (2008). Supervised dictionary learning. Advances in Neural Information Processing Systems (NIPS), 21. See http://papers.nips.cc/paper/3448-supervised-dictionary-learning.pdf
Pacheco, J., Casado, S., Nunez, L. and Gomez, O. (2006). Analysis of new variable selection methods for discriminant analysis. Computational Statistics and Data Analysis, 51: 1463-1478.
Rai, S. (2014). Students dropout risk assessment in undergraduate course at residential university.https://arxiv.org/pdf/1405.3727.
Stone, M. (1974). Cross-validatory choice and assessment of statistical predictions. J Roy Stat Soc Ser Bi, 36: 111-147.
Tabachnick, B. G. and Fidell, L. S. (2007). Using multivariate statistics, 5th ed. Pearson Education, Inc. USA, 382.
Thomas, D. (2014). Predicting student college completion intention: A discriminant analysis. Asean Journal of Management and Innovationi, journal.stamford.edu/index.php/ajmi/article/download/190/58.
Tibshirani, R. (1996). Regression shrinkage and selection via the LASSO. J R Stat Soc, Ser B, 58: 267-288.
Trendafilov, N. T. and Jolliffe, I. T. (2007). DALASS: Variable selection in discriminant analysis Via the LASSO. Comput Stat Data Anal, 51: 37183736.
Whellams, F. S. (1973). Musical abilities and sex differences in the analysis of aural-musical capacities. Journal of Research in Music Education, 21: 30-39.
Whitaker, J. S. (1997). Use of stepwise methodology in discriminant analysis. A paper presented at the annual meeting of the Southwest Educational Research Assoc, Austin, January, 1997.
Yang M., Zhang L., Feng X., Zhang D. (2011). Fisher discrimination dictionary learning for sparse representation. In: Proceedings of the IEEE international conference on computer vision (ICCV), 543550.
SHARE WITH OTHERS