PROPOSING THE PROCESS OF DEVELOPING A SCALE FOR STUDENTS’ COMPETENCY: A STRUCTURAL EQUATION MODELING APPROACH

Thanh Trung Tạ

Main Article Content

Abstract

With the trend of shifting from content-based to student-centered teaching, developing and standardizing a student competency assessment framework is an important task to ensure the quality of education. Although many recent studies have focused on developing frameworks to assess students' competencies using various methods, only some studies have demonstrated the reliability and validity of such assessment frameworks/tools. As a result, this article presents a research design to standardize a student competency assessment framework, which includes the following stages: (1) developing a theoretical scale; (2) preliminary qualitative research using the Delphi method; (3) preliminary quantitative research using the EFA method; and (4) formal quantitative research using the CFA method. Furthermore, the study demonstrates the process can meet the rigorous standards of testing the reliability and validity of a student assessment framework. This research design can assist educators in standardizing complex competency frameworks and assist teachers in selecting behavioral indicators to assess student learning outcomes.


 

Article Details

References

Bandura, A. (1977). Self-efficacy: Toward a Unifying Theory of Behavioral Change. Psychological Review, 84, 191-215.
Boateng, G. O., Neilands, T. B., Frongillo, E. A., Melgar-Quiñonez, H. R., & Young, S. L. (2018). Best Practices for Developing and Validating Scales for Health, Social, and Behavioral Research: A Primer. Front Public Health, 6: 149, 1-18. doi:10.3389/fpubh.2018.00149
Cohen, L., Manion, L., & Morrison, K. (2007). Research Methods in Education (6th ed.). New York, NY, USA: Routledge. Retrieved from https://gtu.ge/Agro-Lib/RESEARCH%20METHOD%20COHEN%20ok.pdf
Colby, R. L. (2019). Competency-based education: A new architecture for K-12 schooling. Cambridge, Cambridgeshire, England: Harvard Education Press. Retrieved from https://eric.ed.gov/?id=ED581158
Comrey, A. L., & Lee, H. B. (2013). A First Cours in Factor Analysis (2nd ed.). Psychology press.
Danh, T. H. (2017). Modeling in educational research [Mo hinh hoa trong nghien cuu giao duc]. Vietnam Science and Education Journal, 137, 12-16. Retrieved from http://vjes.vnies.edu.vn/vi/mo-hinh-hoa-trong-nghien-cuu-giao-duc
Dash, G., & Paul, J. (2021). CB-SEM vs PLS-SEM methods for research in social sciences and technology forecasting. Technological Forecasting & Social Change, 173. doi:doi.org/10.1016/j.techfore.2021.121092
Devellis, R., & Thorpe, C. T. (2021). Scale Development Theory and Applications (5th ed.). Sage Publications.
Efron, B. (2000). The Bootstrap and Modern Statistics. Journal of the American Statistical Association, 95(452), 1293-1296. doi:10.1080/01621459.2000.10474333
Fornell, C., & Larcker, D. F. (1981). Evaluating Structural Equation Models with Unobservable Variables and Measurement Error. Journal of Marketing Research, 18(1), 39-50. doi:10.1177/002224378101800104
Gehringer, E. F. (2017). Self-assessment to improve learning and evaluation. 2017 ASEE Annual Conference and Exposition, (p. 19411). Columbus, Ohio, USA. doi:10.18260/1-2--28816
Gerbing, D. W., & Anderson, J. C. (1988). An Updated Paradigm for Scale Development Incorporating Unidimensionality and Its Assessment. Journal of Marketing Research, 25(2), 186-192. doi:10.2307/3172650
Guadagnoli, E., & Velicer, W. F. (1988). Relation of sample size to the stability of component patterns. Psychological Bulletin, 103(2), 265-275. doi:10.1037/0033-2909.103.2.265
Guion, R. M. (1977). Content validity—The source of my discontent. Applied Psychological Measurement, 1(1), 1-10. doi:10.1177/014662167700100103
Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2019). Multivariate Data Analysis (8th ed.). Cengage Learning, EMEA.
Hair, J. F., Hult, G. T., Ringle, C. M., & Sarstedt, M. (2021). A primer on partial least squares structural equation modeling (PLS-SEM) (3rd ed.). Thousand Oaks, CA: Sage.
Haynes, S. N., Richard, D., & Kubany, E. S. (1995). Content validity in psychological assessment: a functional approach to concepts and methods. Psychological assessment, 7(3), 238-247. doi:10.1037%2F1040-3590.7.3.238
Hoang, H. B. (2015). Competence and assessment by competence [Nang luc va danh gia theo nang luc]. Ho Chi Minh City University of Education Journal of Science, 6(71), 21-32. doi:10.54607/hcmue.js.0.6(71).667.658(2015)
In'am, A., & Sutrisno, E. S. (2021). Strengthening Students’ Self-efficacy and Motivation in Learning Mathematics through the Cooperative Learning Model. International Journal of Instruction, 14(1), 395-410. doi:10.29333/iji.2021.14123
Joppe, M. (2000). The research process. Retrieved from https://www.uoguelph.ca/hftm/research-process
Jöreskog, K. G. (1971). Simultaneous factor analysis in several populations. Psychometrika, 36, 409-426. doi:10.1007/BF02291366
Kline, R. B. (2016). Principles and Practice of Structural Equation Modeling (4th ed.). The Guilford Press.
Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of personality and social psychology, 77(6), 1121-1134. doi:10.1037/0022-3514.77.6.1121
Landis, J. R., & Koch, G. G. (1977). An Application of Hierarchical Kappa-type Statistics in the Assessment of Majority Agreement among Multiple Observers. Biometrics, 33(2), 363-374. doi:10.2307/2529786
McCoach, D. B., Gable, R. K., & Madura, J. P. (2013). Instrument Development in the Affective Domain (3rd ed.). New York: Springer. doi:10.1007/978-1-4614-7135-6
MOET. (2018). New general education curriculum [Chuong trinh giao duc pho thong tong the]. Hanoi.
Nunnally, J. C., & Bernstein, I. H. (1994). The Assessment of Reliability. In Psychometric Theory (Vol. 3, pp. 248-292). New York: McGraw-Hill.
Nguyen, V. B., Nguyen, A. T., Dang, V. S., & Nguyen, T. T. (2020). Reliability and validity an instrument to assess creative competency in engineering design on STEM education [Xay dung cong cu danh gia nang luc sang tao thiet ke ki thuat trong giao duc STEM]. HNUE Journal of Science, 65(1), 151-162. doi:10.18173/2354-1075.2020-0015
OECD. (2002). Definition and Selection of Competencies: Theoretical and Conceptual Foundation.
Ta, T. T., Tran, T. X., Nguyen, P. U., & Nguyen, T. N. (2022). Construct and standardize the STEM competency assessment tool for high school students in Ho Chi Minh City [Xay dung va chuan hoa cong cu danh gia nang luc STEM cua hoc sinh trung hoc pho thong tai Thanh pho Ho Chi Minh]. Ho Chi Minh City University of Education Journal of Science, 19(8), 1255-1270. doi:10.54607/hcmue.js.19.8.3408(2022)
Taber, K. S. (2018). The Use of Cronbach’s Alpha When Developing and Reporting Research Instruments in Science Education. Research in Science Education, 48, 1273-1296. doi:10.1007/s11165-016-9602-2
Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach's alpha. International Journal of Medical Education, 2, 53-55. doi:10.5116/ijme.4dfb.8dfd
Weinert, S., Artelt, C., Prenzel, M., Senkbeil, M., Ehmke, T., & Carstensen, C. H. (2011). 5 Development of competencies across the life span. Zeitschrift für Erziehungswissenschaft, 14, 67-86. doi:10.1007/s11618-012-0265-0
Winter, G. (2000). A Comparative Discussion of the Notion of 'Validity' in Qualitative and Quantitative Research. The Qualitative Report, 4(3), 1-14. doi:10.46743/2160-3715/2000.2078