Designing an Internal Evaluation Analysis Software for Basic Sciences Educational Groups

authors:

avatar Mohammad Rezaei 1 , avatar Mansour Rezaei ORCID 2 , * , avatar Lida Memar Eftekhari ORCID 3

Department of Medical Physics and Medical Engineering, Faculty of Medicine, Kermanshah University of Medical Sciences, Kermanshah, Iran
Biostatistics Department, Social Development and Health Promotion Research Center, Kermanshah University of Medical Sciences, Kermanshah, Iran
Secretary of the Education Research Committee, Center for Studies and Development of Medical Science Education (EDC), Kermanshah University of Medical Sciences, Kermanshah, Iran

how to cite: Rezaei M, Rezaei M, Memar Eftekhari L . Designing an Internal Evaluation Analysis Software for Basic Sciences Educational Groups. Educ Res Med Sci. 2022;11(2):e133936. https://doi.org/10.5812/erms-133936.

Abstract

Background:

Internal evaluations (IEs) are associated with various obstacles, including time consumption, high study costs, and extensive data analysis, requiring some educational groups (EG). Therefore, it is essential to provide a suitable solution to overcome these obstacles.

Objectives:

This study aimed to design and use Internal Evaluation Analysis Software for basic science educational groups (IESEG) at Kermanshah University of Medical Sciences (KUMS).

Methods:

This study tried to implement all the procedures necessary to conduct the IE of Basic sciences EGs in a particular software format. For this purpose, the MATLAB programming environment and its graphical interface were used for implementation after preparing the necessary standard procedures. The evaluation of the designed software, IESEG, was performed based on the results of several EGs by traditional (manual) and new software approaches.

Results:

Internal Evaluation Analysis Software for basic science educational groups showed that the calculation errors were reduced to zero during the IE process with the software compared to the manual method. In addition, the designed software could draw graphs and related tables to display the evaluation results.

Conclusions:

The use of designed software, IESEG, creates a systematic, uniform, and regular method of conducting IE. In addition, speeding up the IE and providing an accurate analysis are the benefits which increase the capabilities of the internal evaluation committees of EGs.

1. Background

Higher education is essential in training specialized human resources, producing knowledge, and providing technical services (1, 2). On the other hand, challenges such as the rapid development of higher education institutions, responding to social demand, developing information technology, and the knowledge-oriented economy have focused on the quality of higher education institutions. Despite the limited resources, maximum efficiency and effectiveness should be realized (3-7). The challenges require the responsibility and accountability of the university system to rethink its structure, mission, goals, functions, and processes (8-10). Therefore, a quality system in the scientific method has an inevitable necessity.

Consequently, quality assessment of educational processes is also emphasized (11, 12). These goals can be achieved through internal and external evaluation, and the first step in this field is internal evaluation (IE) by educational groups (EG) of universities (13, 14). IE is the collection of appropriate and up-to-date information from faculty members, students, and alumni about the constituent factors of the higher education unit to judge the quality and plan to improve the current situation (15). Hence, as an essential part of the educational evaluation process, program implementation evaluation reveals the weaknesses of a program and provides practical solutions for the success of the programs.

National and international experiences indicate that the IE process, especially at the level of the EG, can play a significant role as one of the effective mechanisms in guaranteeing university quality (16-18). An IE procedure provides a systematic way to determine a program's strengths and weaknesses and how well it achieves its goals. The IE process has many problems in implementation. For this reason, today, some EGs refuse to conduct IE. Hakimi et al. analyzed IE plans and showed that the time-consuming nature of the evaluation plan, and the lack of evaluation software, are among the reasons for the reluctance to conduct IE (19).

Forms and questionnaires must be collected and categorized to conduct IE of EGs. An extensive manual analysis should evaluate the collected information in the next step. In addition, the required information should be entered into statistical software such as SPSS in several separate stages, which requires time to perform the necessary statistical analysis. In this method, the calculation error increases. Therefore, using accurate methods and tools to perform all the necessary calculations for the IE of EGs and the required analyses is essential.

2. Objectives

In the present study, an attempt was made to design and implement a software program to perform all calculations related to the IE of EGs to eliminate manual analysis and reduce the IE process time and optimal use of human resources. The research objective was to design and use Internal Evaluation Analysis Software for basic science educational groups (IESEG) at Kermanshah University of Medical Sciences (KUMS).

3. Methods

This study prepared standard criteria and indicators for the IE of basic EGs. The forms and questionnaires prepared by the quality assessment center of Tehran University were used for this program. In addition, data extraction and analysis instructions, quality requirements, and criteria designed by the quality assessment center at Tehran University were used.

In the next step, a software program was prepared so that all the instructions for data extraction and analysis of the results were followed. In this step, firstly, the intended program was prepared in the form of codes, and a program draft was designed. Testing the scheduled program led to the preparation of its graphical interface. MATLAB software version R2009a was used for these two purposes (coding and developing a program).

The coding of the program and preparation of its graphic interface was designed in five parts:

- In the first part, the initial data related to the questionnaires were loaded.

- The second part involves extracting the results from the primary data, calculating the indicators, criteria, and investigated factors, and storing them in separate files.

- The third part involves drawing diagrams and reporting the results.

- The fourth part is related to the presentation of the results.

- In the fifth part, all the obtained results were printed.

In addition, this graphic interface included menus for loading files, saving results, and explanations about the designed software. In another menu, instructions were included to use the software. This software, IESEG, consisted of importing data, analyzing results, drawing graphs, and preparing a final report.

KUMS conducted the IE in three colleges' EGs to test the prepared program: (1) medical physics and engineering; (2) radiology; and (3) public health.

Then, IESEG was used to analyze these three EGs. The results and analyses of the IE plans carried out in the above three groups were compared in two manual modes according to the previous traditional procedure and the prepared software. The criterion for comparing the manual mode with the software approach was to pay attention to the accuracy and precision of extracting the evaluation results based on the data extraction and analysis instructions.

4. Results

This work process was carried out in two parallel phases. A suitable program code was prepared in the first phase to extract and analyze the required data. Then, the graphic interface of the designed software was organized and used (Figure 1).

The graphical interface of the designed software (IESEG, for internal evaluation).
The graphical interface of the designed software (IESEG, for internal evaluation).

As stated in the methods section, the graphic interface was designed in five sections. In the first section, the initial data related to the questionnaires were loaded. The second part was associated with extracting the results from the primary data. The indicators, criteria, and investigated factors were calculated and stored in separate files in this section. Drawing the required diagrams and reporting the results was performed in the third part. The results were displayed in two graphs in the fourth section, and a table was created to show the results (Figure 1).

The quality coefficients of the evaluated factors were drawn in the right graph, and the quality coefficients of the evaluated criteria were illustrated in the left diagram. In the relevant table, the results were reported again with another statement. The parameters of the average quality coefficients of the criteria related to each factor are given separately, and their standard deviation and quality level are reported. Finally, all the results were printed in the fifth part of the graphical interface.

In the second phase of this study, three IE were performed from the three departments of KUMS in three different colleges’ EGs. The desired results of the IE of these three EGs were extracted manually (traditionally) and using the designed software. Then, the results of these two methods were compared. Data were rechecked when the results differed between the two methods. In all cases, the accuracy of the results obtained from the software approach was confirmed, and the error was due to the manual calculation method.

Additionally, the software is more accurate than the traditional manual method due to its high reproducibility. The approval of the managers of these three EGs (Dr. Karim Khoshgard, Dr. Mohamad Rasol Tawhidinia, and Dr. Behzad Karamimatin) in using this software confirmed the precision and appropriateness of the IESEG for IE analysis.

5. Discussion

Today, there are challenges in the higher education system that should be responsible and accountable (8-10). There are different solutions to investigate and solve or reduce these challenges. An IE in a sub-sector of an educational system is one solution for identifying and mitigating these challenges (13, 14). In the EGs, an IE should be performed every few years (3 to 5 years). Obstacles, such as the time-consuming nature of IE and the possibility of mistakes in extracting its results, have caused such a critical process to be disrupted (20). This problem can be attributed to how primary data is collected and its difficulties. Using online questionnaires is one of the ways to solve this problem (21-25).

Conducting IE also requires performing an appropriate analysis of the primary data to extract the necessary results. Many people refrain from performing these evaluations because of the time-consuming process and the high possibility of mistakes. Since this process is done traditionally and manually, the case of mistakes and errors is very high (25).

In the present study, software was designed to perform IE as one of the capabilities of this software to solve existing problems. In this program, all the necessary instructions for extracting the results and their analysis were implemented step by step in the form of a program code. The designed software can perform all the stages of IE data extraction automatically.

An IE was conducted three times in the three EGs at three different colleges to test the software's efficiency. For this purpose, the necessary data were manually and traditionally extracted from the prepared IE forms in the three groups and compared with the data extracted by the designed software. Where the results of the two methods differed, the necessary investigations were conducted to identify the error. According to the results, the manual method is error-prone, whereas the designed software is error-free. Although zero error is evident if the relevant instructions are defined correctly, in any case, using the designed software reduces the error to zero.

Another critical issue to pay attention to is time consumption. Traditional and manual methods require several days or months to extract the necessary data from prepared questionnaires. The designed software now extracts all the data in a few seconds due to the high-speed nature of the processor's calculations.

This software’s other features are drawing graphs and preparing the necessary tables from the obtained results with a user-friendly interface. In addition, since its program code is implemented in the MATLAB programming environment, it is open source, and users can apply their desired changes to the program code.

In the traditional method of IE, many indicators, criteria, and factors are manually calculated, which is laborious and error-prone. EG managers also spend much time doing this job. The use of this software can both prevent the occurrence of such errors and create less trouble for the performers of IE. As a result, EG managers spend less time completing educational, research, and management tasks.

5.1. Limitations

In this study, the same questionnaire is used for all educational groups, and corrections should be made to some questionnaire items in each educational group. This issue was a limitation of this study and designed software. When modifications are made in the questionnaire or their changes before the IE in each EG, the necessary corrections should be made in the software for that particular group.

5.2. Conclusions

The use of designed software, IESEG, creates a systematic, uniform, and regular method of conducting internal evaluations. In addition, the IE can be sped up, and its results can be analyzed more accurately, which will enhance the capabilities of the IE committees of the EGs.

Acknowledgements

References

  • 1.

    Gibbons M. Higher education relevance in the 21st century. Washington, USA: World Bank; 1998.

  • 2.

    Alvesson M, Benner M. Higher Education in the Knowledge Society: Miracle or Mirage? In: Frost J, Hattke F, Reihlen M, editors. Multi-Level Governance in Universities: Strategy, Structure, Control. 47. New York, USA: Springer Cham; 2016. p. 75-91. https://doi.org/10.1007/978-3-319-32678-8_4.

  • 3.

    Edwards R, Raggatt P, Small N. The learning society: challenges and trends. Oxfordshire, UK: Routledge; 2013. https://doi.org/10.4324/9781315004662.

  • 4.

    Van Dusen GC. The Virtual Campus: Technology and Reform in Higher Education. ASHE-ERIC Higher Education Report, Volume 25, No. 5. Washington, USA: The George Washington University; 1997.

  • 5.

    Green D. What Is Quality in Higher Education?. London, UK: Society for Research into Higher Education; 1994.

  • 6.

    Marginson S. The worldwide trend to high participation higher education: dynamics of social stratification in inclusive systems. High Educ. 2016;72(4):413-34. https://doi.org/10.1007/s10734-016-0016-x.

  • 7.

    Altbach PG, Gumport PJ, Berdahl RO. American higher education in the twenty-first century: Social, political, and economic challenges. Maryland, USA: JHU Press; 2011.

  • 8.

    Ehrlich T. Civic responsibility and higher education. California, USA: Greenwood Publishing Group; 2000.

  • 9.

    Houston D. Rethinking quality and improvement in higher education. Qual Assur Educ. 2008;16(1):61-79. https://doi.org/10.1108/09684880810848413.

  • 10.

    Lotz-Sisitka H, Wals AE, Kronlid D, McGarry D. Transformative, transgressive social learning: rethinking higher education pedagogy in times of systemic global dysfunction. Curr Opin Environ Sustain. 2015;16:73-80. https://doi.org/10.1016/j.cosust.2015.07.018.

  • 11.

    Cheong Cheng Y, Ming Tam W. Multi‐models of quality in education. Qual Assur Educ. 1997;5(1):22-31. https://doi.org/10.1108/09684889710156558.

  • 12.

    Sachdeva R, Douglas PS. Quality Improvement Interventions to Improve Appropriateness of Imaging Studies: Necessary, But Are They Sufficient? Circ Cardiovasc Qual Outcomes. 2016;9(1):2-4. [PubMed ID: 26733587]. https://doi.org/10.1161/circoutcomes.115.002510.

  • 13.

    Nevo D. School evaluation: internal or external? Stud Educ Evaluation. 2001;27(2):95-106. https://doi.org/10.1016/s0191-491x(01)00016-5.

  • 14.

    Nevo D. The Conceptualization of Educational Evaluation: An Analytical Review of the Literature. Rev Educ Res. 2016;53(1):117-28. https://doi.org/10.3102/00346543053001117.

  • 15.

    Sonnighsen R. High Impact Internal Evaluation: A Practitioner's Guide to Evaluating and Consulting Inside Organizations. Washington, USA: SAGE Publications; 2000. https://doi.org/10.4135/9781483328485.

  • 16.

    Mathison S. What do we know about internal evaluation? Eval Program Plann. 1991;14(3):159-65. https://doi.org/10.1016/0149-7189(91)90051-h.

  • 17.

    Love A. Internal evaluation. Washington, USA: SAGE Publications; 1991. https://doi.org/10.4135/9781412984546.

  • 18.

    Bazargan A. Internal evaluation as an approach to revitalize university systems: the case of the Islamic Republic of Iran. High Educ Policy. 2000;13(2):173-80. https://doi.org/10.1016/s0952-8733(99)00024-0.

  • 19.

    Hakimi A, Abedi Z, Dadashian F. Increasing Energy and Material Consumption Efficiency by Application of Material and Energy Flow Cost Accounting System (Case Study: Turbine Blade Production). Sustainability. 2021;13(9):4832. https://doi.org/10.3390/su13094832.

  • 20.

    Bazargan A. Problems of Organising and Reporting Internal and External Evaluation in Developing Countries: The Case of Iran. Qual High Educ. 2007;13(3):207-14. https://doi.org/10.1080/13538320701800126.

  • 21.

    van Gelder MM, Bretveld RW, Roeleveld N. Web-based questionnaires: the future in epidemiology? Am J Epidemiol. 2010;172(11):1292-8. [PubMed ID: 20880962]. https://doi.org/10.1093/aje/kwq291.

  • 22.

    Ekman A, Dickman PW, Klint A, Weiderpass E, Litton JE. Feasibility of using web-based questionnaires in large population-based epidemiological studies. Eur J Epidemiol. 2006;21(2):103-11. [PubMed ID: 16518678]. https://doi.org/10.1007/s10654-005-6030-4.

  • 23.

    Denscombe M. Web-Based Questionnaires and the Mode Effect. Soc Sci Comput Rev. 2016;24(2):246-54. https://doi.org/10.1177/0894439305284522.

  • 24.

    Gunn H. Web-based Surveys: Changing the Survey Process. First Monday. 2002;7(12). https://doi.org/10.5210/fm.v7i12.1014.

  • 25.

    Fox J, Murray C, Warm A. Conducting research using web-based questionnaires: Practical, methodological, and ethical considerations. Int J Soc Res Methodol. 2003;6(2):167-80. https://doi.org/10.1080/13645570210142883.