Validation of the Persian Version of the Engagement in E-Learning Scale in Students of the School of Nursing and Midwifery in Iran

authors:

avatar Shahrzad Sanjari ORCID 1 , * , avatar Mohammad Reza Mohammadi Soleimani ORCID 2 , 3

PhD in Reproductive Health, Department of Midwifery, Nursing and Midwifery School, Jiroft University of Medical Sciences, Jiroft, Iran
Assistant Professor, Department of Psychology, Kerman Branch, Islamic Azad University, Kerman, Iran
Neurology Research Center, Kerman University of Medical Sciences, Kerman, Iran

how to cite: Sanjari S, Mohammadi Soleimani M R. Validation of the Persian Version of the Engagement in E-Learning Scale in Students of the School of Nursing and Midwifery in Iran. Middle East J Rehabil Health Stud. 2023;10(3):e134881. https://doi.org/10.5812/mejrh-134881.

Abstract

Background:

Engagement in e-learning is undesirable in Iranian students, and there is no valid scale to assess its status.

Objectives:

This study was conducted to validate the scale of "participation in e-learning scale" among students of nursing and midwifery schools in Iran.

Methods:

In this cross-sectional study, validation was conducted on 1014 students from 51 universities of medical sciences in Iran. The samples were selected using the cluster sampling method in 2022. The scale was translated into Persian by the forward-backward method. Validity was evaluated as face validity, content validity, and construct validity. Face validity was assessed by qualitative interviews with the participants and by calculating the impact scores of each item. Content validity was assessed using the content validity ratio and content validity index. Exploratory factor analysis and confirmatory factor analysis were used for construction reliability. Convergent validity was calculated by average variance extract (AVE) and composite reliability (CR). Concurrent validity was checked by comparing the correlation between the Lee scale and the educational engagement questionnaire of Schaufeli’s study. Reliability was evaluated by calculating Cronbach's alpha and test-retest reliability. The data were analyzed using SPSS version 18 and LISREL version 8.8. The level of significance was considered 0.05.

Results:

In this study, 579 (66.86%), 569 (65.7%), and 679 (78.41%) of the students were under the age of 22 years, women, and passing an undergraduate or associate degree, respectively. Based on the results, the items were revised with face validity. The content validity ratio value of the questionnaire items was estimated between 0.76 and 1, and the content validity index was estimated at 0.79. According to the factor analysis, four factors were extracted ((1) psychological motivation; (2) management and effective communication; (3) cognitive problem-solving; and (4) peer collaboration). Regarding the convergence validity results, AVE values were greater than 0.5, and CR values were greater than 0.7. The concurrent validity results were calculated at 0.61. The Cronbach's alpha obtained for the whole scale was 0.95.

Conclusions:

Our findings showed that the e-learning scale is valid and reliable for measuring the participation of nursing and midwifery students in e-learning, and this scale can improve academic engagement in online classes in nursing and midwifery schools.

1. Background

At the end of 2019, the world became infected by the coronavirus pandemic, which started in China and spread rapidly to other countries (1). With the spread of coronavirus disease 2019 (COVID-19), restrictions were imposed in many countries (2). These restrictions disrupted many citizens’ daily routines (3). One limitation in controlling this disease was the lockdown of schools and universities, which, according to scientific evidence, has decreased the number of cases and deaths (4, 5). Estimates indicate the lockdown of universities and schools in more than 100 countries and the deprivation of more than 1 billion students (6).

The lockdown of schools, universities, and educational institutions has changed the way of education from a standard system to a virtual, online framework (7, 8). Online education includes learning through the Worldwide Web (9), and despite its benefits, it can never be a substitute for face-to-face education (7). It can lead to long-term consequences for students’ life expectancy and physical and mental health, as well as academic failure or dropout in poor and disabled students (10-12). This sudden shift from face-to-face to online education has significantly affected students, and many are confused about academic success (13). The novelty of online education and the constant change of existing technologies have exacerbated these confusions (14). In face-to-face education, students are more involved in learning processes than in online education because online education has challenges regarding internet access, student communications, student-teacher communication, and teacher-student skills (15).

Academic engagement can be better understood and defined as the interaction between attention and commitment (16). Active engagement in educational settings is essential for learners’ academic achievement (17). Lack of experience in academic engagement is associated with academic failure and academic fatigue in students, leading to reduced academic achievement and, eventually, dropout (18). Dropping out of school leads to higher unemployment, lower welfare and life satisfaction, increased crime, and poorer health (19-22). Academic engagement is more important in online education than in any other educational system because, in online education, students are primarily responsible for learning (23). In fact, in such a system, the achievement of educational goals depends on the active involvement of learners in educational environments. However, some studies showed that learners’ scientific involvement in online education is non-existent or very low in many countries (24).

The results of Mahdavi and Rahimi showed that the academic engagement of Iranian students is relatively unfavorable (25). Iranian students have little interest in participating in classrooms and participate in classrooms with a feeling of boredom. They are less likely to discuss class content with their friends (25). In Iran, some studies mentioned the lack of interaction, discussion, access, and involvement in cooperative learning among students as the challenges of virtual education (26). Similarly, the feeling of the superficiality of online education, along with the lack of motivation, proper interaction between students and professors, and active participation of students in learning, were mentioned as the limitations of virtual education in Iran (27-29).

There are few tools for assessing online academic engagement (30, 31), all of which were before online education became ubiquitous. Therefore, due to the various changes during the coronavirus outbreak, such as the expansion of online education instead of face-to-face education, these tools cannot measure academic engagement correctly because face-to-face communication between students and teachers was very limited and impossible during the corona era. There is a lack of research on online education, and scale building and validation can lead to research advances in online education (30). A scale has been developed in South Korea to measure engagement in e-learning (32). Due to cultural and linguistic differences, using scales in diverse societies requires re-validation (33).

The tool introduced in South Korea is the only scale introduced for online courses during the COVID-19 pandemic and has been used in numerous articles. Various methods have been used to validate this scale. Additionally, there are similarities in midwifery education between Iran and South Korea, such as the use of a national entrance exam, ethical principles based on cultural values, shared goals in professional skills, improvement of health levels, lifelong learning, evidence-based learning, and provision of services based on the highest available standards (34). Therefore, this scale is suitable for assessing academic engagement in Iran.

Iran was severely affected by COVID-19, and the spread of the virus continued despite the restrictions (35, 36). In Iran, since the outbreak of COVID-19, the process of education in universities has been online. According to reports, academic engagement among students has been very undesirable, and there is no estimate of the status of student interaction because of the lack of scales. Therefore, to measure academic engagement, providing a valid scale in online education is very important to help researchers and planners to provide transparent statistics on academic engagement.

2. Objectives

This study aimed to validate the Persian version of engagement in the e-learning scale (EELS).

3. Methods

3.1. Study Design

This cross-sectional study was conducted in 2022 on 1014 nursing and midwifery students of medical sciences universities across Iran. In this study, after stating the objectives of the study, informed telephone consent was obtained from participants by maintaining confidentiality and anonymity, willingness to participate in the study, and having the right to withdraw from the study.

3.2. Sampling Method

The cluster sampling method was used so that ten schools were randomly selected among all nursing and midwifery schools of medical universities. Next, the contact numbers and student codes were obtained by referring to the schools' education departments. The target sample (nursing and midwifery students) was randomly selected from each selected school at different levels of education based on the validation phase. After contacting these students and obtaining informed consent, the questionnaire link was sent to them via WhatsApp, e-mail, or SMS.

3.3. Measurement

This research used three questionnaires: A demographic information questionnaire, EELS, and an Educational Engagement Questionnaire of Schaufeli’s study (EEQSS).

Demographic Characteristics Questionnaire: The questionnaire on demographic characteristics included questions concerning gender, age, and education status.

EELS: This questionnaire was designed and created in 2019 by Lee et al. based on a systematic review of related studies (32). In the first stage, Lee extracted 48 primary items. After evaluating content validity and construct validity, a 24-item [on a 5-point Likert scale (never = 1, rarely = 2, sometimes = 3, often = 4, and always = 5] scale with six dimensions ((1) psychological motivation (6 items); (2) peer collaboration (5 items); (3) cognitive problem-solving (5 items); (4) interactions with instructors (2 items); (5) community support (3 items); and (6) learning management (3 items)) was obtained. On this scale, the scores had a range of 24 - 120. A higher score indicated more engagement of students in e-learning. Lee et al. reported the value of Cronbach's alpha for the whole scale and its dimensions above 0.7 (32). Therefore, this scale has been reported by the researcher as a valid and reliable scale for assessing engagement in e-learning among students (32).

EEQSS: This questionnaire was created by Schaufeli et al. in 1996 to measure the level of students' engagement in academic activities. This questionnaire has 17 items that include three subscales: Vigor (6 questions), dedication (5 questions), and absorption (6 questions) (37). The minimum scale score was 17, and the maximum was 56. Concurrent validity results showed a negative correlation between EEQSS and the burnout scale. Schaufeli et al. obtained the overall reliability of the scale at 0.73. Similarly, Cronbach's alpha was calculated as 0.78, 0.91, and 0.73 for the dimensions of vigor, dedication, and absorption, respectively. Momeni and Radmehr reported Cronbach's alpha reliability as 0.76 for the whole scale and 0.78, 0.8, and 0.67 for the vigor, dedication, and absorption dimensions, respectively (38).

3.4. Forward-Backward Translation

First, the original questionnaire was translated into Persian by two researchers fluent in both languages (Persian and English). Disagreements were resolved by discussion between the two translators. Next, the agreed version was given to two independent translators fluent in English and Persian for back translation (for translation from Persian to English). Another meeting was held with the translators to reach an "agreement on the reverse translation of the questions." Subsequently, an expert group (fields of educational technology, midwifery, nursing, and psychometrics) compared the main and translated backward scales to correct the ambiguities (Figure 1).

Diagram of the scale translation process
Diagram of the scale translation process

After the scale translation, validity (face validity, content validity, structure validity, and convergent validity) and reliability (internal consistency and stability) were checked.

3.5. Statistical Analysis

This research used descriptive statistics indicators, such as frequency and frequency percentage. Data analysis was performed using SPSS version 18 and LISREL version 8.8. The significance level was considered at 0.05.

3.5.1. Face Validity

Face validity was checked in two stages (qualitative and quantitative). The qualitative stage was conducted through face-to-face interviews with ten participants from the target community regarding the simplicity, comprehensibility, and relevance of the items. The quantitative step was performed by calculating the impact score of each item based on the following formula:

Impact = Frequency in % age × Importance.

The minimum value of the impact score for accepting each item was considered 1.5 (39).

3.5.2. Content Validity

Considering that exclusively educational technology experts were selected on the main scale, the research team decided to assess content validity by nursing and midwifery experts in addition to educational technology. In this stage, the items were evaluated by ten experts (educational technology three people, midwifery two people, nursing two people, and psychometrics three people) to check the content validity ratio (CVR) and content validity index (CVI). In CVR, items were evaluated based on a 3-part Likert scale of “essential,” “useful but not necessary,” and “unnecessary.” A CVR value above 0.62 is acceptable based on the Lawshe table (40). In CVI, the items were evaluated based on the score obtained from the mean of three criteria of simplicity, specificity, and clarity based on a 4-point Likert scale (1 = lowest score to 4 = highest score). A score above 0.79 is acceptable (41, 42). Based on content validity, a CVR value less than 0.79 (43) and CVI value less than 0.62 (based on the score of ten experts) were considered as criteria for removing questions (44).

3.5.3. Construct Validity

Exploratory factor analysis (EFA) was used to determine construct validity. First, Kaiser-Meier-Olkin (KMO) and Bartlett coefficient were used to determine the adequacy of the sample size. The value of KMO was considered greater than 0.5 (45). Based on the value of KMO > 0.8, the sample size was sufficient, and based on the value of Bartlett's test < 0.001, it is justified to perform factor analysis. Afterward, hidden factors were extracted by analyzing the principal components and using varimax rotation (46). In construct validity, factors with a value greater than one were considered the main factors (33, 47). The sample size in EFA is 5 - 20 participants per item (48). At this stage, 480 subjects were selected as the sample. The confirmatory factor analysis (CFA) was used to confirm the extracted factors. Coleman reported that a sample size of 200 was appropriate for CFA (49). Therefore, 220 people were selected (20 participants due to sample dropout).

3.5.4. Convergent Validity

Convergent validity was calculated through CFA. Average variance extract (AVE optimal value > 0.5) and composite reliability (CR optimal value > 0.7) were used for convergent validity analysis (50).

3.5.5. Concurrent Validity

At this stage, our questionnaire was distributed to 100 participants along with EEQSS (37), and the correlation between the two tests was assessed.

3.5.6. Reliability

We studied the number of samples required for internal consistency, and one of the criteria was determining the number of samples based on the eigenvalue. Therefore, in this research, after conducting the exploratory factor analysis and calculating the eigenvalue, the sample size was 100 people (51). This research evaluated the reliability of internal consistency with Cronbach’s alpha coefficient and the correlation of each item with the total score. The 2-split-half reliability method was also used. To evaluate the stability reliability, 30 participants completed the research questionnaire and completed it again after two weeks. Then, the correlation coefficient between the two tests was evaluated (52). The value of Cronbach's alpha was considered 0.7 (43).

4. Results

Among 1032 questionnaires distributed to the five sampling stages, 166 students were excluded from the study due to unwillingness to participate or failure to comply with the research criteria. The demographic characteristics of the participants in the validation stages are shown in Table 1. Out of 866 students, 579 (66.86%) were under 22 years old, 569 (65.7%) were women, and 679 (78.41%) were undergraduate or associate students.

Table 1.

Demographic Characteristics of the Participants a

Age (y)GenderEducation StatusTotalMissing
< 22> 22FemaleMaleUndergraduate or Associate StudentsPostgraduate and Ph.D. Students
Face validity6 (60)4 (40)7 (70)3 (30)6 (60)4 (40)100
EFA284 (65.9)147 (34.1)287 (66.6)144 (33.7)347 (81.3)84 (19.4)48049
CFA142 (68.3)66 (31.7)135 (64.9)73 (35.09)161 (77.4)47 (22.6)22012
Concurrentvalidity67 (69.79)29 (30.21)59 (61.46)37 (38.54)68 (70.83)28 (29.17)1004
Reliability (internal consistenc)61 (62.8)32 (33)61 (62.80)32 (34.4)73 (78.5)20 (21.5)1007
Reliability (stability)19 (67.9)9 (32.1)20 (71.4)8 (29.62)24 (88.9)4 (14.2)302

4.1. Face Validity

First, the items on the scale were modified based on interviews with the participants, and then, the impact scores of the items were obtained. All values were above 1.5; therefore, no items were removed at this stage (44).

4.2. Content Validity

The values obtained from CVR for the 24-item scale ranged from 76% to 100%. In addition, the CVI value was 0.79. Due to acceptable CVR and CVI values, no items were removed from the scale (30).

4.3. EFA and CFA

According to the results of factor extraction, for all the coefficients, the values were greater than 0.6. Therefore, no item was removed from the questionnaire (53). In this analysis, using the varimax rotation, four factors with particular values higher than one were extracted, explaining a total of 57.13% of the variance (factor 1 (17.02%), factor 2 (15.27%), factor 3 (13.24%) and factor 4 (11.59%)).

In the rotated component matrix, items with loading values above 0.5 form a dimension in each column. Consequently, items 1 - 6 were placed in one dimension according to the loaded values of column one (i1 = 0.80, i2 = 0.79, i3 = 0.82, i4 = 0.83, i5 = 0.79, and i6 = 0.82). Its name was defined according to these questions as psychological motivation. Items 18 - 24 were placed in one dimension based on the loaded values of column two. According to these questions, its name was management and effective communication (i18 = 0.73, i19 = 0.71, i20 = 0.73, i21 = 0.70, i22 = 0.72, i23 = 0.71, and i24 = 0.73).

Items 12 - 16 were placed in one dimension according to the loaded values of column three. According to these questions, its name was cognitive problem-solving (i12 = 0.80, i13 = 0.81, i14 = 0.78, i15 = 0.77, and i16 = 0.78). Items 7 - 11 were placed in one dimension according to the loaded values of column four. According to these questions, its name was peer collaboration (i7 = 0.72, i8 = 0.71, i9 = 0.72, i10 = 0.74, and i11 = 0.75). The first factor was psychological motivation, with six questions. The second was management and effective communication with seven questions, the third was cognitive problem-solving with five questions, and the fourth was peer collaboration with five questions. Question 17 was not included in any dimension.

All the goodness-of-fit indicators in Table 2 have acceptable values, confirming that the proposed model fits the data reasonably well. Figure 2 displays the factor loadings of the four factors, indicating the model's good fit with the data. Consequently, the data support the 4-factor model, as shown in the confirmatory factor analysis results.

Table 2.

Goodness-of-Fit Measures for Engagement in the E-Learning Scale

VariablesSRMR (< 0.1)RMSEA (< 0.1)CFI (> 0.9)GFI (> 0.9)AGFI (> 0.85)CMIN/DF (< 3)
Values0.0410.0550.950.870.841.66
Confirmatory factor analysis results
Confirmatory factor analysis results

4.4. Convergent Validity

AVE values > 0.5 and CR values > 0.7 AVE values > 0.5 and CR values > 0.7 are acceptable. Therefore, the scale has convergent validity (Table 3).

Table 3.

Average Variance Extract Values and Composite Reliability Values

AVECR
Factor 10.5280.873
Factor 20.5000.872
Factor 30.5280.848
Factor 40.5340.851

4.5. Concurrent Validity

The correlation coefficient between the two scales was 0.61 (P = 0.001. Therefore, the scale has concurrent validity.

4.6. Reliability

The estimated Cronbach's alpha for the whole scale was 0.95. These coefficients were 0.91, 0.87, 0.88, and 0.86 for the first, second, third, and fourth factors, respectively. The correlation value was estimated to be 0.87 at the reliability of the split-half. The correlation between each item and the total score was significant at 0.05. Instability reliability, the correlation between the two tests, was calculated as 0.82.

5. Discussion

One of the major challenges for universities is the lack of knowledge sharing among students (54). Universities use knowledge sharing to help increase the efficiency and effectiveness of their community. This cross-sectional study provided detailed information on the validity and reliability of EELS in students. Lee et al.'s questionnaire has been expanded with the financial support of the Ministry of Public Education, and investigations confirmed content validity, convergent validity, divergent validity, and construct validity (with a sample of 737 students) (32). In addition, the researcher has reported its reliability at a suitable level. On the other hand, a review of the references of this article showed that many studies published in the field of online courses in the last two years had used this tool to measure academic engagement (202 articles posted on reputable sites, such as PubMed, tandfonline, and Springer) (55-59). Therefore, this tool was used in this research to provide a reliable tool to assess academic engagement.

The analysis included a broad range of aspects of the scale, from the construct of the questionnaire (for which exploratory and CFA were used) to its content validity calculations, CVI, and CVR. In all cases, it provided very satisfactory results. Reliability was 0.95 for 24 items, and EFA was satisfactory. Regarding CFA, root mean square error of approximation (RMSEA), goodness of fit index (GFI), and comparative fit index (CFI) values were satisfactory, and factor loadings were all statistically significant. This finding is consistent with the study by Lee et al. (32). This scale had a significant correlation with Schaufeli's academic engagement scale, and Lee's scale had a significant relationship with Schaufeli's academic engagement scale. Therefore, these two scales measure the same concept, and the Lee scale well measures academic engagement.

It should be noted that concurrent validity has not been investigated in the Lee scale. The AVE values were greater than 0.5, and the CR values were greater than 0.7. This indicates the convergent validity of the scale and is consistent with the original version (32).

The final questionnaire included 24 items and four factors: (1) psychological motivation; (2) effective management and communication; (3) cognitive problem-solving; and (4) peer collaboration (Appendix 1). The first dimension: "Psychological Motivation," includes questions about learning, enjoyment, stimulating interest, course functionality, satisfaction with the course, learning expectations, learning expectations, and motivation. This dimension corresponds to the dimension presented in the main questionnaire (32). It can be said that motivation is a prerequisite for learning, and the richest educational programs will not be useful in the absence of motivation (60). Academic engagement in online classes will not be exempt from this issue, and psychological motivation is essential in academic engagement and can increase academic engagement in online classes.

The second dimension, "management and effective communication," includes questions regarding asking questions, belonging to the community, connection with peers, interaction with peers, self-directed study, managing own learning, and managing own learning schedule. As a possible explanation, it can be said that communication increases academic performance, and students who have communication skills establish positive relationships with their classmates and teachers and create a suitable environment for learning (61). That is why communication is essential in online conflict. Moreover, management skills in students improve the motivation to learn, and they do not postpone their assignments and take control of the work processes (62). Therefore, management skills are essential in academic engagement. This dimension entails items from the dimensions of "Interactions with Instructors," "Community Support," and "Learning Management." In explaining this combination, we can highlight the difference between education in Iran and South Korea.

In comparing the educational system of South Korea and Iran, the difference in acceptance could be noted. In South Korea, the admission of students in the field of nursing and midwifery is based on an entrance exam and evaluation of interest and ability to communicate (63). In addition, coordination between goals and content in the educational program, providing lessons in line with creative and critical thinking, human relations, working in multicultural societies, and using evidence-based knowledge are distinctive and different features of the nursing education program. Teaching and using a comprehensive evaluation approach are also among the differences between nursing and midwifery education in South Korea and Iran (34, 64).

Another explanation is the speed and infrastructure related to the internet. South Korea has one of the fastest and cheapest internet settings in the world, and its average internet speed reaches 28.6 Mbps (65-68). Iran, meanwhile, ranks 107th in the world with an average internet connection speed of 4.7 Mbps, and internet access is expensive for citizens (69-72). In some regions of Iran, there is no proper Internet infrastructure. The weakness of the Internet and the lack of access to it in Iran have led to the low participation of students in classrooms compared to South Korea. The involvement of students in online classes largely depends on how to use the facilities of online platforms (73). The lack of internet infrastructure, low internet speed, and poor antenna coverage create many limitations in using the online platforms' features in online classes (74). These problems are more visible in Iran due to internet outages, low internet speed, and filtering.

The third dimension. "cognitive problem-solving," encompasses questions concerning asking questions, deriving an idea, applying knowledge, analyzing knowledge, judging the value of information, and approaching a new perspective. This dimension is aligned with the dimension presented in the main questionnaire (32). It can be said that learning based on problem-solving leads to deep learning, which is effective in the teaching process in which students collaboratively analyze educational issues and reflect on their experiences. The cooperation of professors and students in solving academic problems plays an essential role in the teaching-learning process, which results in improved personal learning skills (75).

The fourth dimension, "peer collaboration," includes questions on requesting help, collaborative problem-solving, responding to questions, collaborative learning, and collaborative assignments. This dimension is aligned with the dimension presented in the main questionnaire (32). It can be said that class participation can facilitate learning. However, in online learning, students must simultaneously "assess themselves," "set goals," "provide strategies to achieve those goals," and be concerned about their learning and progress (76). Therefore, the participation of students in online classes facilitates these challenges, and this dimension was important in this research. To assess the internal reliability of the scale, this study examined the correlation between the total test score and each item, as well as Cronbach's alpha values for the scale and its dimensions (1) psychological motivation; (2) effective management and communication; (3) cognitive problem-solving; and (4) peer collaboration), which were found to be more than 70%. Additionally, the correlation between the total score and each item was significant, indicating good internal reliability. These findings are consistent with the results of the Korean version (32). Moreover, this study confirmed that the scale has stable reliability by establishing a significant relationship between the two scale scores (test-retest), although the stability reliability of the Korean version has not been investigated yet.

One of the strengths of the present study was the selection of a large sample that increased the generalizability of the results. The second strength of the study was the use of cluster sampling methods from all universities in the country. Thus, considering the nature of sampling, it can be said that the random state and maximum variety of samples were maintained.

Among the limitations of this research is its validation in nursing and midwifery students. More caution should be taken in generalizing the results to other students. Therefore, it is suggested to validate this scale among other students. On the other hand, 78.41% of the participants in this research were undergraduate and associate degree students, which can lead to misuse of the results. Therefore, it is suggested to validate this scale among graduate students in research.

Our results supported the appropriate validity and reliability of the scale. This scale can help develop targeted interventions and improve student participation in e-learning by identifying the extent of student engagement in e-learning. The results of this study can also be used in designing online courses and evaluating the effectiveness of this teaching method.

Acknowledgements

References

  • 1.

    Khan M, Adil SF, Alkhathlan HZ, Tahir MN, Saif S, Khan M, et al. COVID-19: A Global Challenge with Old History, Epidemiology and Progress So Far. Molecules. 2020;26(1):39. [PubMed ID: 33374759]. [PubMed Central ID: PMC7795815]. https://doi.org/10.3390/molecules26010039.

  • 2.

    Zajenkowski M, Jonason PK, Leniarska M, Kozakiewicz Z. Who complies with the restrictions to reduce the spread of COVID-19?: Personality and perceptions of the COVID-19 situation. Pers Individ Dif. 2020;166:110199. [PubMed ID: 32565591]. [PubMed Central ID: PMC7296320]. https://doi.org/10.1016/j.paid.2020.110199.

  • 3.

    Lee J. Mental health effects of school closures during COVID-19. Lancet Child Adolesc Health. 2020;4(6):421. [PubMed ID: 32302537]. [PubMed Central ID: PMC7156240]. https://doi.org/10.1016/S2352-4642(20)30109-7.

  • 4.

    Sahu P. Closure of Universities Due to Coronavirus Disease 2019 (COVID-19): Impact on Education and Mental Health of Students and Academic Staff. Cureus. 2020;12(4):e7541. https://doi.org/10.7759/cureus.7541.

  • 5.

    Abdollahi E, Haworth-Brockman M, Keynan Y, Langley JM, Moghadas SM. Simulating the effect of school closure during COVID-19 outbreaks in Ontario, Canada. BMC Med. 2020;18(1):230. [PubMed ID: 32709232]. [PubMed Central ID: PMC7378981]. https://doi.org/10.1186/s12916-020-01705-8.

  • 6.

    Onyema EM, Eucheria NC, Obafemi FA, Sen S, Atonye FG, Sharma A, et al. Impact of Coronavirus Pandemic on Education. J Educ Pract. 2020;11(13):108-21. https://doi.org/10.7176/jep/11-13-12.

  • 7.

    Milosievski M, Zemon D, Stojkovska J, Popovski K. Learning Online: Problems and Solutions. 2020. Available from: https://www.unicef.org/northmacedonia/stories/learning-online-problems-and-solutions.

  • 8.

    Moawad RA. Online Learning during the COVID- 19 Pandemic and Academic Stress in University Students. Revista Romaneasca pentru Educatie Multidimensionala. 2020;12(1 Suppl 2):100-7. https://doi.org/10.18662/rrem/12.1sup2/252.

  • 9.

    Appana S. A Review of Benefits and Limitations of Online Learning in the Context of the Student, the Instructor and the Tenured Faculty. Int J E-Learn. 2008;7(1):5-22.

  • 10.

    Donohue JM, Miller E. COVID-19 and School Closures. JAMA. 2020;324(9):845-7. [PubMed ID: 32745182]. https://doi.org/10.1001/jama.2020.13092.

  • 11.

    Masonbrink AR, Hurley E. Advocating for Children During the COVID-19 School Closures. Pediatrics. 2020;146(3):e20201440. [PubMed ID: 32554517]. https://doi.org/10.1542/peds.2020-1440.

  • 12.

    Meckler L, Natanson H. ‘A lost generation’: Surge of research reveals students sliding backward, most vulnerable worst affected. 2020. Available from: https://www.washingtonpost.com/education/students-falling-behind/2020/12/06/88d7157a-3665-11eb-8d38-6aea1adb3839_story.html.

  • 13.

    Parveen MK, Hannan MJ, Hasan MS, Nandy A. Undergraduate Medical Education in Bangladesh during Coronavirus Disease 2019: Scope and Limitations. 2021. Available from: https://www.researchsquare.com/article/rs-141843/v1.

  • 14.

    Khan MA, Vivek V, Nabi MK, Khojah M, Tahir M. Students’ Perception towards E-Learning during COVID-19 Pandemic in India: An Empirical Study. Sustainability. 2021;13(1):57. https://doi.org/10.3390/su13010057.

  • 15.

    Rajalingam S, Kanagamalliga S, Karuppiah N, Puoza JC. Peer Interaction Teaching-Learning Approaches for Effective Engagement of Students in Virtual Classroom. J Eng Educ Transform. 2021;34:425-32. https://doi.org/10.16920/jeet/2021/v34i0/157191.

  • 16.

    Durbeej N, Abrahamsson N, Papadopoulos FC, Beijer K, Salari R, Sarkadi A. Outside the norm: Mental health, school adjustment and community engagement in non-binary youth. Scand J Public Health. 2021;49(5):529-38. [PubMed ID: 31868564]. https://doi.org/10.1177/1403494819890994.

  • 17.

    Jessani NS, Valmeekanathan A, Babcock C, Ling B, Davey-Rothwell MA, Holtgrave DR. Exploring the evolution of engagement between academic public health researchers and decision-makers: from initiation to dissolution. Health Res Policy Syst. 2020;18(1):15. [PubMed ID: 32039731]. [PubMed Central ID: PMC7011533]. https://doi.org/10.1186/s12961-019-0516-0.

  • 18.

    Wilkins CH, Alberti PM. Shifting Academic Health Centers From a Culture of Community Service to Community Engagement and Integration. Acad Med. 2019;94(6):763-7. [PubMed ID: 30893063]. [PubMed Central ID: PMC6538435]. https://doi.org/10.1097/ACM.0000000000002711.

  • 19.

    Gyönös E. Early School Leaving: Reasons and Consequences. Theoretical and Applied Economics. 2011;18(11(564)):43-52.

  • 20.

    De Ridder KA, Pape K, Johnsen R, Westin S, Holmen TL, Bjorngaard JH. School dropout: a major public health challenge: a 10-year prospective study on medical and non-medical social insurance benefits in young adulthood, the Young-HUNT 1 Study (Norway). J Epidemiol Community Health. 2012;66(11):995-1000. [PubMed ID: 22315238]. https://doi.org/10.1136/jech-2011-200047.

  • 21.

    Rumberger RW. Dropping Out: Why Students Drop Out of High School and What Can Be Done About It. Cambridge, MA: Harvard University Press; 2011. https://doi.org/10.4159/harvard.9780674063167.

  • 22.

    Belfield CR, Levin HM. The Price We Pay: Economic and Social Consequences of Inadequate Education. Washington: Brookings Institution Press; 2007.

  • 23.

    Meurer JR, Whittle JC, Lamb KM, Kosasih MA, Dwinell MR, Urrutia RA. Precision Medicine and Precision Public Health: Academic Education and Community Engagement. Am J Prev Med. 2019;57(2):286-9. [PubMed ID: 31326012]. https://doi.org/10.1016/j.amepre.2019.03.010.

  • 24.

    Ahmed SM, Neu Young S, DeFino MC, Kerschner JE. Measuring institutional community engagement: Adding value to academic health systems. J Clin Transl Sci. 2019;3(1):12-7. [PubMed ID: 31402986]. [PubMed Central ID: PMC6676498]. https://doi.org/10.1017/cts.2019.373.

  • 25.

    Mahdavi B, Rahimi H. [The Predicting of Students' Creativity and Academic Engagement based on Classroom Management Components (Study Case: Students in University of Kashan)]. Res Med Educ. 2021;13(3):30-41. Persian. https://doi.org/10.52547/rme.13.3.30.

  • 26.

    Qorbanpour Lafamajan A. [A study of students' lived experience of virtual education during the epidemic of COVID-19]. Rooyesh-e-Ravanshenasi Journal. 2021;10(8):33-44. Persian.

  • 27.

    Moosavi S, Gholamnejad H, Hassan Shiri F, Ghofrani Kelishami F, Raoufi S. [Challenges of Virtual education During the Pandemic of COVID-19: A Qualitative Research]. Iran J Nurs. 2022;35(135):94-105. Persian. https://doi.org/10.32598/ijn.35.135.3030.

  • 28.

    Ebrahimi M, Alishah F, Zamanipour F. [Identify and analyze the opportunities and challenges of students' virtual education]. New Educ Approaches. 2021;16(2):15-32. Persian. https://doi.org/10.22108/nea.2022.129442.1646.

  • 29.

    Tehrani H, Afzal Aghaei M, Salehian M, Taghipour A, Latifnejad Roudsari R, Karimi FZ. [Explaining the perception and experience of faculty members of Mashhad University of Medical Sciences of virtual education during the COVID-19 epidemic]. J Torbat Heydariyeh Univ Med Sci. 2022;10(1):48-63. Persian.

  • 30.

    Dixson MD. Measuring Student Engagement in the Online Course: The Online Student Engagement Scale (OSE). Online Learn. 2015;19(4). https://doi.org/10.24059/olj.v19i4.561.

  • 31.

    Bigatel PM, Williams V. Measuring Student Engagement in an Online Program. Online Journal of Distance Learning Administration. 2015;18(2).

  • 32.

    Lee J, Song HD, Hong A. Exploring Factors, and Indicators for Measuring Students’ Sustainable Engagement in e-Learning. Sustainability. 2019;11(4):985. https://doi.org/10.3390/su11040985.

  • 33.

    Sanjari S, Amir Fakhraei A, Mohammidi Soleimani MR, Alidousti K. Validation of the Slade Fear of Childbirth Scale for Pregnancy in a Sample of Iranian Women: A Crosssectional Study. Crescent J Med Biol Sci. 2022;9(3):138-46. https://doi.org/10.34172/cjmb.2022.24.

  • 34.

    Gudarzi A, Borzou R, Molavi Vardanjani M, Cheraghi F. [Comparison of Iran and South Korea's undergraduate nursing education]. J Nurs Educ. 2020;9(2):75-88. Persian.

  • 35.

    Sohrabi C, Alsafi Z, O'Neill N, Khan M, Kerwan A, Al-Jabir A, et al. World Health Organization declares global emergency: A review of the 2019 novel coronavirus (COVID-19). Int J Surg. 2020;76:71-6. [PubMed ID: 32112977]. [PubMed Central ID: PMC7105032]. https://doi.org/10.1016/j.ijsu.2020.02.034.

  • 36.

    Alandijany TA, Faizo AA, Azhar EI. Coronavirus disease of 2019 (COVID-19) in the Gulf Cooperation Council (GCC) countries: Current status and management practices. J Infect Public Health. 2020;13(6):839-42. [PubMed ID: 32507401]. [PubMed Central ID: PMC7256540]. https://doi.org/10.1016/j.jiph.2020.05.020.

  • 37.

    Schaufeli WB, Salanova M, González-romá V, Bakker AB. The Measurement of Engagement and Burnout: A Two Sample Confirmatory Factor Analytic Approach. Journal of Happiness Studies. J Happiness Stud. 2002;3(1):71-92. https://doi.org/10.1023/a:1015630930326.

  • 38.

    Momeni K, Radmehr F. [Prediction of academic engagement Based on Self-Efficacy and Academic Self-handicapping in Medical Students]. Res Med Educ. 2019;10(4):41-50. Persian. https://doi.org/10.29252/rme.10.4.41.

  • 39.

    Abdollahipour F, Alizadeh Zarei M, Akbar Fahimi M, Karamali Esmaeili S. [Study of Face and Content Validity of the Persian Version of Behavior Rating Inventory of Executive Function, Preschool Version]. J Rehabil. 2016;17(1):10-7. Persian. https://doi.org/10.20286/jrehab-170110.

  • 40.

    Abbaspoor Z, Javadifar N, Miryan M, Abedi P. Psychometric properties of the Iranian version of mindful eating questionnaire in women who seeking weight reduction. J Eat Disord. 2018;6:33. [PubMed ID: 30410760]. [PubMed Central ID: PMC6214170]. https://doi.org/10.1186/s40337-018-0220-4.

  • 41.

    Lawshe CH. A Quantitative Approach to Content Validity. Pers Psychol. 1975;28(4):563-75. https://doi.org/10.1111/j.1744-6570.1975.tb01393.x.

  • 42.

    Sanjari S, Rafati F, Amirfakhraei A, Mohamade Solymane MR, Karimi Afshar E. [Evaluation of Factor Structure and Validation of Electronic form of CAQ Fear of Delivery Questionnaire in Pregnant Women]. Health Psychol. 2021;10(38):57-70. Persian. https://doi.org/10.30473/hpj.2021.53031.4830.

  • 43.

    Taber KS. The Use of Cronbach’s Alpha When Developing and Reporting Research Instruments in Science Education. Res Sci Educ. 2018;48(6):1273-96. https://doi.org/10.1007/s11165-016-9602-2.

  • 44.

    Sanjari S, Mohammidi Soleimani MR, Keramat A. Development and Validation of an Electronic Scale for Sexual Violence Experiences in Iranian Women. Crescent J Med Biol Sci. 2023;10(1):27-35. https://doi.org/10.34172/cjmb.2023.05.

  • 45.

    Li N, Huang J, Feng Y. Construction and confirmatory factor analysis of the core cognitive ability index system of ship C2 system operators. PLoS One. 2020;15(8):e0237339. [PubMed ID: 32833969]. [PubMed Central ID: PMC7446803]. https://doi.org/10.1371/journal.pone.0237339.

  • 46.

    Sharif Nia H, Haghdoost AA, Ebadi A, Soleimani MA, Yaghoobzadeh A, Abbaszadeh A, et al. [Psychometric Properties of the King Spiritual Intelligence Questionnaire (KSIQ) in Physical Veterans of Iran-Iraq Warfare]. J Mil Med. 2015;17(3):145-53. Persian.

  • 47.

    Rostami F, Owaysee Osquee H, Mahdavi F, Dousti S. Development of a New Psychometric Assessment Tool for Predicting Hepatitis B Virus Infection in Pregnant Women. Int J Women's Health Reprod Sci. 2020;8(3):297-302. https://doi.org/10.15296/ijwhr.2020.48.

  • 48.

    Varmazyar S, Mortazavi SB, Arghami S, Hajizadeh E. [Determination of the Validity and Reliability of Bus Drivers' Behaviour Questionnaire in Tehran in 2012: Exploratory and Confirmatory Factor Analysis]. J Rafsanjan Univ Med Sci. 2014;13(3):235-48. Persian.

  • 49.

    Coleman DA. Service brand identity: definition, measurement, dimensionality and influence on brand performance [dissertation]. Birmingham: University of Birmingham; 2011.

  • 50.

    Guo Y, Ma K, Guo L, Dong X, Yang C, Wang M, et al. Development and psychometric appraisal of Head Nurse Research Leadership Scale. Nurs Open. 2023;10(5):3378-87. [PubMed ID: 36622948]. [PubMed Central ID: PMC10077399]. https://doi.org/10.1002/nop2.1592.

  • 51.

    Yurdugül H. Minimum sample size for cronbach’s coefficient alpha: A monte-carlo study. Hacettepe Üniversitesi Eğitim Fakültesi Dergis. 2008;35:397-405.

  • 52.

    Arastoo AA, Montazeri A, Abdolalizadeh M, Ghasemzadeh R, Ahmadi K, Azizi A. [Psychometric properties of Persian version of the Work Ability Index questionnaire]. Payesh. 2013;12(5):535-43. Persian.

  • 53.

    Maskey R, Fei J, Nguyen HO. Use of exploratory factor analysis in maritime research. Asian J Shipp Logist. 2018;34(2):91-111. https://doi.org/10.1016/j.ajsl.2018.06.006.

  • 54.

    Salimi G, Haidari E, Keshavarzi F. [The explanation of the relationship between the students` engagement and knowledge sharing behavior at University: the role of attitude to knowledge sharing as the mediator variable]. Academic Librarianship and Information Research. 2014;47(4):351-74. Persian. https://doi.org/10.22059/jlib.2013.51130.

  • 55.

    Lee GG, Kang DY, Kim MJ, Hong HG, Martin SN. University students’ perceptions of remote laboratory courses necessitated by COVID-19: differences in emergent teaching strategies at a Korean university. Asia Pac Educ Rev. 2023. https://doi.org/10.1007/s12564-023-09837-1.

  • 56.

    Al Mamun MA, Lawrie G. Student-content interactions: Exploring behavioural engagement with self-regulated inquiry-based online learning modules. Smart Learn Environ. 2023;10(1). https://doi.org/10.1186/s40561-022-00221-x.

  • 57.

    Hampton D, Hardin-Fanning F, Culp-Roche A, Hensley A, Wilson JL. Promotion of Student Engagement Through the Application of Good Practices in Nursing Online Education. Nurs Adm Q. 2023;47(2):E12-20. [PubMed ID: 36728081]. https://doi.org/10.1097/NAQ.0000000000000556.

  • 58.

    Richards S. Faculty Perception of Student Engagement in Online Anatomy Laboratory Courses During the COVID-19 Pandemic. Med Sci Educ. 2023;33(2):465-80. [PubMed ID: 37251200]. [PubMed Central ID: PMC9990041]. https://doi.org/10.1007/s40670-023-01762-7.

  • 59.

    Ma X, Jiang M, Nong L. The effect of teacher support on Chinese university students' sustainable online learning engagement and online academic persistence in the post-epidemic era. Front Psychol. 2023;14:1076552. [PubMed ID: 36794084]. [PubMed Central ID: PMC9922889]. https://doi.org/10.3389/fpsyg.2023.1076552.

  • 60.

    Salajegheh M, Hoseiny Shavoun A. [The Role of motivation in learning]. J Med Educ Dev. 2018;13(2):172-4. Persian.

  • 61.

    Ahmadi MS, Hatami HR, Ahadi H, Asadzadeh H. [A Study of the Effect of Communication Skills Training on the Female Students’ Self-efficacy and Achievement]. J New Approaches Educ Adm. 2014;4(16):105-18. Persian.

  • 62.

    Habibikaleybar R, Bahadorikhosroshahi J. [The effectiveness training of self-management skills with academic alienation, academic persecution and bullying high school students]. Educ Strategy Med Sci. 2016;9(5):371-80. Persian.

  • 63.

    Kwon SH, Park MH, Kim HS. Education, Role, and Prospects of Advanced Practice Nurses in Hospice and Palliative Care in South Korea. Korean J Hosp Palliat Care. 2021;24(1):1-12. https://doi.org/10.14475/jhpc.2021.24.1.1.

  • 64.

    Kang SJ, Kim IS. Development of the Korean Nursing Profession with Changes in its Legal Basis. Int J Nurs Clin Pract. 2016;3(1):165. https://doi.org/10.15344/2394-4978/2016/165.

  • 65.

    Lee K, Kim Y, Palvia P. Information Technology Issues in South Korea. In: Palvia P, Ghosh J, Jacks T, Serenko A, Turan AH, editors. World Scientific-Now Publishers Series in Business: Volume 17: The World IT Project: Global Issues in Information Technology. 17. Singapore: World Scientific Publishing; 2020. p. 407-19. https://doi.org/10.1142/9789811208645_0032.

  • 66.

    Kiio AM, Kohsuwan P. The Effects of Service Fairness on Customer Recovery Satisfaction and Loyalty towards Internet Services. Hum Behavi Dev Soc. 2020;21(4):35-46.

  • 67.

    Soumya B, Vani K. COVID-19 and Digital Inclusion. J Dev Policy Rev. 2020;1(1):167-74.

  • 68.

    Han S, Lim H, Noh H, Shin HJ, Kim GW, Lee YH. Videotelephony-assisted medical direction to improve emergency medical service. Am J Emerg Med. 2020;38(4):754-8. [PubMed ID: 31227420]. https://doi.org/10.1016/j.ajem.2019.06.023.

  • 69.

    Nasr M, Zolfaghari H, Houmansadr A, Ghafari A. MassBrowser: Unblocking the Censored Web for the Masses, by the Masses. Network and Distributed Systems Security (NDSS) Symposium 2020. 23-26 February 2020; San Diego, CA, USA. 2020.

  • 70.

    VanderSloot B. Enhancing System Transparency, Trust, and Privacy with Internet Measurement [thesis]. Ann Arbor, MI: University of Michigan; 2020.

  • 71.

    Rahmani H. Integrated Wirelessly Powered Solutions for Medical Implants and Internet of Things [dissertation]. Los Angeles, CA: University of California; 2020.

  • 72.

    Li F. Understanding and Circumventing Deployed Traffic Differentiation Practices [dissertation]. Boston, MA: Northeastern University; 2020.

  • 73.

    Vezne R, Yildiz Durak H, Atman Uslu N. Online learning in higher education: Examining the predictors of students' online engagement. Educ Inf Technol (Dordr). 2023;28(2):1865-89. [PubMed ID: 35967825]. [PubMed Central ID: PMC9360686]. https://doi.org/10.1007/s10639-022-11171-9.

  • 74.

    Hollister B, Nair P, Hill-Lindsay S, Chukoskie L. Engagement in Online Learning: Student Attitudes and Behavior During COVID-19. Front Educ. 2022;7:851019. https://doi.org/10.3389/feduc.2022.851019.

  • 75.

    Mansoori S, Abedini-Baltork M, Lashkari H, Bagheri S. [Effectiveness of Problem-Based Learning on Student's Academic Performance: A quasi-experimental study]. Res Med Educ. 2017;9(1):1-8. Persian. https://doi.org/10.18869/acadpub.rme.9.1.8.

  • 76.

    Winters FI, Alexander PA. Peer collaboration: the relation of regulatory behaviors to learning with hypermedia. Instr Sci. 2011;39(4):407-27. https://doi.org/10.1007/s11251-010-9134-5.