Design and Evaluation of a Performance Dashboard for the Faculty of Allied Medical Sciences, AJA University of Medical Sciences: A Protocol for a Mixed Methods Study

authors:

avatar Sohrab Almasi ORCID 1 , avatar Nahid Mehrabi ORCID 1 , * , avatar Mahdi Ghorbani 2

Department of Health Information Technology, School of Paramedical Sciences, AJA University of Medical Sciences, Tehran, Iran
Assistant Professor of Hematology and Transfusion Medicine, Department of Medical Laboratory Sciences, School of Allied Medical Sciences, AJA University of Medical Sciences, Tehran, Iran

how to cite: Almasi S, Mehrabi N, Ghorbani M. Design and Evaluation of a Performance Dashboard for the Faculty of Allied Medical Sciences, AJA University of Medical Sciences: A Protocol for a Mixed Methods Study. Shiraz E-Med J. 2023;24(12):e137592. https://doi.org/10.5812/semj-137592.

Abstract

Background:

Faculties, as educational systems, comprise various educational groups, faculty members, researchers, students, and administrative staff. The management of data records related to the performance and activities of the faculty and its members leads to better monitoring, identification of weaknesses and strengths, and, ultimately, promotion of the faculty's performance. Dashboards are data management tools that can be used for monitoring and evaluating a faculty's performance.

Objectives:

This study aims to develop a protocol for the design of a faculty performance dashboard with a sequential mixed methods approach.

Methods:

This cross-sectional study will be conducted in the Faculty of Allied Medical Sciences, AJA University of Medical Sciences, in 2023. A mixed methods study with a sequential mixed (qualitative and then quantitative) design will be conducted in four phases. First, all the resources related to the functional dashboard are reviewed to identify its operational requirements. Second, the requirements and necessities of the software are determined by qualitative (interviews) and then quantitative (Delphi) methods. In this phase, 8 people will be interviewed during the qualitative phase, and thematic analysis will be used to analyze the data. For the quantitative step, the 2-round Delphi technique will be conducted by the purposive selection of 21 individuals. Data analysis for the quantitative step will be conducted in SPSS v. 22 by using descriptive statistics, including mode, median, mean, and percentage of agreement. Third, software coding will be performed in C# programming language in Visual Studio. Finally, 15 people among faculty members and managers will be selected by using purposive sampling to evaluate the software. In this phase, the qualitative method and then the quantitative method are used for software evaluation. In the qualitative method, the think-aloud protocol will be used to evaluate usability, and in the quantitative method, the users' satisfaction with the dashboard software will be assessed using a questionnaire. The validity and reliability of the questionnaire have been confirmed previously (Cronbach's alpha: 0.94). The data will be analyzed using descriptive statistics in SPSS v. 21.

Results:

The paper identifies four steps that should be followed when designing and adopting performance dashboards to support student agency and empowerment.

Conclusions:

The final product of this study is a dashboard for monitoring, evaluating performance, and managing resources at the faculty level.

1. Background

A faculty, as an educational system, consists of various educational groups, faculty members, researchers, students, and administrative staff. Each faculty member contributes to different areas, namely teaching, research, and management (1). A faculty is a place where different types of conferences and conventions are held. The data related to these activities, with the participation of faculty members, are facts and information resulting from academic endeavors (2). The data of a faculty refers to the information linked with the academic performance of its professors and lecturers, such as details of academic services and contributions, completed courses, the number of annual research publications, and the number of committees of which the faculty member is a member (1, 3).

Although the collection, management, and reporting of faculty data are crucial for each faculty member and the institution itself as a complete establishment, numerous gaps exist in this area (4). While a faculty member may be involved in several activities, most of these activities are not documented and recognized because the university lacks a central system for effectively recording these data and presenting a comprehensive report of such activities and performance feedback (5).

Currently, different independent systems host faculty data (6). The lack of internal communication between these systems causes these data to be enclosed in a contained silo (7). Retrieving data from multiple systems is often a manual and, of course, difficult process for representatives and faculty members. Since these data are not analyzed or merged, their trends and inter-relationships cannot be exploited, which is a lost opportunity to discover information and extract knowledge (8).

Currently, the data recording section is inadequate in most higher education institutions, and there is partial automation for recording and sharing data between different systems. Therefore, faculty members and managers have to spend a lot of time and effort on manual data entry to gather or track the details of academic activities and assessments (9). Although the manual entering of data is unavoidable in some cases, automation and interoperability between systems can prevent duplicate data recording. In addition, faculty members may have inadequate time and skills to perform statistical analyses on data (e.g., findings correlations) and extrapolate valuable interpretations, targeted feedback, or practical complementary objectives (10).

As a data management tool, dashboards are one of the most effective and renowned forms of data objectification (10, 11). A dashboard can be defined as a tool for visualization that provides the possibility for acquiring awareness, finding trends, planning, and real comparisons. These items are repeatedly embodied in a simple and functional user interface. A dashboard of accumulated data effectively presents multiple sources and a comprehensive summary of important information that can be assimilated by faculty members at a glance (11). Performance dashboards enable organizations to measure, monitor, and manage business performance more effectively (12, 13). They are built on the foundations of business intelligence and data integration infrastructure and are used for monitoring, analysis, and management (12).

Developing a faculty performance dashboard is useful for quickly and easily sharing information about faculty members' performance with them in a way that helps them better understand the data (14). Observing and interpreting the data presented in large tables and lengthy reports are exhausting and time-consuming tasks for faculty members. In other words, a dashboard, if designed appropriately, can help faculty members spot their strengths and areas of progress and identify the trends and steps necessary for improvement (15).

Based on the researchers’ explorations, there are substantial gaps in the reporting and management of faculty data. Therefore, it seems necessary to develop a comprehensive dashboard for monitoring and evaluating the performance of educational groups, faculty members, students, and other faculty staff, as well as to monitor the performance of the faculty in various fields, such as education, research, culture, and student affairs, resource management, and technology and development.

2. Methods

This study will be carried out by using the consecutive mixed design. In a sequential design (qualitative and then quantitative), the data collection and analysis of one component take place after the data collection and data analysis of the other component and depend on the latter's outcomes (16). Mixed methods research combines closed-ended response data (quantitative) and open-ended personal data (qualitative) (17).

This protocol study is registered in the Open Science Framework (OSF) registries database (DOI: 10.17605/OSF.IO/J326S).

2.1. Setting

The research setting is the Faculty of Allied Medical Sciences of the AJA University of Medical Sciences in 2023 (From 20 April 2023 to 21 November 2023). The study has obtained ethical approval and will be conducted in four phases (Figure 1 and Table 1).

The phases of the mixed methods study
The phases of the mixed methods study
Table 1.

Summary of the Phases of the Protocol Study and the Goals, Outputs, Methods, and Time of Each Phase

Study phaseGoalsOutputMethodTechniqueTime
1. Extracting the key performance of the faculty and the capabilities of the performance dashboardIdentification of functional and non-functional requirements of the performance dashboard and performance indicatorsFunctional and non-functional requirements of the performance dashboard; Key performance indicators of the facultySystematic reviewPreferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA)From 20 April 2023 to 5 June 2023
2. Requirements of the performance dashboard from the perspective of usersIdentification of the requirements of the performance dashboard from the perspective of usersFunctional and non-functional requirements of the performance dashboard; Key performance indicators of the faculty2.1. Qualitative methodsSemi-structured interviewFrom 22 June 2023 to 22 July 2023
2.2. Quantitative methods Delphi technique (questionnaire)From 22 July 2023 to 22 August 2023
3. Software developmentSoftware productionPerformance dashboard software-From 22 August 2023 to 22 October 2023
4. Evaluation of the performance dashboardEvaluation of user satisfactionUsability and user satisfaction evaluation with the dashboard software4.1. Qualitative methodsThink-aloudFrom 22 October 2023 to 6 November 2023
4.2. Quantitative methodsQuestionnaireFrom 6 November 2023 to 21 November 2023

2.1.1. Phase 1: Identification of Functional and Non-functional Requirements of the Performance Dashboard and Performance Indicators of the Faculty Through a Systematic Review

This phase aims to extract the key performance indicators of the faculty, as well as the capabilities of the performance dashboard. Data search and extraction phases were previously performed based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist (18). In this step, the search was performed using a combination of keywords, including "dashboard[TIAB] OR whiteboard[TIAB]" AND "Quality Indicators, Health Care" [Mesh] OR "Quality Indicators" [TIAB] OR "Key performance indicators" [TIAB] AND faculty[TIAB] OR university[TIAB] in PubMed, ScienceDirect, Web of Science, Scopus, and Google Scholar from 1 until 20 April 2023. The articles were selected based on the inclusion/exclusion criteria in terms of study design and whether or not they assessed performance dashboards at the levels of faculty or university.

The inclusion criteria was as follow: (1) English articles published in peer-reviewed journals or conferences with an available full-text; (2) articles on performance indicators and functionalities of performance dashboards at the levels of faculty or university; (3) articles published from 1 until 20 April 2023.

The exclusion criteria was as follow: (1) review articles, case reports, case studies or study protocols, letters to the editor, correspondences, and conference papers (absence or lack of access to the full text); (2) papers that merely design a performance dashboard at the levels of faculty or university.

For paper selection, three authors (SA, NM, and MGH) checked the titles and abstracts of the papers and removed the irrelevant papers. For eligibility assessment, the papers were independently checked by the mentioned authors. The bibliography check was then conducted by one of the authors (SA).

In data extraction, the indicators are divided into 5 groups, including education, research, cultural and student affairs, resource management, and development and technology (Appendix 1 in the Supplementary File).

2.1.2. Phase 2: Requirements of the Performance Dashboard from the Perspective of Users

This phase is conducted in two steps. First, a qualitative study is conducted to identify the requirements of the performance dashboard software. The inclusion/exclusion criteria for the participants in this phase are given below.

The inclusion criteria was as follow: (1) being the main users of the dashboard; (2) desiring to participate in the study; (3) having at least 5 years of work experience.

The exclusion criteria was as follow: Unwillingness to continue cooperation at any stage of the research.

For this purpose, 8 educational group directors and faculty directors are selected by purposive sampling for interviews. The average duration of each interview will be 30 minutes. Semi-structured interviews will continue until data saturation. At this stage, after coordinating with the interviewees and obtaining their informed consent, the interview is recorded using an electronic audio recorder, and then it is transcribed verbatim in Microsoft Word. The questions raised in the interviews are related to the functional and non-functional requirements of the dashboard, as well as the performance preferences of users (Appendix 1 in the Supplementary File). After transcription, the interviews are subjected to code extraction and then thematic analysis, involving 6 phases: (1) phase 1: Familiarizing oneself with the data; (2) phase 2: Generating initial codes; (3) phase 3: Searching for themes; (4) phase 4: Reviewing themes; (5) phase 5: Defining and naming; and (6) phase 6: Reporting the themes (19).

In the second step, a questionnaire is designed to identify the key performance indicators of the faculty using the 2-round Delphi technique. Twenty individuals are purposively selected among academic members, educational group directors, and faculty directors. In the first step of the Delphi technique, a questionnaire with 3-choice questions (disagree, no opinion, and agree) and an open-ended question at the end of each section is completed. In this way, the participants can state if they think anything should be added to the questionnaire for the second step of the Delphi technique. In the second step, the proposed indicators are added and subjected to a poll. For data analysis, items with > 75% agreement are accepted, those with an agreement of 50 - 75% enter the second round of Delphi, and those with < 50% agreement are omitted from the questionnaire.

2.1.3. Phase 3: Software Development

Microsoft Visual Studio 2019, the ASP.NET MVC Core 3.1 framework, and C# Server Language are used to perform software coding. The interface of the software is designed using HTML, jQuery, CSS, and Javascript languages. Finally, the Microsoft SQL Server is used for designing tables and managing the database.

2.1.4. Phase 4: Evaluation of the Performance Dashboard

In this phase, the qualitative method and then the quantitative method are used to evaluate the software. In the qualitative method, the think-aloud protocol will be used to evaluate usability, and in the quantitative method, the users' satisfaction with the dashboard software is assessed using the questionnaire. In this phase, 15 academic members and managers of the faculty who are dashboard software users are chosen. The inclusion/exclusion criteria for the participants in this phase are mentioned below.

Inclusion criteria was as follow: (1) having at least 5 years of work experience; (2) academic staff members of educational groups and faculty managers; (3) being the main users of the dashboard; (4) desiring to participate in the study.

Exclusion criteria was as follow: Unwillingness to continue cooperation at any stage of the research.

Think-aloud or concurrent verbalization was borrowed from cognitive psychology (20). In this method, users think aloud while performing a set of specified tasks (21); in other words, they verbalize anything that crosses their minds during the task performance (20). An advantage of this method is that it enables the collection of insights into the difficulties that participants encounter while using the system/product (22). In this method, users are asked to express their suggestions and comments regarding the dashboard while working with the software.

A 20-question Dashboard Assessment Usability Model scored based on a five-point Likert scale (1 = completely disagree; 5 = completely agree) will be used to evaluate user satisfaction with the dashboard software. In addition, two open-ended questions are presented to the participants so that they can express their viewpoints and recommendations. This questionnaire will evaluate the dimensions of satisfaction (4 questions), effectiveness (2 questions), efficiency (2 questions), operability (5 questions), learnability (4 questions), user interface aesthetics (1 question), appropriate recognizability (1 question), and accessibility (1 question).

The validity and reliability of the questionnaire have been confirmed previously (Cronbach's alpha for reliability: 0.94) (23). In the final step, the data are presented in tables using descriptive statistics such as frequency and percentage. Data analysis is conducted in SPSS v. 21.

3. Results

In the first stage, which is already conducted, performance indicators and dashboard features were extracted after extracting relevant articles based on the inclusion and exclusion criteria for the studies.

The performance indicators were divided into 5 areas: education, research, educational and cultural, resource management, and development and technology. Each area has its own performance indicators. The performance dashboard features were divided into performance monitoring, evaluation, and resource management. In the second stage of the study, a questionnaire and interview guide were prepared to identify users' needs based on the information extracted from the first phase.

4. Discussion

This study aimed to develop a protocol for the design of a faculty performance dashboard for monitoring, evaluation, and resource management at the faculty level. The steps used for developing this dashboard can provide a basis for designing better performance dashboards for other colleges or universities.

Due to the importance of information in organizations such as universities, it is essential to trace the flow and dimensions of information. The lack of proper management of information resources can impede the achievement of organizational goals and baffle employees when they work with information sources; this leads to redundant work in different departments, retrieval of similar information, and finally, the flowing of this information into organizational databases, which requires spending extra time and costs to reuse it (24, 25). The establishment and use of comprehensive information resources play a strategic role in the qualitative development of universities and their transformation into pioneer organizations. These measures also play a substantial role in achieving the strategic goals of the university (26). The information obtained from the information system provides a powerful management tool in the higher education system (27). Because of providing timely and accurate information, dashboards are considered powerful systems to fulfill the informational needs of organizations, including universities, and to handle large amounts of organizational data (28).

Performance evaluation is among the capabilities of the faculty performance dashboard, a process through which the function of employees is formally and regularly assessed at certain intervals. Evaluation of the performance of academic members refers to the regular assessment of their educational/research activities and determining to what extent the goals of the educational system are achieved according to predetermined criteria (29, 30). Functional monitoring refers to the real-time observation of the faculty's key performance indicators (29, 30). Faculty resource management encompasses being informed of the status quo of human resources and equipment (29, 30).

In the present study, first, the qualitative method and then the quantitative method are used in the data collection process. In the dashboard evaluation phase, a combination of qualitative and quantitative methods is used.

Studies illustrate the growing importance of mixed methods research for many health disciplines, ranging from nursing to epidemiology (31, 32). Mixed methods approaches require not only the skills of the individual quantitative and qualitative methods but also a skill set to bring the two methods/datasets/findings together in the most appropriate way (31).

Mixed methods research can provide a plethora of advantages for researchers and practitioners who try to gain a more comprehensive and nuanced understanding of their research topic. By offering a richer and deeper data set that can capture the diversity and complexity of the research phenomenon, mixed methods research can enable the triangulation or corroboration of the data or results from different sources or methods, thus increasing the validity or trustworthiness of the research (33). Additionally, it can allow for the exploration or explanation of the findings from one approach with the data or results from another approach, thereby enhancing the interpretation or understanding of the research (32). In the current study, the qualitative method (think-aloud) and a questionnaire will be used in the dashboard evaluation phase. Generally, questionnaires are the most commonly used tools for usability evaluation due to the simplicity of data analysis (34). However, a combination of qualitative and quantitative approaches is suggested to appropriately measure the usability of technologies (34).

Based on another study, the use of the think-aloud protocol for usability evaluation allows participants to share their real-time experience with using the app and stimulates verbal expression of this experience, which is more difficult to achieve using traditional stand-alone usability testing (35). Finally, it can be acknowledged that both quantitative and qualitative methods play a significant role in technology development and progress. While quantitative methods have some advantages, such as cost-effectiveness and higher suitability for studies with a large sample size, qualitative methods (e.g., think-aloud) provide details about problems to which quantitative methods do not commonly apply (36). Additionally, qualitative data analysis of user behaviors and routines and a variety of other types of information are essential to delivering a product that actually fits into the users' needs or desires (37).

Despite the strengths of this study, we may face some challenges while conducting its various phases. For example, in phase 1, the participants may refuse full cooperation in completing the questionnaire or conducting the interviews due to their busy work schedules. We will try to distribute a considerable number of questionnaires among users to obviate this challenge. During the implementation phase, the designed software may not be suitably integrated with other organizational systems, thus interfering with information exchange. This challenge will be addressed by writing the codes of this software in object-oriented programming languages.

4.1. Conclusions

Faculty, as an educational system, comprises various educational groups, faculty members, researchers, students, and administrative staff. The management of data records related to the performance and activities of the faculty and its members leads to better monitoring, identification of weaknesses and strengths, and, ultimately, promotion of the faculty's performance. Dashboards are embedded in educational processes, paying attention to the ways that the tools are integrated into the educational systems and processes. In fact, a dashboard is a data management tool that can be used for monitoring and evaluating a faculty's performance. The final product of this study is a dashboard for monitoring, evaluating performance, and managing resources at the faculty level.

Acknowledgements

References

  • 1.

    Reymert I, Thune T. Task complementarity in academic work: a study of the relationship between research, education and third mission tasks among university professors. J Technol Transf. 2022;48(1):331-60. https://doi.org/10.1007/s10961-021-09916-8.

  • 2.

    Sridhar S, Dias B, Sequeira AH. Measuring Faculty Productivity - A Conceptual Review. SSRN Electronic Journal. 2010;1(1). https://doi.org/10.2139/ssrn.2285279.

  • 3.

    Norcini J, Anderson MB, Bollela V, Burch V, Costa MJ, Duvivier R, et al. 2018 Consensus framework for good assessment. Med Teach. 2018;40(11):1102-9. [PubMed ID: 30299187]. https://doi.org/10.1080/0142159X.2018.1500016.

  • 4.

    Bland CJ, Wersal L, VanLoy W, Jacott W. Evaluating faculty performance: a systematically designed and assessed approach. Acad Med. 2002;77(1):15-30. [PubMed ID: 11788318]. https://doi.org/10.1097/00001888-200201000-00006.

  • 5.

    Moore S, Kuol N. Students evaluating teachers: exploring the importance of faculty reaction to feedback on teaching. Teach High Educ. 2007;10(1):57-73. https://doi.org/10.1080/1356251052000305534.

  • 6.

    Philibert I, Lieh-Lai M, Miller R, Potts J3, Brigham T, Nasca TJ. Scholarly activity in the next accreditation system: moving from structure and process to outcomes. J Grad Med Educ. 2013;5(4):714-7. [PubMed ID: 24455034]. [PubMed Central ID: PMC3886487]. https://doi.org/10.4300/JGME-05-04-43.

  • 7.

    Lewis KO, Baker RC. The development of an electronic educational portfolio: an outline for medical education professionals. Teach Learn Med. 2007;19(2):139-47. [PubMed ID: 17564541]. https://doi.org/10.1080/10401330701332219.

  • 8.

    Lewis PJ, Chertoff JD. Developing and Implementing a Web-Based Departmental Faculty Scholarly and Service Activity Database. J Am Coll Radiol. 2017;14(5):671-4. [PubMed ID: 28017268]. https://doi.org/10.1016/j.jacr.2016.10.015.

  • 9.

    Collins J, Amis EJ, Beauchamp NJ, Norbash AM, Meltzer CC, Society of Chairs of Academic Radiology D. A guide to the external review of an academic radiology department. Acad Radiol. 2014;21(3):400-6. [PubMed ID: 24507427]. https://doi.org/10.1016/j.acra.2013.11.020.

  • 10.

    Hora MT, Bouwma-Gearhart J, Park HJ. Data driven decision-making in the era of accountability: Fostering faculty data cultures for learning. Rev High Ed. 2017;40(3):391-426. https://doi.org/10.1353/rhe.2017.0013.

  • 11.

    Almasi S, Rabiei R, Moghaddasi H, Vahidi-Asl M. Emergency Department Quality Dashboard; a Systematic Review of Performance Indicators, Functionalities, and Challenges. Arch Acad Emerg Med. 2021;9(1). e47. [PubMed ID: 34405145]. [PubMed Central ID: PMC8366462]. https://doi.org/10.22037/aaem.v9i1.1230.

  • 12.

    Eckerson WW. Performance Dashboards: Measuring, Monitoring, and Managing Your Business. New Jersey, USA: John Wiley & Sons; 2010.

  • 13.

    Rabiei R, Almasi S. Requirements and challenges of hospital dashboards: a systematic literature review. BMC Med Inform Decis Mak. 2022;22(1):287. [PubMed ID: 36348339]. [PubMed Central ID: PMC9644506]. https://doi.org/10.1186/s12911-022-02037-8.

  • 14.

    Faiola AJ, Srinivas P, Doebbeling BN. A ubiquitous situation-aware data visualization dashboard to reduce ICU clinician cognitive load. 17th International Conference on E-health Networking, Application & Services (HealthCom). Boston, MA, USA. IEEE; 2015. p. 439-42.

  • 15.

    Yilmaz Y, Carey R, Chan TM, Bandi V, Wang S, Woods RA, et al. Developing a dashboard for faculty development in competency-based training programs: a design-based research project. Can Med Educ J. 2021;12(4):48-64. [PubMed ID: 34567305]. [PubMed Central ID: PMC8463237]. https://doi.org/10.36834/cmej.72067.

  • 16.

    Schoonenboom J, Johnson RB. How to Construct a Mixed Methods Research Design. Kolner Z Soz Sozpsychol. 2017;69(Suppl 2):107-31. [PubMed ID: 28989188]. [PubMed Central ID: PMC5602001]. https://doi.org/10.1007/s11577-017-0454-1.

  • 17.

    Johnson R, Onwuegbuzie AJ, Turner LA. Toward a Definition of Mixed Methods Research. J Mix Methods Res. 2016;1(2):112-33. https://doi.org/10.1177/1558689806298224.

  • 18.

    Moher D, Liberati A, Tetzlaff J, Altman DG, Prisma Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7). e1000097. [PubMed ID: 19621072]. [PubMed Central ID: PMC2707599]. https://doi.org/10.1371/journal.pmed.1000097.

  • 19.

    Nowell LS, Norris JM, White DE, Moules NJ. Thematic Analysis. Int J Qual Methods. 2017;16(1):160940691773384. https://doi.org/10.1177/1609406917733847.

  • 20.

    Ericsson KA, Simon HA. Protocol Analysis, revised edition: Verbal Reports as Data. Cambridge, MA, USA: MIT Press; 1984.

  • 21.

    Fernandez A, Insfran E, Abrahão S. Usability evaluation methods for the web: A systematic mapping study. Inf Softw Technol. 2011;53(8):789-817. https://doi.org/10.1016/j.infsof.2011.02.007.

  • 22.

    Schall MC, Cullen L, Pennathur P, Chen H, Burrell K, Matthews G. Usability Evaluation and Implementation of a Health Information Technology Dashboard of Evidence-Based Quality Indicators. Comput Inform Nurs. 2017;35(6):281-8. [PubMed ID: 28005564]. https://doi.org/10.1097/CIN.0000000000000325.

  • 23.

    Silva Antunes RSD. DATUS: Dashboard Assessment Usability Model: A case study with student dashboards [master's thesis]. Lisbon, Portugal: Instituto Universitario de Lisboa; 2020.

  • 24.

    Holsapple C, Jones K. Exploring secondary activities of the knowledge chain. Knowl Process Manag. 2005;12(1):3-31. https://doi.org/10.1002/kpm.219.

  • 25.

    Zhuge H. A knowledge flow model for peer-to-peer team knowledge sharing and management. Expert Syst Appl. 2002;23(1):23-30. https://doi.org/10.1016/s0957-4174(02)00024-6.

  • 26.

    Tahvildarzadeh M, Moghaddasi H, Hosseini M. A Framework for Quality Management of University Educational Information: A Review Study. Research and Development in Medical Education. 2017;6(1):3-11. https://doi.org/10.15171/rdme.2017.002.

  • 27.

    Levina EY, Mustafina GM, Nigmetzyanova VM, Galiyev RM, Chalkina NA, Ashmarina SI, et al. Improving the Information System of University Management. Rev Eur Stud. 2014;7(1). https://doi.org/10.5539/res.v7n1p109.

  • 28.

    CaliforniaFew S. Information Dashboard Design: Displaying Data for At-a-glance Monitoring. California, USA: Analytics Press; 2013.

  • 29.

    Muntean M, Sabau G, Bologa AR, Surcel T, Florea A. Performance Dashboards for Universities. Proceedings of the 2nd International Conference on Manufacturing Engineering, Quality and Production Systems. Constantza, Romania. 20190.

  • 30.

    Carey R, Wilson G, Bandi V, Mondal D, Martin LJ, Woods R, et al. Developing a dashboard to meet the needs of residents in a competency-based training program: A design-based research project. Can Med Educ J. 2020;11(6):e31-45. [PubMed ID: 33349752]. [PubMed Central ID: PMC7749685]. https://doi.org/10.36834/cmej.69682.

  • 31.

    Wasti SP, Simkhada P, van Teijlingen ER, Sathian B, Banerjee I. The Growing Importance of Mixed-Methods Research in Health. Nepal J Epidemiol. 2022;12(1):1175-8. [PubMed ID: 35528457]. [PubMed Central ID: PMC9057171]. https://doi.org/10.3126/nje.v12i1.43633.

  • 32.

    Zhang W, Watanabe-Galloway S. Using mixed methods effectively in prevention science: designs, procedures, and examples. Prev Sci. 2014;15(5):654-62. [PubMed ID: 23801237]. https://doi.org/10.1007/s11121-013-0415-5.

  • 33.

    Zhang W, Creswell J. The use of "mixing" procedure of mixed methods in health services research. Med Care. 2013;51(8):e51-7. [PubMed ID: 23860333]. https://doi.org/10.1097/MLR.0b013e31824642fd.

  • 34.

    Almasi S, Bahaadinbeigy K, Ahmadi H, Sohrabei S, Rabiei R. Usability Evaluation of Dashboards: A Systematic Literature Review of Tools. Biomed Res Int. 2023;2023:9990933. [PubMed ID: 36874923]. [PubMed Central ID: PMC9977530]. https://doi.org/10.1155/2023/9990933.

  • 35.

    van den Haak MJ, de Jong MDT. Exploring two methods of usability testing: concurrent versus retrospective think-aloud protocols. International Professional Communication Conference. Aachen, Germany. IEEE; 2003. 3 pp p.

  • 36.

    Cooke L, Cuddihy E. Using eye tracking to address limitations in think-aloud protocol. International Professional Communication Conference. Aachen, Germany. IEEE; 2005. p. 653-8.

  • 37.

    Sousa VEC, Dunn Lopez K. Towards Usable E-Health. A Systematic Review of Usability Questionnaires. Appl Clin Inform. 2017;8(2):470-90. [PubMed ID: 28487932]. [PubMed Central ID: PMC6241759]. https://doi.org/10.4338/ACI-2016-10-R-0170.