Digital Cognitive Tests for Dementia Screening: A Systematic Review

authors:

avatar Masoud Amanzadeh ORCID 1 , avatar Mahnaz Hamedan 2 , avatar Alireza Mohammadnia 3 , avatar Abdollah Mahdavi ORCID 2 , *

Department of Health Information Management, School of Medicine, Ardabil University of Medical Sciences, Ardabil, Iran
Health Information Management, Department of Health Information Management, School of Medicine, Ardabil University of Medical Sciences, Ardabil, Iran
School of Medicine, Ardabil University of Medical Sciences, Ardabil, Iran

how to cite: Amanzadeh M, Hamedan M, Mohammadnia A, Mahdavi A. Digital Cognitive Tests for Dementia Screening: A Systematic Review. Shiraz E-Med J. 2023;24(6):e137241. https://doi.org/10.5812/semj-137241.

Abstract

Context:

The number of people with dementia is increasing dramatically. With the outbreak of the COVID-19 pandemic, digital screening tests can play a significant role in the remote and timely detection of people with dementia. This study aimed to review digital cognitive tests for dementia screening.

Methods:

We searched Web of Science, ProQuest, PubMed, Scopus, and Cochrane using related terms such as “dementia,” “mobile,” “digital,” “computer,” and “cognitive assessment,” leading to the emergence of 1,348 articles. Titles, abstracts, and full texts were screened to select the relevant articles based on inclusion/exclusion criteria. Study characteristics and digital test features such as diagnostic performance and deploying platforms were extracted from selected articles. The risk of bias and reporting quality were evaluated in the included studies.

Results:

Out of 1,348 identified articles, 32 were eligible for inclusion. We categorized digital cognitive tests into 3 groups based on deploying platforms: (1) Mobile-based screening tests (59.5%), (2) desktop-based screening tests (28%), and (3) web-based screening tests (12.5%).

Conclusions:

Digital cognitive tests, especially mobile-based screening tests, facilitate the timely diagnosis of dementia. The development of AI-based screening tests and the use of technologies such as virtual reality and chatbots will set a bright future in the early detection of dementia.

1. Context

As the population ages, the number of people with dementia increases dramatically. Consequently, it imposes enormous health and economic problems on societies (1, 2). More than 55 million people live with dementia worldwide, with nearly 10 million new cases yearly. As the proportion of older people in the population is increasing in nearly every country, this number is expected to rise to 78 million in 2030 and 139 million in 2050 (3). Timely diagnosis of dementia is important in treating and managing the disease. Early detection provides access to the right services and support. It also helps people manage their condition, plan for the future, and live well with dementia (4-7).

Cognitive assessment is one of the methods for early detection of dementia (8-10). Various tests, such as Mini-mental State Examination (MMSE) and Montreal Cognitive Assessment (MOCA), have been developed to assess cognitive functions and screen for dementia. These tests are mostly used because they are non-invasive, efficient, and cost-effective approaches to diagnosis (7, 8, 11-14). Studies showed that cognitive tests have good diagnostic accuracy for detecting dementia (15, 16). Therefore, paper-based tests are almost always used in healthcare organizations (17). The conventional paper-based cognitive tests accurately detect dementia but have some limitations. One of the most important limitations of these tests is that their administration, scoring, interpretation, and documentation require considerable time from the healthcare provider (2, 18, 19).

Digital technology transforms traditional pencil and paper approaches to cognitive testing into more objective, efficient, and sensitive methods. Digital or computerized cognitive tests have different advantages, such as enhanced scoring accuracy, immediate automated scoring and interpretation, easy access to tests by healthcare providers, availability of several alternative tests, and the possibility of using the test by individuals in a self-administered manner (19-21). With the outbreak of the COVID-19 pandemic and accompanying various social restrictions such as social distancing measures, remote service delivery has accelerated (22). In the current situation, digital screening tests for assessing cognitive functions can play a significant role in the remote and timely identification of people with dementia. These tests can be used by healthcare providers and even by individuals. Therefore, this study aimed to review digital cognitive tests for dementia screening.

2. Methods

2.1. Data Resources and Search Strategy

This systematic review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) principles (23). We searched five electronic databases (ProQuest, Cochrane, PubMed, Scopus, and Web of Science) using search terms for relevant articles from inception to June 2022. Search terms were categorized into 3 groups (Table 1). To combine search terms, we used the OR operator within each group and the AND operator between the groups.

Table 1.

Search Terms

QueryOperator
Group 1 (technology)Computer* OR Cell phone* OR Mobile OR Handheld OR Application* OR Health OR m-health OR android OR iPad* OR iPhone* OR Mobile device OR Phone* OR App OR PDA OR Smart phone* OR Smartphone* OR Tablet* OR Cellular phone* OR Telephone* OR Internet OR Software OR Electronic* OR Digital OR CDSS OR Clinical decision support system OR CAD OR Computer aided OR Computer assisted OR Decision support AND
Group 2 (disease)Dementia OR Cognitive dysfunction OR Cognitive impairment* OR Cognitive decline* OR Neurocognitive disorderAND
Group 3 (screening)Assessment OR Screening OR Diagnosis OR Detect* OR Identify OR Identification OR Test OR Battery OR Batteries OR Tool

2.2. Inclusion and Exclusion Criteria

To select the relevant articles, inclusion/exclusion criteria were determined. The inclusion criteria included peer-reviewed articles, original articles, full-text availability, studies related to dementia diagnosis, studies using digital tests for dementia diagnosis, and studies reporting on the psychometric characteristics of the measure, including reliability and validity indices. Non-English articles, review articles, studies on other cognitive disorders, and articles that only used paper-based tests were excluded. After eliminating duplicate studies, two raters (MA and AM) screened the titles, abstracts, and full texts of the articles based on the specified criteria. Finally, they chose the relevant articles. Disagreements between raters were resolved by the third author (MH).

2.3. Data Extraction

Two investigators (MA and AM) independently extracted data from studies. The data included: (1) Sample size, (2) name of the tests, (3) country of the study, (4) type of the disease, (5) administration time, (6) diagnostic accuracy, (7) test validity, (8) type of the platform, and (9) cognitive domains of the test. The third author (MH) resolved the disagreement between the raters.

2.4. Risk of Bias and Quality Assessment

Two raters (MA and AM) independently assessed the potential risks of bias in selected studies using the QUADAS-2 (Quality Assessment of Diagnostic Accuracy Studies) tool, which include 4 key domains: (1) Patient selection, (2) index test, (3) reference standard, and (4) flow and timing (24). In addition, the quality of the studies was assessed by an ad hoc scale (Table 2) designed by adapting the STARD statement (Standards for Reporting of Diagnostic Accuracy) and the scale used by Chan et al. (25, 26). This scale includes 8 domains, as follows: (1) Study population, (2) selection of participants, (3) procedures to run the index test, (4) reference standard, (5) cognitive domains, (6) evaluated diseases, (7) test validity, and (8) diagnostic accuracy. The score of each domain ranged from 0 to 3, and the total score was between 0 and 24. Two raters independently evaluated the quality of the selected studies using the designed scale. Any disagreements were resolved by the third author (MH).

Table 2.

Quality Assessment Scale

DomainsDetailsScores
Study populationDefinition of the study population and details of participant recruitment0 = No data 1 = Poor; 2 = Moderate; 3 = Strong
Selection of participantsSampling0 = No data 1 = ≤ 50 participants; 2 = 50 to 200 participants; 3 = ≥ 200 participants
Procedures of data collectionExplanation of the digital cognitive test and procedures of data collection0 = No data 1 = Poor; 2 = Moderate; 3 = Strong
Reference standardExplanation of reference standard and its rationale0 = No data 1 = Poor; 2 = Moderate; 3 = Strong
Cognitive domainsAssessment of cognitive domains, including memory, attention, language, executive functions, orientation, and calculation0 = No data 1 = 1 domain; 2 = 2 domains; 3 = ≥ 3 domains
Evaluated diseasesEvaluated diseases by digital cognitive test, including dementia, MCI, Alzheimer’s disease, and other types of dementia0 = No data; 1 = 1 type of disease; 2 = 2 types of disease; 3 = ≥ 3 types of disease
Reliability of testNumber of standard cognitive tests used to measure reliability0 = No data 1 = 1 test; 2 = 2 tests; 3 = ≥ 3 tests
Diagnostic performanceCalculation methods of diagnostic performance (criteria for diagnostic performance such as sensitivity, specificity, and accuracy)0 = No data; 1 = 1 criteria; 2 = 2 criteria; 3 = ≥ 3 criteria

3. Results

3.1. Study Selection

Figure 1 illustrates the study selection process. A total of 1,348 articles were identified from electronic database searching. After reviewing the titles and abstracts of the articles and excluding 1,270 duplicates and irrelevant articles, 78 studies were obtained for further eligibility assessment. After reviewing the full text of the articles, 32 were eventually selected.

3.2. Study Characteristics

According to the investigations, 32 articles were published between 1994 and 2021. America with 14 articles, Japan with three articles, Korea with two articles, and other countries with one article each appeared in the results. The total number of participants in the studies was 38,429 people with an age range of 50 to 85 years. Twenty-three studies evaluated the diagnostic performance of digital tests for patients with dementia, 22 studies evaluated the diagnostic performance of digital tests for patients with mild cognitive impairment (MCI), 8 studies evaluated the diagnostic performance of digital tests for patients with Alzheimer’s disease, and one study evaluated the diagnostic performance of digital tests for patients with vascular dementia. The risk of bias in the included articles was evaluated by QUADAS-2 (Appendices 1 and 2 in the Supplementary File). Seven studies (21.8%) were assessed as having a high risk of bias on flow and timing, 6 studies (18.7%) as having a high risk on the index test, and 2 studies (6.2%) as having a high risk in the reference standard.

Different platforms were used to create digital dementia screening tests. Thus, based on the operating platforms, we classified the studies into three groups: (1) Mobile-based screening tests, (2) desktop-based screening tests, and (3) web-based screening tests.

3.3. Mobile-based Screening Tests

Nineteen articles investigated mobile-based screening tests (Table 3). Eight tests were developed based on existing neuropsychological tests. Nine tests were new and innovative cognitive tests. The CAMCI test was previously desktop based, which has become a mobile-based test. The CADi2 test is an improved version of CADi.

Table 3.

Mobile-Based Screening Tests

StudyTestTime, minType of DiseaseParticipantsCognitive DomainsDiagnostic PerformanceQuality Score
MCIDADOtherMemoryAttentionLanguageExecutive FunctionVisuospatialOrientationCalculation
(27)CANTAB-PAL8N = 58 (AD = 19, MCI = 17, HC = 22)ACC = 81.0%15
(28)BHA10N = 347 (HC = 185, MCI = 99), D = 42, normal with concerns = 21)SN = 100%, SP = 85% (D vs. HC); SN = 84%, SP = 85% (MCI vs. HC)23
(29)e-CT2N = 325 (HC = 112, MCI = 129, AD = 84)SN = 70.4%. SP = 78.7% (MCI vs. HC); SN = 86.1%, SP = 91.7% (AD vs. HC)18
(30)VSMNDN = 55 (HC = 21, MCI = 34)ACC = 91.8%, SN = 89.0% SP = 94.0%19
(31)dCDTNDN = 231 (HC = 175, AD = 29, VD = 27)AUC = 91.52% (HC vs. D); AUC = 76.94% (AD vs. VaD)19
(32)iVitalityNDN = 151 (mean age in years = 57.3)Moderate correlation with the conventional test ρ = 0.3 - 0.5 (P < 0.001).15
(33)CAMCI20N = 263 (patients with cognitive concerns = 130, HC = 133)SN = 80%, SP = 74%21
(34)CSTNDN = 215 (HC = 104, AD = 84, MCI = 27)ACC = 96% (D vs. HC) ACC = 91% (HC, MCI, AD, D)23
(35)IDEA-IADLNDN = 3011AUC = 79%, SN = 84.8%, SP = 58.4%19
(36)EC-Screen5N = 243 (HC = 126, MCI = 54, D = 63)AUC = 90%, SN = 83% and SP = 83%21
(37)MCSNDN = 23 (HC = 9 (age 81.78 ± 4.77), D = 14 (age 72.55 ± 9.95))The ability to differentiate the individuals in the control and dementia groups with statistical significance (P < 0.05)12
(38)BrainCheck20N = 586 (HC = 398, D = 188) over the age of 49SN = 81%, SP = 94%.21
(39)CADi210AD = 27ACC = 83%, SN = 85%, SP = 81%21
(40)CADi10N = 222SN = 96%, SP = 77%19
(41)CCS3N = 60 (D = 40, HC = 20)AUC = 94%, SN = 94%, SP = 60%.17
(42)ICA5N = 230 (HC = 95, MCI = 80, mild AD = 55)AUC = 81% (MCI), AUC = 88% (D and AD)19
(43)mSTS-MCINDN = 181 (HC = 107, MCI = 74)Significant correlations with MoCA17
(44)SATURN12N = 60 (D = 23, HC = 37)83% reported that SATURN was easy to use.19
(45)eSAGE10 - 15N = 66 (HC = 21, MCI = 24, D = 21; -50 years of age or overSN = 71%, SP = 90%19

In 17 studies, the results of mobile-based tests were compared with the results of paper-based tests to measure the validity of the tests. Also, MOCA (nine cases) and MMSE (six cases) had the highest frequency among other paper-based cognitive tests.

According to the quality assessment results (Appendix 3 in the Supplementary File), the qualitative scores of the studies were between 12 and 23. The studies on the BHA, CST, CAMCI, EC-Screen, BrainCheck, and CADi2 tests received the highest quality scores in sequence.

Various statistical criteria investigated the diagnostic performance of mobile-based tests in dementia screening. Sensitivity and specificity criteria were used in 11 studies. The BHA and CADi tests had the highest sensitivity, with 100% and 96%, respectively, and the e-CT and eSAGE tests had the lowest sensitivity, with 70.04% and 71%, respectively. Regarding specificity, the VSM and BrainCheck tests had the highest specificity at 94%, and the IDEA-IADL and CCS tests had the lowest specificity with 58.4% and 60%, respectively. Five studies reported the area under the receiving operating characteristic curve (AUC) values. The CCS (94%) and IDEA-IADL (79%) tests had the highest and lowest AUC, respectively. Four studies reported correct classification accuracy values. Also, the CST (96%) and CANTAB-PAL (81%) tests had the highest and lowest accuracy, respectively.

3.4. Desktop-Based Screening Tests

Nine articles examined desktop-based screening tests (Table 4). Four tests were innovative and new cognitive tests, while four were developed based on existing neuropsychological tests. One of the tests is the Brazilian version of the CANS-MCI test (46). In 4 of the studies, the validity of the desktop-based tests was evaluated.

Table 4.

Desktop-Based Screening Tests

StudyTestTime, MinType of DiseaseParticipantsCognitive DomainsDiagnostic PerformanceQuality Score
MCIDADOtherMemoryAttentionLanguageExecutive FunctionVisuospatialOrientationCalculation
(46)CANS-MCI-BR30N = 97 (HC = 41, MCI = 35, AD = 21)SN = 0.81, SP = 0.7321
(47)C-ABC5N = 701 (HC = 134, MCI = 145, D = 422)AUCs = 0.910, 0.874, and 0.882 (in the 50s, 60s, and 70 - 85 age groups).21
(48)CANS-MCI30N = 310 (older adults)SN = 0.89, SP = 0.7320
(49)Cogstate10N = 263SN = 0.78, SP = 0.9019
(50)MoCA-CCNDN = 181 (HC = 85, MCI = 96)AUC = 0.97, SN = 95.8%, SP = 87.1%19
(51)MicroCog30-45N = 102 (HC = 50, MCI = 52)SN = 98% and SP = 83%18
(52)VSMNDN = 66 (HC = 29, MCI = 10, D = 27)SN = 70.0%, SP = 76.0% (HC vs. MCI); SN = 93.0%, SP = 85.0% (HC vs. D)17
(53)dCDTNDN = 163 (HC = 35, MCI = 69, AD = 59)ACC = 91.42% (HC vs. AD); ACC = 83.69% (HC vs. MCI)16
(54)dTDTNDN = 187 (HC = 67, D = 56, MCI = 64)AUC = 0.90, SN = 0.86, SP = 0.82 (HC vs. D); AUC = 0.77, SN = 0.56, SP = 0.83 (HC vs. MCI)15

According to the qualitative assessment (Appendix 4 in the Supplementary File), the scores of studies were between 15 and 21. The studies conducted on the C-ABC, CANS-MCI-BR, CANS-MCI, MoCA-CC, and Cogstate tests obtained the highest quality scores in sequence.

In nine studies, the diagnostic performance of desktop-based tests in dementia screening was investigated by sensitivity and specificity. The MicroCog and MoCA-CC tests had the highest sensitivity, with 98% and 97%, respectively, and the dTDT and Cogstate tests had the lowest sensitivity, with 56% and 78%, respectively. The Cogstate (90%) and CANS-MCI (58.4%) tests had the highest and lowest specificity, respectively. Three studies reported the AUC values, and MoCA-CC (97%), dTDT (90%), and C-ABC (88.86%) tests had the highest AUC.

3.5. Web-Based Screening Tests

Four articles described web-based screening tests (Table 5). The MITSI-L test was based on a paper-based test called LASSI-L. The accuracy of MITSI-L in the detection of dementia was 85.3%. The CNS-VS test used to be a desktop version that has become a web-based test. This test has been translated into more than 50 languages in the world. According to the qualitative assessment results (Appendix 5 in the Supplementary File), the scores were between 20 and 17. The sensitivity and specificity of the CNS-VS test for dementia diagnosis were 90% and 85%. Mindstreams and Co-Wis tests were new and innovative.

Table 5.

Web-Based Screening Tests

StudyTestTime, minType of DiseaseParticipantsCognitive DomainsDiagnostic PerformanceQuality Score
MCIDADOtherMemoryAttentionLanguageExecutive FunctionVisuospatialOrientationCalculation
(55)CNS-VS30N = 178 (HC = 89, MCI = 36, D = 53)SN = 0.90, SP = 0.85 (for MCI); SN = 0.90, SP = 0.94 (for D)20
(56)MITSI-L8N = 98 ((HC = 64, MCI = 34)ACC = 85.3%17
(57)Co-Wis10N = 113Significant correlation with the SNSB-II test17
(58)Mindstreams45-60N = 52 (HC = 22, MCI = 27)The ability to differentiate the individuals in control and MCI groups with statistical significance17

4. Discussion

We examined 32 digital cognitive tests for dementia screening. The studies differed regarding test design, the number of participants, and the aim.

The cognitive tests were designed on mobile, desktop, and web platforms. Examination of tests indicated that mobile-based tests have increased significantly in recent years compared to other platforms. Koo and Vizer have also acknowledged this in their study (10). Recent advances in technology, the expansion of smartphones, and the unique capabilities and advantages of this technology, such as portability, ease of access, and user-friendliness, have led to the design of most tests based on mobile phones. Mobile technologies make it possible for the elderly to access these tests outside of medical centers, even at home or in their workplaces (45). This possibility makes the tests easier to access and provides more usability. In addition, the technology of touch screens in smartphones and tablets facilitates and accelerates the entry of information, which makes it easy for the elderly to perform these tests even though they have fewer computer skills (59). Due to the advantages of touch technology, in some desktop-based tests, this technology has also been used to facilitate information entry (46, 47, 52). Using a digital pen was one of the technologies used in 2 tests (53, 54).

The validity of dementia screening tests is an essential component of the designed tests’ acceptability (60). Accordingly, the validity of the test was measured in nearly 72% of the studies. Tsoy et al. obtained similar results in their review study (61). In order to measure the validity, the researchers used conventional paper-based tests. The MOCA and MMSE tests were used more than other tests. In most cases, the results showed a high correlation between digital and standard paper-based tests.

The reviewed tests were often developed based on paper-based neuropsychological tests; only 38% were innovative and new. Digitalizing existing cognitive tests seems more acceptable and reliable than innovative tests among physicians. The electronic version of paper-based tests can overcome the limitations of the paper version and offers various advantages. These include ease of access, increased test usage, faster administration and reduced costs, automatic score calculation, and immediate access to test results (44, 56, 62). However, when paper-based tests are converted into electronic forms, it is possible to get different results because by making the test electronic, fundamental changes occur in how the test is conducted, especially in self-administered tests, which can affect the obtained results. Therefore, conducting necessary assessments and investigations in this field is recommended. Ruggeri et al. obtained different results from the two electronic and paper versions of the test. The researchers stated that even when the paper test is directly translated, mobile-based tests require training and development of new standards because they should match the elderly population with different skills and familiarity with mobile technologies (63).

The length of administration is one of the important factors that significantly impact its efficiency, effectiveness, and acceptability (64). According to the results, administration time in 38% of reviewed tests was less than 10 minutes. However, it should be noted that in addition to administration time, other important factors, such as diagnostic performance and the number of cognitive domains, are also influential in the efficiency of a test. For example, the administration time of the e-CT test was 2 min, but it assessed only one cognitive domain and had a low diagnostic accuracy (29).

Most of the reviewed tests, especially the mobile-based tests, were self-administered. Self-administration requires less examiner involvement in performing and calculating the test, making it easier to access and facilitate the cognitive assessment. Also, if integrated with health information systems, this capability can effectively receive the necessary recommendations from healthcare providers and telecare (65).

The diagnostic performance of tests in dementia screening is one of the main factors that play an important role in their efficiency. Tests with high sensitivity and specificity are more acceptable. In disease screening, sensitivity is more important than specificity, so tests with high sensitivity can be more suitable for dementia screening. For example, the BHA test had 100% sensitivity (28). It seems that this test can be the best option for dementia screening. The method and quality of the study on tests can affect the results. To ensure the quality and accuracy of the reported results, we examined studies from various aspects, including the number of participants and cognitive domains. The results of some studies in which the tests had high diagnostic power were also qualitatively favorable (28, 50, 51).

Often, studies evaluated the diagnostic accuracy of digital tests for patients with MCI. Some of these tests, such as BHA, had obtained acceptable results even in the diagnosis of MCI (28). However, in all these cases, the diagnostic accuracy of the test was lower in diagnosing MCI than dementia. Developing digital tests to diagnose MCI can be very effective in early detection and better management of cognitive disorders, especially dementia. Consequently, researchers have always been working in this field, especially in recent years.

In some reviewed studies, the authors used virtual reality and machine learning techniques for cognitive testing (30, 42, 53). The application of new technologies, such as virtual reality, the Internet of Things (IoT), and chatbots, along with the development of intelligent cognitive tests, offers numerous opportunities.

4.1. Conclusions

Digital cognitive tests, especially self-administrated mobile-based tests, can effectively facilitate the screening and timely diagnosis of dementia. These tests can play an important role in remote cognitive assessment and diagnosis of dementia during the COVID-19 pandemic and similar situations. In addition, digital cognitive tests can contribute to successfully implementing a national dementia screening program.

Diagnostic performance, administration time, ease of use, especially for the elderly and people with low computer and health literacy, ease of access, and the ability to communicate with healthcare centers and receive advice from healthcare providers are important factors that influence the acceptability and efficiency of digital tests. Therefore, in developing digital tests, attention must be paid to these factors.

References

  • 1.

    Livingston G, Huntley J, Sommerlad A, Ames D, Ballard C, Banerjee S, et al. Dementia prevention, intervention, and care: 2020 report of the Lancet Commission. Lancet. 2020;396(10248):413-46. [PubMed ID: 32738937]. [PubMed Central ID: PMC7392084]. https://doi.org/10.1016/S0140-6736(20)30367-6.

  • 2.

    Zygouris S, Gkioka M, Moraitou D, Teichmann B, Tsiatsos T, Papagiannopoulos S, et al. Assessing the Attitudes of Greek Nurses Toward Computerized Dementia Screening. J Alzheimers Dis. 2020;78(4):1575-83. [PubMed ID: 33185598]. [PubMed Central ID: PMC7836064]. https://doi.org/10.3233/JAD-200666.

  • 3.

    Dementia. World Health Organization; 2022. Available from: https://www.who.int/news-room/fact-sheets/detail/dementia#:~:text=Worldwide%2C%20around%2055%20million%20people,and%20139%20million%20in%202050.

  • 4.

    Herrero ES. Semantically steered clinical decision support system. Universidad del País Vasco-Euskal Herriko Unibertsitatea; 2014.

  • 5.

    Brodaty H, Low LF, Gibson L, Burns K. What is the best dementia screening instrument for general practitioners to use? Am J Geriatr Psychiatry. 2006;14(5):391-400. [PubMed ID: 16670243]. https://doi.org/10.1097/01.JGP.0000216181.20416.b2.

  • 6.

    Higashino S, Tsutsui T, Otaga M. Building a Population Based Cognitive Screening System to Improve the Early Diagnosis of Dementia in Japan. Int J Integr Care. 2017;17(5):51. https://doi.org/10.5334/ijic.3354.

  • 7.

    Huo Z, Lin J, Bat BKK, Chan JYC, Tsoi KKF, Yip BHK. Diagnostic accuracy of dementia screening tools in the Chinese population: a systematic review and meta-analysis of 167 diagnostic studies. Age Ageing. 2021;50(4):1093-101. [PubMed ID: 33625478]. https://doi.org/10.1093/ageing/afab005.

  • 8.

    Shamsollah A, Farhadinasab A, Noorbakhsh S. Comparison of clinical dementia rating scale and clock drawing test in elderly without dementia. J Geronto. 2017;1(3):57-67. https://doi.org/10.18869/acadpub.joge.1.3.57.

  • 9.

    Milne A. Dementia screening and early diagnosis: The case for and against. Health Risk Soc. 2010;12(1):65-76. https://doi.org/10.1080/13698570903509497.

  • 10.

    Koo BM, Vizer LM. Mobile Technology for Cognitive Assessment of Older Adults: A Scoping Review. Innov Aging. 2019;3(1):igy038. [PubMed ID: 30619948]. [PubMed Central ID: PMC6312550]. https://doi.org/10.1093/geroni/igy038.

  • 11.

    Borson S, Scanlan J, Brush M, Vitaliano P, Dokmak A. The mini-cog: a cognitive 'vital signs' measure for dementia screening in multi-lingual elderly. Int J Geriatr Psychiatry. 2000;15(11):1021-7. [PubMed ID: 11113982]. https://doi.org/10.1002/1099-1166(200011)15:11<1021::aid-gps234>3.0.co;2-6.

  • 12.

    Aiello EN, Gramegna C, Esposito A, Gazzaniga V, Zago S, Difonzo T, et al. The Montreal Cognitive Assessment (MoCA): updated norms and psychometric insights into adaptive testing from healthy individuals in Northern Italy. Aging Clin Exp Res. 2022;34(2):375-82. [PubMed ID: 34313961]. [PubMed Central ID: PMC8847194]. https://doi.org/10.1007/s40520-021-01943-7.

  • 13.

    Morris JC. Clinical dementia rating: a reliable and valid diagnostic and staging measure for dementia of the Alzheimer type. Int Psychogeriatr. 1997;9 Suppl 1:173-6. discussion 177-8. [PubMed ID: 9447441]. https://doi.org/10.1017/s1041610297004870.

  • 14.

    Hendry K, Green C, McShane R, Noel-Storr AH, Stott DJ, Anwer S, et al. AD-8 for detection of dementia across a variety of healthcare settings. Cochrane Database Syst Rev. 2019;3(3). CD011121. [PubMed ID: 30828783]. [PubMed Central ID: PMC6398085]. https://doi.org/10.1002/14651858.CD011121.pub2.

  • 15.

    Tsoi KK, Chan JY, Hirai HW, Wong SY, Kwok TC. Cognitive Tests to Detect Dementia: A Systematic Review and Meta-analysis. JAMA Intern Med. 2015;175(9):1450-8. [PubMed ID: 26052687]. https://doi.org/10.1001/jamainternmed.2015.2152.

  • 16.

    Suarez-Araujo CP, Garcia Baez P, Cabrera-Leon Y, Prochazka A, Rodriguez Espinosa N, Fernandez Viadero C, et al. A Real-Time Clinical Decision Support System, for Mild Cognitive Impairment Detection, Based on a Hybrid Neural Architecture. Comput Math Methods Med. 2021;2021:5545297. [PubMed ID: 34257699]. [PubMed Central ID: PMC8257364]. https://doi.org/10.1155/2021/5545297.

  • 17.

    Alzheimer's Association. 2019 Alzheimer's disease facts and figures. Alzheimers Dement. 2019;15(3):321-87. https://doi.org/10.1016/j.jalz.2019.01.010.

  • 18.

    Gualtieri CT. Dementia screening using computerized tests. J Insur Med. 2004;36(3):213-27. [PubMed ID: 15495437].

  • 19.

    Robillard JM, Lai JA, Wu JM, Feng TL, Hayden S. Patient perspectives of the experience of a computerized cognitive assessment in a clinical setting. Alzheimers Dement (N Y). 2018;4:297-303. [PubMed ID: 30090850]. [PubMed Central ID: PMC6077833]. https://doi.org/10.1016/j.trci.2018.06.003.

  • 20.

    Zygouris S, Tsolaki M. New Technologies and Neuropsychological Evaluation of Older Adults. Virtual and Augmented Reality: Concepts, Methodologies, Tools, and Applications. IGI Global; 2018. p. 1762-79.

  • 21.

    Parsons TD, McMahan T, Kane R. Practice parameters facilitating adoption of advanced technologies for enhancing neuropsychological assessment paradigms. Clin Neuropsychol. 2018;32(1):16-41. [PubMed ID: 28590154]. https://doi.org/10.1080/13854046.2017.1337932.

  • 22.

    Sathish R, Manikandan R, Silvia Priscila S, Sara BV, Mahaveerakannan R. A Report on the Impact of Information Technology and Social Media on Covid–19. 3rd International Conference on Intelligent Sustainable Systems (ICISS). IEEE; 2020. p. 224-30.

  • 23.

    Moher D, Liberati A, Tetzlaff J, Altman DG, Prisma Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151(4):264-9. W64. [PubMed ID: 19622511]. https://doi.org/10.7326/0003-4819-151-4-200908180-00135.

  • 24.

    Whiting P, Rutjes A, Westwood M, Mallett S, Leeflang M, Reitsma H, et al. Updating QUADAS: Evidence to inform the development of QUADAS-2. 2014. Available from: http://www.bris.ac.uk/media-library/sites/quadas/migrated/documents/quadas2reportv4.pdf.

  • 25.

    Chan JYC, Kwong JSW, Wong A, Kwok TCY, Tsoi KKF. Comparison of Computerized and Paper-and-Pencil Memory Tests in Detection of Mild Cognitive Impairment and Dementia: A Systematic Review and Meta-analysis of Diagnostic Studies. J Am Med Dir Assoc. 2018;19(9):748-756 e5. [PubMed ID: 29921507]. https://doi.org/10.1016/j.jamda.2018.05.010.

  • 26.

    Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig LM, et al. The STARD statement for reporting studies of diagnostic accuracy: explanation and elaboration. Ann Intern Med. 2003;138(1):W1-12. [PubMed ID: 12513067]. https://doi.org/10.7326/0003-4819-138-1-200301070-00012-w1.

  • 27.

    Junkkila J, Oja S, Laine M, Karrasch M. Applicability of the CANTAB-PAL computerized memory test in identifying amnestic mild cognitive impairment and Alzheimer's disease. Dement Geriatr Cogn Disord. 2012;34(2):83-9. [PubMed ID: 22922741]. https://doi.org/10.1159/000342116.

  • 28.

    Possin KL, Moskowitz T, Erlhoff SJ, Rogers KM, Johnson ET, Steele NZR, et al. The Brain Health Assessment for Detecting and Diagnosing Neurocognitive Disorders. J Am Geriatr Soc. 2018;66(1):150-6. [PubMed ID: 29355911]. [PubMed Central ID: PMC5889617]. https://doi.org/10.1111/jgs.15208.

  • 29.

    Wu YH, Vidal JS, de Rotrou J, Sikkes SAM, Rigaud AS, Plichart M. Can a tablet-based cancellation test identify cognitive impairment in older adults? PLoS One. 2017;12(7). e0181809. [PubMed ID: 28742136]. [PubMed Central ID: PMC5524401]. https://doi.org/10.1371/journal.pone.0181809.

  • 30.

    Zygouris S, Giakoumis D, Votis K, Doumpoulakis S, Ntovas K, Segkouli S, et al. Can a virtual reality cognitive training application fulfill a dual role? Using the virtual supermarket cognitive training application as a screening tool for mild cognitive impairment. J Alzheimers Dis. 2015;44(4):1333-47. [PubMed ID: 25428251]. https://doi.org/10.3233/JAD-141260.

  • 31.

    Davoudi A, Dion C, Amini S, Tighe PJ, Price CC, Libon DJ, et al. Classifying Non-Dementia and Alzheimer's Disease/Vascular Dementia Patients Using Kinematic, Time-Based, and Visuospatial Parameters: The Digital Clock Drawing Test. J Alzheimers Dis. 2021;82(1):47-57. [PubMed ID: 34219737]. [PubMed Central ID: PMC8283934]. https://doi.org/10.3233/JAD-201129.

  • 32.

    Jongstra S, Wijsman LW, Cachucho R, Hoevenaar-Blom MP, Mooijaart SP, Richard E. Cognitive Testing in People at Increased Risk of Dementia Using a Smartphone App: The iVitality Proof-of-Principle Study. JMIR Mhealth Uhealth. 2017;5(5). e68. [PubMed ID: 28546139]. [PubMed Central ID: PMC5465383]. https://doi.org/10.2196/mhealth.6939.

  • 33.

    Tierney MC, Naglie G, Upshur R, Moineddin R, Charles J, Jaakkimainen RL. Feasibility and validity of the self-administered computerized assessment of mild cognitive impairment with older primary care patients. Alzheimer Dis Assoc Disord. 2014;28(4):311-9. [PubMed ID: 24614274]. https://doi.org/10.1097/WAD.0000000000000036.

  • 34.

    Dougherty JJ, Cannon RL, Nicholas CR, Hall L, Hare F, Carr E, et al. The computerized self test (CST): an interactive, internet accessible cognitive screening test for dementia. J Alzheimers Dis. 2010;20(1):185-95. [PubMed ID: 20164591]. https://doi.org/10.3233/JAD-2010-1354.

  • 35.

    Paddick SM, Yoseph M, Gray WK, Andrea D, Barber R, Colgan A, et al. Effectiveness of App-Based Cognitive Screening for Dementia by Lay Health Workers in Low Resource Settings. A Validation and Feasibility Study in Rural Tanzania. J Geriatr Psychiatry Neurol. 2021;34(6):613-21. [PubMed ID: 32964799]. [PubMed Central ID: PMC8600584]. https://doi.org/10.1177/0891988720957105.

  • 36.

    Chan JYC, Wong A, Yiu B, Mok H, Lam P, Kwan P, et al. Electronic Cognitive Screen Technology for Screening Older Adults With Dementia and Mild Cognitive Impairment in a Community Setting: Development and Validation Study. J Med Internet Res. 2020;22(12). e17332. [PubMed ID: 33337341]. [PubMed Central ID: PMC7775823]. https://doi.org/10.2196/17332.

  • 37.

    Zorluoglu G, Kamasak ME, Tavacioglu L, Ozanar PO. A mobile application for cognitive screening of dementia. Comput Methods Programs Biomed. 2015;118(2):252-62. [PubMed ID: 25481217]. https://doi.org/10.1016/j.cmpb.2014.11.004.

  • 38.

    Groppell S, Soto-Ruiz KM, Flores B, Dawkins W, Smith I, Eagleman DM, et al. A Rapid, Mobile Neurocognitive Screening Test to Aid in Identifying Cognitive Impairment and Dementia (BrainCheck): Cohort Study. JMIR Aging. 2019;2(1). e12615. [PubMed ID: 31518280]. [PubMed Central ID: PMC6715071]. https://doi.org/10.2196/12615.

  • 39.

    Onoda K, Yamaguchi S. Revision of the Cognitive Assessment for Dementia, iPad version (CADi2). PLoS One. 2014;9(10). e109931. [PubMed ID: 25310860]. [PubMed Central ID: PMC4195614]. https://doi.org/10.1371/journal.pone.0109931.

  • 40.

    Onoda K, Hamano T, Nabika Y, Aoyama A, Takayoshi H, Nakagawa T, et al. Validation of a new mass screening tool for cognitive impairment: Cognitive Assessment for Dementia, iPad version. Clin Interv Aging. 2013;8:353-60. [PubMed ID: 23569368]. [PubMed Central ID: PMC3615850]. https://doi.org/10.2147/CIA.S42342.

  • 41.

    Scanlon L, O'Shea E, O'Caoimh R, Timmons S. Usability and Validity of a Battery of Computerised Cognitive Screening Tests for Detecting Cognitive Impairment. Gerontology. 2016;62(2):247-52. [PubMed ID: 26113397]. https://doi.org/10.1159/000433432.

  • 42.

    Kalafatis C, Modarres MH, Apostolou P, Marefat H, Khanbagi M, Karimi H, et al. Validity and Cultural Generalisability of a 5-Minute AI-Based, Computerised Cognitive Assessment in Mild Cognitive Impairment and Alzheimer's Dementia. Front Psychiatry. 2021;12:706695. [PubMed ID: 34366938]. [PubMed Central ID: PMC8339427]. https://doi.org/10.3389/fpsyt.2021.706695.

  • 43.

    Park JH, Jung M, Kim J, Park HY, Kim JR, Park JH. Validity of a novel computerized screening test system for mild cognitive impairment. Int Psychogeriatr. 2018;30(10):1455-63. [PubMed ID: 29923471]. https://doi.org/10.1017/S1041610218000923.

  • 44.

    Bissig D, Kaye J, Erten-Lyons D. Validation of SATURN, a free, electronic, self-administered cognitive screening test. Alzheimers Dement (N Y). 2020;6(1). e12116. [PubMed ID: 33392382]. [PubMed Central ID: PMC7771179]. https://doi.org/10.1002/trc2.12116.

  • 45.

    Scharre DW, Chang SI, Nagaraja HN, Vrettos NE, Bornstein RA. Digitally translated Self-Administered Gerocognitive Examination (eSAGE): relationship with its validated paper version, neuropsychological evaluations, and clinical assessments. Alzheimers Res Ther. 2017;9(1):44. [PubMed ID: 28655351]. [PubMed Central ID: PMC5488440]. https://doi.org/10.1186/s13195-017-0269-3.

  • 46.

    Memoria CM, Yassuda MS, Nakano EY, Forlenza OV. Contributions of the Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) for the diagnosis of MCI in Brazil. Int Psychogeriatr. 2014:1-9. [PubMed ID: 24806666]. https://doi.org/10.1017/S1041610214000726.

  • 47.

    Noguchi-Shinohara M, Domoto C, Yoshida T, Niwa K, Yuki-Nozaki S, Samuraki-Yokohama M, et al. A new computerized assessment battery for cognition (C-ABC) to detect mild cognitive impairment and dementia around 5 min. PLoS One. 2020;15(12). e0243469. [PubMed ID: 33306697]. [PubMed Central ID: PMC7732101]. https://doi.org/10.1371/journal.pone.0243469.

  • 48.

    Tornatore JB, Hill E, Laboff JA, McGann ME. Self-administered screening for mild cognitive impairment: initial validation of a computerized test battery. J Neuropsychiatry Clin Neurosci. 2005;17(1):98-105. [PubMed ID: 15746489]. [PubMed Central ID: PMC1559991]. https://doi.org/10.1176/jnp.17.1.98.

  • 49.

    Darby DG, Pietrzak RH, Fredrickson J, Woodward M, Moore L, Fredrickson A, et al. Intraindividual cognitive decline using a brief computerized cognitive screening test. Alzheimers Dement. 2012;8(2):95-104. [PubMed ID: 22404851]. https://doi.org/10.1016/j.jalz.2010.12.009.

  • 50.

    Yu K, Zhang S, Wang Q, Wang X, Qin Y, Wang J, et al. Development of a computerized tool for the chinese version of the montreal cognitive assessment for screening mild cognitive impairment. Int Psychogeriatr. 2014:1-7. [PubMed ID: 25362894]. https://doi.org/10.1017/S1041610214002269.

  • 51.

    Green RC, Green J, Harrison JM, Kutner MH. Screening for cognitive impairment in older individuals. Validation study of a computer-based test. Arch Neurol. 1994;51(8):779-86. [PubMed ID: 8042926]. https://doi.org/10.1001/archneur.1994.00540200055017.

  • 52.

    Maki Y, Yoshida H, Yamaguchi H. Computerized visuo-spatial memory test as a supplementary screening test for dementia. Psychogeriatrics. 2010;10(2):77-82. [PubMed ID: 20738811]. https://doi.org/10.1111/j.1479-8301.2010.00320.x.

  • 53.

    Binaco R, Calzaretto N, Epifano J, McGuire S, Umer M, Emrani S, et al. Machine Learning Analysis of Digital Clock Drawing Test Performance for Differential Classification of Mild Cognitive Impairment Subtypes Versus Alzheimer's Disease. J Int Neuropsychol Soc. 2020;26(7):690-700. [PubMed ID: 32200771]. https://doi.org/10.1017/S1355617720000144.

  • 54.

    Robens S, Heymann P, Gienger R, Hett A, Muller S, Laske C, et al. The Digital Tree Drawing Test for Screening of Early Dementia: An Explorative Study Comparing Healthy Controls, Patients with Mild Cognitive Impairment, and Patients with Early Dementia of the Alzheimer Type. J Alzheimers Dis. 2019;68(4):1561-74. [PubMed ID: 30909229]. https://doi.org/10.3233/JAD-181029.

  • 55.

    Gualtieri CT, Johnson LG. Neurocognitive testing supports a broader concept of mild cognitive impairment. Am J Alzheimers Dis Other Demen. 2005;20(6):359-66. [PubMed ID: 16396441]. https://doi.org/10.1177/153331750502000607.

  • 56.

    Curiel RE, Crocco E, Rosado M, Duara R, Greig MT, Raffo A, et al. A Brief Computerized Paired Associate Test for the Detection of Mild Cognitive Impairment in Community-Dwelling Older Adults. J Alzheimers Dis. 2016;54(2):793-9. [PubMed ID: 27567839]. [PubMed Central ID: PMC5610962]. https://doi.org/10.3233/JAD-160370.

  • 57.

    Song SI, Jeong HS, Park JP, Kim JY, Bai DS, Kim GH, et al. A Study of the Effectiveness Verification of Computer-Based Dementia Assessment Contents (Co-Wis): Non-Randomized Study. Appl Sci. 2020;10(5):1579. https://doi.org/10.3390/app10051579.

  • 58.

    Doniger GM, Jo MY, Simon ES, Crystal HA. Computerized cognitive assessment of mild cognitive impairment in urban African Americans. Am J Alzheimers Dis Other Demen. 2009;24(5):396-403. [PubMed ID: 19700670]. https://doi.org/10.1177/1533317509342982.

  • 59.

    Thabtah F, Mampusti E, Peebles D, Herradura R, Varghese J. A Mobile-Based Screening System for Data Analyses of Early Dementia Traits Detection. J Med Syst. 2019;44(1):24. [PubMed ID: 31828523]. https://doi.org/10.1007/s10916-019-1469-0.

  • 60.

    Aldridge VK, Dovey TM, Wade A. Assessing Test-Retest Reliability of Psychological Measures. Eur Psychol. 2017;22(4):207-18. https://doi.org/10.1027/1016-9040/a000298.

  • 61.

    Tsoy E, Zygouris S, Possin KL. Current State of Self-Administered Brief Computerized Cognitive Assessments for Detection of Cognitive Disorders in Older Adults: A Systematic Review. J Prev Alzheimers Dis. 2021;8(3):267-76. [PubMed ID: 34101783]. [PubMed Central ID: PMC7987552]. https://doi.org/10.14283/jpad.2021.11.

  • 62.

    Rodriguez-Salgado AM, Llibre-Guerra JJ, Tsoy E, Penalver-Guia AI, Bringas G, Erlhoff SJ, et al. A Brief Digital Cognitive Assessment for Detection of Cognitive Impairment in Cuban Older Adults. J Alzheimers Dis. 2021;79(1):85-94. [PubMed ID: 33216033]. [PubMed Central ID: PMC8216130]. https://doi.org/10.3233/JAD-200985.

  • 63.

    Ruggeri K, Maguire A, Andrews JL, Martin E, Menon S. Are We There Yet? Exploring the Impact of Translating Cognitive Tests for Dementia Using Mobile Technology in an Aging Population. Front Aging Neurosci. 2016;8:21. [PubMed ID: 27014053]. [PubMed Central ID: PMC4794643]. https://doi.org/10.3389/fnagi.2016.00021.

  • 64.

    Rentz DM, Dekhtyar M, Sherman J, Burnham S, Blacker D, Aghjayan SL, et al. The Feasibility of At-Home iPad Cognitive Testing For Use in Clinical Trials. J Prev Alzheimers Dis. 2016;3(1):8-12. [PubMed ID: 26998469]. [PubMed Central ID: PMC4795477]. https://doi.org/10.14283/jpad.2015.78.

  • 65.

    Sabbagh MN, Boada M, Borson S, Chilukuri M, Dubois B, Ingram J, et al. Early Detection of Mild Cognitive Impairment (MCI) in Primary Care. J Prev Alzheimers Dis. 2020;7(3):165-70. [PubMed ID: 32463069]. https://doi.org/10.14283/jpad.2020.21.