[go: up one dir, main page]

Next Article in Journal
Are Short-Term Study Abroad Experiences Effective in Developing Global Citizenship in University Students Studying Health Degrees?
Previous Article in Journal
An Investigation of Content-Specific Unit Emotions in Secondary Physical Education
Previous Article in Special Issue
Computational Thinking and Modeling: A Quasi-Experimental Study of Learning Transfer
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Systematic Review of Instruments to Assess Computational Thinking in Early Years of Schooling

by
Lina Marcela Ocampo
1,*,
Milena Corrales-Álvarez
1,
Sergio Augusto Cardona-Torres
1 and
María Zapata-Cáceres
2
1
Faculty of Engineering, Universidad del Quindío, Armenia 630001, Colombia
2
Department of Computer Science and Statistics, Universidad Rey Juan Carlos, Móstoles, 28933 Madrid, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(10), 1124; https://doi.org/10.3390/educsci14101124
Submission received: 14 July 2024 / Revised: 9 September 2024 / Accepted: 11 September 2024 / Published: 16 October 2024
(This article belongs to the Special Issue Measuring Children’s Computational Thinking Skills)

Abstract

:
Computational thinking (CT) is considered a key competence in today’s digital era. It is an emerging construct that relates to critical thinking and creativity. Research on its assessment is in the process of consolidation. This systematic review aims to analyze studies that have used CT assessment instruments for children and adolescents aged 4 to 16 years in order to identify which variables, they assess and their psychometric properties. The search and analysis were carried out following the PRISMA statement protocol, analyzing 50 articles published between 2006 and March 2023. An increase in the publication of CT measurement instruments is observed, with 54% of them supported by evidence of validity and 88% by reliability, highlighting construct validity, followed by content and criteria validity. China leads in the number of publications, while Asia and Europe concentrate most of the research. There is a noticeable contribution from South America, evidencing the lack of participation from Central and South American countries in this field of study.

1. Introduction

Computational thinking (CT) has been established as a key skill in today’s digital era [1,2], relating to critical thinking and creativity [3]. It is considered an emerging construct, consolidating from theoretical, operational, and measurement perspectives. The lack of consensus on its operationalization implies its implementation in various ways in the student learning process and hinders the development of assessment methodologies and tools [4].
The literature shows a diversity of approaches aimed at fostering the development of CT through integration strategies in the educational environment. At the same time, it highlights the importance of deepening the understanding of teaching and assessment processes from the early years of schooling [5].
In the article “Computational Thinking” [6], the conceptual foundations and fundamental characteristics of CT are established; it is conceived as a process that involves problem-solving, system design, and understanding human behavior through the basic principles of computer science. Román’s work [7] outlines CT definitions from different perspectives: generic, operational, psychological-cognitive, and educational-curricular. It also presents an operational definition, expressing that it involves the ability to approach problems using the fundamental principles of computing and the logic of programming languages. Likewise, Tang and collaborators [8] propose a classification of definitions based on two aspects: the first focuses on characteristics linked to programming and computational concepts, while the second refers to the skills and competencies students must acquire in specific areas of knowledge and the ability to effectively tackle problems.
In the educational field, there are various frameworks for introducing CT into school curricula, including the three-dimensional framework by Karen Brennan and Mitchel Resnick [9] and the CSTA Standards Task Force [10]. These provide a structure that guides the teaching of this construct. This integration can be transversal, incorporating it into various curricular areas, or independent, assigning it as an autonomous subject or study area. This has shown the need to implement assessment methodologies and tools with validity and reliability [4] that can adapt to various purposes and uses [11]. Additionally, some research concludes that there are few agreements on the strategies to evaluate it, and few instruments have psychometric properties for use with young children [8,9,12,13,14,15].
As CT is a recent field of study, there are various assessment proposals in different formats, involving students in different types of activities and educational contexts [14]. These include questionnaires [13,16,17,18], task-based tests [19], coding tests [20], observation [21], and one-on-one interviews [22]. On the other hand, Poulakis and Politis [23], based on their systematic review findings, classify CT assessment approaches into three categories: (1) assessment through specific programming environments, (2) evaluation criteria and psychometric instruments, and (3) multiple assessment methods, consisting of project portfolios, participant observation, and artifact-based student interviews.
Various studies express the need for multiple or complementary assessment systems [24,25] that address both cognitive and non-cognitive aspects of CT to obtain a comprehensive view of student learning. Grover and Pea [24] analyzed the use of an assessment system encompassing multiple strategies to measure the growth of algorithmic thinking skills and their transfer, as well as non-cognitive aspects, such as perceptions of computer science and communication of CT ideas.
Román and collaborators [25] also emphasize the importance of employing more than one tool to obtain a comprehensive assessment of CT. They classify assessments into several types: diagnostic assessments to identify areas for improvement, summative assessments to measure final performance, formative-iterative assessments to adjust teaching during the process, data mining assessments to analyze learning patterns, skill transfer assessments to evaluate practical application, perception assessments to understand attitudes toward CT, and vocabulary assessments to evaluate the mastery of technical terms related to the construct.
This systematic review aims to identify research that has used CT assessment instruments for children and adolescents aged 4 to 16 years in order to identify which variables they assess and their psychometric properties.

2. Materials and Methods

The search and analysis of the literature were carried out considering the protocol of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA in Supplementary Materials) statement [26,27]. The review protocol includes eligibility criteria, sources of information and search strategy, study selection, data extraction, and information elements.
The following research questions were posed, associated with both bibliometric and interest variables:
Q1. What bibliometric characteristics are identified in the analyzed articles?
Q2. Which instruments to measure CT have been used in their original version or have adjustments?
Q3. What are the ages targeted by the CT assessment instruments?
Q4. What is the number of items and most commonly used response options?
Q5. What are the elements related to CT evaluated in the analyzed instruments?
Q6. What psychometric properties are evidenced in the instruments, and what methods were used for the analysis of psychometric properties?
Through the analysis of the articles, these questions are answered by identifying for the bibliometric variables: the source of information, the year of publication, common words in the article title and keywords, the co-citation network of authors, and the country where the study was conducted; and regarding the interest variables: the instruments used, the population under study, the number of items and response options, the theoretical basis used, the elements related to CT evaluated, the evidence of reliability and validity, the methods used to determine validity, and the reliability coefficients implemented. Finally, techniques for data analysis and visualization were applied.
a.
Eligibility Criteria
Research articles published in scientific journals from January 2006 to March 2024 were included. Articles that assess the construct using instruments in children and adolescents aged 4 to 16 years were considered. Articles addressing other topics or evaluating CT using a tool other than an instrument (surveys, observation rubrics, interviews) were excluded. Duplicate articles, reviews, editorials, review articles, books, summaries, or conference proceedings were also excluded.
b.
Databases, Information Sources, and Search Strategy
Research articles were searched in Scopus and Web of Science (WoS). Selection and extraction were carried out in parallel by two researchers during March 2024. Based on the review objective and research questions, CT was established as the axial category, and evaluation and the population under study as complementary categories. Synonyms for the terms were searched, and the following search string was obtained: “Computational Thinking” (Topic) AND assessment OR evaluat* OR instrument OR measur* OR apprais* OR scale OR test (Topic) AND “elementary school” OR “primary school” OR “primary education” OR “elementary education” OR kindergarten OR “early childhood education” (Topic) AND Article (Document Types). The search strategy was documented transparently and reproducibly to allow others to replicate the process.
c.
Study Selection
When searching WoS and Scopus, 398 articles were found for the systematic literature review. After removing duplicates (148 articles), 250 articles were obtained for analysis. By applying the exclusion and inclusion criteria, 171 articles were discarded. This led to a sample consisting of 42 studies. Additionally, 8 articles were directly included, considered relevant for meeting the inclusion criteria of this review, and mostly referenced in the 42 articles obtained in the search, leading to a total of 50 articles comprising this review. The flow diagram for literature selection is presented in Figure 1.
d.
Data Extraction and Information Elements.
Once the selection process was completed, all 50 articles were thoroughly analyzed, extracting data of interest for the present study [28], including the following elements: title and research objective, year of publication, abstract, keywords, country where the research was conducted, publishing journal, name of the instrument for assessing CT, elements related to CT evaluated, theoretical basis for instrument development, ages or school level of evaluated students, number of items, type of item responses, and evidence of reliability and validity.
e.
Bibliometric Analysis and Data Visualization
Based on the 50 articles comprising the sample, an analysis was conducted for each study variable. The analysis approached the information based on: (1) Bibliometric variables: information source, year of publication, article title, keywords, authors, and country where the research was conducted; and (2) Variables of interest: population under study, number of items, response options, authors providing theoretical basis for the instrument, elements related to CT evaluated, evidence of reliability and validity, and methods used to determine them. VOSviewer [29] was used for bibliometric analysis.

3. Results

3.1. Analysis of Bibliometric Variables

Information Source. The search was conducted in two databases, WoS and Scopus. Fifty articles were identified for this systematic literature review, where it was found that both sources contributed approximately the same number of articles.
Year of publication. Article selection was conducted from 2006 to March 2024 (18 years). Although 2006 was the starting year, the search revealed that the first publication related to assessment instruments appeared in 2017 with two articles [7,30], and in 2023, the highest number of publications was reached, with fourteen articles [18,31,32,33,34,35,36,37,38,39,40,41,42,43], as evidenced in Figure 2.
Title of the article and keywords. The analysis of article titles and keywords revealed that the words “computational”, “thinking”, “test”, “education”, “primary”, and “school” are most frequently repeated, as shown in Figure 3. These results confirm that the articles are related to the objective set for this review.
In Figure 4, a map based on the words from the titles and keywords of the articles is shown. For this, terms that appeared at least 5 times out of a total of 70 words were considered. The graph reveals 5 groups, composed of 32, 15, 11, 9, and 3 elements, respectively. Group 1 includes words such as “assessment”, “validity”, “tool”, and “effect”, among others (with link 62 and a total link strength of 655).
Co-citation of authors. Figure 5 refers to the co-citation network of authors. Authors who appeared most frequently were Román-González, M. [7,17,44,45,46], Zapata-Cáceres, M. [17,46,47,48,49], and Martín-Barroso, E. [17,46,47,48,49], each in 5 articles. El-Hamamsy, L. [46,47,49] appeared in 3 articles, and Butz, M. [44] appeared in one article. Four groups were identified: the first group with 6 elements is led by El-Hamamsy, L., and Zapata-Cáceres, M. (links 9, total link strength 14). The second group with 6 elements is led by Butz, M. (links 6, total link strength 6). The third group with 5 elements is led by Román-González, M. (links 16, total link strength 17). The fourth group with 3 elements includes Martín-Barroso, E. (links 62, total link strength 72). Additionally, a publication authored by Román-González, M., Zapata-Cáceres, M., El-Hamamsy, L., and Martín-Barroso, E. was identified [46].
Country where the study was conducted. The country with the most publications is China with nineteen articles, followed by Spain and the United States with eight each, Turkey with four, and Switzerland with two publications. Finally, Germany, Austria, France, Indonesia, Italy, the Netherlands, Portugal, the Czech Republic, and Uruguay each have one publication. Regarding continents, Asia and Europe have the highest representation, as shown in Figure 6.

3.2. Analysis of Variables of Interest

In the review of the 50 articles, a total of 22 different instruments for assessing CT were identified, which are presented in Table 1.
Furthermore, in 11 articles, adjustments were made to some of these instruments, such as translation, expert judgment, pilot testing, and/or verification of psychometric properties. This results in a total of 33 different instruments for measuring CT skills. Table 2 presents the instruments used as a basis in the research and the adjustments made.
It was found that the instruments BCTt [17], CTt [7], CTS [67], and TechCheck [13] are the most referenced in other research for assessing CT, either directly or with adjustments.
Furthermore, several studies used questions from different years and countries of the Bebras Tasks to construct instruments that allowed them to measure CT skills of interest for the study [19,34,36,59,74,75]. Additionally, one article uses questions from a repository at Peking University [76] to assess this construct.
It should be noted that several studies used or adjusted more than one instrument. Relkin, Johnson, and Bers [31] used five instruments in their research (TechCheck, TechCheck-k, TechCheck-1, TechCheck-2, TechCheck-PreK); El-Hamamsy et al. [46] adjusted two instruments (BCTt and cCTt); Ma et al. [65] adjusted two instruments (CTt and CTS).
Target population. In the various reviewed studies, age ranges varied between 4 and 16 years. Table 3 shows the ages at which the 33 identified instruments were applied. Appendix A presents the age table for the 50 analyzed articles.
In Figure 7, the age distribution of the 50 studies included in the review and the 33 studies where distinct instruments to assess CT were identified is shown. It can be observed that there is a greater availability of instruments in the age range of 8 to 11 years.
Number of Items. From the reviewed studies, it was identified that the “CT Practices Test”, used in two studies [32,62], has the highest number of items, with a total of forty-seven. On the other hand, both the “Assessment through a card-based game” [19] and the “Early Assessment” [56] have the fewest items, each with nine in total.
Response Options. Regarding the distribution of response formats used in the instruments, multiple-choice is the most implemented format, used in 68% of the reviewed articles, followed by the 5-point Likert scale in 20% of the articles. Open-ended questions combined with multiple-choice and fill-in-the-blank short responses are each used in 3% of the articles. Additionally, the response format was not specified in 6% of the cases. It is important to note that two instruments have two different response formats: the CT questionnaire [72], which includes fill-in-the-blank and short responses, and the Computational Thinking Test using Bebras Problems [36], which includes both open-ended and multiple-choice questions.
Theoretical Foundation. Figure 8 presents the authors and/or associations that have contributed to the construction of the instruments identified in the review, providing theoretical frameworks, definitions, and taxonomies. The most widely used model is the “3D” model proposed by Brennan and Resnick [9].
Skills, attitudes, and/or computational concepts evaluated. The most commonly evaluated skills identified are abstraction, debugging and evaluation, problem-solving, algorithms, and algorithmic thinking; likewise, the most commonly evaluated computational concepts are sequences, loops, and conditionals. Additionally, attitudes such as cooperation and creativity are also considered. The frequency with which these categories are evaluated is shown in Figure 9.
In most of the research studies, skills are evaluated, followed by concepts, attitudes, and perspectives. Some studies also evaluate skills and concepts, or skills and attitudes, while only one study evaluates skills, concepts, and perspectives, as shown in Figure 10.
Table 4 presents the skills, concepts, perspectives, and attitudes assessed in the 33 different instruments identified for evaluating the CT.
Evidence of reliability and validity, and methods used to determine them. Table 5 details the 50 articles, along with the instrument used, and also provides information on the validity and reliability evidence mentioned in the articles.
It was found that 54% of the reviewed articles have evidence of validity, and 88% present evidence of reliability. Among the validity types, it was found that 17 articles used content validity, followed by construct validity in 21 articles and criterion validity in 13 articles. Two articles referenced other types of validity: apparent validity [62] and argument-based validity [56]. Additionally, 23 articles were identified without validity testing. Regarding reliability, Cronbach’s alpha was used in 39 articles, followed by KR-20 in 2 articles, Spearman-Brown, McDonald’s omega, split-half, Cohen’s kappa, and empirical reliability in 1 article each. However, 12 articles do not present evidence of reliability.

4. Discussion

The aim of this systematic review was to identify research that has utilized CT assessment instruments in children and adolescents aged between 4 and 16 years to identify the variables they assess and their psychometric properties.
Through the search chain, 42 articles were identified, and an additional 8 articles were directly included, totaling 50 articles for analysis. Although the first definition of CT was presented in 2006, the first instruments to measure this construct were found starting in 2017 [7,30]. The year 2023 recorded the highest number of these instruments, with fourteen articles [18,31,32,33,34,35,36,37,38,39,40,41,42,43], indicating a growing interest in the scientific community in measuring CT skills.
Most instruments used multiple-choice as the response format, followed by a 5-point Likert scale. In terms of item count, two instruments had nine items [19,56] and one had forty-seven [32]. China had the highest number of publications with nineteen articles, followed by Spain and the United States with eight articles each, Turkey with four, and Switzerland with two publications. Asia and Europe are the continents where most research is conducted. Only one article from South America was found in this systematic literature review [72].
The most referenced authors were Marcos Román-González [7,17,44,45,46], María Zapata-Cáceres [17,46,47,48,49], and Estefanía Martín-Barroso [17,46,47,48,49], appearing in five articles each. Laila El-Hamamsy [46,47,49], was mentioned in three different articles. Furthermore, for the design of most instruments analyzed in this review, the three-dimensional (3D) model proposed by Brennan and Resnick [9] was the most used.
To determine validity and reliability, it was necessary to specify the methods used, which can be addressed through classical test theory (CTT) and item response theory (IRT). CTT relies on methods that evaluate test quality by measuring internal consistency and validity (content, criterion, and construct). In contrast, IRT offers a more advanced approach by considering individual item and participant characteristics, allowing for a more precise estimation of assessed skills and a more sensitive performance assessment. Integrating CTT and IRT provides a more comprehensive and reliable test assessment, facilitating informed decision-making in various educational and professional contexts.
The review protocol posed questions to deepen the discussion:
Which CT assessment instruments have been used in their original version or with adjustments? Among the 50 articles analyzed, 22 instruments designed to evaluate CT were identified, some of which have been used in other studies in their original version and in another 11 articles with adjustments such as translation, expert judgment, pilot testing, and verification of psychometric properties. Notably, the CTS [67] served as a reference in six investigations, while the BCTt [17], TechCheck [13], and CTt [7] were used in four investigations each. This underscores the importance of these instruments in the field of CT assessment. Additionally, the use of Bebras Tasks was noted in six articles as a basis for generating instruments and evaluating PC skills of interest in the research.
It was also observed that several studies used or adjusted more than one instrument. For example, Relkin, Johnson, and Bers [31] used five instruments (TechCheck, TechCheck-k, TechCheck-1, TechCheck-2, and TechCheck-PreK); El-Hamamsy, Zapata-Cáceres, Marcelino, and colleagues [46] adjusted two instruments (BCTt and cCTt), and Ma H. et al. [65] adjusted two instruments (CTt and CTS).
It is important to note that using an unadapted instrument may run counter to the need expressed by several authors that instruments should be contextually relevant [77] and that methodologies and tools must possess psychometric properties [4].
What are the age ranges targeted by the CT assessment instruments? In the analysis of the 50 reviewed articles, a total of 33 different instruments designed to assess CT in children within the age range of interest, spanning from 4 to 16 years, were identified.
The instrument with the broadest age range is the “Computational Thinking Test for Beginners” (BCTt), which covers ages from 5 to 12 years [17]. It is followed by the CTt, which covers ages from 5 to 12 years [7], and in [66], the CTt was used in a population aged 7 to 12 years.
It was observed that the age range with the highest availability of instruments ranges from 8 to 11 years. In contrast, for 4-year-olds, only two instruments were found: TechCheck in its TechCheck-PreK version [31] and the CT questionnaire [72]. For the 14, 15, and 16-year-old population, only one instrument, CTt [7], was identified.
What elements related to CT are assessed in the analyzed instruments? Most of the analyzed instruments assess CT from the cognitive component. The findings align with previous reviews [15,78,79,80,81,82], where the most commonly assessed skills include abstraction (13 instruments), problem-solving (11 instruments), algorithmic thinking (13 instruments), and specific computational concepts such as algorithms (14 instruments), sequences (22 instruments), loops, and conditionals (24 instruments each). This is consistent with the findings of [78,83,84,85,86].
The majority of investigations assess skills (37 instruments), followed by concepts (32 instruments), attitudes (7 instruments), and computational perspectives (2 instruments). Additionally, 17 articles were identified where both skills and concepts were assessed, 7 where skills and attitudes were assessed, and only one investigation assessed skills, concepts, and perspectives. This finding highlights the need to propose comprehensive approaches to assess CT in children and adolescents aged 4 to 16 years [24,25,86] and the fact that disconnected tests do not allow for the measurement of computational perspectives [25].
What psychometric properties are evidenced in the instruments? Analyzing the psychometric properties evidenced in the 50 articles regarding validity and reliability, it was found that 54% of the articles have evidence of validity, and 88% presented evidence of reliability, with construct validity predominating, followed by content and criterion validity. Additionally, two articles referenced other types of validity: apparent validity [62] and argument-based validity [56]. However, 12% of the articles provided no evidence of either validity or reliability. These results contrast with the findings in [8], which express that many CT assessments lack evidence of reliability and validity.
What methods were used for analyzing psychometric properties? Statistical methods such as correlation, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA) were employed to determine validity in most cases. Regarding reliability, 78% of the articles used Cronbach’s Alpha coefficient for evaluation. However, 24% of the articles did not provide evidence of having tested this psychometric property.
The evaluation of CT has experienced significant growth in recent years. However, there are still studies that use assessment instruments without reported psychometric validity. Additionally, current instruments tend to focus on specific skills such as abstraction, debugging, and problem-solving, as well as on CT-related concepts such as sequences, loops, and conditionals, while neglecting key components such as non-cognitive skills, which are also essential for holistic development.
There are few assessment instruments aimed at children between the ages of 4 and 6, a critical stage for cognitive and non-cognitive development, early detection of skills, and identification of areas for improvement. The lack of specific tools for this age group limits the possibility of establishing a solid foundation for future learning. Therefore, the design of appropriate instruments for this age range would help promote educational processes that prepare children to face the educational challenges of the future.
Given these limitations, several lines of future research could be drawn to improve the understanding and evaluation of CT. Among them, the development of assessment instruments adapted to different educational and demographic contexts to ensure equity in evaluation is essential. Likewise, conducting longitudinal studies to understand how CT skills develop and transform over time would provide a deeper insight into their evolution.
Furthermore, the integration of non-cognitive skills into evaluations will offer a more comprehensive view of student development and comparing pedagogical approaches could help identify educational practices that optimize CT learning in various contexts. Additionally, analyzing teacher training is key, as teachers are the main facilitators of CT in the classroom, making it essential to investigate how they are prepared and updated in this field.
Finally, research on the perception of CT among students, teachers, and society will allow for adjustments in pedagogical strategies to increase acceptance and motivation towards its learning, thus contributing to a more effective implementation of computational thinking in educational systems.

5. Conclusions

The systematic review analyzed 50 research articles focusing on CT assessment instruments published in WoS and Scopus databases from 2006 to March 2024. It was identified that, although the number of instruments to assess this construct has increased in recent years, in some cases, the contextual relevance and psychometric properties are not adequately considered. The review delineated the assessed skills, concepts, perspectives, and attitudes associated with CT.
The search strategy was documented transparently and reproducibly to allow others to replicate the process. Bibliometric and research interest variables were defined based on the research questions posed. Through bibliometric variables, it was determined that there was an increase in research aimed at evaluating CT through an instrument in 2023. China has conducted the most studies in this area, followed by the United States and Spain. In the Latin American context, only one study was found. Authors such as Brennan and Resnick, Marina Bers, Özgen Korkmaz, and Marcos Román Gonzales served as foundational figures in instrument construction. The most frequently used or adapted instruments in research include CTS, BCTt, CTt, and TechCheck. Additionally, a significant number of instruments incorporate questions from various years and countries’ Bebras Tasks. Instrument adjustments included item translation, expert judgment, pilot testing, and determining psychometric properties.
The age groups with the highest number of instruments for evaluating CT are between 8 and 11 years, contrasting with only two instruments identified for 4-year-olds and a single instrument for 14 to 16-year-olds. Regarding assessed skills, abstraction, problem-solving, evaluation and debugging, algorithms, and algorithmic thinking were identified as the most common, followed by sequences, loops, and conditional concepts. Some studies also evaluated computational perspectives and certain attitudes. This finding underscores the need for a comprehensive assessment of CT.
Most instruments provide evidence of content, construct, or criterion validity, with reliability predominantly assessed through Cronbach’s Alpha coefficient.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/educsci14101124/s1. Reference [87] is cited in the supplementary materials.

Author Contributions

Authorship recognition in this article is based on the contribution of each author: Methodology, conceptualization, research, writing, review, and editing. L.M.O. and M.C.-Á.; Conceptualization, research, writing, review, and editing. S.A.C.-T. and M.Z.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Universidad del Quindío, Code Reference 100016837, Research Project “El pensamiento computacional: proyecto formativo para su desarrollo y evaluación en estudiantes de grado cuarto y quinto de básica primaria” No. 1187. Call No. 14 of 2022.

Conflicts of Interest

The authors declare that there is no conflict of interest that could influence the work presented in this article.

Appendix A

Table A1. Distribution of ages for the 50 analyzed articles.
Table A1. Distribution of ages for the 50 analyzed articles.
YearArticleInstrument Used and/or AdjustedAge
456789101112131415 16
2017Assessing elementary students’ computational thinking in everyday reasoning and robotics programming [30]Instrument with emphasis on robotics programming XX
2017Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test [7]CTt XXXXXXX
2018Extending the nomological network of computational thinking with non-cognitive factors [45]CTt XXXXXXX
2019Developing scientific literacy-based teaching materials to improve students’ computational thinking skills [60]Integrated CT competence test XX
2019Educational Robotics in Primary School: Measuring the Development of Computational Thinking Skills with the Bebras Tasks [74]Bebras Tasks XXX
2020Can computational thinking be improved by using a methodology based on metaphors and scratch to teach computer programming to children? [58]PCNT XXXX
2020Computational thinking through unplugged activities in early years of Primary Education [75]Bebras Category Instrument X
2020Formation of computational identity through computational thinking perspectives development in programming learning: A mediation analysis among primary school students [54]Scale of CT perspectives XXXX
2020TechCheck: Development and Validation of an Unplugged Assessment of Computational Thinking in Early Childhood Education [13]TechCheck XXXXX
2020Computational thinking test for beginners: Design and content validation [17]BCTt XXXXXXXX
2020Development and Predictive Validity of the Computational Thinking Disposition Questionnaire [53]Questionnaire of CT disposition XX
2021Cognitive abilities and computational thinking at age 5: Evidence for associations to sequencing and symbolic number comparison [72]CT questionnaireXXX
2021Collaborative Game-Based Environment and Assessment Tool for Learning Computational Thinking in Primary School: A Case Study [48]BCTt XXXXXXXX
2021Development and Validation of Computational Thinking Assessment of Chinese Elementary School Students [50]CTA-CES XXXXX
2021Does learning to code influence cognitive skills of elementary school children? Findings from a randomized experiment [71]CT questionnaire XX
2021Effect of Scratch on computational thinking skills of Chinese primary school students [68]CTS XX
2021Item response analysis of computational thinking practices: Test characteristics and students’ learning abilities in visual programming contexts [62]CT practices test XXX
2021Promoting pupils’ computational thinking skills and self-efficacy: a problem-solving instructional approach [65]CTt XX
Self-Efficacy Scale for CT XX
2021Computational Thinking Test for Lower Primary students: Design principles, content validation, and pilot testing [52]CTtLP XX
2021Design and validation of learning trajectory-based assessments for computational thinking in upper elementary grades [56]Early assessment XXX
2021TechCheck-K- A Measure of Computational Thinking for Kindergarten Children [63]TechCheck-K XX
2021Measuring coding ability in young children relations to computational thinking, creative thinking, and working memory [19]Assessment through a card-based game XX
2021A principled approach to designing computational thinking concepts and practices assessments for upper elementary grades [55]Assessment of computational thinking concepts XXX
2021Computational Thinking Evaluation Tool Development for Early Childhood Software Education [59]CT Test Using Bebras Tasks X
2022A cognitive definition of computational thinking in primary education [44]CTt XXX
2022Adaptation into turkish of the computational thinking test for primary school students [73]TechCheck XXX
2022Comparing the psychometric properties of two primary school Computational Thinking (CT) assessments for grades 3 and 4: The Beginners’ CT test (BCTt) and the competent CT test (cCTt) [46]BCTt XXX
cCTt XXX
2022Development and Validation of the Computational Thinking Test for Elementary School Students (CTT-ES): Correlate CT Competency With CT Disposition [51] CTT-ES XX
2022Effect of Unplugged Programming Teaching Aids on Children’s Computational Thinking and Classroom Interaction: with Respect to Piaget’s Four Stages Theory [76]CT capacity test XXX
2022Exploring the characteristics of an optimal design of non-programming plugged learning for developing primary school students’ computational thinking in mathematics [69]Perception questionnaire adapted from CTS XX
2022The competent Computational Thinking Test: Development and Validation of an Unplugged Computational Thinking Test for Upper Primary School [49]cCTt XXX
2022The Effect of Robotics-Based Storytelling Activities on Primary School Students’ Computational Thinking [64]BCTt XX
2022Unplugged or plugged-in programming learning: A comparative experimental study [61]Test of computational concepts mastery XX
2022Validating a computational thinking concepts test for primary education using item response theory: An analysis of students’ responses [57]CT Concepts Test Instrument for Primary Education Based on an ECD Approach XXXX
2023A Normative Analysis of the TechCheck Computational Thinking Assessment [31]TechCheckXXXXXX
2023Computational Literacy: Unplugged musical activities around Bebras International Challenge [36] Computational Thinking Test using Bebras Problems XXXXX
2023Computational thinking in primary school: effects of student and school characteristics [37]TechCheck XXXXX
2023Developing and Testing a Design-Based Learning Approach to Enhance Elementary Students’ Self-Perceived Computational Thinking [38]CTS XX
2023Development and validation of a computational thinking test for lower primary school students [39]CTtLP XXXXX
2023Effect of Reverse Engineering Pedagogy on Primary School Students’ Computational Thinking Skills in STEM Learning Activities [40]CTS XX
2023Effects of robotics STEM camps on rural elementary students’ self-efficacy and computational thinking [41]Survey adapted from CTS XXX
2023Effects of Scratch-Based Activities on 4th-Grade Students’ Computational Thinking Skills [42]BCTt XX
2023Exploring the underlying cognitive process of computational thinking in primary education [43]CTtLP XXXXXX
2023Monitoring cognitive development through the assessment of computational thinking practices: A longitudinal intervention on primary school students [32]CT practices test XXXXX
2023Possibilities of diagnosing the level of development of students’ computational thinking and the influence of alternative methods of teaching mathematics on their results [33]Didactic CT test XXX
2023The effect of an unplugged coding course on primary school students’ improvement in their computational thinking skills [34]CTST XXXX
2023Think together, design together, code together: the effect of augmented reality activity designed by children on the computational thinking skills [35]TechCheck XX
2023Unravelling the underlying mechanism of computational thinking: The mediating role of attitudinal beliefs between personality and learning performance [18]CTtLP XXXX
2024A Bebras Computational Thinking (ABC-Thinking) program for primary school: Evaluation using the competent computational thinking test [47]cCTt XXX
2024The effect on computational thinking and identified learning aspects: Comparing unplugged smartGames with SRA-Programming with tangible or On-screen output [66]CTt XXXXXX
Total28101424343827123222

References

  1. Durak, H.Y.; Saritepeci, M. Analysis of the relation between computational thinking skills and various variables with the structural equation model. Comput. Educ. 2018, 116, 191–202. [Google Scholar] [CrossRef]
  2. Li, Y.; Schoenfeld, A.H.; Disessa, A.A.; Graesser, A.C.; Benson, L.C.; English, L.D.; Duschl, R.A. Computational Thinking Is More about Thinking than Computing. J. STEM Educ. Res. 2020, 3, 1–18. [Google Scholar] [CrossRef] [PubMed]
  3. Lye, S.Y.; Koh, J.H.L. Review on teaching and learning of computational thinking through programming: What is next for K-12? Comput. Hum. Behav. 2014, 41, 51–61. [Google Scholar] [CrossRef]
  4. De la Fuente, H.A.; García, A.P. Evaluación del Pensamiento Computacional en Educación Primaria. Rev. Interuniv. Investig. Tecnol. Educ. 2017, 3, 25–39. [Google Scholar] [CrossRef]
  5. Luo, F.; Israel, M.; Gane, B. Elementary Computational Thinking Instruction and Assessment: A Learning Trajectory Perspective. ACM Trans. Comput. Educ. 2022, 22, 1–26. [Google Scholar] [CrossRef]
  6. Wing, J.M. Computational Thinking. 2006. Available online: https://dl.acm.org/doi/pdf/10.1145/1118178.1118215 (accessed on 4 April 2024).
  7. Román-González, M.; Pérez-González, J.-C.; Jiménez-Fernández, C. Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Comput. Hum. Behav. 2017, 72, 678–691. [Google Scholar] [CrossRef]
  8. Tang, X.; Yin, Y.; Lin, Q.; Hadad, R.; Zhai, X. Assessing computational thinking: A systematic review of empirical studies. Comput. Educ. 2020, 148, 103798. [Google Scholar] [CrossRef]
  9. Brennan, K.; Resnick, M. New Frameworks for Studying and Assessing the Development of Computational Thinking. In American Educational Research Association. 2012, pp. 135–160. Available online: https://scratched.gse.harvard.edu/ct/files/AERA2012.pdf (accessed on 18 May 2023).
  10. CSTA K-12 Computer Science Standards; CSTA. Association for Computing Machinery: New York, NY, USA, 2011.
  11. Clarke-Midura, J.; Lee, V.R.; Shumway, J.F.; Silvis, D.; Kozlowski, J.S.; Peterson, R. Designing formative assessments of early childhood computational thinking. Early Child. Res. Q. 2023, 65, 68–80. [Google Scholar] [CrossRef]
  12. Grover, S.; Pea, R. Computational Thinking in K-12: A Review of the State of the Field. Educ. Res. 2013, 42, 38–43. [Google Scholar] [CrossRef]
  13. Relkin, E.; de Ruiter, L.; Bers, M.U. TechCheck: Development and Validation of an Unplugged Assessment of Computational Thinking in Early Childhood Education. J. Sci. Educ. Technol. 2020, 29, 482–498. [Google Scholar] [CrossRef]
  14. Clarke-Midura, J.; Silvis, D.; Shumway, J.F.; Lee, V.R.; Kozlowski, J.S. Developing a kindergarten computational thinking assessment using evidence-centered design: The case of algorithmic thinking. Comput. Sci. Educ. 2021, 31, 117–140. [Google Scholar] [CrossRef]
  15. Cutumisu, M.; Adams, C.; Lu, C. A Scoping Review of Empirical Research on Recent Computational Thinking Assessments. J. Sci. Educ. Technol. 2019, 28, 651–676. [Google Scholar] [CrossRef]
  16. El-Hamamsy, L.; Zapata-Cáceres, M.; Martín-Barroso, E.; Mondada, F.; Zufferey, J.D.; Bruno, B.; Román-González, M. The competent Computational Thinking test (cCTt): A valid, reliable and gender-fair test for longitudinal CT studies in grades 3–6. arXiv 2023, arXiv:2305.19526. Available online: http://arxiv.org/abs/2305.19526 (accessed on 4 April 2024).
  17. Zapata-Caceres, M.; Martin-Barroso, E.; Roman-Gonzalez, M. Computational thinking test for beginners: Design and content validation. In Proceedings of the 2020 IEEE Global Engineering Education Conference (EDUCON), Porto, Portugal, 27–30 April 2020; pp. 1905–1914. [Google Scholar] [CrossRef]
  18. Zhang, S.; Wong, G.K.W. Unravelling the underlying mechanism of computational thinking: The mediating role of attitudinal beliefs between personality and learning performance. J. Comput. Assist. Learn. 2023, 40, 902–918. [Google Scholar] [CrossRef]
  19. Wang, L.; Geng, F.; Hao, X.; Shi, D.; Wang, T.; Li, Y. Measuring coding ability in young children: Relations to computational thinking, creative thinking, and working memory. Curr. Psychol. 2021, 42, 8039–8050. [Google Scholar] [CrossRef]
  20. Kanaki, K.; Kalogiannakis, M. Assessing Algorithmic Thinking Skills in Relation to Age in Early Childhood STEM Education. Educ. Sci. 2022, 12, 380. [Google Scholar] [CrossRef]
  21. Kotsopoulos, D.; Floyd, L.; Dickson, B.A.; Nelson, V.; Makosz, S. Noticing and Naming Computational Thinking During Play. Early Child. Educ. J. 2021, 50, 699–708. [Google Scholar] [CrossRef]
  22. Na, C.; Clarke-Midura, J. Assessing Young Children’s Computational Thinking Using Cognitive Diagnostic Modeling. In Proceedings of the 17th International Conference of the Learning Sciences—ICLS 2023, Montreal, Canada, 10–15 June 2023; pp. 672–679. [Google Scholar]
  23. Poulakis, E.; Politis, P. Research on E-Learning and ICT in Education; Springer International Publishing: Cham, Switzerland, 2021. [Google Scholar] [CrossRef]
  24. Grover, S.; Pea, R.D. “Systems of Assessments” for Deeper Learning of Computational Thinking in K-12”. researchgate.net. 2015, pp. 1–11. Available online: https://www.researchgate.net/publication/275771253 (accessed on 7 April 2024).
  25. Román-González, M.; Moreno-León, J.; Robles, G. Combining Assessment Tools for a Comprehensive Evaluation of Computational Thinking Interventions. In Computational Thinking Education; Springer: Singapore, 2019; pp. 79–98. [Google Scholar] [CrossRef]
  26. Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M.D.J.; Horsley, T.; Weeks, L.; et al. PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and Explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef] [PubMed]
  27. Urrutia, G.; Bonfill, X. PRISMA declaration: A proposal to improve the publication of systematic reviews and meta-analyses. Med. Clin. 2010, 135, 507–511. [Google Scholar] [CrossRef]
  28. Sánchez-Serrano, S.; Pedraza-Navarro, I.; Donoso-González, M. How to conduct a systematic review under PRISMA protocol? Uses and fundamental strategies for its application in the educational field through a practical case study. Bordon. Rev. Pedagog. 2022, 74, 51–66. [Google Scholar] [CrossRef]
  29. Van Eck, N.J.; Waltman, L. VOSviewer Manual; Univeristeit Leiden: Leiden, The Netherlands, 2013. [Google Scholar]
  30. Chen, G.; Shen, J.; Barth-Cohen, L.; Jiang, S.; Huang, X.; Eltoukhy, M. Assessing elementary students’ computational thinking in everyday reasoning and robotics programming. Comput. Educ. 2017, 109, 162–175. [Google Scholar] [CrossRef]
  31. Relkin, E.; Johnson, S.K.; Bers, M.U. A Normative Analysis of the TechCheck Computational Thinking Assessment. Educ. Technol. Soc. 2023, 26, 118–130. [Google Scholar] [CrossRef]
  32. Kong, S.-C.; Wang, Y.-Q. Monitoring cognitive development through the assessment of computational thinking practices: A longitudinal intervention on primary school students. Comput. Hum. Behav. 2023, 145, 107749. [Google Scholar] [CrossRef]
  33. Bryndová, L.; Bártek, K.; Klement, M. Possibilities of Diagnosing the Level of Development of Students’ Computational Thinking and the Influence of Alternative Methods of Teaching Mathematics on Their Results. Ad Alta J. Interdiscip. Res. 2023, 13, 45–51. [Google Scholar] [CrossRef]
  34. Dağ, F.; Şumuer, E.; Durdu, L. The effect of an unplugged coding course on primary school students’ improvement in their computational thinking skills. J. Comput. Assist. Learn. 2023, 39, 1902–1918. [Google Scholar] [CrossRef]
  35. Arslanoğlu, İ.I.; Kert, S.B.; Tonbuloğlu, İ. Think together, design together, code together: The effect of augmented reality activity designed by children on the computational thinking skills. Educ. Inf. Technol. 2024, 29, 8493–8522. [Google Scholar] [CrossRef]
  36. Durán, C.M.S.; Fernández, C.M.G.; Galán, A.A. Computational Literacy: Unplugged musical activities around Bebras International Challenge. Rev. Educ. Distancia 2023, 23, 1–33. [Google Scholar] [CrossRef]
  37. Küçükaydın, M.A.; Çite, H. Computational thinking in primary school: Effects of student and school characteristics. Educ. Inf. Technol. 2023, 29, 5631–5649. [Google Scholar] [CrossRef]
  38. Li, X.; Xie, K.; Vongkulluksn, V.; Stein, D.; Zhang, Y. Developing and Testing a Design-Based Learning Approach to Enhance Elementary Students’ Self-Perceived Computational Thinking. J. Res. Technol. Educ. 2023, 55, 344–368. [Google Scholar] [CrossRef]
  39. Zhang, S.; Wong, G.K.W. Development and validation of a computational thinking test for lower primary school students. Educ. Technol. Res. Dev. 2023, 71, 1595–1630. [Google Scholar] [CrossRef]
  40. Liu, X.; Wang, X.; Xu, K.; Hu, X. Effect of Reverse Engineering Pedagogy on Primary School Students’ Computational Thinking Skills in STEM Learning Activities. J. Intell. 2023, 11, 36. [Google Scholar] [CrossRef]
  41. Shang, X.; Jiang, Z.; Chiang, F.-K.; Zhang, Y.; Zhu, D. Effects of robotics STEM camps on rural elementary students’ self-efficacy and computational thinking. Educ. Technol. Res. Dev. 2023, 71, 1135–1160. [Google Scholar] [CrossRef]
  42. Piedade, J.; Dorotea, N. Effects of Scratch-Based Activities on 4th-Grade Students’ Computational Thinking Skills. Inform. Educ. 2023, 22, 499–523. [Google Scholar] [CrossRef]
  43. Zhang, S.; Wong, G.K. Exploring the underlying cognitive process of computational thinking in primary education. Think. Ski. Creat. 2023, 48, 101314. [Google Scholar] [CrossRef]
  44. Tsarava, K.; Moeller, K.; Román-González, M.; Golle, J.; Leifheit, L.; Butz, M.V.; Ninaus, M. A cognitive definition of computational thinking in primary education. Comput. Educ. 2022, 179, 104425. [Google Scholar] [CrossRef]
  45. Román-González, M.; Pérez-González, J.-C.; Moreno-León, J.; Robles, G. Extending the nomological network of computational thinking with non-cognitive factors. Comput. Hum. Behav. 2018, 80, 441–459. [Google Scholar] [CrossRef]
  46. El-Hamamsy, L.; Zapata-Cáceres, M.; Marcelino, P.; Bruno, B.; Zufferey, J.D.; Martín-Barroso, E.; Román-González, M. Comparing the psychometric properties of two primary school Computational Thinking (CT) assessments for grades 3 and 4: The Beginners’ CT test (BCTt) and the competent CT test (cCTt). Front. Psychol. 2022, 13, 1082659. [Google Scholar] [CrossRef] [PubMed]
  47. Zapata-Cáceres, M.; Marcelino, P.; El-Hamamsy, L.; Martín-Barroso, E. A Bebras Computational Thinking (ABC-Thinking) program for primary school: Evaluation using the competent computational thinking test. Educ. Inf. Technol. 2024, 1–30. [Google Scholar] [CrossRef]
  48. Zapata-Caceres, M.; Martin-Barroso, E.; Roman-Gonzalez, M. Collaborative Game-Based Environment and Assessment Tool for Learning Computational Thinking in Primary School: A Case Study. IEEE Trans. Learn. Technol. 2021, 14, 576–589. [Google Scholar] [CrossRef]
  49. El-Hamamsy, L.; Zapata-Cáceres, M.; Barroso, E.M.; Mondada, F.; Zufferey, J.D.; Bruno, B. The Competent Computational Thinking Test: Development and Validation of an Unplugged Computational Thinking Test for Upper Primary School. J. Educ. Comput. Res. 2022, 60, 1818–1866. [Google Scholar] [CrossRef]
  50. Li, Y.; Xu, S.; Liu, J. Development and Validation of Computational Thinking Assessment of Chinese Elementary School Students. J. Pac. Rim Psychol. 2021, 15, 183449092110102. [Google Scholar] [CrossRef]
  51. Tsai, M.-J.; Chien, F.P.; Lee, S.W.-Y.; Hsu, C.-Y.; Liang, J.-C. Development and Validation of the Computational Thinking Test for Elementary School Students (CTT-ES): Correlate CT Competency With CT Disposition. J. Educ. Comput. Res. 2022, 60, 1110–1129. [Google Scholar] [CrossRef]
  52. Zhang, S.; Wong, G.K.W.; Pan, G. Computational Thinking Test for Lower Primary Students: Design Principles, Content Validation, and Pilot Testing. In Proceedings of the TALE 2021—2021 IEEE International Conference on Engineering, Technology & Education, Wuhan, China, 5–8 December 2021; pp. 345–352. [Google Scholar] [CrossRef]
  53. Jong, M.S.-Y.; Geng, J.; Chai, C.S.; Lin, P.-Y. Development and predictive validity of the computational thinking disposition questionnaire. Sustainability 2020, 12, 4459. [Google Scholar] [CrossRef]
  54. Kong, S.C.; Wang, Y.Q. Formation of computational identity through computational thinking perspectives development in programming learning: A mediation analysis among primary school students. Comput. Hum. Behav. 2020, 106, 106230. [Google Scholar] [CrossRef]
  55. Basu, S.; Rutstein, D.W.; Xu, Y.; Wang, H.; Shear, L. A principled approach to designing computational thinking concepts and practices assessments for upper elementary grades. Comput. Sci. Educ. 2021, 31, 169–198. [Google Scholar] [CrossRef]
  56. Gane, B.D.; Israel, M.; Elagha, N.; Yan, W.; Luo, F.; Pellegrino, J.W. Design and validation of learning trajectory-based assessments for computational thinking in upper elementary grades. Comput. Sci. Educ. 2021, 31, 141–168. [Google Scholar] [CrossRef]
  57. Kong, S.-C.; Lai, M. Validating a computational thinking concepts test for primary education using item response theory: An analysis of students’ responses. Comput. Educ. 2022, 187, 104562. [Google Scholar] [CrossRef]
  58. Pérez-Marín, D.; Hijón-Neira, R.; Bacelo, A.; Pizarro, C. Can computational thinking be improved by using a methodology based on metaphors and scratch to teach computer programming to children? Comput. Hum. Behav. 2020, 105, 105849. [Google Scholar] [CrossRef]
  59. Lee, K.; Cho, J. Computational Thinking Evaluation Tool Development for Early Childhood Software Education. JOIV Int. J. Inform. Vis. 2021, 5, 313. [Google Scholar] [CrossRef]
  60. Fakhriyah, F.; Masfuah, S.; Mardapi, D. Developing scientific literacy-based teaching materials to improve students’ computational thinking skills. J. Pendidik. IPA Indones. 2019, 8, 482–491. [Google Scholar] [CrossRef]
  61. Sigayret, K.; Tricot, A.; Blanc, N. Unplugged or plugged-in programming learning: A comparative experimental study. Comput. Educ. 2022, 184, 104505. [Google Scholar] [CrossRef]
  62. Kong, S.C.; Wang, Y.Q. Item response analysis of computational thinking practices: Test characteristics and students’ learning abilities in visual programming contexts. Comput. Hum. Behav. 2021, 122, 106836. [Google Scholar] [CrossRef]
  63. Relkin, E.; Bers, M. TechCheck-K: A measure of computational thinking for kindergarten children. In Proceedings of the 2021 IEEE Global Engineering Education Conference (EDUCON), Vienna, Austria, 21–23 April 2021; pp. 1696–1702. [Google Scholar] [CrossRef]
  64. Tengler, K.; Kastner-Hauler, O.; Sabitzer, B.; Lavicza, Z. The Effect of Robotics-Based Storytelling Activities on Primary School Students’ Computational Thinking. Educ. Sci. 2022, 12, 10. [Google Scholar] [CrossRef]
  65. Ma, H.; Zhao, M.; Wang, H.; Wan, X.; Cavanaugh, T.W.; Liu, J. Promoting pupils’ computational thinking skills and self-efficacy: A problem-solving instructional approach. Educ. Technol. Res. Dev. 2021, 69, 1599–1616. [Google Scholar] [CrossRef]
  66. Fanchamps, N.; van Gool, E.; Slangen, L.; Hennissen, P. The effect on computational thinking and identified learning aspects: Comparing unplugged smartGames with SRA-Programming with tangible or On-screen output. Educ. Inf. Technol. 2023, 29, 2999–3024. [Google Scholar] [CrossRef]
  67. Korkmaz, Ö.; Çakir, R.; Özden, M.Y. A validity and reliability study of the computational thinking scales (CTS). Comput. Hum. Behav. 2017, 72, 558–569. [Google Scholar] [CrossRef]
  68. Jiang, B.; Li, Z. Effect of Scratch on computational thinking skills of Chinese primary school students. J. Comput. Educ. 2021, 8, 505–525. [Google Scholar] [CrossRef]
  69. Wang, J.; Zhang, Y.; Hung, C.-Y.; Wang, Q.; Zheng, Y. Exploring the characteristics of an optimal design of non-programming plugged learning for developing primary school students’ computational thinking in mathematics. Educ. Technol. Res. Dev. 2022, 70, 849–880. [Google Scholar] [CrossRef]
  70. Tran, Y. Computational Thinking Equity in Elementary Classrooms: What Third-Grade Students Know and Can Do. J. Educ. Comput. Res. 2019, 57, 3–31. [Google Scholar] [CrossRef]
  71. Özcan, M.; Çetinkaya, E.; Göksun, T.; Kisbu-Sakarya, Y. Does learning to code influence cognitive skills of elementary school children? Findings from a randomized experiment. Br. J. Educ. Psychol. 2021, 91, 1434–1455. [Google Scholar] [CrossRef]
  72. Gerosa, A.; Koleszar, V.; Tejera, G.; Gómez-Sena, L.; Carboni, A. Cognitive abilities and computational thinking at age 5: Evidence for associations to sequencing and symbolic number comparison. Comput. Educ. Open 2021, 2, 100043. [Google Scholar] [CrossRef]
  73. Küçükaydın, M.A.; Akkanat, Ç. Adaptation Into Turkish of the Computational Thinking Test for Primary School Students. Probl. Educ. 21st Century 2022, 80, 765–776. [Google Scholar] [CrossRef]
  74. Chiazzese, G.; Arrigo, M.; Chifari, A.; Lonati, V.; Tosto, C. Educational robotics in primary school: Measuring the development of computational thinking skills with the bebras tasks. Informatics 2019, 6, 43. [Google Scholar] [CrossRef]
  75. del Olmo-Muñoz, J.; Cózar-Gutiérrez, R.; González-Calero, J.A. Computational thinking through unplugged activities in early years of Primary Education. Comput. Educ. 2020, 150, 103832. [Google Scholar] [CrossRef]
  76. Zhan, Z.; He, W.; Yi, X.; Ma, S. Effect of Unplugged Programming Teaching Aids on Children’s Computational Thinking and Classroom Interaction: With Respect to Piaget’s Four Stages Theory. J. Educ. Comput. Res. 2022, 60, 1277–1300. [Google Scholar] [CrossRef]
  77. Haseski, H.I.; Ilic, U.; Tugtekin, U. Defining a New 21st Century Skill-Computational Thinking: Concepts and Trends. Int. Educ. Stud. 2018, 11, p29. [Google Scholar] [CrossRef]
  78. Corrales-álvarez, M.; Ocampo, L.M.; Augusto, S.; Torres, C. Instruments for Evaluating Computational Thinking: A Systematic Review. TecnoLogicas 2024, 27, 2950. [Google Scholar] [CrossRef]
  79. Lu, C.; Macdonald, R.; Odell, B.; Kokhan, V.; Epp, C.D.; Cutumisu, M. A scoping review of computational thinking assessments in higher education. J. Comput. High. Educ. 2022, 34, 416–461. [Google Scholar] [CrossRef]
  80. Ezeamuzie, N.O.; Leung, J.S.; Ting, F.S. Unleashing the Potential of Abstraction From Cloud of Computational Thinking: A Systematic Review of Literature. J. Educ. Comput. Res. 2022, 60, 877–905. [Google Scholar] [CrossRef]
  81. Tikva, C.; Tambouris, E. Mapping Computational Thinking through Programming in K-12 Education: A Conceptual Model Based on a Systematic Literature Review. Comput. Educ. 2020, 162, 104083. [Google Scholar] [CrossRef]
  82. Sun, L.; Hu, L.; Zhou, D. The bidirectional predictions between primary school students’ STEM and language academic achievements and computational thinking: The moderating role of gender. Think. Ski. Creat. 2022, 44, 101043. [Google Scholar] [CrossRef]
  83. Ezeamuzie, N.O.; Leung, J.S.C. Computational Thinking Through an Empirical Lens: A Systematic Review of Literature. J. Educ. Comput. Res. 2022, 60, 481–511. [Google Scholar] [CrossRef]
  84. Bakala, E.; Gerosa, A.; Hourcade, J.P.; Tejera, G. Preschool children, robots, and computational thinking: A systematic review. Int. J. Child-Comput. Interact. 2021, 29, 100337. [Google Scholar] [CrossRef]
  85. McCormick, K.I.; Hall, J.A. Computational thinking learning experiences, outcomes, and research in preschool settings: A scoping review of literature. Educ. Inf. Technol. 2022, 27, 3777–3812. [Google Scholar] [CrossRef]
  86. Román-González, M.; Moreno-león, J.; Robles, G. Computational Thinking Education; Springer: Singapore, 2019. [Google Scholar] [CrossRef]
  87. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. bmj 2021, 372, n71. [Google Scholar] [CrossRef]
Figure 1. Literature Selection Flow Diagram. Source: Own elaboration following the PRISMA Methodology [26].
Figure 1. Literature Selection Flow Diagram. Source: Own elaboration following the PRISMA Methodology [26].
Education 14 01124 g001
Figure 2. Distribution over time of the number of publications.
Figure 2. Distribution over time of the number of publications.
Education 14 01124 g002
Figure 3. Word cloud of titles.
Figure 3. Word cloud of titles.
Education 14 01124 g003
Figure 4. Map of title and keywords.
Figure 4. Map of title and keywords.
Education 14 01124 g004
Figure 5. Co-citation Network.
Figure 5. Co-citation Network.
Education 14 01124 g005
Figure 6. Country where the study was conducted.
Figure 6. Country where the study was conducted.
Education 14 01124 g006
Figure 7. Age distribution targeted by the instruments.
Figure 7. Age distribution targeted by the instruments.
Education 14 01124 g007
Figure 8. Distribution of authors and/or associations that have contributed to the construction of the instruments.
Figure 8. Distribution of authors and/or associations that have contributed to the construction of the instruments.
Education 14 01124 g008
Figure 9. Skills, concepts, and attitudes evaluated.
Figure 9. Skills, concepts, and attitudes evaluated.
Education 14 01124 g009
Figure 10. Distribution of assessed skills, concepts, perspectives, and attitudes.
Figure 10. Distribution of assessed skills, concepts, perspectives, and attitudes.
Education 14 01124 g010
Table 1. Identified instruments.
Table 1. Identified instruments.
#ArticleInstrument UsedCitation
1Computational thinking test for beginners: Design and content validationBCTt [17]
2The competent Computational Thinking Test: Development and Validation of an Unplugged Computational Thinking Test for Upper Primary SchoolcCTt[49]
3Development and Validation of Computational Thinking Assessment of Chinese Elementary School StudentsCTA-CES[50]
4The effect of an unplugged coding course on primary school students’ improvement in their computational thinking skillsCTST[34]
5Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking TestCTt[7]
6Development and Validation of the Computational Thinking Test for Elementary School Students (CTT-ES): Correlate CT Competency With CT DispositionCTT-ES[51]
7Computational Thinking Test for Lower Primary students: Design principles, content validation, and pilot testingCTtLP[52]
8Development and Predictive Validity of the Computational Thinking Disposition QuestionnaireQuestionnaire of CT disposition[53]
9Formation of computational identity through computational thinking perspectives development in programming learning: A mediation analysis among primary school studentsScale of CT perspectives [54]
10Measuring coding ability in young children relations to computational thinking, creative thinking, and working memoryAssessment through a card-based game [19]
11A principled approach to designing computational thinking concepts and practices assessments for upper elementary gradesAssessment of computational thinking concepts [55]
12Design and validation of learning trajectory-based assessments for computational thinking in upper elementary gradesEarly assessment [56]
13Assessing elementary students’ computational thinking in everyday reasoning and robotics programmingInstrument with emphasis on robotics programming [30]
14Validating a computational thinking concepts test for primary education using item response theory: An analysis of students’ responsesInstrument for testing CT concepts [57]
15Can computational thinking be improved by using a methodology based on metaphors and scratch to teach computer programming to children?PCNT[58]
16Computational Thinking Evaluation Tool Development for Early Childhood Software EducationComputational Thinking Test [59]
17Developing scientific literacy-based teaching materials to improve students’ computational thinking skillsIntegrated CT competence test [60]
18Unplugged or plugged-in programming learning: A comparative experimental studyTest of computational concepts mastery [61]
19Item response analysis of computational thinking practices: Test characteristics and students’ learning abilities in visual programming contextsCT practices test [62]
20Possibilities of diagnosing the level of development of students’ computational thinking and the influence of alternative methods of teaching mathematics on their resultsDidactic CT test[33]
21TechCheck: Development and Validation of an Unplugged Assessment of Computational Thinking in Early Childhood EducationTechCheck[13]
22TechCheck-K- A Measure of Computational Thinking for Kindergarten ChildrenTechCheck-K[63]
Table 2. Referenced instruments used without adaptation and with adaptation.
Table 2. Referenced instruments used without adaptation and with adaptation.
Base InstrumentArticle Where the Instrument Is UsedAdjustment EvidenceNo AdjustmentComments
Translate Expert JudgmentPilot TestingPsychometric Properties
BCTt
[17]
[42] XThe instrument is applied in its original version and translated into Portuguese.
[46] XX The psychometric properties of two instruments are compared: the Beginner Computational Thinking Test (BCTt), developed for grades 1–6, and the Competent Computational Thinking Test (cCTt), validated for grades 3–4.
[48] XThe instrument is applied in its original version.
[64] XThe instrument is applied in its original version.
CTt
[7]
[45] XThe instrument is applied in its original version.
[65]X XX It is translated into Chinese, and the reliability is analyzed.
[44] XIt is translated into German, and the number of items in the instrument is reduced.
[66] X The instrument is applied in its original version, and its internal consistency is determined.
cCTt
[49]
[46] XX The psychometric properties of two instruments are compared: the Beginner Computational Thinking Test (BCTt), developed for grades 1–6, and the Competent Computational Thinking Test (cCTt), validated for grades 3–4.
[47] XThe instrument is applied in its original version.
CTS (The original article where the instrument appears is not part of the 50 articles in the review.)
[67]
[38]X X It is translated into Chinese.
[65]X XX It is translated into Chinese.
[40]X XX It is translated into Chinese, adjusted, and simplified for comprehension and application.
[41] XIt is adjusted to the age range of the study.
[68]X X It is translated into Chinese and expressions are adjusted.
[69]XX X It is translated into Chinese and adjusted to reading levels by a research and teaching team.
CTtLP
[52]
[18] X The instrument is applied in its original version.
CFA is conducted.
[39] XXX The psychometric properties of the instrument are validated.
[43] XThe instrument is applied in its original version.
CT questionnaire (The original article where the instrument appears is not part of the 50 articles in the review.)
[70]
[71] XThe instrument is applied in its original version.
[72] XIt is adjusted to the age range of the study.
CT practices test [62][32] X A partial matrix sampling approach was used to distribute this comprehensive test into multiple shorter forms with fewer items, aiming to reduce the testing burden on students.
TechCheck
[13]
[31] XThe instrument is applied in its original version.
[35] XThe instrument is applied in its original version.
[37] XThe translation performed by one of the authors is used [73].
[73]XXXX The test items were translated into Turkish.
TechCheck-K
[63]
[31] XThe instrument is applied in its original version.
Table 3. Age distribution for each instrument.
Table 3. Age distribution for each instrument.
YearArticleInstrument Used and/or AdjustedAge
5678910111213141516
2017Assessing elementary students’ computational thinking in everyday reasoning and robotics programming [30]Instrument with emphasis on robotics programming XX
2017Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test [7]CTt XXXXXXX
2019Developing scientific literacy-based teaching materials to improve students’ computational thinking skills [60]Integrated CT competence test XX
2020Can computational thinking be improved by using a methodology based on metaphors and scratch to teach computer programming to children? [58]PCNT XXXX
2020Formation of computational identity through computational thinking perspectives development in programming learning: A mediation analysis among primary school students [54]Scale of CT perspectives XXXX
2020TechCheck: Development and Validation of an Unplugged Assessment of Computational Thinking in Early Childhood Education [13]TechCheckXXXXX
2020Computational thinking test for beginners: Design and content validation [17]BCTtXXXXXXXX
2020Development and Predictive Validity of the Computational Thinking Disposition Questionnaire [53]Questionnaire of CT disposition XX
2021Development and Validation of Computational Thinking Assessment of Chinese Elementary School Students [50]CTA-CES XXXXX
2021Effect of Scratch on computational thinking skills of Chinese primary school students [68]CTS XX
2021Item response analysis of computational thinking practices: Test characteristics and students’ learning abilities in visual programming contexts [62]CT practice test XXX
2021Promoting pupils’ computational thinking skills and self-efficacy: a problem-solving instructional approach [65]CTt XX
Self-Efficacy Scale for CT XX
2021Computational Thinking Test for Lower Primary students: Design principles, content validation, and pilot testing [52]CTtLP XX
2021Design and validation of learning trajectory-based assessments for computational thinking in upper elementary grades [56]Early assessment XXX
2021TechCheck-K- A Measure of Computational Thinking for Kindergarten Children [63]TechCheck-KXX
2021Measuring coding ability in young children relations to computational thinking, creative thinking, and working memory [19]Assessment through a card-based gameXX
2021A principled approach to designing computational thinking concepts and practices assessments for upper elementary grades [55]Assessment of computational thinking concepts XXX
2021Computational Thinking Evaluation Tool Development for Early Childhood Software Education [59]CT Test Using Bebras TasksX
2022Adaptation into turkish of the computational thinking test for primary school students [73]TechCheck XXX
2022Comparing the psychometric properties of two primary school Computational Thinking (CT) assessments for grades 3 and 4: The Beginners’ CT test (BCTt) and the competent CT test (cCTt) [46]BCTt XXX
cCTt XXX
2022Development and Validation of the Computational Thinking Test for Elementary School Students (CTT-ES): Correlate CT Competency With CT Disposition [51] CTT-ES XX
2022Exploring the characteristics of an optimal design of non-programming plugged learning for developing primary school students’ computational thinking in mathematics [69]Perception Questionnaire Adapted from CTS XX
2022The competent Computational Thinking Test: Development and Validation of an Unplugged Computational Thinking Test for Upper Primary School [49]cCTt XXX
2022Unplugged or plugged-in programming learning: A comparative experimental study [61]Test of computational concepts mastery XX
2022Validating a computational thinking concepts test for primary education using item response theory: An analysis of students’ responses [57]CT Concepts Test Instrument for Primary Education Based on an ECD Approach XXXX
2023Developing and Testing a Design-Based Learning Approach to Enhance Elementary Students’ Self-Perceived Computational Thinking [38]CTS XX
2023Development and validation of a computational thinking test for lower primary school students [39]CTtLP XXXXX
2023Effect of Reverse Engineering Pedagogy on Primary School Students’ Computational Thinking Skills in STEM Learning Activities [40]CTS XX
2023Monitoring cognitive development through the assessment of computational thinking practices: A longitudinal intervention on primary school students [32]CT practices test XXXXX
2023Possibilities of diagnosing the level of development of students’ computational thinking and the influence of alternative methods of teaching mathematics on their results [33]Didactic CT test XXX
2023The effect of an unplugged coding course on primary school students’ improvement in their computational thinking skills [34]CTST XXXX
2023Unravelling the underlying mechanism of computational thinking: The mediating role of attitudinal beliefs between personality and learning performance [18]CTtLP XXXX
2024The effect on computational thinking and identified learning aspects: Comparing unplugged smartGames with SRA-Programming with tangible or On-screen output [66]CTt XXXXXX
Total5571522262182111
Table 4. Assessed skills, concepts, perspectives, and attitudes.
Table 4. Assessed skills, concepts, perspectives, and attitudes.
YearCitationSkillsConceptsPerspectivesAttitudes
AbstractionAlgorithmic ThinkingCritical ThinkingLogical ThinkingAlgorithmsCodingProblem- SolvingDecompositionModularizationDebugging and EvaluationGeneralizationRemix and ReuseRepresentationPattern RecognitionSequencesDirectionsLoopsConditionalsControl StructuresEventsParallelismOperatorsDataExpressingConnectingQuestioningCooperativityCreativitySensitivitySelf-efficacyInclination
2017[30] xx
2017[7] x x xx
2019[60] x x xx
2020[58] x x x xx xxxxxxx
2020[17] x xx
2020[53] xx xxx
2020[54] xxx
2020[13] x xx x x
2021[55] x x xx xx
2021[59]x x x x
2021[52] xxxx
2021[56] x x x x xx
2021[50]x x xx x x x
2021[68] xx x xx
2021[62]xx xx x
2021[19] xx x xx
2021[65] xx x xxxx xx
2021[63] x xx x x
2022[73] x xx x x
2022[46] x xx
2022[51]x x x xx
2022[69]xxx xx x
2022[49] X xx
2022[61] xx xx
2022[57] x xx
2023[38] xx x xx
2023[39] xx x
2023[40] xx x xx
2023[32]xx xx x
2023[33]xx x x xx
2023[34]x x x xx
2023[18] x x xx
2024[66] xx
Total8850109106610335113315153231122254111
Table 5. Articles associated with the instruments and the evidence of validity and reliability.
Table 5. Articles associated with the instruments and the evidence of validity and reliability.
YearArticleInstrument Used and/or AdjustedValidityReliability
ContentConstructCriterionOtherNo evidenceCoefficient
2017Assessing elementary students’ computational thinking in everyday reasoning and robotics programming [30]Instrument with emphasis on robotics programming XCronbach’s alpha
2017Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test [7]CTt X Cronbach’s alpha
2018Extending the nomological network of computational thinking with non-cognitive factors [45]CTtX X Cronbach’s alpha
2019Developing scientific literacy-based teaching materials to improve students’ computational thinking skills [60]Integrated CT competence testX XCronbach’s alpha
2019Educational Robotics in Primary School: Measuring the Development of Computational Thinking Skills with the Bebras Tasks [74]Bebras Tasks XNo evidence
2020Can computational thinking be improved by using a methodology based on metaphors and scratch to teach computer programming to children? [58]PCNT XNo evidence
2020Computational thinking through unplugged activities in early years of Primary Education [75]Bebras Category Instrument XNo evidence
2020Formation of computational identity through computational thinking perspectives development in programming learning: A mediation analysis among primary school students [54]Scale of CT perspectivesXX Cronbach’s alpha
2020TechCheck: Development and Validation of an Unplugged Assessment of Computational Thinking in Early Childhood Education [13]TechCheckX X Cronbach’s alpha
2020Computational thinking test for beginners: Design and content validation [17]BCTtX X Cronbach’s alpha
2020Development and Predictive Validity of the Computational Thinking Disposition Questionnaire [53]Questionnaire of CT disposition X Cronbach’s alpha
2021Cognitive abilities and computational thinking at age 5: Evidence for associations to sequencing and symbolic number comparison [72]CT questionnaire XCronbach’s alpha
2021Collaborative Game-Based Environment and Assessment Tool for Learning Computational Thinking in Primary School: A Case Study [48]BCTt XCronbach’s alpha
2021Development and Validation of Computational Thinking Assessment of Chinese Elementary School Students [50]CTA-CESXXX Cronbach’s alpha
2021Does learning to code influence cognitive skills of elementary school children? Findings from a randomized experiment [71]CT questionnaire XNo evidence
2021Effect of Scratch on computational thinking skills of Chinese primary school students [68]CTS X Cronbach’s alpha
2021Item response analysis of computational thinking practices: Test characteristics and students’ learning abilities in visual programming contexts [62]CT practices testXX X Empirical
2021Promoting pupils’ computational thinking skills and self-efficacy: a problem-solving instructional approach [65]CTt XCronbach’s alpha
Self-Efficacy Scale for CT XCronbach’s alpha
2021Computational Thinking Test for Lower Primary students: Design principles, content validation, and pilot testing [52]CTtLPXXX Cronbach’s alpha
2021Design and validation of learning trajectory-based assessments for computational thinking in upper elementary grades [56]Early assessment X Cohen’s kappa
2021TechCheck-K- A Measure of Computational Thinking for Kindergarten Children [63]TechCheck-K X No evidence
2021Measuring coding ability in young children relations to computational thinking, creative thinking, and working memory [19]Assessment through a card-based gameXXX Cronbach’s alpha
2021A principled approach to designing computational thinking concepts and practices assessments for upper elementary grades [55]Assessment of computational thinking concepts X Cronbach’s alpha
2021Computational Thinking Evaluation Tool Development for Early Childhood Software Education [59]CT Test Using Bebras TasksX No evidence
2022A cognitive definition of computational thinking in primary education [44]CTt XCronbach’s alpha
2022Adaptation into turkish of the computational thinking test for primary school students [73]TechCheck XKR-20
Split half
Spearman-Brown
2022Comparing the psychometric properties of two primary school Computational Thinking (CT) assessments for grades 3 and 4: The Beginners’ CT test (BCTt) and the competent CT test (cCTt) [46]BCTt X Cronbach’s alpha
cCTt X Cronbach’s alpha
2022Development and Validation of the Computational Thinking Test for Elementary School Students (CTT-ES): Correlate CT Competency With CT Disposition [51] CTT-ES XX Cronbach’s alpha
2022Effect of Unplugged Programming Teaching Aids on Children’s Computational Thinking and Classroom Interaction: with Respect to Piaget’s Four Stages Theory [76]CT capacity test XCronbach’s alpha
2022Exploring the characteristics of an optimal design of non-programming plugged learning for developing primary school students’ computational thinking in mathematics [69]Perception Questionnaire Adapted from CTSX XCronbach’s alpha
2022The competent Computational Thinking Test: Development and Validation of an Unplugged Computational Thinking Test for Upper Primary School [49]cCTtXXX Cronbach’s alpha
2022The Effect of Robotics-Based Storytelling Activities on Primary School Students’ Computational Thinking [64]BCTt XNo evidence
2022Unplugged or plugged-in programming learning: A comparative experimental study [61]Test of computational concepts mastery XMcDonald’s ω
2022Validating a computational thinking concepts test for primary education using item response theory: An analysis of students’ responses [57]CT Concepts Test Instrument for Primary Education Based on an ECD ApproachXX Cronbach’s alpha
2023A Normative Analysis of the TechCheck Computational Thinking Assessment [31]TechCheck XCronbach’s alpha
2023Computational Literacy: Unplugged musical activities around Bebras International Challenge [36] Computational Thinking Test using Bebras Problems XCronbach’s alpha
2023Computational thinking in primary school: effects of student and school characteristics [37]TechCheck XCronbach’s alpha
2023Developing and Testing a Design-Based Learning Approach to Enhance Elementary Students’ Self-Perceived Computational Thinking [38]CTS X Cronbach’s alpha
2023Development and validation of a computational thinking test for lower primary school students [39]CTtLPXXX Cronbach’s alpha
2023Effect of Reverse Engineering Pedagogy on Primary School Students’ Computational Thinking Skills in STEM Learning Activities [40]CTS X Cronbach’s alpha
2023Effects of robotics STEM camps on rural elementary students’ self-efficacy and computational thinking [41]Survey adapted from CTS XCronbach’s alpha
2023Effects of Scratch-Based Activities on 4th-Grade Students’ Computational Thinking Skills [42]BCTtX X Cronbach’s alpha
2023Exploring the underlying cognitive process of computational thinking in primary education [43]CTtLP XCronbach’s alpha
2023Monitoring cognitive development through the assessment of computational thinking practices: A longitudinal intervention on primary school students [32]CT practices testXX Cronbach’s alpha
2023Possibilities of diagnosing the level of development of students’ computational thinking and the influence of alternative methods of teaching mathematics on their results [33]Didactic CT test XX No evidence
2023The effect of an unplugged coding course on primary school students’ improvement in their computational thinking skills [34]CTSTX KR-20
2023Think together, design together, code together: the effect of augmented reality activity designed by children on the computational thinking skills [35]TechCheck XNo evidence
2023Unravelling the underlying mechanism of computational thinking: The mediating role of attitudinal beliefs between personality and learning performance [18]CTtLP X XCronbach’s alpha
2024A Bebras Computational Thinking (ABC-Thinking) program for primary school: Evaluation using the competent computational thinking test [47]cCTt X Cronbach’s alpha
2024The effect on computational thinking and identified learning aspects: Comparing unplugged smartGames with SRA-Programming with tangible or On-screen output [66]CTt XCronbach’s alpha
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ocampo, L.M.; Corrales-Álvarez, M.; Cardona-Torres, S.A.; Zapata-Cáceres, M. Systematic Review of Instruments to Assess Computational Thinking in Early Years of Schooling. Educ. Sci. 2024, 14, 1124. https://doi.org/10.3390/educsci14101124

AMA Style

Ocampo LM, Corrales-Álvarez M, Cardona-Torres SA, Zapata-Cáceres M. Systematic Review of Instruments to Assess Computational Thinking in Early Years of Schooling. Education Sciences. 2024; 14(10):1124. https://doi.org/10.3390/educsci14101124

Chicago/Turabian Style

Ocampo, Lina Marcela, Milena Corrales-Álvarez, Sergio Augusto Cardona-Torres, and María Zapata-Cáceres. 2024. "Systematic Review of Instruments to Assess Computational Thinking in Early Years of Schooling" Education Sciences 14, no. 10: 1124. https://doi.org/10.3390/educsci14101124

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop