Systematic Design and Rapid Development of Motion-Based Touchless Games for Enhancing Students’ Thinking Skills
<p>Proposed game design and development stages.</p> "> Figure 2
<p>Gesture Difficulty Levels.</p> "> Figure 3
<p>Gesture Tangible Cards (Examples).</p> "> Figure 4
<p>Measuring Students’ Social Skills (Students’ answers).</p> "> Figure 5
<p>Results from the accomplishment of the evaluation forms’ criteria.</p> "> Figure A1
<p>5-Likert scale games’ evaluation form.</p> ">
Abstract
:1. Introduction
2. Students are Game Designers and Developers
3. The proposed Approach
3.1. Proposed Game Design and Development Stages
- An evaluation rubric [25] for the design artefacts (storyboard and game design document).
- A 5-Likert scale game evaluation form, which contained 13 criteria based on the common design heuristics and two (2) open questions about the pros and cons of the game (Appendix A).
3.2. Timeline and Deliverables
4. Evaluation Study
4.1. Context
- Were the students’ thinking skills enhanced as a result of the approach?
- Was the proposed educational approach implemented successfully and as planned in the school environments?
- Was the proposed educational approach appreciated by the participants?
4.2. Participants
- 45.45% to 68.18% of them had excellent grades (18–20/20) in the four STEM lessons (Physics, Mathematics, Chemistry and Computer Science).
- Most of them (55.71%) had zero to little previous experience in playing Kinect games and zero to little participation in other game design/development activities.
- All participants had previous experience in Scratch; however, only 17.24% of them declared they were confident (level 4 and 5 on the 5-Likert scale).
- The students’ fundamental initial goal was programming. Moreover, the students’ answers regarding the question “What was/were your reason(s) for enrolling in this program?” were grouped and are presented in the following Table 2.
4.3. Data Collection
- (a)
- Measure the computational thinking (CT) skills that are related to the students’ CT competence level on the following six (6) computational thinking (CT) concepts: flow control, abstraction, user interactivity, synchronisation, parallelism and logic. With the aid of the Scrape tool and the Dr. Scratch evaluation rubric the overall CT score per project was calculated by adding up the partial scores of each CT concept. Projects with up to 6 points are considered to prove a Basic CT, while projects between 7 and 12 points are valued as Developing, and projects with more than 12 points are evaluated as Proficient [26].
- (b)
- Confirm if the students followed the common best practices in programming. These practices concerned the avoidance of duplicated scripts (two programs formed by the same blocks and where only the parameters or values of the blocks vary), incorrect names (when the default names of the new characters were left e.g., Sprite1, Sprite2), dead code (parts of programs that are not executed) and not initialising attributes (when the objects’ attributes are not correctly initialised).
- (c)
- Confirm if the students deeply understood the CT programming concepts by using a wide variety of programming commands and avoiding leaving any “dead code”.
- (d)
- Confirm if the students have understood the connections among the body interaction, the space and the abstract representation of angles and geometry concepts, by adding complex gestures into the game-play.
- The positive feelings about the proposed approach.
- The level of satisfaction for specific factors/components of the approach.
- The thoughts about time management and difficulty level (open-ended questions).
5. Results
5.1. Were the Students’ Thinking Skills Enhanced as a Result of the Approach? (RQ1)
- (a)
- Regarding the existence of the six (6) computational thinking (CT) concepts in students’ games.
- (b)
- Students followed the common best practices in programming: As shown in Table 6, students managed to follow the proposed best practices and avoided common programming mistakes such as duplication of scripts, incorrect sprite names, dead code and not initialisation of sprites’ attributes.
- (c)
- Complexity of the produced games: Quantitative analysis supported by the Scrape tool showed that students deeply understood the CT concepts, as they used a wide variety of programming commands (average number of programming commands per game: 966), scripts (average number of scripts per game: 156), sprites (average number of sprites per game: 33) and sprites’ costumes (average number of costumes per game: 43). In addition, an average number of 13.09 previous versions per game were created by each group of students. Finally, according to the researcher’s field notes, students used logic, flow control and abstraction techniques to express their thoughts, to debug/update their games and to make decisions about the proper body joints and gesture algorithms.
- (d)
- Understanding the connections among the body interaction, the space and the abstract representation of angles and geometry concepts, by creating complex gestures for the needs of user interaction: The analysis of the NUI gestures that had been embedded into the games showed that 7/11 games (63.64%) included gestures that required the execution of 2 or 3 postures and 10/11 games (90.91%) were coded for tracking 4 to 9 different body joints. For example, students simulated the climbing gesture by tracking 7 body joints for better accuracy and having to perform 2 different body postures. Another group of students used 9 different body joints in their game. In addition, 7/11 games (63.64%) included more than one gesture (2 to 4 gestures).
5.2. Was the Proposed Educational Approach Implemented Successfully and as Planned in the School Environments? (RQ2)
5.3. Was the Proposed Educational Approach Appreciated by the Participants? (RQ3)
- The positive feelings for the proposed process.
- The level of satisfaction for specific components of the proposed approach.
- The thoughts about time management and difficulty level (open-ended questions).
- Positive feelings for the proposed process: Based on the students’ answers on the final questionnaire: (i) 96.56% of them liked the project, (ii) 72.41% of them answered that they were motivated by the fact that they had to create a Kinect game and (iii) 93.10% of them would encourage their schoolmates to participate in similar initiatives. The teachers’ answers were also consistent with the students’ answers (both the teachers selected the “strongly agree” option).
- Level of satisfaction for specific factors/components of the approach (strongly agree and agree): Even though students and teachers had no previous experience on such kind of projects, the majority of the students were satisfied with:
- The learning resources, the project duration and the project structure (69%).
- The guidance given by the CS teachers and the classmates (75.86%).
- The support and encouragement offered by the principal researcher (93%).
Regarding the game-making process, 75.81% of students strongly agreed and agreed that:- They gained a good understanding of concepts/principles in this field.
- They learned to apply the principles of this program (follow specific steps and practices), in order to create their own Kinect game in a systematic process.
Both the teachers were also satisfied with the above factors/components of the approach. - Their thoughts about time management and difficulty level
6. Discussion-Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
Appendix A
References
- Hava, K.; Cakir, H. A systematic review of literature on students as educational computer game designers. In Proceedings of the EdMedia: World Conference on Educational Media and Technology, Waynesville, NC, USA, 20 June 2017; Johnston, J., Ed.; Association for the Advancement of Computing in Education (AACE): Washington, DC, USA, 2017. [Google Scholar]
- Kafai, Y.B.; Burke, Q. Constructionist gaming: Understanding the benefits of making games for learning. Educ. Psychol. 2015, 50, 313–334. [Google Scholar] [CrossRef] [PubMed]
- Ke, F. An implementation of design-based learning through creating educational computer games: A case study on mathematics learning during design and computing. Comput. Educ. 2014, 73, 26–39. [Google Scholar] [CrossRef]
- Denner, J.; Werner, L.; Ortiz, E. Computer games created by middle school girls: Can they be used to measure understanding of computer science concepts? Comput. Educ. 2012, 58, 240–249. [Google Scholar] [CrossRef]
- Grover, S.; Basu, S. Measuring Student Learning in Introductory Block-Based Programming: Examining Misconceptions of Loops, Variables, and Boolean Logic. In Proceedings of the ACM SIGCSE Technical Symposium on Computer Science Education, Seattle, WA, USA, 8–11 March 2017. [Google Scholar]
- Hsu, H.J. The potential of kinect in education. Int. J. Inf. Educ. Technol. 2011, 1, 365–370. [Google Scholar] [CrossRef]
- Kinesthetic Learning (Wikipedia). Available online: https://en.wikipedia.org/wiki/Kinesthetic_learning (accessed on 10 November 2017).
- Spatial Ability (Wikipedia). Available online: https://en.wikipedia.org/wiki/Spatial_ability (accessed on 10 November 2017).
- Wai, J.; Lubinski, D.; Benbow, P.C. Spatial ability for stem domains: Aligning over 50 years of cumulative psychological knowledge solidifies its importance. J. Educ. Psychol. 2009, 101, 817–835. [Google Scholar] [CrossRef]
- Scratch MIT. Available online: http://scratch.mit.edu/ (accessed on 24 November 2016).
- Howell, S. Kinect2Scratch (Version 2.5) [Computer Software]. Available online: http://scratch.saorog.com (accessed on 24 November 2016).
- Aleem, S.; Capretz, L.; Ahmed, F. Game development software engineering process life cycle: A systematic review. J. Softw. Eng. Res. Dev. 2016, 4, 1–30. [Google Scholar] [CrossRef]
- GameStar Mechanic (GSM)-Getting-Started-Teacher Pack 2011. Available online: https://sites.google.com/a/elinemedia.com/gsmlearningguide/getting-started-teacher-pack (accessed on 30 November 2016).
- Bass, K.M.; Dahl, I.H.; Panahandeh, S. Designing the game: How a project-based media production program approaches. steam career readiness for underrepresented young adults. J. Sci. Educ. Technol. 2016, 25, 1009–1024. [Google Scholar] [CrossRef]
- Schell, J. The Art of Game Design: A Book of Lenses; Kaufmann, M., Ed.; Elsevier: Amsterdam, The Netherlands, 2008. [Google Scholar]
- The 35 Gamification Mechanics Toolkit v2.0. Available online: http://www.epicwinblog.net/2013/10/the-35-gamification-mechanics-toolkit.html (accessed on 5 November 2016).
- Mueler, F.; Gibbs, M.R.; Vetere, F.; Edge, D. Supporting the creative game design process with exertion cards. In Proceedings of the CHI ‘14 Conference on Human Factors in Computing Systems; ACM: New York, NY, USA, 2014. [Google Scholar]
- Hornecker, E. Creative idea exploration within the structure of a guiding framework: The card brainstorming game. In Proceedings of the Fourth International Conference on Tangible, Embedded, and Embodied Interaction, Cambridge, MA, USA, 24–27 January 2010. [Google Scholar]
- Ke, F.; Im, T. A case study on collective cognition and operation in team-based computer game design by middle-school children. Int. J. Technol. Design Educ. 2014, 24, 187–201. [Google Scholar] [CrossRef]
- Baytak, A.; Land, S.M. An investigation of the artifacts and process of constructing computers games about environmental science in a fifth grade classroom. Educ. Technol. Res. Dev. 2011, 59, 765–782. [Google Scholar] [CrossRef]
- Mouza, C.; Pan, Y.; Pollock, L.; Atlas, J.; Harvey, T. Partner4CS: Bringing computational thinking to middle school through game design. In Proceedings of the Stanford University FabLearn Conference on Creativity and Fabrication, Palo Alto, CA, USA, 25–26 October 2014. [Google Scholar]
- Kafai, Y.; Vasudevan, V. Constructionist gaming beyond the screen: Middle school students’ crafting and computing of touchpads, board games, and controllers. In Proceedings of the Workshop in Primary and Secondary Computing Education, London, UK, 9–11 November 2015. [Google Scholar]
- Feng, C.Y.; Chen, M.P. The effects of goal specificity and scaffolding on programming performance and self-regulation in game design. Br. J. Educ. Technol. 2014, 45, 285–302. [Google Scholar] [CrossRef]
- Evaluation Criteria—Storyboards or Design Documents. Available online: http://www.write4web.com/wp-content/uploads/2011/10/My.Rubric.storyboard_design.pdf (accessed on (accessed on 24 November 2016).
- Dr. Scratch (Analyze your Scratch projects here!). Available online: http://www.drscratch.org/ (accessed on 10 May 2016).
- Moreno-León, J.; Robles, G. Analyze Your Scratch Projects with Dr. Scratch and Assess Your Computational Thinking Skills; AMS: Amsterdam, The Netherlands, 2015. [Google Scholar]
- Happy Analysing (Only Works with Scratch 1.x). Available online: http://happyanalyzing.com/ (accessed on 12 September 2016).
- Gravestock, P.; Gregor-Greenleaf, E. Student Course Evaluations: Research, Models and Trends; Higher Education Quality Council of Ontario: Toronto, ON, USA, 2008; Available online: http://www.heqco.ca/en-CA/Research/Research%20Publications/Pages/Home.aspx (accessed on 10 December 2016).
- Farzaneh, N.; Nejadansari, D. Students’ attitude towards using cooperative learning for teaching reading comprehension. Theory Prac. Lang. Stud. 2014, 4, 287–292. [Google Scholar] [CrossRef]
- Leal, A.; Ferreira, D. Learning programming patterns using games. Int. J. Inf. Commun. Technol. Educ. 2014, 12, 23–34. [Google Scholar] [CrossRef]
- Kuruvada, P.; Asamoah, D.A.; Dalal, N.; Kak, S. The Use of Rapid Digital Game Creation to Learn Computational Thinking. 2010. Available online: http://arxiv.org/pdf/1011.4093.pdf (accessed on 12 November 2017).
- Wu, M.L.; Richards, K. Facilitating computational thinking through game design. In International Conference on Technologies for E-Learning and Digital Entertainment; Springer: Heidelberg/Berlin, Germany, 2011. [Google Scholar]
- Scaffidi, C.; Chambers, C. Skill progression demonstrated by users in the scratch animation environment. Int. J. Hum.-Comput. Interact. 2012. [Google Scholar] [CrossRef]
- Williams, L.; Wiebe, E.; Yang, K.; Ferzli, M.; Miller, C. In support of pair programming in the introductory computer science course. Comput. Sci. Educ. 2002, 12, 197–212. [Google Scholar] [CrossRef]
Stage | Timeline(Weeks) | Deliverables |
---|---|---|
Stage 0: Introduction | 1 |
|
Stage 1: Understanding NUI | 1 |
|
Stage 2: Design a Kinect Game | 2 |
|
Stage 3: Developing a Kinect Game | 2–3 |
|
Stage 4: Playtesting and Evaluation | 2 | |
Total | 8–9 |
Initial Students’ Goals Classification | Number of Students |
---|---|
Learn Programming | 15/22 |
Create Games | 13/22 |
Understand the Game Design Process | 6/22 |
Learn Scratch | 7/22 |
Get Familiarised/Interact with Kinect | 5/22 |
Have fun | 5/22 |
Enhance cooperation skills | 2/22 |
Research Questions (RQ) | Research Assessment Tools |
---|---|
RQ1: Were the students’ thinking skills enhanced as a result of the approach? | For Computational Thinking Skills:
|
RQ2: Was the proposed educational approach implemented successfully and as planned in the school environments? |
|
RQ3: Was the proposed educational approach appreciated by the participants? |
Game Design Documents | Storyboards | Scratch Games |
---|---|---|
CT Programming Concepts | Basic | Developing | Proficient |
---|---|---|---|
Flow Control | 7/11 | 4/11 games with “Repeat-Until“ commands | |
Abstraction | 11/11 | ||
User interactivity | 11/11 | ||
Synchronisation | 2/11 | 9/11 games with “Wait-Until“ commands | |
Parallelism | 11/11 | ||
Logic | 2/11 | 9/11 games had more than one condition inside the If block. |
Students should Avoid | Detected In |
---|---|
Duplicated scripts | 9.06% (156/1721 scripts) |
Incorrect sprite names | 9.09% (33/363 sprites) |
Dead code | 0.35% (6/1721 scripts) |
Not initialising sprites’ attributes | 1.10% (4/363 sprites) |
Design Artefacts’ Score | Design Artefacts’ Score (%) | |||||
---|---|---|---|---|---|---|
Evaluation Criteria for Storyboards and Design Documents | Excellent (A) | OK (B) | Weak (C) | Excellent (A) | OK (B) | Weak (C) |
Design describes or shows the objective, purpose and scope | 4/11 | 7/11 | 0/11 | 45.45% | 36.36% | 18.18% |
Design describes or shows the reader who the intended audience is | 10/11 | 1/11 | 0/11 | 90.91% | 9.09% | 0.00% |
Design describes any media planned for the project | 3/11 | 3/11 | 5/11 | 27.27% | 72.73% | 0.00% |
Design has a logical organisation, effective structure and guide for the reader | 4/11 | 5/11 | 2/11 | 45.45% | 36.36% | 18.18% |
Design is succinct and clear | 4/11 | 5/11 | 2/11 | 45.45% | 36.36% | 18.18% |
Design depicts a clear ending | 4/11 | 5/11 | 2/11 | 45.45% | 36.36% | 18.18% |
Design has few mechanical defects | 3/11 | 6/11 | 2/11 | 45.45% | 36.36% | 18.18% |
Students’ thoughts about the Program | Teachers’ thoughts about the Program |
---|---|
“It was nice. We tried something new and different!” | 1st teacher comments: “On the current analytical and clock program (1 h per week for CS Course) the approach seems to enhance the students’ programming skills and motivate them not only because they used Kinect camera but also due to the “game” part. It worked really well as an initial step for students who are interested in Computer Science and the future use of more advanced authoring tools. Through those kinds of projects, students understand the development stages, the required distribution of work, the process and the results of playtesting. It was a constructive and appreciable experience.” 2nd teacher comments: “Pleasant, productive and educational process. It certainly won the students’ interest.” |
“Outstanding experience and I’ll do that again.” | |
“It was a very useful learning program.” | |
“I’m very pleased because I’ve learned a lot about the Kinect camera and new potentials of scratch.” | |
“I gained experience from this Kinect process and I’m feeling more confident in building any kind of game.” | |
“It was a beautiful experience due to the fact that we learned how to follow a process for building Kinect games, we cooperated and we interacted with our classmates.” | |
“I’m very pleased because I’ve learned a lot about the games’ functionality and the required scripts for programming.” | |
“A beautiful experience. I believe that I improved my CT skills.” | |
“The program motivates someone to work even more in this area.” | |
“I’ve learned how a game is working using variables, blocks, loops etc. and at the same time it was fun and interesting.” |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Altanis, I.; Retalis, S.; Petropoulou, O. Systematic Design and Rapid Development of Motion-Based Touchless Games for Enhancing Students’ Thinking Skills. Educ. Sci. 2018, 8, 18. https://doi.org/10.3390/educsci8010018
Altanis I, Retalis S, Petropoulou O. Systematic Design and Rapid Development of Motion-Based Touchless Games for Enhancing Students’ Thinking Skills. Education Sciences. 2018; 8(1):18. https://doi.org/10.3390/educsci8010018
Chicago/Turabian StyleAltanis, Ioannis, Symeon Retalis, and Ourania Petropoulou. 2018. "Systematic Design and Rapid Development of Motion-Based Touchless Games for Enhancing Students’ Thinking Skills" Education Sciences 8, no. 1: 18. https://doi.org/10.3390/educsci8010018