[go: up one dir, main page]

Skip to main content

Educational Dashboards for Smart Learning: Review of Case Studies

  • Conference paper
  • First Online:
Emerging Issues in Smart Learning

Part of the book series: Lecture Notes in Educational Technology ((LNET))

Abstract

An educational dashboard is a display which visualizes the results of educational data mining in a useful way. Educational data mining and visualization techniques allow teachers and students to monitor and reflect on their online teaching and learning behavior patterns. Previous literature has included such information in the dashboard to support students’ self-knowledge, self-evaluation, self-motivation, and social awareness. Further, educational dashboards are expected to support the smart learning environment, in the perspective that students receive personalized and automatically-generated information on a real-time base, by use of the log files in the Learning Management System (LMS). In this study, we reviewed ten case studies that deal with development and evaluation of such a tool, for supporting students and teachers through educational data mining techniques and visualization technologies. In the present study, a conceptual framework based on Few’s principles of dashboard design and Kirkpatrick’s fourlevel evaluation model was developed to review educational dashboards. Ultimately, this study is expected to evaluate the current state of educational dashboard development and suggest an evaluative tool to judge whether or not the dashboard function is working properly, in both a pedagogical and visual way.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  • J. Bayer, H. Bydzovská, J. Géryk, T. Obšıvac, and L. Popelınský, “Predicting drop-out from social behaviour of students,” in Proceedings of the 5th International Conference on Educational Data Mining-EDM 2012, 2012, pp. 103-109.

    Google Scholar 

  • C. Romero, P. G. Espejo, A. Zafra, J. R. Romero, and S. Ventura, “Web usage mining for predicting final marks of students that use Moodle courses,” Computer Applications in Engineering Education, vol. 21, pp. 135-146, 2013.

    Google Scholar 

  • E. J. Lauría and J. Baron, “Mining Sakai to Measure Student Performance: Opportunities and Challenges in Academic Analytics.”

    Google Scholar 

  • N. Thai-Nghe, L. Drumond, A. Krohn-Grimberghe, and L. Schmidt-Thieme, “Recommender system for predicting student performance,” Procedia Computer Science, vol. 1, pp. 2811-2819, 2010.

    Google Scholar 

  • N. Bousbia and I. Belamri, “Which Contribution Does EDM Provide to Computer-Based Learning Environments?,” in Educational Data Mining, ed: Springer, 2014, pp. 3-28.

    Google Scholar 

  • K. Verbert, E. Duval, J. Klerkx, S. Govaerts, and J. L. Santos, “Learning Analytics Dashboard Applications,” American Behavioral Scientist, 2013.

    Google Scholar 

  • L. Ali, M. Hatala, D. Gašević, and J. Jovanović, “A qualitative evaluation of evolution of a learning analytics tool,” Computers & Education, vol. 58, pp. 470-489, 2012.

    Google Scholar 

  • O. Scheuer and C. Zinn, “How did the e-learning session go? The Student Inspector,” Frontiers in Artifical Intelligence and Applications, vol. 158, p. 487, 2007.

    Google Scholar 

  • A. Essa and H. Ayad, “Student success system: risk analytics and data visualization using ensembles of predictive models,” in Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 2012, pp. 158-161.

    Google Scholar 

  • K. E. Arnold and M. D. Pistilli, “Course signals at Purdue: Using learning analytics to increase student success,” in Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 2012, pp. 267-270.

    Google Scholar 

  • R. Mazza and C. Milani, “Gismo: a graphical interactive student monitoring tool for course management systems,” in Technology Enhanced Learning Conference, Milan, 2004.

    Google Scholar 

  • D. Leony, A. Pardo, L. de la Fuente Valentín, D. S. de Castro, and C. D. Kloos, “GLASS: a learning analytics visualization tool,” in Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 2012, pp. 162-163.

    Google Scholar 

  • S. Dawson, A. Bakharia, and E. Heathcote, “SNAPP: Realising the affordances of realtime SNA within networked learning environments,” in Proceedings of the 7 th International Conference on Networked Learning, 2010, pp. 125-133.

    Google Scholar 

  • K. Upton and J. Kay, “Narcissus: group and individual models to support small group work,” in User Modeling, Adaptation, and Personalization, ed: Springer, 2009, pp. 54-65.

    Google Scholar 

  • J. L. Santos, K. Verbert, S. Govaerts, and E. Duval, “Addressing learner issues with StepUp!: an Evaluation,” in Proceedings of the Third International Conference on Learning Analytics and Knowledge, 2013, pp. 14-22.

    Google Scholar 

  • S. Govaerts, K. Verbert, E. Duval, and A. Pardo, “The student activity meter for awareness and self-reflection,” in CHI’12 Extended Abstracts on Human Factors in Computing Systems, 2012, pp. 869-884.

    Google Scholar 

  • “W. W. Eckerson, Performance dashboards: Measuring, monitoring, and managing your business, 2nd ed.: Wiley, 2010.

    Google Scholar 

  • V. Podgorelec and S. Kuhar, “Taking advantage of education data: Advanced data analysis and reporting in virtual learning environments,” Electronics and Electrical Engineering, vol. 114, pp. 111-116, 2011.

    Google Scholar 

  • S. Malik, Enterprise dashboards: Design and best practices for IT: Wiley, 2005.

    Google Scholar 

  • E. Duval, “Attention please!: learning analytics for visualization and recommendation,” in Proceedings of the 1st International Conference on Learning Analytics and Knowledge, 2011, pp. 9-17.

    Google Scholar 

  • S. K. Card, J. D. Mackinlay, and B. Shneiderman, Readings in information visualization: using vision to think: Morgan Kaufmann, 1999.

    Google Scholar 

  • R. Nisbet, J. Elder, and G. Miner, Handbook of statistical analysis and data mining applications: Academic Press, 2009.

    Google Scholar 

  • S. Few, Information dashboard design: Displaying data for at-a-glance monitoring, 2nd ed. Burlingame, CA: Analytics Press, 2013.

    Google Scholar 

  • S. Few, Now you see it: simple visualization techniques for quantitative analysis: Analytics Press, 2009.

    Google Scholar 

  • M. R. Endsley, Designing for situation awareness: An approach to user-centered design: CRC Press, 2012.

    Google Scholar 

  • K. L. Gustafson and R. M. Branch, Survey of instructional development models, 4th ed. New York: ERIC Clearninghouse on Information and Technology, 2002.

    Google Scholar 

  • W. Dick, L. Carey, and J. O. Carey, The systematic design of instruction, 6th ed. Boston, MA: Allyn and Bacom, 2005.

    Google Scholar 

  • W. Horton, Evaluating e-learning. Alexandria: ASTD (American Society for Training & Development), 2001.

    Google Scholar 

  • T. C. Reeves, L. Benson, D. Elliott, M. Grant, D. Holschuh, B. Kim, et al., “Usability and Instructional Design Heuristics for E-Learning Evaluation,” 2002.

    Google Scholar 

  • J. Brill and Y. Park, “Evaluating online tutorials for university faculty, staff, and students: The contribution of just-in-time online resources to learning and performance,” International Journal on E-Learning, vol. 10, pp. 5-26, 2011.

    Google Scholar 

  • D. L. Kirkpatrick and J. D. Kirkpatrick, Evaluating training programs: The four levels, 3rd ed. San Francisco: Berrett-Koehler, 2006.

    Google Scholar 

  • O. Scheu and C. Zinn, “How did the e-learning session go? The student inspector,” in 13 th International Conference on Artificual Intelligence and Education, Amsterdam: Netherland, 2007.

    Google Scholar 

  • G. J. Houben, G. McCalla, F. Pianesi, and M. Zancanaro, Eds., User modeling, adaptation, and personalization (Narcissus: Group and individual models to support small group work. 2009, p.^pp. Pages.

    Google Scholar 

  • BlackboardInc. Blackboard & Montgomery County Community College: Solution focus, student analytics.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yeonjeong Park .

Editor information

Editors and Affiliations

Appendix 1. an Evaluation Framework for Educational Dashboards

Appendix 1. an Evaluation Framework for Educational Dashboards

Criteria

Sub-categories

Indexes

1. reaction

Goal-orientation

1. A dashboard identifies goals that present the specific information.

  

2. A dashboard helps users monitor goal-related activities.

 

Information Usefulness

3. A dashboard displays the information that users want to know.

  

4. A dashboard includes essential information only.

 

Visual Effectiveness

5. A dashboard consists of visual elements.

  

6. A dashboard fits on a single computer screen.

  

7. A dashboard presents visual information that users can scan at a glance.

  

8. Visual elements in a dashboard are arranged in a way for rapid perception. Appropriation of visual representation

  

9. A dashboard includes proper graphic representations

  

10. Graphs in a dashboard appropriately represent the scales and units.

  

11. A dashboard delivers information in a concise, direct and clear manner.

  

12. A dashboard uses appropriate pre-attentive attributes, such as form, color, spatial position and motion.

  

13. A dashboard displays information correctly in both desktop computers and mobile devices.

 

User Friendliness

14. A dashboard is easy to access.

  

15. A dashboard is customized to users’ context.

  

16. A dashboard has intuitive interfaces and menus to use easily.

  

17. A dashboard allows users to explore more information that are embedded or hidden on the single page.

2. Learning

Understanding

18. A user understands what the visual information in a dashboard implies.

 

Reflection

19. A user understands what the statistical information in a dashboard implies.

3. Behavior

Learning motivation

23. A user is motivated to be engaged in learning as he/she reviews the dashboard.

  

24. A user makes plans for his/her own learning based on the information in a dashboard.

 

Behavioral change

25. A user manages his/her learning activities based on a dashboard.

  

26. A user makes changes in learning patterns as he/she monitors the information in a dashboard.

4. Result

Performance improvement

27. A dashboard helps users to achieve their learning goal.

  

28. A dashboard enhances users’ academic achievement.

 

Competency Development

29. A dashboard enhances users’ self-management skill.

  

30. A dashboard enhances users’ social values and networking competency.

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yoo, Y., Lee, H., Jo, IH., Park, Y. (2015). Educational Dashboards for Smart Learning: Review of Case Studies. In: Chen, G., Kumar, V., Kinshuk, ., Huang, R., Kong, S. (eds) Emerging Issues in Smart Learning. Lecture Notes in Educational Technology. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44188-6_21

Download citation

Publish with us

Policies and ethics