CI 5330 Section 002: Learning Analytics in the Knowledge Age

Instructor Information

  • Bodong Chen, Assistant Professor
  • Email: chenbd@umn.edu
  • Phone: (773) 850-1032
  • Office: 1954 Buford Avenue, 210B LES, St. Paul, MN 55108
  • Office Hours: By Appointment

Course Description

Overview

Learning analytics is an nascent field of research that aspires to "turn educational data into actionable knowledge, in order to optimize learning and the environments in which it occurs." This course aims to provide a general, non-technical survey of learning analytics, as well as its application in various educational contexts. In particular, we will discuss foundations of the field, explore new forms of assessment, become acquainted with popular data mining techniques, review learning analytical tools and cases, and design/develop new analytic tools by ourselves---all with emphasis on emergent competencies in the knowledge age. Additional supports will be provided for students interested in pursuing specific issues in any of these areas. Overall, this will be a great course for getting a broad overview of the field of learning analytics.

Audience

The course is designed for a broad audience. All graduate students interested in learning analytics and its application in specific educational areas (e.g., STEM, literacies, life-long learning) are welcomed.

Prerequisites: None, but some prior knowledge in learning theories, assessment, and/or data science recommended.

Objectives

By the end of the course, students should:

  1. Understand the logic of analytics;
  2. Identify and describe key epistemological, pedagogical, ethical, and technical factors related to the design of learning analytics;
  3. Be familiar with the basics of finding, cleaning, and using educational data;
  4. Understand some of the popular data mining techniques, including predictive models, text analysis, relationship mining, and social networks;
  5. Develop beginning skills necessary to plan and design learning analytics;
  6. Be able to apply data analytic skills in their own research.

Course Design

This is a Knowledge Building course, which means all participants (including the instructor) are collectively producing ideas and knowledge as a community, to solve authentic learning analytics problems \footnote{See this article for an explanation of Knowledge Building: Scardamalia, M. and Bereiter, C. (2003). Knowledge building. In Guthrie, J. W., editor, Encyclopedia of education, volume 17, pages 1370–1373. Macmillan Reference, New York, NY, 2 edition.}. Our top-level goal in this course will be to work as a knowledge building team, living and exploring the capacity of learning analytics in supporting growth in learning in different domains. This overarching goal will be interwoven throughout this course. We will advance this goal through analysis of readings, case studies, and innovative design.

Course Timeline

The first seven weeks are designed to provide an introduction to the field of learning analytics, including its roots, basic logic, data mining techniques, and case studies. These weeks feature both theoretical discussion, with emphasis on the assumptions underlying analytics tools and projects, and hands-on learning activities. During the process, students will form working groups (WGs) around emergent design problems in different contexts. Students will also sign up for one of the five "themes" representing key research areas in the field of learning analytics (see detailed class schedule below) to form special interest groups (SIGs).

The second part of the course features five themes, each led by a corresponding SIG. Each SIG is expected to take a lead on its theme---designing classroom activities, presenting key ideas, and facilitating discussion. Each SIG will meet with the instructor one week in advance to finalize their course plan. In the mean time, each WG will keep advancing their designs.

The class will use the final weeks to further advance our designs and create synthesis. Each WG will present their work in front of the class. We will together reflect on our designs and explore possible ways to further improve them.

Supporting Environments

  • Online Learning Environment: Knowledge Forum (KF)
    • Register for an account
    • Signup for the course using access code: "last"
  • Use #LAUMN when you post on social media (e.g., Twitter, Facebook, Google+); send the instructor your RSS feed if you blog. Social pulses related to the course will be aggregated on a netvibes dashboard

Course Evaluation

Parameters

  • Group- and Individual-Assessment: Students will be assessed both individually and as a group (SIG and WG)
  • Teacher- and Peer-Assessment: Students will be assessed both by the instructor and peers, based their personal growth and contributions to the community

Grading

  • Class participation, 15%
  • Online participation, 15%
  • SIG presentation (group), 20%
  • WG project artifact (group), 20%
  • WG presentation (group), 15%
  • Reflection essay or portfolio, 15%

Participation involves active and constructive participation in online and offline discussions. Evaluation will be based on both numeric metrics exported from Knowledge Forum and qualitative assessment of one's contribution to discussions.

Two group presentations (i.e., SIG and WG presentations) will be peer-assessed: When one group presents, other groups will evaluate the presentation following a given rubric. Students in a same group get a same score.

A WG project artifact could be a design document, a research plan, or a functioning prototype depending on the problem a WG choose to tackle. (Each WG should come up with a tentative project proposal to discuss with the instructor by the end of Week 11.)

Final assignment: Students would have the choice among writing a reflective essay (not exceeding 2,000 words excluding references) or preparing a portfolio Knowledge Forum note reflecting on one's journey in the course. Deadline: May 15, 2015.

\newpage

Class Schedule

  • Special Topics: Learning Analytics
  • Classroom: LES R250, St Paul campus
  • Time: Thursdays, 05:30 - 08:10 PM

Overview

Week/Date Topic Important Notes
Week 1, Jan 22 Introduction In-take Survey
Week 2, Jan 29 Learning Analytics: A Brief Overview
Week 3, Feb 5 What to Assess? "New Competencies"
Week 4, Feb 12 Explore Hidden Assumptions Meet with Simon Knight; Class starts at 5pm
Week 5, Feb 19 Educational Data Mining SIG signup; WG signup
Week 6, Feb 26 Cases and Examples of Learning Analytics Meet with Prof. George Karypis
Week 7, Mar 5 Data Wrangling Hands-on Meet with Stian Haklev
Week 8, Mar 12 Learning and Knowledge Growth (theme 1)
Week 9, Mar 19 Spring Break; LAK Conference; NO CLASS
Week 10, Mar 26 Social Networks (theme 2)
Week 11, Apr 2 Mining of Text and Discourse (theme 3) Tentative WG proposal due
Week 12, Apr 9 Temporality in Learning (theme 4)
Week 13, Apr 16 AERA Conference; Group/Individual Study
Week 14, Apr 23 Prediction and Intervention (theme 5)
Week 15, Apr 30 WG Presentations
Week 16, May 7 WG Presentations Exit Survey; Final assignment due by May 15

Week 1: Introduction

Readings

  • Scardamalia, M. and Bereiter, C. (2003). Knowledge building. In Guthrie, J. W., editor, Encyclopedia of education, volume 17, pages 1370–1373. Macmillan Reference, New York, NY, 2 edition.
  • Woolley, A., et al. (2015, January). Why Some Teams Are Smarter Than Others. New York Times.

Learning Activities

  • Complete in-take survey
  • Get familiar with Knowledge Forum (KF)
  • KF Discussion
    1. Introduce yourself and tell people why you're here!
    2. Discuss learning analytics research and projects you are aware of

Week 2: Learning Analytics: A Brief Overview

Readings

Activities

  • KF Discussion: Discuss readings in KF
  • Start planning the final analytics project

Week 3: What to Assess: "New Competencies" in the Knowledge Age

Readings

Activities

Week 4: Explore Hidden Assumptions: Epistemology, Pedagogy, Assessment, and Learning across Levels

Readings

Activities

  • Virture meeting with our guest speaker, Simon Knight, Open University
  • KF discussion

Week 5: Educational Data Mining: An Overview

Readings

Activities

  • SIG and WG signup; 2-3 students per group
  • KF discussion

Week 6: Cases and Examples of Learning Analytics

Readings:

Activities

  • Guest speaker TBD
  • KF Discussion

Week 7: Data Wrangling Hands-on

Readings: None

Tools

  • R
  • Google Refine
  • Tableau

Activities

  • Watch Tony Hirst's talk
  • Meet with our guest speaker Stian Haklev, University of Toronto

Week 8: Learning and Knowledge Growth (theme 1)

Suggested Readings

  • Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Acher, A., Fortus, D., Shwartz, Y., Hug, B., and Krajcik, J. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46(6):632–654.
  • Bull, S. and Kay, J. (2010). Open learner models. In Nkambou, R., Bordeau, J., and Miziguchi, R., editors, Advances in Intelligent Tutoring Systems, chapter 15, pages 318–338. Springer.
  • Desmarais, M. C., & Baker, R. S. J. d. (2011). A review of recent advances in learner and skill modeling in intelligent learning environments. User Modeling and User-Adapted Interaction, 22(1-2), 9–38. doi:10.1007/s11257-011-9106-8

Activities: To be designed by SIG 1

Week 9: Spring Break; NO CLASS

Week 10: Social Networks (theme 2)

Suggested Readings

  • Haythornthwaite, C. (1996). Social network analysis: An approach and technique for the study of information exchange. Library & Information Science Research, 18(4):323–342.
  • Grunspan, D. Z., Wiggins, B. L., & Goodreau, S. M. (2014). Understanding Classrooms through Social Network Analysis: A Primer for Social Network Analysis in Education Research. CBE-Life Sciences Education, 13(2), 167–178. doi:10.1187/cbe.13-08-0162
  • Oshima, J., Oshima, R., and Matsuzawa, Y. (2012). Knowledge Building Discourse Explorer: a social network analysis application for knowledge building discourse. Educational Technology Research and Development, 60(5):903–921.
  • Chen, B., Chen, X, & Xing, W. (2013). Twitter Archeology of Learning Analytics and Knowledge Conferences. Paper to be presented at the 2015 Learning Analytics and Knowledge Conference.
  • - [Social Networks Adapting Pedagogical Practice

Activities: To be designed by SIG 2

Suggested Tools

Week 11: Mining of Text and Discourse (theme 3)

Suggested Readings

  • Rohrer, R., Ebert, D., and Sibert, J. (1998). The shape of Shakespeare: visualizing text using implicit surfaces. In Proceedings of IEEE Symposium on Information Visualization, pages 121–129. IEEE Comput. Soc.
  • Rose, C. P., Wang, Y.-C., Cui, Y., Arguello, J., Stegmann, K., Weinberger, A., and Fis- cher, F. (2008). Analyzing collaborative learning processes automatically: Exploiting the advances of computational linguistics in computer-supported collaborative learning. International Journal of Computer-Supported Collaborative Learning, 3(3):237–271.
  • Optional
    • Shermis, M. D. (2014). State-of-the-art automated essay scoring: Competition, results, and future directions from a United States demonstration. Assessing Writing, 20, 53–76.
    • Simsek, D., Buckingham Shum, S., Sandor, A., De Liddo, A., and Ferguson, R. (2013). Xip dashboard: visual analytics from automated rhetorical parsing of scientific metadiscourse. In 1st International Workshop on Discourse-Centric Learning Analytics.

Activities: To be designed by SIG 3

Suggested Tools

  • ManyEyes
  • LightSIDE
  • RapidMiner

Week 12: Temporality in Learning (theme 4)

Suggested Readings

  • Reimann, P. (2009). Time is precious: Variable-and event-centred approaches to process analysis in CSCL research. International Journal of Computer-Supported Collaborative Learning, 4(3):239– 257.
  • Kinnebrew, J., Segedy, J., and Biswas, G. (2014). Analyzing the temporal evolution of students’ behaviors in open-ended learning environments. Metacognition and Learning, 9(2):187–215.
  • Magnusson, M. S. (2000). Discovering hidden time patterns in behavior: T-patterns and their detection. Behavior Research Methods, Instruments, & Computers, 32(1):93–110.
  • Chen, B. and Resendes, M. (2014). Uncovering what matters: Analyzing transitional relations among contribution types in knowledge-building discourse. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge - LAK ’14, pages 226–230, New York, New York, USA. ACM Press.

Activities: To be designed by SIG 4

Week 13: AERA; Group/Individual Study

Week 14: Prediction and Intervention (theme 5)

Suggested Readings

  • Pardos, Z.A., Baker, R.S.J.d., San Pedro, M.O.C.Z., Gowda, S.M., Gowda, S.M. (2013). Affective states and state tests: Investigating how affect throughout the school year predicts end of year learning outcomes. In Proceedings of the 3rd International Conference on Learning Analytics and Knowledge. <!-- - DeBoer, J. and Breslow, L. (2014). Tracking progress: Predictors of students’ weekly achievement during a circuits and electronics mooc. In Proceedings of the First ACM Conference on Learning @ Scale Conference, L@S ’14, pages 169–170, New York, NY, USA. ACM.
  • Kloft, M., Stiehler, F., Zheng, Z., and Pinkwart, N. (2014). Predicting mooc dropout over weeks using machine learning methods. In Modeling Large Scale Social Interaction in Massively Open Online Courses Workshop (EMNLP 2014).-->
  • Baker, R. S. J. d., D’Mello, S. K., Rodrigo, M. M. T., & Graesser, A. C. (2010). Better to be frustrated than bored: The incidence, persistence, and impact of learners’ cognitive–affective states during interactions with three different computer-based learning environments. International Journal of Human-Computer Studies, 68(4), 223–241. doi:10.1016/j.ijhcs.2