CI 5330 Section 002: Learning Analytics in the Knowledge Age

Instructor Information

  • Bodong Chen, Assistant Professor
  • Email:
  • Phone: (773) 850-1032
  • Office: 1954 Buford Avenue, 210B LES, St. Paul, MN 55108
  • Office Hours: By Appointment

Course Description


Learning analytics is an nascent field of research that aspires to "turn educational data into actionable knowledge, in order to optimize learning and the environments in which it occurs." This course aims to provide a general, non-technical survey of learning analytics, as well as its application in various educational contexts. In particular, we will discuss foundations of the field, explore new forms of assessment, become acquainted with popular data mining techniques, review learning analytical tools and cases, and design/develop new analytic tools by ourselves---all with emphasis on emergent competencies in the knowledge age. Additional supports will be provided for students interested in pursuing specific issues in any of these areas. Overall, this will be a great course for getting a broad overview of the field of learning analytics.


The course is designed for a broad audience. All graduate students interested in learning analytics and its application in specific educational areas (e.g., STEM, literacies, life-long learning) are welcomed.

Prerequisites: None, but some prior knowledge in learning theories, assessment, and/or data science recommended.


By the end of the course, students should:

  1. Understand the logic of analytics;
  2. Identify and describe key epistemological, pedagogical, ethical, and technical factors related to the design of learning analytics;
  3. Be familiar with the basics of finding, cleaning, and using educational data;
  4. Understand some of the popular data mining techniques, including predictive models, text analysis, relationship mining, and social networks;
  5. Develop beginning skills necessary to plan and design learning analytics;
  6. Be able to apply data analytic skills in their own research.

Course Design

This is a Knowledge Building course, which means all participants (including the instructor) are collectively producing ideas and knowledge as a community, to solve authentic learning analytics problems \footnote{See this article for an explanation of Knowledge Building: Scardamalia, M. and Bereiter, C. (2003). Knowledge building. In Guthrie, J. W., editor, Encyclopedia of education, volume 17, pages 1370–1373. Macmillan Reference, New York, NY, 2 edition.}. Our top-level goal in this course will be to work as a knowledge building team, living and exploring the capacity of learning analytics in supporting growth in learning in different domains. This overarching goal will be interwoven throughout this course. We will advance this goal through analysis of readings, case studies, and innovative design.

Course Timeline

The first seven weeks are designed to provide an introduction to the field of learning analytics, including its roots, basic logic, data mining techniques, and case studies. These weeks feature both theoretical discussion, with emphasis on the assumptions underlying analytics tools and projects, and hands-on learning activities. During the process, students will form working groups (WGs) around emergent design problems in different contexts. Students will also sign up for one of the five "themes" representing key research areas in the field of learning analytics (see detailed class schedule below) to form special interest groups (SIGs).

The second part of the course features five themes, each led by a corresponding SIG. Each SIG is expected to take a lead on its theme---designing classroom activities, presenting key ideas, and facilitating discussion. Each SIG will meet with the instructor one week in advance to finalize their course plan. In the mean time, each WG will keep advancing their designs.

The class will use the final weeks to further advance our designs and create synthesis. Each WG will present their work in front of the class. We will together reflect on our designs and explore possible ways to further improve them.

Supporting Environments

  • Online Learning Environment: Knowledge Forum (KF)
    • Register for an account
    • Signup for the course using access code: "last"
  • Use #LAUMN when you post on social media (e.g., Twitter, Facebook, Google+); send the instructor your RSS feed if you blog. Social pulses related to the course will be aggregated on a netvibes dashboard

Course Evaluation


  • Group- and Individual-Assessment: Students will be assessed both individually and as a group (SIG and WG)
  • Teacher- and Peer-Assessment: Students will be assessed both by the instructor and peers, based their personal growth and contributions to the community


  • Class participation, 15%
  • Online participation, 15%
  • SIG presentation (group), 20%
  • WG project artifact (group), 20%
  • WG presentation (group), 15%
  • Reflection essay or portfolio, 15%

Participation involves active and constructive participation in online and offline discussions. Evaluation will be based on both numeric metrics exported from Knowledge Forum and qualitative assessment of one's contribution to discussions.

Two group presentations (i.e., SIG and WG presentations) will be peer-assessed: When one group presents, other groups will evaluate the presentation following a given rubric. Students in a same group get a same score.

A WG project artifact could be a design document, a research plan, or a functioning prototype depending on the problem a WG choose to tackle. (Each WG should come up with a tentative project proposal to discuss with the instructor by the end of Week 11.)

Final assignment: Students would have the choice among writing a reflective essay (not exceeding 2,000 words excluding references) or preparing a portfolio Knowledge Forum note reflecting on one's journey in the course. Deadline: May 15, 2015.


Class Schedule

  • Special Topics: Learning Analytics
  • Classroom: LES R250, St Paul campus
  • Time: Thursdays, 05:30 - 08:10 PM


Week/Date Topic Important Notes
Week 1, Jan 22 Introduction In-take Survey
Week 2, Jan 29 Learning Analytics: A Brief Overview
Week 3, Feb 5 What to Assess? "New Competencies"
Week 4, Feb 12 Explore Hidden Assumptions Meet with Simon Knight; Class starts at 5pm
Week 5, Feb 19 Educational Data Mining SIG signup; WG signup
Week 6, Feb 26 Cases and Examples of Learning Analytics Meet with Prof. George Karypis
Week 7, Mar 5 Data Wrangling Hands-on Meet with Stian Haklev
Week 8, Mar 12 Learning and Knowledge Growth (theme 1)
Week 9, Mar 19 Spring Break; LAK Conference; NO CLASS
Week 10, Mar 26 Social Networks (theme 2)
Week 11, Apr 2 Mining of Text and Discourse (theme 3) Tentative WG proposal due
Week 12, Apr 9 Temporality in Learning (theme 4)
Week 13, Apr 16 AERA Conference; Group/Individual Study
Week 14, Apr 23 Prediction and Intervention (theme 5)
Week 15, Apr 30 WG Presentations
Week 16, May 7 WG Presentations Exit Survey; Final assignment due by May 15

Week 1: Introduction


  • Scardamalia, M. and Bereiter, C. (2003). Knowledge building. In Guthrie, J. W., editor, Encyclopedia of education, volume 17, pages 1370–1373. Macmillan Reference, New York, NY, 2 edition.
  • Woolley, A., et al. (2015, January). Why Some Teams Are Smarter Than Others. New York Times.

Learning Activities

  • Complete in-take survey
  • Get familiar with Knowledge Forum (KF)
  • KF Discussion
    1. Introduce yourself and tell people why you're here!
    2. Discuss learning analytics research and projects you are aware of

Week 2: Learning Analytics: A Brief Overview



  • KF Discussion: Discuss readings in KF
  • Start planning the final analytics project

Week 3: What to Assess: "New Competencies" in the Knowledge Age



Week 4: Explore Hidden Assumptions: Epistemology, Pedagogy, Assessment, and Learning across Levels



  • Virture meeting with our guest speaker, Simon Knight, Open University
  • KF discussion

Week 5: Educational Data Mining: An Overview



  • SIG and WG signup; 2-3 students per group
  • KF discussion

Week 6: Cases and Examples of Learning Analytics



  • Guest speaker TBD
  • KF Discussion

Week 7: Data Wrangling Hands-on

Readings: None


  • R
  • Google Refine
  • Tableau


  • Watch Tony Hirst's talk
  • Meet with our guest speaker Stian Haklev, University of Toronto

Week 8: Learning and Knowledge Growth (theme 1)

Suggested Readings

  • Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Acher, A., Fortus, D., Shwartz, Y., Hug, B., and Krajcik, J. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46(6):632–654.
  • Bull, S. and Kay, J. (2010). Open learner models. In Nkambou, R., Bordeau, J., and Miziguchi, R., editors, Advances in Intelligent Tutoring Systems, chapter 15, pages 318–338. Springer.
  • Desmarais, M. C., & Baker, R. S. J. d. (2011). A review of recent advances in learner and skill modeling in intelligent learning environments. User Modeling and User-Adapted Interaction, 22(1-2), 9–38. doi:10.1007/s11257-011-9106-8

Activities: To be designed by SIG 1

Week 9: Spring Break; NO CLASS

Week 10: Social Networks (theme 2)

Suggested Readings

  • Haythornthwaite, C. (1996). Social network analysis: An approach and technique for the study of information exchange. Library & Information Science Research, 18(4):323–342.
  • Grunspan, D. Z., Wiggins, B. L., & Goodreau, S. M. (2014). Understanding Classrooms through Social Network Analysis: A Primer for Social Network Analysis in Education Research. CBE-Life Sciences Education, 13(2), 167–178. doi:10.1187/cbe.13-08-0162
  • Oshima, J., Oshima, R., and Matsuzawa, Y. (2012). Knowledge Building Discourse Explorer: a social network analysis application for knowledge building discourse. Educational Technology Research and Development, 60(5):903–921.
  • Chen, B., Chen, X, & Xing, W. (2013). Twitter Archeology of Learning Analytics and Knowledge Conferences. Paper to be presented at the 2015 Learning Analytics and Knowledge Conference.
  • - [Social Networks Adapting Pedagogical Practice

Activities: To be designed by SIG 2

Suggested Tools

Week 11: Mining of Text and Discourse (theme 3)

Suggested Readings

  • Rohrer, R., Ebert, D., and Sibert, J. (1998). The shape of Shakespeare: visualizing text using implicit surfaces. In Proceedings of IEEE Symposium on Information Visualization, pages 121–129. IEEE Comput. Soc.
  • Rose, C. P., Wang, Y.-C., Cui, Y., Arguello, J., Stegmann, K., Weinberger, A., and Fis- cher, F. (2008). Analyzing collaborative learning processes automatically: Exploiting the advances of computational linguistics in computer-supported collaborative learning. International Journal of Computer-Supported Collaborative Learning, 3(3):237–271.
  • Optional
    • Shermis, M. D. (2014). State-of-the-art automated essay scoring: Competition, results, and future directions from a United States demonstration. Assessing Writing, 20, 53–76.
    • Simsek, D., Buckingham Shum, S., Sandor, A., De Liddo, A., and Ferguson, R. (2013). Xip dashboard: visual analytics from automated rhetorical parsing of scientific metadiscourse. In 1st International Workshop on Discourse-Centric Learning Analytics.

Activities: To be designed by SIG 3

Suggested Tools

  • ManyEyes
  • LightSIDE
  • RapidMiner

Week 12: Temporality in Learning (theme 4)

Suggested Readings

  • Reimann, P. (2009). Time is precious: Variable-and event-centred approaches to process analysis in CSCL research. International Journal of Computer-Supported Collaborative Learning, 4(3):239– 257.
  • Kinnebrew, J., Segedy, J., and Biswas, G. (2014). Analyzing the temporal evolution of students’ behaviors in open-ended learning environments. Metacognition and Learning, 9(2):187–215.
  • Magnusson, M. S. (2000). Discovering hidden time patterns in behavior: T-patterns and their detection. Behavior Research Methods, Instruments, & Computers, 32(1):93–110.
  • Chen, B. and Resendes, M. (2014). Uncovering what matters: Analyzing transitional relations among contribution types in knowledge-building discourse. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge - LAK ’14, pages 226–230, New York, New York, USA. ACM Press.

Activities: To be designed by SIG 4

Week 13: AERA; Group/Individual Study

Week 14: Prediction and Intervention (theme 5)

Suggested Readings

  • Pardos, Z.A., Baker, R.S.J.d., San Pedro, M.O.C.Z., Gowda, S.M., Gowda, S.M. (2013). Affective states and state tests: Investigating how affect throughout the school year predicts end of year learning outcomes. In Proceedings of the 3rd International Conference on Learning Analytics and Knowledge. <!-- - DeBoer, J. and Breslow, L. (2014). Tracking progress: Predictors of students’ weekly achievement during a circuits and electronics mooc. In Proceedings of the First ACM Conference on Learning @ Scale Conference, L@S ’14, pages 169–170, New York, NY, USA. ACM.
  • Kloft, M., Stiehler, F., Zheng, Z., and Pinkwart, N. (2014). Predicting mooc dropout over weeks using machine learning methods. In Modeling Large Scale Social Interaction in Massively Open Online Courses Workshop (EMNLP 2014).-->
  • Baker, R. S. J. d., D’Mello, S. K., Rodrigo, M. M. T., & Graesser, A. C. (2010). Better to be frustrated than bored: The incidence, persistence, and impact of learners’ cognitive–affective states during interactions with three different computer-based learning environments. International Journal of Human-Computer Studies, 68(4), 223–241. doi:10.1016/j.ijhcs.2009.12.003

Activities: To be designed by SIG 5

Week 15 and 15: WG Presentations and Reflection


  • WGs present their group projects
  • Reflection and celebration!

Final individual reflection assignment due by May 15, 2015.



Grading and Transcripts

The University utilizes plus and minus grading on a 4.000 cumulative grade point scale in accordance with the following:

  • A 4.000 - Represents achievement that is outstanding relative to the level necessary to meet course requirements
  • A- 3.667
  • B+ 3.333
  • B 3.000 - Represents achievement that is significantly above the level necessary to meet course requirements
  • B- 2.667
  • C+ 2.333
  • C 2.000 - Represents achievement that meets the course requirements in every respect
  • C- 1.667
  • D+ 1.333
  • D 1.000 - Represents achievement that is worthy of credit even though it fails to meet fully the course requirements
  • S Represents achievement that is satisfactory, which is equivalent to a C- or better.

This means that the grade that you have earned in this course based on the percentage scale above will then be documented on your transcript according to this 4.000 scale and letter grade, not as a percentage.

For additional information about grades, please refer to:

Definition of Grades

  • A - achievement that is outstanding relative to the level necessary to meet course requirements.
  • B - achievement that is significantly above the level necessary to meet course requirements.
  • C - achievement that meets the course requirements in every respect.
  • D - achievement that is worthy of credit even though it fails to meet fully the course requirements.
  • S - achievement that is satisfactory, which is equivalent to a C- or better (achievement required for an S is at the discretion of the instructor but may be no lower than equivalent to a C-)
  • F (or N) - Represents failure (or no credit) and signifies that the work was either (1) completed but at a level of achievement that is not worthy of credit or (2) was not completed and there was no agreement between the instructor and the student that the student would be awarded an I (see also I).

Incomplete Grades

The grade of "I" is not a regular University grade and cannot be given without special arrangements under very unusual circumstances. It cannot be given merely to extend the time allowed to complete course requirements. If family or personal emergency requires that your attention be diverted from the course and that more time than usual is needed to complete course work, arrangements should be made with the instructor of the course before the quarter ends and consent obtained for receiving an "Incomplete" or "I" grade. These arrangements should be made as soon as the need for an "I" can be anticipated. A written agreement should be prepared indicating when the course assignment will be completed. Normally an "Incomplete" grade for a course should be removed within one quarter of its receipt.

University Technology Support Services

Need help with common campus technology issues? Students can get help with general computer, Internet, and network issues in a variety of ways as described on the Office of Information Technology (OIT) help and support page. *Please note that individual instructors cannot help you with issues or problems with your personal computer, network, or Internet connection.

Student Conduct Code

The University seeks an environment that promotes academic achievement and integrity, that is protective of free inquiry, and that serves the educational mission of the University. Similarly, the University seeks a community that is free from violence, threats, and intimidation; that is respectful of the rights, opportunities, and welfare of students, faculty, staff, and guests of the University; and that does not threaten the physical or mental health or safety of members of the University community. As a student at the University you are expected adhere to Board of Regents Policy: Student Conduct Code. To review the Student Conduct Code, please see: Note that the conduct code specifically addresses disruptive classroom conduct, which means "engaging in behavior that substantially or repeatedly interrupts either the instructor's ability to teach or student learning. The classroom extends to any setting where a student is engaged in work toward academic credit or satisfaction of program-based requirements or related activities."

Scholastic Dishonesty

Academic dishonesty, including plagiarism, in any portion of the academic work for a course shall be grounds for receiving a grade of F or N for the entire course. You are expected to do your own academic work and cite sources as necessary. Failing to do so is scholastic dishonesty. Scholastic dishonesty means plagiarizing; cheating on assignments or examinations; engaging in unauthorized collaboration on academic work; taking, acquiring, or using test materials without faculty permission; submitting false or incomplete records of academic achievement; acting alone or in cooperation with another to falsify records or to obtain dishonestly grades, honors, awards, or professional endorsement; altering, forging, or misusing a University academic record; or fabricating or falsifying data, research procedures, or data analysis. (Student Conduct Code: If it is determined that a student has cheated, he or she may be given an "F" or an "N" for the course, and may face additional sanctions from the University. For additional information, please see: The Office for Student Conduct and Academic Integrity has compiled a useful list of Frequently Asked Questions pertaining to scholastic dishonesty: If you have additional questions, please clarify with your instructor for the course. Your instructor can respond to your specific questions regarding what would constitute scholastic dishonesty in the context of a particular class-e.g., whether collaboration on assignments is permitted, requirements and methods for citing sources, etc.

Appropriate Student Use of Class Notes and Course Materials

Taking notes is a means of recording information but more importantly of personally absorbing and integrating the educational experience. However, broadly disseminating class notes beyond the classroom community or accepting compensation for taking and distributing classroom notes undermines instructor interests in their intellectual work product while not substantially furthering instructor and student interests in effective learning. Such actions violate shared norms and standards of the academic community. For additional information, please see:

Sexual Harassment

"Sexual harassment" means unwelcome sexual advances, requests for sexual favors, and/or other verbal or physical conduct of a sexual nature. Such conduct has the purpose or effect of unreasonably interfering with an individual's work or academic performance or creating an intimidating, hostile, or offensive working or academic environment in any University activity or program. Such behavior is not acceptable in the University setting. For additional information, please consult Board of Regents Policy:

Equity, Diversity, Equal Opportunity, and Affirmative Action

The University provides equal access to and opportunity in its programs and facilities, without regard to race, color, creed, religion, national origin, gender, age, marital status, disability, public assistance status, veteran status, sexual orientation, gender identity, or gender expression. For more information, please consult Board of Regents Policy:

Disability Accommodations

The University of Minnesota is committed to providing equitable access to learning opportunities for all students. Disability Services (DS) is the campus office that collaborates with students who have disabilities to provide and/or arrange reasonable accommodations. If you have, or think you may have, a disability (e.g., mental health, attentional, learning, chronic health, sensory, or physical), please contact DS at 612-626-1333 to arrange a confidential discussion regarding equitable access and reasonable accommodations. If you are registered with DS and have a current letter requesting reasonable accommodations, please contact your instructor as early in the semester as possible to discuss how the accommodations will be applied in the course. For more information, please see the DS website,

Mental Health and Stress Management

As a student you may experience a range of issues that can cause barriers to learning, such as strained relationships, increased anxiety, alcohol/drug problems, feeling down, difficulty concentrating and/or lack of motivation. These mental health concerns or stressful events may lead to diminished academic performance and may reduce your ability to participate in daily activities. University of Minnesota services are available to assist you. You can learn more about the broad range of confidential mental health services available on campus via the Student Mental Health website:

Academic Freedom and Responsibility: for courses that do not involve students in research

Academic freedom is a cornerstone of the University. Within the scope and content of the course as defined by the instructor, it includes the freedom to discuss relevant matters in the classroom. Along with this freedom comes responsibility. Students are encouraged to develop the capacity for critical judgment and to engage in a sustained and independent search for truth. Students are free to take reasoned exception to the views offered in any course of study and to reserve judgment about matters of opinion, but they are responsible for learning the content of any course of study for which they are enrolled.\footnote{Language adapted from the American Association of University Professors "Joint Statement on Rights and Freedoms of Students"} Reports of concerns about academic freedom are taken seriously, and there are individuals and offices available for help. Contact the instructor, the Department Chair, your adviser, the associate dean of the college, or the Vice Provost for Faculty and Academic Affairs in the Office of the Provost.


Additional Bibliography

  • Baker, R. S. J. d., Gowda, S. M., and Corbett, A. (2011). Automatically detecting a student’s preparation for future learning: Help use is key. In Proceedings of the 4th International Conference on Educational Data Mining, pages 179–188.
  • Baker, R. S., Hershkovitz, A., Rossi, L. M., Goldstein, A. B., and Gowda, S. M. (2013). Predicting Robust Learning With the Visual Form of the Moment-by-Moment Learning Curve. Journal of the Learning Sciences, 22(4):639–666.
  • Bienkowski, M., Feng, M., and Means, B. (2012). Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics: An issue brief.
  • Blikstein, P. (2011). Using learning analytics to assess students’ behavior in open-ended programming tasks. In Proceedings of the 1st International Conference on Learning Analytics and Knowl- edge - LAK ’11, page 110, New York, New York, USA. ACM Press.
  • Burstein, J., Marcu, D., Andreyev, S., and Chodorow, M. (2001). Towards automatic classification of discourse elements in essays. In Proceedings of the 39th annual Meeting on Association for Computational Linguistics, pages 98–105. Association for Computational Linguistics.
  • Calvo, R. A. and D’Mello, S. (2010). Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications. IEEE Transactions on Affective Computing, 1(1):18– 37.
  • Chiu, M. M. (2008). Flowing Toward Correct Contributions During Group Problem Solving: A Statistical Discourse Analysis. Journal of the Learning Sciences, 17(3):415–463.
  • Coffrin, C., Corrin, L., de Barba, P., and Kennedy, G. (2014). Visualizing patterns of student engagement and performance in MOOCs. In Proceedins of the Fourth International Conference on Learning Analytics And Knowledge - LAK ’14, pages 83–92, New York, New York, USA. ACM Press.
  • D’Mello, S., Olney, A., and Person, N. (2010). Mining Collaborative Patterns in Tutorial Dialogues. Journal of Educational Data Mining, 2(1):1–37.
  • Dyke, G., Kumar, R., Ai, H., and Rosé, C. P. (2012). Challenging assumptions: Using sliding window visualizations to reveal time-based irregularities in CSCL processes. In van Aalst, J., Thompson, K., Jacobson, M. J., and Reimann, P., editors, The future of learning: Proceedings of the 10th international conference of the learning sciences (ICLS 2012) - Volume 1, Full Papers, pages 363–370. ISLS, Sydney, Australia.
  • Ferguson, R. (2012) Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning (IJTEL), 4 (5/6), 304-317.
  • Gobert, J. D., Sao Pedro, M., Raziuddin, J., and Baker, R. S. (2013). From Log Files to Assessment Metrics: Measuring Students’ Science Inquiry Skills Using Educational Data Mining. Journal of the Learning Sciences, 22(4):521–563.
  • Halatchliyski, I., Hecking, T., Gohnert, T., and Hoppe, H. U. (2013). Analyzing the flow of ideas and profiles of contributors in an open learning community. In Proceedings of the Third International Conference on Learning Analytics and Knowledge - LAK ’13, page 66, New York, USA. ACM Press.
  • Halevy, A., Norvig, P., and Pereira, F. (2009). The unreasonable effectiveness of data. Intelligent Systems, IEEE, 24(2):8–12.
  • Haythornthwaite, C., de Laat, M., and Dawson, S. (2013). Introduction to the Special Issue on Learning Analytics. American Behavioral Scientist, 57(10):1371–1379.
  • Holloway, T., Bozicevic, M., and Borner, K. (2007). Analyzing and visualizing the semantic coverage of Wikipedia and its authors. Complexity, 12(3):30–40.
  • Howley, I., Mayfield, E., and Rose, C. P. (2013). Linguistic analysis methods for studying small groups.
  • Lee, V. R., & Drake, J. (2013). Quan