HCI Bibliography Home | HCI Conferences | LAK Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
LAK Tables of Contents: 1112131415

LAK'13: 2013 International Conference on Learning Analytics and Knowledge

Fullname:Proceedings of the Third International Conference on Learning Analytics and Knowledge
Editors:Erik Duval; Xavier Ochoa; Dan Suthers; Katrien Verbert
Location:Leuven, Belgium
Dates:2013-Apr-08 to 2013-Apr-12
Publisher:ACM
Standard No:ISBN: 978-1-4503-1785-6; ACM DL: Table of Contents; hcibib: LAK13
Papers:47
Pages:300
Links:Conference Website
  1. Reflections on learning analytics
  2. Visualization to support awareness and reflection
  3. Social network analysis and visualization
  4. Communication and collaboration
  5. Discourse analytics
  6. Behavior analysis
  7. Affect analytics
  8. Predictive analytics
  9. Sequence analytics
  10. MOOCs
  11. Assessment
  12. Supporting teachers
  13. Challenges
  14. Analytic architectures
  15. Design briefings
  16. Panels
  17. Workshops

Reflections on learning analytics

Learning analytics as a "middle space" BIBAFull-Text 1-4
  Dan Suthers; Katrien Verbert
Learning Analytics, an emerging field concerned with analyzing the vast data "given off" by learners in technology supported settings to inform educational theory and practice, has from its inception taken a multidisciplinary approach that integrates studies of learning with technological capabilities. In this introduction to the Proceedings of the Third International Learning Analytics & Knowledge Conference, we discuss how Learning Analytics must function in the "middle space" where learning and analytic concerns meet. Dialogue in this middle space involves diverse stakeholders from multiple disciplines with various conceptions of the agency and nature of learning. We hold that a singularly unified field is not possible nor even desirable if we are to leverage the potential of this diversity, but progress is possible if we support "productive multivocality" between the diverse voices involved, facilitated by appropriate use of boundary objects. We summarize the submitted papers and contents of these Proceedings to characterize the voices and topics involved in the multivocal discourse of Learning Analytics.
Multidisciplinarity vs. Multivocality, the case of "learning analytics" BIBAFull-Text 5-13
  Nicolas Balacheff; Kristine Lund
In this paper, we consider an analysis of the TeLearn archive, of the Grand Challenges from the STELLAR Network of Excellence, of two Alpine Rendez-Vous 2011 workshops and research conducted in the Productive Multivocality initiative in order to discuss the notions of multidisciplinarity, multivocality and interidisciplinarity. We use this discussion as a springboard for addressing the term "Learning Analytics" and its relation to "Educational Data Mining". Our goal is to launch a debate pertaining to what extent the different disciplines involved in the TEL community can be integrated on methodological and theoretical levels.

Visualization to support awareness and reflection

Addressing learner issues with StepUp!: an evaluation BIBAFull-Text 14-22
  Jose Luis Santos; Katrien Verbert; Sten Govaerts; Erik Duval
This paper reports on our research on the use of learning analytics dashboards to support awareness, self-reflection, sensemaking and impact for learners. So far, little research has been done to evaluate such dashboards with students and to assess their impact on learning. In this paper, we present the results of an evaluation study of our dashboard, called StepUp!, and the extent to which it addresses issues and needs of our students. Through brainstorming sessions with our students, we identified and prioritized learning issues and needs. In a second step, we deployed StepUp! during one month and we evaluated to which extent our dashboard addresses the issues and needs identified earlier in different courses. The results show that our tool has potentially higher impact for students working in groups and sharing a topic than students working individually on different topics.
Live interest meter: learning from quantified feedback in mass lectures BIBAFull-Text 23-27
  Verónica Rivera-Pelayo; Johannes Munk; Valentin Zacharias; Simone Braun
There is currently little or no support for speakers to learn by reflection when addressing a big audience, like mass lectures, virtual courses or conferences. Reliable feedback from the audience could improve personal skills and work performance. To address this shortcoming we have developed the Live Interest Meter App (LIM App) that supports the gathering, aggregation and visualization of feedback. This application allows audience members to easily provide and quantify their feedback through a simple meter. We conducted several experimental tests to investigate the acceptance and perceived usefulness of the LIM App and a user study in an academic setting to inform its further development. The results of the study illustrate the potential of the LIM App to be used in such scenarios. Main findings show the need for motivating students to use the application, the readiness of presenters to learn retrospectively, and distraction as the main concern of end users.

Social network analysis and visualization

Considering formal assessment in learning analytics within a PLE: the HOU2LEARN case BIBAFull-Text 28-32
  Eleni Koulocheri; Michalis Xenos
Personal Learning Environments are used more and more by the academic community. They can coexist with formal courses as a communication and collaboration channel. In this paper, an application of learning analytics into HOU2LEARN, a Personal Learning Environment set by Hellenic Open University is discussed. The present part of research focuses on the social network analysis as a branch of learning analytics, along with formal grading system. Since it is an ongoing research, this paper presents the preliminary results of the study of the correlation between the social network metrics and the formal grades, through a test case course, the PLH42.
Visualizing social learning ties by type and topic: rationale and concept demonstrator BIBAFull-Text 33-37
  Bieke Schreurs; Chris Teplovs; Rebecca Ferguson; Maarten de Laat; Simon Buckingham Shum
Social Learning Analytics (SLA) are designed to support students learning through social networks, and reflective practitioners engage in informal learning through a community of practice. This short paper reports work in progress to develop SLA motivated specifically by Networked Learning Theory, drawing on the related concepts and tools of Social Network Analytics and Social Capital Theory, which provide complementary perspectives onto the structure and content of such networks. We propose that SLA based on these perspectives needs to devise models and visualizations capable of showing not only the usual SNA metrics, but the types of social tie forged between actors, and topic-specific subnetworks. We describe a technical implementation demonstrating this approach, which extends the Network Awareness Tool by automatically populating it with data from a social learning platform SocialLearn. The result is the ability to visualize relationships between people who interact around the same topics.

Communication and collaboration

Analysis of collaborative writing processes using revision maps and probabilistic topic models BIBAFull-Text 38-47
  Vilaythong Southavilay; Kalina Yacef; Peter Reimann; Rafael A. Calvo
The use of cloud computing writing tools, such as Google Docs, by students to write collaboratively provides unprecedented data about the progress of writing. This data can be exploited to gain insights on how learners' collaborative activities, ideas and concepts are developed during the process of writing. Ultimately, it can also be used to provide support to improve the quality of the written documents and the writing skills of learners involved. In this paper, we propose three visualisation approaches and their underlying techniques for analysing writing processes used in a document written by a group of authors: (1) the revision map, which summarises the text edits made at the paragraph level, over the time of writing. (2) the topic evolution chart, which uses probabilistic topic models, especially Latent Dirichlet Allocation (LDA) and its extension, DiffLDA, to extract topics and follow their evolution during the writing process. (3) the topic-based collaboration network, which allows a deeper analysis of topics in relation to author contribution and collaboration, using our novel algorithm DiffATM in conjunction with a DiffLDA-related technique. These models are evaluated to examine whether these automatically discovered topics accurately describe the evolution of writing processes. We illustrate how these visualisations are used with real documents written by groups of graduate students.
Learning analytics for online discussions: a pedagogical model for intervention with embedded and extracted analytics BIBAFull-Text 48-56
  Alyssa Friend Wise; Yuting Zhao; Simone Nicole Hausknecht
This paper describes an application of learning analytics that builds on an existing research program investigating how students contribute and attend to the messages of others in online discussions. A pedagogical model that translates the concepts and findings of the research program into guidelines for practice and analytics with which students and instructors can assess their discussion participation are presented. The analytics are both embedded in the learning environment and extracted from it, allowing for integrated and reflective metacognitive activity. The pedagogical intervention is based on the principles of (1) Integration (2) Diversity (of Metrics) (3) Agency (4) Reflection (5) Parity and (6) Dialogue. Details of an initial implementation of this approach and preliminary findings are described. Initial results strongly support the value of student-teacher dialogue around the analytics. In contrast, instructor parity in analytics use did not seem as important to students as was expected. Analytics were reported as useful in validating invisible discussion activity, but at times triggered emotionally-charged responses.
Understanding promotions in a case study of student blogging BIBAFull-Text 57-65
  Bjorn Levi Gunnarsson; Richard Alterman
Promoting blog content is a social activity; it is a means of communicating one student's appreciation of another student's work. This paper explores the feasibility of using student promotions of content, in a blogosphere, to identify quality content, and implications for instructors. We show that students actively and voluntarily promote content, use promotion data to select which posts to read, and with considerable accuracy identify quality material. We explore the benefits of knowing which students are good and poor predictors of quality content, and what instructors can do with this information in terms of feedback and guidance.

Discourse analytics

Analyzing the flow of ideas and profiles of contributors in an open learning community BIBAFull-Text 66-74
  Iassen Halatchliyski; Tobias Hecking; Tilman Göhnert; H. Ulrich Hoppe
This paper provides an introduction to the scientometric method of main path analysis and its application to detecting idea flows in an online learning community using data from Wikiversity. We see this as a step forward in adapting and adopting network analysis techniques for analyzing the evolution of artifacts in knowledge building communities. The analysis steps are presented in detail including the description of a tool environment ("workbench") designed for flexible use by non-computer experts. Through the definition of directed acyclic graphs the meaningful interconnectedness of learning resources is made accessible to analysis in consideration of the temporal sequence of their creation during a collaborative process. The potential of the method is elaborated for analyzing the overall learning process of a community as well as the individual contributions of the participants.
Epistemology, pedagogy, assessment and learning analytics BIBAFull-Text 75-84
  Simon Knight; Simon Buckingham Shum; Karen Littleton
There is a well-established literature examining the relationships between epistemology (the nature of knowledge), pedagogy (the nature of learning and teaching), and assessment. Learning Analytics (LA) is a new assessment technology and should engage with this literature since it has implications for when and why different LA tools might be deployed. This paper discusses these issues, relating them to an example construct, epistemic beliefs -- beliefs about the nature of knowledge -- for which analytics grounded in pragmatic, sociocultural theory might be well placed to explore. This example is particularly interesting given the role of epistemic beliefs in the everyday knowledge judgements students make in their information processing. Traditional psychological approaches to measuring epistemic beliefs have parallels with high stakes testing regimes; this paper outlines an alternative LA for epistemic beliefs which might be readily applied to other areas of interest. Such sociocultural approaches afford opportunity for engaging LA directly in high quality pedagogy.
An evaluation of learning analytics to identify exploratory dialogue in online discussions BIBAFull-Text 85-93
  Rebecca Ferguson; Zhongyu Wei; Yulan He; Simon Buckingham Shum
Social learning analytics are concerned with the process of knowledge construction as learners build knowledge together in their social and cultural environments. One of the most important tools employed during this process is language. In this paper we take exploratory dialogue, a joint form of co-reasoning, to be an external indicator that learning is taking place. Using techniques developed within the field of computational linguistics, we build on previous work using cue phrases to identify exploratory dialogue within online discussion. Automatic detection of this type of dialogue is framed as a binary classification task that labels each contribution to an online discussion as exploratory or non-exploratory. We describe the development of a self-training framework that employs discourse features and topical features for classification by integrating both cue-phrase matching and k-nearest neighbour classification. Experiments with a corpus constructed from the archive of a two-day online conference show that our proposed framework outperforms other approaches. A classifier developed using the self-training framework is able to make useful distinctions between the learning dialogue taking place at different times within an online conference as well as between the contributions of individual participants.

Behavior analysis

Towards the development of multimodal action based assessment BIBAFull-Text 94-101
  Marcelo Worsley; Paulo Blikstein
In this paper, we describe multimodal learning analytics techniques for understanding and identifying expertise as students engage in a hands-on building activity. Our techniques leverage process-oriented data, and demonstrate how this temporal data can be used to study student learning. The proposed techniques introduce useful insights in how to segment and analyze gesture- and action-based generally, and may also be useful for other sources of process rich data. Using this approach we uncover new ideas about how experts engage in building activities. Finally, a primary objective of this work is to motivate additional research and development in the area of authentic, automated, process-oriented assessments.
Multimodal learning analytics BIBAFull-Text 102-106
  Paulo Blikstein
New high-frequency data collection technologies and machine learning analysis techniques could offer new insights into learning, especially in tasks in which students have ample space to generate unique, personalized artifacts, such as a computer program, a robot, or a solution to an engineering challenge. To date most of the work on learning analytics and educational data mining has focused on online courses or cognitive tutors, in which the tasks are more structured and the entirety of interaction happens in front of a computer. In this paper, I argue that multimodal learning analytics could offer new insights into students' learning trajectories, and present several examples of this work and its educational application.
Toward collaboration sensing: applying network analysis techniques to collaborative eye-tracking data BIBAFull-Text 107-111
  Bertrand Schneider; Sami Abu-El-Haija; Jim Reesman; Roy Pea
In this paper we describe preliminary applications of network analysis techniques to eye-tracking data. In a previous study, the first author conducted a collaborative learning experiment in which subjects had access (or not) to a gaze-awareness tool: their task was to learn from neuroscience diagrams in a remote collaboration. In the treatment group, they could see the gaze of their partner displayed on the screen in real-time. In the control group, they could not. Dyads in the treatment group achieved a higher quality of collaboration and a higher learning gain. In this paper, we describe how network analysis techniques can further illuminate these results, and contribute to the development of 'collaboration sensing'. More specifically, we describe two contributions: first, one can use networks to visualize and explore eye-tracking data. Second, network metrics can be computed to interpret the properties of the graph. We conclude with comments on implementing this approach for formal learning environments.
Inferring higher level learning information from low level data for the Khan Academy platform BIBAFull-Text 112-116
  Pedro J. Muñoz-Merino; José A. Ruipérez Valiente; Carlos Delgado Kloos
To process low level educational data in the form of user events and interactions and convert them into information about the learning process that is both meaningful and interesting presents a challenge. In this paper, we propose a set of high level learning parameters relating to total use, efficient use, activity time distribution, gamification habits, or exercise-making habits, and provide the measures to calculate them as a result of processing low level data. We apply these parameters and measures in a real physics course with more than 100 students using the Khan Academy platform at Universidad Carlos III de Madrid. We show how these parameters can be meaningful and useful for the learning process based on the results from this experience.

Affect analytics

Affective states and state tests: investigating how affect throughout the school year predicts end of year learning outcomes BIBAFull-Text 117-124
  Zachary A. Pardos; Ryan S. J. D. Baker; Maria O. C. Z. San Pedro; Sujith M. Gowda; Supreeth M. Gowda
In this paper, we investigate the correspondence between student affect in a web-based tutoring platform throughout the school year and learning outcomes at the end of the year, on a high-stakes mathematics exam. The relationships between affect and learning outcomes have been previously studied, but not in a manner that is both longitudinal and finer-grained. Affect detectors are used to estimate student affective states based on post-hoc analysis of tutor log-data. For every student action in the tutor the detectors give us an estimated probability that the student is in a state of boredom, engaged concentration, confusion, and frustration, and estimates of the probability that they are exhibiting off-task or gaming behaviors. We ran the detectors on two years of log-data from 8th grade student use of the ASSISTments math tutoring system and collected corresponding end of year, high stakes, state math test scores for the 1,393 students in our cohort. By correlating these data sources, we find that boredom during problem solving is negatively correlated with performance, as expected; however, boredom is positively correlated with performance when exhibited during scaffolded tutoring. A similar pattern is unexpectedly seen for confusion. Engaged concentration and frustration are both associated with positive learning outcomes, surprisingly in the case of frustration.
An eye-tracking study of notational, informational, and emotional aspects of learning analytics representations BIBAFull-Text 125-134
  Ravi Vatrapu; Peter Reimann; Susan Bull; Matthew Johnson
This paper presents an eye-tracking study of notational, informational, and emotional aspects of nine different notational systems (Skill Meters, Smilies, Traffic Lights, Topic Boxes, Collective Histograms, Word Clouds, Textual Descriptors, Table, and Matrix) and three different information states (Weak, Average, & Strong) used to represent student's learning. Findings from the eye-tracking study show that higher emotional activation was observed for the metaphorical notations of traffic lights and smilies and collective representations. Mean view time was higher for representations of the "average" informational learning state. Qualitative data analysis of the think-aloud comments and post-study interview show that student participants reflected on the meaning-making opportunities and action-taking possibilities afforded by the representations. Implications for the design and evaluation of learning analytics representations and discourse environments are discussed.

Predictive analytics

What can we learn from Facebook activity?: using social learning analytics to observe new media literacy skills BIBAFull-Text 135-144
  June Ahn
Social media platforms such as Facebook are now a ubiquitous part of everyday life for many people. New media scholars posit that the participatory culture encouraged by social media gives rise to new forms of literacy skills that are vital to learning. However, there have been few attempts to use analytics to understand the new media literacy skills that may be embedded in an individual's participation in social media. In this paper, I collect raw activity data that was shared by an exploratory sample of Facebook users. I then utilize factor analysis and regression models to show how (a) Facebook members' online activity coalesce into distinct categories of social media behavior and (b) how these participatory behaviors correlate with and predict measures of new media literacy skills. The study demonstrates the use of analytics to understand the literacies embedded in people's social media activity. The implications speak to the potential of social learning analytics to identify and predict new media literacy skills from data streams in social media platforms.
Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning environment BIBAFull-Text 145-149
  Annika Wolff; Zdenek Zdrahal; Andriy Nikolov; Michal Pantucek
One of the key interests for learning analytics is how it can be used to improve retention. This paper focuses on work conducted at the Open University (OU) into predicting students who are at risk of failing their module. The Open University is one of the worlds largest distance learning institutions. Since tutors do not interact face to face with students, it can be difficult for tutors to identify and respond to students who are struggling in time to try to resolve the difficulty. Predictive models have been developed and tested using historic Virtual Learning Environment (VLE) activity data combined with other data sources, for three OU modules. This has revealed that it is possible to predict student failure by looking for changes in user's activity in the VLE, when compared against their own previous behaviour, or that of students who can be categorised as having similar learning behaviour. More focused analysis of these modules applying the GUHA (General Unary Hypothesis Automaton) method of data analysis has also yielded some early promising results for creating accurate hypothesis about students who fail.
Open academic analytics initiative: initial research findings BIBAFull-Text 150-154
  Eitel J. M. Lauría; Erik W. Moody; Sandeep M. Jayaprakash; Nagamani Jonnalagadda; Joshua D. Baron
This paper describes the results on research work performed by the Open Academic Analytics Initiative, an on-going research project aimed at developing an early detection system of college students at academic risk, using data mining models trained using student personal and demographic data, as well as course management data. We report initial findings on the predictive performance of those models, their portability across pilot programs in different institutions and the results of interventions applied on those pilots.

Sequence analytics

Interpreting data mining results with linked data for learning analytics: motivation, case study and directions BIBAFull-Text 155-164
  Mathieu d'Aquin; Nicolas Jay
Learning Analytics by nature relies on computational information processing activities intended to extract from raw data some interesting aspects that can be used to obtain insights into the behaviours of learners, the design of learning experiences, etc. There is a large variety of computational techniques that can be employed, all with interesting properties, but it is the interpretation of their results that really forms the core of the analytics process. In this paper, we look at a specific data mining method, namely sequential pattern extraction, and we demonstrate an approach that exploits available linked open data for this interpretation task. Indeed, we show through a case study relying on data about students' enrolment in course modules how linked data can be used to provide a variety of additional dimensions through which the results of the data mining method can be explored, providing, at interpretation time, new input into the analytics process.
Nanogenetic learning analytics: illuminating student learning pathways in an online fraction game BIBAFull-Text 165-169
  Taylor Martin; Ani Aghababyan; Jay Pfaffman; Jenna Olsen; Stephanie Baker; Philip Janisiewicz; Rachel Phillips; Carmen Petrick Smith
A working understanding of fractions is critical to student success in high school and college math. Therefore, an understanding of the learning pathways that lead students to this working understanding is important for educators to provide optimal learning environments for their students. We propose the use of microgenetic analysis techniques including data mining and visualizations to inform our understanding of the process by which students learn fractions in an online game environment. These techniques help identify important variables and classification algorithms to group students by their learning trajectories.

MOOCs

Deconstructing disengagement: analyzing learner subpopulations in massive open online courses BIBAFull-Text 170-179
  René F. Kizilcec; Chris Piech; Emily Schneider
As MOOCs grow in popularity, the relatively low completion rates of learners has been a central criticism. This focus on completion rates, however, reflects a monolithic view of disengagement that does not allow MOOC designers to target interventions or develop adaptive course features for particular subpopulations of learners. To address this, we present a simple, scalable, and informative classification method that identifies a small number of longitudinal engagement trajectories in MOOCs. Learners are classified based on their patterns of interaction with video lectures and assessments, the primary features of most MOOCs to date.
   In an analysis of three computer science MOOCs, the classifier consistently identifies four prototypical trajectories of engagement. The most notable of these is the learners who stay engaged through the course without taking assessments. These trajectories are also a useful framework for the comparison of learner engagement between different course structures or instructional approaches. We compare learners in each trajectory and course across demographics, forum participation, video access, and reports of overall experience. These results inform a discussion of future interventions, research, and design directions for MOOCs. Potential improvements to the classification mechanism are also discussed, including the introduction of more fine-grained analytics.
The pairing of lecture recording data with assessment scores: a method of discovering pedagogical impact BIBAFull-Text 180-184
  Negin Mirriahi; Shane Dawson
Web technologies, such as lecture recordings, have the capacity to capture and store massive amounts of data from individuals' online behavior. Such data can provide insight into student learning processes and the relationship between online trace data and academic performance alerting educators to when intervention may be required or if their learning activities may need to be adjusted. This paper discusses how data captured from students' use of lecture recordings accessed through a Collaborative Lecture Annotation System (CLAS) when aggregated and correlated with assessment data can help educators evaluate the impact of the recordings on their students' learning. Such information can help inform and alert educators to when adjustments may be required to their pedagogical approach.
MOOCs and the funnel of participation BIBAFull-Text 185-189
  Doug Clow
Massive Online Open Courses (MOOCs) are growing substantially in numbers, and also in interest from the educational community. MOOCs offer particular challenges for what is becoming accepted as mainstream practice in learning analytics.
   Partly for this reason, and partly because of the relative newness of MOOCs as a widespread phenomenon, there is not yet a substantial body of literature on the learning analytics of MOOCs. However, one clear finding is that drop-out/non-completion rates are substantially higher than in more traditional education.
   This paper explores these issues, and introduces the metaphor of a 'funnel of participation' to reconceptualise the steep drop-off in activity, and the pattern of steeply unequal participation, which appear to be characteristic of MOOCs and similar learning environments. Empirical data to support this funnel of participation are presented from three online learning sites: iSpot (observations of nature), Cloudworks ('a place to share, find and discuss learning and teaching ideas and experiences'), and openED 2.0, a MOOC on business and management that ran between 2010-2012. Implications of the funnel for MOOCs, formal education, and learning analytics practice are discussed.

Assessment

What different kinds of stratification can reveal about the generalizability of data-mined skill assessment models BIBAFull-Text 190-194
  Michael A. Sao Pedro; Ryan S. J. D. Baker; Janice D. Gobert
When validating assessment models built with data mining, generalization is typically tested at the student-level, where models are tested on new students. This approach, though, may fail to find cases where model performance suffers if other aspects of those cases relevant to prediction are not well represented. We explore this here by testing if scientific inquiry skill models built and validated for one science topic can predict skill demonstration for new students and a new science topic. Test cases were chosen using two methods: student-level stratification, and stratification based on the amount of trials ran during students' experimentation. We found that predictive performance of the models was different on each test set, revealing limitations that would have been missed from student-level validation alone.
Assessing students' performance using the learning analytics enriched rubrics BIBAFull-Text 195-199
  Ioannis Dimopoulos; Ourania Petropoulou; Symeon Retalis
The assessment of students' performance in e-learning environments is a challenging and demanding task for the teachers. Focusing on this challenge, a new assessment tool, called Learning Analytics Enriched Rubric (LAe-R) is presented in this paper. LAe-R is based on the concept of assessment rubrics which is a very popular assessment technique in education. LAe-R contains "enriched" criteria and grading levels that are associated to data extracted from the analysis of learners' interaction and learning behavior in an e-learning environment. LAe-R has been developed as a plug-in for the Moodle learning management system. Via an example, we will show how LAe-R can be used by teachers and students.
Model-driven assessment of learners in open-ended learning environments BIBAFull-Text 200-204
  James R. Segedy; Kirk M. Loretz; Gautam Biswas
Open-ended learning environments (OELEs) provide students with opportunities to take part in authentic and complex problem-solving tasks. However, many students struggle to succeed in such complex learning endeavors. Without support, these students often use system tools incorrectly and adopt suboptimal learning strategies. However, providing adaptive support to students in OELEs poses significant challenges, and relatively few OELEs provide students with adaptive support. This paper presents the initial development of a systematic approach for interpreting and evaluating learner behaviors in OELEs called model-driven assessments, which uses a model of the cognitive and metacognitive processes important for completing the open-ended learning task. The model provides a means for both classifying and assessing students' learning behaviors while using the system. An evaluation of the analysis technique is presented in the context of Betty's Brain, an OELE designed to help middle school students learn about science.
Formative assessment and learning analytics BIBAFull-Text 205-209
  Dirk T. Tempelaar; André Heck; Hans Cuypers; Henk van der Kooij; Evert van de Vrie
Learning analytics seeks to enhance the learning process through systematic measurements of learning related data, and informing learners and teachers of the results of these measurements, so as to support the control of the learning process. Learning analytics has various sources of information, two main types being intentional and learner activity related metadata [1]. This contribution aims to provide a practical application of Shum and Crick's theoretical framework [1] of a learning analytics infrastructure that combines learning dispositions data with data extracted from computer-based, formative assessments. The latter data component is derived from one of the educational projects of ONBETWIST, part of the SURF program 'Testing and Test Driven Learning'.

Supporting teachers

STEMscopes: contextualizing learning analytics in a K-12 science curriculum BIBAFull-Text 210-219
  Carlos Monroy; Virginia Snodgrass Rangel; Reid Whitaker
In this paper, we discuss a scalable approach for integrating learning analytics into an online K-12 science curriculum. A description of the curriculum and the underlying pedagogical framework is followed by a discussion of the challenges to be tackled as part of this integration. We also include examples of data visualization based on real student and teacher data. With more than one million students and fifty thousand teachers using the curriculum, a massive and rich dataset is continuously updated. This repository depicts teacher and students usage of an inquiry-based science program, and offers exciting opportunities to leverage research to improve both teaching and learning. The growing dataset, with more than a hundred million items of activity in six months, also poses technical challenges such as data storage, complex aggregation and analysis with broader implications for pedagogy, big data, and learning.
Supporting action research with learning analytics BIBAFull-Text 220-229
  A. L. Dyckhoff; V. Lukarov; A. Muslim; M. A. Chatti; U. Schroeder
Learning analytics tools should be useful, i.e., they should be usable and provide the functionality for reaching the goals attributed to learning analytics. This paper seeks to unite learning analytics and action research. Based on this, we investigate how the multitude of questions that arise during technology-enhanced teaching and learning systematically can be mapped to sets of indicators. We examine, which questions are not yet supported and propose concepts of indicators that have a high potential of positively influencing teachers' didactical considerations. Our investigation shows that many questions of teachers cannot be answered with currently available research tools. Furthermore, few learning analytics studies report about measuring impact. We describe which effects learning analytics should have on teaching and discuss how this could be evaluated.
A case study inside virtual worlds: use of analytics for immersive spaces BIBAFull-Text 230-234
  Vanessa Camilleri; Sara de Freitas; Matthew Montebello; Paul McDonagh-Smith
In this paper we describe some case studies of the use of virtual worlds in corporate training as well as Higher Education. In particular for Higher Education we describe how the Virtual World constructed using the platform Avaya Live Engage, is used as an immersive environment with pre-service teachers, who are undergoing a 1-year teacher training program, and how the data analytics collected in-world is being used to monitor and direct content development. We focus our studies on the initial hypothesis that 3D immersive environments are highly engaging and offer an experience that goes beyond the 'traditional' online education. We want to combine different analysis methods to be able to get empirical evidence showing the students' engagement with the 3D space in ways that can help us in the design of the learning experience accompanying the learners in their journey. In this paper we describe the research methods we use for the study, and give an overview of the information we can collect from the in-world analytics. We also propose how these analytics can be used for a predictive model with the intention of refocusing the virtual world experience to match learner needs.

Challenges

Issues, challenges, and lessons learned when scaling up a learning analytics intervention BIBAFull-Text 235-239
  Steven Lonn; Stephen Aguilar; Stephanie D. Teasley
This paper describes an intra-institutional partnership between a research team and a technology service group that was established to facilitate the scaling up of a learning analytics intervention. Our discussion focuses on the benefits and challenges that arose from this partnership in order to provide useful information for similar partnerships developed to support scaling up learning analytics interventions.
An evaluation of policy frameworks for addressing ethical considerations in learning analytics BIBAFull-Text 240-244
  Paul Prinsloo; Sharon Slade
Higher education institutions have collected and analysed student data for years, with their focus largely on reporting and management needs. A range of institutional policies exist which broadly set out the purposes for which data will be used and how data will be protected. The growing advent of learning analytics has seen the uses to which student data is put expanding rapidly. Generally though the policies setting out institutional use of student data have not kept pace with this change.
   Institutional policy frameworks should provide not only an enabling environment for the optimal and ethical harvesting and use of data, but also clarify: who benefits and under what conditions, establish conditions for consent and the de-identification of data, and address issues of vulnerability and harm. A directed content analysis of the policy frameworks of two large distance education institutions shows that current policy frameworks do not facilitate the provision of an enabling environment for learning analytics to fulfil its promise.
Aggregating social and usage datasets for learning analytics: data-oriented challenges BIBAFull-Text 245-249
  Katja Niemann; Martin Wolpers; Giannis Stoitsis; Georgios Chinis; Nikos Manouselis
Recent work has studied real-life social and usage datasets from educational applications, highlighting the opportunity to combine or merge them. It is expected that being able to put together different datasets from various applications will make it possible to support learning analytics of a much larger scale and across different contexts. We examine how this can be achieved from a practical perspective by carrying out a study that focuses on three real datasets. More specifically, we combine social data that has been collected from the users of three learning portals and reflect on how they should be handled. We start by studying the data types and formats that these portals use to represent and store social and usage data. Then we develop crosswalks between the different schemas, so that merged versions of the source datasets may be created. The results of this bottom-up, hands-on investigation reveal several interesting issues that need to be overcome before aggregated sets of social and usage data can be actually used to support learning analytics research or services.

Analytic architectures

From micro to macro: analyzing activity in the ROLE Sandbox BIBAFull-Text 250-254
  Dominik Renzel; Ralf Klamma
Current learning services are increasingly based on standard Web technologies and concepts. As by-product of service operation, Web logs capture and contextualize user interactions in a generic manner, in high detail, and on a massive scale. At the same time, we face inventions of data standards for capturing and encoding learner interactions tailored to learning analytics purposes. However, such standards are often focused on institutional and management perspectives or biased by their intended use. In this paper, we argue for Web logs as valuable data sources for learning analytics on all levels of Bronfenbrenner's Ecological System Theory and introduce a simple framework for Web log data enrichment, processing and further analysis. Based on an example data set from a management service for widget-based Personal Learning Environments, we illustrate our approach and discuss the applicability of different analysis techniques along with their particular benefits for learners.
Analytics of collaborative planning in Metafora: architecture, data, and analytic methods BIBAFull-Text 255-259
  Andreas Harrer
This paper describes our approach for learning analytics in the Metafora system, a collaborative learning framework that supports self-regulated and constructionist activities in groups. Our specific interest in analysis is the nature of collaborative planning behaviour and aspects of learning to learn together (L2L2). For that end we will describe the architecture supporting diverse analytic components across all the tools used in Metafora, the data formats, storage and access methods, and the analytic principles we designed and implemented. We will also describe our first insights using these methods on real Metafora data collected during practical experimentation in schools.

Design briefings

GradeCraft: what can we learn from a game-inspired learning management system? BIBAFull-Text 260-264
  Caitlin Holman; Stephen Aguilar; Barry Fishman
The "gamification" of courses (i.e., designing courses that leverage motivational mechanisms found in videogames) is a movement that is gaining traction in educational research communities and universities. Two game-inspired courses were developed at a high-enrollment public university in an effort to increase student engagement, and to provide students with more personalized learning experiences. We designed a learning management system, GradeCraft, to foreground the affordances of these grading systems, and to enhance the "game-like" experience for students. Along with serving as a translation layer for the grading systems of these courses, GradeCraft is also designed with an eye towards learning analytics, and captures information that can be described as student "process" data. Currently this data includes what types of assignments students choose to complete; how students assign percentage weights to their chosen assignments; how often and how accurately students check or model their course grades; and how successfully assignments are completed by students individually and the class as a whole across a structured grading rubric. We hope GradeCraft will give instructors new insight into student engagement, and provide data-driven ideas about how to tailor courses to student needs.
System for assessing classroom attention BIBAFull-Text 265-269
  Mirko Raca; Pierre Dillenbourg
In this paper we give a preview of our system for automatically evaluating attention in the classroom. We demonstrate our current behaviour metrics and preliminary observations on how they reflect the reactions of people to the given lecture. We also introduce foundations of our hypothesis on peripheral awareness of students during lectures.
Orchestrating of complex inquiry: three roles for learning analytics in a smart classroom infrastructure BIBAFull-Text 270-274
  James D. Slotta; Mike Tissenbaum; Michelle Lui
This paper presents our research of a pedagogical model known as Knowledge Community and Inquiry (KCI), focusing on our design of a technological infrastructure for the orchestration of the complex CSCL scripts that characterize KCI curricula. We first introduce the KCI model including some basic design principles, and describe its dependency on real time learning analytics. Next, we describe our technology, known as SAIL Smart Space (S3), which provides scaffolding and analytic support of sequenced interactions amongst people, materials, tools and environments. We outline the critical role of the teacher in our designs and describe how S3 supports their active role in orchestration. Finally we outline two implementations of KCI/S3 and the role of learning analytics, in supporting dynamic collective visualizations, real time orchestrational logic, and ambient displays.

Panels

Crafting transformative strategies for personalized learning/analytics BIBAFull-Text 275-277
  Linda L. Baer; Ann Hill Duin; Donald Norris; Robert Brodnick
Personalized learning environments and learning analytics hold the promise to transform learning experiences, enhance and accelerate student success, and "open up" student learning to resources and experiences from outside individual institutions. To achieve their potential, personalized learning projects must move beyond individual, stand-alone projects or innovations to reshaping the institutional experience.
   Learning science must connect with learning pedagogy and design. Learners and institutions must have access to tools and resources that assist in customizing student progress and supplemental learning needs. Teachers and faculty must be empowered to provide teaching and learning environments that allow individual students to thrive. All this will require unique partnerships and collaborations within and across institutions, incorporating the best learning science findings and bridging with public and private entities developing the learning and analytic tools to support personalized learning.
   Crafting a strategy to embrace and sustain the transformative power of personalized learning systems will require strong leadership and clear planning models to align with institutional planning and future investments.
Educational data scientists: a scarce breed BIBAFull-Text 278-281
  Simon Buckingham Shum; Martin Hawksey; Ryan S. J. D. Baker; Naomi Jeffery; John T. Behrens; Roy Pea
The Educational Data Scientist is currently a poorly understood, rarely sighted breed. Reports vary: some are known to be largely nocturnal, solitary creatures, while others have been reported to display highly social behaviour in broad daylight. What are their primary habits? How do they see the world? What ecological niches do they occupy now, and will predicted seismic shifts transform the landscape in their favour? What survival skills do they need when running into other breeds? Will their numbers grow, and how might they evolve? In this panel, the conference will hear and debate not only broad perspectives on the terrain, but will have been exposed to some real life specimens, and caught glimpses of the future ecosystem.

Workshops

DCLA13: 1st International Workshop on Discourse-Centric Learning Analytics BIBAFull-Text 282
  Simon Buckingham Shum; Maarten de Laat; Anna De Liddo; Rebecca Ferguson; Paul Kirschner; Andrew Ravenscroft; Ágnes Sándor; Denise Whitelock
This workshop anticipates that an important class of learning analytic will emerge at the intersection of research into learning dynamics, online discussion platforms, and computational linguistics. Written discourse is arguably the primary class of data that can give us insights into deeper learning and higher order qualities such as critical thinking, argumentation, mastery of complex ideas, empathy, collaboration and interpersonal skills. Moreover, the ability to write in a scholarly manner is a core competence, often taking the form of discourse with oneself and the literature. Computational linguistics research has developed a rich array of tools for machine interpretation of human discourse, but work to develop these tools in the context of learning is at a relatively early stage. Moreover, there is a significant difference between designing tools to assist researchers in discourse analysis, and their deployment on platforms to provide meaningful analytics for the learners and educators who are conducting that discourse. This workshop aims to catalyse ideas and build community connections among those who want to shape this field.
Analytics on video-based learning BIBAFull-Text 283-284
  Michail N. Giannakos; Konstantinos Chorianopoulos; Marco Ronchetti; Peter Szegedi; Stephanie D. Teasley
The International Workshop on Analytics on Video-based Learning (WAVe2013) aims to connect research efforts on Video-based Learning with Learning Analytics to create visionary ideas and foster synergies between the two fields. The main objective of WAVe is to build a research community around the topical area of Analytics on video-based learning. In particular, WAVe aims to develop a critical discussion about the next generation of analytics employed on video learning tools, the form of these analytics and the way they can be analyzed in order to help us to better understand and improve the value of video-based learning. WAVe is based on the rationale that combining and analyzing learners' interactions with other available data obtained from learners, new avenues for research on video-based learning have emerged.
Learning object analytics for collections, repositories & federations BIBAFull-Text 285-286
  Miguel-Angel Sicilia; Xavier Ochoa; Giannis Stoitsis; Joris Klerkx
A large number of curated digital collections containing learning resources of a various kind has emerged in the last year. These include referatories containing descriptions for resources in the Web (as MERLOT), aggregated collections (as Organic.Edunet), concrete initiatives as Khan Academy, repositories hosting and versioning modular content (as Connexions) and meta-aggregators (as Globe and Learning Registry). Also, OpenCourseware and other OER initiatives have contributed to making this ecosystem of resources richer. Very interesting insights can be extracted when studying the usage and social data that are produced within the learning collections, repositories and federations. At the same time, concerns for the quality and sustainability of these collections have been raised, which has lead to research on quality measurement and metrics. The Workshop attempts to bring studies and demonstrations for any kind of analysis done on learning resource collections, from an interdisciplinary perspective. We consider digital collections not as merely IT deployments but as social systems with contributors, owners, evaluators and users forming patterns of interactions on top of portals or through search systems embedded in other learning technology components. This is in coherence of considering these social systems under a Web Science approach (http://webscience.org/).
Second International Workshop on Teaching Analytics BIBAFull-Text 287-289
  Ravi Vatrapu; Peter Reimann; Wolfgang Halb; Susan Bull
Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers' dynamic diagnostic decision-making in real-world settings.