HCI Bibliography Home | HCI Conferences | LAK Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
LAK Tables of Contents: 1112131415

LAK'11: 2011 International Conference on Learning Analytics and Knowledge

Fullname:Proceedings of the 1st International Conference on Learning Analytics and Knowledge
Editors:Phillip Long; George Siemens; Gráinne Conole; Dragan Gašević
Location:Banff, Alberta, Canada
Dates:2011-Feb-27 to 2011-Mar-01
Publisher:ACM
Standard No:ISBN: 978-1-4503-0944-8; ACM DL: Table of Contents; hcibib: LAK11
Papers:27
Pages:185
Links:Conference Website
  1. Vision and conceptual papers
  2. Ideas and innovation
  3. Tool demonstration papers
Learnometrics: metrics for learning objects BIBAFull-Text 1-8
  Xavier Ochoa
The field of Technology Enhanced Learning (TEL) in general, has the potential to solve one of the most important challenges of our time: enable everyone to learn anything, anytime, anywhere. However, if we look back at more than 50 years of research in TEL, it is not clear where we are in terms of reaching our goal and whether we are, indeed, moving forward. The pace at which technology and new ideas evolve have created a rapid, even exponential, rate of change. This rapid change, together with the natural difficulty to measure the impact of technology in something as complex as learning, has lead to a field with abundance of new, good ideas and scarcity of evaluation studies. This lack of evaluation has resulted into the duplication of efforts and a sense of no "ground truth" or "basic theory' of TEL. This article is an attempt to stop, look back and measure, if not the impact, at least the status of a small fraction of TEL, Learning Object Technologies, in the real world. The measured apparent inexistence of the reuse paradox, the two phase linear growth of repositories or the ineffective metadata quality assessment of humans are clear reminders that even bright theoretical discussions do not compensate the lack of experimentation and measurement. Both theoretical and empirical studies should go hand in hand in order to advance the status of the field. This article is an invitation to other researchers in the field to apply Informetric techniques to measure, understand and apply in their tools the vast amount of information generated by the usage of Technology Enhanced Learning systems.
Attention please!: learning analytics for visualization and recommendation BIBAFull-Text 9-17
  Erik Duval
This paper will present the general goal of and inspiration for our work on learning analytics, that relies on attention metadata for visualization and recommendation. Through information visualization techniques, we can provide a dashboard for learners and teachers, so that they no longer need to "drive blind". Moreover, recommendation can help to deal with the "paradox of choice" and turn abundance from a problem into an asset for learning.
Learning networks, crowds and communities BIBAFull-Text 18-22
  Caroline Haythornthwaite
Who we learn from, where and when is dramatically affected by the reach of the Internet. From learning for formal education to learning for pleasure, we look to the web early and often for our data and knowledge needs, but also for places and spaces where we can collaborate, contribute to, and create learning and knowledge communities. Based on the keynote presentation given at the first Learning Analytics and Knowledge Conference held in 2011 in Banff, Alberta, this paper explores a social network perspective on learning with reference to social network principles and studies by the author. The paper explores the ways a social network perspective can be used to examine learning, with attention to the structure and dynamics of online learning networks, and emerging configurations such as online crowds and communities.
Discourse-centric learning analytics BIBAFull-Text 23-33
  Anna De Liddo; Simon Buckingham Shum; Ivana Quinto; Michelle Bachler; Lorella Cannavacciuolo
Drawing on sociocultural discourse analysis and argumentation theory, we motivate a focus on learners' discourse as a promising site for identifying patterns of activity which correspond to meaningful learning and knowledge construction. However, software platforms must gain access to qualitative information about the rhetorical dimensions to discourse contributions to enable such analytics. This is difficult to extract from naturally occurring text, but the emergence of more-structured annotation and deliberation platforms for learning makes such information available. Using the Cohere web application as a research vehicle, we present examples of analytics at the level of individual learners and groups, showing conceptual and social network patterns, which we propose as indicators of meaningful learning.
iSpot analysed: participatory learning and reputation BIBAFull-Text 34-43
  Doug Clow; Elpida Makriyannis
We present an analysis of activity on iSpot, a website supporting participatory learning about wildlife through social networking. A sophisticated and novel reputation system provides feedback on the scientific expertise of users, allowing users to track their own learning and that of others, in an informal learning context. We find steeply unequal long-tail distributions of activity, characteristic of social networks, and evidence of the reputation system functioning to amplify the contribution of accredited experts. We argue that there is considerable potential to apply such a reputation system in other participatory learning contexts.
Dataset-driven research for improving recommender systems for learning BIBAFull-Text 44-53
  Katrien Verbert; Hendrik Drachsler; Nikos Manouselis; Martin Wolpers; Riina Vuorikari; Erik Duval
In the world of recommender systems, it is a common practice to use public available datasets from different application environments (e.g. MovieLens, Book-Crossing, or Each-Movie) in order to evaluate recommendation algorithms. These datasets are used as benchmarks to develop new recommendation algorithms and to compare them to other algorithms in given settings. In this paper, we explore datasets that capture learner interactions with tools and resources. We use the datasets to evaluate and compare the performance of different recommendation algorithms for learning. We present an experimental comparison of the accuracy of several collaborative filtering algorithms applied to these TEL datasets and elaborate on implicit relevance data, such as downloads and tags, that can be used to improve the performance of recommendation algorithms.
Variable construction for predictive and causal modeling of online education data BIBAFull-Text 54-63
  Stephen E. Fancsali
We consider the problem of predictive and causal modeling of data collected by courseware in online education settings, focusing on graphical causal models as a formalism for such modeling. We review results from a prior study, present a new pilot study, and suggest that novel methods of constructing variables for analysis may improve our ability to infer predictors and causes of learning outcomes in online education. Finally, several general problems for causal discovery from such data are surveyed along with potential solutions.
A unified framework for multi-level analysis of distributed learning BIBAFull-Text 64-74
  Daniel Suthers; Devan Rosen
Learning and knowledge creation is often distributed across multiple media and sites in networked environments. Traces of such activity may be fragmented across multiple logs and may not match analytic needs. As a result, the coherence of distributed interaction and emergent phenomena are analytically cloaked. Understanding distributed learning and knowledge creation requires multi-level analysis of the situated accomplishments of individuals and small groups and of how this local activity gives rise to larger phenomena in a network. We have developed an abstract transcript representation that provides a unified analytic artifact of distributed activity, and an analytic hierarchy that supports multiple levels of analysis. Log files are abstracted to directed graphs that record observed relationships (contingencies) between events, which may be interpreted as evidence of interaction and other influences between actors. Contingency graphs are further abstracted to two-mode directed graphs that record how associations between actors are mediated by digital artifacts and summarize sequential patterns of interaction. Transitive closure of these associograms creates sociograms, to which existing network analytic techniques may be applied, yielding aggregate results that can then be interpreted by reference to the other levels of analysis. We discuss how the analytic hierarchy bridges between levels of analysis and theory.
Redefining dropping out in online higher education: a case study from the UOC BIBAFull-Text 75-80
  Josep Grau-Valldosera; Julià Minguillón
In recent years, studies into the reasons for dropping out of online higher education have been undertaken with greater regularity, parallel to the rise in the relative weight of this type of education, compared with brick-and-mortar education. However, the work invested in characterising the students who drop out of education, compared with those who do not, appears not to have had the same relevance as that invested in the analysis of the causes. The definition of dropping out is very sensitive to the context. In this article, we reach a purely empirical definition of student dropping out, based on the probability of not continuing a specific academic programme following several consecutive semesters of "theoretical break". Dropping out should be properly defined before analysing its causes, as well as comparing the drop-out rates between the different online programmes, or between online and on-campus ones. Our results show that there are significant differences among programmes, depending on their theoretical extension, but not their domain of knowledge.

Vision and conceptual papers

Usage contexts for object similarity: exploratory investigations BIBAFull-Text 81-85
  Katja Niemann; Hans-Christian Schmitz; Maren Scheffel; Martin Wolpers
We present new ways of detecting semantic relations between learning resources, e. g. for recommendations, by only taking their usage but not their content into account. We take concepts used in linguistic lexicology and transfer them from their original field of application, i. e. sequences of words, to the analysis of sequences of resources extracted from user activities. In this paper we describe three initial experiments, their evaluation and further work.
The who, what, when, and why of lecture capture BIBAFull-Text 86-92
  Christopher Brooks; Carrie Demmans Epp; Greg Logan; Jim Greer
Video lecture capture is rapidly being deploying in higher-education institutions as a means of increasing student learning, outreach, and experience. Understanding how learners use these systems and relating this use back to pedagogical and institutional goals is a hard issue that has largely been unexplored. This work describes a novel web-based lecture presentation system which contains fine-grained user tracking features. These features, along with student surveys, have been used to help analyse the behaviour of hundreds of students over an academic term, quantifying both the learning approaches of students and their perceptions on learning with lecture capture.
Towards visual analytics for teachers' dynamic diagnostic pedagogical decision-making BIBAFull-Text 93-98
  Ravi Vatrapu; Chris Teplovs; Nobuko Fujita; Susan Bull
The focus of this paper is to delineate and discuss design considerations for supporting teachers' dynamic diagnostic decision-making in classrooms of the 21st century. Based on the Next Generation Teaching Education and Learning for Life (NEXT-TELL) European Commission integrated project, we envision classrooms of the 21st century to (a) incorporate 1:1 computing, (b) provide computational as well as methodological support for teachers to design, deploy and assess learning activities and (c) immerse students in rich, personalized and varied learning activities in information ecologies resulting in high-performance, high-density, high-bandwidth, and data-rich classrooms. In contrast to existing research in educational data mining and learning analytics, our vision is to employ visual analytics techniques and tools to support teachers dynamic diagnostic pedagogical decision-making in real-time and in actual classrooms. The primary benefits of our vision is that learning analytics becomes an integral part of the teaching profession so that teachers can provide timely, meaningful, and actionable formative assessments to on-going learning activities in-situ. Integrating emerging developments in visual analytics and the established methodological approach of design-based research (DBR) in the learning sciences, we introduce a new method called "Teaching Analytics" and explore a triadic model of teaching analytics (TMTA). TMTA adapts and extends the Pair Analytics method in visual analytics which in turn was inspired by the pair programming model of the extreme programming paradigm. Our preliminary vision of TMTA consists of a collocated collaborative triad of a Teaching Expert (TE), a Visual Analytics Expert (VAE), and a Design-Based Research Expert (DBRE) analyzing, interpreting and acting upon real-time data being generated by students' learning activities by using a range of visual analytics tools. We propose an implementation of TMTA using open learner models (OLM) and conclude with an outline of future work.
Learning analytics to identify exploratory dialogue within synchronous text chat BIBAFull-Text 99-103
  Rebecca Ferguson; Simon Buckingham Shum
While generic web analytics tend to focus on easily harvested quantitative data, Learning Analytics will often seek qualitative understanding of the context and meaning of this information. This is critical in the case of dialogue, which may be employed to share knowledge and jointly construct understandings, but which also involves many superficial exchanges. Previous studies have validated a particular pattern of 'exploratory dialogue' in learning environments to signify sharing, challenge, evaluation and careful consideration by participants. This study investigates the use of sociocultural discourse analysis to analyse synchronous text chat during an online conference. Key words and phrases indicative of exploratory dialogue were identified in these exchanges, and peaks of exploratory dialogue were associated with periods set aside for discussion and keynote speakers. Fewer individuals posted at these times, but meaningful discussion outweighed trivial exchanges. If further analysis confirms the validity of these markers as learning analytics, they could be used by recommendation engines to support learners and teachers in locating dialogue exchanges where deeper learning appears to be taking place.
The value of learning analytics to networked learning on a personal learning environment BIBAFull-Text 104-109
  Hélène Fournier; Rita Kop; Hanan Sitlia
Some might argue that the analytics tools at our disposal are currently mainly used for boring purposes, such as improving processes and making money. In this paper we will try to define learning analytics and their purpose for learning and education. We will ponder on the best possible fit of particular types of research methods and their analysis. Methodological concerns related to the analysis of Big Data collected on online networks as well as ethical and privacy concerns will also be highlighted and a case study of the use of learning analytics in a Massive Open Online Course explored.
Using learning analytics to assess students' behavior in open-ended programming tasks BIBAFull-Text 110-116
  Paulo Blikstein
There is great interest in assessing student learning in unscripted, open-ended environments, but students' work can evolve in ways that are too subtle or too complex to be detected by the human eye. In this paper, I describe an automated technique to assess, analyze and visualize students learning computer programming. I logged hundreds of snapshots of students' code during a programming assignment, and I employ different quantitative techniques to extract students' behaviors and categorize them in terms of programming experience. First I review the literature on educational data mining, learning analytics, computer vision applied to assessment, and emotion detection, discuss the relevance of the work, and describe one case study with a group undergraduate engineering students.
Learning analytics as interpretive practice: applying Westerman to educational intervention BIBAFull-Text 117-121
  Michael Atkisson; David Wiley
In Westerman's [12] disruptive article, "Quantitative research as an interpretive enterprise: The mostly unacknowledged role of interpretation in research efforts and suggestions for explicitly interpretive quantitative investigations," he invited qualitative researchers in psychology to adopt quantitative methods into interpretive inquiry, given that they were as capable as qualitative measures in producing meaning-laden results. The objective of this article is to identify Westerman's [12] key arguments and apply them to the practice of Learning Analytics in educational interventions. The primary implication for Learning Analytics practitioners is the need to interpret quantitative analysis procedures at every phase from philosophy to conclusions. Furthermore, Learning Analytics practitioners and consumers must critically examine any assumption that suggests quantitative methodologies in Learning Analytics are inherently objective or that Learning Analytics algorithms may replace judgment rather than aid it. Lastly we propose a method for making observational data in virtual environments concrete through nested models.

Ideas and innovation

Academic analytics landscape at the University of Phoenix BIBAFull-Text 122-126
  Mike Sharkey
The University of Phoenix understands that in order to serve its large population of non-traditional students, it needs to rely on data. We have created a strong foundation with an integrated data repository that connects data from all parts of the organization. With this repository in place, we can now undertake a variety of analytics projects. One such project is an attempt to predict a student's persistence in their program using available data indicators such as schedule, grades, content usage, and demographics.
Cultural considerations in learning analytics BIBAFull-Text 127-133
  Ravi Vatrapu
This paper discusses empirical findings demonstrating cultural influences in social behavior, communication, cognition, technology enhanced learning and draws implications for learning analytics.
Social and semantic network analysis of chat logs BIBAFull-Text 134-139
  Devan Rosen; Victor Miagkikh; Daniel Suthers
Multi-user virtual environments (MUVEs) allow many users to explore the environment and interact with other users as they learn new content and share their knowledge with others. The semi-synchronous communicative interaction within these learning environments is typically text-based Internet relay chat (IRC). IRC data is stored in the form of chatlogs and can generate a large volume of data, posing a difficulty for researchers looking to evaluate learning in the interaction by analyzing and interpreting the patterns of communication structure and related content. This paper describes procedures for the measurement and visualization of chat-based communicative interaction in MUVEs. Methods are offered for structural analysis via social networks, and content analysis via semantic networks. Measuring and visualizing social and semantic networks allows for a window into the structure of learning communities, and also provides for a large cache of analytics to explore individual learning outcomes and group interaction in any virtual interaction. A case study on a learning based MUVE, SRI's Tapped-In community, is used to elaborate analytic methods.
Applying analytics for a learning portal: the Organic.Edunet case study BIBAFull-Text 140-146
  Nikos Palavitsinis; Vassilios Protonotarios; Nikos Manouselis
Learning portals are education-oriented Web portals, which provide access to a variety of educational material, usually coming from various sources. In order to explore how they can support their users during an educational activity (e.g. preparation of teaching a course), it would be interesting to study the behavior of their visitors, focusing on the particular context in which specific actions are taking place. For example, user activities may be analyzed during specific learning events, when activities are more focused. This paper discusses the case study of the Organic.Edunet Web portal (www.organic-edunet.eu), a learning portal for organic agriculture educators that provides access to more than 10,500 learning resources from a federation of 11 institutional repositories. The portal mostly focuses on serving school teachers and university tutors and has attracted until today almost 42.200 unique visitors from more than 160 countries, out of which about 2.600 have registered to the portal. An effort is made to study the users' behavior, focusing in tutors and educators in both schools and universities, in relation to specific training events in which we know that they have been involved. Therefore, we analyze logs of user activities that took place on specific dates and geographical locations, in order to potentially identify notable changes in their normal visiting behavior.
Generating predictive models of learner community dynamics BIBAFull-Text 147-152
  Chris Teplovs; Nobuko Fujita; Ravi Vatrapu
In this paper we present a framework for learner modelling that combines latent semantic analysis and social network analysis of online discourse. The framework is supported by newly developed software, known as the Knowledge, Interaction, and Social Student Modelling Explorer (KISSME), that employs highly interactive visualizations of content-aware interactions among learners. Our goal is to develop, use and refine KISSME to generate and test predictive models of learner interactions to optimise learning.
Learning designs and learning analytics BIBAFull-Text 153-156
  Lori Lockyer; Shane Dawson
Government and institutionally-driven reforms focused on quality teaching and learning in universities emphasize the importance of developing replicable, scalable teaching approaches that can be evaluated. In this context, learning design and learning analytics are two fields of research that may help university teachers design quality learning experiences for their students, evaluate how students are learning within that intended learning context and support personalized learning experiences for students. Learning Designs are ways of describing an educational experience such that it can be applied across a range of disciplinary contexts. Learning analytics offers new approaches to investigating the data associated with a learner's experience. This paper explores the relationship between learning designs and learning analytics.
Revisiting formative evaluation: dynamic monitoring for the improvement of learning activity design and delivery BIBAFull-Text 157-162
  Griff Richards; Irwin DeVries
Distance education courses have a tradition of a formative evaluation cycle that takes place before a course is formally delivered. This paper discusses opportunities for improving online and blended learning by collecting formative data during course presentation. With a goal of overall improvement in instructional effectiveness and identification of promising practices for inclusion in a learning activities design library, we propose the immediate and on-going monitoring of the effectiveness of learning activities, tutor facilitation and learner satisfaction during the course presentation. This has implications for constructively involving the learners and facilitators in the course improvement process. While originally conceived to reduce the time for pilot evaluation of new courses and learning activities, the proposed system could also be extended to individualized and blended learning environments, and if implemented using semantic web technologies, for research into the effectiveness of learning activity patterns.
Stepping out of the box: towards analytics outside the learning management system BIBAFull-Text 163-167
  Abelardo Pardo; Carlos Delgado Kloos
Most of the current learning analytic techniques have as starting point the data recorded by Learning Management Systems (LMS) about the interactions of the students with the platform and among themselves. But there is a tendency on students to rely less on the functionality offered by the LMS and use more applications that are freely available on the net. This situation is magnified in studies in which students need to interact with a set of tools that are easily installed on their personal computers. This paper shows an approach using Virtual Machines by which a set of events occurring outside of the LMS are recorded and sent to a central server in a scalable and unobtrusive manner.

Tool demonstration papers

SNAPP: a bird's-eye view of temporal participant interaction BIBAFull-Text 168-173
  Aneesha Bakharia; Shane Dawson
The Social Networks Adapting Pedagogical Practice (SNAPP) tool was developed to provide instructors with the capacity to visualise the evolution of participant relationships within discussions forums. Providing forum facilitators with access to these forms of data visualisations and social network metrics in 'real-time', allows emergent interaction patterns to be analysed and interventions to be undertaken as required. SNAPP essentially serves as an interaction diagnostic tool that assists in bringing the affordances of 'real-time' social network analysis to fruition. This paper details the functional features included in SNAPP 2.0 and how they relate to learning activity intent and participant monitoring. SNAPP 2.0 includes the ability to view the evolution of participant interaction over time and annotate key events that occur along this timeline. This feature is useful in terms of monitoring network evolution and evaluating the impact of intervention strategies on student engagement and connectivity. SNAPP currently supports discussion forums found in popular commercial and open source Learning Management Systems (LMS) such as Blackboard, Desire2Learn and Moodle and works in both Internet Explorer and Firefox.
AAT: a tool for accessing and analysing students' behaviour data in learning systems BIBAFull-Text 174-179
  Sabine Graf; Cindy Ives; Nazim Rahman; Arnold Ferri
In online learning environments, teachers and course designers often get little feedback about how students actually interact with and learn in online courses. Most of the learning systems used by educational institutions store comprehensive log data associated with students' behaviours and actions. However, these systems typically reveal or report on very general and limited information based on this data. In order to provide teachers and course designers with more detailed and meaningful information about students' behaviour and their use of learning resources within online courses, an analytics tool has been developed. The tool incorporates functionality to access and analyse data related to students' behaviours in learning systems. This tool can provide valuable information about students' learning processes allowing the identification of difficult or inappropriate learning material, and can therefore significantly contribute to the design of improved student support activities and resources.
Evolving a learning analytics platform BIBAFull-Text 180-185
  Ari Bader-Natal; Thomas Lotze
Web-based learning systems offer researchers the ability to collect and analyze fine-grained educational data on the performance and activity of students, as a basis for better understanding and supporting learning among those students. The availability of this data enables stakeholders to pose a variety of interesting questions, often specifically focused on some subset of students. As a system matures, the number of stakeholders, the number of interesting questions, and the number of relevant sub-populations of students also grow, adding complexity to the data analysis task. In this work, we describe an internal analytics system designed and developed to address this challenge, adding flexibility and scalability. Here we present several examples of typical examples of analysis, discuss a few uncommon but powerful use-cases, and share lessons learned from the first two years of iteratively developing the platform.