r3becca

This user hasn't shared any biographical information

The Second International Workshop on Discourse-Centric Learning Analytics

Workshop notes: Google Doc at http://bit.ly/1oRR1MO

Twitter hashtag: #dcla14

Workshop rationale

The First International Workshop on Discourse-Centric Learning Analytics succeeded in its aim of catalysing ideas and building community connections between those working in this field of social learning analytics. It also proposed a mission statement.

Mission statement for DCLA
Devise and validate analytics that look beyond surface measures in order to quantify linguistic proxies for deeper learning.

In 2014, the focus of the second international DCLA workshop, like that of LAK14, will be on the intersection of discourse learning analytics research, theory and practice. Once researchers have developed and validated discourse-centric analytics, how can these be successfully deployed at scale to support learning?

The success of DCLA13 demonstrated that an important class of learning analytic is emerging that is concerned with the use of discourse to support learning and teaching. These analytics are being developed at the intersection of fields including automated assessment, learning dynamics, deliberation platforms, and computational linguistics. What moves these developments into the category of learning analytics, as opposed to research that sits in any of the above categories, is their use or potential to generate actionable intelligence. This may include the development of information displays that help learners and educators to understand significant discourse patterns and to reflect on learning dialogue, or tools that support interventions to improve discourse for learning.

Unlike other analytics, which take measures such as engagement, attention and test scores as proxies for learning, discourse-centric analytics offer researchers from different traditions the potential to focus on the quality of the learning process. Cognitive constructivists approach this work from the perspective that engaging in online dialogue encourages learners to make explicit what they have stored in their memories, while social constructivists may be focused on the ways in which dialogue promotes the collaborative process during which meaning is negotiated and knowledge co-constructed. Dialogue can reveal engagement with the ideas of others, the development of reasoning, shifts in understanding and the ways in which learners relate new ideas to personal understanding. Studies have shown that, in a variety of contexts, educational success and failure can be related to the quality of learners’ dialogue.

From an educator’s perspective, written discourse is one way of gaining insights into deeper learning and the higher order qualities associated with it, including critical thinking, argumentation and mastery of complex ideas. These skills are difficult to master and are a focus for assessment, particularly in the field of higher education. Discourse-centric learning analytics offer the possibility of using students’ written drafts to generate feedback that will help them to reflect upon and improve the ways in which they express their understanding.

The use of these analytics is not confined to the formal learning sector, they can also be applied to informal learning, where learners set their own goals and select their own methods of achieving them. For example, learners can use an Evidence Hub to make sense of the written ideas of a community. Algorithms can be used to identify key issues, ideas and evidence and to support the development of a reflective community of practice by making it clear where people disagree and why.

Potential benefits and applications of discourse-centric analytics are clear, yet their development and deployment face challenges. DCLA14 addresses two of these: fragmentation of the field and barriers to adoption.

Fragmentation of the field

Fragmentation of the field emerges in part because, as identified above, these analytics are emerging at the intersection of many different areas of research. Even within these areas, the work and its topics are diverse. Researchers focused on networked learning computer-mediated collaboration, computer-mediated discussion, computer conferencing, and asynchronous learning networks may all be working on similar data but with only limited opportunities to share their conceptualisations and their findings.

This can result in a failure to build upon the work of others. De Wever and his colleagues carried out a review of instruments used for content analysis over a period of 13 years, from 1992 to 2005. They examined 15 instruments and found that ‘instruments are hardly compared or contrasted with one another. As a consequence, the empirical base of the validity of the instruments is limited’ [1].

These studies – and the analysis of them as a group – all have implications for DCLA, but they were carried out before the emergence of learning analytics as a field of study. A related cause for concern is therefore that researchers who are new to the field may neglect or be wholly unaware of relevant work in related areas. The DCLA community needs to be active in identifying and sharing key research and development that has taken place in related fields.

Barriers to adoption

The concern that researchers in the field of discourse-centric learning analytics may not build effectively on past work is mirrored by a concern that they may not be building effectively for the future. Learning analytics are an element of technology-enhanced learning (TEL); they make use of online data and computer algorithms to support both learning and teaching.

TEL is a complex system with many different elements, all of which must be taken into account as an innovation is designed, developed and embedded. Successful implementation of discourse-centred analytics requires attention to both technology and pedagogy. It also needs to take into account diverse elements of the TEL complex, including activities and expectations of learners and teachers, policies on privacy and data protection, staff training, technical support, institutional goals and technical infrastructure.

  1. de Wever, B., Schellens, T., Vallcke, M. and van Keer, H., Content analysis schemes to analyze transcripts of online asynchronous discussion groups: a review. Computers & Education, 46, 1, (2006), 6-28.

Leave a comment