Writing Analytics, Data Mining, & Writing Studies, a preconference workshop at EDM 2016, aims to generate cross-disciplinary research among writing program directors and faculty, computational linguists, computer scientists, and educational measurement specialists. The primary goal of this workshop is to facilitate a research community around the topic of large-scale data analysis with a particular focus on writing studies, data mining, and analytics.
Workshop Goals
The workshop
- aimed to generate cross-disciplinary research among writing program directors and faculty, computational linguists, computer scientists, and educational measurement specialists
- questioned ways writing analytics and data mining can be used to improve on existing methods for responding to and assessing student writing
- invited researchers in the domains of data mining, writing analytics, and writing studies to engage in a creative interdisciplinary exploration of how digitally based writing analytics might improve students’ cognitive, intrapersonal and interpersonal competencies as writers, and also provide new analytic tools for assessing this improvement
- invited participants to brainstorm about ways analytics can improve document critique, peer review, and writing program assessment.
About the Workshop
Workshop Leaders:
- Val Ross, University of Pennsylvania
- Alex Rudniy, Fairleigh Dickinson University
- Joe Moxley, University of South Florida
- David Eubanks, Furman University
The primary goal of this workshop was to facilitate a research community around the topic of large-scale data analysis with a particular focus on writing studies, data mining, and analytics. The workshop was designed to generate cross-disciplinary research among writing program directors and faculty, computational linguists, computer scientists, and educational measurement specialists.
Presenters addressed the following questions:
- How can data mining and analytics be leveraged to better meet the needs of students and educational institutions?
- What are the best practices for adapting the state-of-the-art data mining approaches to the educational domain, with specific attention to teaching and assessing writing?
- How can researchers detect and assess students’ affective and emotional states while engaging the writing construct?
- For assessing writing, automated grading, automated commenting, natural language or textual data processing:
- What are applications of massive parallel computations?
- What are current advances and future directions in the artificial intelligence field?
- What methods, tools or big data platforms are more efficient?
- What are effective pre-processing techniques, e.g. for the Extract/Transform/Load phase?
- What are successful evaluation and validation methods?
Digital tools such as My Reviewers that enable instructors to grade and comment on student papers and peer reviews online are transforming how instructors and students critique documents and have the potential to transform how writing and writing programs are assessed. Beyond profoundly altering how faculty and students respond to writing, these tools aggregate e-portfolios, facilitate distributive evaluation, and archive data that allow researchers to mine texts and map student outcomes in order to produce analytics that inform users, researchers, and administrators. Rather than limit assessment to cognitive measures, these toolsets facilitate gathering authentic assessment information about students’ intrapersonal and interpersonal competencies.