Tentative Schedule

Thursday: January 11, 2018

9-12, Workshops

1. Workshop for Writing Program Directors: Data Jam, Program Chair:  

Workshop Leader: Erica Snow

This workshop is ideal for writing program administrators who are looking for ways to assess their writing programs and secure funding for mentoring and evidence-based program development

The goal of this track is to highlight exemplary methods for corpus-based research and writing analytics. Over the decade, there has been exciting advances in the field of learning analytics and data-mining. Recently, this work has begun to impact how we study and view the writing process. This track aims to bring together with data scientists with writing program administrators who are interested in data mining and learning analytic methods to better understand the writing process and products.

 

2. Workshop on New Tools for Writers: From tool development to writing analytics and back: Using data to explore pedagogical impacts and create the next generation of writing tools

Workshop Leaders: Elena Cotos (Iowa State University), Otto Kruse (Zurich University of Applied Sciences), and Christian Rapp (Zurich University of Applied Sciences)

In the last years we see an increase of electronic tools that support academic writing (and its instruction, learning, practicing) at different stages of the writing process with ever more, and increasingly sophisticated, functions. In a review currently conducted by members of the European Literacy Network we found no less than 85 of them and sure there is more (to come). Most of the tools are running as a Software as a Service (SaaS) i.e. run in a web browser while the software is executed on a server. This opens the potential to collect enormous amounts of user data. The resulting research question we want to address within the conference track is which data is (or should be) collected, how and for which purposes it could be analyzed and how the results could be used for improving the tools and our understanding of them. We assume that the research problem posed calls for interdisciplinary cooperation, among others, between academic writing researchers and -instructors, writing tool developers, experts in data analysis, and computer-/ corpus linguists. While the initiators of the track all develop such systems, and hence have very practical interest, we also deem an exchange between applied and foundational researchers fruitful.

12-1, Lunch will not be provided.

1:00 -1:45, Erica Snow, Imbellus 

Presentation Title: What can Writing Studies learn from the Fields of Cognitive Psychology and Predictive Analytics?

The development of effective learning tasks involves considerations of a number of factors, including characteristics of the learner, the content of the domain, and features of the task environment.  Within ill-defined domains, such as reading and writing, this process becomes increasingly complicated, as student performance, cognitive states, and affect are subjective and difficult to assess accurately. However, recent work in the field of cognitive psychology has provide new insights into these hard to measure constructs that can be used to augment in-class writing instruction. This presentation will describe work within the field of cognitive psychology that aims to use advanced analytics as a means to identify evidence of cognitive states of users’ as well as writing quality. Future application of these results and how they can be used within the classroom and by writing instructors  will also be discussed.

1:45-2:00, Coffee Break

2:00 -3:00

Session One:  TBD

Session Two:  TBD

3:00 -3:15, Coffee Break

3:15-4:15

Session One: Writing Analytics and Peer Review in STEM (Roundtable)

Speakers: Val Ross, Director of Critical Writing at University of Pennsylvania; Christiane Donahue, Director of the Institute for Writing and Rhetoric at Dartmouth; Chris Anson, Director of the Campus Writing and Speaking Program at North Carolina State University; Suzanne Lane, Director of Writing, Rhetoric, and Professional Communication at MIT

Session Two: TBD

4:30-5:30, Phil Durant, Senior Lecturer in Language Education at the University of Exeter

Presentation Title: Corpus research on the development of children’s school writing

Since at least the 1940s, researchers have been interested in studying the development of children’s language writing through quantitative analysis of texts. The need for research of this kind has become pressing in England in recent years due to an increased curricular emphasis on explicit teaching of the linguistic features of writing. The current National Curriculum states that students should be taught to ‘draft and write by: selecting appropriate grammar and vocabulary, understanding how such choices can change and enhance meaning’ (DfE, 2013a) and specifies the ages at which children are expected master specific features of written grammar and vocabulary (DfE, 2013b). A convincing linguistic research base against which such policies can be evaluated does not yet, however, exist.

The Growth in Grammar project was developed in response to this need. It uses corpus methods to understand the linguistic development of English children’s language throughout the course of their compulsory education. Our team at the University of Exeter has collected a corpus of educationally-authentic texts from children in schools across England from ages six to sixteen with the aims of understanding what distinguishes texts written at different ages, at different levels of attainment, and in different genres.

This presentation will give an overview of the last six decades of research into how the language of children’s writing develops and discuss the Growth in Grammar project, focusing especially on methodological issues involved in creating and analysing a child learner corpus and on what our results are telling us about written language development.

8:00 p.m., Bonfire on the Beach

Friday: January 12, 2018

8:00-9:00, Chat with Editorial Board, The Journal of Writing Assessment. Norbert Elliot

9:00-10:00, Jill Burstein, Director of Research of the Natural Language Processing Group in the Research Division at Educational Testing Service

Presentation Title: Automated Writing Evaluation & the Literacy Challenge: Tools for Supporting & Understanding Postsecondary Writers

Abstract: Writing is a challenge, especially for at-risk students who may lack the prerequisite writing skills required to persist in U.S. 4-year postsecondary (college) institutions. U.S K-12 research examines writing achievement and the specific skills and knowledge in the writing domain. Automated writing evaluation (AWE) systems typically support the measurement of pertinent writing skills for automated scoring of large-volume, high-stakes assessments and online instruction. AWE has been used primarily for on-demand essay writing on standardized assessments. However, the real-time, dynamic nature of NLP-based AWE affords the ability to generate feedback across a range of writing genres in postsecondary education, such as, on-demand essay writing tasks, argumentative essays from the social sciences, and lab reports in STEM courses. AWE analyses can be use to generatefeedback that can provide students with meaningful information to support their writing, and educational analytics that can be informative for various stakeholders, including students, instructors, parents, administrators and policy-makers. This talk will focus on a demonstration and discussion of new publicly-accessible feedback app (to be announced) and an exploratory research study that uses AWE to examine relationships between features in postsecondary student writing and broader success predictors.

10:15-11:15

Session One: TBD

Session Two: TBD

11:15-12:30, Lunch on Your Own

12:30-1:30

Session One: TBD

Session Two: TBD

1:45-2:45

Session One: TBD

Session Two: TBD

3:00-4:00

Session One: TBD

Session Two: TBD

4:30-4:45, Closing Remarks