Thematic Paper Session 1

female viagra women http://whenwaterwaseverywhere.com/?x=order-usa-viagra-online Corpus Studies of Peer Review: How Effective is Commentary, and Is There Evidence of Transfer Across Courses?
Chris Anson (North Carolina State University, US), Joseph Moxley (University of South Florida, US) and Djuddah Leijen (University of Tartu, Estonia)

buy viagra canada Abstract:
Currently there is increasing interest in the field of writing studies in how effectively the language of peer review promotes revision, and to what extent the knowledge students gain in a writing course “transfers” into the writing they do in other academic contexts. Both questions are important to curriculum development and delivery, the former because we need information about how to orient students to provide the most effective peer reviews possible, the latter because of the assumption that foundational courses prepare students for writing they do in their fields of study and eventually in their careers and civic lives (Anson & Moore, 2016; Yancey, et al., 2014). On this panel, we will share the results of studies exploring these questions.

http://ctpremierit.com/?x=generic-viagra-drugs-comparable-to-abilify Speaker #1 will describe the collaborative efforts at his institution to develop a digital peer review system (MyReviewers) that has been used across a number of institutions internationally and is the source generating the data for a number of studies of student peer review. Students at these institutions have used the system to engage in reflective writing about their processes and provide reviews and/or grades on their peers’ written work. The system also allows for the generation and storage of instructor comments on the same student papers. As a result, a corpus has emerged of hundreds of thousands of student essays and peer reviews, which researchers are actively analyzing using different methods. Speaker #1 will describe MyReviewers and sketch the results of some of the studies that have utilized the peer review system as their main data collection tool. His presentation will transition into the second presentation, which also used MyReviewers to collect and analyze a large corpus of data.

clomid drug names Speaker #2 will present the results of a corpus study comparing the conceptually-related key terms (quasi-threshold concepts) used by students during peer review in a foundational writing course and in a large STEM course. This study applies the results of a non-probability survey of writing experts administered to two major listservs populated by scholars and teachers of academic writing, yielding a response rate of 475 (Anson & Anson, 2017). The survey asked respondents to provide ten key terms expected in principled, expert response to student writing. The resulting corpus was then applied to student peer reviews generated by MyReviewers, described by Speaker #1. Preliminary statistical analysis shows that terms being adopted by students in the foundational writing course do not appear with the same frequency in the STEM peer reviews. These results, which will be made more robust in the presentation after further analysis, suggest that either 1) assumptions about the transfer of key rhetorical and composing terms across academic contexts are misguided; or 2) that the contexts in STEM courses do not require the use of these terms for reasons to be considered; or 3) that the transfer of concepts important to writing teachers—and to all writing–is simply not taking place once students leave that context.

Speaker #3 will present the results of a study of a corpus of Master’s and PhD students (Master students’ data collected in MyReviewers) providing feedback in small groups.The feedback comments were analysed to measure the reported effectiveness of their comments (self reported and as reported by others), with the aim to determine what is most likely to be considered an effective comment. Comments are coded based on a variety of effective features, as described in recent literature (Nelson and Schunn, 2009; Leijen, 2017), and based on features measuring the social interaction between peers, such as revision and non-revision type comments (Liu and Sadler, 2003), Communication type (Yallop, 2016), and use of affective language (Yallop, 2016). The results of this study will contribute to developing an effectiveness measure on larger sets of corpus data, with the aim to measure whether the self-reported effective comment is considered effective when measuring the improvement of students writing from one draft to the next, and in the long run, transfers to next years.

References
Anson, I. G., & Anson, C. M. (2017). Assessing peer and instructor response to writing: A corpus analysis from an expert survey. Assessing Writing, 33, 12-24.
Anson, C. M., & Moore, J. (Eds.). (2016). Critical Transitions: Writing and the question of transfer. Boulder: University Press of Colorado and the WAC Clearinghouse.
Leijen, D. A. J. (2017). A Novel Approach to Examine the Impact of Web-based Peer Review on the Revisions of L2 Writers. Computers and Composition, 43, 35-54.
Liu, J., & Sadler, R. W. (2003). The effect and affect of peer review in electronic versus traditional modes on L2 writing. Journal of English for academic Purposes, 2(3), 193-227.
Nelson, M. M., & Schunn, C. D. (2009). The nature of feedback: How different types of peer feedback affect writing performance. Instructional Science, 37(4), 375-401.
Yallop, R. M. A. (2016). Measuring affective language in known peer feedback on L2 Academic writing courses: A novel approach. Eesti Rakenduslingvistika Ühingu aastaraamat, 12, 287-308.
Yancey, K. B., Robertson, L., & Taczak, K. (2014). Writing across contexts: Transfer, composition, and sites of writing. Boulder: University Press of Colorado.