Program

Thursday, January 12

1:00pm

Welcome from Joe Moxley, Conference Chair

Speaker:: Joe Moxley, Director of Composition, University of South Florida. Joseph M. Moxley is a professor of English and director of the First-Year Composition Program at the University of South Florida. At USF, Moxley teaches graduate courses on research methods in writing studies, rhetoric and technology, scholarly publishing, and composition pedagogy... Read More -›


Thursday January 12, 2017 1:00pm - 1:25pm
Harbor Hall Community Room

1:25pm

Introduction

Speaker: Christian Weisser


Thursday January 12, 2017 1:25pm - 1:30pm
Harbor Hall Community Room

1:30pm

“Whistle While You Work: Data Mining in the Multi-Dimensional Landscape of Writing Studies”

Speaker: Val Ross, Director of Critical Writing Program, University of Pennsylvania. Val Ross is the founding director (2003) of the University of Pennsylvania’s Critical Writing Program, an independent writing in the disciplines program organized around evidence-driven teaching and discipline-specific transferable writing instruction. Ross’s current research... Read More -›


Thursday January 12, 2017 1:30pm - 1:50pm
Harbor Hall Community Room

Featured Speaker

1:50pm

“Using Peer Review and Reasoning Diagrams in Chemical Engineering Courses”

Speaker: Suzanne Lane, Director of the Writing, Rhetoric, and Professional Communication Program, Massachusetts Institute of Technology. Suzanne Lane is Senior Lecturer in Rhetoric and Communication, and Director of the Writing, Rhetoric, and Professional Communication (WRAP) program. She holds a bachelor’s degree in Chemical Engineering from MIT, a master’s in Creative Writing from the University of Colorado, and a doctorate in English from the University of Massachusetts, Amherst. Her research interests focus on contemporary rhetoric, genre theor... Read More -›


Thursday January 12, 2017 1:50pm - 2:10pm
Harbor Hall Community Room

Featured Speaker

2:10pm

“Growth in Grammar: A multi-dimensional analysis of student writing between 5 and 16”

This talk will focus on pre-college development in student writing. It will describe an ongoing study into the linguistic developments seen in children’s writing as they progress through their school careers. Our research team is establishing a corpus of writing from English, History and Science classes produced by children from ages 5 to 16 at schools across England. We will use this corpus to try to understand how children’s language changes as they get older, what linguistic features distinguish higher- from lower-quality writing and how children at different ages shape their language use according to the disciplines and genres they are writing in. In this talk, I will describe the background to the study, what we already know about this area, our methods, and prospects for future work.


Speaker: Phil Durrant, Senior Lecturer in Language Education, University of Exeter. Since completing an undergraduate degree in Philosophy, I have spent most of my working life involved in language teaching and research. Before coming to Exeter, I taught English as a Foreign Language and English for Academic Purposes in Turkey (at Kent English and Bilkent University... Read More -›


Thursday January 12, 2017 2:10pm - 2:30pm
Harbor Hall Community Room

Featured Speaker

2:30pm

Snack Break

Assorted cookies and brownies, mini pretzels and herbed goat cheese dip, whole fruit, hot tea and coffee, assorted sodas and water.


Thursday January 12, 2017 2:30pm - 2:40pm
Outside Harbor Hall Community Room

2:45pm

“A European Corpus of Student Writing: An Exploration of Design Parameters and Tools”

This paper reports on an ongoing exploratory project that aims to specify parameters, requirements and resources for a collaborative, large-scale European project that (i) builds a multilingual corpus of student writing in Dutch, English and other European languages and (ii) develops corresponding applications for the digital analysis and evaluation of written language at word, sentence and discourse levels. In order to reach that aim, the project follows a six-pronged approach: (1) establishing a European project group; (2) defining the parameters of the corpus; (3) reviewing the international research literature on written language development; (4) consulting panels of experts in writing education; (5) testing and adapting available digital tools for the analysis and evaluation of written language; and (6) setting an agenda for tool development. The project proposed is innovative in two respects: it addresses a largely neglected area in writing research and writing education, and it promotes the use, adaptation and development of digital tools for the analysis and evaluation of written language by beginning and intermediate student writers, students whose texts present serious challenges for software than has been developed to deal with mature, professional and error-free writing.

The presentation will focus on the preliminary outcomes of the review (3 above), and on the achievements of work in annotating errors in student writing. Texts of almost 400 Dutch students from grades 8, 9 and 10 in two genres (descriptive and argumentative) have been annotated manually for errors in spelling, punctuation, use of vocabulary, sentence grammar, referential and relational coherence, and topic shifts. From the point of view of writing analytics these annotations are necessary, since they help establish corpora of student writing that are suitable for automatic analysis. The manual annotations also establish a standard for machine learning that is directed toward detection and correction of errors.


Speaker: Kees de Glopper (organisatie), Professor (Discourse Studies), University of Groningen. Kees de Glopper is full professor at the Center for Language and Cognition Groningen at the University of Groningen, the Netherlands. He teaches courses in literacy, language education and research methodology. His research interests and international (and national) publications deal... Read More -›


Thursday January 12, 2017 2:45pm - 3:15pm
Davis Hall 130

2:45pm

“Evaluating MOOC Students’ Performance Based on Their Course Engagement Behaviors: An Item Response Theory Approach”

Due to the large enrollment and high dropout rates in MOOCs, it is challenging to provide reliable evaluations of students’ achievement in MOOCs. In this study, we use student engagement behaviors to estimate their performance with the item response theory (IRT) approach. IRT is traditionally used to scale examinee performance on standardized tests by estimating relationships between examinee ability, item properties, and the probability of correctly answering each item. Typically, students’ responses to each test item are used to estimate their performance (i.e., their ability score). Using data from a Geography MOOC offered by Penn State University in 2014, we estimated each student’s overall ability score based on their engagement behaviors. Specifically, students’ engagement activities were treated as dichotomously scored items for IRT calibration. The activities (i.e., items) included posts at discussion forums each week, comments on questions at discussion forums each week, lecture video views, weekly quiz attempts, and participation in peer assessment. Using the R package MIRT, we found a two parameter (2PL) model performs better than a one parameter (1PL) model. This indicates that student engagement activities contribute to their ability scores to different extents. The estimated ability scores have good reliability (reliability coefficient > .70) and are well correlated with students’ course achievement levels (i.e., fail, normal pass, pass with distinction) with a Pearson correlation of .65. This provides concurrent validity evidence to support the use of the IRT approach to produce students’ ability scores based on their engagement behaviors. Our proposed application of IRT in MOOC assessment is important as it provides a convenient and powerful approach to make the large amount of MOOC engagement data scalable. It also shows that traditional psychometric models used for standardized tests may be useful and promising in the MOOC context.


Speaker: Hongli Li, Assistant professor, Georgia State University.


Thursday January 12, 2017 2:45pm - 3:15pm
Harbor Hall Community Room

3:15pm

“The My Reviewers Corpus”

Speaker: Alex Rudniy


Thursday January 12, 2017 3:15pm - 3:35pm
Davis Hall 130

3:15pm

“I Hear What You're Saying: The Power of Screencasts in Peer-to-Peer Review”

In “Talking about Text: The Use of Recorded Commentary in Response to Student Writing,” Chris Anson (2000) argues that using audio as a tool for mediating feedback on student writing transformed his practice. By recording feedback to students, Anson resolved a disconnect he felt between his teaching style and his evaluative style because the digital tool encouraged him to rely on a more social dimension that included more narrative and less editorial commentary. Most telling was Anson’s assertion that recording made transparent his reading processes—the way he was interpreting the student’s words and making meaning of the student’s text. Anson’s assertion is also supported by Killoran (2013), who found that “recording audio commentary, as opposed to writing text comments, enabled [instructors] to offer more feedback with more detail and more explanation” (p. 37). Although Killoran (2013), like Anson, found that students and teachers had positive experiences with recorded commentary, this method is still not being used widely by teachers in composition classrooms, a finding confirmed by Ferris (2014) in her mixed methods study on teachers’ philosophies and practices toward responding to student writing. While research suggests statistically significant positive student perceptions related to screencast instructor response, results thus far in relation to peer-to-peer screencast response are, at best, mixed. This study extends Anson’s research and insights by reporting findings from a multi-institutional study of 161 writing students designed to examine the impact of screencasts on peer review processes in a traditional introductory composition course, a face-to-face peer tutoring program, and an online tutoring program. This presentation will provide an in-depth analysis of students’ experiences, perceptions, and attitudes toward giving and receiving screencast feedback, focusing on the impact of this method on student revision initiative. The presentation will also integrate training materials for those new to screencast technology and analytics.

Speaker: Allison S. Walker, High Point University. A graduate of the University of Alaska Anchorage, Allison S. Walker received her M.F. A. in Creative Writing and Literary Arts in 2004. Her poems have appeared in word river, Two Review, Cold Mountain Review, Apogee, Snapdragon and Convergence Review. Her scholarly work has appeared... Read More -›

Related Files:

Thursday January 12, 2017 3:15pm - 3:35pm
Harbor Hall Community Room

3:35pm

“The Technical Writing Project: Bringing Corpus Data into the Technical Writing Classroom

Technical writing service courses have become a mainstay across institutions of higher education; however, the heterogeneous student population that these courses attract leads to generic, prescriptive instruction that often contradicts how students are expected to communicate within their respective fields. Empirical research reveals distinctive linguistic and rhetorical patterns across academic disciplines (Conrad 1996; Ding 2001; Stoller et al. 2005), yet the instructional materials available to teach technical writing often fail to prepare students to write in their discipline (Wolfe 2009).

Corpus linguistics research has established that one way to engage a heterogeneous group of students in learning that meets their personal needs is data-driven learning (Gilquin and Granger 2010; Römer 2012). In order to enable data-driven learning in technical writing classrooms, we compiled the Technical Writing Project (TWP), a corpus of student technical writing. TWP comprises over 5,000 writing assignments from 492 technical writing students from 40 different academic majors. More than 60 different text types, including lab reports, theses, and instruction manuals, are covered in TWP. Each text is annotated with 11 metadata variables, including the writer's gender, major, and academic classification. Pending funding currently pursued, TWP will be made publically available for non-commercial purposes.

In this presentation, we will provide a brief demonstration of how we envisage TWP being used as a tool for teaching technical writing, and illustrate TWP’s application potential for technical writing research (Boettger and Wulff 2014).


Speaker: Ryan Boettger and Stephanie Wulff


Thursday January 12, 2017 3:35pm - 3:55pm
Davis Hall 130

3:35pm

"Predicting Success? Exploring Predictive and In-Course Analytics’ Role in Helping Students Navigate First Year Writing Courses"

This discussion will explore several analytical models developed from local data that attempt to predict student success in a first-year writing program sequence. After explaining the development of these models, which incorporate a variety of internal and external data points, both numeric and textual, from the last several years, I’ll consider whether any indicators readily available prior to the start of each semester can help program administrators assist instructors in more effectively working with students in these courses. Next, I’ll discuss indicators available during the semester and whether or not any of those data points appear reliable metrics of students’ ability to complete each course in the sequence. Finally, I will examine a representative sample of commentary on student writing from students considered “at risk” and students considered “not at risk” prior to the start of each course to see whether commentary to each group is significantly different on benchmark assignments. Those attending this session will leave with strategies for creating similar models at their home institutions.


Speaker: Susan Lang


Thursday January 12, 2017 3:35pm - 3:55pm
Harbor Hall Community Room

3:55pm

Break

Thursday January 12, 2017 3:55pm - 4:10pm
Outside Harbor Hall Community Room

Break

4:10pm

“Statistical and Qualitative Analyses of Students’ Answers to a Constructed Response Test”

We report on a comparative study of the language used by middle school students in their answers to a constructed response test of science inquiry knowledge. Two types of linguistic analysis were compared to investigate their utility in understanding students’ learning of scientific inquiry practices. A statistical method, latent Dirichlet allocation (LDA; Blei, Ng & Jordan, 2003), was used to extract latent profiles from the texts of students’ written responses. These profiles were examined for content and used to characterize students’ answers on the constructed response test. Next, a qualitative method, systemic functional linguistic (SFL) analysis, was used to analyze the text of students’ responses on the same test of science inquiry knowledge. Results of the LDA and SFL analyses were consistent with each other and clearly showed that students’ learned to use the discipline specific and academic terminology of the language of scientific communication.


Speaker: Allan Cohen, Seohyun Kim, and Minho Kwak


Thursday January 12, 2017 4:10pm - 4:40pm
Davis Hall 130

Davis: Block

4:10pm

“Turning Core Writing Courses into Corpus Based Courses”

Speaker: Alexander Helberg, David Kaufer, Maria Poznahovska, and Danielle Zawodny Wetzel


Thursday January 12, 2017 4:10pm - 4:40pm
Harbor Hall Community Room

4:40pm

“Understanding Students’ Statistical Thinking through Error Analysis and its Implication for Learning and Teaching Statistics”

Due to the widely recognized importance of statistical knowledge, introductory statistics is a prerequisite of attaining a degree in most disciplines. Yet, the difficulty that students, especially those majoring in education studies, experience in introductory statistics courses and their low achievement in this subject, have been widely reported since the early 1970s. In order to provide some remedies to students' errors, it is imperative to understand their types and patterns and their potential sources. Therefore, diagnosing students’ errors and understanding the reasoning and cognitive processes underlying students' errors were the focus of the current study.

Data were collected from a sample of 81 undergraduate students in an introductory statistics course with a majority of female students (85%) who enrolled in education, social science, and psychology programs. A diagnostic achievement test that included 50 questions of which 39 were multiple choice and 11 required short answers, was administered at the end of the first semester. In addition, an interview was conducted with students who committed one or more of the frequent error types.

Error content analysis revealed several types of errors of which the most prominent are conceptual such as failing to make a correct distinction between a variable and its values/categories; confusing among types of variables, different measures of central tendency, sampling methods, and threats of internal and external validity. Procedural, arithmetic, usage of partial information and careless error patterns were also detected and possible sources of these types of errors were described. The findings provide important information concerning students’ statistical error patterns and their possible sources. Possessing this knowledge enables teachers to provide students with accurate and useful feedback and to improve their teaching planning and practices.


Speaker: Fadia Nasser Abu Alhija


Thursday January 12, 2017 4:40pm - 5:00pm
Davis Hall 130

Davis: Block

4:40pm

“Screencasts as Writing Feedback: Leveraging Mediasite Analytics to Identify Trends in How Students View Writing Feedback”

Providing writing feedback is laborious and time intensive for instructors. Similarly, processing feedback and making revisions exert effort—oftentimes overwhelming students. Finding ways to enhance providing and applying feedback would benefit both the instructor and student. Recent studies have used technologies to examine writing feedback. A mixed-methods study by Macklin, Jeng, & Triana (2016) looked at students’ revision processes for draft iterations to determine if one type of multimodal feedback was more significant in producing greater revisions, better quality final drafts, and higher grades. In other studies, video screencast feedback has been explored in distance education to add humanistic elements (i.e., recorded synchronous oral feedback that mirrors a face-to-face writing consultation or writing tutorial) (Anson, Dannels, Laboy, & Carneiro, 2016; Collins, 2016). In Edwards et al.’s study (2012), screencast feedback on students’ writing was “received more positively in the richer media of audio-visual screencasting”(p. 95). Though some research in the last decade has shed light on the use of screencasts to provide writing feedback, more research is needed to fully understand this instructional practice’s potential. This practitioner presentation will explore ways of using video screencasts as a means for providing writing feedback to students. Specifically, the presenters will showcase the use of Mediasite as an analytic tool for understanding students’ use of video screencasts to process feedback on their writing. Analytics from Mediasite show where students stop, pause, rewind, review, fast-forward, and disconnect from the video screencast—providing useful insight about what kind of feedback students skim over, gaze at the longest, rewind to view more than once, and come back to review in subsequent viewings of the recording. Presenters will explore with the audience: how might screencast analytics be used in a wide-scale study to understand students’ use of writing feedback?

Anson, C. M., Dannels, D. P., Laboy, J. I., & Carneiro, L. (2016). Students’ perceptions of oral screencast responses to their writing: Exploring digitally mediated identities. Journal of Business and Technical Communication, 30(3). 378-411.

Collins, M. (2016, October). Connecting with writers in a digital age: Students' perceptions of screencasted feedback. Paper presented at the annual meeting of IWCA, Denver, CO. Edwards et al. (2012). Screencast feedback for essays on a distance learning MA in professional communication. Journal of Academic Writing, 2(1), 95-126.


Speaker: Amber Lancaster, Dissertation Specialist and Writing Coach, Texas Tech University. Dr. Amber Lancaster has taught writing classes since 1998. At Texas Tech University, she served as the Assistant Director of Composition for two academic years before taking on the position of Assistant Director of Online Graduate Studies and now Dissertation Specialist and Writing... Read More -›Lesley Shelton


Thursday January 12, 2017 4:40pm - 5:00pm
Harbor Hall Community Room

5:00pm

Reception at The Hangar

540 1st St SE

St. Petersburg, FL 33701
 

Appetizers include:

  • Garbanzo-Tahini Hummus with Parmesan crostini and fresh vegetables;
  • Crispy Mushroom Ravioli with Creamy truffle-parmesan sauce
  • Grilled Pesto chicken Skewers with Balsamic reduction
  • Smoked Gouda Stuffed Meatball with House made tomato sauce

Thursday January 12, 2017 5:00pm - 11:30pm
TBA

 

Friday, January 13

8:00am

Editorial Board Meeting, Journal of Writing Analytics

Everyone is welcome to attend to learn about the journal. As we intend to be interdisciplinary, we invite participation from across the disciplines.

Friday January 13, 2017 8:00am - 9:00am
Harbor Hall Community Room

8:30am

Continental breakfast and coffee

Continental breakfast and coffee includes scones, bagels, seasonal fresh fruit tray, fresh brewed coffee and hot tea, fruit infused iced water and orange juice.

Friday January 13, 2017 8:30am - 9:15am
TBA

9:15am

“Construct Specific Measurement and Massive Data Analysis: Foundational Challenges”

In educational settings involving writing in digital environments, effective tools are only now emerging for creative implementation and longitudinal maintenance of real-time systems providing actionable analytics. At each inference drawn about student performance, questions therefore arise regarding data quality. What, we may justifiably ask, can twentieth-century lessons learned from construct specific measurement add to twenty-first century complexities of big data analysis?

Drawing from a range of experiences in automated and human writing assessment, my presentation will focus on evidential categories of fairness, validity, and reliability as they are related to evidence centered design (Mislevy, Steinberg, Almond, & Lukas, 2006) and design for assessment (White, Elliot, & Peckham, 2015). These two frameworks will be used to illustrate the significance of construct modeling in its relationship to frameworks of evidence. Special attention will be paid to the significance of data disaggregation, general linear modeling, and their relationships to evidence of fairness.


Speaker: Norbert Elliot, Research Faculty, University of South Florida. A specialist in writing assessment, Norbert Elliot is Research Professor at the University of South Florida. He is presently on the editorial boards of Assessing Writing, IEEE Transactions in Professional Communication, Research in the Teaching of English, and WPA: Journal of... Read More -›


Friday January 13, 2017 9:15am - 9:35am
Harbor Hall Community Room

Featured Speaker

9:35am

“Discussion Forum and Interaction Data for Predicting Success from MOOCs”

In this talk, I’ll discuss my group’s work to study the relationship between engagement and success in online learning in massive online open courses (MOOCs). We look both at the now-standard metric of course completion but also at participation in the community of practice after completing the course. We will examine variables related to discussion forum participation and interaction with both other students and with course materials as factors predictive of student achievement. I will discuss the relevance of forum behaviors such as negativity, abusiveness of instructors and other students, and forum lurking in understanding student success and failure, as well as discussing our work to analyze how the linguistic properties of text relate to student success and failure.

 


Speaker: Ryan Baker, Director of the Penn Center for Learning Analytics, University of Pennsylvania. Ryan Baker is Associate Professor at the University of Pennsylvania, and Director of the Penn Center for Learning Analytics. His lab conducts research on engagement and robust learning within online and blended learning, seeking to find actionable indicators... Read More -›


Friday January 13, 2017 9:35am - 9:55am
Harbor Hall Community Room

Featured Speaker

9:55am

“More Useful Inter-rater Statistics”

Understanding the reliability of assessments is critical to improving them. In this session we demo a new way to look at inter-rater agreement that provides much more detail than existing single-parameter methods. This is a practical demonstration using real data. The theory and a free application to perform the calculations are available online. The method should be immediately useful to those who want to reanalyze data from (for example) rubric ratings.

Speaker: Dave Eubanks, Assistant Vice President, Furman University. David Eubanks is the Assistant Vice President for Assessment and Institutional Effectiveness at Furman University, a private liberal arts university. His research focuses on learning outcomes, assessment and prediction, analysis of other kinds of educational indices like retention... Read More -›


Friday January 13, 2017 9:55am - 10:15am
Harbor Hall Community Room

Featured Speaker

10:15am

Snack Break

Assorted granola bars, whole fruit, coffee and soft drinks.

Friday January 13, 2017 10:15am - 10:45am
TBA

10:45am

“Analytics in Large-scale Implementations of Formative Writing Systems: Practical Implications”

Student writing in digital educational environments provides a wealth of information about the processes involved in learning to write, the processes involved in acquiring domain knowledge, and evidence for the impact of the digital environment on those processes. Developing these skills is highly dependent on students having opportunities to practice, most particularly when they are supported with frequent feedback and taught strategies for planning, revising and editing their compositions. Formative systems incorporating automated writing scoring provide the opportunities for students to write, receive feedback, and then revise essays in a timely iterative cycle. The analytics can further provide a means for instructors and administrators to track changes in performance, at an individual student level and at a class, school, state, or national level.

This talk will describe the development, implementation, deployment and maintenance of some commercial systems used in K-12 and higher education and used by millions of students. The talk will focus on the crucial role data mining can play in providing empirical support for continual improvement in writing instruction through validating and challenging pedagogical writing theory and verifying that the features of the system positively impact learning. While some of these issues may be evident in small scale implementations, when a system is deployed across large, diverse student populations the power of learning analytics applied to log data can be critical to obtain understanding of student performance and validation of design decisions to improve learning outcomes. The talk will further describe some of the implications and lessons-learned from moving from pedagogical principles to research studies to practical large-scale implementations.


Speaker: Peter W. Foltz, Vice President/Adjoint Professor, Pearson/Univ. of Colorado. Dr. Peter Foltz is a Vice President for Research in Pearson's Advanced Computing and Data Science Lab. He is also Professor, Adjoint at the University of Colorado’s Institute of Cognitive Science. Dr. Foltz’s research covers language comprehension, 21st Century skills learning... Read More -›


Friday January 13, 2017 10:45am - 11:05am
Davis Hall 130

Davis: Block

10:45am

“Mining for Transfer: Using MyReviewers Data to Address Evidence of Writing Transfer in Upper-Level STEM Courses”

In this practitioner presentation, the speaker will demonstrate the use of MyReviewers as a data-rich platform that can be mined for evidence of transfer. Specifically, the speaker will detail a study in which MyReviewers data (peer review comments) is analyzed for evidence that upper-level students in STEM courses have retained the knowledge or best practices associated with peer review that they presumably learned in First-Year Writing courses. As described by Anson & Moore (2016), “writers consistently draw on prior knowledge in order to navigate within and among various contexts for writing and learning.” The MyReviewers platform is a rich environment to examine this transfer of knowledge to new contexts. In this presentation, the speaker will explore how data in MyReviewers can potentially serve as evidence that the knowledge gained in First-Year Writing Courses has transferred to different writing contexts later in their academic career.

In order to address evidence of peer review, the speaker developed a coding scheme based on the scheme reported in Straub’s “Responding, Really Responding to Other Student Writers” (1999) as it is a seminal piece on the instruction of peer review. Using the coding scheme, the comments were analyzed for tone and focus, as well as Straub’s principles such as “offer specific advice,” “temper your criticisms,” among others. The speaker will explore the results of the coded data, provide a discussion of the results, and offer future implications of the research.


Speaker: Kendra L. Andrews


Friday January 13, 2017 10:45am - 11:05am
Harbor Hall Community Room

11:05am

“Teaching Big Data Storytelling as Empowerment: A Case of How to Turn English Students into Data-Proficient Storytellers”

Teaching a data-driven journalism course to English majors can be a curse or a blessing, or a little bit a both. The presentation will reflect upon the presenter's experience in teaching "Big Data Storytelling" under a renovated journalism curriculum in a liberal arts college. The presentation will be three-fold. It will first discuss learning goals/outcomes and describe the pedagogical preparation necessary to embark on the journey. Second, the presenter will case-study students' writings throughout the course and picture patterns of learning and growth. Third, a discussion will demonstrate various pedagogical strategies that have contributed to a successful learning trajectory, from rejection to acceptance, from curiosity to excitement, and from hesitation to embracement.

The presenter will share a variety of tools, resources, assignments and assessment instruments used in a comprehensive training package including (but not limited) to the following topics:

  • data hunting with training in FIOA
  • data scraping
  • data cleaning fundamental statistics with Excel
  • introductory inferential statistics
  • data pivoting and unpivoting
  • exploratory data analysis and the art of interview
  • relational database and SQL with SQLite
  • principles of information visualization with Tableau
  • ethics of data-driven storytelling

In the end, the presenter will share tips on what instructors of similar courses can do to stay ahead of the game. The course may also lend instructors the opportunity to grow their scholarly research and teaching in a symbiotic way.


Speaker: Lei (Tommy) Xie


Friday January 13, 2017 11:05am - 11:25am
Davis Hall 130

Davis: Block

11:05am

“Mining Students’ Constructed Response Answers”

Constructed response (CR) items are becoming increasingly prevalent in educational tests. Analyzing right or wrong or even awarding partial credit for CR responses can overlook important information potentially available in students’ answers. LDA has been used to analyze texts in many areas including abstracts of a science journal, medical texts, and twitter messages (Griffiths & Steyvers, 2004; Phan et al., 2008; Paul & Dredze, 2011). In this study, LDA was used to analyze the text in students’ CR answers to a middle grades test of science inquiry knowledge. The objective of this study was to investigate the utility of LDA for explaining pre-test to post-test changes as a result of an instructional intervention.

LDA has been proposed for use in analyzing text corpuses (Blei et al., 2003). A document of text is assumed to be generated by a mixture of profiles, with each word in a document being generated from a single profile. A profile is assumed to follow a multinomial distribution, such that words are drawn from multinomial distributions which correspond to the words’ profiles.

In the model selection procedure, the R package topicmodel (Hornik, & Grün, 2011) was used to fit the data. The posterior estimate for each model was obtained using Gibbs sampling. A burn-in of 5,000 iterations and a post-burn-in of 20,000 iterations were used based on convergence of the algorithm. The results suggested that a three-topic model was the best fitting model for both pre-test and post-test data . The three profiles consisted of discipline-specific words, everyday language, and general academic words. The profile consisting of discipline specific words was strongly positive correlated with the score on the CR items. Results using LDA will be shown to provide useful information about how students learn and construct their responses.


Speaker: pAllan Cohen, Seohyun Kim, and Minho Kwak


Friday January 13, 2017 11:05am - 11:35am
Harbor Hall Community Room

11:25am

“Framing Arguments: Corpus Driven Study of Students’ Rhetorical Moves”

Following Zak Lancaster’s work in his recent College Composition and Communication article “Do Academics Really Write this Way?,” I take a look at the rhetorical moves of “entertaining objections” and “making concessions” as described in Gerald Graff’s and Cathy Birkenstein’s composition textbook They Say/I Say: The Moves that Matter for Academic Writers. My findings in a study of 1,465,091 word corpus of research essays written by students at the City College of New York in the fall of 2015 were strikingly similar to Lancaster’s; I learned that students were much more likely to entertain objections and make concessions using words and phrases that are not found in They Say/I Say. In addition, I discovered that our students are more likely to entertain objections in their essays, and less likely to make concessions or offer counter arguments, than writers represented in the three corpora in Lancaster’s study. The results of my study have provided direction for curriculum and faculty development for the spring 2017 semester.


Speaker: Tom Peele


Friday January 13, 2017 11:25am - 11:45am
Davis Hall 130

12:00pm

Lunch Break

Lunch buffet includes a salad, entrée selection, dessert and a beverage.

Friday January 13, 2017 12:00pm - 1:30pm
TBA

 

1:30pm

“Summative and Formative Feedback Using Natural Language Processing Tools”

Research demonstrates that sustained writing practice that includes both summative feedback (i.e., an overall score) and formative feedback (i.e., suggestions on how to revise portions of a text to improve it) can improve writing skills (Duke & Pearson, 2002; Graham & Harris, 2013; Kellogg & Raulerson, 2007; NRP, 2000; Snow et al., 1998). Feedback alone is not sufficient because the feedback students receive must be individualized (Graham & Harris, 2013; Kellogg & Raulerson, 2007), support writing development through multiple rounds of revision (Graham & Harris, 2013), and motivate students through the revision process (Beach & Friedrich, 2006; Ferris, 2003). However, providing both summative and formative feedback is difficult in terms of time and cost (Higgins, Xi, Zechner, & Williamson, 2011). Thus, teachers, administrators, and researchers are interested in the use of automatic feedback systems to provide students with actionable feedback both in and outside of the classroom.

This presentation provides an overview of feedback provided within the Writing Pal (W-Pal), an intelligent tutoring system designed to provide writing strategy instruction to high school and entering college students (Crossley et al., 2016; McNamara et al., 2013). W-Pal allows students to compose essays, and, using an automatic writing evaluation (AWE) system, provides automated summative and formative feedback to users based upon their natural language input. This presentation provides details about the W-Pal feedback system and its effectiveness. It also introduces new data collected from high school students under two experimental conditions wherein students had control of feedback and/or the system controlled the amount of feedback provided. Results from this analysis along with previous findings will be discussed. Implications for developing effective feedback mechanisms in tutoring systems will also be discussed.


Speaker: Scott Crossley, Associate Professor, Georgia State University. Dr. Scott Crossley is an Associate Professor of Applied Linguistics at Georgia State University. Professor Crossley’s primary research focus is on natural language processing and the application of computational tools and machine learning algorithms in language learning, writing... Read More -›


Friday January 13, 2017 1:30pm - 2:00pm
Harbor Hall Community Room

Featured Speaker

2:00pm

“Patterned Metadiscourse in Academic Genres and Disciplines: Corpus Analysis and Implications for Assessment”

Existing corpus-based research of student writing reveals patterns across texts and contexts that we cannot otherwise detect, including how lexical and grammatical patterns interact and influence student success. This talk suggests that an important next step is research that explores the interaction between textual patterns, on the one hand, and assessment design, on the other, so that moving forward, writing analytics will help expose the relationship between the discursive profile of texts and their corresponding assignment parameters. As an example, the talk describes corpus-based analysis of connections between metadiscursive patterns and the genres and rhetorical cues of writing assessments.


Speaker: Laura Aull, Wake Forest University. Laura L. Aull is an Assistant Professor of English and Linguistics at Wake Forest University. Her research focuses on rhetorical and corpus linguistic analysis of academic and popular writing and can be found in Written Communication, Assessing Writing, Corpora, College Composition... Read More -›


Friday January 13, 2017 2:00pm - 2:30pm
Harbor Hall Community Room

Featured Speaker

2:30pm

“What Big Data Can Tell Us About Student Writing (and What It Can’t)”

Writing scholars have tended to favor descriptive and interpretive forms of inquiry, especially of individuals or limited contexts, over large-scale quantitative and statistical research. But now, the concept of “big data” is everywhere. Although not without controversy (see Gold, 2012), writing scholars are increasingly more accepting of the possibility that big data can tell us important things about writing that can be applied instructionally. Using two studies of similar peer-review contexts, I will demonstrate the potential of big data to generate instructionally-relevant insights about student writing while also questioning its utility for certain kinds of finer-grained analysis. The demonstration will raise some questions about the future of big data in writing studies, including its potential role in instructional intervention and in the broadening of scholarship in the field.

Gold, Matthew K. (ed.). Debates in the Digital Humanities. Minneapolis: U of Minnesota P, 2012.


Speaker: Chris Anson, North Carolina State University. Chris M. Anson is Distinguished University Professor and the Director of the Campus Writing and Speaking Program at North Carolina State University, where he teaches graduate and undergraduate courses in language, composition, and literacy and helps faculty across the university... Read More -›


Friday January 13, 2017 2:30pm - 2:50pm
Harbor Hall Community Room

Featured Speaker

3:10pm

“Revisiting Kristeva’s ‘The Ethics of [Corpus] Linguistics’”

In their introduction to a special issue on human rights and professional communication, Sapp, Savage, and Mattson (2013) establish that “the issue of human rights has not yet emerged as a consistent thread in professional communication scholarship: but over the past decade the literature has addressed themes related to the larger issues of human rights” (p. 1). As technical and professional communication (TPC) grows in its awareness of these larger issues, a questioning of methods has begun to emerge as well, asking “which humans are at the center of our work” (Walton, 2016, p. 401).

Several research methods exist that easily accommodate this rising awareness of human dignity and human rights in TPC. With writing analytics, corpus linguistics, and big data as emerging methods in TPC (Graham, Kim, DeVasto, & Keith, 2015), scholars must make a strong case for their use in social justice research where humans are at the center. Traditional humanities scholars will be quick to point out the loss of human dignity where the human seems lost in data sets, algorithms, and coding.

Kristeva (1980) argued “that formulating the problem of linguistic ethics means, above all, compelling linguistics to change its object of study” (p. 24). Other research methods have an ability to draw attention to the individual via narrative. Certain human narratives, however, tend to dominate the discourse, marginalizing voices in the process. If we imagine a corpus linguistics whose object of study is the narrative as a dominate narrative, we can disrupt it. If we are to answer the call from Jones, Moore, and Walton (2016) to disrupt the past to disrupt the future through antenarratives, we must thoroughly understand the dominant narrative. There is no better tool to disrupt the narrative than corpus linguistics.

References

  • Jones, N. N., Moore, K. R., & Walton, R. (2016). Disrupting the past to disrupt the future: An antenarrative of technical communication. Technical Communication Quarterly, 25(4), 211-229.
  • Kristeva, J. (1980). Desire in Language, trans. Thomas Gora, Alice Jardine, and Leon S. Roudiez and ed. Leon S. Roudiez.
  • Graham, S. S., Kim, S. Y., DeVasto, D. M., & Keith, W. (2015). Statistical genre analysis: Toward big data methodologies in technical communication. Technical Communication Quarterly, 24(1), 70-104.
  • Sapp, D. A., Savage, G., & Mattson, K. (2013). After the International Bill of Human Rights (IBHR): Introduction to special issue on human rights and professional communication. Rhetoric, Professional Communication, and Globalization, 14(1).
  • Walton, R. (2016). Supporting human dignity and human rights: A call to adopt the first principle of human-centered design. Journal of Technical Writing and Communication.

Speaker: Eric James Stephens, Clemson University. Eric is a PhD student in the Rhetorics, Communication, and Information Design (RCID) program at Clemson University. In addition to pedagogy, his research interests include issues of social justice, schizoanalysis, popular culture, and big data to better understand power relationships... Read More -›


Friday January 13, 2017 3:10pm - 3:30pm
Davis Hall 130

3:10pm

“Analyzing Student Voices in Peer Review”

Speaker: >Ashley Wilson


Friday January 13, 2017 3:10pm - 3:30pm
Harbor Hall Community Room

3:30pm

“‘Don’t Demonize Nuclear Energy!’: A Case Study of Rhetorical Strategies on Chinese Social Media and its Implications for Teaching of ESL Writing”

“The Chinese write very indirectly” is a commonly accepted notion among many Westerners which include educators and researchers (Kaplan 1966, Hall 1976, Hofstede and Hofstede, 2005). Seeing indirectness as a preferred rhetorical approach in Eastern Asian students’ writing has had a significant influence on second-language writing pedagogy.

This presentation proposes a new perspective on the directness/indirectness in Chinese rhetoric by examining social media postings through the lens of rhetorical strategies. This study focuses on one piece of Sina Weibo (the Chinese Twitter) post and 1,246 pieces of comments under the post. This piece, written by a verified Nuclear Physics expert, was published in the waves of massive protests stirred by the news that a “nuclear waste processing plant” would be built in a coastal city of China. The original post presented an introductory article about nuclear fuel cycling, the correct term of “nuclear waste”, and received 1,246 comments from a well-educated reader group online.

Data analysis consists of three phases:

  1. Synthesizes nine rhetorical strategies from current literature: reasoning pattern, paragraph organization, naming strategy, request strategy, informing strategy, statement of criticism, quotation strategy, “speaking out” strategy, and correcting strategy.
  2. Categorizes each comment according to the nine strategies.
  3. Calculates the volume and percentage of each strategy.

Research findings include:

  1. Directness is more commonly used in comments.
  2. One reasoning pattern–deductive approach–is the most used strategy.
  3. Users tend to use direct strategies while requesting, informing, criticizing, and quoting.

As a large portion of international students in the US are social media users, the researcher believes the findings indicate writing and communication strategies adopted by second-language writers. The current perception that Chinese students use indirect writing strategies might not reflect the change in Eastern rhetoric under the influence of technology, new media, and globalization.


Speaker: YY Yunye Yu


Friday January 13, 2017 3:30pm - 3:50pm
Davis Hall 130

3:30pm

“Examining Web-based Peer Review in General Chemistry Students: Improving Cognitive, Interpersonal, and Intrapersonal Competencies”

Speaker: LA Laura AndersonJF Jhon FigueroaUK Ushiri Kulatunga


Friday January 13, 2017 3:30pm - 3:50pm
Harbor Hall Community Room

4:00pm

Afternoon snack break

Vegetable crudité w/, Humus dip and Pita chips, spanakopita, homemade banana bread, hot tea and coffee, assorted sodas and water

Friday January 13, 2017 4:00pm - 4:15pm
TBA

 

4:15pm

“Why Has English Become the Language for all the Written Materials at the International Level?”

English has become the language for most of all the written materials at the international level as it is a lingua franca.It is a universal means of communication for speakers of different first languages.Its use is also related also in intercultural communication rather than in formal native-speaker contexts. Although lingua francas have been employed for centuries,English has a novel phenomenon bearing upon its functionality and geographicality.English as a lingua franca is strongly dependent on particular stances of quotidian uses. It is involved into interactions focusing on function rather than form. Communicative adeptness is more important than linguistic accuracy. As a result of this inclination, interactions in English are very often hybrid. Speakers acclimatize to each other's cultural backgrounds and can also use code-switching into other languages that they know.Based on the Vienna-Oxford International Corpus of English (VOICE, there are features of English lexicogrammaical aspects that have been singled out.However, these features are by no means invariant or‘obligatory. Rather,these forms do not seem to constitute trenchant communication in settings with English milieus when they do occur.While some researchers hold that English as a lingua franca is a neutral and culture-free tool, others believe that it abides with the culture and language of its speakers. Recent linguistic discussions by English as a second language experts treat the interactants’cultural and linguistic background as a factor impacting language performance. For Hülmbauer, for instance,‘it seems likely that the ELF users develop their own markers of identity (be they a common 'European' or 'international' nature or more individual ones which are created online, depending on the community of practice they are emerging).In this regard, English is multicultural rather than culture-free tool for written materials universally.


Speaker: Tamer Osman


Friday January 13, 2017 4:15pm - 4:35pm
Davis Hall 130

Davis: Block

4:15pm

“Using Writing Analytics to Determine the Prevalence of ADHD with WLD in Students”

Attention Deficit Hyperactivity Disorder (ADHD) is a mental health disorder. People diagnosed with ADHD are often inattentive (have difficulty focusing on a task for a considerable period of time), overly impulsive (make rash decisions), and are hyperactive (moving excessively, often at inappropriate times). ADHD is often diagnosed through psychiatric assessments with additional input from physical/neurological evaluations.

Written Language Disorder (WLD) is a learning disorder. People diagnosed with WLD often make multiple spelling, grammar and punctuation mistakes, have sentences that lack cohesion and topic flow, and have trouble completing written assignments. WLD is often diagnosed through psychological educational assessments with the additional evaluations as mentioned for ADHD above.

Previous research has shown a link between ADHD and WLD. So much so, that students with ADHD have an increased risk of having a WLD, and rarely is there a prevalence of WLD without ADHD or another mental health disorder involved. In order to measure the prevalence of ADHD and WLD in a student, this research paper focuses on the creation of an integrated computational model that combines the outcomes of common screening tools for ADHD (physical and behavioural questionnaires, adult self-reporting scales, and reaction-based continuous performance tasks (CPTs)) with written performance tasks as a measure for WLD.

The combination of outcomes is run through a neural network to learn the behavior of measuring the prevalence and adjusting the prevalence value based on the screening information. The model is then tested with ADHD student data to validate the prevalence of ADHD and measure the prevalence of WLD. A normal group is tested as well for a control. The results show that ADHD students have a high prevalence of WLD in comparison to normal students, demonstrating the link between ADHD and WLD.


Speaker: Diane Mitchnick


Friday January 13, 2017 4:15pm - 4:35pm
Harbor Hall Community Room