AbstractBest practices in online writing instruction (OWI) have been a concern for more than a decade, yet students’ voices have not played a major role in OWI research projects to date. This article reports on the data from a U.S.-based national online student survey conducted in 2017. When viewed comprehensively, survey data revealed students valued instructor expertise and feedback; however, they did not know how the work in online writing courses helped them improve their writing.

For the fourteenth straight year, student enrollment in online classes in higher education has increased, and recent reports indicate that 32% of all students are taking at least one online class (Seaman, Allen, & Seaman, 2018). This national growth in online enrollments aligns with growth in online courses and programs in composition and technical and professional communication (TPC). For example, as of 2012, 15% of all TPC degree programs were offered fully online (Melonçon, 2012) and 21% of service courses were offered fully online or in a hybrid format (Melonçon, 2009).

These numbers strongly indicate that educators must deliver effective online instruction. Best—or, more accurately, effective—practices in online writing instruction (OWI) have been a concern for writing studies for more than a decade as evidenced by the work of the Conference on College Composition and Communication (CCCC) Committee on Computers in Composition and Communication and the CCCC Committee for Effective Practices in Online Writing Instruction (hereafter called the OWI Committee). The OWI Committee, constituted in March 2007 (Hewett & Depew, 2015), was charged with developing a position statement of effective practices for OWI. After several years of research into the issues, the OWI Committee (2013) composed A Position Statement of Principles and Examples of Effective Practices in Online Writing Instruction (hereafter called the OWI Principles document) that described 15 grounding principles for ethical and effective teaching of postsecondary-level writing in online learning.

These OWI principles have been key to new research into OWI because they provide a solid ground for examining both theoretical and practical challenges of OWI pedagogy and administration. They offer a position from which scholars could agree and differ, thus enriching the research and educational understanding of OWI’s possible effects on student writing. However, despite its strengths, the OWI Principles document, like much of the research surrounding instruction that occurs in online environments, failed to address student needs head-on. Indeed, along with attention to pedagogical practice and growth in courses and programs, OWI researchers have responded with vast and diverse research studies and articles that recently were collected to form the Bedford Bibliography of Research in Online Writing Instruction(2017). Nonetheless, despite the research advances that these publications represent, online literacy educators still do not know much about student needs from the student point of view. With most of the research engaging instructional perspectives, students’ stated needs regarding their experiences in online writing courses (OWCs) remain woefully underexplored; exceptions include research published by Patricia Webb Boyd (2008), Beth Brunk-Chavez and Shawn J. Miller (2006), Jennifer M. Cunningham (2015), Angela Eaton (2005, 2013), and Scott Warnock (2018). This report of a national survey of OWI students is the result of the OWI Committee’s attempt to address this gap in published literature.

Background of the OWI Student Survey’s Construction

In 2014, the OWI Committee conducted an informal OWI-focused literature review, which indicated that no published reports existed of any national surveys dedicated to learning about students’ experiences in OWI. To address this gap in OWI research, in 2015, the OWI Committee tasked a student-survey working group to develop a U.S.-based national online survey for students in OWCs.

The student-survey working group settled on a survey designed to gather quantitative data primarily, using only a few subjective, open-ended questions. The working group decided to keep the survey as short as possible to encourage students to complete it fully, given research that suggests long surveys may be abandoned by respondents (Chudoba, n.d.). Given the ultimate intention of obtaining a generalizable sample of U.S. OWI students, the student-survey working group sought as large and as representative a student sample as possible.​The pilot student survey focused on student experiences regarding preparation for, access to, and learning in OWCs. Reading and alphabetic writing are minimal, core components to succeeding in online courses, and the working group was especially interested in learning directly from students what pedagogical components of OWCs they considered helpful in improving their writing. Development of the survey was guided by the following research questions:

How are students prepared for, or oriented to, their online writing courses specifically?

How do students typically access their online writing courses?

What components of online writing classes do students find most helpful in improving their writing?

What components of online writing classes do students find least helpful in improving their writing?

Testing the survey as a pilot was critical to refining it for the broadest participant base. To ensure a strong survey instrument, the student-survey working group conducted a pilot of the survey in the fall of 2015 to test question construction and survey length. Its first iteration was piloted with students taking classes from online instructors associated either directly or indirectly with the OWI Committee. When the OWI Committee reported its general results at the CCCC conference in 2016, the audience responses revealed ways to refine the instrument to engage student voices more fully about their OWI experiences. During that same conference, the OWI Committee received word that it had not been reconstituted; despite this unexpected news, the student-survey working group was committed to continuing the work begun with the pilot survey. They subsequently partnered with Macmillan Learning and the newly constituted Global Society of Online Literacy Educators (GSOLE), which stepped up as the sponsor of this work. A second iteration of the survey was created from the pilot survey’s learned lessons, and the working group requested feedback from online technical writing students at Louisiana Tech University in June and July 2017. Further revisions were made to the survey based on the second pilot, and the national survey was distributed in September 2017.

Western Carolina University is the IRB of record for the survey. The survey was designed in SurveyGizmo by Macmillan Learning, and it was intended for students in only fully online—defined on the survey as “a class that is delivered solely online, meaning your class does not meet in a face-to-face setting. All work and class participation is online”—and hybrid writing courses—defined on the survey as “a class that is delivered online and sometimes meets in a face-to-face setting (maybe once a week, once a month, or once or twice semester).” Student responses were anonymous. The student-survey working group used both a convenience sample in that the survey was advertised on listservs and relevant social media and a purposeful sample garnered from specific lists of online writing instructors that Macmillan Learning made available. The survey was distributed during the week of September 18, 2017 and remained open until December 31, 2017; three reminder emails were sent during that time. Since the student-survey working group did not have access to students, the recruitment process involved two layers of participant requests in that the working group asked instructors who were teaching online to ask their students to complete the survey. There is no exact count of how many people received the email, and there is no way to know who distributed the survey link to students, so calculating a rate of return is impossible. Data were collected and stored in a SurveyGizmo account owned by Macmillan Learning and monitored by IRB-approved members of the survey group.

________________________________________________1. The authors of this article were members of this student-survey working group.2. GSOLE was founded on the premise that all online courses, and particularly OWCs, engage literacy. Minimally, online literacy involves reading, alphabetic writing, and multimodal composition skills in digital settings. As Beth L. Hewett (2015) indicated, these three components are necessary in any online class. Minimally, all coursework requires online teachers to write clear instructional text—developed with the online setting in mind—and for students to read and compose from such text, making reading an especially important skill. From the most basic course information (course descriptions, syllabi, and assignments, as well as feedback) to lengthier and more complex required course readings, students must (re)learn and engage sophisticated reading skills to do their best work in online settings (Hewett, 2015). Additionally, in contemporary online courses, students are learning and using both traditional alphabetic writing as well as multimodal composition because these are core literacies for writing in postsecondary education and—more globally—for workplace communication and documents.

Limitations of the Survey

Survey creation is a rhetorical act that must consider and balance the research questions with the audience and the selected research method (Rife, 2013). This important aspect of survey development is both a strength and a limitation. Thus, no survey will provide comprehensive data on any subject. This OWI student survey is no different. One limitation of surveys is that they rely on self-reported data, which can be incomplete and unreliable (Paulhus & Vazire, 2007). Those who complete surveys tend to self-select into a study for a variety of reasons that typically bias their responses. In the case of student-respondent data, even when they are assured that the instructor has no access to the responses, students may have believed they were required to complete the survey or that they had to do so using positive responses, which is a version of context effect (Lavrakas, 2008). Some of the data about pedagogical approaches can be interpreted in several ways, particularly when read against other research in OWI (see, for example, Anson & Anson (2017) regarding results from student responses to discussion boards). Even with the potential self-reporting dilemma, surveys remain a valuable method for acquiring responses from wide, diverse populations (Murphy, 2002). The data in this survey were limited because they were garnered primarily from students at four-year institutions. There also was a recruitment bias to TPC, which likely means there would be differences in responses from first-year writing students; we surmise that TPC students would have different expectations of their OWCs than those of students in a required first-year writing course that happened to be an OWC. Further, no statistical comparison was made between responses from students in hybrid versus fully online courses. The two terms were defined on the survey, and students were asked to select which course they were enrolled in when asked to conduct the survey; however, hybrid courses vary greatly in their delivery, which was not delineated on the survey, so all the data were combined for a more holistic picture. With a larger sample size of hybrid course students and more information about the delivery of such courses, additional research might use this survey’s data as a comparison point. Finally, and most importantly, we know little about the instructors teaching the course; instructor personality, ways of requesting survey participation, online teaching style, training and background, and numerous other factors all can impact the student experience, including participation and responses to the survey. However, as with studying instructors’ perceptions devoid of student experiences, there is research value in gathering and analyzing only the students’ experiences. Such responses also illuminate OWI pedagogy.

The final survey had approximately 25 questions (fewer depending on how respondents answered several skip-logic questions); 18 were multiple choice, 2 were frequency rankings, and 1 was a Likert scale question on pedagogical activities. There were four open-ended questions. As part of their sponsorship of this research project, GSOLE agreed to house the full dataset from this survey, which can be accessed by the public. Table 1 shows the breakdown of respondents.

Table 1. Breakdown of Total Survey Respondents

Total responses

569

Total responses after eliminating disqualified respondents and those who chose not to consent to taking the survey

487

Total who completed the survey

346

Total partial responses

141

Students selecting class option as “hybrid”

94

Students selecting class option as “fully online”

367

The student-survey working group chose to include all responses in the analysis, even partially completed surveys, because they had value for a grounded, initial understanding of student perspectives about OWI. In this written report, responses include n numbers as well as percentages to qualify the offset of completed and partial surveys; however, using all the data prohibited us from making more advanced statistical comparisons, which typically can be done only by comparing variables from complete data sets.

The complete survey responses are weighted to TPC with 67% (n=231). This result is likely because of the recruitment strategies used. The student-survey working group had access to a list of individual instructors who primarily taught TPC courses, and survey research indicates that response rates typically are higher when the request for participation comes directly to a personal email rather than from other sources (Lindemann, 2018). ​

​To better frame the results and discussion of this survey in subsequent sections, we first provide some demographic information of students who participated in the survey. Demographic questions appeared in the final quarter of the survey itself. Questions that asked students about the course they were enrolled in appeared in the first part of the survey, and they are included in this section. ​

A little over half of respondents, 59%, were traditional-age students, which typically are defined as 24-years old or younger (U. S. Department of Education, n.d.). The remainder, 41%, were (as defined in higher education) non-traditional students. The age distribution in this survey was much more traditional than the typical distribution of online-only students where the mean age of undergraduate online students is 29 (Clinefelter & Aslanian, 2016), possibly indicating that the traditional-aged students who took the survey were taking a single online course as a part of a traditional on-campus degree. The high number of students over age 24 supports the notion that online learning is a valuable resource for students with complex family lives who need the flexibility of online courses (e.g., Gos, 2015; Noel-Levitz, 2015; Ruffalo Noel-Levitz, 2016). Age data specific to OWI can help faculty and administrators plan and market courses accordingly.

Although Figure 1 shows responses of “male,” “female,” and “I do not wish to respond,” the survey did include the option of “I identify as” with an open comment to allow participants to signify how they would like to be identified. This comment box did not generate any usable answers. The gender breakdown of this survey matches national trends wherein women students are more prevalent in higher education (56% female) (U. S. Department of Education, 2018); additionally, it matches the increase in women students who take fully online and hybrid online courses (approximately 70%) potentially because of family or work obligations (Clinefelter & Aslanian, 2016).

Academic level and type of institution.

Question 22 asked, “I am a [fill in academic level] at a [fill in type of institution].” Students could select from a dropdown menu for the fill-in-the-blank options. Part one included freshman-to-senior options paired with the typical year students would be in school (e.g., freshman/first year), graduate, non-degree seeking, and “other” options. Part two included options for two-year, four-year, and “other.” Figure 2 shows results for academic level reported.

Only 2% (n=7) of the students reported attending a two-year college; the remaining 98% reported attending a four-year college or university. These results likely reflect the survey distribution more than typical online student-to-institution ratios since recent research has suggested that 36% of students attend two-year colleges (Ginder, Elly-Reid, & Mann, 2017), and two-year colleges comprise 30% of online enrollments (U. S. Department of Education, 2018).

The distribution of student academic levels is not surprising based on the type of courses being taken by the participants as shown next. ​

Type of course.

Question 1 asked, “Please identify the writing class in which you were asked to take this survey.” Students were provided a series of courses from which to choose, and there was an “other” option for writing in a course not listed. Table 3 presents the breakdown of courses identified.​ Table 3. Courses Students Enrolled in at Time of Survey (n=413)

Type of Course

% of students

TPC Service Course

62% (n=256)

First Semester Composition Course

14% (n=56)

Developmental writing/English

11% (n=44)

Second Semester Composition Course

6% (n=21)

Literature Course

2% (n=8)

Other (Write In)

2% (n=7)

TPC (Grant Writing)

1% (n=6)

Online Writing Instruction

1% (n=5)

Intro to Research

1% (n=5)

TPC (Editing)

1% (n=4)

Aligning with the recruitment and distribution of the survey link, more students in TPC completed the survey with first-semester composition (taken broadly to be first-year writing courses) following. “Other” written-in responses included “expository writing,” “analytical and research writing,” “creative writing,” and “master’s level course.”

Length of course.

Question 2 asked, ‘The duration of this course is” with options for students to choose from and an open comment box for participants to write-in other options. Results are shown in Table 4.

Table 4. Length of Course Responses (n=411)

# of weeks

Student Responses

4 or fewer weeks

1% (n=5)

5-9 weeks

9% (n=37)

10-12 weeks

12% (n=50)

13-16 weeks

78% (n=319)

​That over three quarters of respondents were in traditional 13-16-week semester courses aligns with the high percentage who reported they were in a four-year college or university.

Previous online course experience.

Question 25 asked, “How many additional online or hybrid courses—in any subject—have you taken since you began college?” Figure 3 displays the results. ​

The general experience of students in online courses can be seen as a positive when thinking about OWCs. However, as discussed below, this experience may make it more difficult for students to adjust to the OWC that requires a different type of engagement than with other online courses such as math or basic science. Only 15% (n=51) had not taken an online course prior to their enrollment in the OWC, but this number suggests that care is necessary to ensure those students new to this learning environment are properly oriented to the setting as well as the particular class.

In this section, results of the three content-based sections—Course Information, Orientation, and Online Course Activities—are reported and discussed, leading to implications of this research at the end of this report.

Course InformationOther than information reported on the courses in which students were enrolled (see Table 3) and duration of the courses (see Table 4), this part of the survey asked questions about the physical location where students completed their course work and what types of devices they used for accessing and completing their course work. ​

Location.

Question 3 asked, “Please rank in order of frequency (with 1 being least frequent and 5 being most frequent) where you do the majority of your work for this online writing course.” Results are presented in Figure 4. ​

Figure 4. Where Students Access and Complete Course Work (Home n=377; Work n=274; Campus n=306; Off Campus Public Space n=276; Off Campus Private Space n=260)

As shown in Figure 4, the most frequent location in which students completed their school work was at home, followed by on campus; however, 274 students reported completing course work at their workplaces and, therefore, the conditions of such an environment and how it influences both the degree and frequency of engagement are important considerations. For example, when students access and complete their schoolwork at home or on campus, it would be reasonable to think they have a time and place set aside for such activity. Yet when they access and complete their schoolwork in the workplace, the question of whether the schoolwork is sandwiched between workplace tasks (or even completed instead of workplace tasks) arises. In such cases, it would be useful to learn at what kinds of workplaces students work and to what degree they can focus completely on their schoolwork. Certainly, an entire survey regarding location of access and completion of online coursework would be illuminating.

Device used to access and complete course work.

Question 4 asked, “Please rank in order of frequency (with 1 being least frequent and 5 being most frequent) what kind of device you use to do the work for this online writing course.” See Figure 5 for the results. ​

A majority of students reported using a laptop to access their OWCs; however, use of a mobile phone exceeded use of desktops, tablets, and notebooks, suggesting that the size of the device mattered less than either the convenience factor or the possibility that a mobile device was the only connection the student had with the internet. The Pew Internet Project (2013) reported that 73% of Advanced Placement and National Writing Project teachers surveyed said that their students used mobile phones inside and outside of the classroom to complete their school work. Some high school students from that Pew survey were college students at the time of this survey, suggesting that they would have access to and experience using mobile devices in classrooms. In 2018, over 94% of traditional college-age students (18-29) owned a smartphone (Pew Research Center), and Esteban Vazquez-Cano (2014) reported “[m]obile learning often takes place outside a formal learning environment, tending to become personalized via users’ personal mobile devices” (p. 1508). One implication of these data is that students may be using technology that is not conducive for the kind of work they are assigned in OWCs, so they may need technological guidance, which is addressed in the next section.

For students who responded “yes” to Question 13, Question 14 asked, “Please describe how you use the mobile application in this online writing course (check all that apply).” Results are shown in Figure 7. ​

Figure 7. Course Activities Completed in Mobile Application (n= 667, students could select more than one response; Due dates n=256; Read updates n=155; Participate in discussion n=102; E-mail n=81; I do not use n=52; Other n=21)

The type of device students use to access and work in a course may be influenced by pathways into a course and how easy it is to use the course software on any given device; thus, some universities (e.g., Oregon State University, Colorado State University, and University of Southern California) now have mobile applications available for students to access online courses (Friedman, 2017). Most students who reported that their institution had a mobile app, 60% (n=238), also reported completing two to three activities using it (see Figure 7). It is worth noting that 13% (n=52) students reported not using the app even if it was available, suggesting a need to learn why they made this choice.

Students who preferred to use mobile devices to check-in—for example, with an online group or to ensure they are not missing an announcement—reported doing so for quick activities such as verifying due dates, reading messages, or accessing voice-threads. “Other” responses in the student survey of OWI included completing assignments and quizzes and checking grades. These activities are indeed quick undertakings that both are practical and supported through a mobile device. In a national survey of over 1500 online students, Dian Schaffhauser (2018) reported that 67% completed their coursework on a mobile device. The most common activities from that survey, which provided no information about the types of courses, included accessing course readings (51%), communication with professors (51%), and communicating with fellow students (44%) (Schaffhauser, 2018). The findings from that national survey align with the findings from the OWI student survey and provides additional support for understanding ways that students are integrating mobile devices as part of their online learning and how instructors can support mobile technology in OWCs. However, it does not suggest how many students may be attempting to complete their writing assignments via mobile devices.

Additionally, Rochelle Rodrigo (2015) reviewed studies associated with using mobile devices in higher education and discussed mobile apps specifically in terms of OWI and the OWI Principles. She emphasized that it is impossible for any online instructor to be familiar with all devices available for students to complete their work; yet, given that OWI Principles 2 and 10 address institutional responsibility for supporting students’ technology (CCCC OWI Committee, 2013), online writing instructors can provide “reasonable support.” Reasonable support can manifest through instructors being “reasonably aware of some of the major issues that might occur when their OWC interacts with popular mobile devices and operating systems” (Rodrigo, 2015, p. 501), such as incorporating “low-stakes learn-the-technology assignments where students safely can explore how they will interact in a specific course with their individual devices” (p. 504). Implications of this survey’s data indicating how students use their mobile apps minimally offers online writing instructors a glimpse into how to guide students as to what extent their mobile devices might work for some assignments and not for other work. Importantly, despite the useful voice-to-text functions of many mobile devices and their value for students who compose best orally, students likely should arrange to use other devices to complete their writing, such as when revising, editing, and formatting a final paper.

​OrientationAs a section, “Orientation” regarded how students had been prepared by their teachers or the institution for their OWCs.

Question 5 asked, “Were you offered an orientation about taking an online writing class?” This question used skip logic so that students who responded “No” were advanced to the next section of the survey. Results for Question 5 are presented in Figure 8.

Only 28% of respondents indicated they had received some sort of orientation to the OWC. Yet, once students enter the online learning platform, orienting them to the space is key to student success (Melonçon & Harris, 2015). Whether orientations take place at the institutional level (Bozarth et al., 2004), in face-to-face settings (Gos, 2015), or at the start of online courses (Dockter, 2016), instructors ideally should receive institutional support to help students adjust to online learning (Minter, 2015). Online orientations have been shown to increase retention in online classes (Taylor, Dunn, & Winn, 2015), and they can vary from a short tutorial video about the affordances of a single classroom to a multi-day on-ground orientation for students new to an online program (Lieberman, 2017).

Distinguishing between an orientation that specifically addresses OWI versus an orientation to online learning more generally is important. The survey question asked about an orientation specifically for an OWC; however, we do not know whether students identified the words “taking an online writing class” as central to the question or whether they simply responded to being offered an orientation of any kind. This lack of clarity existed despite a follow-up question (Question 8, “Did this orientation adequately prepare you for the work in this course?”).

The failure of the survey question to be sufficiently specific aside, the 48% who responded “No” and the 24% who responded “I don’t know” indicate an alarming number without orientation to the OWC; the published literature strongly supports orienting online students to online classes in general (Lee & Choi, 2011) and to OWI in particular (Bozarth, Chapman, & LaMonica, 2004; CCCC OWI Committee, 2013; Gos, 2013; Melonçon & Harris, 2013; Minter, 2013). Additionally, because of the decision to keep the survey short, follow-up questions were not provided regarding actual participation in and components of offered orientations. Therefore, we do not know whether those students who reported having been offered an orientation actually participated in it and what they might have learned from it.

When developing the survey, the student-survey working group wondered whether previous experience in online courses affected whether students were offered an orientation in their current class. Although the question about being offered an orientation and the demographics question about the number of previous online courses students had taken (Figure 3) cannot be correlated statistically, we speculate that prior experience (or lack thereof) in taking online courses is one possible reason students may or may not participate in an orientation if one is offered.

Question 6 asked, “Was the orientation delivered online?” Students could choose to check “online” (meaning “fully online”), “hybrid,” or “face-to-face.” According to OWI Principle 13, “OWI students should be provided support components through online/digital media as a primary resource” (CCCC OWI Committee, 2013, p. 25), which indicates that orientation for an OWC should be online in some fashion dependent on the type of course to be taken, and most respondents reported the orientation as having been fully online (n=102), with only 8 students reporting “No,” and 5 students reporting it was a hybrid orientation.

Additionally, OWI Principle 10 states that “Students should be prepared by the institution and their teachers for the unique technological and pedagogical components of OWI” (CCCC OWI Committee, 2013, p. 21), which means students should be oriented not only for the technology they will encounter in the OWC, but they also should be given a course-specific orientation that includes interface familiarization, lessons, and examples of study habits and skills needed to succeed in an OWC (OWI Committee, 2013). Results from the survey were split for general orientations to online courses, 51% (n=39), compared to course-specific orientations, 49% (n=35), which we further address below with an open-ended question. ​

Question 8 asked, “Did this orientation adequately prepare you for the work in this course?” and the majority of students, 78%, (n=82), responded positively. For those students who found the orientation helpful in preparing for the online course, the following responses exemplify why they found it helpful.

The orientation helped give me a better understanding of what I should be prepared for during the semester.

The orientation allowed me to have a “heads-up” of things that could happen in an online course (good or bad) and where/how to seek help if problems arise.

The orientation clearly showed me how to navigate the class website and gain access to course material.

It was very clear that I had to have some basic computer skills, that I had to be self-motivated to check in, and showed me how the course was set up and how to get around in the course.

For those who responded “No” (n=23), the following responses indicate that the orientations were too general to be helpful.

The orientation was too generic to be of any real assistance. Customizable course orientations would be more beneficial, and would also enable more uniformity in the way that online courses operate.

It was just general information on whether I would be able to keep up with the demands of an online course.

The video discussed how to use the interface, but it did not discuss how the professor would use the interface or layout the course (it is a pretty flexible interface from the professor side). It did not create a successful mental model of the course.

The combination of responses indicate that students need—and want—an orientation that includes information about technology, general online learning pedagogy, and course-specific information. Orientation to the online environment and a specific online course are the first steps to ensuring student success (Lee & Choi, 2011). In the case of OWI, “course-specific” should include a range of information depending on the course, including how tools and activities within a course are designed to help students improve their writing as indicated in OWI Principle 10 (CCCC OWI Committee, 2013). We encourage writing program administrators to advocate for instructor funding and support in creating online orientations perhaps through university centers for teaching and learning or through offices of distance education. Online Platform and Technical DifficultiesAs part of the Online Course Activities section, students were asked questions about their experiences with the learning management system (LMS) or other software programs used to deliver the OWC and technical difficulties in accessing the course.

Online platform.

Question 9 asked, “In what program do you do most of your online work for this course?” Students were given four options for specific LMS programs, one “I don’t know” option, and one write-in option. In this survey, by far, Blackboard and Canvas were the two most common platforms reported being used to deliver OWCs: Blackboard at 38% (n=152) and Canvas at 31% (n=123). The other options of Moodle, Google, D2L, and “I don’t know” were less than 10% each. As one would expect, the types of tools used to deliver online courses are vast, and since the survey did not define exactly what was meant by “program,” respondents included a number of responses in the open comment box. Examples of “other” responses included “Zoom,” “Adobe Connect,” “a website,” “Google,” “Word,” and “Eli Review.”

According to Phil Hill (2017), Blackboard continues to be used in just over one-quarter of academic institutions (28%) representing 37% of all student enrollments in the U.S. and Canada. Canvas continues to gain market share, used in 21% of academic institutions representing 27% of student higher education enrollment. The next closest competitor, Moodle, is used at 25% of higher education institutions but represents only 12% of enrollments in the higher education market, indicating that it is used primarily at smaller institutions. Canvas continues to gain users, pulling market share primarily from Blackboard and D2L/Brightspace (edutechnica, 2018). Thus, the survey results are unsurprising when considered in the broader context of national conversations about learning management systems and online course delivery.

Technical difficulties.

Question 10 asked, “Have you ever had technical difficulty accessing (getting into or working in) this course?” This question used skip logic, so those who did not experience technical difficulties were moved to the next section of the survey. Most students reported they did not have difficulty accessing their OWCs. However, for those students who reported difficulties, 28% (n=111), a follow-up question asked what those difficulties were. Most responses indicated “system” problems in that students had problems accessing the LMS. Common access problems included “server was down,” “system was down,” “freezes,” and “connectivity issues.” Of the 28% students who reported problems accessing the course, they expressed they were using the devices identified in Figure 9.

​The reported devices in Figure 9 align with Figure 5 for the most commonly used devices by students to access and complete their online work in the Course Information section. Few students reported access problems stemming from how the course was set up within the LMS or other tools used to deliver the course. They further conveyed that if problems existed within the course, instructors were responsive in resolving those issues. The following detailed response encapsulates the findings from technological access:

When I first enrolled in the course, I could view the course but I was unable to create a thread in the discussion board. So I was unable to do the assignment for the week. I simply emailed my professor and a few days later I was able to create a thread. In the past, I have had difficulties with accessing online courses due to poor internet connection at home. Also when using the app on a mobile phone it is very difficult to write in the discussion board. The page jumps around, deletes or hides things you’ve written or doesn’t allow you to fix mistakes.

​​Effectiveness of Course ActivitiesAs part of the Online Course Activities section, the following discussion centers on three questions that asked students to rate course tools and activities commonly found in OWCs and then to explain their ratings. For an in-depth discussion of these pedagogical activities and a way to improve online course content, design, and deliver, see “A Call for Purposeful Pedagogy-driven Course Design in OWI”in this issue.

Question 15 asked, “Please rate the effectiveness of the following activities in your current online writing course as they relate to you improving your writing ability or becoming a better writer. There are two rows to add activities that are not presented in the options. ”In this question, using a 1-5 Likert scale, students were asked to rate the effectiveness of common tools and activities implemented in their OWCs as they related to improving their writing. Students could indicate whether the item was not incorporated in their current course, and a write-in option also was built in. The items in this question included:

Discussion boards

Quizzes and tests

Synchronous chats

Podcasts

Videos

PowerPoints

Assigned readings

Giving and receiving peer feedback

Instructor feedback

Question 16 followed, asking, “Considering your responses in Question 15, please identify what work in your current online writing course is the most valuable or helpful to you in improving your writing and explain why.” Question 17 asked, “Considering your responses in Question 15, please identify what work in your current online writing course is the least valuable or helpful to you in improving your writing and explain why.” Results in this section include both quantitative data from Question 15 (shown in Figures 10-17) and qualitative data from Questions 16 and 17.

Discussion boards.

​The quantitative data regarding discussion boards suggests that most students found this tool helpful in OWCs (see Figure 10).

​When coupled with the open-ended responses, however, contrary results about discussion boards emerged. Although we are uncertain why such a discrepancy between the quantitative and qualitative data sets existed, by considering both data sets, we speculate a possible disconnect between students’ general perceptions of this learning tool versus their perceptions of—and familiarity with—asynchronous communication for specific uses in their OWCs.

Students who reported discussion boards as being useful generally commented on the benefit of receiving multiple perspectives on their writing.

I find the discussion boards most valuable and helpful. There, we students can collaborate and gain different perspectives of assigned discussions.

The discussion board is the most useful because it is nice to have a place where we can communicate. It provides feedback and is similar to a face-to-face class.

The discussion board as well as instructor feedback on my writing. It helps me to better gauge the work that is expected of me and work out any smaller issues that I may have overlooked during my writing process. It helps extensively with content development.

​​Students who remarked that discussion boards are among the least useful activities generally perceived participation in discussion as being “forced,” which they reported as resulting in thoughtless and meaningless responses and peer feedback. In his research, Michael Wilson et al. (2015) reported that students’ “trust in the ability of other students to mark their work was quite low” (p. 22), which appears to match some results of this survey based on open-ended comments. Specifically, students reported not understanding how discussion posts improved their writing, and they did not see how peer response—either through their responses to others or in others responding to them—improved their writing, as indicated in the comments below.

Discussion boards can help generate new ideas but don’t necessarily help my writing, in my opinion.

Discussion posts. They seem to be used as “filler” points for the class. I have yet to complete a discussion post that brings value to the course.

Discussions. I feel like I don’t get much out of the peer reviews and discussions because the other students in the course don’t give me feedback I find helpful.

The discussion board is the least valuable because I feel people just are writing to complete it rather than having meaningful discussion.

​Only 43% of the respondents in a national survey of online students (Schaffhauser, 2018) found discussion board use helpful as class activities. This result is similar to our findings, and, in some ways, connect to data below on peer reviews wherein students may not have seen the value of individual student comments. Yet, per other studies, when they were guided toward self-reflection and self-assessment of their writing, they expressed that discussion board use could help them improve their writing and learning (Nielsen, 2012; Papadopoulos, Lagkas, & Demetriadis, 2017). Student comments in this study also suggest that the use of discussion boards, while widespread, needs to be carefully considered—and explained pedagogically from a student-centered perspective—when designing OWCs.

​Figures 11 and 12 display quantitative results for students’ perceptions of helpfulness of quizzes, tests, and assigned reading. Students wrote comments concerning quizzes and tests that often were coupled with comments about assigned reading; therefore, we present these results together.

​​Students reported that quizzes were generally assigned to test whether they had completed the reading, suggesting that they understood quizzes can be helpful in learning about their understanding of the readings (e.g., “The quizzes help focus on what’s important in the readings”), but some students commented that they did not know how reading about writing could improve their writing:

Reading a textbook about writing is not helpful I think it takes more experience and practice to develop writing skills.

I think the course readings are somewhat hard to follow at times, especially since we can’t see an in-person demonstration or explanation of the material. There are times it feels like it would be quicker and easier to learn some of the book material in class, so we can learn from our peers and professor about our specific concerns right then and there in class.

​Some students reported readings as being most helpful. In the open-ended responses for Questions 16 and 17, some students simply listed the textbook as being effective with no other comments; others reported the textbook and readings to be beneficial for several reasons, such as those provided below. ​

The text and assigned readings because they were geared to a specialized for of writing.

The course readings were the most valuable work assignments because I learned to look at writing differently which opened a door to new dimensions of writing. Writing became something I admired, not just a literacy requirement.

​One take-away from these data is that quizzes and readings can be seen as “added” assignments if they are not connected to the other assignments in an OWC. For example, Ingrid Spanjers et al. (2015) found that in hybrid environments, frequent quizzes help students “[space] their learning activities” (p. 72), which may positively affect them achieving learning objectives, and feedback on quizzes help students determine what is important. Additionally, Mary Margaret Kerr and Kristen Frese (2017) noted four reasons that students do not read, one being an underestimation about the importance of the reading. Thus, online writing instructors should explain to students how readings and quizzes connect with learning outcomes and other assignments in the course; additionally, instructors should provide feedback to help students realize the connections and importance of the reading, which is addressed in “A Call for Purposeful Pedagogy-Driven Course Design in OWI”.

Synchronous chats, podcasts, videos, and PowerPoints.

Figures 13, 14, 15, and 16 display the results of student rankings of various multimedia teaching tools. Although typically considered at least somewhat helpful when instructors used them, students appeared not to have much experience with instructional synchronous chats, podcasts, videos, and PowerPoints as evidenced by the significant number of responses in the “N/A” category. Regarding OWI as a course that potentially teaches multimodal composition, these results are somewhat dismal. If instructors do not use multimedia (particularly in the TPC OWCs prevalent in the survey responses), it is highly unlikely that such skills are taught as part of the composing processes overall.

​When evaluating how students responded to open-ended questions about helpfulness of multimedia in OWCs, we found that open-ended responses yielded more information than the numbers/percentages alone. In the open-ended responses, some students simply listed chats, podcasts, videos, or PowerPoints as being most or least helpful with no further explanation, but for those who gave explanations, the responses provide some reasons why students found these tools or activities helpful or not, such as how these activities or tools provided clarification about the course or their assignments.

Real-time synchronous chats work best for me; if I have questions, I usually get an immediate answer, either from other students or the instructor.

the [video-recorded] lectures helped me most because while not face to face they were still more informative than me trying to read and figure it out myself.

i love the videos provided for the lecture. it’s as though you are actually in the classroom. i love the feedback. i like the incorporation of google for like EVERYTHING in the class.

This is an editing class, so there is not as much writing, but we are learning the skills and grammar rules that strengthen my writing and professionalism. However, The PowerPoint slides we have used in class have been the most helpful.

Posting presentations and handouts in the discussion forum. We can access these to study for the midterm.

​PowerPoints and videos were remarked upon the most where students provided explanations for whether they thought creating such multimodal products as writing activities were helpful. As with reading, lack of context and knowing how to apply the information to their composition was a common complaint about videos and PowerPoint slides, as was the volume of these materials.

I found the video responses to be the least helpful to my writing abilities. They were more focused on responding to social issues but I find writing out thoughts to be a more productive source of learning.

Videos because I usually don’t have the time to sit down and watch them. If I do, I am distracted and don’t learn anything from them

Powerpoints. I don’t read / watch them. You can’t use a presentation medium to deliver information without a presenter.

The work that is the least valuable/helpful to me would have to be any PowerPoint/lecture slides. The course is primarily based off of videos and essays, so in terms of lectures, this is not specifically provided for the class.

​​Technology is an integral part of OWI; however, as one can see from these responses, students were using multimedia to retrieve information to study from or to get answers to questions they had. Likewise, students expressed that they disliked multimedia in the classroom because they saw these activities as time consuming activities that simply deliver important or not-so-important information to them. They did not appear to see multimedia as composition devices, which begs the question whether instructors are using multimedia as delivery devices only or as compositional tools. As Scott Warnock (2015) mentioned, “Using audio/video is one way technology can enhance communications” (p. 158); thus, technology in the OWC can be used as a delivery system and as compositional aids that students should be taught to use in support of conveying the message they seek to express. GSOLE advocates that students learn this literacy skill as well as ways to read and write alphabetic text; the use of multimodal technologies should be connected with specific OWI pedagogy (Cargile Cook & Grant-Davie, 2005, 2013; Hewett, 2004-2005, 2006, 2010, 2011, 2015a, 2015b; Hewett & DePew, 2015; Hewett & Ehmann, 2004; Paull & Snart, 2016; Warnock, 2009), which may not be the case given the student feedback in this survey.

Instructor feedback and peer reviews.

​Figure 17 displays the quantitative responses to the helpfulness of instructor feedback and of peer review, whether it is getting or giving feedback.

​Respondents overwhelmingly reported the most helpful type of feedback came from their instructor (Figure 17).

In my opinion, instructor feedback is the most valuable or helpful. The instructor grades according to rubric and they explain why the paper may or may not have touched on all require information.

Instructor feedback- clarity in what I am doing wrong and if I am doing something right.

Direct comments on a written item from the professor. Its the most personal to me and can help me see my downfalls and where I need to improve. I think having students review work is positive, but value of reviews is often not that helpful for me personally.

Receiving instructor feedback is the most helpful in improving my writing, because sometimes students don’t have all the information that my professor has that could help me.

The responses concerning instructor feedback are noteworthy because they indicate that students value their instructors because of their expertise in writing, they value their instructor’s feedback on their writing, and they want and need advice/directions from instructors regularly. Thus, student responses confirmed that instructor presence is a crucial aspect of successful OWI and can influence student participation, overall learning experiences (Richardson, Besser, Koehler, Lim, & Strait, 2016), and satisfaction in online courses (Ladyshewsky, 2013). In terms of peer review, in this survey, 22% (n=68) of students reported peer review as helpful and 15% (n=33) did not find it helpful. Students who commented that peer reviews were helpful in the qualitative comments mentioned they had an audience other than the instructor that they wanted to impress. According to Arianne Rourke (2013), students reported that peer review of their work leads them to “feel less alone, more supported and more motivated to continue with the writing process” (p. 5). Additionally, they appreciated receiving different perspectives on their ideas and their writing as reflected in the comments below. ​

I think the feedback is the most helpful because you get to see different opinions on it.

Definitely, feedback from my classmates. I was able to see what I was doing wrong or could improve on before submission to the teacher.

It was most helpful having feedback from peers and my teacher and being able to stay in constant communication because I was able to identify what needed work in my writing.

Peer reviews is hands down the most helpful asset in this course as it helps other students perfect their own work when critiquing others.

A common thread in these comments is that students may be using peer reviews to assess their own writing. Kristen Nielsen (2012) reported that formative activities, such as peer review, can improve student writing achievement and learning; however, both Nielsen (2012) and Pantelis Papadopoulos et al. (2017) mentioned that such improvement may come from writers analyzing their own work through their analysis of others’ writing and not necessarily from the comments provided by their peers in the review, which could be one reason for the negative comments about peer review received on the survey, such as: ​

If I had to pick one, I would say peer feedback, just because peers are usually not trained in written response. The way they frame their comments can sometimes come across as rude or they comment on areas that aren’t that important, instead of areas that would be helpful.

I find that peer reviews tend to be the least valuable, because most students and simply participating for the credit rather than giving meaningful information to improve the documents.

Peer feedback. Sometimes peers don’t give legitimate feedback or do not know what they are talking about.

Thus, instructors may need to reconsider how peer review is commonly conducted in OWCs and what guidance students are given for giving—and receiving—peer reviews.

Comments regarding which activities students found to be least useful were not insignificant complaints. To characterize comments from this section of the survey generally, students did not recognize why they were doing the work they were assigned in their OWCs (which may, in fact, be a challenge for students in more traditional classrooms as well); they expressed uncertainty about the structure, content, and participation in a OWC; and they did not see how the assigned work related to improving their writing. Therefore, we must ask: Are the OWCs these students experienced strategically and pedagogically designed to help them become better writers? If not, what features of the OWCs could be improved and how? (A potential answer to this question is posited in "A Call for Purposeful Pedagogy-Driven Course Design in OWI." These questions are crucial points for administrators to address in conversations and professional development with faculty. Furthermore, it seems important to ask whether OWCs are designed by writing experts (the instructor), and whether instructors have access to the institutional support they need to provide quality OWI in this primarily digital setting. These issues are addressed in the implications section.

Other.

The pedagogical activities question was followed by two open-ended response questions.

Question 16 asked, “Considering your responses in Question 15 [regarding effectiveness of tools], please identify what work in your current OWC is the most valuable or helpful to you in improving your writing and explain why.”

Question 17 asked, “Considering your responses in Question 15 [regarding effectiveness of tools], please identify what work in your current OWC is the least valuable or helpful to you in improving your writing and explain why.”

Many of the responses from these two questions were similar to the preceding open-ended questions where students were asked to clarify their comments in relation to the Likert-scaled questions, so the responses addressed some of the same issues which were raised previously. Question 18 asked, “What is not included in your online writing class that would benefit the learning experience for you in relation to improving your writing?” Student comments in response to this question shed light on pedagogical activities that administrators and faculty should consider since a few aspects of the course were repeated more than others. These included students asking for more visual resources (i.e., PowerPoints, podcasts, and synchronous chats). Specifically, students requested videos as a way to feel the “human” element of online courses (e.g., “more videos from the professor, the audio video is helpful to bring the human element into the digital classroom.”).There also were several comments associated with desiring better directions and more context for assignments (e.g., “I want lectures on the topic. Stuff to put the reading or blog posts into context. We just have five major assignments and no real curricula in my course. The prof doesn’t seem real because there’s no teaching involved.”).The final content question of the survey, Question 19, asked, “Do you have any other comments, concerns, or suggestions about online writing courses that you feel we should know?” Student responses did not contain repeated elements as in the previous question. Because students had not been queried about the content of their OWCs, it was an interesting finding that learners addressed context for assignments, grading, and appreciation for their instructors.

Context for the assignment:Concerned that my fellow classmates and I aren’t given a lot of context for our writing assignments, and therefore we do not put in a lot of time and energy into them...why we’re writing is not made clear, in other words. How is a literary analysis going to help me do my job?

Grading:Please don’t make a class that is graded based on labor that is bell curved against the other students’ labor. Please don’t make a class that is without lectures and only uses blog posts to convey tiny snippets of learning. Please don’t make a class where even if you do all the assignments perfectly, if someone commented more, it counts as more labor, so they get the higher grade when there’s no set standard for a participation grade. If you guarantee people a B for doing all the assignments moderately okay, but don’t actually let them know if their labor is higher than the rest of the class, or in the middle, they’re gonna complain about how they don’t actually know what their grade is. Thanks.

Grading:An online writing class should not be graded solely on participation, as that is a biased remark. I, for one, think about my papers for several hours over a span of days before I begin writing it, but I know some people that can crank out a solid paper in less than an hour. That being said, a writing course should be focused on content and correctness, rather than “effort”.

Appreciation for organized instructors:I love the structure of this course! I love that my instructor sends out emails before the new week begins which tells us what we have in store for that particular week and when we have due dates coming up. I also love that we are given a checklist with what we have to do for each week, as well as additional links that help with that week’s lesson. Organization really helps me in courses!

Appreciation for organized instructors:It is *so* helpful when the instructor is organized and has explicit expectations for every aspect of the class (when homework is due, what’s the plan for the course, what readings are due and where a student can find them, where and how should a student submit homework, etc). Having clear expectations and directions takes a lot of stress off the student because they don’t have to try to guess what the teacher wants.

Additionally, some students used this space for self-reflection about their learning styles, such as that they preferred online learning or face-to-face classes and why they were taking online classes, such as schedule conflicts with campus-based courses. These included the following:

Would prefer in person classes but they don't fit my schedule.

I don't like taking courses online because it doesn't allow for flowing and interesting discussion among classmates.

I took regular composition and found it much more useful. The classroom interaction, exercises, and feedback can't be replicated online.

The reason I attend online class from home rather than work (where I would prefer) is because Zoom, Slack, and Skype Groups are all blocked from my work by Internet policy. I received exemption for a while, but it has to be renewed every month.

This article presented the data of the survey with generalized findings and now includes a brief discussion regarding implications. The companion article “A Call for Purposeful Pedagogy-Driven Course Design in OWI” offers analytical discussion regarding the findings and a model for purposeful, pedagogy-driven course design.

Some of the most illuminating responses from the survey related to the tools and activities in OWCs that students found least helpful in helping them improve their writing and the importance of the role and presence of the instructor. Students disclosed their perceptions about pedagogical approaches and components of OWC design as they related to improving their own writing without, perhaps, being cognizant they were providing such information. When viewed comprehensively, student comments in this section revealed a disconnect between the intended pedagogical application (as we speculate based on the scholarly literature) and how the tools and activities were perceived—and used—by students regarding the improvement of their writing. It seems important to note students did not express that the tools and activities in OWCs were ineffective or unhelpful in and of themselves. Instead, they communicated that the tools and activities were not always implemented in ways that improve student writing, rendering them somewhat ineffective or unhelpful for these respondents.

These results strongly indicate that instructors should consider how their discussion board prompts are designed, how the prompts function to meet learning objectives, what kind of feedback (i.e., grammar, content, and/or organization, among others) the instructor has been providing (and how that feedback relates to improving each individual student’s writing), and what kind of feedback students are expected to provide to each other. Sheri Williams, Amy Jaramillo, and John Carl Pesko (2015) argued that “if instructors make their expectations explicit regarding depth of posts and exploration/problem resolution and collaboration/reflection, students will come to value and use these processes to extend their thinking” (p. 62). How could Williams et al.’s perspective be brought to fruition in an OWC? Warnock (2009, 2018) expressed that discussion boards can be more useful when the instructor takes on a respectful yet critical persona that questions student assumptions and asks them to rethink their statements in an asynchronous discussion. Is this a skill limited to one educator or can all online writing instructors learn and implement it?

Furthermore, peer feedback arose as a meaningful concern for students. If peer feedback is expected to help students improve their own writing, how should it be taught as a skill (or measured as a competency)? The data made apparent that students valued instructors and instructors’ expertise with writing; furthermore, the data visibly revealed that these student respondents preferred and desired instructor feedback to improve their writing. This point has important implications regarding the role of the instructor in an OWC because instructor feedback is crucial in supporting student confidence of their application of course material (Borup, West, & Thomas, 2015; Hewett, 2015b).

Finally, but not to be minimized, the data indicated that instructors should explain the rationale—and long-term learning impact—for their specific uses of readings, quiz assessments, and peer review in each OWC’s context, as well as for the assignments themselves. By taking the time to include a pedagogical foundation for course activities, instructors may motivate students to more effectively attempt or complete tasks in an OWC; as a result, students may more substantially associate successful completion of an assignment with framed and scaffolded learning activities. (See “A Call for Purposeful Pedagogy-Driven Course Design in OWI”).

This point also emerges in other data discussed below—students expressed that they did not know how the work in OWCs helped them improve their writing. Thus, one implication is that the potential to learn should not only be apparent to instructors and administrators in the form of course learning outcomes—it should be made apparent to students, and the orientation is one place to do so. In other words, the course design should ensure that course activities and course tools are actively and meaningfully contributing to successful completion of course assignments and that students can understand these connections throughout the course (Cargile Cook & Grant-Davie, 2005, 2013; Hewett, 2004-2005, 2006, 2010, 2011, 2015a, 2015b; Hewett & DePew, 2015; Hewett & Ehmann, 2004; Paull & Snart, 2016; Warnock, 2009).

Data from students about their learning experiences in OWCs are vital to the future of OWI. As the first attempt to gather student perceptions on a national scale, the survey data raised additional questions key to future research. Acknowledging—and including—student perception and student voice in studies associated with OWI will improve faculty understanding of these issues from the learners’ perspectives.

Future studies should focus on and expand attention to additional aspects of student learning in OWCs beyond what was attempted in this survey. Especially important components of such study should specifically address the issue that access is multifaceted, including physical and cognitive abilities, language proficiency, and socioeconomic conditions. Beyond access, the nature of the modality in which the course is offered (i.e., whether it was asynchronous or synchronous and whether it was fully online or hybrid) needs to be raised to offer additional context for student responses; hybrid courses vary regarding inclusion of required (or optional) synchronous face-to-face and/or fully online sessions, which affect student experiences. Moreover, studies should address the physical location in which students access the LMS as it may impact experiences both in using course tools and in completing course activities. For example, students who accessed the course primarily from a work location may be completing course tasks within time restraints or in a hectic environment where course activities are consistently achieved, but disjunctively.

At this point in the development of OWI, more research is needed to better understand the student learning experience to comprehend what students envision they want and need in an OWC. This survey and its subsequent results are only the first steps in the necessary evolution of research from the students’ perspective. Researchers interested in further studying the approach taken by the student-survey working group have full access to the survey and survey datain hopes other researchers will advance and expand on the work presented here.

This survey was deliberately developed to understand student experiences and perspectives in online writing courses. Even though literature on online writing pedagogy is vast and growing due to increasing numbers of students taking OWCs, it is worth understanding if instructors’ intentions and their perceptions of what should be happening in OWCs matches student perceptions of what is happening to and for them in these classes. Such determinations cannot be known unless students contribute to the body of knowledge on this subject. In other words, simply looking at student evaluations at the end of each semester is insufficient; students must be participants in the research process, which was a major goal of this study.

In reading and assessing student needs from the student point of view, we encourage instructors to consider student preparation for online writing courses and to investigate methods that may more successfully orient students to online writing courses. Additionally, as instructors develop courses, they should design course elements while considering what components of online writing courses students find most helpful and least helpful in improving their writing. Exploring creative yet effective ways to integrate these helpful components will improve student experience in OWI. These efforts may necessitate collaboration with instructional designers, central offices of distance learning, department heads, and writing program coordinators. Interdisciplinary associations of this nature allow for coordination among areas of expertise and communication between faculty, staff, and administrators to meet the common objective of student success and satisfaction.

Even with the limitations of this survey, student responses provide instructors and administrators with new conversations and topics for further investigation regarding how to best teach writing online. One of the most important conversations is how to measure pedagogical impact of OWCs on student writing. Online writing instruction is distinct from distance education courses in other disciplines, and it is important that students see and understand these distinctions. One point made repeatedly in the student data was the absence of rationale for course design and course work. If students are made aware of and taught how to intentionally use feedback and online writing tools to improve their writing, they will understand the online writing activities they are asked to do and connect those activities with their own writing strengths and weaknesses, thus improving their understanding of writing and their writing skills as well. ​

Conference on College Composition and Communication (CCCC) Committee for Effective Practices in Online Writing Instruction. (2013). A position statement of OWI principles and effective practices. Retrieved from http://www.ncte.org/cccc/committees/owi

Rourke, Arianne Jennifer. (2013). Assessment ‘as’ Learning: The Role that Peer and Self-Review can Play towards Enhancing Student Learning. The International Journal of Technology, Knowledge, and Society, 8(3), 1-12.