Background Image Alternative Text: Teacher giving a lecture


Mississippi State University’s (MSU) Quality Enhancement Plan (QEP), Maroon and Write, seeks to improve undergraduate student writing by implementing a writing across the curriculum model, including writing-to-learn strategies and formal writing components. Maroon and Write utilizes direct and indirect measures, qualitative and quantitative measures, and internal and external instruments to evaluate the success of the initiative for both students and faculty. The assessment plan tracks student learning in the designated classes taught by MIWE-trained faculty and in non-MIWE senior capstone classes.

Student Learning Outcomes

Maroon and Write has established three outcomes for this goal, and each outcome has targets for measuring the success toward achieving the outcomes.

  1. Outcome 1: Students will write documents that are appropriately organized, well developed, and clearly worded.
  2. Outcome 2: Students will use Standard English correctly, avoiding errors in syntax, grammar, and usage.
  3. Outcome 3: Students will be more engaged in writing activities.

Evidence of Student Learning Outcomes

Maroon and Write evaluates all three student learning outcomes on an annual basis (refer to the Maroon and Write Scorecard to track the progress of the program). Targets were selected based on baseline data from the target year; however, the program identified intermediate targets that increase every year. Only the final target is listed on the scorecard, but the intermediate targets are provided on the individual Institutional Effectiveness (IE) Reports.

Results have been both positive and negative. Student writing as measured by the ETS Proficiency Profile has improved over the past several years of the program; however, engagement in writing activities as measured by the National Survey of Student Engagement (NSSE) has remained the same. Writing scores based on the Maroon and Write rubric have varied from year to year, and the program has several hypotheses for why the variation has been occurring, which is explained in the Use of Results section. Feedback from students and faculty during focus groups has been constructive. This feedback combined with results from the rubric have inspired the majority of the improvements made to the program. Refer to the section on Assessment Instruments for more information about the instruments and how they were administered.

In 2016, Maroon and Write revised its rubric to help clarify the connection of writing for thinking to the outcomes. The new rubric eliminated some areas of confusing redundancy between word choice and correctness, as well as problem and thesis. It also added components for evidence, structure, and conclusion, which has helped faculty incorporate best practices for developing writing assignments. This new rubric was piloted in summer of 2017 in addition to the grading with the original rubric. After a review of the pilot data, Maroon and Write moved to accept the New Rubric and began grading with it in spring 2018 for the 2017-18 academic year.

Summary of Results and use of Results to Improve the Program

Maroon and Write began as a pilot program with a small number of faculty in 2013-14 to gather baseline data and test implementation. The program will continue through 2018-19, and will complete annual assessment cycles. The following paragraphs provide a summary of results and how the assessment results were used to improve the program.

Pilot Year: 2013-14

After the pilot year, the co-directors of Maroon and Write implemented several changes to improve the assessment procedures. The rubric scores seemed to be higher than the Maroon and Write team anticipated for two potential reasons: (1) most of the student writing samples came from classes taught by the MIWE-trained faculty and (2) only one grader evaluated each student writing sample based on the rubric. Therefore, the program implemented the dual-grader system where each writing sample is evaluated independently first and then as a team to determine a validated score. Additionally, Maroon and Write clarified its expectations from faculty to include a formal writing assignment in every Maroon and Write class, which could then be used for assessment purposes. The Maroon and Write team sought additional writing samples from other non-MIWE classes to serve as comparative data.


The first year of the program resulted in having significantly higher writing scores for students from MIWE classes than from non-MIWE classes (refer to Table 1). Further analysis revealed that the majority of the non-MIWE writing samples were coming from “service” classes rather than classes from the students’ major. The co-directors hypothesized that the level of effort students put into major classes versus service classes was different. Subsequent years of writing sample collections excluded services classes.

Feedback from the MIWE faculty during the focus groups indicated that they would have liked more emphasis in developing a writing assignment. Implemented in summer of 2015, Maroon and Write dedicated one of the three writing coordinators for each MIWE faculty member, and all faculty were required to meet with the writing coordinator to develop an appropriate writing assignment.

Table 1. Independent samples t-test analysis of 2014-15 MSU seniors' writing samples by MIWE students and non-MIWE students


  MIWE N Mean Std.Deviation
Problem MIWE 217 3.894* .8293
Non-MIWE 633 3.717* .9698
Thesis MIWE 217 3.742 .9168
Non-MIWE 633 3.592 1.080
Support MIWE 217 3.673* .7928
Non-MIWE 633 3.542* .8707
Word Choice MIWE 217 3.940 .7941
Non-MIWE 633 3.833 .8626
Correctness MIWE 217 3.512** .8114
Non-MIWE 633 3.292** .9571
*p<.05      **p<.001

This chart indicates that seniors in MIWE courses scored higher than their counterparts in non-MIWE courses during 2014-15. For three of the components, the seniors in MIWE courses scored significantly higher: Defining the problem, providing support for the argument, and correctness/grammar.
Read more details about this year’s assessment in the 2014-15 IE Report


MIWE classes in 2015-16 included students other than seniors, and therefore, the scorecard includes results for seniors and all MIWE students. All comparative data are from senior-level students only. Maroon and Write attributes the significantly higher scores in MIWE students’ “problem” area compared to non-MIWE students to the emphasis placed on assignment design during the MIWE training program (Refer to Table 2). However, three of the other rubric areas had lower scores than their non-MIWE counterparts.

Consistent with the rubric scores, the 2015-16 administration of the ETS Proficiency Profile included results for 155 students from MIWE classes. Maroon and Write felt it appropriate to report the score for all MSU seniors as it has in previous years, and also to identify the scores of students from MIWE classes. The General Education Subcommittee of the University Committee on Courses and Curricula (UCCC) requested that the transfer students be distinguished from the non-transfer students, which is also indicated in the scorecard. Students in MIWE classes had lower proficiency than non-MIWE students, and non-transfer students had higher proficiency than transfer students.

The writing coordinators identified the “thesis” and “support for thesis” components of the rubric as the areas in more need of improvement. Therefore, the writing coordinators implemented a series of workshops through the Center for Teaching and Learning to help faculty better understand these components and how best to improve them.

Table 2. Independent samples t-test analysis of 2015-16 MSU seniors' writing samples by MIWE students and non-MIWE students


  MIWE N Mean Std.Deviation
Problem MIWE 473 3.70* .829
Non-MIWE 390 3.55* 1.025
Thesis MIWE 473 3.45 .951
Non-MIWE 390 3.43 .996
Support MIWE 473 3.40* .807
Non-MIWE 390 3.54* .813
Word Choice MIWE 473 3.34* .769
Non-MIWE 390 3.48* .838
Correctness MIWE 473 3.16** .906
Non-MIWE 390 3.41** .922
*p<.05      **p<.001

Read more details about this year’s assessment in the 2015-16 IE Report


Based on the results of the 2015-16 data, Maroon & Write emphasized thesis statements in the 2016 MIWE class. This emphasis resulted in significant improvements in this writing area between MIWE students and non-MIWE students. Related to the thesis statement, a good thesis relies on a good introduction to the problem or the context surrounding the paper itself. The writing component related to problem statement was also significantly higher in the papers generated from MIWE classes. Although the rest of the writing components were higher for MIWE students than non-MIWE students, the remainder were not significantly higher.

Table 3. Independent samples t-test analysis of 2016-17 MSU seniors' writing samples by MIWE students and non-MIWE students

  MIWE N Mean Std.Deviation
Problem MIWE 146 3.24* .841
Non-MIWE 171 3.03* .690
Thesis MIWE 146 3.34*** .995
Non-MIWE 171 2.94*** .791
Support MIWE 146 3.17 .800
Non-MIWE 171 3.04 .719
Word Choice MIWE 146 3.11 .753
Non-MIWE 171 2.99 .715
Correctness MIWE 146 2.94 .896
Non-MIWE 171 2.88 .818
*p<.05      **p<.001


An observation that Maroon & Write is noting is that the writing scores are declining in all writing areas for all class types. Students from MIWE classes do have higher averages than non-MIWE students, but those averages are less than the previous years’ cohorts. Maroon & Write is now interested in evaluating whether the decline in writing is related to students’ actual performance or to the quality of the writing prompt. All of the people who participate in grading writing samples for the purposes of this project have noted that the rubric itself has several flaws that lead to inconsistency in scoring. For this reason, a new rubric was created in 2016 and tested among all of the people who grade Maroon & Write papers. Both the old and the new rubrics will be used to evaluate the writing samples to provide continuity as the program shifts from one rubric to the next. As of the time of this update (April 2018), the new rubric scores were not available because most of the grading occurs during the summer. Those scores will be added to this web site as soon as they are available.


Maroon & Write underwent transition during the 2017-18 year. For the first time, the MIWE summer workshop was moved to May instead of June, which meant the grading of the previous year had not been finished. Therefore, MIWE was not informed by the most recent data but rather data from 2015-16. During MIWE, more focus was placed on fine-tuning the assignment, working on adding a thesis, and helping faculty connect learning outcomes to the formal writing assignment. The change to May though was a positive outcome as the timing allowed more faculty to be able to participate than had been able to in the past. As a result, Maroon & Write implemented a new grading model so that most of the grading of the fall 2017 semester occurred during the spring 2018 semester. Grading will continue through the summer 2018 semester; however, initial data analysis can be used to inform the 2018 MIWE class.

Maroon & Write experienced a good deal of personnel changes during this year. Several faculty members from the MIWE class were promoted to administrative positions and were unable to complete the courses they signed up to revise during summer 2017. One of the Writing Coordinators was reassigned to a different department within the university. Also, one of the co-directors of Maroon & Write was promoted to a different administrative level and is no longer able to retain her QEP-related roles.

Feedback from the past several years of MIWE faculty has been that they would like to see more work on rubrics. Maroon & Write has already developed the syllabus for the summer 2018 class, and a day has been dedicated to helping faculty fine-tune their learning outcomes and then developing associated rubrics. Also included in this discussion is explaining how Maroon & Write will evaluate the writing samples using a common rubric that is focused on writing skills and not on course content.


The final year of Maroon & Write saw the largest MIWE class since 2014-15, with 14 faculty members participating in the workshop. These faculty were supported by four Writing Coordinators, three of whom were new additions to Maroon & Write. The new Writing Coordinators all had experience as graders for Maroon & Write and their familiarity with the program and the revised rubric enabled them to provide a high level of support for the MIWE faculty in the final year. In particular, MIWE students experienced growth in the Evidence and Conclusion categories of the rubric, which is indicative of the focus on the rubric in the 2018 MIWE class. 

  2017-18 2018-19
Evidence 2.38 2.54
Conclusion 2.26 2.32


Similar to previous years, MIWE faculty found the introducing writing in their courses led to students being more engaged and incorporating some of the strategies such as scaffolding and reflective writing improved the quality of students' formal writing assignments. While 2018-19 was the final year of the QEP, the work of Maroon & Write will continue through the efforts of the MSU Department of English. 

Assessment Instruments for Student Learning Outcomes

Maroon and Write has developed or identified four instruments for measuring the impact on student learning: Maroon and Write Rubric, ETS Proficiency Profile exam, NSSE survey, and focus groups.

Maroon and Write Rubric

MSU faculty developed the Maroon and Write Rubric as a direct, internal measure of student learning. The rubric contains five components to represent:

  • Background, context, or problem
  • Thesis
  • Support for the Thesis
  • Word Choice and Sentence structure
  • Correctness

Five levels of proficiency:

1= Poor
2= Acceptable
3= Good
4= Excellent
5= Superior

measure the students’ performance in each of the five components.


Starting in 2014-15, Maroon and Write employed two teams of two graders each summer to apply the rubric to all student writing samples. Each grader reviews the writing sample individually and then compares scores with his or her partner. The pair then arrives as a validated score for each writing sample, which is logged in an Excel spreadsheet. If the pair is more than one number different on the rubric, then a third grader is asked to score the assignment to determine a validated score. This process reduces the potential variation among graders and ensures more accurate evaluation of writing skills.

ETS Proficiency Profile

The ETS Proficiency Profile is a standardized, external instrument that directly measures students’ performance in writing, mathematics, critical thinking, and reading skills. The abbreviated version of the test contains 36 questions, with nine questions for each of the skill-set areas. Students have 40 minutes to complete the exam. The writing section of the exam measures students’ knowledge of grammar, language organization, and figurative language. The results of the exam indicate what percentage of students score at proficient, marginal, and not proficient within three levels of understanding. The following levels demonstrate a student’s ability to do the following:

  • ETS Level 1: recognize grammar and word usage
  • ETS Level 2: build upon simple components of writing and incorporate those simple components into more complex sentence structures
  • ETS Level 3: recognize how complex sentences work together for parallelism, idiomatic language, correct constructions, and reduction in redundancy


Each year, the Office of Institutional Research and Effectiveness administers a paper and pencil exam to freshmen and seniors to track the institution’s progress in these skill sets. The institution then compares its results to a group of 10 peer institutions to establish a peer-level benchmark. MSU has performed below its peers in writing, but the gap is narrowing as the Maroon and Write program progresses.

National Survey of Student Engagement

The National Survey of Student Engagement (NSSE) represents an indirect, external measure that has been standardized with hundreds of four-year colleges and universities. The NSSE measures students’ engagement with coursework and studies and how the university motivates students to participate in activities that enhance student learning. This survey is used to identify practices that institutions can adopt or reform to improve the learning environment for students.

Each year, the Office of Institutional Research and Effectiveness deploys the online survey to freshmen and seniors. The institution then compares its results to a group of peers from the same Carnegie classification, from a peer group determined by the NSSE examiners, and from a group of peers that MSU has identified.

Focus Groups

Maroon and Write engages students and faculty in focus groups. Discussion pertains to their experiences with writing in their classes and the engagement with course content. The co-directors assemble these groups and coordinate the discussions each spring.