perception word graphic

MSU replaced the Student Instructional Rating System (SIRS) survey this summer with the new Student Perceptions of Learning Survey (SPLS). Conversations about a transition to a new system began several years ago due to limitations of the SIRS Online system, which was a homegrown solution in use since 2004. In addition, the provost’s office had received frequent requests to revise the SIRS instrument due to concerns about validity and reliability, reports of instrumentation biases, inappropriate use of data, and subjective use of SIRS results for personnel decisions such as salary adjustments, reappointment, promotion, and tenure. Thus, the transition to the SPLS is a major technical upgrade and an opportunity to advance a culture that supports professional development and values high-quality teaching and learning.

The technical aspect of this transition is relatively simple, even for the technically “hesitant” among us. If we were just upgrading systems, we would import questions from the old SIRS into the new SPLS and carry on. But addressing issues like validity, reliability, bias, and use of data is complex and requires that we do some things fundamentally differently.

Reflection and Revision

The transition to the SPLS began with a review of the two-page SIRS policy that had not been revised since 1979. With input from across the university in a broadly inclusive process, the nine-page Student Perceptions of Learning Environments Policy was adopted by university governance and the office of the provost. The name was changed from SIRS to SPLS to acknowledge that these end-of-course surveys represent student perceptions or experiences in a course. Important to the culture shift currently underway is full recognition that such surveys are not objective measures of learning or evaluations of the quality of teaching. As such, the SPLS should not be the only evidence used to evaluate instructors. When properly contextualized and paired with other forms of feedback, they can be useful for course improvement and rewarding quality teaching.

What are other sources of information to pair with SPLS data? Given that quality teaching is a crucial factor in evaluating personnel, SPLS data could be considered alongside evidence such as syllabi, course materials and student artifacts, narrative or reflective statements about teaching, a teaching portfolio, or the summary of a peer observation. Expanding the use of this kind of evidence at MSU is essential to advancing a culture that values teaching.

Mitigating Bias

Still, more must be done to address concerns, at MSU and nationally, about biases in student surveys which are well documented in the extant research. It is tempting to throw student surveys out altogether instead of implementing research informed strategies for mitigating bias. We chose the latter focusing on two recommended, evidence-based strategies: (1) exclude open-ended questions, and (2) present students with Likert-scale questions that are narrower in scope representing characteristics of quality teaching. Implementing these strategies at MSU means that there is no longer a reservoir of student comments in the “Institution 8” questions or a single mean score that purportedly represents the overall quality of the instructor. It requires annual evaluations and reviews for reappointment, promotion, and tenure be done differently and use more authentic data for analysis. Unlike the technical aspects of a system upgrade, these changes in how we do things are not simple, quick, or easy but, hopefully, they address the strong concerns held about the prior approach to evaluating instruction.

Another major change in this transition is that the SPLE Policy now requires that all survey questions are reviewed at least every five years. Reviews of the institution-level questions will include validation studies that examine each question for validity, reliability, and equity. Because colleges have the decision to add additional questions specific to courses offered in their unit, it is expected that there will be similar periodic question review at that level, as well.

Shifting Toward Support

Lastly, student survey data are often unwittingly used in inappropriate ways. For example, those conducting reviews could be tempted to compare instructors to department or college mean scores to identify those whose performance was “better” or “worse.” Such comparisons are rarely valid or reliable, and they ignore the fact that student perception feedback is highly context specific. For example, research shows that students perceive that they learn better in small classes which is reflected in survey results, meaning that comparisons between instructors who teach high and low enrolled courses are not typically valid or equitable. Thus, an important aspect of this transition is to provide support to evaluators so that SPLS data are used appropriately for course improvement and when making personnel decisions.

The national discourse and research about student surveys suggests that reform in this approach to evaluating teaching is long overdue. As with other reforms, sustainable progress is usually more evolutionary than revolutionary. Given that the MSU policy had not been updated since 1979, some changes currently underway probably feel like seismic shifts. But a concerted and sustained focus on gathering and appropriately using student feedback will further advance Michigan State University’s commitment to broad and diverse approaches to innovative and effective teaching, as well as more authentically support educators whose employment and promotion significantly rests of the quality of their teaching. Read more on Innovative and Effective Teaching.

 

Share: