Even though the new online student course evaluations will open to students soon, the Faculty Senate is still nervous about its effects on faculty’s annual teaching evaluations.
The Faculty Senate held a scheduled meeting on Nov. 11 to discuss, among other things, how the results from the Nov. 18 online course evaluations will be used. These evaluations affect a professor’s chance for promotion or tenure at the university.
“We want to make sure that this is not used to penalize faculty in their annual evaluation exercise,” Secretary of the General Faculty, Amy Jasperson, said to the Faculty Senators.
The Faculty Senate spent a large portion of their meeting time working on a resolution, which addressed instructor evaluations and the use of results from online course evaluations. This resolution would protect faculty from punitive use of the early results from online course evaluations until the survey’s reliability can be verified; however, Faculty Senators immediately asked the questions “When will that be?’
The student course evaluation has been a faculty reference tool for many years; however, the surveys were usually administered in a paper format and during class time. With the expansion of the Internet, course surveys have been becoming electronic. This new format provides several advantages for the university including being less expensive, having quicker feedback and requiring less class time.
The issues begin to surface when online response rates are compared with the paper survey response rates. A study conducted at Brigham Young University (BYU) in 1997 concluded that response rates are significantly raised by student awareness and incentive. The researchers reported: 87% percent of students completed the survey when instructors assigned extra points for completing the survey; 77% of students completed the survey when it was assigned without extra points; 32% of students completed the survey when informally encouraged by instructors and 20% completed the survey without any mention of it by faculty or staff.
Instead of assigning the survey with the opportunity for extra points, UTSA is attempting an alternative positive incentive approach by conducting a lottery with several iPads as prizes. This may have some affect on student response, but according to the BYU study, the largest impression on response rates is simply student awareness of what the surveys are, the impact of their use and how to complete them.
However, many different methods are available in case the online survey does not produce the results faculty members are comfortable with. Faculty Senator, Macneil Shonle, suggested a paper survey to supplement the online course survey.
“[A] department can conduct its own evaluations on paper, and that can count toward teaching evaluations,” Shonle said. “That’s one way to get the data when the online data is not reliable.”
The English Department will continue using a paper survey, which is a short answer format they used during the old paper surveys, and this paper portion will still be filled out during class time. Other departments may resort to the addition of paper surveys pending the online versions consistent reliability. So how many semesters worth of online course surveys will be needed until UTSA discovers the reliability of results?
“There is no way to answer that, but in the meantime, until we get to that point, faculty members are protected from punitive use of such data,” Jasperson said.
The Faculty Senate was unable to completely ratify the resolution due to time constraints.