Online course evals. permanent
After a one-year trial of the online course evaluations system, faculty members of the College have decided to continue to administer evaluations online. The faculty voted to make the change to electronic forms permanent on April 9.
On April 5, the Faculty Course Evaluation Committee developed a final report on this year’s trial of online course evaluations. The report described the overall efficacy of the new evaluation delivery system. They determine this by analyzing the differences between the responses on paper in previous years and those collected online in the past year. “The responses [online] were very close to what we were getting on paper,” Associate Professor of Mathematics and Chair of the Course Evaluation Committee George Welch said.
Welch said that the faculty had discussed the switch to electronic evaluations many times in the years preceding the trial. He noted that online evaluations are easier to collect, compile and analyze than those done on paper.
The Committee’s main concern regarding the transition revolved around the quality of the data collected. The faculty wondered if the responses would be as meaningful if they were collected online. Tenure-track professors, whose position at the College is highly influenced by course evaluations, were particularly concerned that the online evaluation responses would by insubstantial.
The Evaluation Committee conducted the trial from spring 2010 through January 2011. As an incentive to complete the online evaluations, students who used the new system received their grade reports over a week earlier than those who did not. While it is unclear whether this incentive prompted a higher response rate, many students found this to be very convenient.
The overall student response rate in both the spring and fall of semesters was 86 percent. The overall response rate in January 2011 was slightly higher at 89 percent.
Seniors were less likely than students from other class years to fill out the forms, especially in the spring. Male students were also less likely to respond. In the course of conducting the trial, the Evaluation Committee learned that they must specifically target these groups in their efforts to encourage students to complete the evaluations in the future.
The Committee received some positive student feedback regarding the increased anonymity of online forms. Seventy-four of the 407 respondents expressed that they believed that their professors could recognize their handwriting when the completed the paper forms. Although the new online system shows the administration which students have and have not not filled out evaluations, it eliminated identifiable differences such as handwriting in individual evaluations.
The narrative responses were also much easier for the administration to read online than those written on paper.
“While we would like to increase the response rates,” the Evaluation Committee’s report concluded, “we feel that the current rates are good, and that the differences in the quality of the data received is not diminished to the degree that some may think.”