In a two-year span, Temple’s Student Feedback Forms Committee has turned the near-mandatory, hard-copy instructor evaluation process into an incentive-laden online experiment.
The initial challenge faced when online student feedback forms were introduced during Summer 2012 is no different two years later: to push student response to an optional online survey.
“We’re trying to get our response rate up,” said Gina Calzaferri, a manager of assessment and evaluation and a member of the SFF committee. “We’re working with Temple’s strategic marketing and communications group in order to review our communications to students. We had a meeting with them [Friday] and talked about ways to improve our emailing, ways to use different technologies with emails.
“If we use two different kinds of messages, is there one message students are responding to more?” Calzaferri said. “We can tailor our email communications to that kind of messaging and reach more students.”
Senior Vice Provost for Undergraduate Studies Peter Jones said Temple saw a response rate of higher than 70 percent for the hard-copy SFFs before the committee decided to make the jump to an online operation.
The response rate of an optional online teacher evaluation was a pressing concern in 2012, when administrators were considering switching the surveys to an online form. Jones said then-Acting President Richard Englert wanted a response rate of at least 50 percent in the program’s second year.
“We presented a proposal to [Englert] and he came back to us and said he would be happy for us to make the change if we could achieve at least a 50 percent response rate,” Jones said. “[Englert] said he would be OK if we took about two years to do that. He understood during the transition that we might not achieve that range immediately.”
The then-inaugural e-SFF forms drew a 51 percent response rate after the Fall 2012 semester before dropping slightly to 50 percent after the Spring 2013 semester, equaling Englert’s benchmark.
Since the initiative’s opening year, the committee implemented the incentive of releasing teacher evaluation data for students who fill out the online evaluations for each class toward the end of the respective semester.
The 2013-14 academic year turned in slightly improved numbers after that, yielding responses of 55 percent for both semesters.
Evaluation scores are termed as upper, middle and lower levels. Answers for the “strongly agree” option on the multiple choice section fall into the upper portion of the scale, while “disagree” and “strongly disagree” answers are classified as a lower level answer, with “neutral” answers making up the middle level.
Though the program’s senior advisor, Jim Degnan, said the average overall feedback score tends to hover around 4.0 on a 1.0-5.0 scale, he stressed the need for increased feedback from students, as the evaluations weigh heavily into an instructor’s standing with the university.
“It’s a useful tool for dialogue,” Degnan said. “It’s a formative type of evaluation. In most cases, we want this to be a self-corrective form. If a faculty member falls into the lower level once, that’s something where a conversation probably needs to happen.”
“It is part of the tenure review process, too,” Degnan added. “When people come up for promotion and tenure, they need to demonstrate teaching effectiveness in some way. So, the reports are sometimes bundled with the promotion and tenure package.”
Based on the numbers provided by Calzaferri and Degnan, the four multiple choice questions given on the e-SFF yielded a 54.6 percent average of an upper-level answer, while an average of 39.1 percent of answers fell in the middle category, with a 6.3 percent average of answers yielding a low-level result.
“The numbers have gone up,” Degnan said. “That doesn’t necessarily mean the teaching has improved, but it is a symptom of that. So, that makes us feel good.”
Andrew Parent can be reached at firstname.lastname@example.org and on twitter @daParent93