"We accumulate our opinions at an age when our understanding is at its weakest.".... ...G.C. Lichtenberg.
Commentary of the Day - July 8, 2002: Give Us This Day Our "Daily Jolt.com," or Student Evaluations Hit Rock Bottom. Guest commentary by Sanford Pinsker.
I've always been, let us say, ambivalent about efforts to quantify student evaluations. Born in the heat of the tumultuous late l960s and early 1970s, questionnaires that sought to measure teaching effectiveness were, first of all, an expression of student power and, second, a way to give teaching a greater share of the larger evaluation pie. Students demanded a say in which of their professors were retained, given tenure, or promoted -- and administrators were quick to grant the request.
What followed, however, were often ill-conceived efforts to ask the right questions. At my college students were polled about which courses had "changed their lives" (something unlikely to happen four times a semester, much less thirty-two times during a college career), and woe to the professor who came up with a poor score. The same thing is true for what seemed an entirely reasonable question -- namely, "Did your professor hand back written work within a reasonable period?" The rub came with the word "reasonable," which some students defined as, say, within a week and others insisted was the next class meeting. One student went so far as to expect that his written work be handed back at the end of the period in which it was turned in, although nobody was sure how that could be done and still have the professor in question actually hold class. Because of this last fellow, the question now defines "reasonable" as a period "normally within two weeks."
Teaching questionnaires can, of course, be tinkered with, and that has been the case at my college. But no matter how the items change, what seems to matter most is whether or not the students like their professor. If they do, their professor is likely to get high marks for "expertise" or "demeanor"; if they don't, a hammer gets dropped.
In the best case scenario, deans and department chairpersons quickly learn how to read between the lines of student responses, and to distinguish between those professors who cling to high standards and those who give away the farm. But many, tenured and untenured alike, are not so confident. Because most evaluations are computer scored, what one ends up with is a number -- and higher numbers are better than lower ones. Forced to decide between giving a student a B- or a C+ on a paper, the choice is all too clear.
Small wonder, then, that I have my doubts about institution-wide student evaluations -- not only because many students don't take them seriously, but also because I suspect a case can be made that they have done more harm than good. Like the Willy Loman in Arthur Miller's play, "Death of a Salesman", too many professors want, above all, to be "well liked" -- and given the way that evaluation numbers dictate who will stick around and who will get their walking papers, can anybody -- including me -- blame them?
We've lived with this unfortunate condition for a very long time, and there are no signs that it is likely to change in the near future. What colleges and universities will do is what they've always done -- namely, form committees (in some cases, whole task forces) that will add new questions, drop old ones, and the beat will beat on. But all this hectic activity misses the real point: to borrow an analogy from Mark Twain, the difference between real teaching and teaching evaluations is akin to the difference between lightening and a lightening bug.
What I have no doubts about, however, are the informal, student-generated rankings that spring up from time to time, sometimes in hard print, but increasingly on the Internet. At present, dozens of colleges and universities from Alfred to Wilkes share a web site called "dailyjolt.com". All manner of information about the job market, school activities, even daily dining hall menus are posted, but nothing has garnered more attention than a feature called "Rate My Professor" in which students are invited to comment on categories such as "expertise," "helpfulness," "ease of grading," and "sexiness." The last item can earn the professor in question a chili pepper, to indicate that he or she is hot. The results for other categories are tabulated and each professor receives the appropriate mark: a smiley face for the good, a just-plain face for the average, or a frowny face, reserved for those who collected too many knocks.
Let a few examples stand for many, at my college an openly gay professor received a chili pepper because one student (it only takes one) thought he had "the sweetest ass cheeks in town, baby." In the same way, it takes only one disgruntled student to sink somebody's chances for a smiley face. A colleague of mine was singled out as somebody who taught in such a way that "If you don't love English... you're dead." -- this for a course in Introduction to Literature. Professors teaching intro courses in other departments often met with the similar criticisms. And then there was the art professor who got lambasted because "Her philosophy: a C equals average, so most people receive C's.
Granted, not all students are pissed off about the teaching they've received but even those who think their professor's expertise is "awesome" are not really in a good position to make that judgment. The point, of course, is that getting one's daily jolt is a way of viewing the uncensored, unofficial student mind -- and it is often not a pretty sight.
What can, what should, be done about "dailyjolt.com."? Nothing, absolutely nothing. But students should know that this self-indulgent, widely unscientific venture does little to help the cause of student power, and may, in fact, weaken it. Those who make decisions about who gets tenure or promotions have more important other things to ponder than butt cheeks, sweet or otherwise.
©2002 Sanford Pinsker
Sanford Pinsker is the Shadek Professor of Humanities at Franklin and Marshall College.
The IP notes that the student evaluations that Professor Pinsker refers to in his commentary do not reside directly on the "dailyjolt.com" web site. The "rate my professor" link on Daily Jolt takes one to another site "RateMyProfessor.com".
As Sanford notes, the ratings on the RateMyProfessor.com site can be grossly misleading. Most are based on a very small sample of students.
Printer friendly version
[ home | web rings | links | archives | about| freelance contributions |donate ]
The Irascible Professor invites your comments .
©2002 Dr. Mark H. Shapiro - All rights reserved.