by Dr. Mark H. Shapiro
"Examinations are formidable even to the best prepared, for the greatest fool may ask more than the wisest man can answer"... ...Charles Caleb Colton.
Commentary of the Day - April 19, 2004: In the trenches with the California High School Exit Exam. Guest commentary by Elise Vogler.
This March, sophomores enrolled in California public high schools took the state's High School Exit Exam. According to current law, each sophomore must pass both the English/Language Arts and Mathematics sections of the test in order to receive a diploma in 2006. This obviously qualifies the exam as a high-stakes test for the students, but the exit exam is equally high stakes for the high school administering it. Under the recent reauthorization of the Elementary and Secondary Education Act, otherwise known as the No Child Left Behind (NCLB) law, California high schools are judged as making "adequate yearly progress" or not based on the rate at which students take and pass the exit exam.
Any test which purports to judge both school quality and student achievement had better be up to the challenge. Furthermore, state policies regarding the administration of the exam must be rational and coherent.
Unfortunately, neither of the above statements is true in fact.
Don't misunderstand; when the exit exam requirement for graduation was first announced by the Davis administration, I was an enthusiastic supporter. I firmly believe that a high school diploma should mean something substantive, and that the only way to guarantee this was for the state to take control of exit testing. For more than twenty years, individual public school districts had been allowed to design and use their own "proficiency tests" to ascertain that students were suitable candidates for graduation. The results of this fragmented system were abysmal. Some districts officially set "high school proficiency" in reading at the 4th grade level, on the basis that most newspapers were written to that reading level. The mathematics typically required was simple consumer math: calculation up through the manipulation of fractions, a skill that should be mastered back in 6th grade. Districts in California were not requiring anything resembling secondary school competence in their "minimal proficiency tests." Given that, it came as no surprise that the state would eventually impose from above a uniform standard of achievement for graduates to meet.
I thought the exit exam a good idea when it was first proposed in law. I still think it a good idea in principle. I believe that universities and employers have every right to expect that high-school graduates have high-school level skills in reading, writing, and mathematics.
However, the content and implementation of the exit exam leave much to be desired. Those who are not "in the trenches" have no way to know this, but high school teachers get a chance to see the test up close and personal. And here are some of the things we see.
The mathematics portion of the test is billed as having no more than 10% of items requiring algebraic calculations. It was promoted on this basis to the Legislature that approved it. However, both the principal and the high school teachers (these were math teachers) administering the exam were shocked three years ago to see that about 90% of the questions required algebraic understanding. There is nothing wrong with expecting graduates to have mastered algebra, of course, but to have the math test focus almost exclusively on that strand of mathematics is unwise, especially when the public announcements about the test paint a different picture. In subsequent years, the math content of the test has become less exclusively algebraic, yet it still remains far in excess of the 10% algebra originally touted.
There are also issues with the Language Arts portion of the test, particularly with the writing portion. I teach sophomore English, and in the months leading up to the exam, my students relentlessly practiced the five writing tasks which are "fair game" for the exam: biographical narrative, expository composition, persuasive essay, literary response essay, and business letter. In all the domains of writing, I stressed one thing: on any high stakes exams, you do not submit a first draft to be graded. You revise and improve, and submit a polished draft.
This was actually a daunting concept for many of my students to master. Not liking writing, their typical response to a writing task is to rush through a first draft, and then declare themselves "done." It took effort and diligence on my part, and on theirs, to get them to the stage where they would even proofread their essays. But we did it. By the time March came around, my students were routinely turning in third drafts to be graded, instead of letting me see their initial responses. I had a firm expectation that on the exit exam, they would proceed in the same way.
Imagine my dismay when I discovered that the test format itself precluded students from revising or rewriting. Students were given only one two-sided lined page on which to write their essays. This is barely enough space for one well-developed essay, let alone multiple drafts. My students came to me afterwards, utterly demoralized. "We tried to do what you always said, what we practiced," more than one said to me. "But they didn't give us room." There wasn't even any provision for the students to outline their response or jot down notes. And please understand, the use of scratch paper is explicitly forbidden. Students may not write on anything except the test booklet.
Some students reported to me that they found ways around it by searching out "This page intentionally left blank. Do not write on this page" pages, and using them to outline and draft initial responses, reserving the lined pages for their final versions. Others, not as creative or proactive, wrote their rough drafts on the lined paper and then asked for more paper. A perfectly reasonable request, one would think, but the test conditions demanded it be denied. One young lady, in despair that despite her intentions, her rough draft was destined to be graded, crumpled up her entire test booklet, threw it in the trash, and burst into tears.
This entire approach to testing writing puzzles me. Is it the intent of the Legislature that student writing ability should be judged on the basis of first drafts? Is it reasonable to expect a 10th grader not only to write a high quality essay on the first try, but also to do so without having mapped out ideas beforehand?
Expecting students to write well is reasonable, but I believe the test must be designed in a more responsible manner than what I've seen so far.
A last and serious issue with the exit exam is how the state is using it to judge schools. Under the No Child Left Behind Act, each year a designated percentage of sophomores must pass the exit exam. These percentages are currently quite low (under 20%) but are slated to rise every year. If a school can't manage to have enough of its students pass the exit exam, it is labeled as not making Adequate Yearly Progress.
At my school last year, far more than 20% of the students passed each section. In fact over 90% of the sophomores passed the Language Arts portion, and over 80% passed the mathematics portion. Despite this, my school was labeled as not making Adequate Yearly Progress. The problem was that, on one of the days the exit exam was administered, two students were home with the flu. Yes, that's right; you didn't misread. Two students were home sick.
California requires that 95% or more of sophomores must take both parts of the exit exam (it lasts two days). If the participation rate on either day falls below 95%, the school is automatically labeled as not making Adequate Yearly Progress. Since we have only 30 sophomore students at our school (I teach at a Small Necessary California High School), just two students missing the test will sink us no matter how well the test takers do.
The true problem isn't the participation rate, but the fact that there are no make-up days allowed on the exit exam. If you don't take it in March when it is administered statewide, then you must wait until the following October. By that time, your school has already been labeled as failing.
Just two students can destroy our participation rate, but in some schools in this area, one absent student will render the whole school as officially "failing." Yet is it not in the school's power to go drag sick students out of bed and make them take the test.
And lest readers assume that the participation rate requirement affects only small schools, which are anomalous in this state, let me share that the average daily attendance at many inner-city high schools does not exceed 90% on any given day. Asking for 95% on the test day is tantamount to asking the schools for a magic wand.
My principal and superintendent went to the State School Board and explained this problem in detail, but they got nowhere. Essentially, they were told that if our school is so small, we'd better make sure that nobody gets sick in March.
This is what it's like in the trenches with the California High School Exit Exam. I believe in testing, and I even believe that exit testing is a sound idea. But if anything should be labeled "failing," it is the current CAHSEE program.
© 2004, Elise Vogler
Elise Vogler (a pseudonym) has taught in public schools in California for over fifteen years. She has a B.A. in literature from the University of California at San Diego and an M.A. in humanities (specialty: history) from the California State University, Dominguez Hills.
The IP comments: Ms. Vogler's account of the problems with the California High School Exit Exam should resonate with teachers across the country. We have heard similar critiques from many readers who are K-12 teachers. It long has been the IP's view that exams are a necessary part of the educational process. However, the process for determining the progress of an individual student should be based on the cumulative results of many tests rather than on a single "high stakes" exam. Likewise, the process for determining the progress made by a school or school district should be based on sound statistics and sound data garnered from a variety of sources rather than from a single high stakes exam.
Unfortunately, we had a disk crash on one of our computers recently. Part of our mailing list was lost. If you have recently joined the list, or if you are on the list and don't receive a message within the next few days, please send us your information again (just click the mailing list link on the main page). (Privacy Notice: Your email address is used only to inform you about newly posted articles, and to respond to your comments. It is never used for any commercial purpose, and it is never sold or transferred to any third party.)
© 2004 Dr. Mark H. Shapiro - All rights reserved.