February 11—last Friday—was the day the music man died. A high-level hustler walked back an old tale. Nick Anderson reported the action in that day’s Washington Post:
I should have said “significant gains,” Rhee now said, offering a mushy-mouthed, watered-down claim—the type of claim that wouldn’t have helped her advance her gong-show career.
(Would that claim have been reasonably accurate? At this point, it’s still unclear.)
Good lord! What had happened? What had led the music woman to revise her sacred tale? Earlier in his report, Anderson offered the outline:
At any rate, Anderson had the story right. Brandenburg had reviewed an important study from August 1995—a study commissioned by the Baltimore City School System, which was attempting to decide whether to continue a three-year experiment in public school privatization. (To read his post, click here.) Over the previous three years, a private entity, Education Alternatives Inc. (EAI), had been running nine of Baltimore’s public schools. An institute at University of Maryland Baltimore County (UMBC) had been commissioned to study the seven elementary schools involved in this ongoing project. This was a major, high-profile effort, closely watched by the Baltimore press.
Brandenburg was thus reviewing the data from a very high-profile study. (To review the full study, click this.) And uh-oh! One of EAI’s elementary schools was Harlem Park Elementary, the school where Rhee was working her miracles as a third-grade teacher! By a tumble of fate, Rhee’s miraculous, three-year career in the classroom coincided with the three years reviewed in the UMBC study. The study was released in August 1995, just two months after the end of Rhee’s final year in the classroom—a year which would become famous for what it helped us see about “educational reform.”
Land o Goshen! UMBC had studied the very school where Rhee was teaching—had studied it during the very year in which her miracle occurred! But what did this high-profile study seem to show? Anderson went there too:
“The 90th percentile or higher,” Rhee had always said. Presumably, that meant that her students had recorded scores ranging from the 90th percentile all the way up to the 99th. As Anderson suggested, there is no sign in the UMBC study that anything like this actually occurred. That said, there are some oddities in the UMBC study which should be observed at this time.
But first:
At this point, we should note another fact. This UMBC study was fully reported in the Washington Times when Rhee arrived on the DC scene in June 2007. In precise detail, the Times’ Gary Emerling described the study, reporting the mediocre scores it ascribed to Harlem Park’s third-graders. He even noted that the scores bore no relation to Rhee’s self-glorying claims (see THE DAILY HOWLER, 7/9/07). Back in 2007, everyone knew about the UMBC study—but the Washington Post chose not to discuss it. Despite Emerling’s report in the Washington Times, the Washington Post never mentioned the study; it never reported the awkward data found in this major research effort. Last Friday morning, the Post was four years late to this party—and Rhee was retracting the sacred tale upon which she has based her influential “reform” ideas.
Back to that high-profile study, and to some of its curious aspects:
The most remarkable part of that study didn’t involve students’ test scores. The most remarkable part of the study involved the astonishingly low rates at which students had actually been tested. The study “noted that many students at [Harlem Park Elementary] were not tested,” Anderson reported last Friday. That statement is perfectly accurate, but it barely scrapes the surface of this study’s most puzzling aspect. Good God! According to the study, only 62 percent of Harlem Park’s students were actually tested that year in math; only 64 percent were tested in reading. (These numbers were not broken down by grade level.)
Can these data really be accurate? Please understand: The testing program in question (the Comprehensive Test of Basic Skills/CTBS) was mandated by the state of Maryland. Beyond that, CTBS results had been hotly debated the previous year with respect to the seven elementary schools which made up the privatization experiment. This was a very high-profile annual program; every student should have been tested, except small numbers of students in a few categories, including kids in special education. (According to the study, only three percent of Harlem Park’s students were in special ed.) It’s bizarre to think that Harlem Park failed to test almost forty percent of its students; equally strange is the way the UMBC researchers took these remarkable data in stride. In fact, non-tested rates were remarkably high at all seven EAI schools, though none of the other schools even approached the rates at Harlem Park. (According to the study, Harlem Park tested 62 percent of its students in reading. The rate at the other six EAI schools was 85 percent.)
Beyond that, non-tested rates were remarkably high at the study’s “comparison schools;” we’re amazed by the researchers’ insouciance about these remarkable data. “The percentage of students for whom 1994-1995 CTBS test scores were recorded in this document is only 75 to 80 percent of the students enrolled in a school,” they write at one point, using the jumbled, imprecise English which typifies studies of this type, undermining faith in the researchers’ competence. “Thus the scores for 20 to 25 percent of the students were not reflected in the published scores for a grade or school.” Those numbers vastly understate the situation at Harlem Park, of course. But does that mean that those other “20 to 25 percent of the students” actually had test scores, but their test scores simply aren’t “reflected in the published scores?” We have no idea, but uncertainty lurks in this report due to the way the researchers dealt with these remarkable data—data which seem to show vast numbers not getting tested. Why would anyone look at those data and not send up an alarm? Why would Baltimore City accept such low rates of testing? Why would the state of Maryland?
If this study means what it seems to mean, lets of students weren’t getting tested in a lot of Baltimore schools that year. And uh-oh! Nowhere were fewer kids getting tested than in Grade 3 at Harlem Park, where Rhee was performing her miracle. At this point, let’s consider the version of her miracle tale Rhee told Newsweek’s Evan Thomas in 2008. In this recounting of her story, Rhee claims she was teaching 70 kids, a specific claim she had made in the past. In this passage, Rhee describes her last two years of teaching, the years when her miracle happened:
Were Rhee and her fellow teacher working with 70 students? At this point, the UMBC study completely breaks down, seeming to offer a pair of contradictory numbers. At one point, the study seems to say that a total of 43 third-graders were tested in reading at Harlem Park that year; at another point, it seems to say that 56 “two-year students” were so tested at the third-grade level. (Those would be third-grade students who had been in the school for at least two years.) The numbers seem to be contradictory. Using our math and logic skills, we can see that if 56 “two-year” third-graders were tested, it can’t be true that only 43 third-graders were tested in all. For no other school can we find this type of contradiction in the UMBC data. This may mean that the researchers made some sort of transcription error at this point in their work.
(Or it could always mean something stranger.)
At any rate, Rhee has boasted, bragged and bellowed about her success with 70 students whom she taught for two straight years, in their second- and third-grade years. If we can trust this study’s data, only 56 “two-year” third-graders were tested at Harlem Park that year—and their performance on the CTBS doesn’t begin to resemble the claim bruited by Rhee through the years.
How well did those 56 students perform? In reading, those 56 children achieved a “normal curve equivalent” of 45—the rough equivalent, the study says, of perhaps the 42nd percentile. In short: As a group, those 56 third-graders scored below the national norm. There is no sign that any significant number of kids scored at or above the 90th percentile. And although this UMBC study has some obvious shaky elements, might we make an observation—an observation which is blindingly obvious?
In the 1994-1995 school year, the seven schools run by EAI were under enormous pressure. During and after the previous year, major disputes had broken out about the low test scores of the EAI schools; by the fall of 1994, everyone knew that the pressure was on, that the plug might be pulled on the program. (As a simple Nexis search will show, all these matters were being discussed in the Baltimore Sun.) Do we possess three brain cells among us? If any school in the EAI group had an educational miracle occurring, this glorious fact would have been shouted to the skies by EAI’s corporate leadership. Trust us: The teachers involved would have gained acclaim in the national media—the kind of “acclaim” Rhee used to say she had attained, before she realized she had to stop saying it. It’s absurd to think there was some large group of third-graders “scoring at the 90th percentile or higher,” but their test scores somehow never came to the attention of the UMBC researchers.
Was there a large group of third-grade students “scoring at the 90th percentile or higher?” Barring some truly bizarre occurrence, you’d have to be a fool to believe it—or you’d have to be a “journalist” at the Washington Post. Tomorrow, we’ll wander back through the years, recalling the way this paper’s education “journalists” have swallowed a wide range of stories.
This miracle tale predates Michelle Rhee. Our “journalists” have constantly bought it.
Tomorrow—part 4: The history of an old story
ANDERSON (2/11/11): At issue is a line in Rhee's resume from that year that described her record at Harlem Park Elementary School: "Over a two-year period, moved students scoring on average at the 13th percentile on national standardized tests to 90 percent of students scoring at the 90th percentile or higher."Sic semper con men! At some point in the past four years, Rhee had softened another bogus claim—the claim that she won “acclaim” in the Wall Street Journal and on Good Morning America for her “outstanding success” as a teacher. Now, the whole house of cards came down. Rhee acknowledged that she shouldn’t have made that detailed claim about her miracle—the detailed claim around which her ministry for “education reform” has long turned.
On Wednesday evening, Rhee said she would revise that wording if she could. "If I were to put my resume forward again, would I say 'significant' gains?" Rhee said. "Absolutely.”
I should have said “significant gains,” Rhee now said, offering a mushy-mouthed, watered-down claim—the type of claim that wouldn’t have helped her advance her gong-show career.
(Would that claim have been reasonably accurate? At this point, it’s still unclear.)
Good lord! What had happened? What had led the music woman to revise her sacred tale? Earlier in his report, Anderson offered the outline:
ANDERSON: Former D.C. schools chancellor Michelle A. Rhee, known for her crusade to use standardized test scores to help evaluate teachers, is facing renewed scrutiny over her depiction of progress that her students made years ago when she was a schoolteacher.Rhee “could have described her accomplishments differently!” Go ahead! Treat yourself to a good, long, rueful laugh!
A former D.C. math teacher, Guy Brandenburg, posted on his blog a study that includes test scores from the Baltimore school where Rhee taught from 1992 to 1995. The post, dated Jan. 31, generated intense discussion in education circles this week. In it, Brandenburg contended that the data show Rhee "lied repeatedly" in an effort to make gains in her class look more impressive than they were.
Rhee, who resigned last year as chancellor, denied fabricating anything about her record and said Brandenburg's conclusion was unfounded. But she acknowledged this week that she could have described her accomplishments differently in 2007, when then-Mayor Adrian M. Fenty (D) selected her to be chancellor.
At any rate, Anderson had the story right. Brandenburg had reviewed an important study from August 1995—a study commissioned by the Baltimore City School System, which was attempting to decide whether to continue a three-year experiment in public school privatization. (To read his post, click here.) Over the previous three years, a private entity, Education Alternatives Inc. (EAI), had been running nine of Baltimore’s public schools. An institute at University of Maryland Baltimore County (UMBC) had been commissioned to study the seven elementary schools involved in this ongoing project. This was a major, high-profile effort, closely watched by the Baltimore press.
Brandenburg was thus reviewing the data from a very high-profile study. (To review the full study, click this.) And uh-oh! One of EAI’s elementary schools was Harlem Park Elementary, the school where Rhee was working her miracles as a third-grade teacher! By a tumble of fate, Rhee’s miraculous, three-year career in the classroom coincided with the three years reviewed in the UMBC study. The study was released in August 1995, just two months after the end of Rhee’s final year in the classroom—a year which would become famous for what it helped us see about “educational reform.”
Land o Goshen! UMBC had studied the very school where Rhee was teaching—had studied it during the very year in which her miracle occurred! But what did this high-profile study seem to show? Anderson went there too:
ANDERSON: The study Brandenburg posted, published in 1995 by researchers with the University of Maryland Baltimore County and Towson University, is stored in an online federal archive. It drew a small amount of attention in 2007. Now it is getting a fresh look.Oof! According to the UMBC study, Harlem Park’s third-graders “finished nowhere near the 90th percentile” in the 1995 testing. Allegedly, this was the year in which Rhee had completed her miracle cure, with “90 percent of [her] students scoring at the 90th percentile or higher.”
The study found modest, uneven gains in various grade levels at the school in a review of results from the Comprehensive Test of Basic Skills. There were no separate results for Rhee or any other Harlem Park teacher. The study also noted that many students at the struggling Baltimore school were not tested.
But the results were presented in enough detail to raise questions about whether any single class could have made strides of the magnitude Rhee depicted on her resume.
Rhee said she taught second grade for two years, then third grade in 1994-95. In that year, Rhee said, her class made a major leap in achievement.
The study found that third-graders overall at the school made gains that year in reading and math. But they finished nowhere near the 90th percentile.
“The 90th percentile or higher,” Rhee had always said. Presumably, that meant that her students had recorded scores ranging from the 90th percentile all the way up to the 99th. As Anderson suggested, there is no sign in the UMBC study that anything like this actually occurred. That said, there are some oddities in the UMBC study which should be observed at this time.
But first:
At this point, we should note another fact. This UMBC study was fully reported in the Washington Times when Rhee arrived on the DC scene in June 2007. In precise detail, the Times’ Gary Emerling described the study, reporting the mediocre scores it ascribed to Harlem Park’s third-graders. He even noted that the scores bore no relation to Rhee’s self-glorying claims (see THE DAILY HOWLER, 7/9/07). Back in 2007, everyone knew about the UMBC study—but the Washington Post chose not to discuss it. Despite Emerling’s report in the Washington Times, the Washington Post never mentioned the study; it never reported the awkward data found in this major research effort. Last Friday morning, the Post was four years late to this party—and Rhee was retracting the sacred tale upon which she has based her influential “reform” ideas.
Back to that high-profile study, and to some of its curious aspects:
The most remarkable part of that study didn’t involve students’ test scores. The most remarkable part of the study involved the astonishingly low rates at which students had actually been tested. The study “noted that many students at [Harlem Park Elementary] were not tested,” Anderson reported last Friday. That statement is perfectly accurate, but it barely scrapes the surface of this study’s most puzzling aspect. Good God! According to the study, only 62 percent of Harlem Park’s students were actually tested that year in math; only 64 percent were tested in reading. (These numbers were not broken down by grade level.)
Can these data really be accurate? Please understand: The testing program in question (the Comprehensive Test of Basic Skills/CTBS) was mandated by the state of Maryland. Beyond that, CTBS results had been hotly debated the previous year with respect to the seven elementary schools which made up the privatization experiment. This was a very high-profile annual program; every student should have been tested, except small numbers of students in a few categories, including kids in special education. (According to the study, only three percent of Harlem Park’s students were in special ed.) It’s bizarre to think that Harlem Park failed to test almost forty percent of its students; equally strange is the way the UMBC researchers took these remarkable data in stride. In fact, non-tested rates were remarkably high at all seven EAI schools, though none of the other schools even approached the rates at Harlem Park. (According to the study, Harlem Park tested 62 percent of its students in reading. The rate at the other six EAI schools was 85 percent.)
Beyond that, non-tested rates were remarkably high at the study’s “comparison schools;” we’re amazed by the researchers’ insouciance about these remarkable data. “The percentage of students for whom 1994-1995 CTBS test scores were recorded in this document is only 75 to 80 percent of the students enrolled in a school,” they write at one point, using the jumbled, imprecise English which typifies studies of this type, undermining faith in the researchers’ competence. “Thus the scores for 20 to 25 percent of the students were not reflected in the published scores for a grade or school.” Those numbers vastly understate the situation at Harlem Park, of course. But does that mean that those other “20 to 25 percent of the students” actually had test scores, but their test scores simply aren’t “reflected in the published scores?” We have no idea, but uncertainty lurks in this report due to the way the researchers dealt with these remarkable data—data which seem to show vast numbers not getting tested. Why would anyone look at those data and not send up an alarm? Why would Baltimore City accept such low rates of testing? Why would the state of Maryland?
If this study means what it seems to mean, lets of students weren’t getting tested in a lot of Baltimore schools that year. And uh-oh! Nowhere were fewer kids getting tested than in Grade 3 at Harlem Park, where Rhee was performing her miracle. At this point, let’s consider the version of her miracle tale Rhee told Newsweek’s Evan Thomas in 2008. In this recounting of her story, Rhee claims she was teaching 70 kids, a specific claim she had made in the past. In this passage, Rhee describes her last two years of teaching, the years when her miracle happened:
THOMAS (9/1/08): Over the next two years, working with another teacher, she took a group of 70 kids who had been scoring "at almost rock bottom on standardized tests" to "absolutely at the top," she says. (Baltimore does not keep records by classroom, so NEWSWEEK was unable to confirm this assertion.) The key to success was, in her word, "sweat," on the part of the teacher and the students. "I wouldn't say I was a great teacher. I've seen great. I worked hard," says Rhee.In telling her inspiring tale, Rhee has often said that she co-taught 70 kids. Multiplying by 90 percent, that would mean that 63 of those kids ended up “scoring at the 90th percentile or higher”—in the top ten percent of the nation. According to Rhee, 63 third-graders at Harlem Park turned in those miracle scores that year. But uh-oh! In that high-profile UMBC study, there is no sign that anything dimly like that actually occurred at that school.
She had an epiphany of sorts. In the demoralized world of inner-city schools, it is easy to become resigned to poor results—and to blame the environment, not the schools themselves. Broken families, crime, drugs, all conspire against academic achievement. But Rhee discovered that teachers could make the critical difference. "It drives me nuts when people say that two thirds of a kid's academic achievement is based on their environment. That is B.S.," says Rhee. She points to her second graders in Baltimore whose scores rose from worst to best. "Those kids, where they lived didn't change. Their parents didn't change. Their diets didn't change. The violence in the community didn't change. The only thing that changed for those 70 kids was the adults who were in front of them every single day teaching them.”
Were Rhee and her fellow teacher working with 70 students? At this point, the UMBC study completely breaks down, seeming to offer a pair of contradictory numbers. At one point, the study seems to say that a total of 43 third-graders were tested in reading at Harlem Park that year; at another point, it seems to say that 56 “two-year students” were so tested at the third-grade level. (Those would be third-grade students who had been in the school for at least two years.) The numbers seem to be contradictory. Using our math and logic skills, we can see that if 56 “two-year” third-graders were tested, it can’t be true that only 43 third-graders were tested in all. For no other school can we find this type of contradiction in the UMBC data. This may mean that the researchers made some sort of transcription error at this point in their work.
(Or it could always mean something stranger.)
At any rate, Rhee has boasted, bragged and bellowed about her success with 70 students whom she taught for two straight years, in their second- and third-grade years. If we can trust this study’s data, only 56 “two-year” third-graders were tested at Harlem Park that year—and their performance on the CTBS doesn’t begin to resemble the claim bruited by Rhee through the years.
How well did those 56 students perform? In reading, those 56 children achieved a “normal curve equivalent” of 45—the rough equivalent, the study says, of perhaps the 42nd percentile. In short: As a group, those 56 third-graders scored below the national norm. There is no sign that any significant number of kids scored at or above the 90th percentile. And although this UMBC study has some obvious shaky elements, might we make an observation—an observation which is blindingly obvious?
In the 1994-1995 school year, the seven schools run by EAI were under enormous pressure. During and after the previous year, major disputes had broken out about the low test scores of the EAI schools; by the fall of 1994, everyone knew that the pressure was on, that the plug might be pulled on the program. (As a simple Nexis search will show, all these matters were being discussed in the Baltimore Sun.) Do we possess three brain cells among us? If any school in the EAI group had an educational miracle occurring, this glorious fact would have been shouted to the skies by EAI’s corporate leadership. Trust us: The teachers involved would have gained acclaim in the national media—the kind of “acclaim” Rhee used to say she had attained, before she realized she had to stop saying it. It’s absurd to think there was some large group of third-graders “scoring at the 90th percentile or higher,” but their test scores somehow never came to the attention of the UMBC researchers.
Was there a large group of third-grade students “scoring at the 90th percentile or higher?” Barring some truly bizarre occurrence, you’d have to be a fool to believe it—or you’d have to be a “journalist” at the Washington Post. Tomorrow, we’ll wander back through the years, recalling the way this paper’s education “journalists” have swallowed a wide range of stories.
This miracle tale predates Michelle Rhee. Our “journalists” have constantly bought it.
Tomorrow—part 4: The history of an old story