How Schools Cheat
From underreporting violence to inflating graduation rates to fudging testscores, educators are lying to the American public.
On March 17, 2005, 15-year-old Delusa Allen was shot in the head while leaving Locke High School in Los Angeles, sending her into intensive care and eventually killing her. Four months before that several kids were injured in a riot at the same school, and last year the district had to settle a lawsuit by a student who required eye surgery after he was beaten there. In 2000, 17-year-old Deangelo Anderson was shot just across the street from Locke; he lay dead on the sidewalk for hours before the coroner came to collect his body.
Violent crime is common at Locke. According to the Los Angeles Police Department, in the 2003-?04 school year its students suffered three sex offenses, 17 robberies, 25 batteries, and 11 assaults with a deadly weapon. And that's actually an improvement over some past years: In 2000?01 the school had 13 sex offenses, 43 robberies, 57 batteries, and 19 assaults with a deadly weapon.
Sounds unsafe, doesn't it? Not in the skewed world of official education statistics. Under the federal No Child Left Behind Act, states are supposed to designate hazardous schools as "persistently dangerous" and allow their students to transfer to safer institutions. But despite Locke's grim record, the state didn't think it qualified for the label.
Locke is not unique. In the 2003?04 school year only 26 of the nation's 91,000 public schools were labeled persistently dangerous. Forty-seven states and the District of Columbia proudly reported that they were home to not a single unsafe school. That would be news to the parents of James Richardson, a 17-year-old football player at Ballou Senior High in Southeast Washington, D.C., who was shot inside the school that very year. It would be news to quite a few people: The D.C. Office of the Inspector General reports that during that school year there were more than 1,700 "serious security incidents" in city schools, including 464 weapons offenses.
Most American schools are fairly safe, it's true, and the overall risk of being killed in one is less than one in 1.7 million. The data show a general decline in violence in American public schools: The National Center for Education Statistics' 2004 Indicators of School Crime and Safety shows that the crime victimization rate has been cut in half, declining from 48 violent victimizations per 1,000 students in 1992 to 24 in 2002, the last year for which there are complete statistics.
But that doesn't mean there has been a decline at every school. Most of the violence is concentrated in a few institutions. According to the National Center for Education Statistics, during the 1999?2000 school year 2 percent of U.S. schools (1,600) accounted for about 50 percent of serious violent incidents--and 7 percent of public schools (5,400) accounted for 75 percent of serious violent incidents. The "persistently dangerous" label exists to identify such institutions.
So why are only 26 schools in the country tagged with it?
The underreporting of dangerous schools is only a subset of a larger problem. The amount of information about schools presented to the general public is at an all-time high, but the information isn't always useful or accurate.
Thanks to the No Child Left Behind Act, now three years old, parents are seeing more and more data about school performance. Each school now has to give itself an annual report card, with assessment results broken down by poverty, race, ethnicity, disability, and English-language proficiency. Schools also are supposed to accurately and completely report dropout rates and teacher qualifications. The quest for more and better information about school performance has been used as a justification to increase education spending at the local, state, and national levels, with the federal Department of Education alone jacking up spending to nearly $60 billion for fiscal year 2005, up more than $7 billion since 2003.
But while federal and state legislators congratulate themselves for their newfound focus on school accountability, scant attention is being paid to the quality of the data they're using. Whether the topic is violence, test scores, or dropout rates, school officials have found myriad methods to paint a prettier picture of their performance. These distortions hide the extent of schools' failures, deceive taxpayers about what our ever-increasing education budgets are buying, and keep kids locked in failing institutions. Meanwhile, Washington--which has set national standards requiring 100 percent of school children to reach proficiency in math and reading by 2014--has been complicit in letting states avoid sanctions by fiddling with their definitions of proficiency.
The federal government is spending billions to improve student achievement while simultaneously granting states license to game the system. As a result, schools have learned to lie with statistics.
Prospering Cheaters
Under No Child Left Behind, if schools fail to make adequate yearly progress on state tests for three consecutive years, students can use federal funds to transfer to higher-performing public or private schools, or to obtain supplemental education services from providers of their choice. In addition, schools that fail for four to five consecutive years may face state takeovers, have their staffs replaced, or be bid out to private management.
Wesley Elementary in Houston isn't a school you'd expect to be worried about those threats. From 1994 to 2003, Wesley won national accolades for teaching a majority of its low-income students how to read. Oprah Winfrey once featured it in a special segment on schools that "defy the odds," and in 2002 the Broad Foundation awarded the Houston Independent School District a $1 million prize for being the best urban school district in America, largely based on the performance of schools like Wesley.
It turned out that Oprah was righter than she realized: Wesley was defying the odds. A December 31, 2004, expos? by The Dallas Morning News found that in 2003 Wesley's fifth-graders performed in the top 10 percent in the state on the Texas Assessment of Knowledge and Skills (TAKS) reading exams. The very next year, as sixth-graders at Houston's M.C. Williams Middle School, the same students fell to the bottom 10 percent.
The newspaper obtained raw testing data for 7,700 Texas public schools for 2003 and 2004. It found severe statistical anomalies in nearly 400 of them. The Houston, Dallas, and Fort Worth districts are now investigating dozens of their schools for possible cheating on the TAKS test. Fort Worth's most suspicious case was at A.M. Pate Elementary. In 2004, Pate fifth-graders finished in the top 5 percent of Texas students. In 2003, when those same students were fourth-graders, they had finished in the bottom 3 percent.
In the Winter 2004 issue of Education Next, University of Chicago economist Steven D. Levitt and Brian A. Jacob of Harvard's Kennedy School of Government explored the prevalence of cheating in public schools. Using data on test scores and student records from the Chicago public schools, Jacob and Levitt developed a statistical algorithm to identify classrooms where cheating was suspected. Their sample included all student test scores in grades 3?7 for the years 1993 to 2000. The final data set contained more than 40,000 "classroom years" of data and more than 700,000 "student year" observations. Jacob and Levitt's analysis looked for unexpected fluctuations in students' test scores and unusual patterns of answers for students within a classroom that might indicate skullduggery.
They found that on any given test the scores of students in 3 percent to 6 percent of classrooms are doctored by teachers or administrators. They also found some evidence of a correlation of cheating within schools, suggesting some centralized effort by a counselor, test coordinator, or principal. Jacob and Levitt argue that with the implementation of the No Child Left Behind Act, the incentives for teachers and administrators to manipulate the results from high-stakes tests will increase as schools begin to feel the consequences of low scores.
Texas' widespread cheating likely was a response both to high-stakes testing and to financial incentives for raising test scores. The Houston school district, for example, spends more than $7 million a year on performance bonuses that are largely tied to test scores. Those bonuses include up to $800 for teachers, $5,000 for principals, and $20,000 for higher-level administrators.
Texas is not the only state where schools have cheated on standardized tests. Teachers provided testing materials to students nearly a dozen times in 2003 in Nevada, for example. And Indiana has seen a raft of problems, including three Gary schools that were stripped of their accreditation in 2002 after hundreds of 10th-graders received answers for the Indiana Statewide Testing for Education Progress?Plus in advance. A teacher in Fort Wayne took a somewhat subtler approach in 2004, when school officials had to throw out her third-grade class's scores after she gave away answers by emphasizing certain words on oral test questions. In January 2005 another Fort Wayne third-grade teacher was suspended for tapping children on the shoulder to indicate a wrong answer.
Phantom Dropouts
If you want to make a school's performance look more impressive than it really is, you don't have to abet cheating on standardized tests. Instead you can misrepresent the dropout rate.
In 2003 The New York Times described an egregious example of this scam in Houston. Jerroll Tyler was severely truant from Houston's Sharpstown High School. When he showed up to take a math exam required for graduation, he was told he was no longer enrolled. He never returned.
So Tyler was surprised to learn, when the state audited his high school, that Sharpstown High had zero dropouts in 2002. According to the state audit of Houston's dropout data, Sharpstown reported that Tyler had enrolled in a charter school--an institution he had never visited, much less attended. The 2003 state audit of the Houston district examined records from 16 middle and high schools, and found that more than half of the 5,500 students who left in the 2002 school year should have been declared dropouts but were not.
The Manhattan Institute's Jay P. Greene argues, in his 2004 paper "Public School Graduation Rates in the United States," that "this problem is neither recent nor confined to the Houston school district….Official graduation rates going back many years have been highly misleading in New York City, Dallas, the state of California, the state of Washington, several Ohio school districts, and many other jurisdictions." Administrators, he explains, have strong incentives to count students who leave as anything other than dropouts. Next to test scores, graduation rates are an important measure of a school's performance: If parents and policy makers believe a school is producing a high number of graduates, they may not think reform is necessary. Greene writes that "when information on a student is ambiguous or missing, school and government officials are inclined to say that students moved away rather than say that they dropped out."
Greene and his associates have devised a more accurate method for calculating graduation rates. Simplifying a bit, it essentially counts the number of students enrolled in the ninth grade in a particular school or jurisdiction, makes adjustments for changes in the student population, and then counts the number of diplomas awarded when those same students leave high school. The percentage of original students who receive a diploma is the true graduation rate.
Using Greene's methodology, the national high school graduation rate for 2002 was 71 percent. Yet according to the National Center for Education Statistics, in 2002 the national high school "completion rate," defined as the percentage of adults 25 and older who had completed high school, was 85 percent. As Greene notes, "There were a total of 3,852,077 public school ninth-graders during the 1998?99 school year. In 2001?02, when that class was graduating, only 2,632,182 regular high school diplomas were distributed. Simply dividing these numbers produces a (very rough) graduation rate estimate of 68%." The states show similar discrepancies between their reported graduation rates and the number of students who actually receive diplomas.
As Sharpstown High School's former assistant principal, Robert Kimball, told The New York Times, "We go from 1,000 Freshman [sic] to less than 300 Seniors with no dropouts. Amazing!"
The problem isn't limited to Texas. In March researchers at Harvard's Civil Rights Project released an analysis of state graduation rates for 2002, in which they derived their figures by counting the number of students who move from one grade to the next and then on to graduation. The report found serious discrepancies between the rates calculated by the Civil Rights Project and those offered by education departments in all 50 states. In California, for example, the state reported an 83 percent graduation rate, but the Harvard report found that only 71 percent of students made it through high school.
The Civil Rights Project's paper also found a high dropout rate among minorities, which California officials hides behind state averages. Almost half of the Latino and African-American students who should have graduated from California high schools in 2002 failed to complete their education. In the Los Angeles Unified School District, just 39 percent of Latinos and 47 percent of African Americans graduated, compared with 67 percent of whites and 77 percent of Asians.
Moving the Goalposts on Proficiency
A subtler way to distort data is to report test scores as increasing when in fact more students have been excluded from taking the test. One egregious example of this practice took place in Florida, which grades schools from F to A based on their standardized test scores. Oak Ridge High School in Orlando boosted its test scores from an F to a D in 2004 after purging its attendance rolls of 126 low-performing students.
The students were cut from school enrollment records without their parents' permission, a violation of state law. According to the Orlando Sentinel, about three-quarters of the students had at least one F in their classes, and 80 percent were ninth- or 10th-graders--a key group, because Florida counts only the scores of freshmen and sophomores for school grades. More than half of the students returned to Oak Ridge a few weeks after state testing.
The Sentinel also reported that in 2004 some 160 Florida schools assigned students to new schools just before standardized testing in a shell game to raise school grades. In Polk County, for example, 70 percent of the students who were reassigned to new schools scored poorly on Florida's Comprehensive Assessment Test, suggesting they were moved to avoid giving their old schools a bad grade.
Florida is not alone. In a third of Houston's 30 high schools, scores on standardized exams have risen as enrollment has shrunk. At Austin High, for example, 2,757 students were enrolled in the 1997?98 school year, when only 65 percent passed the 10th-grade math test. Three years later, 99 percent of students passed the math exam, but enrollment had shrunk to 2,215 students. The school also reported that dropout figures had plummeted from 4.1 percent to 0.3 percent. Rather than a sudden 20 percent drop in enrollment, the school had used a strategy of holding back low-scoring ninth-graders and then promoting them directly to 11th grade to avoid the 10th-grade exam.
States are also excluding a higher percentage of disabled students and students for whom English is a second language. (Needless to say, these exclusion rates are not reported with the test score data.) And states often report that their test scores are going up when they've merely dumbed-down their standards by changing the percentage of correct responses necessary to be labeled "proficient" or by changing the content of the tests to make them easier. Of the 41 states that have reported their 2004 No Child Left Behind test results so far, 35--including all of the states showing improvement--had schools meet the targets not by improving the schools but by amending the rules that determine which schools pass and which fail.
For example, the Philadelphia Inquirer reported last October that Pennsylvania's "improvements" were a result of lower standards, not improved performance. These changes, approved by the federal government, allowed schools with lower graduation rates, lower standardized test scores, or lower attendance than in previous years to win passing marks. In 2004, 81 percent of the state's schools met No Child Left Behind's adequate yearly progress benchmarks using the new standards. But the Inquirer analysis found that if the same rules used in 2003 had been used in 2004, the number of schools falling short of the yearly benchmark would have grown from 566 to 1,164. Instead of 81 percent meeting the benchmark, just 61 percent would have succeeded. When the Pennsylvania Education Department announced in August that only 566 of 3,009 public schools failed to meet federal standards, it neglected to mention the role the rule changes played in the "significant gains" made.
This sort of thing has been going on for a while. Back in 2002 Education Week reported that "a number of states appear to be easing their standards for what it means to be 'proficient' in reading and math because of pressures to comply with a new federal law requiring states to make sure all students are proficient on state tests in those subjects within 12 years. In Louisiana, for instance, students will be considered proficient for purposes of the federal law when they score at the 'basic' achievement level on their state's assessment. Connecticut schoolchildren will be deemed proficient even if they fall shy of the state's performance goals in reading and mathematics. And Colorado students who score in the 'partially proficient' level on their state test will be judged proficient."
The federal government actually gives a seal of approval to states that are lowering the standards they had before Bush's era of "accountability." For example, the U.S. Department of Education allowed Washington state to lower its high school graduation rate from 73 percent to 66 percent and still meet No Child Left Behind requirements--with the promise of an 85 percent graduation rate by 2014. Apparently, the feds are spending billions to compel states to reduce their academic standards.
Lying by Omission
But the most common way school data deceive people is through omission. State and local education officials simply do not define their terms for the media or the general public. As we've already seen, "persistently dangerous" doesn't mean the same thing to officials that it means to you and me.
Another example: My local newspaper lists area schools that have met No Child Left Behind goals and are compliant with federal law. The article will tell you that every subgroup, from low-income children and Hispanics to special education children, is proficient in reading and in math. It will not say that in California, in order for yearly progress for each subgroup to be considered adequate, only 13 percent of the children in each group must be proficient. Imagine the difference--and how much more helpful it would be to a concerned parent trying to decide what is best for her child--if the newspaper article said, "Here is a list of schools where at least 13 percent of children in each group are proficient."
The newspaper should also explain what it really means to be "proficient" in reading. To be considered proficient for the third grade in California, you must score at the 51st percentile in reading and the 63rd percentile in math on California's standardized STAR test. In other words, all it really means when my school is listed as meeting "adequate yearly progress" under No Child Left Behind is that at least 13 percent of third-graders in every subgroup scored at the 51st percentile on the reading test.
Most parents assume that "proficiency" means grade-level performance. But proficiency standards are so different from state to state that students with the same skills will have very different proficiency rates. In third-grade reading, for example, Texas sets its cut score--the correct number of responses or percentile ranking a student needs to be considered proficient--at the 13th percentile. Nevada sets its cut score at the 58th percentile.
All this only scratches the surface of the ways schools use statistics to mislead parents and the public. From reporting teachers' salaries without including benefits as part of their compensation to reporting per-pupil spending while excluding billions in spending on school buildings and infrastructure, the list of deceptions goes on and on.
The No Child Left Behind Act was supposed to let parents and policy makers identify and fix failing schools. More important, it was supposed to give kids a right of exit out of failing or dangerous institutions. But that's meaningless if "failing" and "dangerous" can be defined away. Despite the violence at Locke High School, the teaching failures at Wesley Elementary School, and the high dropout rates at Sharpstown High School, the average kid in those institutions is no closer to escaping now than before the law was passed. And despite the glut of information being offered to parents--and the glut of dollars being spent on education--most families rarely see the facts about their schools' performance.
No Child Left Behind was sold as a way to make the schools more accountable. Instead, it has encouraged and abetted them as they distort the data and game the system. That may be the worst deception of all.
Show Comments (1)