Alleged Dropout Factories Cry Foul
One of the key issues in trying to determine the real number of dropouts is the ability to develop a way of measuring those who do leave school. One would think it would be easy, but with the transitory nature of society today, it is truly difficult to track students.
The recent report from Johns Hopkins challenged the typical method of counting dropouts, stating that the nation’s statistics are simply not accurate. The study further asserts that dropouts are grossly underreported.
The Hopkins researchers utilized a measure called “Promoting power.” The analysis was done by simply comparing the number of freshmen enrolled at a school to the number of seniors enrolled at the school three years later.
As has been quickly noted by most educators, “Promoting power” is neither a graduation rate nor a dropout rate. In fact the researchers make that same note in their report. Instead a better phrase to be used to describe the measure is that it is more of a “check engine” light.
However, for reasons known only to one of the researchers, Bob Balfanz, those schools with a low promoting power number were quickly dubbed “dropout factories.” And then, as is often the case, the press ran with a concept that was certain to sell papers.
School Officials Cry Foul
However, that descriptor was soon assailed by virtually every school thus named. As one example, we turn to a Washington State Superintendent who saw his school named as such. The Superintendent there, Rick Schulte, dismissed the research as “crackpot” and the publicity surrounding the data “irresponsible journalism.” He also noted how severely damaging the report was to the image of his high school.
As but another example, Pinckney Community School officials vehemently denied the report that their school had a 46 percent dropout rate. The high school Principal Jim Darga insisted that the school’s actual dropout rate is about 1.7 percent. Said Darga, if the suggested rate were indeed accurate students would “be walking out the doors daily”
While it is easy to see why so many education officials took exception to the simplistic approach used, the study does beg a legitimate question. Where do all these freshmen go over their four years of high school? Individuals do transfer, do move to new locations, etc. but there should then be additional students at some schools by virtue of that movement.
A simple example of the difficulty of assessing school responsibility would also to be to examine the number of teenagers who are incarcerated. They are truly dropouts in most people’s minds, but should that number be assessed to the school that the student last attended?
That aside, it is time for elected officials to recognize there is indeed an issue here, even if it is difficult to accurately portray the extent of it. For many teenagers today, school is not seen as a solution. In fact, for many of that age group, school is seen as one of their problems.
One Size Does Not Fit All
While NCLB purports to close the achievement gap for minority students, it in fact creates more problems for the group of students already struggling in school than it solves. For those who are experiencing academic difficulty in school, raising the standards bar only exacerbates their problems.
Some students simply do not learn well by reading and writing. For many more the complexities and abstractions of algebra are beyond the scope of their abilities. Yet as a nation we have foisted upon all students a one size fits all approach and now seek to penalize schools that cannot get all students to fit into that one size.
With such an approach we will continue to see dropout rates that are truly troubling, rates that have not improved in more than a decade no matter what measure is used.