Monday, April 13, 2009
Nobody Wants To Know The Truth
I'm an advocate of premising social policy on sound research and analysis. You try to find out what's working in other places, and build on those ideas. You monitor your own implementations of those ideas to be sure that your policies are, in fact, working. Ideally, you can build on working ideas, revise or eliminate ideas that aren't working, and make things better.
But what happens when those who implement policies, and those who are affected by those policies or by potential changes in those policies, either don't care about implementing the best public policy solutions or are afraid of what they will uncover if they shine a spotlight on existing policies or the alternatives? One of the most obvious contexts where you can see this phenomenon at work is in K-12 education. When "reforms" come, they're often ad hoc, seat-of-the-pants, "this sounds good"-type reforms, that may or may not last. But most of the time, there appears to be little to no interest in achieving actual reform or improvement. Sometimes you get the worst of both worlds - a reform agenda that doesn't appear to be concerned with quality, and which may in fact be rooted in a political agenda that is something quite apart from ensuring quality public education.
For example, advocates of charter schools are often strong advocates of standardized testing of children in public schools, but they often seek to exempt charter schools from those testing requirements. That prevents a direct, side-by-side evaluation of charter schools and public schools in the same district. Ideally you would be able to track individual student performance, year by year, through their entire progress through the grades. That allows you to compare schools and classes not only by the performance of students at the end of a year, but also to compare the quality of students going in. If schools accept public funding, why shouldn't they have at least that much accountability? For that matter, shouldn't we be taking a similarly hard view at this type of data that already should be available in public school settings?
It isn't just the supporters of charter schools who don't get behind mandating across-the-board testing. It's also many proponents of the status quo. When you don't have the data that supports a side-by-side comparison, it's easier to argue that the status quo is better than (or at least as good as) the alternative. Teaching schools don't appear interested in collecting and analyzing data that might suggest that teaching certificates aren't superior to accelerated or alternative certification programs, or even in comparison to uncertified teachers. Teacher's unions don't necessarily want to see data that suggests that contract standards for hiring, promotion, and assignment to schools and classrooms aren't optimal or may impede good classroom learning.
A related example of this phenomenon in action comes from Teach for America. TFA has been around for about two decades, so there's been ample opportunity to develop good data. To some degree that could be done internally - TFA could develop performance tests to be administered at the start and end of the school year, to help gauge student needs, performance, and improvement in performance over the course of a school year. A school district interested in comparing general teacher performance to TFA corps member performance could administer those tests in its regular classes - or, conversely, it could develop or license tests and require their use across the district. Instead, even when students are being tested and it appears that performance could easily be tracked down to an individual level, that seems to be regarded as something to avoid.
Thus, TFA can announce on its website that a study "confirms... that corps members have a positive effect on student achievement relative to other teachers, including experienced teachers, traditionally prepared teachers, and those fully certified in their field", based on weak data from a single state, North Carolina. Skeptics can point to the amount of guesswork involved, trying to connect students to classroom teachers by who proctored an exam, trying to gauge what happened in a classroom based upon a year-end test with no start-of-year data, the exceptionally small sample size of TFA teachers, the omission of any schools below the high school level, the differences between North Carolina's demographics and those of the most vexing school districts, such as DC or New Orleans....
My point here is not to criticize TFA or corps members. With the résumé of a corps member, not a certified teacher, I've substitute taught in public schools. I've seen some terrific teachers, and others who (despite "years of experience") sleepwalk through their workday. I've seen how different a classroom experience can be between schools, and the role SES, parental involvement, and administrative competence can make. I've also seen how much easier it can be to teach high school than to teach middle school. With no wish to make this an excessively broad generalization, as some schools have severe order and discipline problems that continue through high school, I found teaching grades 10-12 to be a relative cakewalk as compared to teaching middle school, with ninth grade being something of a transition.
I found it to be at times quite difficult to explain what, to me, was an elementary math concept to a seventh grader. I didn't have a similar problem communicating with kids in tenth or eleventh grade. That is to say, when I was out of the context where teaching skills were most crucial, where I had lesser concerns about maintaining classroom order, where the kids understood their lessons at a more adult level, I could be highly effective. I didn't necessarily feel ineffective at other grade levels, but there were times when it was obvious that I lacked the skill set necessary to be more effective. Ideally, had I been a professional teacher, I would have learned some of those skills during my coursework and teacher training. (I say "ideally" because I've had certified teachers tell that, at least in the programs they attended, training in classroom management came almost as an afterthought.) Oh, sure, I would likely have also developed my own set of skills "on the job", had I continued to teach in a classroom for the next few years, but I have no illusions that, for me, it would have been a particularly easy or natural process. Given my druthers, I would have stuck with grades 10-12.
So again, here, I see TFA demonstrating a strong interest in claiming credit based upon weak data and analysis, but not trying to collect strong data or facilitate compelling analysis. I see school districts employing TFA corps members doing nothing to evaluate how the corps members compare to certified teachers. I see advocates of teacher certification and tenure suggesting that reliance upon TFA results a high level of churn among entry level teachers, causing harm to students who would benefit from having experienced teachers. But I don't see anybody advocating the type of study that would sort out and answer some of the persistent questions about TFA, teacher certification, and teacher effectiveness.
It really does amaze me sometimes that, with all of the money this nation pours into education (not to mention the combined spending of the world's other nations), there's so little apparent interest in determining which curricula, classroom management techniques, and approaches to teaching particular subjects (e.g., reading, foreign language, math) are the most effective. I once crazily hoped that having teacher colleges sponsor charter schools might provide a laboratory for education innovation and reform. Oh well.