Saturday, February 5, 2011
Holy $#!%, We Are Bad Teachers
This thought recently passed through the minds of my colleagues and myself as we reviewed our students mid year SOL scores. We each originally did so in isolation and when we came together and realized that everyone had seen a drop in student performance, I am not sure we felt much better. We got our scores by E-mail and the next day sat down in our PLCs (professional learning communities) to gather ourselves. We realized that there was a rather precipitous drop in how we did not just as individuals, but as a department. And present company excluded we have some darn good teachers so what gives? It is a question that plagued us that might not have any singular answer. We wiped the confusion off our faces and tried doing one of the few things we could, teach.
What this insecurity revealed was how vulnerable we remain with the use of a single indicator. What was clear was that our department had done terribly compared to previous year(s) which goes against every trend. We did not see this coming. Though roughly half of the students have taken these tests(many honors and AP student are enrolled in year long courses and will take the test in May) and percentages should go up, there were some very disturbing trends. Fact is, our scores stunk. The funny thing about facts is they often don't answer questions, they only make you ask more.
Recently the state contracted with a new testing company. There is plenty of the usual edujargon about how the tests would be scored(little of which anyone truly understands). This company was tasked with administering new standards. In my subject there were in fact very few significant differences and given the tests and questions are treated like national security secrets, not sure we'll be able to get a handle on whether the test difficulty changed. We wondered if they specifically asked questions targeting all the new content or not. If they did it might perhaps provide some explanation but would seem a flawed approach. The scary part about this is that I've begun to question the validity of the test. If the test was "harder" wouldn't that throw off all that statistical mumbo-jumbo used to make the results more valid. Maybe it was how the test was scored, the size of the sample of students used to norm the test, the score drop for each teacher, the drop on all 3 SOL subject tests, similar issues at all our local high schools, or even being shocked by the drop in kids that got a perfect score for me last year...Something's fishy.
A cynic might suggest this was some covert effort to discredit our schools and prove we are failing(I do not share this alarmist approach). Our scores have always gone up and by 2014 100% of students in Virginia must pass every test. This is not actually going to happen. Just thought it needed to be said. That's a discussion for later post. Maybe the testing company felt some pressure to "up the ante" and ensure they could be trusted. Maybe they used all new questions and content we were accustomed to teaching now was no longer being assessed. It would be an understatement to say these scores are designed to confuse. In the past we might look next for correlations with AP scores to gain insight but can't yet compare that data. Would that even help?
Not being a total conspiracy theorist I think other factors are at least worth mentioning. To begin I had a student teacher. Those that didn't saw the same thing so that's probably not it. The school switched from an every other day A/B yearlong schedule to every day 4x4 semester classes. Some dismiss this completely but I actually think this could explain a small drop, but not to the degree we experienced. This schedule stinks! It was implemented last minute and is just not what I consider beneficial for learning. We have far less time for instruction both in length of classes and total numbers of days, enough so that days were added two days before exams to meet minimum state requirements.
The timing of the tests was not ideal as we started them immediately after the Holiday break. This likely affected at risk kids more and they took it even later after exams, two weeks later. We as teachers are also teaching 1 more class than in previous years. That means about 25-30 additional kids for each of us(I'll teach 157 by years end, but there is talk of of "capping" it at 150...gee thanks). When you do this more of the responsibility inevitably shifts to the learner. Teachers just can't provide as much attention and help. It is tough. Moreover most kids are taking one additional class adding to their burden.
Still I sit here in some weird funk. Disappointed for the kids who failed, working with others to help those who qualify to retake the test immediately, and just plain ticked I am no closer to finding out exactly what happened...and most important to me, AVOIDING THIS IN THE FUTURE. I don't have the direct means to improve my teaching for these kids on these tests. I'm not just talking about data. Data has a funny way of satisfying those far removed from what is really going on, but most teachers say it is overrated. While the decline of the higher achieving kids might not be what gets the attention in the AYP world, it is equally as troubling. I am now experienced enough to comprehend the greater significance of the SOL test, how those scores are used, and what they really mean(you can read into that what you want).
One reality I have to confront is that the issue might in fact be me. If I didn't think that at some level, it would be a mistake. That insecurity keeps me motivated. Not sure it changes much but scores could soon affect my compensation. I know scores at schools can vary and even overcome some things often seen as predictors. Speaking of predictions, I think it would be safe to expect similar situations at schools around our state. Some might take comfort in that, we do not.
The chain of teaching, learning, ASSESSING, and improving has many links. It appears to me that one of those is broken. Maybe someone paid more money who doesn't work much with kids can someday help me identify which link it is.
Subscribe to:
Post Comments (Atom)
Maybe I misread your post, but it seems like there's an immediate explanation for why SOL scores are lower this year. You say most AP and Honors students will take the test in May. This means only advanced/standard classes have yet to take the SOL. Those groups of students will have, on average, lower scores. If you have only received standard/advanced kid's SOL scores at this time, it would seem only natural that scores would appear lower than they were last year when Honors/AP kids were included in the SOL scores you were getting back.
ReplyDeleteIf I've misinterpreted your post and the drop in SOL scores is occurring WITHIN the subset of standard/advanced kids, then that certainly is an interesting problem.
Best,
Your Boy
The fact of AP and Honors students would lead to a drop in scores, but my understanding is that indeed, that within the subset of standard/advanced students that not only pass rates, but scores in general (and the number of pass advanced scores) have declined this year.
ReplyDeleteFrom the outside (I don't teach an SOL course) it appears pretty clear that something unusual is going on (unless every teacher in our county suddenly became less effective than in the last decade). But even in the face of a common sense explanation that the testing or scoring seem to be off, the teachers and school (and to a lesser degree the students) will be held accountable.
Accountability is a good thing, but it doesn't seem to matter what teachers are accountable for as long as they are accountable.
Things will improve as the pool of scores deepens(and those will include more higher level classes). Total failures though are already higher than last year and we still have some lower end classes to take it in May. Comparing my 2 semester long honors classes that did take the test to those last year, they are down. Kids that did well in 9th grade on the WH I test did worse on WH II test...with no exceptions that I have seen. I had 4 honors sections last year, 22 600s. 2 sections so far this year, 1 600. % of pass advanced down as well. Again without looking at the actual test, specific individual results on questions...flying blind...as usual.
ReplyDeleteKeeping things in perspective and dodging jaded instincts...but I gotta be honest that this last 2 weeks, accountability sucks. Need another snow day...Good thing I hate quitting