Monday, August 22, 2011
That's the Truth Truth
That's the Truth Truth
My colleague in one of his latest posts gave some perspective on the idiocy of NCLB. We are back at work this week diligently attending meetings out the wazoo. In many of these conversations, talk turns to moving beyond the SOLs. On Orientation night I ran into a parent of one of my kids from 1st semester and she mentioned her child's result on said SOL and I felt like the butler in this video. Like most teachers I wanted to be Rocky Balboa at the end of a fight and raise my hands in the air with my students and claim victory. No such luck. The truth is that moving beyond something as ginormous as the SOLs will be tough for many reasons.
In a post earlier this year I shared my state of mind when I got the SOL results. These determine whether or not our school makes AYP(we did not), how I am measured, and many other things. After 2nd semester's results came in I wasn't so much confused as I was frustrated and angry. SOLs have made me like Rocky in the later movies, my brain turning to mush from constant pounding. The punches coming from all this SOL/AYP talk. I just can't get this whole SOL conversation out of my head. It has become all consuming. Not because I focus only on SOL content or whether the school or division is making AYP. It's because like many teachers I think about the impact on individual kids. Too often when I see a parent or kid the test comes up. And it should. People should be outraged...protesting...calling for firings(not mine please)...or at the very least not buying Dixie Chick Albums.
What do these scores and test results mean? In other words..."what is the truth behind the SOL?"
I'll avoid the school or division wide discussion here. This Spring's results got my dander up(whatever that means) so with some colleagues we expended some effort back in June trying to find out what the truth was about how we really did. As we peeled back the layers of the testing onion it got pretty stinky at times. I thought sharing some of what we learned might illicit a degree of empathy from the non-teachers among you that went to school before we migrated to this other-worldish test driven planet. After all we teachers can't be malcontents all of the time and need some help.
At first glance the numbers appeared to show me 2 things about my kids. No surprises passing wise...but there was an pronounced drop in Pass Advanced(scores over 500). So I started to ask what exactly the difference was in how scores were labeled (Pass Advanced/Pass Proficient) and then how exactly the test scores were calculated(not that I hadn't asked this before mind you). What I found, or didn't find was troubling. The labels applied to results seem to have little to no value as an educational tool. But I'm getting ahead of myself.
Back to my conversation with the parent where I shared that my own feelings on SOLs are pretty complex. I can only imagine the feelings and confusion involved for parents and kids. Testing generates one powerful thing. Data data data...I spent more time this year looking at my numbers than I did in the previous 10 combined. To be fair it could be said I was not looking at data this time but instead trying to find the answers I wanted. When those in the higher ranks of education policy engage in this practice I am highly critical. I tried to share with the parent a combination of brief history of SOLs and some analysis at the same time and somewhere communicate something resembling the truth. This included the fact that these numbers have to be interpreted and despite claims to the contrary, numbers can lie.
To begin a little background:
400 is passing proficient, 500 is pass advance, 600 is a perfect score.
Tests and questions in Social Studies EOC tests are not released.
The state adopts a cut score and this is a criterion referenced test(I am not sure those statements are compatible).
No one other than the people who make and grade the tests seem to fully understand everything about these tests and that seems to be the way they like it. The amazing lack of transparency is troubling.
So as I explored my results I was bothered by the logic of a grading system where one kid answers 35 correct and gets a 417, another student in the same class who gets a 415 but got 36 correct(different test versions). The company (Pearson) points this possibility out but again my brain is mush so I don't get it. So mush and all let's look at a random student who got a 492 and received a Pass Proficient rating. What does that mean? Not that much honestly from what I could tell.
On the World History I test there are 6 categories and a score of 50 means the students answered all of the questions in that category correctly. The amount of questions in each category varies by the weight of the category.
Here are the results for that student:
40 RC1 = Human Origins and Early Civilizations
38 RC2 = Classical Civilizations
50 RC3 = Postclassical Civilizations
45 RC4 = Regional Interactions
42 RC5 = Geography
34 RC6 = Civics and Economics
Usually the reports we get from Pearson are about as clear as Rocky's vision when he uttered the phrase "cut me Mick."
We get an overall number and some scores in the six separate categories. My favorite part is this section of the report that reads... "Reporting category scores, which are on a scale of 0-50, can be used to identify students' strengths and weaknesses. A score of 30 or above indicates a strength. A score of less than 30 indicates that the student may benefit from additional instruction in this area". So this particular student was judged as "strong" in each area but only received a Pass Proficient.
So why not Pass Advanced?
We asked the same thing when we saw the drop in scores and what kids got what scores. Here is what I found and it honestly came a little late to comfort me or any of my kids when they were judged as only "Proficient". As for what that means here is a link to the VDOE summary of performance level descriptors for each grade and end of course test(note the absence of Social Studies descriptions). These labels are supposed to assist parents, students and teachers in understanding how they did. But few understand what the labels actually mean or how they are determined. AYP for our school only factors in pass rates and is unaffected by these terms. But kids are and some took it pretty hard. Use of these without proper context would be as dangerous as a kid running with scissors and likely invalid when used to measure how we are doing. Is the same true for individuals? Some of my kids and parents were disappointed especially those who got a Pass Proficient. My disappointment stemmed from the drop in Pass Advanced scores and seeing the reaction of kids who have grown up with these tests and sadly measure themselves by how they do.
What a difference a year makes! Before you call for my firing or resignation spend a second thinking about some of the stuff we've written about on this blog that affected our results. Adding an additional class to teachers and student workloads and switching to the 4x4...guess what. Looks like it made a difference. But so did the test.
Nothing I have found gives any meaning to the terms Pass Advanced/Pass Proficient in social studies and I have looked everywhere. But that is the first thing parents see that has any meaning to them. The difference is simply it says one is a 400(31 right) and one is a 500(53 right). We spent some time figuring out the exact scale(neither the state nor Pearson gave us this...maybe because they don't want people to know that kids only need to get 1/2 of the questions to pass). If they used the scale from previous years the drop in my kids scores would not have been as dramatic.
The tough part is many kids define their performance by these terms but don't even know what they mean. They might know(most do not) they can miss 29 questions and pass but incorrectly assume that the break from 400 to 500 is halfway to a perfect score. It is not. So the truth appears to be these labels and numbers have little substantive meaning. They just fall in from where they set the cut score for 400 and exist in a vacuum until they are pulled out by a bean counter somewhere or by parents who are unable to put these in proper perspective. So I hear frustration and confusion when results come up. Well what do you want for a couple million...?
Since "they" have yet to release more than 1 test in social studies(I have yet to see a test or question still being used beyond the things like the sparse examples found here)) and also do not provide specific feedback, I can't really tell what kids knew and what they didn't. Consequently I have no way of doing a better job preparing my 2nd semester kids by using results from 1st semester and no way to improve next year. This is the type of thing that says to me the SOLs and a great deal are not really for the benefit of the teachers or kids. This quote seemed fitting on so many levels of the testing approach in my state: "Is there anything that we are doggedly pursuing without regard to the actual impact it is having on our intended audience? If it only makes sense to us, it may not be making sense at all."
Staying with the theme of not making sense, the kid's detailed report reads something like this: "Question Description: Describe an essential belief of a major religion.---- Incorrect." So I can tell they missed a question about religion but not which religion or what specifically they didn't know. Was it Islam or Christianity? The founder or how they worship? Imagine one about civilization,...but was it Greece, Rome, Japan Inca...who knows? Throw in the silly graphics, poorly worded questions and inferences required and the misses start to add up. Pretty useless honestly other than just assigning a score and determining minimal competence for AYPs sake.
So here's the truth, truth... I think SOLs do more harm than good and I hate them. I hate people who promote their use and do so from a position far from their impact. I hate the fact people point to kids scores but not kids accomplishments. I hate I am being asked to move beyond the SOLs but still have to deal with issues like this. I hate how they are adding a "college ready" designation in some subjects...think for a second about the impact that'll have on kids who don't meet that mark. I hate that the term "failing schools" has gained footing and is commonly used but like an SOL score it carries little concrete meaning unless you fully understand it. As for measuring the kids performance, I'll stick to my more holistic measure...I call it a grade.
Hope some of this "truth" made sense and you didn't get too punchy towards the end.