Monday, August 22, 2011

That's the Truth Truth


That's the Truth Truth

My colleague in one of his latest posts gave some perspective on the idiocy of NCLB. We are back at work this week diligently attending meetings out the wazoo. In many of these conversations, talk turns to moving beyond the SOLs. On  Orientation night I ran into a parent of one of my kids from 1st semester and she mentioned her child's result on said SOL and I felt like the butler in this video. Like most teachers I wanted to be Rocky Balboa at the end of a fight and raise my hands in the air with my students and claim victory. No such luck. The truth is that moving beyond something as ginormous as the SOLs will be tough for many reasons.

In a post earlier this year I shared my state of mind when I got the SOL results. These determine whether or not our school makes AYP(we did not), how I am measured, and many other things. After 2nd semester's results came in I wasn't so much confused as I was frustrated and angry. SOLs have made me like Rocky in the later movies, my brain turning to mush from constant pounding. The punches coming from all this SOL/AYP talk. I just can't get this whole SOL conversation out of my head. It has become all consuming. Not because I focus only on SOL content or whether the school or division is making AYP. It's because like many teachers I think about the impact on individual kids. Too often when I see a parent or kid the test comes up. And it should. People should be outraged...protesting...calling for firings(not mine please)...or at the very least not buying Dixie Chick Albums.

What do these scores and test results mean? In other words..."what is the truth behind the SOL?"

I'll avoid the school or division wide discussion here. This Spring's results got my dander up(whatever that means) so with some colleagues we expended some effort back in June trying to find out what the truth was about how we really did. As we peeled back the layers of the testing onion it got pretty stinky at times. I thought sharing some of what we learned might illicit a degree of empathy from the non-teachers among you that went to school before we migrated to this other-worldish test driven planet. After all we teachers can't be malcontents all of the time and need some help.

At first glance the numbers appeared to show me 2 things about my kids. No surprises passing wise...but there was an pronounced drop in Pass Advanced(scores over 500). So I started to ask what exactly the difference was in how scores were labeled (Pass Advanced/Pass Proficient) and then how exactly the test scores were calculated(not that I hadn't asked this before mind you). What I found, or didn't find was troubling. The labels applied to results seem to have little to no value as an educational tool. But I'm getting ahead of myself.

Back to my conversation with the parent where I shared that my own feelings on SOLs are pretty complex. I can only imagine the feelings and confusion involved for parents and kids. Testing generates one powerful thing. Data data data...I spent more time this year looking at my numbers than I did in the previous 10 combined. To be fair it could be said I was not looking at data this time but instead trying to find the answers I wanted. When those in the higher ranks of education policy engage in this practice I am highly critical. I tried to share with the parent a combination of brief history of SOLs and some analysis at the same time and somewhere communicate something resembling the truth. This included the fact that these numbers have to be interpreted and despite claims to the contrary, numbers can lie.

To begin a little background:
400 is passing proficient, 500 is pass advance, 600 is a perfect score.
Tests and questions in Social Studies EOC tests are not released.
The state adopts a cut score and this is a criterion referenced test(I am not sure those statements are compatible).
No one other than the people who make and grade the tests seem to fully understand everything about these tests and that seems to be the way they like it. The amazing lack of transparency is troubling.

So as I explored my results I was bothered by the logic of a grading system where one kid answers 35 correct and gets a 417, another student in the same class who gets a 415 but got 36 correct(different test versions). The company (Pearson) points this possibility out but again my brain is mush so I don't get it. So mush and all let's look at a random student who got a 492 and received a Pass Proficient rating. What does that mean? Not that much honestly from what I could tell.

On the World History I test there are 6 categories and a score of 50 means the students answered all of the questions in that category correctly. The amount of questions in each category varies by the weight of the category.
Here are the results for that student:
40 RC1 = Human Origins and Early Civilizations
38 RC2 = Classical Civilizations
50 RC3 = Postclassical Civilizations
45 RC4 = Regional Interactions
42 RC5 = Geography
34 RC6 = Civics and Economics

Usually the reports we get from Pearson are about as clear as Rocky's vision when he uttered the phrase "cut me Mick."

We get an overall number and some scores in the six separate categories. My favorite part is this section of the report that reads... "Reporting category scores, which are on a scale of 0-50, can be used to identify students' strengths and weaknesses. A score of 30 or above indicates a strength. A score of less than 30 indicates that the student may benefit from additional instruction in this area". So this particular student was judged as "strong" in each area but only received a Pass Proficient.

So why not Pass Advanced?
We asked the same thing when we saw the drop in scores and what kids got what scores. Here is what I found and it honestly came a little late to comfort me or any of my kids when they were judged as only "Proficient". As for what that means here is a link to the VDOE summary of performance level descriptors for each grade and end of course test(note the absence of Social Studies descriptions). These labels are supposed to assist parents, students and teachers in understanding how they did. But few understand what the labels actually mean or how they are determined. AYP for our school only factors in pass rates and is unaffected by these terms. But kids are and some took it pretty hard. Use of these without proper context would be as dangerous as a kid running with scissors and likely invalid when used to measure how we are doing. Is the same true for individuals? Some of my kids and parents were disappointed especially those who got a Pass Proficient. My disappointment stemmed from the drop in Pass Advanced scores and seeing the reaction of kids who have grown up with these tests and sadly measure themselves by how they do.

These pie charts graphically illustrate the data from my honors classes only.
What a difference a year makes! Before you call for my firing or resignation spend a second thinking about some of the stuff we've written about on this blog that affected our results. Adding an additional class to teachers and student workloads and switching to the 4x4...guess what. Looks like it made a difference. But so did the test.

The graph above shows the difference in 4x4, A/B and totals the last 2 years. Ouch is all I said. To an outsider it would be evidence we weren't doing as good of a job. But that is misleading to say the least. On the 4x4 we had to go fast and that meant cut some things out. The obvious choice is material that is not going to be assessed and that is a shame. I still felt I gave them a solid class that was rich and varied enough to feel they were prepared to do well. The test was made "more rigorous" which I welcome but it is apparent the test makers view of what to stress is different from the people who actually teach kids. (I 'll have to thank the State Super for the heads up on these shifts next time I see her) I had some really smart kids who knew their stuff and was as surprised as them when they didn't get a Pass Advanced.

Nothing I have found gives any meaning to the terms Pass Advanced/Pass Proficient in social studies and I have looked everywhere. But that is the first thing parents see that has any meaning to them. The difference is simply it says one is a 400(31 right) and one is a 500(53 right). We spent some time figuring out the exact scale(neither the state nor Pearson gave us this...maybe because they don't want people to know that kids only need to get 1/2 of the questions to pass). If they used the scale from previous years the drop in my kids scores would not have been as dramatic.

The tough part is many kids define their performance by these terms but don't even know what they mean. They might know(most do not) they can miss 29 questions and pass but incorrectly assume that the break from 400 to 500 is halfway to a perfect score. It is not. So the truth appears to be these labels and numbers have little substantive meaning. They just fall in from where they set the cut score for 400 and exist in a vacuum until they are pulled out by a bean counter somewhere or by parents who are unable to put these in proper perspective. So I hear frustration and confusion when results come up. Well what do you want for a couple million...?

Since "they" have yet to release more than 1 test in social studies(I have yet to see a test or question still being used beyond the things like the sparse examples found here)) and also do not provide specific feedback, I can't really tell what kids knew and what they didn't. Consequently I have no way of doing a better job preparing my 2nd semester kids by using results from 1st semester and no way to improve next year. This is the type of thing that says to me the SOLs and a great deal are not really for the benefit of the teachers or kids. This quote seemed fitting on so many levels of the testing approach in my state: "Is there anything that we are doggedly pursuing without regard to the actual impact it is having on our intended audience? If it only makes sense to us, it may not be making sense at all."

Staying with the theme of not making sense, the kid's detailed report reads something like this: "Question Description: Describe an essential belief of a major religion.---- Incorrect." So I can tell they missed a question about religion but not which religion or what specifically they didn't know. Was it Islam or Christianity? The founder or how they worship? Imagine one about civilization,...but was it Greece, Rome, Japan Inca...who knows? Throw in the silly graphics, poorly worded questions and inferences required and the misses start to add up. Pretty useless honestly other than just assigning a score and determining minimal competence for AYPs sake.

So here's the truth, truth... I think SOLs do more harm than good and I hate them. I hate people who promote their use and do so from a position far from their impact. I hate the fact people point to kids scores but not kids accomplishments. I hate I am being asked to move beyond the SOLs but still have to deal with issues like this. I hate how they are adding a "college ready" designation in some subjects...think for a second about the impact that'll have on kids who don't meet that mark. I hate that the term "failing schools" has gained footing and is commonly used but like an SOL score it carries little concrete meaning unless you fully understand it. As for measuring the kids performance, I'll stick to my more holistic measure...I call it a grade.
Hope some of this "truth" made sense and you didn't get too punchy towards the end.

5 comments:

  1. I ran across your blog as I was searching in vain to understand our daughter's 3rd grade SOL scores. We have no idea how to interpret them - or even what they mean. How in the world can she receive 560 in one subject and 427 in another? I suspect, after reading your post, that we may never know.

    ReplyDelete
  2. What is unfortunate is that the tests could be a great informative tool, but we're supposed to believe that they are THE vital piece of information that matters regarding student learning.

    If the tests aren't supposed to be norm-referenced like other standardized tests then I don't understand why they can't release scores on new versions of tests until enough students have taken them. Or why they can't present scores in terms of percentage correct if it's supposed to measure achievment on mastery of the standards.

    ReplyDelete
  3. Inspirational Sketchbook,
    Steve and I know. We are just not at liberty to say. :) I probably won't be laughing when my kids get to 3rd grade. Hopefully by then the state or Pearson will actually put something online to help parents. This was posted just recently and I am not sure whether I was more confused or less confused after I read it.
    http://www.doe.virginia.gov/testing/scoring/explanation_for_cut%20scores.pdf



    So back to real world stuff here is the scale we figured out for our WH I EOC test. Maybe teacher contacts you have could come up with something similar for 3rd grade.
    http://schoolcenter.k12albemarle.org/education/components/docmgr/download.php?sectiondetailid=18162&fileitem=64485&catfilter=6925

    ReplyDelete
  4. I'm a WHI and WHII teacher and I completly agree with you. How am I supposed to prepare my students when we don't have adequate resources or feedback. It's especially frustrating that the rigor of the test was increased (as it should have been) but we were never informed. 7 released questions don't help much. I don't need to know the exact questions. I teach my kids and they know the information. But asking questions that are confusing just to give the appearance of raising the bar, gives me a bad name and does and isn't fair to my students.

    ReplyDelete
  5. I used to teach WHI and II as well (back in the days when we used to pick up the test booklets and number two pencils for students the day of the test.) I think part of the problem with the secrecy and security is because so much is riding on these tests now.

    Good AP scores are great for students, but doing poorly at worst costs the student $85. College admissions test carry a lot of weight, but they really are one among several factors weighing college admission.

    We're assuming that SOL tests are really the only measure of whether a student has met the required standards of the test. Students' ability to graduate is based on test performance. The quality of schools, and increasingly individual teachers is assessed by performance on the tests.

    We're not looking for the test to be released ahead of time so that we can get a jump on it, we'd simply like to know more about what to expect from it and how to prepare our students.

    ReplyDelete