Showing posts with label SOLs. Show all posts
Showing posts with label SOLs. Show all posts

Tuesday, June 10, 2014

Testing, the Blob, and the Great Escape

It is a monster.  I've seen it terrorize thousands of children.  They can't run.  They can't hide.  It will catch them eventually.  I've seen it kill kids...or at least kill their love of school and learning.  Like the movie monster Frankenstein it is a freightening patchwork, but in this case not one of used body parts but of misdirected education policy of our own creation that sparks fear, misunderstanding and even panic among people.  But Frankenstein wasn't bad...just a guy that got a little over nervous and freaked out.  When I was little Frankenstein didn't scare me,  what scared me was the blob.  Testing is worse than the blob. But the description from the 1958 movie poster of "Indescribable! Indestructable! Nothing Can Stop It!"  sure does fit.

By now you certainly know that large scale testing has had a dramatic effect on American Education.  It has literally change the way we learn and teach.  Depending on who you believe, or trust, that is either a really good thing, or a really bad thing.  The voice from educators who work directly with kids seems to express the consensus that it is not so good.  Surprise surprise. 

"Testing Season," as it is un-affectionately known, begins in May and basically normal school grind to an abrupt halt.  It puts parts of the school and large portions of our student body on lockdown for weeks on end.  Testing brings any real learning to a halt.  We do testing in 3 or 4 main locations but during that span our gymnasiums(we have 2) are sealed up tight.  Student routines and teacher days are changed to feed the monster.  We all are forced to proctor.  And forced to do worse.  I am always thinking there is a certain indignity involved when you have to escort a student to the restroom for both us, and them. I won't even begin to enumerate the actual number of tests kids take in our state...but we're well into the thousands just at our school alone.  It makes everyone grumpy.

 Testing leads to a curious phenomenon...testing fatigue.   It overcomes a usually vibrant and energetic group of people.  It is a real monster. Upperclassmen "check out" both mentally and physically.  The courses I teach with underclassmen become ineffective as on any given day half of the students or more may be missing.  testing has forever altered the end of school.   I stated before how unfortunate it is that the days of engaging and interesting activities serving to tie everything together have been undermined by all the crap we have railed against on this blog.

Radiation ...reform...what's the diff?
Lange's + Bridges' best work :)
Testing arguably destroys schools and the people within them.  It's the worst of all the most famous monsters.  Like Godzilla it is a beast of our own creation.  Like the blob it grows more powerful and entrenched the longer it is around.  Like Dracula it sucks the life out of victims.  It has the potential to yield great profit like King Kong and that is what causes the problems.  Like Kong it is hard to control but unlike Kong it is unsympathetic.  Like the Mummy it has the potential to be around for a very very long time.  It has tentacles that reach out and cling to just about every aspect of education, like the Giant Squid from 20,000 Leagues Under the Sea.  It seems to function as an agent of some greater diabolical purpose like the creatures from Alien.  Have I made my point about it being a monster?

  Like I said it can be destructive.  This is not an assemblage like that of Disney Pixar's Monster's University.  Fortunately the standardized testing has not overtaken that fictional campus yet..  But in our
I put this image in for my kids
public schools this destruction is incremental and hard to perceive.  I have witnessed many times just as I did this year the reaction of students when they learn they didn't pass.  They are certainly disappointed, even upset, but that deflation is quickly replaced with a callous "I don't care."  I think that on one hand they really shouldn't.  The test does not define them.  But I always want my students to care.  So when they say that they don;t care  sometimes it is true and often was long before any test.  But when it is not, and students do care about results, it is sad to see their earnest  efforts go unrewarded.  I'd unnecessarily add here that there's more to learning that just being able to pass a test. It can't measure so many immeasurable things.  But it does matter according to our state's policy.  A whole lot.

For a teacher, the monster might mean that they literally no longer have a job or a school to work in. It is a very helpless feeling watching your students take a test.  In my state, as I suspect in many others, there is no real way to improve the students efforts.  The test results and accompanying feedback don't give much insight.  Worse yet is once they have taken the test, there is no real way to target remediation so they could pass.  I can't tell what they did well, I can't tell what they didn't.  If it is merely supposed to be one of many tools used by teachers.  We pay for this tool and I hope we kept the receipt.  I would also offer we should stop buying things off late night infomercials.   Tests, even standardized ones, have a place and a role.  But I suspect this monstrous of using tests in ways they were never intended.

Proof it is a poor tool is evident in the Student Performance By Question(SPBQ) report.  It is even more non-descript that the blob.  It is arguably less useful.  This despite the "helpful link" on the VDOE website intended to make this report useful.  I can find out the number of questions a student got correct or incorrect, but I cannot with certainty find what that specific content or concept that student did or did not know. So since they have to pass the test, I feel rather helpless and cannot outrun the monster. I am forced to stand by and watch it consume more victims.  Allow me to share a few other gems from my efforts to make such a report useful:

This one illustrates the non-specific language that all our testing efforts produce.
So a couple million  $ buys you some "maybes"?
So once they take the take I should be wary of using it to figure out what they do and do not know.
So its OK for everyone else to overstress SOL results, just not teachers.
Well, I might teach differently if I knew where to start.


Fighting the Monster
There are literally thousands of examples of teachers crying out against the testing machine.  But it must be fed.  Raw Scores, Scaled Scores, Failing Scores, Remediation, and Online Testing all took time to entrench themselves in our schools.  But maybe there is light ahead and we are entering a new era of education where parents and students, even districts like our own join teachers in saying enough is enough.  Is it likely that we can together defeat the monster?  I don't know.  There is a lot of money tied up in all of this.

The testing monster will be tough to rid ourselves of.  It will take a collective effort and not be an easy task.  Even still it is likely to be a worthy foe.  Reliance on political leadership from statehouses and capital domes will likely mean we'll just confront sequels of the same terror, in scarier form.  testing has its place.  But massive, poorly done, standardized testing is nothing but a destructive and undesirable force that must be stopped.  Maybe if we had a champion like Steve McQueen was in the 1963 film, he could lead us in The Great Escape.   Lest we not forget in that one he didn't actually escape.  Maybe one day we will. 



Monday, July 16, 2012

OK Hot Shot...

 Choose one:

You have to remediate students who have just failed the Standards of Learning Test(SOL) in a non-writing subject area.  You have their raw score but do not know what questions they missed specifically and are having a hard time deciphering where they were "weak"other than what you heard from their regular teacher.  Most of these students want to do well but they struggle with the basics.  You do not know most of their names and have never worked with them. The state will not let you look at or use past tests nor past questions except the outdated ones they released and none of those question will be used.  The students are not strong readers and are not from the same class.  There are 29 of them and it is the last week of school before exams.  You have 90 minutes and then they will retake the test. 

       OR 

You are on a bus and it cannot go below 55 miles per hour.  


Any questions?



One of these scenarios played at our school and in a similar fashion across our state. I faced it back in May and as I think back I am still bothered by the disservice to our students by the current testing system.   With the aid of many other teachers I think I was able to help in some small way but I am left feeling that the bus deal may be more difficult but easier to control.  NCLB waiver or no waiver the time, energy, resources, money and focus all poured to testing make schools a worse place, not better.

Someone please explain to me again how  standardized testing and the millions we steer away from students in public schools and towards Pearson and the like is a good thing?  This system makes about as much sense as Dennis Hopper did in Speed.   I mean who does that?
If I release this switch, the testing company will explode.



Friday, May 25, 2012

"Always Learning"

The solution to America's education problem:
1) Fire all of the bad test makers
2) Give principals the authority to get rid of bad assessments or questions
3) Get rid of the self-interested corporate lobbyists

This shouldn't upset the good test companies. I'm sure all of the good test-makers out there want the bad one's out just as much as the rest of us. But until we stop yielding to the union of corporate test-makers and start making policy that benefits children first we are stuck in this status quo of subjecting children to sub-standard testing.

If anyone complains about this idea then it's probably because they're afraid of change. They've become complacent with the protection that lack of transparency has afforded. The quality test-makers will applaud this approach as healthy and necessary for the success of our children in the 21st century.

Some might argue that publishing the errors of these testers is unethical, but in a system of public education, parents have the right to know what kind of quality they're getting. We learn from mistakes, but when those mistakes interfere with the future of our children and the vitality of our economy nation, we must put the children first.

Click on the pictures below for a better look at one of the latest failures of this status-quo entrenched testing business.

Thursday, May 24, 2012

Cracking the Code: How Testers Language Means Nothing

As a teacher of Ancient World History, one area I find interesting about the period of study is language.   Thousands of years separate civilizations and written language offers a window affording us a glimpse as to the way things were for people who have long since disappeared.  When a language is "lost" to time or cannot be translated, a great deal of misunderstanding exists.   Often some catastrophic event or mysterious demise brings on such a void.  Sometimes it is geographic distance which separates cultures and prevents mutual understanding.  Only about 60 miles separates my school from the decision makers in our state capital of Richmond but it might as well be a million.  The gap between us is wide indeed.  I think they might even be on another planet.

My students have taken this year's SOL test.  I tried to prepare them as best I could for this test that I have never seen.  I can''t prepare them for receiving their scores and not knowing what they missed.  Somewhere in the language of the test and the scoring there exists a disjoint which results in a process devoid of much value.   This test requires a Rosetta Stone in order to decipher what exactly is measured and how. Far worse, without having seen the test or any of the questions, it is impossible to judge its merits fairly, point out flaws, or seek clarification.  The secrets of the test are even more mysterious than the language of the ancients. 

Why do we place such a degree of legitimacy on the tests when it is clear they inherently lack legitimacy?  How can anyone be allowed to make a test like this and get away with not being more transparent to those that are judged by it?  Is the quagmire of documents, forms and numbers designed purposefully to deceive or misdirect?  One is left to speculate.

We have explored these issues in several previous posts on the TU. See Bottom, Truth, Fact, $#!%Flux among others.  There are so many things wrong with the tests themselves and the way they are used that for those not directly involved in today's schools it is difficult to comprehend.  Painfully evident is the reality that testing  is leading us to a place where a growing number of common sense people and countless educators know is bad. A representative in the state legislature of Indiana, Randy Truitt voiced some of this in a recent letter  to his colleagues.  

Imagine the opportunity to sit with a leader of the society like the Maya or Easter Island and simply ask..."What happened?"   If I had the same opportunity with the folks at Pearson and the state DOE I'd do my best to dig deep.  My conversation would ask among other things what exactly are you trying to accomplish? 

I'd begin with a printout of "raw" scores.  What makes it raw is how you feel when you try to figure out what these scores mean once they are scaled(I usually say chapped not raw).  This year is no exception. From VDOE website "the raw score adopted by the Board to represent pass/proficient on the standard setting form is assigned a scaled score of 400, while the raw score adopted for pass/advanced is assigned a scaled score of 500."  That makes perfect sense except when you look elsewhere on the site.


So never mind the 53/60 cut score above since my students who missed 7 questions (53/60) only received a 499.  I would bet that very few students and even fewer parents would have any idea where the 400 and 500 delineations come from.  Aliens perhaps?  Apparently that will remain a mystery.

The vagueness there is surpassed still by what the teacher responds when a kid asks, "what did I miss?"  All I can offer is the kind of imprecision usually reserved for an ancient text translation or interpretation.    "OK Johnny... it is obvious, you missed four in both Human Origins and Early Civilizations and Classical Civilizations.  The Classical Civs questions had something to do with achievements of a person, architecture, role of a key person in a religion, and a figure's accomplishments.  Not sure what ruler, where they were from or what you didn't know.  But what is important for you to remember is that although there were more questions in the HOEC category(thus in theory they each had less value), you again are mistaken because in fact, you only got a 31 scaled scores versus a 32.  You got a 394 so you failed.  Just do better.  Make sense?  No?  Good." 

After consultation with our legal department(each other) and careful inspection of the Test Security Agreement we all sign we elected not to include an actual copy or portion of the grade report.  The rationale being that we need paychecks and both have families to support.  How sad is it that teachers are scared to question the validity of a test by referencing the actual test or results from it?


If we had included a copy of this student's actual score report you would have seen:

(1)Reporting categories contain vague language like "idenitfy characteristics of civilizations" to describe question that the student answered incorrectly.
(2) category A had 11 questions of which the student missed 4.  Category B had 10 questions of which the student missed 4.  The student's scaled score for category A was 31, for B 32, with no explanation of why question in category A are are given greater weight.
(3) The scores, grade reports and feedback is clearly not useful to improve student or teacher performance with specifics as to where weaknesses exist.

Imagine that conversation with a student who fails and trying to help them.  We are asked to "re-mediate" which I would imagine means we target areas where the student has weaknesses.  That is a much tougher task without knowing where exactly they are weak.  I can understand not wanting us to teach to the test.  How about teach to the kid?  

I and my students are judged by a test which in no way serves as a tool to improve my teaching.  How on Earth are we to try to do better next year?   Those that devise such an approach remain as distant as any of the cultures my students are required to learn.  What's more is they manage to encrypt any relevant information in such a way to make it utterly meaningless. 

The numbers and stats derived from massive student testing across the state serve little more purpose than to send the message that policy-makers and testing Corporations like Pearson want to send.  When scores are too high, standards are raised.  When scores are too low, standards are lowered.  Neither the Department of Education nor Pearson are able to state in clear language an objective explanation of how scores are calculated and why certain cut score choices are anything less than arbitrary.

The twenty-first century process for holding American students, teachers, and schools accountable should not prove more difficult to translate than Ancient Hieroglyphics.




No Pearson..."Thank You"

Friday, May 11, 2012

How to Add Detail to Your Writing

The Virginia Department of Education has posted an excellent document of an easy and effective way to add detail to your writing.  We found this gem while searching for something to help us use results from state testing to improve our instruction in the classroom. According the the department of education website:

 "the performance level descriptors (PLD) for the Standards of Learning (SOL) tests...convey the knowledge and skills associated with each performance (achievement) level. The PLD indicates the content-area knowledge and skills that students achieving at a certain level are expected to demonstrate on the SOL...may guide educators and parents in understanding the type of student performance required for each achievement level... there is a detailed description, a brief description, or both.  The brief description is a summary of the content-area knowledge and skills that students are expected to demonstrate on the test and appears on the score reports for some courses. The detailed description provides additional explanation of the knowledge and skills that students are expected to demonstrate".

So, here's what you get.  This is what the brief description for the World History to 1500 test:
 But, suppose that's not enough and you would like a little detail.  Well the folks at the VaDOE aren't going to fail you.  They've created a "detailed" performance level descriptor for the course.  Here is the detailed descriptor:
I'll stop with the snark now.  This really isn't funny.  Someone at the DOE simply added bullets to a paragraph of text and called it "detailed" instead of "brief."  There is no difference in the text from one document to the other.  This is supposed to be information that informs parents, students, and teachers understand what a given test score means about a child's ability.  And to think the Governor of Virginia wanted to pass legislation making it easier for administrators to fire bad teachers, who is accountable for the creation of this document?

To most people this seems like over-reacting, but the people who work for the state and direct education policy for YOUR children either don't care enough to actually add detail, don't think you deserve the detail, or think this is good enough, and somehow nationally the narrative goes "if we could just get better teachers in the classroom."

In addition to the fact that the only differences in the "brief" descriptor and the "detailed" descriptor are bullets, the language itself is troubling.

1) We can actually describe a students level of performance if they fail?  They should be able to locate, identify, and match.  If they demonstrate a proficiency in these skills, congratulations they fail.  What is the label if they fail to locate, identify, and match?  "Fail Really Badly."

2) How about a little creativity?  I'm a fan of Bloom's and all, but this document just walks up the taxonomy without much thought to how it's getting there. Identify, Locate, Match/ Describe, Explain, Explain/ Compare, Organize, Interpret, Analyze.  Was there any thought about "the type of skills a student is expected to demonstrate", or does it just sound good to use the accepted language of the educational establishment to legitimize and strengthen a vague explanation?

3) Can a multiple choice test really measure whether a student is able to describe, explain, compare, interpret and/or analyze?  Try this: 
What is your interpretation of the charts above: 
a) they are an excellent attempt to inform the public of what SOL test results mean.
b) they are the product of overworked and underpaid public workers at the DOE trying to do their best.
c) they are a disingenuous attempt to mislead the public about the reality of testing.
d) they aren't perfect, but we're making progress toward a worthy goal.

Did I measure your ability to interpret?  You may never know because I'm not going to tell you whether you missed the question or not.  That's how SOL testing works silly.  If you don't agree with me you certainly won't meet the requirement of effectively interpreting.  If you do agree with me I'll give you the credit, but then it wasn't really your interpretation either, was it?  I gave it to you and all you had to do was recognize it.  I guess we just fell off of Bloom's ladder.

Look out for a more detailed post tomorrow, I didn't have time to add bullets to the text today.

The documents pictured above were taken from: http://www.doe.virginia.gov/testing/scoring/performance_level_descriptors/index.shtml on May 10, 2011.  Posted tables were found at the link for History and Social Science Performance Level Indicators, World History and Geography to 1500.

Wednesday, March 21, 2012

Bottom 10 Things about Virginia SOLs.

As we enter the "silly" season, AKA SOL season, we thought a list of this variety might be interesting. 
So below is TU's list.  Please feel free to add.

  1. How they intrude upon the schedule.  For at least 2-3 weeks a year, schools basically shut down.
  2. The money that is poured to testing.  They exist and the testing machine must be fed.
  3. The amount of time spent prepping specifically for tests, that could be used to teach/learn.
  4. They (the tests) now define us, as in schools.
  5. End of Course Tests start in March?  How does that make sense?
  6. Their secrecy.   How much time gets wasted because of “test security.”
  7. S.O.L.  The acronym is appropriate for obvious reasons.
  8. Kindergarten through 8th graders are only “expected to take the test”- that should read "can if they want to" and extend that approach to High School.
  9. 2% of schools met the standard in 2001, that number was up to 92% in 2005-2006
  10. The tests were developed in secret, and are still pretty much handled that way.  To avoid “teaching to the test”                                                                                                                                                            
(10 Just wasn't enough) 
     11.  The impact on students is so profound.  They are changed by these tests and not in a good way
     12.  Not being able to use the gym or media center for 2+ weeks kinda sucks.  
     13.  Pearson-controls the curriculum materials, create the test, grade the test, any questions? At least they don't write the standards (maybe).

    Thursday, March 15, 2012

    A Fact that Speaks for Itself

    LSAT  (test for law school applicants)- Three-hour and twenty-five minute test

    MCAT (test for med school applicants)- Five and one-half hour test

    For the 2012 Virginia Math SOL Tests (9-12 grade high school student test to earn verified credit for math) schools are increasing their testing block to accommodate the 4-6 hours that many students need to complete it.

    Seriously-  if you are a decision-maker in Virginia and you honestly think it's ok for fourteen year-olds to take four hour long math tests you should go ahead and turn in your decision-making credentials now.
    This sample is one item, but in fact requires students to work five equations.  It is also possible to get the answer partially correct; but the student would not get partial credit.  This link connects to the Virginia DOE website's .pdf guide to the Algebra II SOL test practice items.

    Monday, February 27, 2012

    Can Math Learn from History?

    That's the Truth Truth
    Holy $#!%, We Are Bad Teachers

    Having written twice previously about the trials and tribulations of being a teacher of an SOL course in the posts above, I thought it appropriate to revisit the topic given some changes that occur this year.  Throughout this post, I share some old and new math SOL questions with commentary for your testing pleasure.

    With the revision of the State Standards in Social Studies last year, our scores plummeted.  To make a long story short history teachers, including myself,  came out smelling really bad in our state.  More than other subjects, we are hamstrung by either the inability or unwillingness of the state to make the tests more transparent(they did release a whopping 7 questions to help us out and demonstrate how things are different).   I guess that's not all bad because we can't "teach to the test" as much, but it doesn't help our student's scores.   Not only has the VDOE released fewer past test and test questions, they have done little to clarify or define scoring and performance practices for history in more concrete terms.    It makes my job harder, hurts kids and confuses parents. 

    What did I miss?  Why does Isiah have erasers in his pocket?
    Not much has changed since last February within our subject and there have been no obvious efforts at the state level to improve the situation from last year.    These standards and how they are measured with the corresponding Standards of Learning(SOL) tests are as much a puzzle as ever.  No further developments in terms of Detailed Performance Level Descriptions to explain things to parents.  Social Studies isn't even listed. 

    As a department within our school we certainly looked at our individual results and are doing our best to adapt.  This is true of every subject area but when they change the test, it is tough.   Increasingly it seems no matter how hard we try or how well we do, it will be defined as not good enough.  I suppose the higher ups in Richmond just figured we'll deal with this and get it figured out.  I do not intentionally malign them, but at the least they are far distant from anything resembling the educational front lines.  They sure don't seem to solicit or consider teacher feedback.   If they did they'd be motivated to end the cat and mouse game that such high stakes testing has become.    So social studies teachers are doing our best to manage.

    Can't we just see if 5th graders can do  421 + 619 =_________
    This year it is the math standards and tests that have changed.  So what would I say to Virgina's math teachers... "uh oh".   Math tests will measure new standards that "include technology-enhanced items that require students to demonstrate content mastery in ways that were not possible with multiple-choice tests."  So these tests are harder, more rigorous and take much longer to complete.  They are more wordy and for kids with poor reading skills, the climb becomes even steeper.  If things play out in math like they did in social studies, it'll get rough. Math matters for AYP after all.  Maybe that's why they shared some SOL Practice Items.  I don't follow this stuff too closely but as score reports from first semester start to trickle out, the whispers in the woods are that things don't look good. 

    I like Fred, he fishes, but the creel limit on bass in VA is 2 per day
    Which brings me to my not-so-bold prediction.  Math scores for the 2011-2012 cycle will encounter a precipitous drop.   Not sure how that will play out but I suspect it will prove significant.  Virginia's math teachers and most importantly our students will not look good.   Throw in the new "college ready achievement level" on certain end-of-course tests and the table is set for kids to not feel good about how they are doing.

    Across the nation and in particular in VA there is an effort to develop STEM(science, technology, engineering and math) education.  I wonder how these results will affect that effort. How will this play out in the media?  Can we afford even more focus shifted to math at the expense of other subjects?   I think back to my own high school experiences in math and can speak with certainty that more math is not always good for everyone.  When I see my former math teacher walking down the hall, I still just apologize.  Honest.

    Parents, students, the media, principles and politicians all will react differently to drops in math scores.  But be certain, teachers are the ones who will need to react the most such a scenario. Such is the Value added world that came to Virginia in 2011.  Will this lead to better math instruction?  Or will it result in more focus on the test and prepping for it?  Time will tell what the result of such an emphasis will be. 

    I like orange Lollipops. Does that matter?
    In November the VDOE shared a release that indicated math scores for VA students were improving on the NAEP .  What's bad about that is that it will mean little to principals, teachers and parents when the scores start rolling in.  I am glad I am not a math teacher(I've always been glad of this actually).  So under NCLB, AYP and Race to the Top, Math is far more important than history.  It counts.  But as subjects they are equal.  I feel for young math teachers.  The slow removal of protections of due process and tenure like those proposed this year in our General Assembly have been well documented and I think the landscape of the future for young math teachers will be especially perilous.  Growth Percentiles will be skewed by the change.


    The letter below is an effort by one knowledgeable administrator to try and prepare parents for the questions that will certainly arise from an almost certain decline. 

     So to my math colleagues I say, "I feel your pain."


    Who cares?  The carnival is the same night as the Justin Bieber concert.


    Not only do I not know the answer, but I've decided to never buy flowers again.  Keanu Reeves was good in The Matrix.