A Curious Takeaway

by Robert Pondiscio
July 15th, 2010

An interesting experiment shows 12th graders’ scores on the National Assessment of Educational Progress (NAEP) go up when students are paid for correct answers on the exam.  The study by researchers from ETS and Boston College has reignited the debate (the hardiest of perennials) about intrinsic motivation and paying kids.

“Though the testing program is considered a national barometer of student achievement, there really isn’t much of an incentive, after all, for students to do well,” Edweek’s Debra Viadero writes.  “Scores from NAEP assessments don’t show up on a report card or count toward graduation requirements. Likewise, colleges never see NAEP scores when students apply for admission.”  The study purports to show ”credible evidence” that NAEP underestimates the reading abilities of students enrolled in 12th grade.  ”Responsible officials should take this into account as they plan changes to the NAEP reading framework and expand the scope of the 12th-grade assessment survey,” the authors conclude.

In other words, things aren’t as bad as they seem?   To my mind the salient point about 12th grade NAEP is that it has read like a dead man’s EKG for 40 years.  Unless you’re ready to suggest that high school seniors were more motivated to do well in years gone by (and show evidence) then NAEP has consistently underestimated reading abilities that entire time.   Under the same testing conditions (no pay) over several decades, there has been zero change in outcome.  

Or am I missing something?

David Steiner Gets It

by Robert Pondiscio
July 14th, 2010

Keep an eye on New York State education commissioner David Steiner, who is gearing up to implement a long overdue reform: establishing a link between test scores and college readiness. 

Harvard’s Daniel Koretz, at Steiner’s urging, has been looking at the correlation between New York’s eighth-grade test scores and high school Regents exam scores. Notes the Buffalo News:  ”The conclusion: Students in New York State are moving through elementary, middle and high school with test scores they believe to be adequate, but once they get to college, they find they are not prepared.”  That’s not a complete shock given the boxcar numbers of college freshman who need remediation once they arrive on campus.  But the New York Post’s Yoav Gonen points out what will surely be the most repeated fact from Koretz’s forthcoming study: eighth-graders who score a 3 out of 4 on state math and reading tests have just a 52 percent chance of graduating high school, even though they’ve been told they’re on track.

Let that rattle around inside your head for a moment:  A child who is deemed proficient in 8th grade has a chance only slightly better than a coin toss of graduating high school just four years later.   “We’ve been calling that ‘proficient,’ ” state Board of Regents Chancellor Merryl Tisch told The Post’s editorial board. “We were giving out misleading information.”

Gee, ya think?

The study is to be released Monday, but anyone who has taught in New York in the last several years can’t be surprised.  For years, I saw 5th graders come into my Bronx classroom who were ostensibly on grade level yet demonstrated little command of basic arithmetic. That was plenty persuasive that all that glitters isn’t gold.

Steiner’s insistence that test scores should actually mean something is clearly going to rattle some cages, and prompt a long hard look at where school districts in New York have made real gains and where they haven’t.  Buffalo’s school superintendent blasted Steiner and his deputy John King last week for focusing on more rigorous tests.  ”I think they’re two people who don’t know what they’re doing,” James A. Williams told the Buffalo News. “A more rigorous test is not going to improve student achievement. It’s not going to improve the graduation rate. I think it’s ridiculous.”

I don’t follow Williams’ complaint.   By my read, Steiner isn’t talking about testing our way to proficiency.  He’s talking about how test scores should be indicative of real-world proficiency.  As I’ve argued in this space before, if we’re going to insist on viewing everything in education through the prism of test scores, those scores have to be meaningful and indicative of real-world proficiency.  Steiner, King and Tisch deserve all the credit in the world for taking this on.

Does Competition Enhance Performance?

by Robert Pondiscio
June 25th, 2010

Does competition enhance performance?  Or does it simply create more incentive to cheat?  That was precisely the question a pair of Spanish researchers set out to explore in an interesting experiment.  Fifty-five men and women spent a half-hour working on mazes on a computer.  Half the students were paid based on the number of mazes they completed “whereas the half in the ‘highly competitive’ condition were only paid per maze if they were the top performer in their group of six students,” according to the British Psychological Society’s research blog:

“The students in the highly competitive condition narrowed their eyes, rolled up their sleeves, focused their minds and cheated. That’s right, the students playing under the more competitive prize rules didn’t complete any more mazes than students in the control group, they just cheated more.”

The test subjects were able to cheat by switching to easier levels of difficulty or clicking on a button that offered solutions for the mazes (software on the computers monitored what the test subjects were actually doing).  Perhaps the most interesting finding: poor performers cheated the most.

‘It turns out that individuals who are less able to fulfill the assigned task do not only have a higher probability to cheat, they also cheat in more different ways,’ the researchers said. ‘It appears that poor performers either feel entitled to cheat in a system that does not give them any legitimate opportunities to succeed, or they engage in “face saving” activity to avoid embarrassment for their poor performance.”

Whistleblowers Delight

by Robert Pondiscio
January 5th, 2010

Did anyone else get that remarkable email from the organizers of the Broader, Bolder Approach to Education yesterday?  The subject line read “BBA Needs Your Help.”  If you just hit delete, you missed a fascinating email.  BBA, which argues that test-driven accountability narrows the curriculum and creates test obsession in schools is asking teachers to submit examples of schools (presumably their own) that have suffered under strict accountability measures:

In a recent meeting, we advised Department of Education staff that their policy of identifying the lowest-performing 5% of schools in each state, in order to target these schools for massive intervention and “turnaround,” was bound to have adverse consequences if these schools were identified primarily by such test scores. We said that many schools that should be considered among the lowest performing schools would be missed if they artificially boosted their test scores at the expense of a balanced curriculum, by excessive test preparation activities and other gaming. And other schools that pursued a more balanced curriculum and attended to children’s long run achievement might falsely be identified as among the lowest-performing schools because they refused to engage in activities that artificially boosted test scores.

The letter, which doesn’t seem to appear on BBA’s website, notes DOE staff ”were not persuaded,” and asked the group to provide “examples of low-performing schools whose test scores have been artificially inflated by excessive test preparation and gaming, and better schools with very low scores but that were delivering a higher quality of instruction.”  The email, which carries the signatures of BBA organizers Helen Ladd, Pedro Noguera, and Tom Payzant, then asks recipients to identify such schools by name. 

Please include the name of the school, the name(s) of your source(s) of information, and other identifying information in your description. We will not initially provide all of this identifying information in the material we supply to the Department, but we have to be prepared to back up our claims by naming names if necessary.

It’s a bold move by BBA, although they might also consider sending along a copy of Linda Perlstein’s Tested.  I suspect they will find no shortage of schools that have muscled up on test prep and played games to boost test scores.  Whether teachers at those schools are willing to publicly say so is another matter. 

BBA is on shakier ground, I believe, in looking for good schools whose efforts don’t show up on test scores.  If a school is delivering a rigorous, well-rounded curriculum and “attending to children’s long run achievement” that should show up on test scores, assuming the effort is long-running, ongoing and well-implemented.

Mandatory Testing for Homeschoolers?

by Robert Pondiscio
December 29th, 2009

Should homeschooled children be required to sit for state exams to ensure minimum competency in reading and math?  And what should happen if they fail?    Indiana University School of Education professor Robert Kunzman, who studies homeschooling, proposes in the journal Theory and Research in Education that states require a basic skills test for homeschoolers.

Seconding Kunzman’s article, Miller-McCune magazine asks, with as many as 2 million students currently being homeschooled, whether ”it might be time to consider some sensible oversight.”  In theory, the magazine notes, a required basic skills test “could be a useful tool to help homeschooling parents understand which areas their child is excelling and struggling in and, if constructed properly, could illuminate where to focus additional attention.”

Above all, it’s essential that the test be crafted by individual states (just as individual states create tests for public schools in compliance with federal testing mandates) and be viewed as “neutral” (evolutionary science off-limits?) by parents and students. Then perhaps local homeschool organizations could work with the state to create a skills assessment that contains no ideological or moral “litmus test.” The result, as Kunzman conceives it, “would involve computation skills (adding, subtracting, multiplication, division) and reading comprehension.” In other words: a simple, rudimentary, noncontroversial test that even a serviceably educated student could pass.

Even that won’t be simple, however.  At the website Homeschooling Research Notes, Milton Gaither, a professor at Messiah College sees several problems with Kunzman’s proposal:

First, he is not clear about exactly when these tests would need to be administered or what would happen if a student failed them.  By what age must a child be able to read, write, and cipher?  For some unschoolers such skills are not deliberately taught until a child wants to learn them, which could be as late as 10 or 12.  Such children would fail the Iowa test of Basic Skills, perhaps repeatedly.  What then?  Kunzman says in a footnote that failure doesn’t mean kids should be forcibly placed in public schools, for they might do even worse there.  All he says is that repeated failure shoud prompt “a closer look by the state into that particular homeschool context, the quality of instruction, and the needs of the student before deciding how best to protect his or her educational interests.” (p. 328)  This I find unhelpful and vague.  Why bother administering the test at all if there’s no clear consequence for failing it?

Kunzman’s website has a lot of interesting information and resources about homeschooling (his state-by-state chart of homeschooling regulations is fascinating).  While I understand the impulse behind his proposal–he points out we really have no objective information about how homeschooled students truly perform academically–but I think it’s unlikely that many, perhaps even most, homeschoolers will see mandated testing as anything other than an unwarranted intrustion.  “The underlying assumption of this proposal seems to be that the citizen is somehow subject to the standards set by the state,” writes one homeschooling blogger.  “Or perhaps that the state has a more compelling interest in the well being of the child than the parent. As any homeschooling parent can tell you, we don’t need a test to tell us how our child is doing. They are not a name on a roster, they are our focus of attention.”

In short, prepare for a fight.  Given the relatively low performance of most states, it will also be hard to make a credible case that they know best or are even minimally competent to gauge, let alone assure academic success.

Text, Yes, But Is It Reading?

by Robert Pondiscio
July 28th, 2008

Are the hours kids and teenagers spend prowling the Web a threat to literacy?  Or is it simply a new form of reading and writing?  A sprawling New York Times thumbsucker notes that “as teenagers’ scores on standardized reading tests have declined or stagnated, some argue that the hours spent prowling the Internet are the enemy of reading — diminishing literacy, wrecking attention spans and destroying a precious common culture that exists only through the reading of books.”

Clearly when kids go online instead of turning on the TV, they read and write instead of passively consuming video.  But critics of reading on the Internet say they see no evidence that increased Web activity improves reading achievement. “What we are losing in this country and presumably around the world is the sustained, focused, linear attention developed by reading,”  Dana Gioia, the chairman of the N.E.A., tells the Times.  “I would believe people who tell me that the Internet develops reading if I did not see such a universal decline in reading ability and reading comprehension on virtually all tests.”

“Reading a book, and taking the time to ruminate and make inferences and engage the imaginational processing, is more cognitively enriching, without doubt, than the short little bits that you might get if you’re into the 30-second digital mode,” adds Ken Pugh, a cognitive neuroscientist at Yale who has studied brain scans of children reading.

According to the paper, the Organization for Economic Cooperation and Development, which administers reading, math and science tests to a sample of 15-year-old students in more than 50 countries, will add an electronic reading component to next year’s tests. The United States, among other countries, will not participate. “A spokeswoman for the Institute of Education Sciences, the research arm of the Department of Education, said an additional test would overburden schools,” the Times notes.

Houston, We Have a Problem

by Robert Pondiscio
June 30th, 2008

I’m all about committed parenting, academic rigor and student achievement so why does it feel excessive to me that children as young as four are being tutored to get ahead in school? The Houston Chronicle reports some parents are hiring tutors, “because they’re feeling the pressure of looming high-stakes tests, which begins in Texas with the Texas Assessment of Knowledge and Skills for third-grade children. Others are thinking about college.”

Houston-area tutors work with hundreds of young children on phonics, numbers, colors, study skills and fine motor skills. Some take children as young as 3 1/2 . But some caution that putting pressure on young children might give them a distaste for school. Rather than spending upward of $45 an hour on private tutors, they say parents should use outings to stores, libraries and museums as teaching moments.

“A child needs summer,” Kay Hall, director of the Early Learning Academy in the Spring school district tells the paper.  ”There’s a lot of learning that can take place over the summer, but it doesn’t need to be in a classroom in a structured environment.”