The Liberal Arts and the Fate of American Democracy

by Guest Blogger
November 12th, 2014

By Scott Samuelson

Scott Samuelson is an associate professor of Philosophy and Humanities at Kirkwood Community College and the author of The Deepest Human Life: An Introduction to Philosophy for Everyone. This post originally appeared in Rhodes Magazine 

In the democracy of ancient Athens and the republic of ancient Rome, freedom was only for the few. Slaves, servants, and women had to toil so that free men could cultivate their minds, participate in the government, and enjoy the highest goods of human life—in short, so they could learn and practice the liberal arts.

Our government takes inspiration from Athens and draws on the model of the Roman Republic, but we also inherit the Enlightenment ideal of freedom for all, even if our history has never quite lived up to it. My view—inspired by a long line of American thinkers going back to Thomas Jefferson—is that in a democratic republic the liberal arts should not be the exclusive privilege of the few. We should all have access to an education in thinking and judging for ourselves. The main goals of elementary and secondary education should center on cultivating the liberal arts, and citizens should have the opportunity to study the liberal arts in college without incurring onerous debt.

I’m not saying that we shouldn’t have opportunities for job training in our educational institutions. The reason that ancient Athenian and Roman citizens could devote themselves whole-heartedly to the liberal arts was precisely that the servile did the work necessary to sustain freedom. Part of the genius of the American educational system is that it mixes liberal and technical education. A just democracy requires that we all pitch in when it comes to the economy.

If anything, I’d like to see more real technical education in elementary and primary schools. There’s no reason that a person with a high school diploma shouldn’t be expected to know something and to do something. Furthermore, I’m grateful that our colleges and universities help their students get employable skills. But the dominant note of an education in a liberal democracy should be the cultivation of freedom, not of employability. We rightly want people to have gainful employment, but American citizens should do their work with a spirit of independence, creativity, and self-reliance.

The powerful trends in education right now are all about standardization, rubrics, passing tests, and compliance, which read as forms of servility rather than freedom. Insofar as the private goal of education is about jumping through the hoops necessary to get hired and the rationale for public education is about growing the economy, I worry that we’re striking a blasé Hobbesian bargain of giving up our freedom to big corporations and government agencies in return for the promise of security.

At the end of the Cold War, Francis Fukuyama famously declared that we’d reached “the end of history,” by which he meant that all peoples would eventually settle into liberal democracy. It’s not simply the authoritarian capitalism of China and the violent theocratic movements of the Middle East that challenge his thesis. It’s that we ourselves run the danger of becoming illiberal.

A century ago, when America was tilting toward inequality and empire, the great American philosopher William James said, “Nothing future is quite secure; states enough have inwardly rotted—and democracy as a whole may undergo self-poisoning. But, on the other hand, democracy is a kind of religion, and we are bound not to admit its failure. Faiths and utopias are the noblest exercise of human reason, and no one with a spark of reason in him will sit down fatalistically before the croaker’s picture. The best of us are filled with the contrary vision of a democracy stumbling through every error till its institutions glow with justice and its customs shine with beauty.”

In the decades following James’ stirring remarks, our country stumbled toward institutions and customs that glowed with a little more justice for workers, women, and black Americans. Twentieth-century America gave birth to a world-class public educational system that, for all its flaws, gave an astonishing number of people a distinctive liberal education. Unfortunately, for a few decades now we’ve been walking with misplaced confidence toward inequality and empire once again.

But we should refuse to “sit down fatalistically before the croaker’s picture.” As a new world order is taking shape, we have the opportunity to shine like never before as the country where, with the help of the liberal arts, citizens widely participate in the government, workers have a voice in an innovative economy, and the widest number of people enjoy the best of the human inheritance.

shutterstock_94693783

Image courtesy of Shutterstock

Knowledge for What?

by Guest Blogger
August 28th, 2014

By Will Fitzhugh

Will Fitzhugh is the founder and editor of the Concord Review, a scholarly history journal with well-researched essays by high school students.

Education is an important issue these days, which is both good and bad. Good, because we need to pay more attention to the work of our schools these days, and not so good, because lots of people who know all about convertible debentures, initial public offerings, etc., think they must know a lot about teaching and learning as well.

There is prolonged debate about the role of education in promoting citizenship, character, lifelong learning (try living without learning sometime), career readiness, environmental awareness, respect for diversity, and on and on.

What I find missing most of the time is any suggestion that after an education (and during an education) it might be nice to have gained some knowledge. “How did so many countries and peoples get involved in World War I?” for example. “How did Jefferson feel when he had to change his mind about presidential prerogatives under the Constitution when the Louisiana territory came up for sale?” “What was the crucial insight that led Watson and Crick to the understanding of the double helix?”

When people raise the question of “Knowledge for What?” my response is usually: for its own sake. E. D. Hirsch and others have shown that having knowledge is what makes it possible to gain more knowledge. And being able to gain more knowledge is really necessary in life, I would agree. In addition, perhaps this is just my bias as an editor and publisher of interesting history research papers, I also feel that gaining knowledge is really one of the essential pleasures in life. Jefferson said: “I could not live without books.” I don’t think that was only because some books could aid him in the many architectural and agricultural innovations he cared about.

James Madison wrote: “Knowledge will forever govern ignorance; and a people who mean to be their own governors must arm themselves with the power which knowledge gives…. What spectacle can be more edifying or more seasonable, than that of liberty and learning, each leaning on the other for their mutual and surest support?”

I have been told that Jefferson may even have been able to play some of Mozart’s new work on his violin, and it seems likely he valued that, whether or not he could prove it made him a more efficient farmer, or a more productive President.

Sometimes, I would suggest, in our vigorous (frantic) pursuit of the practical, we skip over some of the things that are of the greatest (practical) value. Once Sir Alexander Fleming was given a tour of the brand-new gleaming headquarters of the Salk Institute by an eager young Ph.D. After the tour, the guide could not help but say: “Just think what you could have discovered if you had only had this state-of-the-art equipment!?!” And Sir Alexander Fleming said, perhaps kindly, “Not penicillin.”

So, by all means, let us introduce more computer technology, more vocational training, more college- and career-ready standards for critical thinking, textual analysis, deeper reading and all of that. But let’s also remember that one of the goals of education must be the acquisition of knowledge, including knowledge of history. We can never be completely sure, at the time we acquire it, when or in what ways some knowledge may be useful in itself in our brief lives as human beings.

shutterstock_92746891

Essential pleasures courtesy of Shutterstock.

 

The Skills Myth Might Kill You

by Lisa Hansel
August 13th, 2014

All of us in the Core Knowledge community are well aware of the risks of the skills myth: inadequate reading comprehension, limited critical thinking ability, inability to responsibly fulfill basic citizenship duties like researching issues before voting, etc.

Here’s a new risk: death.

Shockingly, I’m only kind of kidding.

As Annie Murphy Paul explained last week, the vast majority of new doctors think they don’t have to memorize all those pesky medical facts—they can just look them up:

A young doctor-in-training examines a new patient. Should she draw information for the diagnosis from her “E-memory”—electronic memory, the kind that’s available on a computer? Or should she dip into her “O-memory”—organic memory, the old-fashioned sort that resides in the brain?

Research shows that apprentice doctors are increasingly relying on E-memory, often in the form of a digital resource called UpToDate. This is an electronic reference tool, accessible on physicians’ laptops or mobile phones; tap in the patient’s symptoms, and up comes a potential diagnosis and a recommended course of treatment. A recent study found that 89 percent of medical residents regard UpToDate as their first choice for answering clinical questions.

I’m all for reference tools. The more important the decision, the more care we should take in checking our thinking. But doctors who believe such tools make memorization unnecessary are putting us all at risk. They are not developing their own web of knowledge, and so their ability to make connections will be limited. They will be less able to perform the human, artful, problem solving that is the heart of good medicine. This has consequences, as Paul reveals through Jerome P. Kassirer, a professor of medicine at Tufts University:

In medicine, writes Kassirer in an essay in the British Medical Journal, “we don’t always know what we need to know, and searches that are constrained to information we need at a given moment may not generate information that may be critically useful later.”… Kassirer offers an example from his own experience: “From the beginning of my third year at medical school I subscribed to two general medical journals, and I scoured each issue. Then, during my first week of internship, I was asked to examine a patient with hypotension, flushing, diarrhea, and hepatomegaly. About a year earlier a report on the carcinoid syndrome had caught my eye in one of the journals because of its unique metabolic characteristics. I correctly made the diagnosis because the article I had found in browsing had evoked the diagnosis.”

The more important the job, the more care we should take in building our expertise. Depending on E-memory results in shallow knowledge. As Paul writes:

Medical residents are, or should be, in the process of becoming experts, and that process involves building a rich and interconnected database of knowledge in one’s own mind. Research in cognitive science and psychology demonstrates that the ability to make quick and accurate judgments depends on the possession of extensive factual knowledge stored in memory — in internal, organic memory, that is, and not in a device.

shutterstock_90266317

Terrifying photo courtesy of Shutterstock.

 

If Only We Had Listened…

by Lisa Hansel
July 15th, 2014

Thanks to my history-loving father-in-law, I’m holding a perfectly preserved editorial from the 1948 Washington Times-Herald—Tuesday, February 24, 1948, to be exact. It’s self-explanatory, so here goes:

More About Schools

A few days ago, we shot a short editorial under the title “Something Wrong With Education.” The piece told how the New York State Department of Education, after an exhaustive survey, had estimated the only about 65% of high school juniors can spell everyday words such as “develop,” “meant,” “athletic,” etc.

From this we inferred that something was moldy in present-day public education methods, and that the something probably wasn’t traceable to either the teachers or the children.

A couple of mornings after that editorial was printed, three mothers of primary public school children in the first and second grades visited your correspondent. There ensued what seemed to us a most interesting conversation—interesting enough to boil down to its essential here. Let’s call the ladies Mrs. A, Mrs. B, and Mrs, C.

Mrs. A: “The editorial was all right, and I only wish you’d put it at the top of the column instead of the bottom. But the trouble doesn’t start in the high schools. It starts right down in the first grade.”

Mrs. B: “Which they’re turning into kindergarten, where the children don’t learn a thing. Likewise the second.”

Mrs. C: “They call it progressive education. Humph.”

Mrs. A: “Puppets.”

Mrs. C: “Yes, puppets. Puppets they want the children to make out of carrots and things. Even have a book called ‘Puppetry in the Classroom’ or something like that.”

Mrs. B: “It has diagrams—do this and do that, with letters A-B-C to show you what to do to make a puppet. But they don’t teach the children what letters are, or what they mean, or how to read, so how can they make head or tail of the diagrams?”

Mrs. A: “There’s a rule, too, against having any letters or figures on the blackboard. They claim a child of 6 can’t grasp those things and mustn’t be bothered with them, or his co-ordination will go bad—at least I think they call it co-ordination.”

Mrs. C: “Of course the fact is that a child at that age is as curious as can be, and loves to fool with pencils, and is usually just crazy to find out how to write like grownups, how to read the papers, how to count—”

Mrs. B: “Oh, yes, about counting. They don’t teach them nowadays to learn figures and add ‘em or subtract ‘em. Oh no—they’ve got to count beads on strings, or bounce rubber balls up and down. Ant they mustn’t learn to go above number 5 for a year or two, because that would strain their brains. Humph.”…

Mrs. C: “It’s not the teachers’ fault. I’m sure of that. Plenty of them will tell you on the quiet that they think these progressive—humph—methods are terrible, and just don’t educate and never will. But they can’t say so in public, because if they did they’d lose their jobs.”

In today’s context, the part of this that most jumps out at me is the mothers’ and editors’ confidence that these poor practices and results are not the teachers’ fault. Indeed, these methods are being imposed on teachers. It’s a sad tale that I continue to hear—teachers who have to close their doors and find spare moments to bring rigor and research-based practices to their classrooms.

Like E. D. Hirsch, I find today’s blame-the-teacher rhetoric shocking and disheartening. How did we get to this point? Hirsch offers a compelling explanation:

The favored structural reforms haven’t worked very well. The new emphasis on “teacher quality” implies that the reforms haven’t worked because the teachers (rather than the reform principles themselves) are ineffective. A more reasonable interpretation is that reforms haven’t worked because on average they have done little to develop “rich content knowledge within and across grades.”

If we are to improve the education we offer all children, reformers must stop blaming teachers and start working with them. As Hirsch explains, “The single most effective way to enhance teacher effectiveness is to create a more coherent multi-year curriculum, so that teachers at each level will know what students have already been taught.” A cumulative, rigorous curriculum is not a cure-all, but it is an essential platform for teachers to work together within and across grades. Schools can choose to write their own curriculum, adopt one, adapt a few—whatever works for them, so long as the result is a content-specific, coherent, cumulative body of knowledge and skills to be learned in each grade. Such a curriculum narrows the gaps in children’s abilities, makes differentiation more doable and effective, and enables the school community to deeply understand and support each child’s year-to-year progress.

In reform circles, however, curriculum is rarely discussed. Rather than wade into the hot water of precisely what students ought to learn, most reformers tinker around the edges of the educational enterprise (which boils down to what gets taught and what gets learned). To that, I say Humph! It’s the reformers’ ideas that are ineffective—not the hardworking teachers.

shutterstock_18732913

Stop blaming teachers for reformers’ faulty ideas.

(Photo courtesy of Shutterstock.) 

Stifling Innovation

by Lisa Hansel
March 28th, 2014

Here and there throughout March I’ve been reading the Partnership for 21st Century Skills’ “P21 BLOGAZINE,” a blog with a magazine-style approach in which the editor, Jim Bellanca, picks themes and invites authors to contribute relevant posts. In March, the theme has been creativity and innovation. While there were some points I agree with—particularly a concern that an over-emphasis on testing and the resulting narrowing of the curriculum will hinder creativity—there was much to question—particularly whether the child-directed, inquiry-driven approach that the authors favored would increase creativity.

For example, after the obligatory homages to Vygotsky and Dewey, there was the usual:

If our goal as educators is to develop a creatively skilled child, then inquiry-guided instruction that fosters imagination, emotional intensity, and curiosity should be infused into the curriculum. Our world is becoming increasingly complex, and therefore the need to teach students how to think and how to use their creative juices to address change must be a priority for our society. Teaching creative imagination should be a key component of 21st century learning.

There was also a more (let’s be polite and call it) creative formulation:

To prepare global, creative, and entrepreneurial talents, that is, everyone in the future, education should at first not harm any child who aspires to do so or suppress their curiosity, imagination, and desire to be different by imposing upon him or her contents and skills judged to be good for him or her by an external agency and thus depriving of the opportunities to explore and express on their own. In other words, we should at least allow Lady Gaga and the likes to exist without punishing them or locking them up in a classroom in the name of helping them to become successful. The most desirable education, of course, is one that enhances human curiosity and creativity, encourages risk-taking, and cultivates the entrepreneurial spirit in the context of globalization.

There were also statements, like this one, that left me perplexed:

Neuroscience research has found creative thinking to be a whole-brain activity leading us to understand that neural responses to creative endeavors can originate anywhere in the brain. The strength of the neural impulses actively transforms thinking and focus; otherwise, a person is just dreaming. These stronger impulses can lead students to persevere and to take educated risks.

Overall, there was very little sense that creativity has anything to do with knowledge or studying works of lasting beauty or building expertise through perspiration (that might be rewarded with inspiration). Although I had been planning to write a bit about how knowledge contributes to creativity and innovation, I’m happy to say that Annie Murphy Paul has done it for me—and done it better than I would have.

Paul was recently a keynote speaker at the Sandbox Summit, where the theme was “Innovation By Design.” The title of her blog post based on her talk, “The Key To Innovation: Making Smart Analogies,” pretty much says it all. There are no analogies without knowledge—and the broader and deeper one’s knowledge, the smarter one’s analogies will be.

shutterstock_90500839

More knowledge, better analogies, brighter ideas (image courtesy of Shutterstock).

First, Paul takes care of widespread misconceptions:

There’s a popular notion that innovation arrives like a bolt out of the blue, as a radical departure from previous knowledge—when really, most new ideas are extensions, twists, variations on what’s come before. The skill of generating innovations is largely the skill of putting old things together in a new way, or looking at a familiar idea from a novel perspective, or using what we know already to understand something new.

Then she turns to the power of analogies:

In their book Mental Leaps: Analogy in Creative Thought, cognitive scientists Keith Holyoak and Paul Thagard point out how many intellectual advances through the ages have been built upon analogies:

The first-century Roman architect Vitruvius compared the sound of actors’ voices in an amphitheater to the movement of water in a pool, the first of many thinkers to compare sound waves to water waves.

The seventeenth-century scientist William Gilbert compared the earth to a magnet, advancing knowledge of the earth’s gravitational force.

The eighteenth-century chemist Antoine Lavoisier compared respiration to combustion, clarifying how breathing turns oxygen into carbon dioxide.

Even the great nineteenth-century biologist, Charles Darwin, built his theory of evolution on an analogy between artificial selection—the deliberate mating of animals by breeders—to the natural selection that goes on in the wild.

Finally, Paul explores the keys to using analogies well, including knowing when to set them aside. Are exploration and inquiry part of the process? Absolutely. But are classrooms and knowledge-building curricula stifling innovation? Absolutely not. Quite the contrary: knowledge prevents wasting resources on reinventing the wheel and enables productive, innovative connections to be drawn.

 

Educators: Don’t Assume A Can Opener

by Guest Blogger
March 11th, 2014

By Paul Bruno

Paul Bruno is a middle school science teacher in California. This post originally appeared on his blog: www.paul-bruno.com.

There is a famous joke about the way economists often undermine the usefulness of their conclusions by making too many simplifying assumptions. Here’s one of the older formulations:

There is a story that has been going around about a physicist, a chemist, and an economist who were stranded on a desert island with no implements and a can of food. The physicist and the chemist each devised an ingenious mechanism for getting the can open; the economist merely said, “Assume we have a can opener”!

shutterstock_149522711

(Imaginary can opener courtesy of Shutterstock.)

It’s probably not fair to pick on economists in this way when the abuse of simplifying assumptions is at least as widespread in education.

For instance, arguably the trendiest thing going in education today is ‘grit‘: “the tendency to sustain interest in and effort toward very long-term goals”.

We all agree, I suspect, that a tendency to persevere is desirable, and that we should prefer that students have more of that tendency than less of it. So it is perhaps not surprising that since the term was popularized by researcher Angela Duckworth many teachers and schools have begun reorganizing their work to better promote and instill ‘grit’ in their students.

And yet, here’s Duckworth being interviewed by Alexander Russo last month:

Can you talk about how to teach grit in the classroom?
AI don’t know that anybody’s totally figured out how to teach it: What do you do exactly, even when we do have insights from research? How do you get your teachers to speak in ways that support growth mind-set? That’s why, through a nonprofit I helped cofound called the Character Lab, we’re organizing some lectures for teachers about self-control, grit, and related topics. It’s not totally prescriptive, because the science is still developing.

Not to put too fine a point on it, the world’s leading expert on grit is saying that educators who are substantially altering their work to better teach grit are doing so without much in the way of scientific backing or guidance.

In other words, in their excitement over grit many teachers and school leaders have simply assumed – without justification – that it is a trait that can be taught and that they know how to teach it.

This is by no means a problem limited to grit. Before grit it was “21st century skills“, “social-emotional learning”, “critical thinking”, or “scientific thinking”. What unites these fads is that they all, to varying degrees, suffer from a lack of rigorous scientific evidence indicating that they can be taught at all, let alone that we have reliable ways of teaching them in schools. (“Fluid intelligence” may be next.)

Meanwhile, we have good evidence indicating that schools today are reasonably – if imperfectly – effective at teaching kids the less-glamorous knowledge and skills – e.g., in math, science, and history – that we associate with “traditional” education.

So while it’s a good idea for researchers and educators to experiment with methods for teaching other, “higher-order” or “non-cognitive” abilities, it’s also important to remember that it is probably premature to ask schools to move away from their core competencies if we can’t also give them a clear alternate path forward.

 

The High Cost of Ignorance

by Lisa Hansel
January 15th, 2014

Earlier this week, I wondered what happen to Tony, a young man who got all the way through K-12 without learning even basic American history. I stumbled onto his story just after reading (yet another) great history book—Jill Lepore’s Book of Ages: The Life and Opinions of Jane Franklin. That’s Ben’s little sister.

 

 

In many ways it’s a tale of two cities. Jane’s life was as grinding as Ben’s was glorious.

Both lived long and displayed much ingenuity. While I doubt any of Ben’s family members were as bright as he—I’m assuming he was rare indeed—Jane had to be very smart merely to survive in her situation. The biggest difference between the two of them seems to have been the opportunity to learn.

Women’s lack of education, based on a widespread belief in their innate and inexorable lack of rational, scholarly, or political abilities, runs throughout the book. We are reminded, for example, of Abigail Adams’s plea to her husband as he helped devise new laws for our new nation: “If particular care and attention is not paid to the ladies, we are determined to foment a rebellion, and will not hold ourselves bound by any laws in which we have no voice or representation.” And of John’s response: “As to your extraordinary code of laws, I cannot but laugh…. Depend upon it, we know better than to repeal our masculine systems” (p. 181-82).

The contrast between this belief system and Jane’s determination to educate herself is sharp. But what really grabbed me—what came to mind as I read about Tony—was Lepore’s excerpt from a 1790 essay by Judith Sargent Murray. As Lepore writes, “Murray asked her reader to imagine the lives of a brother and sister, born very much alike.” I ask you to imagine siblings born alike in 2014, yet one raised in a low-income family and the other in a high-income family:

Will it be said that the judgment of a male of two years old, is more sage than that of a female’s of the same age? I believe the reverse is generally observed to be true. But from that period what partiality! how is the one exalted, and the other depressed, by the contrary modes of education which are adopted! the one is taught to aspire, and the other is early confined and limitted. As their years increase, the sister must be wholly domesticated, while the brother is led by the hand through all the flowery paths of science. Grant that their minds are by nature equal, yet who shall wonder at the apparent superiority, if indeed custom becomes second nature; nay if it taketh the place of nature, and that it doth the experience of each day will evince. (Lepore quoting Murray, page 230)

For century upon century, women were undereducated and assumed incapable. I wonder, how many of our least advantaged youth, like Tony, are suffering that same fate today? The only way to prevent such misunderstandings—to break this self-fulfilling prophecy—is to get the early years right.

 

 

Can Knowledge Level The Learning Field For Children?

by Guest Blogger
December 2nd, 2013

By Esther Quintero

Esther Quintero is a senior research fellow at the Albert Shanker Institute. This post first appeared on the Shanker Blog.

How much do preschoolers from disadvantaged and more affluent backgrounds know about the world and why does that matter? One recent study by Tanya Kaefer (Lakehead University) Susan B. Neuman (New York University) and Ashley M. Pinkham (University of Michigan) provides some answers.

The researchers randomly selected children from preschool classrooms in two sites, one serving kids from disadvantaged backgrounds, the other serving middle-class kids. They then set about to answer three questions:

  1. Do poor and middle-class children possess different knowledge about the world?
  2. Do differences in knowledge influence the children’s ability to learn in the classroom?
  3. If differences in preexisting knowledge were neutralized, would the two groups of children learn similarly?

To answer the first question, the researchers determined how much children from both groups knew about birds and the extent to which they were able to make inferences about new words based on such knowledge.

Not surprisingly, lower-income children had significantly less knowledge about birds and bird behaviors than did their middle-class peers. To rule out the possibility that these differences were the result of disparities in language proficiency, Kaefer et al. measured the children’s receptive vocabularies. This way, they were able to establish that poor kids knew less about birds, not merely because they knew fewer words related to birds, but because they had less information about the domain in general.

To answer the second question — whether differences in knowledge influence the kids’ ability to learn in the classroom — a second study evaluated children’s ability to understand words out of context and to comprehend a story that was read to them. As predicted, children from middle-class backgrounds, who had greater knowledge about the domain category (i.e., birds), performed better in these two tasks than children with more limited knowledge about the domain.

It may not be obvious to adults, but learning words from books is not an automatic or straightforward task for young children. In fact, argue the authors of the paper, one of the factors influencing this process is children’s preexisting knowledge. Previous research (cited in the paper) has established that children with larger vocabularies acquire new words implicitly from storybooks more readily than children with smaller vocabularies. At least two mechanisms might explain the relationship between vocabulary and learning.

First, the authors note, one possible explanation is that metalinguistic factors (e.g., verbal IQ, working memory) explain the relationship between vocabulary knowledge and implicit word learning.

Alternatively, if children’s vocabulary is viewed as an indicator (or “reflection”) of their general background knowledge, it may be the breadth and depth of their preexisting knowledge that influences their implicit word learning.

The logic of the second mechanism is as follows: Children’s preexisting knowledge creates a framework that facilitates the acquisition of new information; knowing more words and concepts scaffolds children’s ability to slot new information in the “right places,” and to learn related words and concepts more efficiently.

To recap, the first study discussed above established that children from disadvantaged backgrounds know less about a topic (i.e., birds) than their middle-class peers. Next, in study two, the researchers showed that differences in domain knowledge influenced children’s ability to understand words out of context, and to comprehend a story. Moreover, poor kids—who also had more limited knowledge—perform worse on these tasks than did their middle-class peers. But could additional knowledge be used to level the playing field for children from less affluent backgrounds?

In study three, the researchers held the children’s prior knowledge constant by introducing a fictitious topic—i.e., a topic that was sure to be unknown to both groups. When the two groups of children were assessed on word learning and comprehension related to this new domain, the researchers found no significant differences in how poor and middle-class children learned words, comprehended a story or made inferences.

These results:

  • Add to the body of research showing that preexisting knowledge shapes incidental vocabulary learning and comprehension for children, and that this is true for children as young as preschool age;
  • Highlight the need to build children’s background knowledge more systematically and strategically, and suggest that procedures to activate children’s prior knowledge—e.g., storybook reading—may prove fruitless when such knowledge does not exist.

While this research, like all research, has limitations—see the paper for a discussion of these—the results taken together suggest that one powerful way to level the “learning field” for all children is to facilitate poor kids’ access to “taken for granted” knowledge that middle class children, on average, are more likely to possess, primarily because they have been exposed to it in the first place.

When poor and middle-class children are given the same opportunities to assimilate new knowledge, their subsequent learning is comparable. Of course this is only one study, but the main finding and its implications are extremely powerful. It suggests that if preschool programs are not making a difference for children from disadvantaged backgrounds, it might be the case that the programs are not tackling an important but solvable problem: A deficit in knowledge.

 

Grant Wiggins Doesn’t Quite Understand E. D. Hirsch

by Guest Blogger
October 16th, 2013

By Harry Webb

This post originally appeared on Webs of Substance, a blog on educational research.

In his latest blog post, Grant Wiggins expresses his frustration at the recent writings of E. D. Hirsch, Jr. It is unsurprising that Wiggins would be irritated by Hirsch: Since the publication of Cultural Literacy in 1987, Hirsch has been an annoyance to the education establishment, particularly in the US. And now, with the adoption of the Hirsch’s Core Knowledge curriculum by many new charter schools, American education can no longer marginalize Hirsch’s message. His star is in the ascendant.

Of all the logical fallacies, the Straw Man seems to be Wiggins’ favorite. He mentions it three times in the blog post and again in response to comments. The Straw Man that Wiggins thinks he has detected is the idea that there are people who will deny the role of factual knowledge in reading comprehension. Indeed, Wiggins asks, “Would Hirsch please quote someone who does deny it, instead of setting up his straw man?”

A Straw Man

A Straw Man

I have said before that I don’t know of anyone who would outright condemn the acquisition of all forms of knowledge. The game is played much more subtly than that. Instead, the role of knowledge acquisition is diminished. It is made to seem inferior to other goals of education such as training in skills of various forms. It is this that I understand Hirsch to be unhappy about.

Wiggins targets Hirsch’s use of assertions. However, Hirsch does draw upon some evidence to support his claims. The amount of time, for instance, that has been given over to English Language Arts instruction increased with the introduction the NCLB act and without a transformative effect on reading proficiency. And this increase has come at the cost of subjects such as social studies, science, art and music. Hirsch’s view is that instruction in specific reading skills and strategies has limited effect on improving reading. He views the acquisition of broad background knowledge is far more important and this view certainly has some support from the realm of cognitive science.

This means that, according to Hirsch, the NCLB-led distortion of the curriculum is doubly dangerous. Not only is it likely to lead to redundant skill-based reading instruction in those additional English Language Arts lessons, it will also cut the exposure to content knowledge in social studies, science, art and music. So, you see, I think that Hirsch has a point.

I have read The Knowledge Deficit by Hirsch and I am attracted to his thesis on how we have arrived at this point. To paraphrase, Hirsch thinks that the educational establishment decided that the mere transmission of knowledge was not a suitable goal of education. However, once this goal is removed then another has to be found. Hirsch views this as the reason for a focus on transferable skills such as reading comprehension skills or higher-order thinking skills. If these skills can be identified and taught then we have a new role for education.

I am attracted to Hirsch’s thesis because it chimes with my own experience. In the very first lecture at my school of education, I was introduced to a misreading of Plutarch, the gist of which was to warn us all that we were not to see our role as to fill-up students with knowledge. Ever since, this has been reinforced in many and varied ways; I have even attended an education research conference where speaker after speaker derided  ”transmission” teaching as if we would all accept this perspective without question. No, you will not find anyone who will completely deny the role of factual knowledge in reading or any other endeavor; to do so would be absurd. However, you will find plenty who will downplay it.

In fact, this is exactly what Wiggins and his co-author Jay McTighe do in their book, Understanding by Design.

“To know which fact to use when requires more than another fact. It requires understanding – insight into essentials, purposes, audience, strategy, and tactics. Drill and direct instruction can develop discrete skills and facts into automaticity (knowing “by heart”), but they cannot make us truly able.”

There is much to unpack here. Knowledge is reduced to facts and facts, by definition, have to be disconnected and known “by heart” or without understanding. Understanding comes not from acquiring more knowledge – the facts that link the facts – but by some spookier kind of thing; insight. Finally, drill and direct instruction cannot make us truly able.

The first thing to note is that this is a string of assertions. I have not removed the footnotes when quoting this passage; there simply aren’t any. At a minimum, the point about direct instruction requires support. I am aware of no evidence that direct instruction leads to an inferior form of learning than any other approach, despite the many researchers who would like to demonstrate it. I suspect that the evidence is not quoted because there is no evidence.

In the same chapter, Wiggins and McTighe go on to draw-up a table to distinguish “knowledge” from “understanding,” just in case we were not clear. Knowledge is “the facts” whereas understanding is “the meaning of the facts.” This is a little odd; can we not know the meaning of the facts? Further, knowledge is, “a body of coherent facts” whereas understanding is, “the ‘theory’ that provides coherence and meaning to those facts.” The coup de grace is in the final pair of statements; knowledge is, “I respond on cue with what I know” whereas understanding is, “I judge when to and when not to use what I know.”

Clearly, understanding is a superior kind of thing to knowledge. Knowledge just consists of a discrete series of facts that children bark on cue, probably in the context of some dismal drill- or direct-instruction-based lesson. It is easy to see why teachers would not want to focus on the acquisition of knowledge if we are going to define it in these terms.

Of course, Wiggins is not alone in these views. Even back in 1916, Cubberley made a similar contrast in “Public School Administration.” According to Diane Ravitch:

“When it came to the curriculum, he authoritatively contrasted two approaches: One was ‘the knowledge curriculum’ which he described in highly pejorative terms: ‘Facts, often of no particular importance themselves, are taught, memorized and tested for, to be forgotten as soon as the school-grade need for them has passed.’ The opposite of this dreary approach was ‘the development type of course,’ in which ‘knowledge is conceived of as a life experience and inner conviction and not as the memorization of the accumulated knowledge of the past,’ Using the latter approach, school would change from a place in which children prepare for life by learning traditional subjects to one in which children live life.”

So, you see, the tradition of devaluing knowledge, of denying that understanding is a form of knowledge, of linking knowledge to pure rote learning; this is an old tradition.

In this context, I am glad that there is someone like Hirsch out there, arguing for the value of content knowledge. His is a perfectly valid argument that needs to be more widely heard.

 

Predicting Failure

by Lisa Hansel
October 8th, 2013

Caution: Frustration Ahead! Yes indeed, this post needs a warning label.

The BBC, in conjunction with the British Council, is aiding and abetting the spread of edutainment and comprehension strategies through its website TeachingEnglish.

 

BBC thumbs down courtesy of Shutterstock.

 

A look at the most recently added lesson plans reveals far too much trivial content. The first two pages have “lessons” on snacking, playground words, food festivals, gossip, and texting. Out of the 10 lessons shown on those two pages, there is one substantive lesson on carnivores vs. herbivores, which draws on “Triumph of the Herbivores” from BBC Earth. With the BBC’s in-depth news coverage and documentaries, I assumed all its lessons would draw on its treasure-trove of content.

To be fair, I must say that I have merely perused the site. These recently added lessons could grossly misrepresent the bulk of the content—and I hope they do. I also hope the BBC starts vetting the lessons to remove the many that are substance free. In particular, I hope it removes this one: “Pause & predict – YouTube technique.” It encourages teachers to use Mr. Bean clips to teach children to make predictions. This is a couple years old, so I wish I could just shudder and forget about it. But over the weekend, this time-wasting lesson spread to the US:

By fourth grade, students are often proficient at making predictions about what will happen at the end of a book…. What they aren’t as used to is making small predictions–close predictions–thinking about how a character might respond to the next big event or interaction based on how that character has responded in the past…. Mr. Bean is a great character to use for prediction work, because he has a very clear M.O. He tries to solve his problems in ways that fix the immediate issues, but miss the main point. For example, in the short clip, “Packing for a Holiday,” Mr. Bean manages to fit everything in a suitcase, but he does so by making the items useless, like packing only half a shoe.

Mr. Bean on the high dive is priceless, but my knowledge of Mr. Bean—including my ability (or lack thereof) to predict what he’ll do next—never helped me in college, in the voting booth, in keeping up with current affairs, etc. Simply put, the lovable Mr. Bean’s purpose is laughter and relaxation—not education. In the US, youth only spend 20% of their waking time (about 12% of their total time) in school. Given the great breadth of knowledge, vocabulary, and skills they need to acquire to become literate adults, we just don’t have time for Mr. Bean.

The sad fact is, the teachers who are excited about this Mr. Bean lesson don’t know any better. As NCTQ has clearly demonstrated, the majority of teachers are never taught that knowledge, vocabulary, and fluent decoding are essential to reading comprehension. Many are taught about comprehension strategies; but without strategies being placed in the larger context of how comprehension develops, teachers end up with a very skewed notion of best practices. Predicting how Mr. Bean will pack for a holiday is the result.

So long as teacher and administrator preparation passes on mistaken beliefs instead of cognitive science research, such silliness is inevitable. Poor preparation leads to weak curriculum selection and development, which is then reinforced with professional development based on the same mistaken beliefs. Our current teachers would be far more effective if they were given a better education and better instructional materials.

In that spirit, let’s see what cognitive science tells us about comprehension strategies. Daniel Willingham summed up the research as follows:

[Comprehension strategies] don’t really improve the comprehension process per se. Rather, they help kids who have become good decoders to realize that the point of reading is communication. And that if they can successfully say written words aloud but cannot understand what they’ve read, that’s a problem. Evidence for this point of view include data that kids don’t benefit much from reading comprehension instruction after 7th grade, likely because they’ve all drawn this conclusion, and that increased practice with reading comprehension strategies doesn’t bring any improved benefit. It’s a one-time increment.

Willingham goes on to explain that “the one-time boost to comprehension can be had for perhaps five or ten sessions of 20 or 30 minutes each” and that the rest of the time spent on comprehension strategies is both a waste of time and counterproductive: It makes reading boring. That’s probably why some teachers have turned to Mr. Bean. A better solution would be to spend far less time on comprehension strategies and far more on science, history, literature, art, geography, and music.