Dhamma

Thursday, April 10, 2025

White Slavery Denial


Jim Goad: Throughout his purported “debunking,” Hogan floats the idea that white indentured servitude and black slavery were so different as to be incomparable. Why is he wrong?

Michael A. Hoffman II: White enslavement has a long history in Britain and Scandinavia. Mr. Hogan, like so many writers on this subject who conform to the Establishment line, overlooks several points and has a naive faith in the white ruling class of Britain.

The conforming authors don’t take into account that the study of white enslavement has been impeded over the centuries by two facts missing from their own studies and critiques. First, white enslavement carried with it a hereditary taint. As I demonstrate in my book, this extends back to the Vikings who looked on Scandinavian “thralls” and those born into “thralldom” as carrying a hereditary defect in part because their enslavement was hereditary. Something akin to this was operating in Anglo-Saxon England under the category of villeinage. The daughter of a villein could not be married without buying her way out of villeinage. The test of a young English woman’s status in early medieval England—whether she was slave or free—was decided by whether or not the local lord had control over her body, as all masters have over their slaves. Under English law, a villein woman was one who could not be married until she first offered the custom of the country as it was known, the “ransom of blood for merchet” (i.e., until she first submitted to sexual intercourse with the lord, prior to her marriage).

Both in early Britain and Scandinavia and much later in the sugar plantations of the West Indies and the tobacco plantations of America (before cotton became king), British-American whites whose parents or relatives had been enslaved, or who themselves had escaped it, or bought their way out (as did some black people), were exceedingly reticent to identify as former white slaves or as their progeny. “Servant” was a far more attractive category, and we see identifications with this status and less so with outright enslavement due to the stigma.

Hogan has a pictorial display allegedly confuting my thesis in which he displays a photograph of what appear to be mixed-race people in the West Indies who are Barbados residents circa 1908, and he is incredulous that no Irish white “redlegs” are in the photo or named as such. We know from primary source materials that white slaves worked alongside blacks in the West Indies and that the word “redleg” was applied to them by their darker coworkers because of the tendency of the white slaves to sunburn. Whether or not their descendants stayed on the island to pose for a camera nearly 300 years later is a flimsy device for casting doubt on a historical epoch. Limiting oneself to the search for Irish surnames is an even more egregious error because it excludes the search for traces of the English Protestants who were transported as slaves to those islands in the 17th and early 18th century.

Hogan and company also support the notion that white bondage was mostly carried out within a framework of law. Here they are swallowing the propaganda of the white ruling class who were permitting the mass kidnapping of white children off the streets of port cities such as London and Glasgow for shipment to America under no indentures whatsoever, or under forged indentures, or as criminalized paupers and “rogues.” Upon arrival these youth were often put to work clearing forests and draining swamps. They were regularly assaulted, ill-fed, and worked to death. Britain had a surfeit of poor white youth who represented a potential for a French-Revolutionary type of insurrection in the cities. The aristocracy was only too glad to be rid of them, and I draw my evidence for this from state papers and contemporary letters and eyewitness accounts.

Hogan accepts the official tale of the slavers themselves, as they obscured their monstrous crime against their own people. So for him, there is no mass “kid-nabbing” (as it was first termed). He is blind to the fact that to facilitate white bondage under penal enslavement, the British ruling class contrived laws such as the Waltham Black Act, which made simple misdemeanors (stealing lace, breaking down an aristocrat’s fish pond, poaching deer) into felonies punishable by “transportation for life into the colonies.” In the 17th century the tens of thousands of prisoners these laws netted were not sent to slavery in British America and the West Indies on indentures. They were sold at the ports on arrival. As prisoners they had no rights. Diaries, letters, and eyewitness accounts give testimony to their horrible mistreatment and slave labor. Mr. Hogan won’t call these wretches “slaves,” even though they were clearly handled like chattel (cattle).

It seems that a powerful lobby has decided that black folk have a proprietary relationship with the word slave. “You blacks are the slave race,” they are told by their liberal white alleged “allies.” If this designation is false, however, then it constitutes an act of psychological crippling. The history of the white race is in many respects synonymous with the long history of enslavement, and if blacks have a copyright on the word “slave,” someone should tell that to the Slavic people, who for generations were targets of Viking slave raids and from whom the word “slave” is derived.

The weakness of Mr. Hogan’s assertions can be found in his gullible acceptance of the party line of the white ruling class with regard to their whitewash of their role in white enslavement. The white ruling class is excoriated with contempt by the left when they minimize crimes of the British aristocracy and capitalists against people of color, but Mr. Hogan will believe them implicitly when they aver that white bondage was operated within a legal framework, and it only involved “servants.”

The kidnapping of poor whites has precedence in Britain. It does not have a legal basis per se, but it has color of law and we find it in the systematic mass kidnapping of British people for maritime slavery aboard ship for the Royal Navy. I can anticipate the objection: The bondage was for a determinate period of time. Officially, yes. An Englishmen was kidnapped of the streets and country lanes of Britain with the connivance of the judiciary and conscripted for a period of years, but in actuality these determinate number of years was not worth the ink that had been used to write the paper, when it so happened that the victim of abduction was returning home from a five- or seven-year compulsory voyage. He saw the blessed shore of England at long last, prayed that his wife had not run off and that his children and parents still lived, and then a few miles from shore, another man-of-war sailed up to his vessel, boarded it at sea with a press gang, and kidnapped him again for another multi-year abduction. This could happen two or three times. The kidnapped sailor could be gone from Britain ten, fifteen, or twenty years and killed or severely injured during that time. I doubt Mr. Hogan would confer upon this naval slave that title because if the slave survived his ordeal he eventually went home. He had been a slave for a time, and to quibble over it is to do a grave disservice to the memory of tens of thousands of Royal Navy slaves. Impressment was one of the rotten roots, along with villeinage, that created a precedent for an institutional framework for white slavery concealed under cover of “indenture” or some other deceptive and cosmetic rubric.

Jim Goad: In this passage, Hogan essentially accuses you and others of lying: “The inclination to describe these different forms of servitude using the umbrella term ‘slavery’ is a wilful [sic] misuse of language.” Didn’t contemporary legal documents often refer to alleged “indentured servants” as “slaves”?

Michael A. Hoffman II: The truth was in the writings of the white slaves themselves who referred to themselves as such, and eyewitnesses to their plight who wrote accounts of what they saw. This is in my bookThey Were White and They Were Slaves.

As for willful misuse of language, I suppose I should apologize for defying the Establishment-imposed monopoly on how the word “slave” is to be employed, but I cannot, because like all monopolies this one is a form of restraint, enforced by thought cops indifferent to truths that violate the whole foundation of their monopoly on history and white self-image.

Jim Goad: Hogan claims your book deceptively uses “selective quotations taken from nearly 200 different secondary sources,” that your motive is “to deny reparations for slavery in America,” and that your “denial has a pro-slavery ideological lineage.”

Michael A. Hoffman II: One usually can discern that something is defective about an argument when it is ad hominem, and we discover this ill omen in Mr. Hogan’s resort to Pavlovian incentives for stigmatizing a writer with pejoratives calculated to cue an intended audience of partisans that here is someone to despise and dismiss. If he’s writing mainly for the benefit of Mark Potok and members of the new religion of Multiculturalism and Diversity, then he’s in luck. If he seeks a wider audience, however, then I don’t see how such tactics help to make his case.

The idea of paying reparations to any aggrieved racial or ethnic group is something with no appeal to me. It devolves into one-upmanship in the game of guilt imposition. Even in the matter of the Irish Catholic slaves, the focus is too narrow, and the subtext is one that excludes English Protestant slaves, since the Irish narrative is too often beholden to a vision of near-perpetual victimization by the English, which excludes the reality that a vicious white ruling class in London has seldom had any compunctions against betraying and enslaving their own white English yeomanry.

Do the English pay reparations to the Irish for what Oliver Cromwell did in transporting Irish slaves to the West Indies? Do the Catholics pay reparations for the pro-Catholic Stuart King Charles II for having shipped criminalized and enslaved English Protestants to the West Indies? A world full of victims demanding payment is a definition of a madhouse, not a civilization. The reparations process is very often politicized, with war victors or the plutocracy with the most bloat apportioning the guilt and assigning the victim status.

As a revisionist historian, I wrote They Were White and They Were Slaves for the astonishing reason (to some observers) that it was a chronicle that had not been adequately addressed and thus was pure gold from the standpoint of historical rediscovery. My “sin” is to have detailed the history without regard to the idol of political correctness. My book is good news for black people: You are not inferior; you were not the only race or the main race in chattel bondage.

Jim Goad: In an earlier email you wrote to me: “the work of my opponents (at least Hogan that is; the distaff side of the trio today admitted to me that she has not read my book), is so blundering (at least what I have read so far) as to actually add ammunition to my thesis.” How so?

Michael A. Hoffman II: At this URL, Hogan, Matthew Reilly, and Laura McAtackney make the case for one of the assertions in my book. They write:

“If a white servant assaulted another servant or a slave, it was treated as a misdemeanor and they were fined. If they assaulted their master they were whipped. Their indenture was legal property therefore a servant’s remaining time could be left in wills, traded for commodities and sold. Since one’s labor is inseparable from one’s person, this meant indentured servants in Barbados were treated as a sort of commodity. The distinction between voluntary and involuntary indentured servitude is also important, but all too often serves as justification for the existence of “white slavery”. It is true that some Europeans, particularly prisoners of war or political prisoners, were sent to places like Barbados against their will and without a predetermined period of servitude. However, upon arrival, those without contracts were, by law, required to serve the master who purchased their labor for a limited number of years, depending on their age. It is also true that many servants didn’t live to see the end of their period of servitude due to brutal treatment and unsparing work regimens, but while under the conditions of servitude, they were subject to the same laws that governed European servants, not enslaved Africans.” (End quote; emphasis supplied).

Apart from the risible fiction that all white bondage was “indentured” and for a “predetermined period” and therefore all of it was scrupulously governed by some kind of “European servant” legislation and jurisprudence, Hogan, Reilly, and McAtackney concede that “many” whites in bondage suffered “brutal treatment and unsparing work regimens” which proved lethal.

Even they admit that many whites were sent into bondage in Barbados and then worked to death. This is not slavery and it cannot be designated chattel slavery? By what sufficiently dainty term do we describe it? The death of many “servants” was by accident? Were their masters prosecuted and executed for this? What sort of human being who is beaten and worked to death is undeserving of the name “slave”?

Notice as well the knee-jerk assumption the three critics of white slave history exhibit: They expect us to believe that in every case where a white person in bondage is whipped, it was because “they assaulted their master.” How do they know this? Imagine the outcry if someone made such a characterization about the whipping of blacks in bondage—that the flogging was always their fault?

How is the experience of whites in servitude, who were at the mercy of masters of all types, reducible to the notion that no master ever unjustly lashed an un-free white person? This myth presupposes that whites were never whipped due to having tried to run away, or because they were too sick to work, or they refused the master’s sexual advances. By some miracle, the human predicament by which nonwhites in bondage unjustly found themselves on the receiving end of a lash is not shared by whites in bondage. Here we observe the subhuman status of whites in servitude as we demarcate the dimensions of the distortion: They died as slaves yet must not be called slaves; unlike black human beings who experienced servitude and were unconscionably whipped, whites who were whipped almost always deserved it.

Jim Goad: Isn’t it true that several historians who can in no way be smeared as “holocaust deniers” or “white supremacists” essentially agree that white slavery existed in the colonies? In my research forThe Redneck Manifesto, I found historians across the political spectrum essentially agreeing with the historical facts you raise in your research.

Michael A. Hoffman II: The cumulative evidence of A. Roger Ekirch in Bound for America (1990) is conclusive, though he employs euphemisms for white enslavement. He certainly does not fit the hysteric’s categories of “denier” or “supremacist.” Oscar and Mary Handlin openly refer to whites in servitude in the seventeenth century as “slaves. ” See “Origins of the Southern Labor System” in William and Mary Quarterly, April, 1950. Personally, I rely mainly on testimony of the white slaves themselves, their own writings and pleadings and those persons high and low who encountered them and wrote of them.

Jim Goad: With all these allegations of people trying to deny white guilt, is it possible that others are trying to deny black responsibility? Couldn’t one argue that some people are lazily trying to use slavery as a blanket and bulletproof excuse for black academic and economic failure?

Michael A. Hoffman II: Some black people are angered that there is an entire industry in the U.S.

centered on the so-called “Holocaust” of Judaic persons in eastern Europe decades ago, with museums in America and innumerable college classes, books, and movies, and a de facto “Holocaust” tax on Americans in the form of billions of dollars in US aid sent to the Israelis, while the descendants of the black people who worked the plantations and helped to bring to market the cotton—the greatest material prize of antebellum America—never did receive their “40 acres and a mule.” One wonders whether the recent energy devoted to black slave reparations and African-American museums and university curricula would be as intensely promulgated and ubiquitous if there was no “Holocaust” industry to remind American black people that they are not getting their share of the white guilt pie.

Certainly if reparations are to be paid for slavery in America, they must also be paid to the descendants of the white slaves who also performed the requisite labor that enriched the planter class and before there was much of a planter class on the eastern frontier, the white slaves of the 17th century performed the most back-breaking labor of colonial settlement: land-clearing.

Jim Goad: So much of Hogan’s alleged “debunking” simply employs the fallacies of argumentum ad hominem and appeal to motive. But since he called your motives into question, what might be someone’s motive in denying the well-documented history of white slavery?

Michael A. Hoffman II: In my book I give examples of white British aristocrats speeding in their carriages to abolitionist meetings where the plight of blacks in America was decried, heedless of white children by the side of the road who had finished toiling 16 hours, half-naked in a mine; or having their arms and legs mutilated in the factory machinery of the early Industrial Revolution whose owners, such as Josiah Wedgewood, and the famous poet and mine owner Lord Byron, considered poor British whites entirely expendable.

Charles Dickens, who had been a child laborer in a chemical factory, termed this callous hypocrisy “telescopic philanthropy.” The white elite of Britain had the apparition of black enslavement constantly before their eyes, even though it was thousands of miles away, while they were oblivious to the English boys who were sold to chimney sweeps and who sometimes burned and suffocated to death in the chimneys of the magnificent mansions of the abolitionists. Almost no one was paying attention to their agony. It’s supposed to be naughty of me to refer to child labor in the factories, mines, and chimneys of Britain as white slavery. I do not know what else to term girls and boys stripped virtually naked toiling side-by-side in a mine for 16 hours stunted and blackened, or a factory, where they often were seriously injured or killed, because if they didn’t undertake this labor they would starve to death. They went to their labor before dawn and ceased their labors after dark.

Most of your readers are familiar with white self-hatred and this perverse loathing, coupled with the legacy of the betrayal of the masses of white paupers and laborers in Britain by their rulers, has created the current situation of ruinous psychological and spiritual alienation that would seem to desire demographic extinction of their own seed.

Jim Goad: The SPLC claims that you are engaging in “an ahistorical reimagining of real events weaponized by racists and conspiracy theorists.” Speaking of “ahistorical reimaginings,” has the Southern Poverty Law Center ever addressed the Jewish role in the transatlantic slave trade?

Michael A. Hoffman II: Mr. Dees’s poverty palace is a satrap in the Israeli orbit. His hate mill seldom grinds the racist libels and murderous deeds of the settler-rabbis. Goyim-hating Chabad rabbis travel throughout the US to disseminate their racist doctrines, and the SPLC purrs like a kitten in the face of it. Dees and Potok are not going to reveal to their duped supporters—and their numerous cronies at executive levels of the US media—the pivotal role Judiac-Americans played in the merchandising of black flesh prior to the 20th century. Black academics such as the late Prof. Tony Martin of Wellesley College are marginalized and their career opportunities severely curtailed if they dare to inform their students of the vast Judaic role in black enslavement in the Americas, as detailed in the magisterial history books, The Secret Relationship Between Blacks and Jews, Volumes I, II, and III.

Jim Goad: Hogan claims that the purpose of reviving this alleged “myth” is that it “aims to shut down all debate about the legacy of black slavery in the United States.” Yet for someone who is apparently in favor of free and open debate, you claim he’s blocked you on Twitter. Would you be willing to debate Hogan…?

Michael A. Hoffman II: I would very much hope that Mr. Hogan would be true to his fighting Celtic heritage and consent to go a round with this writer.

(Written in 2016, published in the 2018 book Whiteness: The Original Sin. The Hogan/Hoffman debate never happened, but Jim Goad interviewed Hoffman on his podcast in 2017.)

https://jimgoad.net/podcast/?episode=13

From article White Slavery Denial →

Wednesday, April 9, 2025

The Marshmallow Test and self-control

 INTRODUCTION

AS BOTH MY STUDENTS and my children can testify, self-control does not come naturally to me. I have been known to call my students in the middle of the night to ask how the latest data analysis was going, though it began only that evening. At dinners with friends, to my embarrassment my plate is often the first to be clean, when others are far from done. My own impatience, and the discovery that self-control strategies can be learned, has kept me studying those strategies for a lifetime.

The basic idea that drove my work and motivated me to write this book was my belief, and the findings, that the ability to delay immediate gratification for the sake of future consequences is an acquirable cognitive skill. In studies initiated half a century ago, and still ongoing today, we’ve shown that this skill set is visible and measurable early in life and has profound long-term consequences for people’s welfare and mental and physical health over the life span. Most important, and exciting for its educational and child-rearing implications, it is a skill open to modification, and it can be enhanced through specific cognitive strategies that have now been identified.

The Marshmallow Test and the experiments that have followed over the last fifty years have helped stimulate a remarkable wave of research on self-control, with a fivefold increase in the number of scientific publications just within the first decade of this century. In this book I tell the story of this research, how it is illuminating the mechanisms that enable self-control, and how these mechanisms can be harnessed constructively in everyday life.

It began in the 1960s with preschoolers at Stanford University’s Bing Nursery School, in a simple study that challenged them with a tough dilemma. My students and I gave the children a choice between one reward (for example, a marshmallow) that they could have immediately, and a larger reward (two marshmallows) for which they would have to wait, alone, for up to 20 minutes. We let the children select the rewards they wanted most from an assortment that included marshmallows, cookies, little pretzels, mints, and so on. “Amy,” for example, chose marshmallows. She sat alone at a table facing the one marshmallow that she could have immediately, as well as the two marshmallows that she could have if she waited. Next to the treats was a desk bell she could ring at any time to call back the researcher and eat the one marshmallow. Or she could wait for the researcher to return, and if Amy hadn’t left her chair or started to eat the marshmallow, she could have both. The struggles we observed as these children tried to restrain themselves from ringing the bell could bring tears to your eyes, have you applauding their creativeness and cheering them on, and give you fresh hope for the potential of even young children to resist temptation and persevere for their delayed rewards.

What the preschoolers did as they tried to keep waiting, and how they did or didn’t manage to delay gratification, unexpectedly turned out to predict much about their future lives. The more seconds they waited at age four or five, the higher their SAT scores and the better their rated social and cognitive functioning in adolescence. At age 27–32, those who had waited longer during the Marshmallow Test in preschool had a lower body mass index and a better sense of self-worth, pursued their goals more effectively, and coped more adaptively with frustration and stress. At midlife, those who could consistently wait (“high delay”), versus those who couldn’t (“low delay”), were characterized by distinctively different brain scans in areas linked to addictions and obesity.

What does the Marshmallow Test really show? Is the ability to delay gratification prewired? How can it be taught? What is its downside? This book speaks to these questions, and the answers are often surprising. In The Marshmallow Test, I discuss what “willpower” is and what it is not, the conditions that undo it, the cognitive skills and motivations that enable it, and the consequences of having it and using it. I examine the implications of these findings for rethinking who we are; what we can be; how our minds work; how we can—and can’t—control our impulses, emotions, and dispositions; how we can change; and how we can raise and educate our children.

Everybody is eager to know how willpower works, and everybody would like to have more of it, and with less effort, for themselves, their children, and their relatives puffing on cigarettes. The ability to delay gratification and resist temptations has been a fundamental challenge since the dawn of civilization. It is central to the Genesis story of Adam and Eve’s temptation in the Garden of Eden, and a subject of the ancient Greek philosophers, who named the weakness of the will akrasia. Over the millennia, willpower was considered an immutable trait—you either had it or you didn’t—making those low in willpower victims of their biological and social histories and the forces of the momentary situation. Self-control is crucial for the successful pursuit of long-term goals. It is equally essential for developing the self-restraint and empathy needed to build caring and mutually supportive relationships. It can help people avoid becoming entrapped early in life, dropping out of school, becoming impervious to consequences, or getting stuck in jobs they hate. It is the “master aptitude” underlying emotional intelligence, essential for constructing a fulfilling life. And yet, despite its evident importance, it was excluded from serious scientific study until my students and I demystified the concept, created a method to study it, showed its critical role for adaptive functioning, and parsed the psychological processes that enable it.

Public attention to the Marshmallow Test increased early in this century and keeps escalating. In 2006, David Brooks devoted an editorial to it in the Sunday New York Times, and years later in an interview he conducted with President Obama, the president asked Brooks if he wanted to talk about marshmallows. The test was featured in The New Yorker in a 2009 Department of Science article, and the research is widely presented in television programs, magazines, and newspapers throughout the world. It is even guiding the efforts of Sesame Street’s Cookie Monster to master his impulse to voraciously devour cookies so that he may join the Cookie Connoisseurs Club. The marshmallow research is influencing the curriculum in many schools that teach a wide range of children, from those living in poverty to those attending elite private academies. International investment companies use it to encourage retirement planning. And a picture of a marshmallow has become an immediately understood opener to launch discussions of delay of gratification with almost any audience. In New York City, I see kids coming home from school wearing T-shirts that say Don’t Eat the Marshmallows and large metal buttons declaring I Passed the Marshmallow Test. Fortunately, as the public interest in the topic of willpower increases, so does the amount and depth of scientific information on how delay of gratification and self-control are enabled, both psychologically and biologically.

In order to understand self-control and the ability to delay gratification, we need to grasp not only what enables it but also what undoes it. As in the parable of Adam and Eve, we see headline after headline that reveals the latest celebrity—a president, a governor, another governor, a revered judge and moral pillar of society, an international financial and political wizard, a sports hero, a film star—who blew it with a young intern, a housekeeper, or an illegal drug. These people are smart, and not just in their IQ intelligence but emotional and social intelligence as well—otherwise they could not have achieved their eminence. Then why do they act so stupid? And why do they have so much company in the many men and women who never make it into the headlines?

I draw on findings at the vanguard of science to try to make sense of this. At the heart of the story are two closely interacting systems within the human brain, one “hot”—emotional, reflexive, unconscious—and the other “cool”—cognitive, reflective, slower, and effortful. The ways in which these two systems interact in the face of strong temptations underlie how preschoolers deal with marshmallows and how willpower works, or doesn’t. What I learned changed my long-held assumptions about who we are, the nature and expressions of character, and the possibilities for self-generated change.

Part I, Delay Ability: Enabling Self-Control, tells the story of the Marshmallow Test and the experiments that showed preschool children doing what Adam and Eve could not do in the Garden of Eden. The results identified the mental processes and strategies through which we can cool hot temptations, delay gratification, and achieve self-control. They also pointed to possible brain mechanisms that enable these achievements. Decades later, a flood of brain research is using cutting-edge imaging techniques to probe the mind-brain connections and help us understand what the preschooler managed to do.

The marshmallow findings inevitably lead to the question “Is self-control prewired?” Recent discoveries in the science of genetics are providing fresh answers to that question. They are revealing the surprising plasticity of our brains and transforming how we think about the role of nurture and DNA, environment and heredity, and the malleability of human nature. The implications go far beyond the science lab and contradict widely shared beliefs about who we are.

Part I leaves us with a mystery: why does the preschooler’s ability to wait for more treats, rather than ring the bell and settle for less, predict so much about future success and well-being? I answer that question in Part II, From Marshmallows in Pre-K to Money in 401(k), where I look at how self-control ability influences the journey from preschool to retirement planning, how it paves the way to creating successful experiences and positive expectations—an “I think I can!” mind-set and a sense of self-worth. While not guaranteeing success and a rosy future, self-control ability greatly improves the chances, helping us make the tough choices and sustain the effort needed to reach our goals. How well it works depends not just on skills but on internalizing goals and values that direct the journey, and on motivation that is strong enough to overcome the setbacks along the route. How self-control can be harnessed to build such a life by making willpower less effortful and increasingly automatic and rewarding is the story of Part II, and like life itself it unfolds in unexpected ways. I discuss not just resistance to temptation but diverse other self-control challenges, from cooling painful emotions, overcoming heartbreak, and avoiding depression to making important decisions that take future consequences into account. And while Part II shows the benefits of self-control, it makes its limits equally clear: a life with too much of it can be as unfulfilling as one with too little.

In Part III, From Lab to Life, I look at the implications of the research for public policy, focusing on how recent educational interventions beginning in preschool are incorporating lessons on self-control in order to give those children living under conditions of toxic stress a chance to build better lives. I then summarize the concepts and strategies examined throughout this book that can help with everyday self-control struggles. The final chapter considers how findings about self-control, genetics, and brain plasticity change the conception of human nature, and the understanding of who we are and what we can be.

In writing The Marshmallow Test, I imagined myself having a leisurely conversation with you, the reader, much like the many I have had with friends and new acquaintances, sparked by the question “What’s the latest in the marshmallow work?” Soon we veer off into how the findings relate to aspects of our own lives, from child rearing, hiring new staff, and avoiding unwise business and personal decisions to overcoming heartbreak, quitting smoking, controlling weight, reforming education, and understanding our own vulnerabilities and strengths. I have written the book for those of you who, like me, have struggled with self-control. I’ve also written it for those who simply would like to understand more deeply how our minds work. I hope The Marshmallow Test will start some new conversations for you.

***

When we designed the experiment in the 1960s we did not film the children. But twenty years later, to record the Marshmallow Test procedure and to illustrate the diverse strategies children use as they try to wait for their treats, my former postdoc Monica L. Rodriguez filmed five-to six-year-olds with a hidden camera in a public school in Chile. Monica followed the same procedure we had used in the original experiments. First up was “Inez,” an adorable little first grader with a serious expression but a twinkle in her eye. Monica seated Inez at a small table in the school’s barren research room. Inez had chosen Oreo cookies as her treats. On the table were a desk bell and a plastic tray the size of a dinner plate, with two cookies in one corner of the tray and one in the other corner. Both the immediate and the delayed rewards were left with the children, to increase their trust that the treats would materialize if they waited for them as well as to intensify their conflict. Nothing else was on the table, and no toys or interesting objects were available in the room to distract the children while they waited.

Inez was eager to get two cookies rather than just one when given the choice. She understood that Monica had to go out of the room to do some work but that she could call her back at any time by ringing the bell. Monica let Inez try ringing it a couple of times, to demonstrate that each time she rang Monica would immediately come back in the room. Monica then explained the contingency. If Inez waited for her to come back by herself, she got the two cookies. If she did not want to wait, she could ring the bell at any time. But if she rang the bell, or began to eat the treat, or left the chair, she’d get only the single cookie. To be sure that Inez understood the instructions fully, she was asked to repeat them.

When Monica exited, Inez suffered for an agonizing few moments with an increasingly sad face and visible discomfort until she seemed about to burst into tears. She then peeked down at the treats and stared hard at them for more than ten seconds, deep in thought. Suddenly her arm shot out toward the bell but just as her hand got to it, she stopped herself abruptly. Gingerly, tentatively, her index finger hovered above the bell’s ringer, almost but not quite touching it, over and over, as if to tease herself. But then she jerked her head away from the tray and the bell, and burst out laughing, as if she had done something terribly funny, sticking her fist into her mouth to prevent herself from roaring aloud, her face beaming with a self-congratulatory smile. No audience has watched this video without oohing and laughing along with Inez in empathic delight. As soon as she stopped giggling, she repeated her teasing play with the bell, but now she alternately used her index finger to shush herself and stuck her hand in front of her carefully closed lips, whispering “No, no” as if to stop herself from doing what she had been about to do. After 20 minutes had passed, Monica returned “by herself,” but instead of eating the treats right away, Inez marched off triumphantly with her two cookies in a bag because she wanted to take them home to show her mother what she had managed to do.

“Enrico,” large for his age and dressed in a colorful T-shirt, with a handsome face topped by neatly cut blond bangs, waited patiently. He tipped his chair far back against the wall behind him, banging it nonstop, while staring up at the ceiling with a bored, resigned look, breathing hard, seemingly enjoying the loud crashing sounds he made. He kept banging until Monica returned, and he got his two cookies.

“Blanca” kept herself busy with a mimed silent conversation—like a Charlie Chaplin monologue—in which she seemed to be carefully instructing herself on what to do and what to avoid while waiting for her treats. She even mimed smelling the imagined goodies by pressing her empty hand against her nose.

“Javier,” who had intense, penetrating eyes and an intelligent face, spent the waiting time completely absorbed in what appeared to be a cautious science experiment. Maintaining an expression of total concentration, he seemed to be testing how slowly he could manage to raise and move the bell without ringing it. He elevated it high above his head and, squinting at it intently, transported the bell as far away from himself as possible on the desktop, stretching the journey to make it as long and slow as he could. It was an awesome feat of psychomotor control and imagination from what looked like a budding scientist.

Monica gave the same instructions to “Roberto,” a neatly dressed six-year-old with a beige school jacket, dark necktie on his white shirt, and perfectly combed hair. As soon as she left the room he cast a quick look at the door to be sure it was tightly shut. He then rapidly surveyed the cookie tray, licked his lips, and grabbed the closest treat. He cautiously opened the cookie to expose the white cream filling in its middle, and, with bent head and busy tongue, he began to lick the cream meticulously, pausing for only a second to smilingly approve his work. After licking the cookie clean, he skillfully put the two sides back together with even more obvious delight and carefully returned the filling-free cookie to the tray. He then hurried at top speed to give the remaining two cookies the identical treatment. After devouring their insides, Roberto arranged the remaining pieces on the tray to restore them to their exact original positions, and checked the scene around him, scanning the door to be sure that all was well. Like a skilled method actor, he then slowly sank his head to place his tilted chin and cheek on the open palm of his right hand, elbow resting on the desktop. He transformed his face into a look of utter innocence, his wide, trusting eyes staring expectantly at the door in childlike innocent wonder.

Roberto’s performance invariably gets the most cheers and the loudest laughter and applause from every audience, including, once, a congratulatory shout from the esteemed provost of one of America’s top private universities to “get him a scholarship when he’s ready to come here!” I don’t think he was joking.

***

Self-control skills are essential for pursuing our goals successfully, but it is the goals themselves that give us direction and motivation. They are important determinants of life satisfaction, and those we select early in life have striking effects both on the later goals that we reach and the satisfaction we feel about our lives. No matter how they are formed, the goals that drive our life stories are as important as the EF we need to try to reach them.

Self-control, especially when it is labeled “effortful control,” can sound as if it demands a grim commitment to very tough, trying labor—a voluntary entry into a work-driven life of self-denial, of living for the future and missing the pleasures of the moment. An acquaintance told me about a recent dinner he had with friends in Manhattan during which the topic turned to the Marshmallow Test. One of his friends, a novelist who lived in Greenwich Village, was contrasting his own life with that of his brother, a very wealthy and successful investment banker living the pinstriped-suit-with-Hermès-necktie life. The brother had long been married and had children who were all doing well. The writer had published five novels but they had had little impact and few sales. He described himself nevertheless as having a great time, spending his days writing and living the bachelor life at night, going from one short-term relationship to the next. He speculated that his solemn, straight-laced brother probably would have waited forever for his marshmallows, whereas he would have been an early bell ringer.

In fact, the novelist could not have published those five books without a great deal of self-control, and he probably also needs it when trying to maintain his fun relationships while staying uncommitted. Nor did he manage to make it through an elite liberal arts college that emphasizes creative writing without having more than enough self-control to do so. You need EF as much for a creative life in the arts as for a successful life in anything else; it’s just the goals that differ. Without EF, the chance to find and pursue your goals is lost. That’s what the kids in the South Bronx faced if they lost in the KIPP lottery. But without compelling goals and drive, EF can leave us competent but aimless.

ALTERNATIVE VIEWS OF WHO WE ARE

Your reactions to the research findings on brain plasticity and the malleability of behavior in this book depend importantly on your own beliefs about how much people can really control and change what they become. There are two conflicting ways to interpret what these findings tell us in the larger context of who we are and what we can be. It is worth using your cool system to think about what the results mean to you before coming to firm conclusions that your hot system has probably already reached.

The answer to the question of whether human nature is, at its core, malleable or fixed has been an enduring concern of not just scientists but, more important, each of us in our everyday lives. Some people see self-control ability, willpower, intelligence, and other characteristics as fixed, unchangeable traits from the very start of life. They read the experimental evidence that executive function and self-control improve after educational interventions and interpret that as short-term effects unlikely to make a long-term difference, just little tricks that don’t change inborn traits. These people differ from those who see the evidence as supporting the view that we are open to change and able to alter how we think and behave, that we can craft our own lives rather than being either the winners or the losers in the DNA lottery.

If we allow the evidence to make a difference to our personal theories, the discovery of the plasticity of the brain tells us that human nature is more flexible and open to change than has long been assumed. We do not come into the world with a bundle of fixed, stable traits that determine who we become. We develop in continuous interactions with our social and biological environments. These interactions shape our expectations, the goals and values that drive us, the ways we interpret stimuli and experience, and the life stories we construct.

To reiterate from the nature-nurture discussion (Chapter 7), as Kaufer and Francis point out, “Environments can be as deterministic as we once believed only genes could be, and… the genome can be as malleable as we once believed only environments could be.” And the basic message of this book has been that there is substantial evidence that we can be active agents who in part control how those interactions play out. That leaves us with a view of human nature in which we potentially have more choice, and more responsibility, than in the purely deterministic scientific views of the past century. Those views attributed the causes of our behavior to the environment, DNA, the unconscious, bad parenting, or evolution, plus chance. The story this book tells acknowledges all these sources as influences. But ultimately, at the end of that causal chain, it is the individual who is the agent of the action and decides when to ring the bell.

Walter Mischel 

When I am asked to summarize the fundamental message from research on self-control, I recall Descartes’s famous dictum cogito, ergo sum—“I think, therefore I am.” What has been discovered about mind, brain, and self-control lets us move from his proposition to “I think, therefore I can change what I am.” Because by changing how we think, we can change what we feel, do, and become. If that leads to the question “But can I really change?,” I reply with what George Kelly said to his therapy clients when they kept asking him if they could get control of their lives. He looked straight into their eyes and said, “Would you like to?”

Ernst Robert Curtius European Literature And The Latin Middle Ages

 European Literature and the Latin Middle Ages still often figures on reading lists for students of medieval literature. It’s more often mined as a reference work than read through. This is a shame, since it aspires to be a totality. Professional medievalists tend to give rather guarded replies when asked what they think of it. The chief objections to it are that its focus on topoi diminishes the role of individuality in medieval Latin poetry;20 that it concentrates on elite and university culture at the expense of oral and popular culture; that it is insufficiently concerned about the mechanisms by which learning was disseminated and transformed; that its conception of a “topos” lacks theoretical rigor; and that its canon (and its account of the genesis of the literary canon, and of the idea of a literary canon, p. 259) is distorted by its focus on Latin materials. It is sometimes also criticized for being unduly Eurocentric, and for not extending its gaze eastward into the Slavonic world or beyond.

None of these criticisms is entirely fair, although it’s easy to see why most of them have arisen. Curtius’s word “topos” encompasses a much wider array of phenomena than the “common places” of the rhetorical tradition, and the boundaries of the concept are sometimes as a result unclear. Sometimes the topoi are presented as rhetorical building blocks of composition, but from time to time they are presented as atemporal truths, or even connected to Carl Jung’s archetypes. Curtius was interested in comparative history, particularly the work of A. J. Toynbee, whose survey of recurrent patterns of rise and decline in transnational civilizations provides much of the historiographical superstructure of his early chapters. He also read works of anthropology and comparative religion. Eliot indeed offered to send him a copy of Frazer’s Golden Bough in the 1920s (he scrupulously protested that he could only afford to send the abbreviated one-volume edition).21 Curtius’s idea of “European literature” is consequently held together by several conceptually distinct forces. The first is the idea that the mind of Europe through the Middle Ages was united by an educational elite, who preserved and disseminated a rhetorical and classical heritage through a range of different topoi and rhetorical conventions. The second is the very different notion that European literature might be held together by quasi-archetypal concerns, which recur because they are archetypal rather than because they are directly transmitted from one generation to the next. The presence of this second line of argument is partly why critics of European Literature and the Latin Middle Ages have sometimes objected to the narrowness of its geographical scope: if the goddess Natura is, as Curtius suggests (p. 122), a manifestation of the Jungean anima, then what is the justification for not exploring further examples of this apparently transhistorical topos in Polish or Indian or even in Chinese literature? This is, though, not a serious objection. The words “Latin” and “European” in the title of this book should give most reasonable readers grounds to expect that India and China will be marginal to its concerns. The neglect of the eastern perimeter of European Latin culture is a limitation, however. It can only be explained by Curtius’s desire in the aftermath of the war to look principally westward and southward.

European Literature and the Latin Middle Ages is energized by several internal contradictions, which are largely the product of the circumstances in which it was written. It is driven by a belief in the unity of European literary traditions, which culminate in and are most fully articulated by Dante, to whom Curtius devotes his final chapter. This unified vision is articulated, however, in an increasingly fragmentary form. As the excursuses on different topoi and particular literary relationships multiply at the end of the volume—and they make up almost a third of its overall length—Curtius seems to fall victim to his own ambition to understand everything. To see Europe as a whole means accumulating large numbers of fragments, and those fragments do not always cohere. This again has parallels with the careers of other modernists born in the 1880s. As Ezra Pound famously declared toward the end of The Cantos, in which he tried to reconfigure the epic tradition, to explore the relationship between East and West in new ways, to account for the rise of usury, and to tie all of this back in to Occitan poetry, “I am not a demi-god / I cannot make it cohere.”22 Curtius’s intellectual trajectory had more in common with Eliot’s than with Pound’s (although by the 1940s there were substantial differences between the two men, particularly in their attitudes to the church).23 Like Eliot, Curtius can substitute an idea of “tradition” for history, and he is also prone to assume that there is an inverse relationship between the value of literary culture and the number of people who possess it. He can even give the impression that culture is a static treasure to be protected and handed down through the generations like an imperial crown: “The bases of Western thought are classical antiquity and Christianity. The function of the Middle Ages was to receive that deposit, to transmit it, and to adapt it” (p. 593). It is not surprising that the preservation of these treasures sometimes seems to matter more to Curtius than their adaptation or their transformation. He had seen Cologne burning on the horizon, and he had lived through the hyperinflation of the Weimar period. This inclined him to see literary culture as a kind of gold standard (“that deposit”), which the Middle Ages preserved and later ages squandered.

This gives rise to the most tantalizing aspect of European Literature and the Latin Middle Ages, and to the most significant of the criticisms that can be leveled against it by those who live in more fortunate times. Its surveys of literary topoi do offer enormous riches. Anyone interested in the idea of literary immortality, in the notion of inexpressibility, the invocation of the muses, the rhetorical methods for arousing passion, or in any of a dozen more recurrent literary themes will find the best starting point for further research in these pages. But Curtius shows relatively little interest in the process by which these topoi were disseminated, or how they were absorbed and transformed by later readers. His pan-Europeanism also means that he is reluctant to dwell on the changes that can result from transmitting particular topoi from one environment to another, be that a different nation or a different institution. The content and character of what is known, like the social composition of those who know it, do not seem so far as he is concerned to alter a great deal between the fourth and fourteenth centuries, or from the Tiber to the Rhine. The topoi recur and live on. The “deposit” of learning is preserved rather than diversified.

Curtius had clearly reflected on these questions, but his overall desire to describe and praise acts of cultural preservation finally triumphed over his interest in transmission and transformation. The short excursus on “The Ape as Metaphor” (pp. 538-40) from John of Salisbury to Shakespeare is one of a number of oblique recognitions that cultural transmission without change might become simple repetition or mimicry, since this excursus is about writers who “ape” other writers, and simply reproduce either nature or their reading without transforming it. His concluding discussion of how ideas of literary imitation are transmuted into notions of inspiration in Longinus (pp. 398-401) also acknowledges that the Latin culture of the Middle Ages needed to be actively reinvigorated in order to remain alive, and that simply treasuring it in the bank vault of the mind was not enough. But readers are left without a clear formulation of how one writer changes or transforms what he or she reads. The topoi do sometimes seem to be a super-personal repository of universal wisdom.

So what then can be taken from this book? What makes it more than a historical curiosity? The first answer to these questions is that the historical position of its author actually makes European Literature and the Latin Middle Ages more rather than less interesting. It is not just a great book about the Middle Ages. It is also a book that reveals a huge amount about twentieth-century literature. It shows a former Weimar modernist attempting to construct a vision of European literature after the war. It is emphatically a book about European literature and the Latin Middle Ages, rather than just European literature in the Latin Middle Ages, since it indirectly addresses Curtius’s present as much as the past. As well as providing a mass of leads for thinking about how Dante grew from Virgil, or about the significance of Alan of Lille or Bernardus Silvestris, this book still shows a great critic rethinking literary history in response to a cultural catastrophe. Its emphasis on the continuity of classical and rhetorical learning through the Middle Ages also makes it permanently valuable. European Literature and the Latin Middle Ages stands as a monumental refutation of the Renaissance humanists’ mythology that Latin literary culture was heroically recovered in the fifteenth century after centuries of darkness. The Middle Ages described here are not at all dark. They are effectively a long series of renaissances and enlightenments that run on until the eighteenth century, after which the real dark ages begin.

But the main quality that makes European Literature and the Latin Middle Ages a great book is its breadth of vision. Critics are today prone to bury their noses in one corner of time and space. Few readers are willing to take a wide view across centuries or across national and linguistic boundaries. There is for Curtius no excuse for not knowing or not reading any work; there is no excuse for not trying to see how every literary text fits into a larger European picture. Even if finally he found that the larger picture he wanted to create fragmented into a series of brilliantly detailed excurses—something that history may well show to be a recurrent tendency within all aspirations to pan-European unity—he would never have seen many of those details if he had not had the ambition to see European literature as a whole.

Chivalry

 

1 DUBBING TO KNIGHTHOOD

FROM the second half of the eleventh century, various texts, soon to become more numerous, begin to mention that here and there a ceremony has taken place for the purpose of “making a knight”. The ritual consisted of several acts. To the candidate, who as a rule was scarcely more than a boy, an older knight first of all handed over the arms symbolic of his future status; in particular, he girded on his sword.1 Then, almost invariably, this sponsor administered a heavy blow with the flat of his hand on the young man’s neck or cheek—the paumée or colée, as the French documents term it. Was it a test of strength? Or was it—as was held by some rather late medieval interpreters—a method of making an impression on the memory, so that, in the words of Ramon Lull, the young man would remember his “promise” for the rest of his life? The poems do indeed often show the hero trying not to give way under this rude buffet—the only one, as a chronicler remarks, which a knight must always receive and not return;2 on the other hand, as we have seen, a box on the ear was one of the commonest methods, sanctioned by the legal customs of the time, of ensuring the recollection of certain legal acts—though it is true that it was inflicted on the witnesses and not on the parties themselves. But a very different and much less purely rational meaning seems at first to have attached to the gesture of “dubbing” (the word was derived from an old Germanic verb meaning “to strike”), originally considered so essential to the making of a knight that the term came to be used habitually to describe the whole ceremony. The contact thus established between the hand of the one who struck the blow and the body of the one who received it transmitted a sort of impulse—in exactly the same way as the blow bestowed by the bishop on the clerk whom he is ordaining priest. The ceremony often ended with an athletic display. The new knight leapt on his horse and proceeded to transfix or demolish with a stroke of his lance a suit of armour attached to a post; this was known as the quintaine .

By its origins and by its nature dubbing to knighthood is clearly connected with those initiation ceremonies of which primitive societies, as well as those of the ancient world, furnish many examples—practices which, under different forms, all had the common object of admitting the young man to full membership of the group, from which his youth had hitherto excluded him. Among the Germans they reflected a warlike society. Without prejudice to certain other features—such as the cutting of the hair, which was sometimes found later, in England, in association with dubbing—they consisted essentially in a delivery of arms. This is described by Tacitus and its persistence at the period of the invasion is attested by many texts. There can be no doubt as to the continuity between the Germanic ritual and the ritual of chivalry. But in changing its setting the act had also changed its social significance.

Among the Germans all free men were warriors. All had therefore the right to initiation by arms wherever this practice was an essential part of the tradition of the folk (we do not know if it was universal). On the other hand, one of the characteristics of feudal society was, as we know, the formation of a group of professional fighting-men, consisting primarily of the military vassals and their chiefs. The performance of the ancient ceremony naturally became restricted to this military class; and in consequence it came near to losing any kind of permanent social foundation. It had been the rite which admitted a man to membership of the people. But the people in the ancient sense—the small civitas of free men—no longer existed; and the ceremony began to be used as the rite which admitted a man to membership of a class, although this class still lacked any clear outline. In some places the usage simply disappeared; such seems to have been the case among the Anglo-Saxons. But in the countries where Frankish custom prevailed it survived, though for a long time it was not in general use or in any way obligatory.

Later, as knightly society became more clearly aware of what separated it from the unwarlike multitude and raised it above them, there developed a more urgent sense of the need for a formal act to mark the individual’s entry into the society so defined, whether the new member was a young man of “noble” birth who was being admitted into adult society, or—as happened much more rarely—some fortunate upstart placed on a level with men of ancient lineage by recently acquired power, or merely by his own strength or skill. In Normandy, as early as the end of the eleventh century, to say of the son of a great vassal: “He is not a knight”, was tantamount to implying that he was still a child or an adolescent.3Undoubtedly the concern thus to symbolize every change of legal status as well as every contract by a visible gesture conformed to characteristic tendencies in medieval society—as witness the frequently picturesque ritual of entry into the craft gilds. It was necessary, however, that the change of status thus symbolized should be clearly recognized as such, which is why the general adoption of the ceremony of dubbing really reflected a profound modification in the idea of knighthood.

During the first feudal age, what was implied by the term chevalier, knight, was primarily a status determined either by a de facto situation or by a legal tie, the criterion being purely personal. A man was called chevalier because he fought à cheval, on horseback, with full equipment; he was called the chevalier of someone when he held a fief of that person, on condition of serving him armed in this fashion. The time came, however, when neither the possession of a fief nor the criterion—necessarily a somewhat vague one—of mode of life was any longer sufficient to earn the title; a sort of consecration was necessary as well. The transformation was completed towards the middle of the twelfth century. A turn of phrase in use even before 1100 will help us to grasp its significance. A knight was not merely “made”; he was “ordained”. This was how it was put, for example, in 1098 by the count of Ponthieu as he was about to arm the future Louis VI.4 The whole body of dubbed knights constituted an “order”, ordo. These are learned words, ecclesiastical words, but we find them from the beginning on the lips of laymen. They were by no means intended—at least when they were first used—to suggest an assimilation with holy orders. In the vocabulary which Christian writers had borrowed from Roman antiquity, an ordo was a division of society, temporal as well as ecclesiastical. But it was a regular, clearly defined division, conformable to the divine plan—an institution, really, and not merely a plain fact.

Nevertheless, in a society accustomed to live under the sign of the supernatural, the rite of the delivery of arms, at first purely secular, could scarcely have failed to acquire a sacred character and two usages, both of them very old, gave openings for the intervention of the Church.

In the first place there was the blessing of the sword. Originally it had had no specific connection with the dubbing. Everything in the service of man seemed in that age to call for this protection from the snares of the Devil. The peasant obtained a blessing for his crops, his herd, his well; the bridegroom, for the marriage bed; the pilgrim, for his staff. The warrior naturally did the same for the tools of his profession. The oath “on consecrated arms” was already known to the old Lombard law.5 But most of all the arms with which the young warrior was equipped for the first time seemed to call for such sanctification. The essential feature was a rite of contact. The future knight laid his sword for a moment on the altar. This gesture was accompanied or followed by prayers, and though these were inspired by the general form of benediction, they early took on a form specially appropriate to an investiture. In this form they appeared already, shortly after 950, in a pontifical compiled in the abbey of St. Alban of Mainz. This compilation, a substantial part of which was doubtless based on borrowings from older sources, was soon in use throughout Germany, as well as in northern France, England, and even Rome itself, where it was imposed by the influence of the Ottonian court. It diffused far and wide the model of benediction for the “newly-girt” sword. It should be understood, however, that this consecration at first constituted only a sort of preface to the ceremony. The dubbing proceeded afterwards according to its peculiar forms.

Here again, however, the Church was able to play its part. Originally the task of arming the adolescent could as a rule be performed only by a knight already confirmed in that status—his father, for example, or his lord; but it might also be entrusted to a prelate. As early as 846, Pope Sergius had girded the baldric on the Carolingian Louis II. Similarly, William the Conqueror later had one of his sons dubbed by the archbishop of Canterbury. No doubt this compliment was paid less to the priest as such than to the prince of the Church, chief of many vassals. Nevertheless, a pope or a bishop could scarcely dispense with religious ceremonial. In this way, the liturgy was enabled to permeate the whole ceremony of dubbing.

This process was completed by the eleventh century. It is true that a pontifical of Besançon composed at this time contains only two benedictions of the sword, both of them very simple. But from the second of these it emerges very clearly that the officiating priest was supposed to do the arming himself. Nevertheless, to find a genuine religious ritual of dubbing, we must look farther north, towards the region between Seine and Meuse which was the true cradle of most authentic feudal institutions. Our oldest piece of evidence here is a pontifical of the province of Rheims, compiled towards the beginning of the century by a cleric who, while taking the Mainz collection as his model, none the less drew abundantly on local usages. Together with a benediction of the sword, which reproduces that in the Rhenish original, the liturgy comprises similar prayers for the other arms or insignia—banner, lance, shield, with the single exception of the spurs, the delivery of which was to the end reserved for laymen. Next follows a benediction upon the future knight himself, and finally it is expressly mentioned that the sword will be girded on by the bishop. After an interval of nearly two centuries, the ceremonial appears in a fully developed form, once again in France, in the pontifical of the bishop of Mende, William Durand, which was compiled about 1295, though its essential elements apparently date from the reign of St. Louis. Here the consecratory rôle of the prelate is carried to the ultimate limit. He now not only girds on the sword; he also gives the paumée; in the words of the text, he “marks” the aspirant “with the character of knighthood”. Adopted in the fourteenth century by the Roman pontifical, this form was destined to become the official rite of Christendom. As to the accessory practices—the purifying bath, imitated from that taken by catechumens, and the vigil of arms—these do not appear to have been introduced before the twelfth century or ever to have been anything but exceptional. Moreover, the vigil was not always devoted entirely to pious meditations. If we are to believe a poem of Beaumanoir, it was not unknown for it to be whiled away to the sound of fiddles.6Nevertheless, it would be a mistake to suppose that any of this religious symbolism was ever indispensable to the making of a knight, if only because circumstances would often have prevented it from being carried out. At all times knights might be made on the field, before or after the battle; witness the accolade (colée)—performed with the sword, according to late medieval practice—which Bayard bestowed on his king after Marignano. In 1213, Simon de Montfort had surrounded with a religious pomp worthy of a crusading hero the dubbing of his son, whom two bishops, to the strains of Veni Creator, armed as a knight for the service of Christ. From the monk Pierre des Vaux-de-Cernay, who took part in it, this ceremony drew a characteristic exclamation: “O new fashion of chivalry! Fashion unheard of till then!” The less ostentatious blessing of the sword itself, according to the evidence of John of Salisbury,7 was not general before the middle of the twelfth century. It seems, however, to have been very widely adopted at that time. The Church, in short, had tried to transform the ancient delivery of arms into a “sacrament”—the word, which is found in the writings of clerics, gave no offence in an age when theology was still far from having assumed a scholastic rigidity and people continued freely to lump together under the name of sacrament every kind of act of consecration. It had not been wholly successful in this; but it had carved out a share for itself, larger in some places, more restricted in others. Its efforts, by emphasizing the importance of the rite of ordination, did much to kindle the feeling that the order of chivalry was a society of initiates. And since every Christian institution needed the sanction of a legendary calendar, hagiography came to its aid. “When at mass the epistles of St. Paul are read,” says one liturgist, “the knights remain standing, to do him honour, for he was a knight.”82 THE CODE OF CHIVALRY

Once the religious element had been introduced, its effects were not confined to strengthening the esprit de corps of the knightly world. It also exercised a potent influence on the moral law of the group. Before the future knight took back his sword from the altar he was generally required to take an oath defining his obligations.9 It was not taken by all dubbed knights, since not all of them had their arms blessed; but, many ecclesiastical writers considered, with John of Salisbury, that by a sort of quasi-contract even those who had not pronounced it with their lips were “tacitly” bound by the oath through the mere fact of having accepted knighthood. Little by little the rules thus formulated found their way into other texts: first into the prayers, often very beautiful ones, which punctuated the course of the ceremony; later, with inevitable variations, into various writings in the vulgar tongue. One of these, composed shortly after 1180, was a celebrated passage from the Perceval of Chrétien de Troyes. In the following century these rules were set forth in some pages of the prose romance of Lancelot; in the German Minnesang, in a fragment of the “Meissner”; finally and above all, in the short French didactic poem entitled L’Ordene de Chevalerie. This little work had a great success. Paraphrased before long in a cycle of Italian sonnets, imitated in Catalonia by Ramon Lull, it opened the way to the abounding literature which, during the last centuries of the Middle Ages, drained to the lees the symbolic significance of the dubbing ceremony and, by its final extravagances, proclaimed the decadence of an institution which had become more a matter of etiquette than of law, and the impoverishment of the very ideal which men professed to rate so high.

Originally this ideal had not lacked genuine vitality. It was superimposed on rules of conduct evolved at an earlier date as the spontaneous expression of class consciousness; rules that pertained to the fealty of vassals (the transition appears clearly, towards the end of the eleventh century, in the Book of the Christian Life by Bishop Bonizo of Sutri, for whom the knight is, first and foremost, an enfeoffed vassal) and constituted above all a class code of noble and “courteous” people. From these secular moral precepts the new decalogue borrowed the principles most acceptable to the religious mind: generosity; the pursuit of glory or “praise” (los); contempt for fatigue, pain and death—“he has no desire to embrace the knight’s profession,” says the German poet Thomasin, “whose sole desire is to live in comfort”.10 But this reorientation was effected by imparting to these same rules a Christian colouring; and, still more, by cleansing the knightly tradition of those profane elements which had occupied, and in practice continued to occupy, so large a place in it—that dross which had brought to the lips of so many rigorists, from St. Anselm to St. Bernard, the old play on words so charged with the cleric’s contempt for the world: non militia, sed malitia:11 “Chivalry is tantamount to wickedness.” But after the Church had finally appropriated the feudal virtues, what writer would thenceforth have dared to repeat this equation? Lastly, to the old precepts which had undergone this process of refinement, others were added of an exclusively spiritual character.

The clergy and the laity were therefore united in demanding of the knight that piety without which Philip Augustus himself considered that he was no true prudhomme. He must go to mass “every day” or at least “frequently”; he must fast on Friday. Nevertheless this Christian hero remained by nature a warrior. What he looked for most from the benediction upon his arms was that it would make them effective, as is clearly shown in the wording of the prayers themselves. But the sword thus consecrated, though it might still as a matter of course be drawn at need against his personal enemies or those of his lord, had been given to the knight first of all that he might place it at the service of good causes. The old benedictions of the end of the tenth century already lay emphasis on this theme, which is greatly expanded in the later liturgies. Thus a modification of vital importance was introduced into the old ideal of war for war’s sake, or for the sake of gain. With this sword, the dubbed knight will defend Holy Church, particularly against the infidel. He will protect the widow, the orphan, and the poor. He will pursue the malefactor. To these general precepts the lay texts frequently add a few more special recommendations concerning behaviour in battle (not to slay a vanquished and defenceless adversary); the practice of the courts of law and of public life (not to take part in a false judgment or an act of treason—if they cannot be prevented, the Ordene de Chevalerie modestly adds, one must withdraw); and lastly, the incidents of everyday life (not to give evil counsel to a lady; to give help, “if possible”, to a fellow-being in distress).

It is hardly surprising that the realities of knightly life, with its frequent trickery and deeds of violence, should have been far from conforming always to these aspirations. One might also be inclined to observe that, from the point of view either of a “socially” inspired ethic or of a more purely Christian code, such a list of moral precepts seems a little short. But this would be to pass judgment, whereas the historian’s sole duty is to understand. It is more important to note that in passing from the ecclesiastical theorists or liturgists to the lay popularizers the list of knightly virtues appears very often to have undergone a rather disturbing attenuation. “The highest order that God has made and willed is the order of chivalry,” says Chrétien de Troyes, in his characteristically sweeping manner. But it must be confessed that after this high-sounding preamble the instructions which his prudhomme gives to the young man he is knighting seem disconcertingly meagre. It may well be that Chrétien represents the “courtesy” of the great princely courts of the twelfth century rather than the prudhommie, inspired by religious sentiments, which was in vogue in the following century in the circle of Louis IX. It is doubtless no accident that the same epoch and the same environment in which this knightly saint lived should have given birth to the noble prayer (included in the Pontifical of William Durand) which may be regarded as a kind of liturgical commentary on the knights carved in stone on the porch of Chartres and the inner wall of the façade of Rheims: “Most Holy Lord, Almighty Father … thou who hast permitted on earth the use of the sword to repress the malice of the wicked and defend justice; who for the protection of thy people hast thought fit to institute the order of chivalry … cause thy servant here before thee, by disposing his heart to goodness, never to use this sword or another to injure anyone unjustly; but let him use it always to defend the Just and the Right.”

Thus the Church, by assigning to it an ideal task, finally and formally approved the existence of this “order” of warriors which, conceived as one of the necessary divisions of a well-ordered society, was increasingly identified with the whole body of dubbed knights. “O God, who after the Fall, didst constitute in all nature three ranks among men,” we read in one of the prayers in the Besançon liturgy. At the same time it provided this class with a religious justification of a social supremacy which had long been a recognized fact. The very orthodox Ordene de Chevalerie says that knights should be honoured above all other men, save priests. More crudely, the romance of Lancelot, after having explained how they were instituted “to protect the weak and the peaceful”, proceeds—in conformity with the emphasis on symbols common to all this literature—to discover in the horses which they ride the peculiar symbol of the “people” whom they hold “in right subjection”. “For above the people must sit the knight. And just as one controls a horse and he who sits above leads it where he will, so the knight must lead the people at his will.” Later, Ramon Lull did not think he offended Christian sentiment by saying that it was conformable to good order that the knight should “draw his well-being” from the things that were provided for him “by the weariness and toil” of his men.12 This epitomizes the attitude of a dominant class: an attitude eminently favourable to the development of a nobility in the strictest sense of the term.

NOTES

  1 See Plate X.

  2 Raimon Lull, Libro de la orden de Caballeria, ed. J. R. de Luanco, Barcelona, R. Academia de Buenos Letras, 1901, IV, 11. English translation: The Book of the Ordre of Chivalry trans. and printed by W. Caxton, ed. Byles, 1926 (Early English Text Society).

  3 Haskins, Norman Institutions, 1918, p. 282, c. 5.

  4Rec. des Histor. de France, XV, p. 187.

  5Ed. Rothari, c. 359. Insufficient research has hitherto been devoted to the liturgy of dubbing to knighthood. In the bibliography will be found an indication of the works and collections which I have consulted. This first attempt at classification, rudimentary though it is, was only made possible for me by the kind assistance of my Strasbourg colleague, Abbé Michel Andrieu.

  6Jehan et Blonde, ed. H. Suchier (Oeuvres poétiques de Ph. de Rémi, II, v. 5916 et seq.).

  7Polycraticus, VI, 10 (ed. Webb, II, p. 25).

  8 Guillaume Durand, Rationale, IV, 16.

  9 Peter of Blois, ep. XCIV.

10Der Wälsche Gast, ed. Rückert, vv. 7791–2.

11 Anselm, Ep., I (P.L., CLVIII, col. 1147); St. Bernard, De laude novae militiae, 77, c. 2.

12Libro de la orden de Caballeria, I, 9. The whole passage has a remarkable flavour.

XXIV


Marc Bloch

Feudal Society

Translated from the French 

by L.A. Manyon

Monday, April 7, 2025

Trotter - Instincts Of The Herd In Peace And War - extracts

 Many attempts have been made to explain the behaviour of man as dictated by instinct. He is, in fact, moved by the promptings of such obvious instincts as self-preservation, nutrition, and sex enough to render the enterprise hopeful and its early spoils enticing. So much can so easily be generalized under these three impulses that the temptation to declare that all human behaviour could be resumed under them was irresistible. These early triumphs of materialism soon, however, began to be troubled by doubt. Man, in spite of his obvious duty to the contrary, would continue so often not to preserve himself, not to nourish himself and to prove resistant to the blandishments of sex, that the attempt to squeeze his behaviour into these three categories began to involve an increasingly obvious and finally intolerable amount of pushing and pulling, as well as so much pretence that he was altogether “in,” {17} when, quite plainly, so large a part of him remained “out,” that the enterprise had to be given up, and it was once more discovered that man escaped and must always escape any complete generalization by science.

A more obvious inference would have been that there was some other instinct which had not been taken into account, some impulse, perhaps, which would have no very evident object as regarded the individual, but would chiefly appear as modifying the other instincts and leading to new combinations in which the primitive instinctive impulse was unrecognizable as such. A mechanism such as this very evidently would produce a series of actions in which uniformity might be very difficult to recognize by direct observation, but in which it would be very obvious if the characters of this unknown “x” were available.

Now, it is a striking fact that amongst animals there are some whose conduct can be generalized very readily in the categories of self-preservation, nutrition, and sex, while there are others whose conduct cannot be thus summarized. The behaviour of the tiger and the cat is simple, and easily comprehensible, presenting no unassimilable anomalies, whereas that of the dog, with his conscience, his humour, his terror of loneliness, his capacity for devotion to a brutal master, or that of the bee, with her selfless devotion to the hive, furnishes phenomena which no sophistry can assimilate without the aid of a fourth instinct. But little examination will show that the animals whose conduct it is difficult to generalize under the three primitive instinctive categories are gregarious. If then it can be shown that gregariousness is of a biological significance approaching in importance that of the other instincts, we may expect to find in it the source of these anomalies of conduct, and if we can also show {18} that man is gregarious, we may look to it for the definition of the unknown “x” which might account for the complexity of human behaviour.

*

Slightly more complex manifestations of the same tendency to homogeneity are seen in the desire for identification with the herd in matters of opinion. {32} Here we find the biological explanation of the ineradicable impulse mankind has always displayed towards segregation into classes. Each one of us in his opinions and his conduct, in matters of dress, amusement, religion, and politics, is compelled to obtain the support of a class, of a herd within the herd. The most eccentric in opinion or conduct is, we may be sure, supported by the agreement of a class, the smallness of which accounts for his apparent eccentricity, and the preciousness of which accounts for his fortitude in defying general opinion. Again, anything which tends to emphasize difference from the herd is unpleasant. In the individual mind there will be an unanalysable dislike of the novel in action or thought. It will be “wrong,” “wicked,” “foolish,” “undesirable,” or as we say “bad form,” according to varying circumstances which we can already to some extent define.

Manifestations relatively more simple are shown in the dislike of being conspicuous, in shyness and in stage fright. It is, however, sensitiveness to the behaviour of the herd which has the most important effects upon the structure of the mind of the gregarious animal. This sensitiveness is closely associated with the suggestibility of the gregarious animal, and therefore with that of man. The effect of it will clearly be to make acceptable those suggestions which come from the herd, and those only. It is of especial importance to note that this suggestibility is not general, and that it is only herd suggestions which are rendered acceptable by the action of instinct. Man is, for example, notoriously insensitive to the suggestions of experience. The history of what is rather grandiosely called human progress everywhere illustrates this. If we look back upon the development of some such thing as the steam-engine, we cannot fail to be struck by the extreme obviousness of each advance, and how {33} obstinately it was refused assimilation until the machine almost invented itself.

*

Now the desire for certitude is one of profound depth in the human mind, and possibly a necessary property of any mind, and it is very plausible to suppose that it led in these early days to the whole field of life being covered by pronouncements backed by the instinctive sanction of the herd. The life of the individual would be completely surrounded by sanctions of the most tremendous kind. He would know what he might and might not do, and what would happen if he disobeyed. It would be immaterial if experience confirmed these beliefs or not, because it would have incomparably less weight than the voice of the herd. Such a period is the only trace perceptible by the biologist of the Golden Age fabled by the poet, when things happened as they ought, and hard facts had not begun to vex the soul of man. In some such condition we still find the Central Australian native. His whole life, to its minutest detail, is ordained for him by the voice of the herd, and he must not, under the most dreadful {35} sanctions, step outside its elaborate order. It does not matter to him that an infringement of the code under his very eyes is not followed by judgment, for with tribal suggestion so compactly organized, such cases are in fact no difficulty, and do not trouble his belief, just as in more civilized countries apparent instances of malignity in the reigning deity are not found to be inconsistent with his benevolence.

Such must everywhere have been primitive human conditions, and upon them reason intrudes as an alien and hostile power, disturbing the perfection of life, and causing an unending series of conflicts.

Experience, as is shown by the whole history of man, is met by resistance because it invariably encounters decisions based upon instinctive belief, and nowhere is this fact more clearly to be seen than in the way in which the progress of science has been made.

*

Direct observation of man reveals at once the fact that a very considerable proportion of his beliefs are non-rational to a degree which is immediately obvious without any special examination, and with {36} no special resources other than common knowledge. If we examine the mental furniture of the average man, we shall find it made up of a vast number of judgments of a very precise kind upon subjects of very great variety, complexity, and difficulty. He will have fairly settled views upon the origin and nature of the universe, and upon what he will probably call its meaning; he will have conclusions as to what is to happen to him at death and after, as to what is and what should be the basis of conduct. He will know how the country should be governed, and why it is going to the dogs, why this piece of legislation is good and that bad. He will have strong views upon military and naval strategy, the principles of taxation, the use of alcohol and vaccination, the treatment of influenza, the prevention of hydrophobia, upon municipal trading, the teaching of Greek, upon what is permissible in art, satisfactory in literature, and hopeful in science.

The bulk of such opinions must necessarily be without rational basis, since many of them are concerned with problems admitted by the expert to be still unsolved, while as to the rest it is clear that the training and experience of no average man can qualify him to have any opinion upon them at all. The rational method adequately used would have told him that on the great majority of these questions there could be for him but one attitude—that of suspended judgment.

In view of the considerations that have been discussed above, this wholesale acceptance of non-rational belief must be looked upon as normal. The mechanism by which it is effected demands some examination, since it cannot be denied that the facts conflict noticeably with popularly current views as to the part taken by reason in the formation of opinion.

It is clear at the outset that these beliefs are invariably regarded by the holder as rational, and {37} defended as such, while the position of one who holds contrary views is held to be obviously unreasonable. The religious man accuses the atheist of being shallow and irrational, and is met by a similar reply; to the Conservative, the amazing thing about the Liberal is his incapacity to see reason and accept the only possible solution of public problems. Examination reveals the fact that the differences are not due to the commission of the mere mechanical fallacies of logic, since these are easily avoided, even by the politician, and since there is no reason to suppose that one party in such controversies is less logical than the other. The difference is due rather to the fundamental assumptions of the antagonists being hostile, and these assumptions are derived from herd suggestion; to the Liberal, certain basal conceptions have acquired the quality of instinctive truth, have become “a priori syntheses,” because of the accumulated suggestions to which he has been exposed, and a similar explanation applies to the atheist, the Christian, and the Conservative. Each, it is important to remember, finds in consequence the rationality of his position flawless, and is quite incapable of detecting in it the fallacies which are obvious to his opponent, to whom that particular series of assumptions has not been rendered acceptable by herd suggestion.

To continue further the analysis of non-rational opinion, it should be observed that the mind rarely leaves uncriticized the assumptions which are forced on it by herd suggestion, the tendency being for it to find more or less elaborately rationalized justifications of them. This is in accordance with the enormously exaggerated weight which is always ascribed to reason in the formation of opinion and conduct, as is very well seen, for example, in the explanation of the existence of altruism as being due to man seeing that it “pays.” {38}It is of cardinal importance to recognize that in this process of the rationalization of instinctive belief, it is the belief which is the primary thing, while the explanation, although masquerading as the cause of the belief, as the chain of rational evidence on which the belief is founded, is entirely secondary, and but for the belief would never have been thought of. Such rationalizations are often, in the case of intelligent people, of extreme ingenuity, and may be very misleading unless the true instinctive basis of the given opinion or action is thoroughly understood.

This mechanism enables the English lady, who, to escape the stigma of having normal feet, subjects them to a formidable degree of lateral compression, to be aware of no logical inconsequence when she subscribes to missions to teach the Chinese lady how absurd it is to compress her feet longitudinally; it enables the European lady who wears rings in her ears to smile at the barbarism of the coloured lady who wears her rings in her nose; it enables the Englishman who is amused by the African chieftain’s regard for the top hat as an essential piece of the furniture of state to ignore the identity of his own behaviour when he goes to church beneath the same tremendous ensign.

The objectivist finds himself compelled to regard these and similar correspondences between the behaviour of civilized and barbarous man as no mere interesting coincidences, but as phenomena actually and in the grossest way identical, but such an attitude is possible only when the mechanism is understood by which rationalization of these customs is effected.

The process of rationalization which has just been illustrated by some of its simpler varieties is best seen on the largest scale, and in the most elaborate form, in the pseudosciences of political economy and ethics. Both of these are occupied in deriving {39} from eternal principles justifications for masses of non-rational belief which are assumed to be permanent merely because they exist. Hence the notorious acrobatic feats of both in the face of any considerable variation in herd belief.

*

Thus far sensitiveness to the herd has been discussed in relation to its effect upon intellectual processes. Equally important effects are traceable in feeling.

It is obvious that when free communication is possible by speech, the expressed approval or disapproval of the herd will acquire the qualities of identity or dissociation from the herd respectively. To know that he is doing what would arouse the disapproval of the herd will bring to the individual the same profound sense of discomfort which would accompany actual physical separation, while to know that he is doing what the herd would approve will give him the sense of rightness, of gusto, and of stimulus which would accompany physical presence in the herd and response to its mandates. In both cases it is clear that no actual expression by the herd is necessary to arouse the appropriate feelings, which would come from within and have, in fact, the qualities which are recognized in the dictates of conscience. Conscience, then, and the feelings of guilt and of duty are the peculiar possessions of the gregarious animal. A dog and a cat caught in the commission of an offence will both recognize that punishment is coming; but the dog, moreover, knows that he has done wrong, and he will come to be punished, unwillingly it is true, and as if dragged along by some power outside him, while the cat’s sole impulse is to escape. The rational recognition of the sequence of act and punishment is equally clear to the gregarious and to the solitary animal, but it is the former only who understands {41} that he has committed a crime, who has, in fact, the sense of sin. That this is the origin of what we call conscience is confirmed by the characteristics of the latter which are accessible to observation. Any detailed examination of the phenomena of conscience would lead too far to be admissible here. Two facts, however, should be noticed. First, the judgments of conscience vary in different circles, and are dependent on local environments; secondly, they are not advantageous to the species to the slightest degree beyond the dicta of the morals current in the circle in which they originate. These facts—stated here in an extremely summary way—demonstrate that conscience is an indirect result of the gregarious instinct, and is in no sense derived from a special instinct forcing men to consider the good of the race rather than individual desires.

*

The next feature of practical interest is connected with the hypothesis, which we attempted in the former article to demonstrate, that irrational belief forms a large bulk of the furniture of the mind, and is indistinguishable by the subject from rational verifiable knowledge. It is obviously of cardinal importance to be able to effect this distinction, for it is the failure to do so which, while it is not the cause of the slowness of advance in knowledge, is the mechanism by which this delay is brought about. Is there, then, we may ask, any discoverable touchstone by which non-rational opinion may be distinguished from rational? Non-rational judgments, being the product of suggestion, will have {44} the quality of instinctive opinion, or, as we may call it, of belief in the strict sense. The essence of this quality is obviousness; the truth held in this way is one of James’s “a priori syntheses of the most perfect sort”; to question it is to the believer to carry scepticism to an insane degree, and will be met by contempt, disapproval, or condemnation, according to the nature of the belief in question. When, therefore, we find ourselves entertaining an opinion about the basis of which there is a quality of feeling which tells us that to inquire into it would be absurd, obviously unnecessary, unprofitable, undesirable, bad form, or wicked, we may know that that opinion is a non-rational one, and probably, therefore, founded upon inadequate evidence.

Opinions, on the other hand, which are acquired as the result of experience alone do not possess this quality of primary certitude. They are true in the sense of being verifiable, but they are unaccompanied by that profound feeling of truth which belief possesses, and, therefore, we have no sense of reluctance in admitting inquiry into them. That heavy bodies tend to fall to the earth and that fire burns fingers are truths verifiable and verified every day, but we do not hold them with impassioned certitude, and we do not resent or resist inquiry into their basis; whereas in such a question as that of the survival of death by human personality we hold the favourable or the adverse view with a quality of feeling entirely different, and of such a kind that inquiry into the matter is looked upon as disreputable by orthodox science and as wicked by orthodox religion. In relation to this subject, it may be remarked, we often see it very interestingly shown that the holders of two diametrically opposed opinions, one of which is certainly right, may both show by their attitude that the belief is held {45} instinctively and non-rationally, as, for example, when an atheist and a Christian unite in repudiating inquiry into the existence of the soul.

A third practical corollary of a recognition of the true gregariousness of man is the very obvious one that it is not by any means necessary that suggestion should always act on the side of unreason. The despair of the reformer has always been the irrationality of man, and latterly some have come to regard the future as hopeless until we can breed a rational species. Now, the trouble is not irrationality, not a definite preference for unreason, but suggestibility—that is, a capacity for accepting reason or unreason if it comes from the proper source.

This quality we have seen to be a direct consequence of the social habit, of a single definite instinct, that of gregariousness, the same instinct which makes social life at all possible and altruism a reality.

It does not seem to have been fully understood that if you attack suggestibility by selection—and that is what you do if you breed for rationality—you are attacking gregariousness, for there is at present no adequate evidence that the gregarious instinct is other than a simple character and one which cannot be split up by the breeder. If, then, such an effort in breeding were successful, we should exchange the manageable unreason of man for the inhuman rationality of the tiger.

The solution would seem rather to lie in seeing to it that suggestion always acts on the side of reason; if rationality were once to become really respectable, if we feared the entertaining of an unverifiable opinion with the warmth with which we fear using the wrong implement at the dinner table, if the thought of holding a prejudice disgusted us as does a foul disease, then the dangers of man’s suggestibility would be turned into advantages. We {46} have seen that suggestion already has begun to act on the side of reason in some small part of the life of the student of science, and it is possible that a highly sanguine prophetic imagination might detect here a germ of future changes.

Again, a fourth corollary of gregariousness in man is the fact expounded many years ago by Pearson that human altruism is a natural instinctive product. The obvious dependence of the evolution of altruism upon increase in knowledge and inter-communication has led to its being regarded as a late and a conscious development—as something in the nature of a judgment by the individual that it pays him to be unselfish. This is an interesting rationalization of the facts because in the sense in which “pay” is meant it is so obviously false. Altruism does not at present, and cannot, pay the individual in anything but feeling, as theory declares it must. It is clear, of course, that as long as altruism is regarded as in the nature of a judgment, the fact is overlooked that necessarily its only reward can be in feeling. Man is altruistic because he must be, not because reason recommends it, for herd suggestion opposes any advance in altruism, and when it can the herd executes the altruist, not of course as such but as an innovator. This is a remarkable instance of the protean character of the gregarious instinct and the complexity it introduces into human affairs, for we see one instinct producing manifestations directly hostile to each other—prompting to ever advancing developments of altruism, while it necessarily leads to any new product of advance being attacked. It shows, moreover, as will be pointed out again later, that a gregarious species rapidly developing a complex society can be saved from inextricable confusion only by the appearance of reason and the application of it to life. {47}

Again, of two suggestions, that which the more perfectly embodies the voice of the herd is the more acceptable. The chances an affirmation has of being accepted could therefore be most satisfactorily expressed in terms of the bulk of the herd by which it is backed.

It follows from the foregoing that anything which dissociates a suggestion from the herd will tend to ensure such a suggestion being rejected. For example, an imperious command from an individual known to be without authority is necessarily disregarded, whereas the same person making the same suggestion in an indirect way so as to link it up with the voice of the herd will meet with success.