Pre-Suasion: A Revolutionary Way to Influence and Persuade
Robert Cialdini

Ended: Jan. 27, 2017

They spent much of their time toiling in the fields of influence thinking about and engaging in cultivation—in ensuring that the situations they were facing had been pretreated and readied for growth. Of course, the best performers also considered and cared about what, specifically, they would be offering in those situations. But much more than their less effective colleagues, they didn’t rely on the legitimate merits of an offer to get it accepted; they recognized that the psychological frame in which an appeal is first placed can carry equal or even greater weight.
accomplish that, they did something that gave them a singular kind of persuasive traction: before introducing their message, they arranged to make their audience sympathetic to it.
In part, the answer involves an essential but poorly appreciated tenet of all communication: what we present first changes the way people experience what we present to them next.
Instead, after his standard presentation and just before declaring his ($75,000) fee, he joked, “As you can tell, I’m not going to be able to charge you a million dollars for this.” The client looked up from the written proposal he’d been studying and said, “Well, I can agree to that!” The meeting proceeded without a single subsequent reference to compensation and ended with a signed contract. My colleague claims that this tactic of mentioning an admittedly unrealistic price tag for a job doesn’t always win the business—too many other factors are involved for that—but it almost always eliminates challenges to the charges.
One practice stood out as central to his success. Before beginning his sales effort, he established an aura of trust with the family. Trust is one of those qualities that leads to compliance with requests, provided that it has been planted before the request is made.
I will forward the argument that all mental activity arises as patterns of associations within a vast and intricate neural network, and that influence attempts will be successful only to the extent that the associations they trigger are favorable to change.
All told, there are any of a number of first steps, besides establishing trust, persuaders can take that will make audiences more receptive to the case they intend to present. The steps can take multiple forms, and, accordingly, they’ve been given multiple labels by behavioral scientists. They can be called frames or anchors or primes or mindsets or first impressions. We will encounter each of those types in the remainder of these pages, where, throughout, I’m going to refer to them as openers—because they open up things for influence in two ways. In the first, they simply initiate the process; they provide the starting points, the beginnings of persuasive appeals. But it is in their second function that they clear the way to persuasion, by removing existing barriers.
I identified only six psychological principles that appeared to be deployed routinely in long-prospering influence businesses. I’ve claimed that the six—reciprocation, liking, social proof, authority, scarcity, and consistency—represent certain psychological universals of persuasion; and I’ve treated each, one per chapter, in my earlier book, Influence.
Whether operating as a moment monitor or a moment maker, the individual who knows how to time a request, recommendation, or proposal properly will do exceedingly well.
It’s because of the only-temporary receptiveness that pre-suasive actions often produce in others that I’ve introduced the concept of privileged moments. The meaning of the word privileged is straightforward, referring to special, elevated status. The word moment, though, is more complex, as it evokes a pair of meanings. One connotes a time-limited period: in this case, the window of opportunity following a pre-suasive opener, when a proposal’s power is greatest. The other connotation comes from physics and refers to a unique leveraging force
in deciding whether a possibility is correct, people typically look for hits rather than misses; for confirmations of the idea rather than for disconfirmations. It is easier to register the presence of something than its absence.
For much of the thirty-plus years that I have been studying the ways that people can be persuaded to choose and change, my thinking has been governed by the dominant scientific model of social influence. It advises as follows: if you wish to change another’s behavior, you must first change some existing feature of that person so that it fits with the behavior.
Have you ever had a phone conversation with someone you can tell is engaged in another task, maybe because you can hear newspaper pages turning or computer keys clicking? I hate that. It shows me that my conversation partner is willing to lose contact with the information I’m providing to make contact with some other information. It always feels like a form of demotion. It advises me that my input is considered relatively unimportant.
anything that draws focused attention to itself can lead observers to overestimate its importance.
He was asked to specify the one scientific concept that, if appreciated properly, would most improve everyone’s understanding of the world. Although in response he provided a full five-hundred-word essay describing what he called “the focusing illusion,” his answer is neatly summarized in the essay’s title: “Nothing in life is as important as you think it is while you are thinking about
The central tenet of agenda-setting theory is that the media rarely produce change directly, by presenting compelling evidence that sweeps an audience to new positions; they are much more likely to persuade indirectly, by giving selected issues and facts better coverage than other issues and facts. It’s this coverage that leads audience members—by virtue of the greater attention they devote to certain topics—to decide that these are the most important to be taken into consideration when adopting a position. As the political scientist Bernard Cohen wrote, “The press may not be successful most of the time in telling people what to think, but it is stunningly successful in telling them what to think about.” According to this view, in an election, whichever political party is seen by voters to have the superior stance on the issue highest on the media’s agenda at the moment will likely win.
Environmental noise such as that coming from heavy traffic or airplane flight paths is something we think we get used to and even block out after awhile. But the evidence is clear that the disruptive noise still gets in, reducing the ability to learn and perform cognitive tasks. One study found that the reading scores of students in a New York City elementary school were significantly lower if their classrooms were situated close to elevated subway tracks on which trains rattled past every four to five minutes. When the researchers, armed with their findings, pressed NYC transit system officials and Board of Education members to install noise-dampening materials on the tracks and in the classrooms, students’ scores jumped back up. Similar results have been found for children near airplane flight paths. When the city of Munich, Germany, moved its airport, the memory and reading scores of children near the new location plummeted, while those near the old location rose significantly.
Thus, parents whose children’s schools or homes are subjected to intermittent automotive, train, or aircraft noise should insist on the implementation of sound-baffling remedies. Employers, for the sake of their workers—and their own bottom lines—should do the same. Teachers need to consider the potentially negative effects of another kind of distracting background stimuli (this one of their own making) on young students’ learning and performance. Classrooms with heavily decorated walls displaying lots of posters, maps, and artwork reduce the test scores of young children learning science material there. It is clear that background information can both guide and distract focus of attention; anyone seeking to influence optimally must manage that information thoughtfully.
But even courses of action selected in this manner should not be allowed the unfair advantages of a different sort of unitary assessment—one focused only on upsides. In the excitement of a looming opportunity, decision makers are infamous for concentrating on what a strategy could do for them if it succeeded and not enough, or at all, on what it could do to them if it failed. To combat this potentially ruinous overoptimism, time needs to be devoted, systematically, to addressing a pair of questions that often don’t arise by themselves: “What future events could make this plan go wrong?” and “What would happen to us if it did go wrong?” Decision scientists who have studied this consider-the-opposite tactic have found it both easy to implement and remarkably effective at debiasing judgments. The benefits to the organization that strives to rid itself of this and other decision-making biases can be considerable. One study of over a thousand companies determined that those employing sound judgment-debiasing processes enjoyed a 5 percent to 7 percent advantage in return on investment over those failing to use such approaches.
The obligation comes from the helping norm, which behavioral scientists sometimes call the norm of social responsibility. It states that we should aid those who need assistance in proportion to their need. Several decades’ worth of research shows that, in general, the more someone needs our help, the more obligated we feel to provide it, the more guilty we feel if we don’t provide it, and the more likely we are to provide it.
In addition—owing to the rapid, customer-centered steps taken by Tylenol’s maker, Johnson & Johnson, which recalled thirty-one million of the capsules from all stores—it produced a textbook approach to proper corporate crisis management that is still considered the gold standard. (The recommended approach urges companies to act without hesitation to fully inform and protect the public, even at substantial expense to its own immediate economic interests.)
by deciding to persist through the interview on my own, I might subject myself to a set of techniques perfected by interrogators over centuries to get confessions from suspects. Some of the techniques are devious and have been shown by research to increase the likelihood of false confessions: lying about the existence of incriminating fingerprints or eyewitness testimony; pressing suspects to repeatedly imagine committing the crime; and putting them into a brain-clouded psychological state through sleep deprivation and relentless, exhaustive questioning. Defenders of such tactics insist that they are designed to extract the truth. An accompanying, complicating truth, however, is that sometimes they just extract confessions that are verifiably untrue.
The legal issue of whether a confession had been made freely by the suspect or extracted improperly by an interrogator involves a judgment of causality—of who was responsible for the incriminating statement. As we know from the experiments of Professor Taylor, a camera angle arranged to record the face of one discussant over the shoulder of another biases that critical judgment toward the more visually salient of the two.
Nothing could change the camera angle’s prejudicial impact—except changing the camera angle itself. The bias disappeared when the recording showed the interrogation and confession from the side, so that the suspect and questioner were equally focal. In fact, it was possible to reverse the bias by showing observers a recording of the identical interaction with the camera trained over the suspect’s shoulder onto the interrogator’s face; then, compared with the side-view judgments, the interrogator was perceived to have coerced the confession. Manifestly here, what’s focal seems causal. Thus,
Evidence that people automatically view what’s focal as causal helps me to understand other phenomena that are difficult to explain. Leaders, for example, are accorded a much larger causal position than they typically deserve in the success or failure of the teams, groups, and organizations they head. Business performance analysts have termed this tendency “the romance of leadership” and have demonstrated that other factors (such as workforce quality, existing internal business systems, and market conditions) have a greater impact on corporate profits than CEO actions do; yet the leader is assigned outsize responsibility for company results.
The attractiveness of the young woman requesting assistance with her phone was not enough, by itself, to accomplish it. Something crucial to the process had to be put into place first. The men had to be exposed to a sexually linked concept, Valentine’s Day, before she could prompt them to act. An opener was needed that rendered them receptive to her plea prior to ever encountering it. In short, an act of pre-suasion was required.
Complexities involving matters of the groin don’t stop there. Take a statistic that belies the notion that infusing sex into advertising is a surefire way to increase sales: in Advertising Age magazine’s list of the top hundred ad campaigns of the twentieth century, only eight employed sexuality in the copy or imagery. Why so few? Although responses to sexual content can be strong, they are not unconditional. Using sex to sell a product works only for items that people frequently buy for sexually related purposes. Cosmetics (lipstick, hair color), body scents (perfume, cologne), and form-fitting clothing (jeans, swimwear) fall into this category. Soft drinks, laundry detergents, and kitchen appliances do not, despite the occasionally misguided efforts of advertisers who don’t appreciate the point.
Remarkably, the best indicator of a breakup was not how much love they felt for their partner two months earlier or how satisfied they were with their relationship at that time or even how long they had wanted it to last. It was how much they were regularly aware of and attentive to the hotties around them back then.
This tendency to lend special attention to potentially threatening stimuli appears to be with us from infancy and often pushes us into silly (indeed, scared silly) actions. There are, for instance, dread risks, which involve risky steps that people take to avoid harm from something that is actually less risky but that they happen to be focused on at the time and have thereby come to dread. After the terrorizing events of September 11, 2001, when four commercial airliners were simultaneously flown to their destruction by Al Qaeda hijackers, media coverage of 9/11-related stories was heaviest. As a result, many thousands of Americans with long-distance travel plans abandoned the dreaded skies for the roads. But the fatality rate for highway travel is considerably higher than for air travel, making that choice the more deadly one. It’s estimated that about 1,600 Americans lost their lives in additional auto accidents as a direct result, six times more than the number of passengers killed in the only US commercial plane crash that next year.
What’s the persuasive alchemy that allows a communicator to trouble recipients deeply about the negative outcomes of their bad habits without pushing them to deny the problem in an attempt to control their now-heightened fears? The communicator has only to add to the chilling message clear information about legitimate, available steps the recipients can take to change their health-threatening habits. In this way, the fright can be dealt with not through self-delusional baloney that deters positive action but through genuine change opportunities that mobilize such action.
Overall, sexual and threatening stimuli, though often compelling, are not simple or unitary in their effects. With their complexities in mind, it becomes possible to understand how employing those stimuli can lead to great successes in some influence situations but to reversals in others. When several research teammates and I thought about the matter, we recognized that advertisers often ignore these complexities and, consequently, can produce expensive campaigns that actually undermine product sales. After one member of our research team, Vlad Griskevicius, urged us to take an evolutionary perspective, we realized that humans encountering threatening circumstances would have developed early on a strong tendency to be part of a group (where there is safety and strength in numbers) and to avoid being separate (where there is vulnerability to a predator or enemy). The opposite would be true, however, in a situation with sexual possibilities. There a person would want distance from the pack in order to be the prime recipient of romantic consideration. We also realized that these two contrary motivations, to fit in and to stand out, map perfectly onto a pair of longtime favorite commercial appeals. One, of the “Don’t be left out” variety, urges us to join the many. The other, of the “Be one of the few” sort, urges us to step away from the many. So, which would an advertiser be better advised to launch into the minds of prospects? Our analysis made us think that the popularity-based message would be the right one in any situation where audience members had been exposed to frightening stimuli—perhaps in the middle of watching a violent film on TV—because threat-focused people want to join the crowd. But sending that message in an ad to an audience watching a romantic film on TV would be a mistake, because amorously focused people want to step away from the crowd.
Although the data pattern seems complex, it becomes simplified when viewed through the prism of a core claim of this book: the effectiveness of persuasive messages—in this case, carrying two influence themes that have been commonly used for centuries—will be drastically affected by the type of opener experienced immediately in advance. Put people in a wary state of mind via that opener, and, driven by a desire for safety, a popularity-based appeal will soar, whereas a distinctiveness-based appeal will sink. But use it to put people in an amorous state of mind, and, driven by a consequent desire to stand out, the reverse will occur.
The potent effect of a rapid change in environmental circumstances on human concentration can be seen in a mundane occurrence that afflicts us all. You walk from one room to another to do something specific, but, once there, you forget why you made the trip. Before cursing your faulty powers of recollection, consider the possibility of a different (and scientifically documented) reason for the lapse: walking through doorways causes you to forget because the abrupt change in your physical surroundings redirects your attention to the new setting—and consequently from your purpose, which disrupts your memory of it. I like this finding because it offers a less personally worrisome account of my own forgetfulness. I get to say to myself, “Don’t worry, Cialdini, it wasn’t you; it was the damned doorway.”
More than a century after Pavlov’s characterization, our bodily reaction to change is no longer called a reflex. It’s termed the orienting response, and scores of studies have enlightened us about it. It isn’t limited to the senses, as Pavlov had thought, but extends to all manner of bodily adjustments, including respiration, blood flow, skin moisture, and heart rate. The indication that has attracted recent scientific scrutiny takes place in the brain, where a pattern of electrical activity known as the “O-wave” (for orienting wave) flows across sectors associated with evaluation. By charting the rise and fall of O-waves in people hooked up to brain-imaging devices, neuroscientists have identified the kinds of stimuli that most powerfully produce shifts in attention. One such category of cues—associated with change—deserves our consideration, as it possesses intriguing implications for the psychology of influence.
There is no question that information about the self is an exceedingly powerful magnet of attention. The ramifications for pre-suasive social influence are significant. In the province of personal health, when recipients get a message that is self-relevant because it has been tailored specifically for them (for example, by referencing the recipient’s age, sex, or health history), they are more likely to lend it attention, find it interesting, take it seriously, remember it, and save it for future reference—all of which leads to greater communication effectiveness, as reflected in arenas as diverse as weight loss, exercise initiation, smoking cessation, and cancer screening. The continuing emergence of large-scale electronic databases, digitized medical records, and personal contact devices such as mobile phones makes individualized message customization and delivery increasingly possible and economical. Purely from an effectiveness standpoint, any health communicator who has not fully investigated the potential use of such tools should be embarrassed.
Even though I was sitting in the front row as the dance unfolded, I never saw it. I missed it completely, and I know why: I was focused on myself and my upcoming speech, with all of its associated phrasings and transitions and pauses and points of emphasis. The missed experience is one of my enduring regrets—it was Balanchine, Stravinsky, etc., after all. I’d been the victim of what behavior scientists call the next-in-line effect, and, as a consequence, I have since figured out how to avoid it and even use it on my behalf. You might be able to do the same.
How might you sail the waters of your meeting more expertly than your first inclination suggested? I’d propose charting a course that takes into account both the next-in-line effect and the what’s-focal-is-presumed-causal effect. Take a spot at the table across from Alex where (1) he’ll be sufficiently distant from his own presentation to hear yours fully, and (2), because of your visual prominence, he’ll see you as fully responsible for the insights within your fine recommendation for resolving the problem. Of course, if you haven’t come up with a creditably reasoned solution to the problem, you might want to grab a chair right next to his so that in his self-focus-induced bubble, he won’t likely register the fact.
To test this logic, Zeigarnik performed an initial set of experiments that she, Lewin, and numerous others have used as the starting point for investigating what has come to be known as the Zeigarnik effect. For me, two important conclusions emerge from the findings of now over six hundred studies on the topic. First (and altogether consistent with the beer garden series of events), on a task that we feel committed to performing, we will remember all sorts of elements of it better if we have not yet had the chance to finish, because our attention will remain drawn to it. Second, if we are engaged in such a task and are interrupted or pulled away, we’ll feel a discomforting, gnawing desire to get back to it. That desire—which also pushes us to return to incomplete narratives, unresolved problems, unanswered questions, and unachieved goals—reflects a craving for cognitive closure.
A problem that afflicts most writers is procrastination. Writing is hard; at least, writing well (texting doesn’t count) is hard. On this point, consider an exchange between the great British novelist Somerset Maugham and a young interviewer. “So, Mr. Maugham, do you enjoy writing?” “I enjoy having written.”44 And that’s the dilemma. Writers all want to get to the place of having written, but getting there is no straightforward, trouble-free task. That reality applies to nonprofessionals as well: authors of extended reports and documents designed for coworkers or superiors, for example.
She never lets herself finish a writing session at the end of a paragraph or even a thought. She assured me she knows precisely what she wants to say at the end of that last paragraph or thought; she just doesn’t allow herself to say it until the next time. Brilliant! By keeping the final feature of every writing session near-finished, she uses the motivating force of the drive for closure to get her back to her chair quickly, impatient to write again. So my colleague did have a writing secret after all. It was one that hadn’t occurred to me, although it should have because it was present—if I’d just thought about it—in the body of work on the Zeigarnik effect that I knew well. That was a type of lapse I’ve tried not to let recur, either in my writing or in another of my professional roles at the time: university teaching. I learned that I could increase my classroom effectiveness, pre-suasively, by beginning each lecture with a special kind of unfinished story: a mystery.
I saw evidence of the force of the craving for closure born within mystery stories after I began using them in my classroom lectures. I was still inexperienced enough that on one particular day I got the timing wrong, and the bell rang, ending the lecture before I’d revealed the solution to a puzzle I’d posed earlier. In every college course I’d ever taught, about five minutes before the scheduled end of a class period, some students start preparing to leave. The signs are visible, audible, and, consequently, contagious: pencils and notebooks are put away, laptops closed, backpacks zipped. But in this instance, not only were there no such preparations but also after the bell rang, no one moved. In fact, when I tried to end the lecture there, students pelted me with protests. They would not let me stop until I had given them closure on the mystery. I remember thinking, “Cialdini, you’ve stumbled onto dynamite here!”
A little-recognized truth I often try to convey to various audiences is that, in contests of persuasion, counterarguments are typically more powerful than arguments. This superiority emerges especially when a counterclaim does more than refute a rival’s claim by showing it to be mistaken or misdirected in the particular instance, but does so instead by showing the rival communicator to be an untrustworthy source of information, generally. Issuing a counterargument demonstrating that an opponent’s argument is not to be believed because its maker is misinformed on the topic will usually succeed on that singular issue. But a counterargument that undermines an opponent’s argument by showing him or her to be dishonest in the matter will normally win that battle plus future battles with the opponent.
Tobacco opponents found that they could use counterarguments to undercut tobacco ad effectiveness. But the tobacco executives learned (and profited from) a related lesson: one of the best ways to enhance audience acceptance of one’s message is to reduce the availability of strong counterarguments to it—because counterarguments are typically more powerful than arguments.
Oh, by the way, there’s a telling answer to the question of what Albert Einstein claimed was so remarkable it could be labeled as both “the most beautiful thing we can experience” and “the source of all true science and art.” His contention: the mysterious.
We convince others by using language that manages their mental associations to our message. Their thoughts, perceptions, and emotional reactions merely proceed from those associations.
the main purpose of speech is to direct listeners’ attention to a selected sector of reality. Once that is accomplished, the listeners’ existing associations to the now-spotlighted sector will take over to determine the reaction. For issues of persuasion, this assertion seems to me groundbreaking. No longer should we think of language as primarily a mechanism of conveyance; as a means for delivering a communicator’s conception of reality. Instead, we should think of language as primarily a mechanism of influence; as a means for inducing recipients to share that conception or, at least, to act in accord with it.
Instead, it replaced such words possessing menacing associations (target, beat) with comparable words that did not (goal, outdistance). Perhaps this practice reveals the belief of SSM’s leadership that, just as violence-laden language could lead to elevated harm doing and therefore should be eliminated, achievement-laden language could lead to elevated performance and therefore should be retained.
If SSM leaders do hold that belief, they’d be right. Multiple studies have shown that subtly exposing individuals to words that connote achievement (win, attain, succeed, master) increases their performance on an assigned task and more than doubles their willingness to keep working at it. Evidence like this has changed my mind about the worth of certain kinds of posters that I’ve occasionally seen adorning the walls of business offices.
An analysis of two years of magazine ads in the United States and South Korea found that (1) in South Korea, the ads attempted to link products and services mostly to the reader’s family or group, whereas in America it was mostly to the individual reader; and (2) in terms of measured impact, group-linked ads were more effective in South Korea, while ads linked to the individual were more effective in the United States.
Within the domain of general attraction, observers have a greater liking for those whose facial features are easy to recognize and whose names are easy to pronounce. Tellingly, when people can process something with cognitive ease, they experience increased neuronal activity in the muscles of their face that produce a smile. On the flip side, if it’s difficult to process something, observers tend to dislike that experience and, accordingly, that thing. The consequences can be striking. An analysis of the names of five hundred attorneys at ten US law firms found that the harder an attorney’s name was to pronounce, the lower he or she stayed in the firm’s hierarchy. This effect held, by the way, independent of the foreignness of the names: a person with a difficult-to-pronounce foreign name would likely be in an inferior position to one with an easy-to-pronounce foreign name.
Even more impressive is the clever way in which they’ve taken a piece of psychological information—that background cues in one’s physical environment can guide how one thinks there—and employed it to generate a desired effect.
the well-known occurrence of “medical student syndrome.” Research shows that 70 percent to 80 percent of all medical students are afflicted by this disorder, in which they experience the symptoms of whatever disease they happen to be learning about at the time and become convinced that they have contracted it. Warnings by their professors to anticipate the phenomenon don’t seem to make a difference; students nonetheless perceive their symptoms as alarmingly real, even when experienced serially with each new “disease of the week.”
On the one hand, she specified a set of manageable activities that reliably increase personal happiness. Several of them—including the top three on her list—require nothing more than a pre-suasive refocusing of attention: 1. Count your blessings and gratitudes at the start of every day, and then give yourself concentrated time with them by writing them down. 2. Cultivate optimism by choosing beforehand to look on the bright side of situations, events, and future possibilities. 3. Negate the negative by deliberately limiting time spent dwelling on problems or on unhealthy comparisons with others. There’s even an iPhone app called Live Happy that helps users engage in certain of these activities, and their greater happiness correlates with frequent use.
She also found that younger individuals have different primary life goals that include learning, developing, and striving for achievement. Accomplishing those objectives requires a special openness to discomforting elements: demanding tasks, contrary points of view, unfamiliar people, and owning mistakes or failures. Any other approach would be maladaptive.
Alan told me that just prior to taking any standardized exam, he’d spend systematic time “getting psyched up” for it. He described a set of activities that could have come from a modified version of Dr. Lyubomirsky’s list. He didn’t take up the minutes before the exam room doors opened as I always had: notes in hand, trying to cram every piece of information I was unsteady about into my brain. He knew, he said, that focusing on material that was still vexing him would only elevate his anxieties. Instead, he spent that crucial time consciously calming his fears and simultaneously building his confidence by reviewing his past academic successes and enumerating his genuine strengths. Much of his test-taking prowess, he was convinced, stemmed from the resultant combination of diminished fear and bolstered confidence: “You can’t think straight when you’re scared,” he reminded me, “plus, you’re much more persistent when you’re confident in your abilities.”