Transference Bias: A Tale of Bloody Wars, Baby Kings, and Bad Bosses

There was a time, and it was not so long ago, that conventional wisdom said if you were born into nobility, you possessed a set of superior traits that automatically qualified you for governance. Got royal parents? Congratulations, you’ve won the leadership lottery.

There was just one problem: the system often produced people uniquely unqualified to rule.

Consider Charles II of Spain. He came from a line of the Spanish Hapsburgs so intermarried that one ancestor appears on his family tree in 14 separate places. Charles took the throne in 1665 despite a host of physical and mental disabilities—he couldn’t chew, drooled frequently, was never really educated, and at one point it’s rumored he ordered his deceased family members dug up so he could look at them.

Consider King George IV of England, famous for his extravagant spending, love of leisure—and utterly selfish, irresponsible behavior.

Consider the many kings and queens who were handed the reigns to their country before they were old enough to put their own pants on.

Now consider: many of these people held the fate of nations in their hands.

A history pockmarked with unnecessary wars, massive public debts, and plain incompetence proves it out: leadership is not an inherited trait. Wisdom and judgement are gained through experience, not via bloodline.

These days, most surviving monarchs are more figurehead than supreme ruler. After all, the industrial revolution has ushered in modern times and modern thinking. Or has it?

Anyone working in business might guess where this is headed. Transference bias at its core presupposes that knowledge is not a requirement for climbing the ranks of leadership. (“If Jones displayed a hardworking can-do attitude over in sales, by golly he can certainly run the finance department!”)

Today, “character, positivity, and fortitude” are the new blue blood in business.

Not that these traits aren’t good things for a leader to have. Most certainly they are. But when it comes time to make hard choices, the sunniest attitude in the world is no substitute for expertise. It’s the same way that Count Chocula’s noble birth doesn’t guarantee him wisdom in the deployment of his infantry.

Unfortunately, transference bias never died, merely dressed itself in new clothes. And like the old kingdoms at war, there is much collateral damage.

So the next time your new boss shows up green behind the ears, fresh from some other unrelated department, remember it could be worse. Your cubicle could be a castle wall, facing a catapult attack of dead rotting cows. And if there’s one thing we can all agree on, it’s that nothing is worse than dead rotting cows. Except for maybe the new minty-fresh boss you’re about to train…

The Brain’s Allergy to the Big Picture

Do you suffer from Systems Blindness? You almost certainly do.

The problem is that your brain’s hardwiring is designed primarily to keep you alive. Which is fair. But as a result, we specialize in snap-second judgments.

Our living strategy is largely built on using association to connect causes and effects, which in turn drives our decision-making. See a school bully in action and we go out of our way to avoid him. Watch a fellow office worker grown lean through jogging and we might be tempted to hit the pavement ourselves in the morning. In short, we observe, draw inferences and plot our course. This strategy has served humanity well; after all, there are over 7 billion of us on the planet.

Individually, we are amazing at making day-to-day decisions that afford us a certain amount of comfort. But what happens when our comfort is besieged by a huge, unnervingly complicated system like weather or traffic? Here is where Daniel Goleman in his new book Focus: The Hidden Driver of Excellence, weighs in.

Take traffic, for example. When we’re stuck in rush hour, we might be tempted to think the way my diminutive little grandmother used to: “Why don’t all these damn people stay home?” Aside from the fact that, by her very presence, she is contributing to the problem, this is probably an issue of oversimplification. Too many people = traffic jam.

We might be tempted to answer my grandmother with, “What we really need is more roadways.” Engaging in this kind of reasoning is known as the “illusion of explanatory depth,” Goleman explains. “…we feel confidence in our understanding of a complex system, but in reality have just superficial knowledge.”

We don’t realize, for instance, that access to new highways can energize nearby industry, which can grow communities, which in turn supports restaurants, shopping, and recreation, thereby attracting even more families, which puts more people like my grandmother on the road, which of course means more traffic jams.

Our brains understand cause and effect at a local level, but as the causes and effects grow larger and more distant, our reasoning suffers. The effect of slamming your fingers in a car door is pretty immediate: the amygdala, the fear center of your brain, fires off a warning and your sensory system administers a shot of pain. Global warming, on the other hand, operates on an almost impossible level of remove.

We are designed to create short-term solutions, and as our societies have grown larger and more complex, system blindness becomes increasingly more dangerous. I can’t see the seas rise as the result of a carbon-loaded atmosphere, so I don’t merely dismiss the impending long-term threat, my amygdala is as complacent as the Maytag repair man.

Luckily, as our ability to generate and analyze large quantities of data has improved, our awareness of systems is growing too. Google’s foray into plotting flu epidemics is but one example.

Hopefully the more data mining that takes place for things like global climate change, the more each of us will begin to consider that we are part of something far larger than the hunk of land on which we live and drive. In the meantime, in regards to our carbon footprint, maybe my grandmother was onto something. “Why don’t all these damn people stay home?”

Einstein, Allie Brosh, and the Secret to Procrastinating With Style

When you contemplate your life, wondering what it means to be alive, it’s unlikely the first thing that came to mind was ‘office work.’

And yet arguably the life you lead at your desk inhabits a great deal of mental real estate. The sheer number of hours typically spent at work guarantees that the office and all it entails is fundamental in understanding and explaining the big picture of your life.

Work may or not bring out the best of us, depending on our tasks and whether we are able to get into flow as defined by Czekmentchiayli. But observation suggests there is one constant in human behavior you can expect to see wherever you find a shantytown of office cubicles.

The idea was coined by Allie Brosh, of Hyperbole and a Half fame. In a recent interview with Terry Gross, Brosh explained how she started her now-famous internet comic when she was supposed to be studying for finals: “I’m laterally productive. I will do productive things, but never the thing that I’m supposed to be doing.”

The elegance of lateral productivity is it allows you to put off tasks indefinitely without guilt. After all, you aren’t loafing around accomplishing nothing. You’re working hard! You don’t have time to file those reports.

It’s Einstein’s theory writ small: two objects can’t occupy the same space at the same time. In this case, two tasks can’t occupy the same brain space, since the human brain is notoriously poor at multi-tasking.

Lateral productivity allows you to play the hero to your self-imposed task villain. You find yourself in a kind of self-perpetuating state of activity, trapped in some Escheresque landscape where you’re diligently drawing the next set of stairs right before you ascend or descend said stairway. Technically, you’re moving, but there’s no forward progress to be had.

This art form is practiced by workers and managers alike. Lateral productivity can go by many other aliases, including “special side project”. The naming lends a measure of credibility. Throw in some metrics, build some color graphs, wrap it up in a Power Point presentation and you’ve got the makings of an entirely new task, which at some point lateral productivity will force you to abandon for something else.

Perhaps the best spokesman for lateral productivity was the great New York Yankees catcher Yogi Berra. Driving around one day, unsure of his location or the route to his destination, he reported, “We may be lost, but we’re making great time.”

Are You Smarter Than a Mouse?

Are you smarter than a mouse? This was one of the intriguing topics presented at 2013’s Society for Neuroscience conference in San Diego, on research done by J.F. Gysner, M. Manglani, N. Escalona, R. Hamilton, M. Taylor, J. Paffman, E. Johnson, and L.A. Gabel, all based out of Lafayette College in Easton, Pennsylvania.

If you are a lab mouse, then you are undoubtedly familiar with mazes. Specifically, you’ve probably logged some time in a Hebb-Williams maze. For decades, it’s been the go-to research model: a spacial-visual maze that centers on twelve standard problems, which differ based on the learning/memory task researchers have assigned to you and your rodent buddies.

But the Hebb-Williams maze is not solely reserved for our tiny rodent friends. Its friendly confines have also been used to test the mettle of ‘rats, cats, rabbits, ferrets, mice, and monkeys.’

The Lafayette College team had a few questions on their minds. Would it alter test results to use a virtual model instead? And if not, could they run humans through the simulation and compare their performances against mice?

Clearly, a virtual maze is far more desirable in terms of space and construction costs. Also, it’s not nearly as problematic as shrinking humans down to fit into a mouse maze. (Which, for one thing, opens itself up to all manner of tired movie plots.)

Ninety-eight humans, both male and female, participated in the experiment. The study focused on two age groups: children aged 8-12, and young adults aged 18-21. The participants were screened and evaluated on their video game knowledge to eliminate any pre-trial skill biases.

In order to ensure that chocolate pellets would be enough of an incentive to run the maze, researchers skimped on the food until the mice reached 85% body weight. (Apparently the humans needed no coercion to run the virtual maze for chocolate pellets.)

Ultimately, when it came to the final showdown, humans from both age groups were faster and less prone to mistakes than their small furry counterparts. However, taking controls for species into account, the humans and mice performed “similarly”, suggesting their performance could be compared in future experiments.

Additionally, it turns out that using a computer-generated maze on humans did not alter their results. This was particularly good news for the Lafayette researchers, but perhaps not such a boon for the would-be producers of Honey I Shrunk the Kids 3.

So lucky for your self-esteem, it turns out you are smarter than a mouse, at least where maze-running is concerned. That is, until the playing field is leveled and then, well, say hello to your new competitors, the irrepressible Mickey and Minnie.

Aristotle’s Three Musketeers, or, A Swiftly Tipping Stool

What might the Greek philosopher and Jack-of-all-trades Aristotle think of the latest findings in neuroscience? How would his notion of what it means to be a good public speaker stack up against the bevy of brain biases Daniel Kahneman outlines in prospect theory?

In On Rhetoric, Aristotle outlines three key concepts in building a convincing speech. The speaker must demonstrate:

Ethos: character, trustworthiness, credibility
Logos: logic, facts, figures or some process
Pathos: emotion, true feelings, a sense of connection

Your ethos can be broadly defined as your reputation or honor. Unfortunately, if your listeners don’t already know you, they are less likely to give you the time of day. When famed violinist Joshua Bell played an incognito recital in the Washington subway system, virtually nobody stopped to listen. Without context, Bell’s playing was swallowed up in the chaos of the daily commute. Our sense of importance is often driven more by context than actual value.

For Aristotle, logos were the facts and details, the nuts and bolts of any logical argument. The concept that an argument should be grounded in reason is one of the many things that western science borrowed from Aristotle. And yet, while we pay lip service to rationality, split-brain studies show that what we describe as our reasons often have little connection to the actual decision. We employ logic not as a compass but as a justification.

Danish author Martin Lindstrom notes an interesting phenomenon with product satisfaction surveys. Namely, that they’re useless. Ask someone to review a product they’ve just bought, and there will be nearly no correlation between their stated stance and what they’ll do the next time

When Aristotle talks about pathos, he is referring to the emotional appeal or the connection to the group, the speaker’s ability to stir the hearts and minds of the listeners. Perhaps modern neuroscience has advanced no idea more strongly than the power of pathos. This is why Kahneman labels the emotional factor, and not our rationality, as the real star of the show.

Aristotle understood this, but in the context of a powerful trinity, with pathos as one leg of a three-legged stool. He wasn’t entirely wrong, just a bit iffy on the relative proportions.

Aristotle often gets billed as a philosopher, and while this is true, it’s also selling him short; his writings cover everything from poetry to physics, music to politics, ethics to zoology. Philosopher Bryan Magee is quoted as saying (maybe a little hyperbolically), “it is doubtful whether any human being has ever known as much as he did”.

So if Aristotle was around today, maybe he wouldn’t need to be embarrassed at having inflated the value of ethos and logos a little. He’d probably be too busy delving into the advances in all his many favorite areas of study. Maybe a few new ones as well. Just what would Aristotle think of neuroscience? There’s no way no know for sure, but he would likely find it interesting. As a wise man once said, “The energy of the mind is the essence of life.”*

(*Aristotle. It was Aristotle.)

Are You Brainwashed? Hopefully Yes.

When you think of brainwashing, the name Patty Hearst might come to mind. Daughter of the late newspaper tycoon Randolph Hearst, in 1974, the then nineteen-year-old was kidnapped by a fringe terrorist group known as the Symbionese Liberation Army.

Several months later, she resurfaced, calling herself Tania and wielding a gun for the SLA during an attempted bank robbery. In September 1975, the local police and the FBI apprehended Hearst in an apartment in San Francisco, along with another SLA member. That January, she was tried for her involvement in the robbery.

The Hearst family’s legal team claimed Patty had been operating under a “classic case” of Stockholm Syndrome. They argued that after weeks of rape, torment, and imprisonment in a closet, she could no longer withstand an SLA indoctrinement.

The prosecution argued that she had willfully decided to aid the SLA, given some circumstantial evidence and her refusal to name names or turn anyone else in. The jury agreed. She served two years of a 35-year jail term before President Carter commuted her sentence. Twenty four years after that, President Clinton issued a full pardon — one of his last official actions in the Oval Office.

The strange case of Patty Hearst, and indeed, the very concept of brainwashing, is still hotly debated. That said, as a clinical diagnosis, it has yet to gain a foothold in the psychological community.

Yet recently the idea of brainwashing has resurfaced with a whole new twist — minus the terrorists, kidnapped heiresses, or attempted bank heists. Still, if you’re a neuroscience buff, you might find this case even more exciting. It concerns the dreaded brain disease known as Alzheimer’s.

According to a story by John Hamilton entitled Brains Sweep Themselves Clean of Toxins During Sleep, researchers discovered that brain cells shrink during sleep. This makes room for cerebral spinal fluid to circulate around the cell walls. It’s theorized that this circulation flushes out harmful proteins, the waste product of extracting energy from the blood’s glucose.

“It’s like a dishwasher,” says Dr. Maiken Nedergaard, a professor of neurosurgery at the University of Rochester and author of study in Science magazine.

This buildup of toxic proteins, sometimes referred to as ‘brain plaque’, has long been associated with Alzheimer’s. According to Nedergaard, a variety of sleep disorders might interfere with the cleansing — leading to some substantial problems later.

Why does this mental rinse cycle only happen during sleep? Since the process uses a lot of resources, she theorizes it’s an energy- saving strategy.

Furthermore, says Nedergaard, it could “explain why we don’t think clearly after a sleepless night and why a prolonged lack of sleep can actually kill an animal or a person.”

Nedergaard’s team first observed the process in laboratory rats, and since then it’s been observed in baboons. Scientists haven’t detected it yet in humans, but Nedergaard believes it’s only a matter of time.

This kind of brainwashing might not have the headline appeal of a kidnapped newspaper heiress gone rogue, but its implications extend to millions of people who may be suffering from plaque-related brain disorders.

If Nedergaard is right, it might turn out that brainwashing is something each and every one of us will be glad we are a party to.

Chocolate Chip Cookies and the Secret to Will-Power

If you run up a long steep incline, it doesn’t take very long before you burn through the energy stored in your muscles and find your legs turning to rubber. We learn this at a relatively early age, and as a result, some of us make it a habit to avoid running up steep inclines.

What you might not realize is that this exhaustion, this depletion of fuel, happens in the exact same way when you exert yourself mentally.

Your brain, like your muscles, runs on glucose. Give your brain a mental workout and your ability to focus, or demonstrate what we call ‘will power’, is spent as well.

This was proven out in a well-known experiment done by psychology Professor Roy Baumeister and his team at Florida State University. They conducted a test where people where randomly assigned to eat either radishes or freshly baked chocolate chip cookies. The radish eaters were instructed to resist eating the cookies. In this case, the noble radish eaters were able to exert enough will power to avoid the cookies 100% of the time.

Both groups were then presented with a series of problems that required extreme eye and hand coordination. The radish eaters gave up noticeably sooner — their focus was about 10% shorter.

In this and in other experiments, Baumeister was able to show that the more often and more recently you resist a desire, the less likely you’re able to complete the next tough task that comes along. This might explain why you were able not to eat those cookies your workmate brought in at lunchtime, only to find yourself with no energy to scrub the bathroom that evening.

Interestingly, will power seems to work just like your leg muscles when you tackle a hill. At some point you run out of gas. And you only possess one store of glucose that both your brain and body share in common.

Luckily people can choose to conserve their glucose and hang onto some will power, and so for this reason we are not necessarily reduced to mush after a series of extremely temping situations.

Not only that but one can build their will power by practicing restraint in four key areas: thought, impulse, feeling control and task performance. The muscle analogy holds up, the more you exercise your will power the better you become at resisting whatever temptation might befall you.

It’s therefore not surprising to discover that the ability to demonstrate will power along with intelligence are the only two reliable predictors about a person’s success measured across a variety of areas including, relationship, happiness, and income generation.

Editors note; don’t try running up a hill with a quart of Hagan Dass in your hand, none of us have enough reserve glucose to pull that one off, not even the most sanctimonious radish eaters…

Processors, Poison, and Poetry: the Science Behind Eyes

Your eyes are far more than your windows to the outside world. They are the movie cameras that project information to the cerebral cortex, the brain’s hardworking visual processor.

In absolute silence and utter darkness, the information is translated inside your skull at amazing speeds.
It’s a complex operation: in a split second, shadows, movement, and shape are first separated and then knit back together again by a workforce of millions of neurons.

In the final stage, the subconscious brain must decide just how much of the imagery it will make available to your conscious mind. The brain only has a limited processing capacity, so these edits are an essential element to the process. Since the revisions happen outside of your awareness, the concious brain is forced to play the part of moviegoer rather than director.

How does your subconscious decide what to keep and what to leave on the metaphorical cutting room floor? Scientists are still in the dark (couldn’t resist).

But it is understood that your eyes, and by that I mean your pupils in particular, give us a window into your brain’s ability to focus. We know, for example, that there is a direct correlation between the dilation of pupils and mental effort. The larger the pupils, the more concentration on display.

Many, many tests have proved this relationship. When I’m working hard on a math problem, for instance, my pupils become quite large. When I give up on the problem, my pupils immediately shrink, as if to say, “Hey, that’s all folks.”

Pupil dilation also occurs when you looks at someone you find attractive. But interestingly, the reverse is also true.

Not so long ago, scientists ran an experiment wherein they showed men photographs of swimsuit models. The first photo was genuine, but in the second picture, the models’ pupils had been photoshopped to look larger. Sure enough, 90% of the men preferred the doctored photographs.

This is hardly news. The scientists could’ve saved a lot of time had they been able to talk to seventeenth century women. A common beauty trick at the time was to cut a sprig of the belladonna plant and sniff it from time to time which—you guessed it—dilates the eyes. Apparantly this trick was quite effective. (Hopefully it was worth it; belladonna is also quite poisonous.)

So how do restauranteurs take advantage of this? They dim the lights at night, ostensibly to create “ambiance”. The lower light levels make the pupil dilate, which suddenly makes the person across from you far more desirable. This generally leads couples to linger in the restaurant longer, which naturally leads to another glass of wine, or a piece of cheesecake. And so with a flick of a switch, everybody is happy, especially the restaurant owner.

The eyes might not be the windows to the soul, as many a poet has suggested. But you could certainly make a strong case that the eyes, in their ability to reflect both inward and outward, help define both what we see and who we are. That alone is worth the price of admission.

The Final Word on Word-of-Mouth

Let’s start at the location where every single sale begins. I’m talking, of course, about a customer’s brain.

Inside each customer’s skull is enough neural pathways to go around the moon and then circle the earth 6 times. It is this web of myriad connections that will decide whether to or not to make the purchase.

And that decision, the one that has enormous implications for you and me, our families, the economy, and virtually everyone else on the planet, begins its journey in the future.

Whenever someone decides to purchase a product, they begin the journey with a kind of thought experiment, imagining how their life will be with their new acquisition. This can take the form of a vague notion (‘Wouldn’t be nice to have a new pair of running shoes?’) or it might be a little more concrete (‘I want Chuck Conner All-Stars in bright orange with white laces, size 10 1/2’).

It is the job of a salesperson to usher these movies in our heads into reality, which often means helping to define that imagined experience.

Not so long ago, the first step to a sale might have begun with the yellow pages. Today, the Internet is where over half of new customers will plot the beginning and, in some cases, the end of the journey. This is why companies large and small devote major dollars to making their websites as vibrant as possible.

So does all of this mean traditional brick-and-mortar is going the way of the dinosaur? The demise of long-time retailers like Montgomery Ward, and more recent ones like Borders and Blockbuster, suggest that the Internet is definitely changing the purchasing landscape. In 2012, sales conducted online racked up over a trillion dollars worldwide.

In his book Contagious, Jonah Berger lays out the truth about the power of the internet as influencer. He points out the following:

1. Word of mouth is the primary driver of 20-50% of all sales in the USA

2. Including blogs, emails, and all social media, how much of that happens online? If you’re like most people, you’d guess around 50-60%. After all, it doesn’t take a mathematician to know that $1 trillion is a lot of money. However, according to the Keller Fay Research group, the real figure is (drum roll please): 7%.

3. Not 70%. 7%.

How is that possible?

Even if you, like the average American, spend two hours of each day online, most of your life is still conducted in “unplugged” mode. Even factoring in sleep, you spend 8 times more time dealing with people face-to-face, sharing your thoughts with the 150 people closest to you. (150 is the maximum number of real relationships one can actually juggle in their life according to anthropologist Robin Dunbar)

Your social group is powerful. You might have read incredible things about Nike on Consumer Report’s website, but if your friend told you she just bought a pair of Nikes and had a miserable experience, all of the carefully compiled statistics go out the window. Your friend’s word of mouth trumps the feedback of thousands of strangers.

And even though the average tweet or Facebook post has the potential to hit 100 people, less than 10% of them actually get read.

Yes, I can get up in the middle of the night half-asleep, turn on my computer, log onto Zappo’s, buy a pair of running shoes and stumble back to bed. And yes, that single item-based sale is susceptible to that sort of enterprise.

Still, when it comes to making purchasing decisions, your inner circle of family and confederates hold tremendous sway over the shopping center in your brain.

Umwelt: Beyond the Five Senses, or, the Mr. Potato Head Model

All of the information that comes to your brain arrives through your senses. Sight, smell, touch, taste and hearing are conduits to the perfectly dark and silent world inside your skull where the most powerful processing machine in the world resides.

And yet, for example, we know that we see only a billionth of what is front of us. Even honey bees and snakes see a spectrum of light far beyond what we can detect. The animal kingdom is rich with creatures that have adapted to see, hear, feel, smell and taste far beyond our meager human abilities. Bats can hear insects flying from 15 to 20 feet away, and polar bears can sniff out a seal through three feet of ice.

The entire world of our perception is what scientists call our umwelt, and ours is quite limited. I might have conceptual knowledge of the X-ray that pulses through my body every time I go through security at an airport, but I can’t register it in any meaningful way, and the same is true of radio waves, magnetic fields, and so on. Put simply, I just don’t have the hardware for it.

So it seems we are prisoners stuck in a tiny sliver of objective reality, constrained by the limitations of our biology.

Not so fast, says David Eagleman, PhD and neuroscientist. At San Francisco’s Being Human conference in October 2012, Eagleman discussed a series of experiments that are rewriting our notion of experience, and what it means to, well, be human.

Eagleman explains that the brain’s incredible power lies in its plasticity. Think of your brain as a computer running code. If you can deliver code to the brain, it will figure out a way to run it. Part of the neuroscience Holy Grail is understanding just how the brain does this.

In the animal kingdom, animals have all kinds of interesting physical apertures to detect their umwelt. The blind tick understands the world through odor and the recognition of butyric acid. Vampire bats sense air compression and rattlesnakes have heat pits that send temperature readings to their brains.

Evolution has created a wide variety of biological receptors, but they all plug into a fairly standard processing system shared by many species. The plasticity of the human brain means we can co-opt our sensory delivery systems for unintended purposes.

In one experiment, a picture of a face was represented as a pictograph through needle pricks on a blind subject’s back. The nerve endings in the skin transferred the data to the brain, where it was decoded and the blind subject was able to report what he was “seeing.”

Eagleman half jokingly refers to this substitution of biological receptors as the MPH model of Evolution: the Mr. Potato Head model. He says we are more like Mr. Potato Head than we realize.

No matter what sensory system you use, like a dutiful computer programmer, it will find a way to run that code. The more we take advantage of this, the more we expand the potential of our umwelt.

To that end, Eagleman and his team are in prototype experiments to help blind people “see.” They have created a vest that picks up sound waves bouncing off objects, then sends these vibrations to the brain for processing. The brain can translate this back into spatial information, allowing someone with no vision to perceive the objects around them.

It is mind-boggling to imagine the possibilities. It makes the cochlear implant look like child’s play.

Hold on to your vest; your umwelt will never be the same.