A Fruit Gone Sour: The Demise of RIM and BlackBerry

Hey, remember BlackBerry?

In this day and age, it’s basically the smartphone equivalent of asking about digital watches or portable CD players. So it may be hard to remember that less than a decade ago, BlackBerry phones were at the technological forefront, a staple of the busy, the important, and the with-it. People joked about their BlackBerry addictions to the point where “CrackBerry” was Webster Dictionary’s 2006 Word of the Year. In 2009, Fortune magazine named RIM, the makers of BlackBerry, as the fastest growing company in the world.

Today, you may still know a BlackBerry user, but it’s probably that eccentric friend who won’t throw away their video cassettes in case the VCR makes a comeback.

Have you ever wondered what happened?

Probably not. But hey, now that I brought it up, aren’t you curious?

RIM’s 1999 BlackBerry was revolutionary. In a time when cellphones weren’t good for much beyond making calls, here was a palm-sized PDA that could send and receive e-mails from anywhere. The network was secure, the battery lasted forever, and the little QWERTY keyboard meant you could tap out a message with nearly the efficiency of typing on a computer.

For a while, everything was going right for RIM. What happened? In a word, people.

CEOs Mark Lazaridis and Jim Balsillie built a tech giant, but sadly they suffered from what is sometimes called “Founder’s syndrome.” Having scaled their way to the peak of the mountain, they failed to remember that landscapes change—especially in the fast-changing world of handheld electronics, where people on average replace their phones every two years.

On one hand, with hindsight on our side, it’s easy to condemn business leaders for failing to divine the future. On the other hand, RIM’s success caused Lazaridis and Balsillie to double down and stick their heads so far in the sand that their comments now make for surreal reading.

When PDAs in Asia began to offer color screens, Lazaridis insisted it was an impractical fad. “Do I need to read my e-mail in color?” he’s reported to have said.

“Cameraphones will be rejected by corporate users,” he stated in 2003.

In 2007, when Apple introduced a little gadget they were working on called an iPhone, Balsillie dismissed it as, “kind of one more entrant into an already very busy space with lots of choice for consumers … But in terms of a sort of a sea-change for BlackBerry, I would think that’s overstating it.”

Maybe in another company, someone might have stepped forward and delivered a wakeup call. But Lazaridis was notorious for only hiring people who thought like him. Lazaridis and Balsillie continued to insist their practical, workmanlike product had an impossible-to-beat foothold among businesspeople. How could a phone that wasted battery life on shiny new features elbow in on their territory? Who would tolerate the less user-friendly touchscreen keyboard of an iPhone?

“The most exciting mobile trend is full Qwerty keyboards,” Lazaridis said in 2008, of a feature they’d been offering for literally nine years. “I’m sorry, it really is. I’m not making this up.”

The public disagreed. The public disagreed pretty hard, as it turned out. That oh-so exciting keyboard feature became a shackle, cutting possible screen space in half and severely limiting what else the BlackBerry could offer. As more and more average consumers were enticed into the world of smartphones by the bells and whistles of the new generation, it altered the very definition of what a phone was supposed to be.

By the time even Lazaridis and Balsillie could no longer deny that change was needed, it was too late: they’d lost their edge, their voice of authority. When they finally started to offer their own touchscreens, it came with a feeling of sweaty desperation—and amazingly, their attempt at an iPad competitor didn’t even offer e-email.

At its peak, BlackBerry stock was worth $230 a share. These days, it hovers around $10. You would probably be better off using that money to buy actual blackberries, which are delicious, full of antioxidants, and much less vulnerable to corporate hubris.

The Achilles’ Heel in Your Head

Have you ever wondered why some people—and maybe we’re talking about you—are so adamant about some things? It might be a political position or your thoughts on diet and exercise, music, drunk drivers, tuna fish—the list goes on.

And if you were asked what informed your particular stance, answers might include your spiritual faith, personal life experience, and/or what you’ve learned from others through a wide variety of sources. Your belief system is subjective, and like your fingerprints, unique to you. As far as storage and access go, you hold the keys.

But although the system is subjective, what’s less subjective is the architecture it’s built on, according to many neuroscientists.

The thought is that the brain attaches emotional meaning to some events, in the form of memory markers. The outrage that accompanies your feelings of hearing about a hit-and-run drunk driver on the 10 o’clock news helps to both inform your opinion and store it in your memory for recall later.

Collected together, opinions create belief, which leads to a rule guide that you can then apply to new situations. In the science community, these are known as heuristics. Your brain catalogs them and makes them available to you for those eyeblink-fast decisions. This saves you from having to consciously invent new guidelines for every situation you find yourself in.

In other words, your brain conserves energy by applying previously established rules to new events. This works pretty well most of the time. In essence, your brain is gambling that a variety of life situations are similar enough that ‘a one size fits all’ approach will get the job done.

Our brains rely on, and would be lost without, our network of prebuilt beliefs to help maneuver us through our day. Unfortunately, this kind of system means that we bias towards simple black-and-white answers, often choosing not to examine the nuance of a decision or argument that might put our belief at risk.

This is why soundbites are so popular; they cater to the brain’s entrenched understanding of the world. We decide quickly but shallowly: this political party is the good guys and the other is the bad guys.

It takes more energy and a much more complicated reasoning process to seek out the grey area of a decision or argument. The only way to teach your brain how to do it is to actively question your own beliefs. That can be a messy business, which can lead to uncertainty—one of very things your system is designed to help clean up. So there is reassuring safety in locking down on a belief and adamantly refusing to open it up for assessment.

Of course, a little certainty isn’t necessarily bad. It might even make perfect sense, provided you’ve taken the time to work your way through your network of opinions and the nuances that drive an argument or decision.

But why are black and white answers so terribly seductive? The simple answer: it’s what our brains are hardwired to do. For many of us caught in the swirl of our day, relying on preconceived beliefs just saves time and energy.  Who has time to spend digging into the reasoning, or lack thereof, behind our decisions?

Our heuristics have allowed us to flourish and populate a large portion of the planet, and yet as a species, the over reliance on unexamined beliefs is also our collective Achilles’ heel.

Why Your Mindset Might Be Throwing You Curveballs

Are you wired for success? And by “success”, I’m not necessarily talking about monetary reward.

In her paper “The Mindset of a Champion”, psychology professor Carol Dweck discusses the almost-professional baseball career of Billy Beane.

You might remember Billy Beane from Michael Lewis’s book Moneyball. Beane was the famous Oakland A’s general manager who enjoyed numerous successful seasons by combining competent scouting with a wonkish handle on baseball statistics.

Beane as a player is another story.  In high school, he was a natural athlete who enjoyed interest from professional teams in baseball, basketball and football. He was considered to be the next big thing, and yet his chosen career in baseball came up short. After stints with several different teams, he eventually washed out of the major leagues.

Was Billy wired for success as a big league player? Carol Dweck doesn’t think so. Dweck’s research focuses on the concept of mindset. The idea is that people’s brains are basically wired up through environmental interaction towards one of two mindsets.

To fixed mindset people, your abilities—or lack of abilities—are fixed traits, like your height or your deadly soy allergy. “In this view,” Dweck writes, “talents are gifts—you either have them or you don’t.” The fixed mindset sees setbacks as powerfully discouraging, since any bump along the way could be a hint you were never blessed with true talent.

What’s the other option? The growth mindset, which holds that people can cultivate and improve their abilities through hard work and learning. To those with a growth mindset, drawbacks and extra practice opportunities are simply part of the game. While the fixed mindset camp sinks their time into proving themselves, the growth mindset followers focus on improving themselves.

According to Dweck, growth mindset people tend to be more successful in a wide variety of endeavors because they demonstrate more grit in the face of adversity, and because sustained incremental improvement tends to pay off over time.

You may be like Billy Beane, born with exceptional talent, but like the tortoise and the hare parable, it’s the willingness to hang in there and keep plodding along on the road of self-improvement that eventually brings the win.

Unfortunately, Billy Beane’s natural talent combined with a history of being rewarded for skill over work ethic probably led him towards a fixed mindset. The reasoning goes something like this: superstars don’t and shouldn’t need to practice all that much. That’s why they’re a cut above, that’s what defines them as superstars.

The problem with this approach is that when adversity shows up, the ‘superstar’ has developed absolutely no mechanism for overcoming it. The need for additional help or practice is seen  as only highlighting one’s personal flaws.

The good news? Since mindset is just that, a state of mind, and since the plasticity of the brain means neural rewiring is an ongoing opportunity; Beane was ultimately able to pick up more of a growth mindset. Dweck believes this contributed to Beane’s eventual success as a GM with the A’s.

Interested in success? What mindset are you?

Mind Wandering, or Getting Your Einstein On

Does your mind tend to wander? Most people believe that their minds wander about 10% of the time. Researchers at UC Santa Barbara put that figure at more like 30%.  When engaged in well-rehearsed tasks like driving a car on a wide-open highway, it’s estimated that mind wandering can be as high as 70%.

In her book How to Get People to Do Stuff, behavioral psychologist Susan Weinschenk makes the important distinction between mind wandering and daydreaming.

According to Weinschenk, daydreaming involves an aspect of fantasy, like imagining you’ve been asked to star in the next Hunger Games flick opposite Jennifer Lawrence, or you’ve just won the lottery.

Mind wandering occurs when your subconscious brain is engaged in a habituated activity, like driving, and at the same time you’re thinking about some other task or wrestling with some other problem.

Doing one thing while your brain focuses on something else might sound an awful lot like multi-tasking. However, the key to multitasking is performing multiple simultaneous conscious activities. In fact, scientists haven’t definitively proven true multitasking even exists. Humans seem to lack the cognitive firepower to pull off multiple independent-thinking operations at the same time.

What feels like multitasking is actually the brain flipping back and forth between separate mental processes, but because of the speed of the flipping, you have the illusion of synchronicity. Psychologists call this “task switching.” It helps to explain a common downside to this behavior, what’s sometimes called the 50/50 rule: when you try to do two things at once, both tasks tend to take 50% longer and involve 50% more mistakes.

So mind wandering is not multitasking or daydreaming. Mind wandering, according to researchers at UC Santa Barbara, is tied to creativity.  Weinschenk notes that the ability to perform a rote task while mind wandering and, more specifically, to switch on this mental meandering at will is “the hallmark of the most creative people.”

There are numerous stories of great thinkers like Nicola Tesla and Albert Einstein’s daydreaming or “thought experiments” helping to fuel some of the greatest achievements of modern time. Weinschenk would probably be quick to point out that these geniuses were not mere daydreamers, but accomplished mind wanderers.

I suppose that’s a blow to all of us daydreamers who up until now could take solace in our former moony-eyed patron saints, Tesla and Einstein. I, for one, vow to pay more attention to my own mind wandering in the future.

Anyway, I’ll get right on that after I finish winning the lottery…

The Reptile Brain Fights Back: Extinction Bursts

Let’s suppose that you’ve got a habit you want to break. You’ve followed the following five habit-breaking rules:

1. Tell a friend you’re going to break a habit to help put pressure on yourself to actually follow through

2. Be persistent; whether making or breaking a habit, it’s generally believed you need about 60 days of reinforced behavior to cement a change

3. Enlist a friend for moral support when you find your will weakening

4. Plan out a meaningful reward to give yourself once the habit is eradicated

5. Keep track of your daily progress towards breaking the habit to reinforce positive habit-breaking behavior

Everything’s going smoothly, and then just when you think you’ve rewired your brain, you’re blindsided by a sneak attack from within.

Once you understand that your rational brain is up against an internal conspirator, you might not be surprised to discover the nemesis is your emotional brain, sometimes known as your reptilian brain, which has some habit maintenance shenanigans up its proverbial sleeve.

The shenanigan in question is known as extinction burst. And your reptilian brain cleverly waits to spring the trap only after you’ve essentially overcome your bad habit, and you are literally in the very final stage of habit change, with a given habit all but eradicated.

An extinction burst is much like a Hail Mary play in football, where desperation drives an all-or-nothing strategy for success. Your reptilian brain makes a final push to reestablish your old habit.

Take healthy eating, for example. Suppose you’ve managed to avoid dessert for weeks and you’ve seen your hard-earned reduction in sugar intake showing in a positive way on the bathroom scale. It’s that big piece of chocolate cake that up until now you’ve been able to walk past that seems to reach out and grab at you, taunting you like the sirens in the Odyssey.

You can thank your reptilian brain for ramping up the chocolate cake craving to almost unbearable level. This might explain why dieters succumb to binging behavior after they’ve been so diligent in their efforts to kick their sugar addiction.

One theory for why your emotional brain might initiate a final extinction burst is that the wiring for a longtime habit is so deeply ingrained that the habit could be misidentified by part of your brain as something vital to survival. In much the same way that your body’s immune system can misidentify a food source as allergen.

Extinction bursts are extremely dangerous, largely because they are part of a process that originates from inside your brain. Unfortunately, there is no well established playbook for fending off an extinction burst.

Odysseus solved his problem and fear of succumbing to the siren song of temptation by having his shipmates lash him to the mast of his ship. That might have worked for the famous Greek, but ship masts aren’t always easy to find, especially when it comes to the dessert isle of your grocery store.

Straws, Steps, and the Importance of Thinking Small

There is a famous Arabic proverb where a camel loaded beyond capacity collapses after a single straw too many, hence, ‘the straw that broke the camel’s back.’ The idea is a basic one: a small, seemingly inconsequential, event ends up having profound effects.

Exactly one hundred Fridays ago, I began posting on this blog. In my writing and research, I have been struck by one reoccurring theme: the simplicity and elegance of the single increment, the power of potentiality unleashed through a minute action.

“A journey of a 1000 miles must begin with the first step.” This quote is sometimes attributed to the Chinese philosopher Lao Tzu, but the concept also resonates in Daniel Coyle’s talent code, Anders Ericson’s 10,000 hours rule, and JB Foot’s tiny habit.

It’s the compounding effect of building on a single decision, and that crucial first step overcomes inertia for creating a new habit.

The process is understood: practice builds repetition, which in turn builds habit. Habit is really nothing more than mylenated neural code put into action. But, of course, knowing is not the same as doing.

Our lives are a complex dance of experience, interpreted through the lens of emotion, and it’s difficult in the moment to comprehend the swirl around us. Even when our goals and aspirations are clearly defined, actually getting there proves difficult.

It’s not the knowing; for the most part, we know what we should do or want to do. In some way, it is the very simplicity of that initial step that lures us away from it, as though somehow there has to be more to it than that.

And yet, maybe there isn’t. Water boils at 212 degrees Fahrenheit. At 211 degrees, all you have is really hot water. Raise the temperature one degree and steam is generated, and steam has the power to run a city’s electrical grid.

Someone had to lay that first stone at the pyramid of Giza, Itzak Perlman had to run his bow across a violin string for the first time, Michael Jordan had to shoot his first layup, and Mark Twain had to write the first word in Huckleberry Finn.

As the process plays out, tiny steps build into something much greater than merely the sum of incremental parts. Nothing illustrates this better than a bird’s nest. Bits of debris, twigs and straw, when woven together, create an amazingly resilient and viable structure that has served our feathered friends for millions of years.

The straw that broke the camel’s back is a parable of warning (I suspect that’s how the camel understands it), but it can also be reframed as the awesome power contained in a single straw.

The last hundred weeks have been an interesting and rewarding journey. Thank you for taking a step down that road with me.

My sincere appreciation to my editor extraordinaire, Jessica. (Editor’s note: aw, thanks!)

See you next week.

Robb Best

The Robert Frost Quandary, or How Irrational Thinking Might Save Your Life

You stumble out of the wilderness, having had no contact with humans for at least ten days. You’re weak from hunger and fatigue and you find yourself at a crossroads, power lines stretching out along each of the separate roadways. It’s decision time. You think about Robert Frost’s poem, and wonder if his advice to take the road less traveled might not just lead to your demise. What do you do, or more importantly, which brain system should you use to make this crucial decision?

Daniel Kahneman, famed psychologist, winner of the Nobel prize for prospect theory and author of Thinking Fast and Thinking Slow might be the one guy to call, assuming your existential crossroads gets cellphone reception.

Kahneman explains that we have two systems for making decisions. He refers to them simply as System 1 and System 2.

System 1 is reflexive, automatic, and impulsive. It takes a constant reading of your surroundings and generates short-term predictions, all operating on a level beneath your everyday notice. When Freud talked about subconscious associations, he was discussing a function of System 1.

System 2, by contrast, is what allows you to focus on boring tasks, search your memory to identify something unusual, monitor the appropriateness of your behavior, and so on. You can think of it as the rational mind if you’d like, although it can be lazy to intervene on System 1’s shenanigans.

Your gut might tell you to take the road on the right. This is System 1 at work, unaware that being right-handed has over the years biased you to feel more comfortable moving in that direction. Studies show that whether entering a building or looking at products on a lineup, we tend to gravitate toward the side of our dominant hand.

On the other hand (so to speak), if you force your System 2 in play, you survey the situation and launch into analytical mode. Rejecting hunches or easy answers, you look for wear in the roads. Perhaps even the litter along the grass might give up clues as to what lies ahead or behind you. This might be a matter of life and death, so extreme deliberation is called for.

Your analytical brain might even recognize your own System 1 bias towards your dominant-hand side, so you are especially determined not to be led down that rabbit hole without a fight. Despite your hunger and thirst, you will use whatever information you can glean from your surroundings to make the most informed decision possible.

But as Kahneman points out, when we’re hungry and tired, our rational thinking and personal willpower begin to suffer mightily. The erstwhile fighter Mike Tyson once said, “Everyone has a plan until they get punched in the mouth.” In Tyson’s case, the insight was probably quite literal. Taken in a broader context, it tells the story of the brain’s limited ability to stay on task when confronted with a degradation of food, sleep or energy.

System 1 and System 2, which is believed to be the newer, shinier system, both have unique characteristics and given a particular situation work amazingly well.

System 1 can get a bad rap. It’s irrational, and it gets us into trouble sometimes. It weighs some pieces of information over others and it loves shortcuts. (Flaws in your System 1 thinking are why you can be fooled by optical illusions.) It also has a huge bias towards noticing and avoiding danger. While this generates plenty of false alarms and irrational fears (System 1 reacts emotionally to even seeing the word ‘crime’), sometimes you want to jump to conclusions.

As you were pondering your two roads dilemma, if a semi truck happened to come roaring around the corner from out of nowhere, you’d hope it wouldn’t take much analysis to dive out of the way. You could thank System 1 for letting you make that leap without waiting to find out the make and model of the truck as it bore down on you.

Luckily, the scenario I describe is theoretical. Besides, you would have never hiked out into the wilderness without GPS, an adequate food supply, and a backup power supply for your smartphone. Planning and preparation are what the boy scouts and System 2 share in common.

But let’s face it, System 1 is probably the real hero of the story. Without your impulses, emotions, and warm memories of the smell of pine, what’s the chance you’d actually martial the energy to go out in the wilderness hiking in the first place?

The beauty of System 1 is that it’s there to remind you just how lazy you truly are. And as it’s done for countless generations before you, it’s there primarily to keep you alive.

Who Are You? The Science (or Lack Thereof) of Myers-Briggs

If you’ve been hired for a job in the last thirty years, chances are you’ve heard of Myers-Briggs. It’s a personality diagnostic tool used by everyone from self-searching college kids to Fortune 500 companies.

It’s understandable why employers embrace the Myers-Briggs Inventory. If there is a way to figure out ahead of time whether or not you’re going to ‘fit in,’ it could save the company money in the long run, and you might avoid working for a company you don’t like.

If the test can prove you’ve got the right stuff, maybe you’ll even skip a couple of rungs on your way up the management ladder. Perhaps someday you’ll be the one ordering the personality testing of the young upstarts seeking to unseat you from your hard-fought throne.

Obviously, when it comes to business, this is all a pretty big deal.

If you’re like me, you probably believed the Myers-Briggs was supported by some serious clinical evidence. After all, this is a common, widespread, accepted tool. People embrace their Myers-Briggs designation, labeling themselves ENFJ or ISTP the same certainty as height or blood type.

Surely it’s been all proven out through a double-blind study, or perhaps studies. We’re talking the kind of careful, thorough science necessary when people’s egos, and personal livelihoods, are balanced on a handful of test answers.

The problem is that science has the same relationship with Myers-Briggs that it had with alchemy back in the Dark Ages. It’s true that Myers Briggs has turned into gold, but a different kind of gold to be sure.

Isabel Myers, daughter of Katharine Briggs, conjured up the personality test at her kitchen table in the forties, shortly after World War Two. She had no formal training in psychology or testing. She based her system on her reading of Carl Jung, who had in turn been a student of the famed Sigmund Freud. Jung suggested in one of his writings, Psychological Types, that human behavior seemed to break down into categories.

Myers, believing that Jung was on to something, went on to build her personality test, placing people on four different continuums: Introvert/Extrovert, Thinking/Feeling, Intuitive/Sensing, and Judging/Perceiving. These variables allowed for 16 different combinations.

When she sent her personality inventory, or ‘Indicator’, to Jung, he was lukewarm, suggesting to Myers that an individual’s personality was far too complex to capture with a set of clever questions. Undeterred, Myers made the rounds of academia, hoping to drum up support for her newly minted system. She was repeatedly turned away due to an utter lack of scientific bonafides.

But in true entrepreneurial spirit, Myers soldiered on. She eventually found a buyer in Henry Chauncey, who had just started up a company called Educational Testing Service. You might know it better as the Scholastic Aptitude Test, or SAT.

Back in the late fifties, Chauncey decided that a personality test would be a nice addition to his fledgling college entry exam. According to Annie Murphy Paul in her book, The Cult of Personality, Myers was then able to leverage her Chauncey connection into some measure of respectability.

Unfortunately for Myers, one group never fully bought in: psychologists. They have their reasons, including studies where people’s answers on the Myers-Briggs can differ significantly depending on the time of day it’s administered. One study showed that up to 50% of the people who are given the test a second time end up with a different personality profile.

It’s also been suggested that because of the way the test was constructed (little emphasis on negative traits), most people tend to accept their results without examining it too closely.

So what’s the bottom line? Is Myers-Briggs a bunch of baloney? At least on an anecdotal level, there certainly appears to be some connection between people and distinct communication styles. Even Hippocrates observed that people seem to fall into four different types.

However, it seems like an overreach to suggest Myers- Briggs, or any test for that matter, could ever capture one’s personality as neatly as trapping a firefly in a glass jar. We’ve witnessed this oversimplification before, with the idea that a single number can represent the breadth and width of one’s IQ.

Myers-Briggs, like IQ tests, do tell us something about ourselves, and it probably makes sense to consider them as one interesting set of data points on an incredibly complicated spectrum of human behavior. But in the end, putting too much emphasis on a personality test appears to be less about science and more about alchemy.

And you always wondered what went on in HR behind closed doors…

The Morality Lag: Smartphones and Dumb Feelings

Your smartphone has more computing power than the computer that took Neil Armstrong and crew to the moon. And this is only one of the staggering technological advancements we’ve made in the last 50 years.

Have you ever wondered why technological advancements, a byproduct of the analytical brain, have far outrun our ability to create any kind of significant improvements in our emotional governance? Wars, murder, and mayhem have gone unabated for thousands of years, and yet this week Apple announced the introduction of a new iPhone with fingerprint recognition.

Back at the turn of the twentieth century, Mark Twain said something along the lines that any newspaper publisher, regardless of the era, could always bank on headlines like “Trouble in the Middle East” and “Revolution in South America”. Twain was uncanny about the consistency of humanity’s inability to live and let live.

So why have our emotional brains hit a roadblock? Why haven’t we wiped out jealously, avarice and greed the way we knocked out polio? More importantly, when can we expect the next big upgrade to the emotional brain?

Sadly, not any day soon. The brain’s emotional decision-making powers have their roots firmly entwined in our most primitive survival instincts. There is a reason why neuroscientists refer to this part of the brain as our reptilian brain: we share this subcortical structure with the creatures found in the Florida Everglades. The drive to survive trumps just about every other kind of judgment, including the moral ones.

Individually, some may attain enlightenment, but it’s not exactly a product that can be monetized like Tang. And so collectively, we have a certain amount of cognitive dissonance about the way emotions wreak havoc on our everyday lives.

It feels much better for us to share in, and highlight technological advancements. Take for example the second sentence in this post:”And this is only one of the staggering technological advancements we’ve made in the last 50 years.” I don’t know about you, but I didn’t have much to do with the creation of the smartphone.

Technological advancements are built on the backs of the very special few. Although we all enjoy the latest creations that come in the form of cameras that are phones or phones that are cameras, who among the unwashed masses is capable of developing the next great innovation?

Let’s face it, most of us are essentially still cave people. We may wear nicer clothes, but if a cataclysmic event rendered electricity, and therefore our microwave ovens, useless, we’d pay to have one of our ancient relatives from the Chauvet Cave in Southern France explain how to start a fire without propane or matches.

A handful of impressive achievements have altered the way we live our lives, but they haven’t fundamentally altered who we are. We are the same people who engaged in war over territory and treasure since recorded time. Our weapons may have improved, but our intentions haven’t.

It’s easy to conflate technological leaps with some improvement in human pathos. They have a connection, but by no means are they the same thing.

We often give ourselves credit for having evolved far more than we actually have. It’s theorized that our current brain system have been around for roughly 40,000 years, apparently without fundamental change.

Like the saying goes, “Same story, different day.”

One can confirm this by simply gazing at the next heartbreaking newspaper headline.

After all, it’s just one click away on your new smartphone.

Cause and Correlation, or the Pirate Problem


As you can see from the above graph, global warming is pirate-based.  It’s something I think we all suspected, but were hesitant to advance until the facts could be summarized in a handy graphic.

There is something about information delivered via graph that instantly lends an air of unassailable authority. The person trapped in the cube next to you, or even the guy down at the gas station couldn’t possibly carry the credibility of a simple graph.

It is an axiom of business that any presenter worth his or her salt is going to fill their PowerPoint with charts and graphs. The more the better, and the more oblique and difficult to read the best. Data delivered with a graph says “Here is the evidence, plain and simple. Let the ascending and descending lines tell you the story.”

The problem with the story, as with the graph above, is that we aren’t just suckered into believing correlation implies causality. We start thinking correlation is causality. Governments, businesses and individuals make this mistake on a daily basis. It’s impossible to calculate the frequency or the magnitude of the resulting financial loss, but it’s enormous.

We all know the crowing of the rooster doesn’t cause the sun to rise. But when rates of breast or prostate cancer is associated with soymilk, or some new drug, it can frequently drive us towards some definitive action, even though the connection of data points might in actuality be more rooster/sun than cause/effect.

The correlation/causality problem goes back a long way. The early human brain, confronted by the rustling of the bush, might naturally assume it was a tiger and not the wind. Erring on the side of safety could make the difference between life and death. Assuming correlation as causality was a small price to pay. This evolution based brain bias is still part of our biology today.

Here are four questions worth considering the next time you’re faced with the seductive whisperings of an X-axis.

1. Where do the represented data points come from?  Groups and individuals might be selectively mining the facts based on their own private agenda.

2. What do the data points represent?  Tiny samples can lead to casual conclusions that would be dismissed in a more robust survey population

3. Was this a blind study? A control group gives you some yardstick by which to judge the rest of the information.

4. Could other factors be in play?  This is probably the most abused problem with the correlation/causality mix-up. Maybe there is some relation between the X axis and the Y axis, but they could just as easily be responding to some other, third influence.

In the case of the rooster/sun problem, you’d want to consider both planetary revolution and the circadian rhythms of diurnal animals. (Additionally, if you’ve ever been on a farm, you’ll know that while roosters do crow at daybreak, those feathery little jerks will also sound their alarm in the middle of the night.)

So the next time some newscaster announces that eating peanut butter has “been linked to” autism, think back to our little graph. And remember: despite the insistence of Pastafarians everywhere (a group inspired by a modern-day Russell’s teapot analogy), most meteorologists agree that pirates have next to no effect on the climate.

Fans of buccaneers, privateers, and skallywags can let out a “Yarr!” of relief.