A Fruit Gone Sour: The Demise of RIM and BlackBerry

Hey, remember BlackBerry?

In this day and age, it’s basically the smartphone equivalent of asking about digital watches or portable CD players. So it may be hard to remember that less than a decade ago, BlackBerry phones were at the technological forefront, a staple of the busy, the important, and the with-it. People joked about their BlackBerry addictions to the point where “CrackBerry” was Webster Dictionary’s 2006 Word of the Year. In 2009, Fortune magazine named RIM, the makers of BlackBerry, as the fastest growing company in the world.

Today, you may still know a BlackBerry user, but it’s probably that eccentric friend who won’t throw away their video cassettes in case the VCR makes a comeback.

Have you ever wondered what happened?

Probably not. But hey, now that I brought it up, aren’t you curious?

RIM’s 1999 BlackBerry was revolutionary. In a time when cellphones weren’t good for much beyond making calls, here was a palm-sized PDA that could send and receive e-mails from anywhere. The network was secure, the battery lasted forever, and the little QWERTY keyboard meant you could tap out a message with nearly the efficiency of typing on a computer.

For a while, everything was going right for RIM. What happened? In a word, people.

CEOs Mark Lazaridis and Jim Balsillie built a tech giant, but sadly they suffered from what is sometimes called “Founder’s syndrome.” Having scaled their way to the peak of the mountain, they failed to remember that landscapes change—especially in the fast-changing world of handheld electronics, where people on average replace their phones every two years.

On one hand, with hindsight on our side, it’s easy to condemn business leaders for failing to divine the future. On the other hand, RIM’s success caused Lazaridis and Balsillie to double down and stick their heads so far in the sand that their comments now make for surreal reading.

When PDAs in Asia began to offer color screens, Lazaridis insisted it was an impractical fad. “Do I need to read my e-mail in color?” he’s reported to have said.

“Cameraphones will be rejected by corporate users,” he stated in 2003.

In 2007, when Apple introduced a little gadget they were working on called an iPhone, Balsillie dismissed it as, “kind of one more entrant into an already very busy space with lots of choice for consumers … But in terms of a sort of a sea-change for BlackBerry, I would think that’s overstating it.”

Maybe in another company, someone might have stepped forward and delivered a wakeup call. But Lazaridis was notorious for only hiring people who thought like him. Lazaridis and Balsillie continued to insist their practical, workmanlike product had an impossible-to-beat foothold among businesspeople. How could a phone that wasted battery life on shiny new features elbow in on their territory? Who would tolerate the less user-friendly touchscreen keyboard of an iPhone?

“The most exciting mobile trend is full Qwerty keyboards,” Lazaridis said in 2008, of a feature they’d been offering for literally nine years. “I’m sorry, it really is. I’m not making this up.”

The public disagreed. The public disagreed pretty hard, as it turned out. That oh-so exciting keyboard feature became a shackle, cutting possible screen space in half and severely limiting what else the BlackBerry could offer. As more and more average consumers were enticed into the world of smartphones by the bells and whistles of the new generation, it altered the very definition of what a phone was supposed to be.

By the time even Lazaridis and Balsillie could no longer deny that change was needed, it was too late: they’d lost their edge, their voice of authority. When they finally started to offer their own touchscreens, it came with a feeling of sweaty desperation—and amazingly, their attempt at an iPad competitor didn’t even offer e-email.

At its peak, BlackBerry stock was worth $230 a share. These days, it hovers around $10. You would probably be better off using that money to buy actual blackberries, which are delicious, full of antioxidants, and much less vulnerable to corporate hubris.

The Achilles’ Heel in Your Head

Have you ever wondered why some people—and maybe we’re talking about you—are so adamant about some things? It might be a political position or your thoughts on diet and exercise, music, drunk drivers, tuna fish—the list goes on.

And if you were asked what informed your particular stance, answers might include your spiritual faith, personal life experience, and/or what you’ve learned from others through a wide variety of sources. Your belief system is subjective, and like your fingerprints, unique to you. As far as storage and access go, you hold the keys.

But although the system is subjective, what’s less subjective is the architecture it’s built on, according to many neuroscientists.

The thought is that the brain attaches emotional meaning to some events, in the form of memory markers. The outrage that accompanies your feelings of hearing about a hit-and-run drunk driver on the 10 o’clock news helps to both inform your opinion and store it in your memory for recall later.

Collected together, opinions create belief, which leads to a rule guide that you can then apply to new situations. In the science community, these are known as heuristics. Your brain catalogs them and makes them available to you for those eyeblink-fast decisions. This saves you from having to consciously invent new guidelines for every situation you find yourself in.

In other words, your brain conserves energy by applying previously established rules to new events. This works pretty well most of the time. In essence, your brain is gambling that a variety of life situations are similar enough that ‘a one size fits all’ approach will get the job done.

Our brains rely on, and would be lost without, our network of prebuilt beliefs to help maneuver us through our day. Unfortunately, this kind of system means that we bias towards simple black-and-white answers, often choosing not to examine the nuance of a decision or argument that might put our belief at risk.

This is why soundbites are so popular; they cater to the brain’s entrenched understanding of the world. We decide quickly but shallowly: this political party is the good guys and the other is the bad guys.

It takes more energy and a much more complicated reasoning process to seek out the grey area of a decision or argument. The only way to teach your brain how to do it is to actively question your own beliefs. That can be a messy business, which can lead to uncertainty—one of very things your system is designed to help clean up. So there is reassuring safety in locking down on a belief and adamantly refusing to open it up for assessment.

Of course, a little certainty isn’t necessarily bad. It might even make perfect sense, provided you’ve taken the time to work your way through your network of opinions and the nuances that drive an argument or decision.

But why are black and white answers so terribly seductive? The simple answer: it’s what our brains are hardwired to do. For many of us caught in the swirl of our day, relying on preconceived beliefs just saves time and energy.  Who has time to spend digging into the reasoning, or lack thereof, behind our decisions?

Our heuristics have allowed us to flourish and populate a large portion of the planet, and yet as a species, the over reliance on unexamined beliefs is also our collective Achilles’ heel.

The Twain Brain, or, Why Smart People do Stupid Things

“A man who carries a cat by the tail learns something he can learn in no other way.” So said Mark Twain, printer, steamboat pilot, journalist, failed miner, entrepreneur, abolitionist, lecturer and supreme humorist. Twain is perhaps the greatest American storyteller and writer ever produced by the fifty states.

Whether attacking kings, hypocrites, or the literary offenses of Fenimore Cooper, Twain was famous for his razor-sharp wit and his allergy to BS. Yet that same man drove himself into bankruptcy, ignoring the council of his most trusted pals in favor of pouring his fortune into a series of disastrous inventions. He once invested the equivalent of 8 million dollars in the Paige typesetting machine, a marvel of engineering that dazzled crowds but also constantly broke and was obsolete by about 1884.

So why did a man renowned for not suffering fools pour his fortune into one fool’s errand after another? Could it be that Twain, like the rest of us, was seduced by a “magic bullet”? That wholly American notion that there is a shortcut out there to unmitigated wealth and happiness.

Whether it’s a diet pill (all-natural, of course) or a potion to restore skin texture to that of a six-week-old baby (all-natural, of course) or a book that promises to create a state of nirvana (no artificial additives) or a new-fangled typesetter machine, many of us are suckers for the shortcut.

We love the easy road, the secret sauce, or that ultimate financial tip (see Martha Stewart). In 2012, Americans spent a total of $78 billion on lottery tickets.

Our brains love shortcuts. The most primitive, basic parts of our brains are wired for them. Although these shortcuts lack precision and can create real problems, their saving grace is efficiency.

Still, this efficiency can suffer sometimes. Take optimism bias, the unfounded belief that things will turn out much better than the facts warrant.

It’s what allows smokers to believe they won’t get cancer, dieters to start their diet in the future, and all of us to procrastinate on our work because, as Twain noted, “Never put off till tomorrow what you can do the day after tomorrow.”

Even the great Twain fell victim to optimism bias as he traveled down what he thought was a shortcut to financial independence through a prototype-printing machine. The Paige typesetter was reported to possess more problems than actual moving parts, of which it had more than 18,000.

Ironically, many suspect that had Twain put more energy in writing and less in his pet get-rich-quick scheme, he would have gotten rich much faster, and with a whole lot less heartache.

But Twain, was plagued with one incurable problem: a human brain. If reasoning is currency, then biases and shortcuts are what the primitive brain trades in. And that brain is where the action is.

Perhaps rather than seeing biases and shortcuts as system flaws, we should instead celebrate that which makes our brains so unique and ‘predictably irrational.’

No one summed it up better than Mark Twain.

“Don’t part with your illusions. When they are gone, you may still exist, but you have ceased to live.”

The Anatomy of Fear

By the time our ancestors were roaming the great savannas, alternating between chasing prey and being prey, their systems had already adapted to face the harsh environment, according to Rick Hanson PhD, author of Buddha’s Brain: The Practical Neuroscience of Happiness, Love, and Wisdom.

Today, those ancient adaptations show up in the most peculiar ways as we constantly hijack a system built for times long ago. Getting cut off on the highway might not seem an awful lot like being chased by a lion, but to the brain’s subcortical structures, it’s pretty much the same deal.

Hanson explains how it all works.

As you feel yourself careening across a lane of traffic, your brain sounds the ancient ‘lion alarm,’ and begins to prepare for battle. First, stress hormones like Epinephrine kick your heart rate up and Norepinephrine increases blood flow to bring your largest muscles online faster.  Your pupils dilate to take in more light for enhanced visibility, and your bronchioles expand, boosting lung capacity for punching and speed.

Another stress hormone called cortisol jumps in to suppress your immune inflammation warning system, just in case you happened to be wounded. The hippocampal system, which normally does its part to keep the cortisol level of the amygdala (the brain’s fear center) under wraps, takes a proverbial step back and lets the amygdala ratchet up, which drives more cortisol into your system, in effect supercharging your blood, much like adding octane to your vehicle’s gas tank.

The system that governs reproduction essentially turns off, because that’s not something that you’re likely to be thinking about at the moment, and your digestive system also goes into hibernation, allowing the body to redirect energy and blood flow as needed.

All of this signals your amygdala to go on a higher level alert; this system, which normally monitors threatening information, turns up the heat on your emotional thermometer and moves the needle from stress to fear/anger in order to raise your level of intensity commensurate with a struggle for life or death.

Your prefrontal cortex, the home of reason, speculation, planning and assessment gets hijacked by more primitive systems like the amygdala and takes a backseat, fundamentally turning the keys to your body over to your reptilian brain. Not a whole lot of contemplation going down when you’re in “kill or be killed” mode.

There was a time when the primitive elegance of this system made sense. But that was a long time ago. Today a whole host of activities, from being cut off on the road to a bad email from your boss, can trip the ancient survival system.

Many people find it difficult to deactivate the stress switch. Hanson teaches that practicing mindfulness and meditation are ways to help keep the brain from prematurely sounding the alarm bell all day long.

Sadly, for all too many people, there is a lion lurking around every corner.

The Experiencing Self, or, Why Present-You Hates Past-You

Unlike Calvin here, most of us will probably never get the opportunity to have a face-to-face conversation between our current selves and our 6:30 selves, so to speak. That is a shame, because as Daniel Kahneman discusses in Thinking, Fast and Slow, the two of them don’t necessarily know that much about each other.

In trying to understand how the brain registers emotion, Kahneman outlines the divide between what he calls the “experiencing self” and the “remembering self.” The experiencing self, or “8:30 Calvin”, if you will, is you in the present. All it knows is whether or not you are having a good time in the moment. The remembering self, or “6:30 Calvin”, on the other hand, looks back and tries to to sum up your overall impressions of past events.

Which Calvin holds more sway in your judgements?  That prize goes to 6:30 Calvin. It makes a certain amount of sense. The trouble with living “in the now” is that every second is a different now. Your in-the-moment perceptions are in constant flux. Your remembering self, on the other hand, is a much more fixed point. Besides, the vast majority of the information in your head is not things you’re discovering in the moment; but feelings and data you’ve gradually built up over your whole life.

Unfortunately, 6:30 Calvin doesn’t always know what he’s talking about.

In the past, we’ve discussed the peak-end rule, where our take-home memory of an event puts way too much weight on the most extreme moment, and also on whatever happened at the very end. (Every stage actor knows that you have to bring your A-game to the final scene.) Our most recent recollection can color the rest of our information to a hilarious degree. In one study, people were asked to judge their own life satisfaction. But first, they had to make a photocopy. Half the participants then “discovered” a carefully planted dime in the photocopy room. The simple minor victory of having just gained a free ten cents had a noticeable impact on how they assessed the overall happiness of their lives.

Then, there’s duration neglect. 6:30 Calvin has no way to accurately record time.

To see these effects in action, we need look no further than Kahneman’s “cold hand” study.

First, experimenters asked people to plunge their hand into very cold water for 60 seconds. As you might imagine, this is not the most pleasant activity. (From an experiment design setup, it’s a good way to administer an easy-to-measure but harmless pain. From a “bored scientists hanging around the lab” setup, it’s probably also a decent dare.)

The subjects were allowed to briefly warm and dry their hands with a towel, and to presumably take a moment to ponder the sacrifices we all make for scientific knowledge, and whether or not it’s worth it to hurt yourself on purpose while some schmuck or schmuckette stands over you with a clipboard.

Then, it was back into the cold water. This time, the subjects got 60 seconds just as before, but immediately followed by 30 seconds in water that was exactly one degree warmer.

Told they had to undergo one more dunking, the subjects then had to decide: did they want to relive Option A or Option B?

Keep in mind: Option B is just Option A with 30 extra seconds of slightly less-painful pain. (A total of 90 seconds in cold water.) So surely it will come as no surprise to know that people overwhelmingly chose…Option B. They were swayed by the recollection of the pain lessening of the last 30 seconds (hello, peak-end rule), and while each second in the cold water had probably felt like an eternity as it was happening, the remembering self couldn’t make the distinction.

The remembering self doesn’t care about 60 seconds vs 90 seconds. “What’s the difference to me?” says the remembering self. “I’m talking to you from the past, and in the vast scheme of your life, 30 seconds are nothing.” Sure, it means a little extra pain in the moment, but the remembering self doesn’t worry about the moment. “Not my department,” says the remembering self with a shrug, passing the buck in a scene familiar to anyone who’s ever worked in a company with multiple employees. “It’s someone else’s problem.”

Unfortunately for you, that “someone else” is…your experiencing self.

If you’ve read the classic Calvin and Hobbes strip above, you know that what follows is a whole lot of arguing. Just one more peril of time travel…

 

 

Six Strategies for Avoiding the Truth

Are you lying to yourself every day?

Depends: are you a “Bayesian Updater?” Hopefully you are. The term is named after Reverend Thomas Bayes. Around 1763, Bayes proposed a probability theory which stated that when you’re confronted with facts contradicting your current beliefs, you change or update your beliefs.

In his new book, Answers for Aristotle, University of Chicago’s Massimo Pigliuuci suggests that if humans are rational, then the Bayesian principle should be our default. Of course, modern science has done a great deal to de-emphasize the role of logic in decision-making. Even the great philosopher Aristotle, upon closer reading, suggests we are more rationalizers than rational.

So if people aren’t Bayesian Updaters, what are they? In their study of cognitive dissonance, Northwestern University professor Monica Prasad and her research team have identified six alternative strategies. Their work shines on light on just how intelligent and well informed individuals can cling to a belief even in the face of all available proof to the contrary.

Her findings are based on a study about Republicans who failed to change their stance on the Iraq War, even after being confronted with hard evidence that Sadam Hussein was not connected to 9/11, as Bush had initially argued.

Here are the six most common responses Prasad identified in her study:

1. Attitude Bolstering (33%): When told Sadam Hussein had nothing to do with 9/11, this group simply shifted to other justifications for the Iraq War. For example, “There is no doubt in my mind that if we did not deal with Saddam Hussein when we did, it was just a matter of time when we would have to deal with him.”

2. Disputing Rationality (16%): Having trouble justifying your reasoning? Here’s one option: don’t even try. As one subject put it, “Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.”

3. Inferred Justification (16%): Some respondents worked backwards, suggesting that even if they couldn’t find a reason, surely one had to exist, because why else would we be in Iraq? “…I believe that [the death of innocent people is wrong] also, but there must be a reason why we’re still over there or we wouldn’t be over there still.”

4. Denial of belief in the link (14%): These subjects used a “slippery slope” defense, subtly reinterpreting the original linkage between Hussein and 9/11 to be about Afghanistan and 9/11, as if the malleability of the facts was not a problem.

5. Counter-arguing (12%) Another common strategy was simply refuting the information. These people responded with their own arguments connecting Sadam and the 9/11 attacks. For example, “I believe he was definitely involved with in it because he was definitely pumping money into the terrorist organizations every way he could. And he would even send $25,000 to somebody who committed suicide to kill another person, to their family.”

6. Selective Exposure (6%) Instead of changing their mind, this group simply disengaged from the issue altogether, saying things like, “I don’t know. I don’t know anything about . . . where and what we’re going after.” and “I’m gonna pass on this one, for now.”
Interestingly, even after the subjects were shown a quote where George Bush acknowledged that there was no linkage between 9/11 and Sadam Hussein, only 2% of those surveyed changed their minds.

It should be pointed out that this study is not a condemnation of Republicans. No personal background or political affiliation makes you immune to these fallacies. It’s not an issue of party lines, it’s an issue of being human.

So what drives our cognitive dissonance? One answer might be heuristics. These are the shortcuts, or rule of thumb processes our emotional brains use to make quick decisions. This primitive thinking system (what Daniel Kahneman calls System 1) is alive and well today. We use it on a daily basis. Heuristics are a handy way to solve a problem when time and/or energy are in short supply. The problem starts when we take the shortcut without even knowing it.

Kudos to the Reverend Bayes who, back in the 18th century, gave us the benefit of the doubt when it came to rationality. Today, we have some rather more unflattering facts to face.

That is, if we want to.

The Lowdown on Luck

“Good luck!”

It’s a common expression in our lexicon. Obviously, on a gut level we have some sense of the importance of luck—that is, until things go our way. Suddenly, we relegate luck to the cheap seats as we bask in the spotlight, prepared to take full credit for our superior decision-making.

Take the Quaker Oats Corporation, for example.

In 1983, the CEO of Quaker Oats, William Smithburg, sampled some of an up-and-coming sports drink and decided to acquire the company. That beverage was Gatorade, and it was the beginning of a goldmine for Quaker Oats.

So when Smithburg decided to buy Snapple in 1994 for $1.8 billion, he was unchallenged from within his organization. This was the Gatorade guy, after all; surely he knew what he was talking about. Media pundits disagreed, lambasting the decision before the deal was even struck.

Did Smithburg’s superior decision-making prove them wrong? Well, no.

Fast forward 3 years and Quaker Oats was desperately unloading Snapple at a loss of $1.5 billion dollars. To this day, it is widely seen as one of the worst decisions in business history— quite a legacy.

So how could Smithburg screw up so badly? How should we understand what happened at Quaker Oats, polar opposite results from the same CEO?

Nobel Laureate Daniel Kahneman has the answer. And for that answer, we must turn to the Israeli Air Force.

For a long time, the Israeli Air Force trained its pilots with the assumption that negative feedback trumped positive. After all, when a trainee pilot was punished for a botched maneuver, the next attempt tended to go better. When that same trainee executed a maneuver perfectly and received praise, their next attempt was generally not as good.

Kahneman was the first to realize that this wasn’t a case of the stick working better than the carrot. It was simple statistics at play.

We all love those magic moments where we outshine our normal capabilities. But there’s a reason your average is your average. So chances are that a better-than-usual outcome is almost certainly going to be followed by something lackluster. The opposite is true as well; if you find yourself performing much worse than usual, the odds favor an eventual upswing.

It’s an old concept in statistics. In the 19th century, Sir Frances Galton found that the children of unusually tall people tended to be a little shorter than their parents, and unusually short parents tended to have children taller than themselves. He referred to this phenomenon as “regression to the mean.”

Kahneman takes this concept beyond height and into the messier real world.

No matter how well you prepare, most enterprises involve a degree of chance. A brilliant business idea may still fail in a lousy economy. An untalented singer might still net a record deal by happening to charm the right person at the right time.

In his book Thinking, Fast and Slow, Kahneman repeatedly demonstrates that much of our lives are shaped by random events beyond our understanding or control.

Was William Smithburg’s gut decision to buy Gatorade a stroke of leadership genius? Or did he happen to taste the right stuff at the right time?

Kahneman was once asked about his definition of success. He famously replied, “Success = talent + luck, and great success = a little more talent + a lot of luck.”

In other words, luck is not some bit player. It’s an integral part of the human experience, whether we chose to acknowledge it or not. Kahneman showed luck, good or bad, pretty much guarantees that regression to the mean is always waiting to take center stage.

The Brain’s Allergy to the Big Picture

Do you suffer from Systems Blindness? You almost certainly do.

The problem is that your brain’s hardwiring is designed primarily to keep you alive. Which is fair. But as a result, we specialize in snap-second judgments.

Our living strategy is largely built on using association to connect causes and effects, which in turn drives our decision-making. See a school bully in action and we go out of our way to avoid him. Watch a fellow office worker grown lean through jogging and we might be tempted to hit the pavement ourselves in the morning. In short, we observe, draw inferences and plot our course. This strategy has served humanity well; after all, there are over 7 billion of us on the planet.

Individually, we are amazing at making day-to-day decisions that afford us a certain amount of comfort. But what happens when our comfort is besieged by a huge, unnervingly complicated system like weather or traffic? Here is where Daniel Goleman in his new book Focus: The Hidden Driver of Excellence, weighs in.

Take traffic, for example. When we’re stuck in rush hour, we might be tempted to think the way my diminutive little grandmother used to: “Why don’t all these damn people stay home?” Aside from the fact that, by her very presence, she is contributing to the problem, this is probably an issue of oversimplification. Too many people = traffic jam.

We might be tempted to answer my grandmother with, “What we really need is more roadways.” Engaging in this kind of reasoning is known as the “illusion of explanatory depth,” Goleman explains. “…we feel confidence in our understanding of a complex system, but in reality have just superficial knowledge.”

We don’t realize, for instance, that access to new highways can energize nearby industry, which can grow communities, which in turn supports restaurants, shopping, and recreation, thereby attracting even more families, which puts more people like my grandmother on the road, which of course means more traffic jams.

Our brains understand cause and effect at a local level, but as the causes and effects grow larger and more distant, our reasoning suffers. The effect of slamming your fingers in a car door is pretty immediate: the amygdala, the fear center of your brain, fires off a warning and your sensory system administers a shot of pain. Global warming, on the other hand, operates on an almost impossible level of remove.

We are designed to create short-term solutions, and as our societies have grown larger and more complex, system blindness becomes increasingly more dangerous. I can’t see the seas rise as the result of a carbon-loaded atmosphere, so I don’t merely dismiss the impending long-term threat, my amygdala is as complacent as the Maytag repair man.

Luckily, as our ability to generate and analyze large quantities of data has improved, our awareness of systems is growing too. Google’s foray into plotting flu epidemics is but one example.

Hopefully the more data mining that takes place for things like global climate change, the more each of us will begin to consider that we are part of something far larger than the hunk of land on which we live and drive. In the meantime, in regards to our carbon footprint, maybe my grandmother was onto something. “Why don’t all these damn people stay home?”

The Robert Frost Quandary, or How Irrational Thinking Might Save Your Life

You stumble out of the wilderness, having had no contact with humans for at least ten days. You’re weak from hunger and fatigue and you find yourself at a crossroads, power lines stretching out along each of the separate roadways. It’s decision time. You think about Robert Frost’s poem, and wonder if his advice to take the road less traveled might not just lead to your demise. What do you do, or more importantly, which brain system should you use to make this crucial decision?

Daniel Kahneman, famed psychologist, winner of the Nobel prize for prospect theory and author of Thinking Fast and Thinking Slow might be the one guy to call, assuming your existential crossroads gets cellphone reception.

Kahneman explains that we have two systems for making decisions. He refers to them simply as System 1 and System 2.

System 1 is reflexive, automatic, and impulsive. It takes a constant reading of your surroundings and generates short-term predictions, all operating on a level beneath your everyday notice. When Freud talked about subconscious associations, he was discussing a function of System 1.

System 2, by contrast, is what allows you to focus on boring tasks, search your memory to identify something unusual, monitor the appropriateness of your behavior, and so on. You can think of it as the rational mind if you’d like, although it can be lazy to intervene on System 1’s shenanigans.

Your gut might tell you to take the road on the right. This is System 1 at work, unaware that being right-handed has over the years biased you to feel more comfortable moving in that direction. Studies show that whether entering a building or looking at products on a lineup, we tend to gravitate toward the side of our dominant hand.

On the other hand (so to speak), if you force your System 2 in play, you survey the situation and launch into analytical mode. Rejecting hunches or easy answers, you look for wear in the roads. Perhaps even the litter along the grass might give up clues as to what lies ahead or behind you. This might be a matter of life and death, so extreme deliberation is called for.

Your analytical brain might even recognize your own System 1 bias towards your dominant-hand side, so you are especially determined not to be led down that rabbit hole without a fight. Despite your hunger and thirst, you will use whatever information you can glean from your surroundings to make the most informed decision possible.

But as Kahneman points out, when we’re hungry and tired, our rational thinking and personal willpower begin to suffer mightily. The erstwhile fighter Mike Tyson once said, “Everyone has a plan until they get punched in the mouth.” In Tyson’s case, the insight was probably quite literal. Taken in a broader context, it tells the story of the brain’s limited ability to stay on task when confronted with a degradation of food, sleep or energy.

System 1 and System 2, which is believed to be the newer, shinier system, both have unique characteristics and given a particular situation work amazingly well.

System 1 can get a bad rap. It’s irrational, and it gets us into trouble sometimes. It weighs some pieces of information over others and it loves shortcuts. (Flaws in your System 1 thinking are why you can be fooled by optical illusions.) It also has a huge bias towards noticing and avoiding danger. While this generates plenty of false alarms and irrational fears (System 1 reacts emotionally to even seeing the word ‘crime’), sometimes you want to jump to conclusions.

As you were pondering your two roads dilemma, if a semi truck happened to come roaring around the corner from out of nowhere, you’d hope it wouldn’t take much analysis to dive out of the way. You could thank System 1 for letting you make that leap without waiting to find out the make and model of the truck as it bore down on you.

Luckily, the scenario I describe is theoretical. Besides, you would have never hiked out into the wilderness without GPS, an adequate food supply, and a backup power supply for your smartphone. Planning and preparation are what the boy scouts and System 2 share in common.

But let’s face it, System 1 is probably the real hero of the story. Without your impulses, emotions, and warm memories of the smell of pine, what’s the chance you’d actually martial the energy to go out in the wilderness hiking in the first place?

The beauty of System 1 is that it’s there to remind you just how lazy you truly are. And as it’s done for countless generations before you, it’s there primarily to keep you alive.

The Morality Lag: Smartphones and Dumb Feelings

Your smartphone has more computing power than the computer that took Neil Armstrong and crew to the moon. And this is only one of the staggering technological advancements we’ve made in the last 50 years.

Have you ever wondered why technological advancements, a byproduct of the analytical brain, have far outrun our ability to create any kind of significant improvements in our emotional governance? Wars, murder, and mayhem have gone unabated for thousands of years, and yet this week Apple announced the introduction of a new iPhone with fingerprint recognition.

Back at the turn of the twentieth century, Mark Twain said something along the lines that any newspaper publisher, regardless of the era, could always bank on headlines like “Trouble in the Middle East” and “Revolution in South America”. Twain was uncanny about the consistency of humanity’s inability to live and let live.

So why have our emotional brains hit a roadblock? Why haven’t we wiped out jealously, avarice and greed the way we knocked out polio? More importantly, when can we expect the next big upgrade to the emotional brain?

Sadly, not any day soon. The brain’s emotional decision-making powers have their roots firmly entwined in our most primitive survival instincts. There is a reason why neuroscientists refer to this part of the brain as our reptilian brain: we share this subcortical structure with the creatures found in the Florida Everglades. The drive to survive trumps just about every other kind of judgment, including the moral ones.

Individually, some may attain enlightenment, but it’s not exactly a product that can be monetized like Tang. And so collectively, we have a certain amount of cognitive dissonance about the way emotions wreak havoc on our everyday lives.

It feels much better for us to share in, and highlight technological advancements. Take for example the second sentence in this post:”And this is only one of the staggering technological advancements we’ve made in the last 50 years.” I don’t know about you, but I didn’t have much to do with the creation of the smartphone.

Technological advancements are built on the backs of the very special few. Although we all enjoy the latest creations that come in the form of cameras that are phones or phones that are cameras, who among the unwashed masses is capable of developing the next great innovation?

Let’s face it, most of us are essentially still cave people. We may wear nicer clothes, but if a cataclysmic event rendered electricity, and therefore our microwave ovens, useless, we’d pay to have one of our ancient relatives from the Chauvet Cave in Southern France explain how to start a fire without propane or matches.

A handful of impressive achievements have altered the way we live our lives, but they haven’t fundamentally altered who we are. We are the same people who engaged in war over territory and treasure since recorded time. Our weapons may have improved, but our intentions haven’t.

It’s easy to conflate technological leaps with some improvement in human pathos. They have a connection, but by no means are they the same thing.

We often give ourselves credit for having evolved far more than we actually have. It’s theorized that our current brain system have been around for roughly 40,000 years, apparently without fundamental change.

Like the saying goes, “Same story, different day.”

One can confirm this by simply gazing at the next heartbreaking newspaper headline.

After all, it’s just one click away on your new smartphone.