A Fruit Gone Sour: The Demise of RIM and BlackBerry

Hey, remember BlackBerry?

In this day and age, it’s basically the smartphone equivalent of asking about digital watches or portable CD players. So it may be hard to remember that less than a decade ago, BlackBerry phones were at the technological forefront, a staple of the busy, the important, and the with-it. People joked about their BlackBerry addictions to the point where “CrackBerry” was Webster Dictionary’s 2006 Word of the Year. In 2009, Fortune magazine named RIM, the makers of BlackBerry, as the fastest growing company in the world.

Today, you may still know a BlackBerry user, but it’s probably that eccentric friend who won’t throw away their video cassettes in case the VCR makes a comeback.

Have you ever wondered what happened?

Probably not. But hey, now that I brought it up, aren’t you curious?

RIM’s 1999 BlackBerry was revolutionary. In a time when cellphones weren’t good for much beyond making calls, here was a palm-sized PDA that could send and receive e-mails from anywhere. The network was secure, the battery lasted forever, and the little QWERTY keyboard meant you could tap out a message with nearly the efficiency of typing on a computer.

For a while, everything was going right for RIM. What happened? In a word, people.

CEOs Mark Lazaridis and Jim Balsillie built a tech giant, but sadly they suffered from what is sometimes called “Founder’s syndrome.” Having scaled their way to the peak of the mountain, they failed to remember that landscapes change—especially in the fast-changing world of handheld electronics, where people on average replace their phones every two years.

On one hand, with hindsight on our side, it’s easy to condemn business leaders for failing to divine the future. On the other hand, RIM’s success caused Lazaridis and Balsillie to double down and stick their heads so far in the sand that their comments now make for surreal reading.

When PDAs in Asia began to offer color screens, Lazaridis insisted it was an impractical fad. “Do I need to read my e-mail in color?” he’s reported to have said.

“Cameraphones will be rejected by corporate users,” he stated in 2003.

In 2007, when Apple introduced a little gadget they were working on called an iPhone, Balsillie dismissed it as, “kind of one more entrant into an already very busy space with lots of choice for consumers … But in terms of a sort of a sea-change for BlackBerry, I would think that’s overstating it.”

Maybe in another company, someone might have stepped forward and delivered a wakeup call. But Lazaridis was notorious for only hiring people who thought like him. Lazaridis and Balsillie continued to insist their practical, workmanlike product had an impossible-to-beat foothold among businesspeople. How could a phone that wasted battery life on shiny new features elbow in on their territory? Who would tolerate the less user-friendly touchscreen keyboard of an iPhone?

“The most exciting mobile trend is full Qwerty keyboards,” Lazaridis said in 2008, of a feature they’d been offering for literally nine years. “I’m sorry, it really is. I’m not making this up.”

The public disagreed. The public disagreed pretty hard, as it turned out. That oh-so exciting keyboard feature became a shackle, cutting possible screen space in half and severely limiting what else the BlackBerry could offer. As more and more average consumers were enticed into the world of smartphones by the bells and whistles of the new generation, it altered the very definition of what a phone was supposed to be.

By the time even Lazaridis and Balsillie could no longer deny that change was needed, it was too late: they’d lost their edge, their voice of authority. When they finally started to offer their own touchscreens, it came with a feeling of sweaty desperation—and amazingly, their attempt at an iPad competitor didn’t even offer e-email.

At its peak, BlackBerry stock was worth $230 a share. These days, it hovers around $10. You would probably be better off using that money to buy actual blackberries, which are delicious, full of antioxidants, and much less vulnerable to corporate hubris.

Neuroscience, Decisions, and Strippers

Decision making: there are countless books about it because, lets face it, decisions are at the epicenter of what we humans do. Make the wrong choice and it can kill you, or at least cause a lot of sweat and tears.

One major crossroads for many involves mate selection. Some knock it out of the park—we’ve all seen the heartwarming stories of couples still in love after 50 years—and then there are the marriages that crumble after a few months, or even days.

So what can we learn from the long-term lovebirds? What’s their secret? How did they find each other? When you first meet someone, what are the telltale signs to look for and, perhaps more importantly, to avoid?

It’s classic advice column fodder, and people make a tidy living doling out their strategies for selection. But at the crucial moment, how much strategy is really involved?

In his book Incognito: the Secret Lives of the Brain, neuroscientist David Eagleman shares an unlikely experiment done in New Mexico.

Scientists were curious about how someone’s attraction response to a woman might be influenced by her fertility. It’s a tricky thing to study: how do you quantify something as ephemeral as human sexual chemistry? For these particular researchers, the answer lay in strip clubs. If the two things were connected, they hypothesized, maybe a lap dancer’s nightly tips would ebb and flow with her menstrual cycle?

The results were surprising. Lap dancers during their peak fertility period earned a cool $68 a night. On evenings they were menstruating, their tips fell to $35, for a monthly average of about $53.

Those who were on the pill saw no such fluctuation. Instead, they averaged about $37 dollars an evening.

What accounts for the difference? Of course, there’s no way to be sure. But Eagleman speculates it has to do with subtle changes in things like body odor, complexion, and waist-to hip ratio. It might also involve the output of pheromones, those neural chemicals linked to attraction, picked up subconsciously through the nose.

In other words, without realizing it, strip club patrons were primed to open their wallets and give more freely. They took their cues from the most primitive parts of their brains, hardwired over the generations to notice potential mates with the greatest likelihood of producing offspring.

No rational decision-making was at work, no reference to a conscious list of preferred attributes. Consciousness wasn’t even invited to the party.

What does all this mean? Well, if you’re a lap dancer relying on those tips, it means doubling up on your shift during peak fertility and maybe looking at alternate forms of birth control.

If you’re a man trying to pick up women, it means you might want to second-guess that gut instinct. Ask yourself, ‘who’s driving?’ It might not be who you think.

The Empathy Switch: Binary Selection in Action

A number of years ago, I had the amazing good fortune to meet the legendary blues piano man Cornbread Harris. Though severely hampered by arthritis, this 86-year-old phenom still makes his living playing some of the most soulful music you’ll ever hear.

Although he didn’t—and still doesn’t—teach piano, I managed to talk him into giving me a few lessons. As you might expect, I ended up learning far more from Cornbread than just piano music.

Once I showed up at his house on a particularly bone-chilling Minnesota winter afternoon, knocked on his door and waited. And waited. And waited.

After what seemed like an eternity, Cornbread finally opened the door. I rushed inside only to find Cornbread clad in nothing but a pair of boxer shorts. He summed it up thusly: “Hey, I can either open the door or put on my pants, I can’t do both.”

What Cornbread had articulated was a perfect example of binary selection.

One of the reasons it’s so hard sometimes to order off a menu or pick a wall color is that the human brain is only designed to evaluate two options at a time.

It’s a simple concept—not necessarily graceful, but it gets the job done. This is a well-known principle in human brain evolution, where limited storage and electrical voltage makes ‘good the enemy of better.’ Natural selection doesn’t ensure the most elegant solution, just one that keeps you alive long enough to pass on your genes. The primitive notion of fight or flight is but one example.

Neuroscientists at the University of Valencia recently used fMRI technology to demonstrate that the responses for empathy and violence share the exact same neural circuitry. In the same way a piece of train track can only accommodate one locomotive at a time, your brain can trigger empathy or violence, but they are mutually exclusive.

This has wide-ranging implications. During the moment an individual chooses to make an attack, they literally can’t access empathetic feelings. Conversely, someone acting on empathy is briefly incapable of attacking.

Why these two conflicting impulses would share the same circuitry is hard to say. But what is clear is that the teaching and practice of empathy is of vital importance to a society torn apart by aggression.

Empathy under this scenario is not just a blunt to bloodlust, it actually shuts down the possibility of violent behavior altogether. Such is the beauty of a binary system in action.

Fish or chicken. Pants or door. Understanding or violence.

Choose wisely.

The Twain Brain, or, Why Smart People do Stupid Things

“A man who carries a cat by the tail learns something he can learn in no other way.” So said Mark Twain, printer, steamboat pilot, journalist, failed miner, entrepreneur, abolitionist, lecturer and supreme humorist. Twain is perhaps the greatest American storyteller and writer ever produced by the fifty states.

Whether attacking kings, hypocrites, or the literary offenses of Fenimore Cooper, Twain was famous for his razor-sharp wit and his allergy to BS. Yet that same man drove himself into bankruptcy, ignoring the council of his most trusted pals in favor of pouring his fortune into a series of disastrous inventions. He once invested the equivalent of 8 million dollars in the Paige typesetting machine, a marvel of engineering that dazzled crowds but also constantly broke and was obsolete by about 1884.

So why did a man renowned for not suffering fools pour his fortune into one fool’s errand after another? Could it be that Twain, like the rest of us, was seduced by a “magic bullet”? That wholly American notion that there is a shortcut out there to unmitigated wealth and happiness.

Whether it’s a diet pill (all-natural, of course) or a potion to restore skin texture to that of a six-week-old baby (all-natural, of course) or a book that promises to create a state of nirvana (no artificial additives) or a new-fangled typesetter machine, many of us are suckers for the shortcut.

We love the easy road, the secret sauce, or that ultimate financial tip (see Martha Stewart). In 2012, Americans spent a total of $78 billion on lottery tickets.

Our brains love shortcuts. The most primitive, basic parts of our brains are wired for them. Although these shortcuts lack precision and can create real problems, their saving grace is efficiency.

Still, this efficiency can suffer sometimes. Take optimism bias, the unfounded belief that things will turn out much better than the facts warrant.

It’s what allows smokers to believe they won’t get cancer, dieters to start their diet in the future, and all of us to procrastinate on our work because, as Twain noted, “Never put off till tomorrow what you can do the day after tomorrow.”

Even the great Twain fell victim to optimism bias as he traveled down what he thought was a shortcut to financial independence through a prototype-printing machine. The Paige typesetter was reported to possess more problems than actual moving parts, of which it had more than 18,000.

Ironically, many suspect that had Twain put more energy in writing and less in his pet get-rich-quick scheme, he would have gotten rich much faster, and with a whole lot less heartache.

But Twain, was plagued with one incurable problem: a human brain. If reasoning is currency, then biases and shortcuts are what the primitive brain trades in. And that brain is where the action is.

Perhaps rather than seeing biases and shortcuts as system flaws, we should instead celebrate that which makes our brains so unique and ‘predictably irrational.’

No one summed it up better than Mark Twain.

“Don’t part with your illusions. When they are gone, you may still exist, but you have ceased to live.”

What Your Grandma and Corporations Have in Common

Imagine your grandma just celebrated her 85th birthday. She’s beginning to forget things, but her doctor has reassured you that since, if given enough time, she can eventually pull the information through, it’s probably not Alzheimer’s.

You take some comfort in a recent study that suggests memory retrieval problems might not necessarily be the deterioration of synaptic connections, but more of a space issue.

Think of Grandma’s hippocampus like a library that, over time, has run out of shelving for the books. As they begin to pile up on the floor, the librarian can still find that edition of Twain’s Huckleberry Finn you’re after, but it takes a a little more time to scour all the nooks and crannies of the library to locate it. The same might be true for the hippocampus, the memory library.

In any event, modern medicine has been good to Grandma. The old family general practitioner has been replaced by a whole bevy of doctors who specialize in any number of medical fields. She’s got her heart specialist, her eye, ear, nose and throat doctor, her podiatrist, her diabetes doctor, her osteopath, and so on.

As a result, she finds herself traveling a regular circuit of doctors, each dedicated to improving the quality of her life and each taking advantage of the latest discoveries in pharmaceutical science.

Pharmaceutical science, like all science, operates on the principal of reduction theory—in essence, that the key to solving problems is to break them down into their smallest components and observe cause and effect. Molecular biology, and thus virtually every modern drug, is the result of this process. This systematic approach has literally built the technological world of modern humans.

There is one key problem with this approach. When you begin to examine complex systems like the human body, the reductionist technique begins to falter. Humans are composed of a myriad of structures that interact with each other, and depend on each other. The tangle of where one system begins and another end is difficult to understand, let alone observe.

For this reason, it makes more sense to understand a human being not as a series of mini structures or systems, but as one giant complex system. We need to think holistically.

When the heart doctor prescribes a heart medication, unless he knows what all the other speciality doctors have prescribed Grandma, and further understands the dynamic effect that might be created through the intermingling of medications, he might be setting her up for catastrophe and a trip to the ER. All of this despite his best intentions.

This is the peril of not recognizing that in a complex system, cause and effect relationships with other parts of the system can be significantly delayed and mask the dangers of your actions. Furthermore, the fact that internal organs are connected means the medication Grandma’s taken doesn’t necessarily move through her system in an isolated or linear fashion. Her heart medication might affect her heart, other medications, and/or other organs in unpredictable ways.

The effects of a medication can travel through the body like a metastasizing cancer, moving out in all directions simultaneously. The net result shows up as a cascading series of outcomes, leaving the simple, reductionist-driven ER doc in its wake.

Like Grandma’s body, today’s corporations have their own dizzying structure of interdependencies. Departments like sales, marketing, manufacturing, logistics, human resources, IT, and a slew of other departments abound, with more bound to come on the heels of new technological developments.

In many corporations, the depth of departmental interconnectivity and dependency is not completely recognized or understood, just like Grandma’s specialists don’t always understand the compounding effect of their actions in relationship to the body as a whole.

The nonlinear aspects of complex systems and delayed cause and effect loops can doom a company in the same way Grandma’s new heart medication may negatively impact her other medications.The end result can put Grandma in the ER, and a corporation on its back.

 

Stay tuned for Part 2. Next week we’ll learn the tricks to keeping your Grandma—and your favorite corporation—alive.

The Experiencing Self, or, Why Present-You Hates Past-You

Unlike Calvin here, most of us will probably never get the opportunity to have a face-to-face conversation between our current selves and our 6:30 selves, so to speak. That is a shame, because as Daniel Kahneman discusses in Thinking, Fast and Slow, the two of them don’t necessarily know that much about each other.

In trying to understand how the brain registers emotion, Kahneman outlines the divide between what he calls the “experiencing self” and the “remembering self.” The experiencing self, or “8:30 Calvin”, if you will, is you in the present. All it knows is whether or not you are having a good time in the moment. The remembering self, or “6:30 Calvin”, on the other hand, looks back and tries to to sum up your overall impressions of past events.

Which Calvin holds more sway in your judgements?  That prize goes to 6:30 Calvin. It makes a certain amount of sense. The trouble with living “in the now” is that every second is a different now. Your in-the-moment perceptions are in constant flux. Your remembering self, on the other hand, is a much more fixed point. Besides, the vast majority of the information in your head is not things you’re discovering in the moment; but feelings and data you’ve gradually built up over your whole life.

Unfortunately, 6:30 Calvin doesn’t always know what he’s talking about.

In the past, we’ve discussed the peak-end rule, where our take-home memory of an event puts way too much weight on the most extreme moment, and also on whatever happened at the very end. (Every stage actor knows that you have to bring your A-game to the final scene.) Our most recent recollection can color the rest of our information to a hilarious degree. In one study, people were asked to judge their own life satisfaction. But first, they had to make a photocopy. Half the participants then “discovered” a carefully planted dime in the photocopy room. The simple minor victory of having just gained a free ten cents had a noticeable impact on how they assessed the overall happiness of their lives.

Then, there’s duration neglect. 6:30 Calvin has no way to accurately record time.

To see these effects in action, we need look no further than Kahneman’s “cold hand” study.

First, experimenters asked people to plunge their hand into very cold water for 60 seconds. As you might imagine, this is not the most pleasant activity. (From an experiment design setup, it’s a good way to administer an easy-to-measure but harmless pain. From a “bored scientists hanging around the lab” setup, it’s probably also a decent dare.)

The subjects were allowed to briefly warm and dry their hands with a towel, and to presumably take a moment to ponder the sacrifices we all make for scientific knowledge, and whether or not it’s worth it to hurt yourself on purpose while some schmuck or schmuckette stands over you with a clipboard.

Then, it was back into the cold water. This time, the subjects got 60 seconds just as before, but immediately followed by 30 seconds in water that was exactly one degree warmer.

Told they had to undergo one more dunking, the subjects then had to decide: did they want to relive Option A or Option B?

Keep in mind: Option B is just Option A with 30 extra seconds of slightly less-painful pain. (A total of 90 seconds in cold water.) So surely it will come as no surprise to know that people overwhelmingly chose…Option B. They were swayed by the recollection of the pain lessening of the last 30 seconds (hello, peak-end rule), and while each second in the cold water had probably felt like an eternity as it was happening, the remembering self couldn’t make the distinction.

The remembering self doesn’t care about 60 seconds vs 90 seconds. “What’s the difference to me?” says the remembering self. “I’m talking to you from the past, and in the vast scheme of your life, 30 seconds are nothing.” Sure, it means a little extra pain in the moment, but the remembering self doesn’t worry about the moment. “Not my department,” says the remembering self with a shrug, passing the buck in a scene familiar to anyone who’s ever worked in a company with multiple employees. “It’s someone else’s problem.”

Unfortunately for you, that “someone else” is…your experiencing self.

If you’ve read the classic Calvin and Hobbes strip above, you know that what follows is a whole lot of arguing. Just one more peril of time travel…

 

 

The Science of Epiphany

You know the sweet satisfaction when you suddenly have an epiphany? I’m talking about that “Aha!” moment when the circuits suddenly connect and, seemingly out of nowhere, you are struck with an insight.

Today, using fMRI technology, neuroscientists can watch the revelation unfold on a cellular level. Neurons begin to cluster and activity speeds up, eventually giving way to burst of energy not unlike a mini fireworks show. All this can be witnessed by the fMRI technician about eight seconds before the subject is aware of their impending moment of truth.

So how does this all work?

First, it’s important to differentiate between an actual Eureka moment and a more mundane retrieval of information from your hippocampus, that general purpose library of memories.

Insights are not merely rediscovering misplaced data, like suddenly remembering where your car keys are. They are combinations or reinterpretations of information, creating something entirely different or new. They are the embodiment of what it means to “think outside the box.”

It starts with consciously trying to solve a problem. Then there is the required period of struggle, hitting the proverbial brick wall with no solution in sight. Take the classic father and son riddle:

A father and his son are in a car accident. The father dies at the scene and the son is rushed to the hospital. At the hospital, the surgeon takes one look at the boy and says, “I can’t operate on this child, he’s my son.” How can this be???

This brainteaser plays on the fact that some readers will automatically assume the surgeon is male.

Suppose you are one of those people whose gender bias prevented you from seeing the answer right away. Even though your prefrontal cortex might be stumped, unbeknownst to you, your subconscious brain is still working overtime trying to figure it out.

Interestingly, it seems that when your prefrontal cortex hits an impasse, it triggers other brain functions to kick into gear. This sets up the opportunity for free association by bypassing your analytical train of thought in favor of the hippocampus’s vast storage of information, feeling, and experience.

Your subconscious brain essentially goes into improvisational mode, and what we call daydreaming is actually this freewheeling engine hard at work. This is a critical aspect of the epiphany process for every one of us, from the average Joe or Jane in the street to Albert Einstein. (Einstein called his daydreaming “thought experiments.”)

Because all this business is going on below your awareness, when the solution floats up into your rational mind fully formulated, it feels as if it came out of nowhere.

Only the conscious brain has language. This is probably a good thing because if your subconscious brain could talk, it might very well demand a thank you, or at the very least an “I told you so”.

The Brain’s Allergy to the Big Picture

Do you suffer from Systems Blindness? You almost certainly do.

The problem is that your brain’s hardwiring is designed primarily to keep you alive. Which is fair. But as a result, we specialize in snap-second judgments.

Our living strategy is largely built on using association to connect causes and effects, which in turn drives our decision-making. See a school bully in action and we go out of our way to avoid him. Watch a fellow office worker grown lean through jogging and we might be tempted to hit the pavement ourselves in the morning. In short, we observe, draw inferences and plot our course. This strategy has served humanity well; after all, there are over 7 billion of us on the planet.

Individually, we are amazing at making day-to-day decisions that afford us a certain amount of comfort. But what happens when our comfort is besieged by a huge, unnervingly complicated system like weather or traffic? Here is where Daniel Goleman in his new book Focus: The Hidden Driver of Excellence, weighs in.

Take traffic, for example. When we’re stuck in rush hour, we might be tempted to think the way my diminutive little grandmother used to: “Why don’t all these damn people stay home?” Aside from the fact that, by her very presence, she is contributing to the problem, this is probably an issue of oversimplification. Too many people = traffic jam.

We might be tempted to answer my grandmother with, “What we really need is more roadways.” Engaging in this kind of reasoning is known as the “illusion of explanatory depth,” Goleman explains. “…we feel confidence in our understanding of a complex system, but in reality have just superficial knowledge.”

We don’t realize, for instance, that access to new highways can energize nearby industry, which can grow communities, which in turn supports restaurants, shopping, and recreation, thereby attracting even more families, which puts more people like my grandmother on the road, which of course means more traffic jams.

Our brains understand cause and effect at a local level, but as the causes and effects grow larger and more distant, our reasoning suffers. The effect of slamming your fingers in a car door is pretty immediate: the amygdala, the fear center of your brain, fires off a warning and your sensory system administers a shot of pain. Global warming, on the other hand, operates on an almost impossible level of remove.

We are designed to create short-term solutions, and as our societies have grown larger and more complex, system blindness becomes increasingly more dangerous. I can’t see the seas rise as the result of a carbon-loaded atmosphere, so I don’t merely dismiss the impending long-term threat, my amygdala is as complacent as the Maytag repair man.

Luckily, as our ability to generate and analyze large quantities of data has improved, our awareness of systems is growing too. Google’s foray into plotting flu epidemics is but one example.

Hopefully the more data mining that takes place for things like global climate change, the more each of us will begin to consider that we are part of something far larger than the hunk of land on which we live and drive. In the meantime, in regards to our carbon footprint, maybe my grandmother was onto something. “Why don’t all these damn people stay home?”

Einstein, Allie Brosh, and the Secret to Procrastinating With Style

When you contemplate your life, wondering what it means to be alive, it’s unlikely the first thing that came to mind was ‘office work.’

And yet arguably the life you lead at your desk inhabits a great deal of mental real estate. The sheer number of hours typically spent at work guarantees that the office and all it entails is fundamental in understanding and explaining the big picture of your life.

Work may or not bring out the best of us, depending on our tasks and whether we are able to get into flow as defined by Czekmentchiayli. But observation suggests there is one constant in human behavior you can expect to see wherever you find a shantytown of office cubicles.

The idea was coined by Allie Brosh, of Hyperbole and a Half fame. In a recent interview with Terry Gross, Brosh explained how she started her now-famous internet comic when she was supposed to be studying for finals: “I’m laterally productive. I will do productive things, but never the thing that I’m supposed to be doing.”

The elegance of lateral productivity is it allows you to put off tasks indefinitely without guilt. After all, you aren’t loafing around accomplishing nothing. You’re working hard! You don’t have time to file those reports.

It’s Einstein’s theory writ small: two objects can’t occupy the same space at the same time. In this case, two tasks can’t occupy the same brain space, since the human brain is notoriously poor at multi-tasking.

Lateral productivity allows you to play the hero to your self-imposed task villain. You find yourself in a kind of self-perpetuating state of activity, trapped in some Escheresque landscape where you’re diligently drawing the next set of stairs right before you ascend or descend said stairway. Technically, you’re moving, but there’s no forward progress to be had.

This art form is practiced by workers and managers alike. Lateral productivity can go by many other aliases, including “special side project”. The naming lends a measure of credibility. Throw in some metrics, build some color graphs, wrap it up in a Power Point presentation and you’ve got the makings of an entirely new task, which at some point lateral productivity will force you to abandon for something else.

Perhaps the best spokesman for lateral productivity was the great New York Yankees catcher Yogi Berra. Driving around one day, unsure of his location or the route to his destination, he reported, “We may be lost, but we’re making great time.”

Are You Smarter Than a Mouse?

Are you smarter than a mouse? This was one of the intriguing topics presented at 2013’s Society for Neuroscience conference in San Diego, on research done by J.F. Gysner, M. Manglani, N. Escalona, R. Hamilton, M. Taylor, J. Paffman, E. Johnson, and L.A. Gabel, all based out of Lafayette College in Easton, Pennsylvania.

If you are a lab mouse, then you are undoubtedly familiar with mazes. Specifically, you’ve probably logged some time in a Hebb-Williams maze. For decades, it’s been the go-to research model: a spacial-visual maze that centers on twelve standard problems, which differ based on the learning/memory task researchers have assigned to you and your rodent buddies.

But the Hebb-Williams maze is not solely reserved for our tiny rodent friends. Its friendly confines have also been used to test the mettle of ‘rats, cats, rabbits, ferrets, mice, and monkeys.’

The Lafayette College team had a few questions on their minds. Would it alter test results to use a virtual model instead? And if not, could they run humans through the simulation and compare their performances against mice?

Clearly, a virtual maze is far more desirable in terms of space and construction costs. Also, it’s not nearly as problematic as shrinking humans down to fit into a mouse maze. (Which, for one thing, opens itself up to all manner of tired movie plots.)

Ninety-eight humans, both male and female, participated in the experiment. The study focused on two age groups: children aged 8-12, and young adults aged 18-21. The participants were screened and evaluated on their video game knowledge to eliminate any pre-trial skill biases.

In order to ensure that chocolate pellets would be enough of an incentive to run the maze, researchers skimped on the food until the mice reached 85% body weight. (Apparently the humans needed no coercion to run the virtual maze for chocolate pellets.)

Ultimately, when it came to the final showdown, humans from both age groups were faster and less prone to mistakes than their small furry counterparts. However, taking controls for species into account, the humans and mice performed “similarly”, suggesting their performance could be compared in future experiments.

Additionally, it turns out that using a computer-generated maze on humans did not alter their results. This was particularly good news for the Lafayette researchers, but perhaps not such a boon for the would-be producers of Honey I Shrunk the Kids 3.

So lucky for your self-esteem, it turns out you are smarter than a mouse, at least where maze-running is concerned. That is, until the playing field is leveled and then, well, say hello to your new competitors, the irrepressible Mickey and Minnie.