Analogous Thinking: The Genius of Mix-and-Match

Want to be more creative? One promising path to creativity what’s called analogous thinking. This approach suggests often you can reach the best solution to one problem by looking at what already exists in a different but somewhat relevant field. By adapting elements of those works to your own purposes, you can create something new, exciting, and effective.

It’s estimated that “perhaps 80% of creative ideas” follow this formation. The key is clever recombinations.

Years ago, composer, actor, and rapper Lin-Manuel Miranda noticed striking parallels between Robert Chernow’s biography of Alexander Hamilton and the classic hip-hop narrative. The result was Hamilton, a hip-hop biographical musical narrated by Aaron Burr, Hamilton’s political rival and eventual killer. It’s a fresh take on the monumental genius and equally monumental egos of our founding fathers, capturing the 18th century revolutionary spirit for new generations of Americans.

(Miranda is not the first to find grist for musicals in unlikely places; as creativity expert Gary A. Davis points out, the Broadway juggernaut Cats was inspired by T.S. Eliot’s Book of Practical Cats. But Miranda may be the first to reimagine George Washington’s cabinet meetings as vitriolic rap battles between Alexander Hamilton and Thomas Jefferson.)

For instance, in the 90s, engineer and bird-watcher Eiji Nakatsu of Japan was working for the rail company JR-west when he observed a kingfisher diving for fish while barely disturbing the water. Realizing that the head and beak shape resulted in incredible aerodynamics, Nakatsu modeled the front train car after the bird to create a quieter, faster bullet train.

The shape of Pringles potato chips was inspired by how tightly and compactly wet leaves can stack on each other. (If you prefer a heartier, more old-fashioned chip, you may be able to find other parallels between Pringles and dead leaves.)

The ubiquitous fastening material Velcro, beloved by toddlers everywhere, came out of a fateful 1941 trip in the Alps, when Swiss engineer George de Mestral and his dog got covered in burdock thistles. When he got home, de Mestral was surprised by the thistles’ sticking power. Intrigued, he grabbed an old microscope and magnified a sample. The burr was covered in tiny hooks that stuck to the natural loops created by fabric or fur.

De mestral says he instantly recognized the analogy between thistles and clothing. He’s quoted as recounting his reaction at the time like so: “I will design a unique, two-sided fastener, one side with stiff hooks like burrs and the other side with soft loops like the fabric of my pants. I will call my invention ‘Velcro,’ a combination of the word ‘velour’ and ‘crochet.’ It will rival the zipper in its ability to fasten.”

Whether this was his exact thought process, or, more likely, how he remembered it later (keeping in mind that memory is not a perfect recorder of events), the key point is that de Mestral’s analogous brain made the link.

As all good invention stories go, his bold challenge to the tried and true zipper initially brought him shame and ridicule. De Mestral, however, was undaunted, and spent years employing top-down problem-solving, working through a variety of material applications like an Edison lab, until, through trial and error, he hit upon nylon sewn under infrared light as the perfect hooks for his artificial burrs. He patented the idea in 1951 and never looked back as Velcro went on to become a multimillion-dollar business.

So the next time you’re feeling stuck, it might be time to find a local park of your own. You never know what new burrs are waiting to be discovered.

Creativity Crunching: How Do We Measure Human Ingenuity?

Scientists trying to get to the core of creativity encounter a very basic problem at the outset. Unlike, say, size, or time, creativity is extremely difficult to measure. It’s even difficult to define.

Frequently, people searching for a creativity-judging metric focus on what’s called “divergent thinking”. This is the ability to come up with a large number of solutions to a given problem. It’s the “no wrong answers” school of brainstorming. Divergent thinking is all about casting the widest possible net, and then gauging success from overall net size.

There are arguably some benefits to this approach. For one thing, since divergent thinking concerns itself with the sheer amount of ideas generated, measuring it is as simple as counting.

What do these experiments look like? Imagine someone hands you a paper cup and asks you to think of as many uses for that cup as you can. Someone with a knack for divergent thinking would be off and running: a drinking vessel, a fly catcher, a drain stop, a place to store crayons, a hat, and so on. One of the standard divergent testing questions is, “How many uses can you devise for a brick?”

This approach lets scientists easily assign scores to large groups of test subjects, generating huge amounts of easy-to-interpret data. Assuming, of course, that divergent thinking is a useful lens for examining creativity in the first place.

If you’ve ever walked out of a “no wrong answers” brainstorming session feeling unsatisfied, you may already grasp the controversy at play here.

Some scientists dismiss the relevance of divergent thinking, arguing that, at the very least, it’s not a useful way to assess a person’s creativity. Sure, it’s easy to compare one person’s score to another, but it’s difficult to prove that high scorers here are more creative in real-life situations.

For one thing, divergent thinking tests don’t seem to have any correlation with a person’s future creativity. There isn’t much evidence that finding many uses for a brick one day translates into any creative advantage later in life.

And outside of a testing facility, most of the time, solutions only count as solutions if they’re actually useful. In other words, a paper cup would make a terrible hat.

In addition, most people would agree that when we judge a person’s creative output, quality trumps quantity. Originality or novelty is considered an essential part of the mix. Judging someone’s creativity only by their number of ideas is like saying that Nora Roberts, who has published a massive amount of books—over 200—is a more creative writer than Maya Angelou.

To add another wrinkle, University of Iowa neuroscientist Nancy C. Andreasen suggests that the human race might owe far more of its creative achievements to convergent thinking, the direct opposite of the divergent approach. Convergent thinking doesn’t concern itself with finding a lot of answers, but with winnowing down to the single best solution. “A process,” she notes in an article in The Atlantic, “that led to Newton’s recognition of the physical formulae underlying gravity, and Einstein’s recognition that E=mc 2.” But nobody is clamoring to test for convergent thinking. It’s tough to know just how to tally it.

Creativity and IQ: What 1500 Kids Can Teach Us

How does your IQ affect your creativity? One might assume that having a super high IQ would garner you more powerful creative flights of fancy, and more control over the process, whether top-down or bottom-up. But as is so often the case with preconceived ideas, things are not always what they seem.

We can trace the American fixation on IQ back to the beginning of our involvement in World War One. The U.S. War Department was searching for ways to rank their recruits by intelligence, and to identify who would be best suited for which jobs, from scouts to officers. For help in making these judgments, the military turned to psychologists like Lewis Terman of Stanford University.

Terman had tweaked an intelligence test devised by the famed French psychologist Alfred Binet to create a new version called the Stanford-Binet Intelligence Scales. He initially promoted this as a tool for classifying developmentally disabled children, but the U.S. military was so impressed with Terman’s work, they hired him and six others to create the “Army Alpha”, an assessment test which was administered to 1.7 million GIs.

Since at the time, there was no other widely circulated intelligence test to use as a benchmark, it’s hard to measure the test’s net effect. However, the allies went on to win the war and Terman went on to screen children for signs of “genius level” IQ.

Several years later, Terman used these screening results to kick off a study aimed at understanding the wide-ranging effects of “genius”. (Eventually, the study would abandon the emotionally charged—and difficult to quantify—label “genius” in favor of “gifted.”) He began in 1921 at Stanford. Terman looked at 1500 children, male and female, attempting to track everything from their developmental progress, their interests when playing, their medical condition, how much they read and how many books were available to them at home. Then he continued to periodically check in with those same individuals throughout their lives.

An early example of a longitudinal study, it’s also the longest-running of its kind and still continuing today—to be concluded at the death of its final subject.

This work eventually begat Terman’s multivolume Genetic Studies of Genius, considered a seminal document in American psychology. (That’s not to say that Terman’s scholarship all holds up by today’s standards; in testing across cultural and racial groups he reached many conclusions that are unquestionably racist.)

However, he did debunk a number of then-common misconceptions about high-IQ children: his research showed them not to be physically frail or socially maladjusted. In an era where parents often held their children back a grade to prevent the kid from being the youngest in their class, Terman found that being the youngest in a class was, in fact, a predictor of a high IQ.

For our purposes, Terman’s most interesting result concerns creativity.

To the extent it could be measured, Terman found that the 1500 study subjects did make an above-average number of societal contributions in creative fields. This, on the face of it, would suggest that a high IQ delivers a key creative boost, but in a separate study, sociologist Pitirim Sorokin showed that a random group of children coming from equivalent socio-economic backgrounds would do just as well. This would seem to indicate that environment plays a larger role, which makes sense: if you’re not sure where your next meal is coming from, you’re less likely to give, say, oil painting, your full focus.

Faced with the data, even Terman had to admit: “We have seen that intellect and achievement are far from perfectly correlated.” This sentiment is echoed by many other studies: abnormally high IQ is no guarantee of academic achievement or high creative output. It’s been suggested that Terman’s research supports what’s called the Threshold Theory, which states that an IQ of 120 (above average but certainly nothing extraordinary) is enough to achieve “creative genius”. Anything above that point doesn’t seem enormously helpful to the individual.

This seems to suggest that creative thinking is well within the reach of a large number of people, given hard work, focus, and a certain dose of luck.

Consciousness: The Signal in the Noise

In February’s blog post, How your Brain is Like an Ant Colony, we discussed how neural networks follow the concept of emergence: when it comes to connections between neurons, much of the order arises by neurons organizing themselves, without top-down direction.

Arne Dietrich, the author of How Creativity Happens in the Brain, writes that some of those networks are hardwired and some are flexible and built in the moment. What determines the strength and intensity of a neural network include “a person’s unique past experience, opinions, preferences, and expertise.” He explains that, in the same way “lightning follows the path of least resistance,” the strongest connections send the fastest signals, taking over brain regions in a phenomenon called “spreading activation.”

The lack of an overall leader makes ant colonies fascinating. But if our own thoughts (activated neural networks) are all just a matter of signal strength, what is the self? How does self-awareness arise?

Oliver Selfridge, a pioneer of artificial intelligence, also known as the “Father of Machine Perception,” posited the seminal idea back in 1959 that it’s basically pandemonium among neural networks until one dominates consciousness, albeit temporarily. This is known, not surprisingly, as the Selfridge Pandemonium model. 

Imagine a diverse group of neural networks. Each network is competing to be heard, to send the strongest signal and thus show up in your working memory and achieve conscious awareness. When you find yourself suddenly thinking about baseball, then your baseball neural circuit won out over say, a thought about the long-term effects of climate change.

All the while, the brain’s executive control, or EC, exerts pressure from above. This may sound like top-down leadership, but keep in mind that in this case, executive control is not “control” in the classic sense: no part of your EC is consciously weighing the merits of these networks and making informed decisions.

The only metric this process runs on is whether or not a network has been strong enough to broadcast its signal in the past. If there’s a signal that’s succeeded many times before, the EC adds to the effect by suppressing and quieting other competing neural networks. So if you’ve been thinking a lot about the upcoming World Series, your brain is likely to stay with that theme and Executive Control is tamping down thoughts about climate change, politics and what’s on the menu for dinner tonight.

The more often a neural network shows up in your working memory, the more powerful the network becomes, and the more likely it will be called back for an encore. This kind of encore biasing is important because only one neural network is going to rule your conscious airwaves.

This is known as the all-important “frequency of occurrence,” a concept at the heart of Hebb’s rule, “neurons that fire together wire together.” When these neuron networks reach critical mass and show up in working memory, the signal is distributed to all areas of the brain.

Daniel C. Dennett says, “Consciousness is accomplished by a distributed society of specialists that is equipped with a working memory, called a global workspace, whose contents can be broadcast to the system as a whole.”

You’ve been intensely focused on baseball. The time has finally arrived: you flip on the TV and now your entire brain is tuned into the start of the World Series. Every aspect of your sensory and motor system knows it’s game time! The excitement in the crowd is palpable, and you can practically smell the beer and hotdogs up in the stands. But there is no guarantee the neural network dominating your working memory will be able to hold your awareness for long.

The band of neurons might break up under its own weight of individualism, meaning that for a fleeting second you start thinking about work on Monday.

It might be ousted by another more aggressive network of neurons with more powerful connections. Now you’re not only think about work, but those emails from your boss piling up in your inbox.

Executive control might also dump a network because something else grabs its attention, like when a speeding fire truck whizzes by your house and suddenly a new thought (‘I wonder what’s on fire?’) steals the show away from work or the baseball game.

Your brain’s insula can be understood as the remote control that toggles back and forth between what will be spotlighted in working memory: either focused thoughts sponsored by executive control (and suddenly the roar of the crowd pulls you back into the heat of the game), or less focused daydreaming content (as your thoughts slowly drift away to a childhood memory of a cat basking itself in the warn sun on a porch somewhere).

Dietrich says that in this type of scenario, the idea of ‘Self’ as an entity is not the “continuous integrated flow we experience but the outcome of a series of discrete representations, sequenced together on the fly from different, endlessly competing, parallel streams of computations, each consisting of continuously shifting coalitions of neurons.”

You don’t feel the shift as your thoughts battle it out, because the neural editing is so smooth each thought seems to blend seamlessly into the next one, belying the underlying pandemonium as a whole host of thoughts jockey for the dominant position of awareness.

‘Self’ is therefore a construct: an interface device providing the same effect that running Microsoft Office has on your computer. It gives you the sense of a nice tidy controlled mechanism, as if you’re watching a perfectly edited highlight reel. In actuality, there’s a cacophony of neural networks fighting to grab the center stage of consciousness at that moment. The grand prize? That neural network is large and in charge—of your thinking.

Epiphanies, or, the Bottom-Up Principle

Creativity as a whole is a hard thing to concretely study. However, a slightly easier question than “where does creativity come from?” is “how do epiphanies work?” Those sudden flashes of new ideas happen when your brain connects up associations that don’t normally go together, like bacon and popcorn.

The process starts when your more rational, top-down conscious thinking system struggles with a question it can’t seem to solve. With possibilities exhausted, you start to lose focus, and as a result, your attention shifts. At this point, your executive control system bows out and the problem gets dumped into the more reflexive, impulsive, emotional brain.

Your brain is designed in such a way that when you stop actively working the problem, your unconscious systems help out by taking over, continuing the associative matching exercise. It’s like an architect who, out of ideas, kicks a problem down to the site foreman, saying, “See what your guys can do with this.”

However, with the unconscious areas of your brain heading up the search, there is no longer any direction or intervention from the executive control center of your prefrontal cortex. This also means working without the benefit of the attention network (singular focus) or working memory (recent information).

This might sound like it would hamper the whole enterprise, but without the scrutiny and self-criticism of top-down control, the bottom-up process is allowed to operate unimpeded. Associations connect less powerfully and more randomly. That may seem like a negative, but the wider you cast the net, the more likely it is that a solution will surface.

The magic is in the novel recombinations that arise through loose association—the ones that would normally be suppressed by your prefrontal cortex, dismissed as arbitrary and unsystematic. In this situation, your new connections are not only free to form but free to take root. It only takes a few to get that novel guitar riff or chess move started.

While your prefrontal cortex is on hiatus, should you chance to encounter some new stimulus—like a tub of water you displace by plopping into it—you, like Archimedes, can spot new associations bubbling up unexpectedly.

I’m talking about that “Eureka!” moment when neural circuits suddenly connect and, seemingly out of nowhere, you are struck with a new insight. If you’re Archimedes, you discover a new principle, a central tenant of physics. Eventually, you get a law named after you. Much later, when the twentieth century rolls out, “Eureka” proves to be a marketable name for an upright vacuum cleaner.

Not bad for a slow day at the bath-house.

The Lure of the Irrational: Why Birds and Basketball Players Fall Prey to Superstition

Irrational behavior is commonplace in sports. Michael Jordan wore his “lucky” University of North Carolina practice shorts under his Chicago Bulls shorts every season, believing that extra layer of shorts made the difference between winning and losing. Tennis ace Serena Williams is rumored to have worn the same unwashed pair of “lucky” socks 162 matches in a row.

And the list goes on and on. Are professional sports stars somehow more superstitious than the rest of us? The answer is no.

And it’s not just humans: famed psychologist B.F. Skinner once reported that pigeons seem to behave superstitiously, too. Although we can, of course, never know for sure what birds are thinking, Skinner observed patterns of strange behavior, like a bird twirling in a circle prior to feeding. He posited that the bird had somehow associated the act of twirling with the act of getting fed.

We are all twirlers to some extent. We can trace our irrational behaviors, both collective and personal, to the associative feature of our brains. At the end of the day, the brain’s main goal is to keep you alive, and that means being on the lookout for any meaningful patterns. Dark clouds mean a storm is brewing. A stranger charging towards you with their teeth bared is probably not stopping by to say hello.

But, following the age-old rule of “better safe than sorry”, we tend to experience some false positives as well, meaning we can also see the face of Elvis in the clouds from time to time.

When someone draws connections nobody has seen before—like when Lin-Manuel Miranda writes a musical reframing the story of Alexander Hamilton as the ultimate hip-hop narrative—the results can be electrifying and transformative, and we hail his associative brain as genius. But when the Son of Sam connected demonic thoughts to his neighbor’s dog, we got serial killing, so clearly there is a tragic downside as well.

The real problem arises when the analytical part of the brain fails to react to unfounded associations. It’s when the emotional brain sneaks into the driver’s seat that we find ourselves donning dirty socks to keep our winning streak alive, or taking a cue from Brazilian Shotokan karate master and UFC Light Heavyweight Champion Lyoto Machida, who believes he improves his chances of victory by drinking his own urine.

So is there a way to thwart these ill-advised leaps?

The answer is yes—if you can recognize them ahead of time.

Running opposite to the patterns in our behavior requires a preplanned defense, what psychologist Gabrielle Oettingen calls an “implementation intention.”

An implementation intention involves taking a moment beforehand to form an if/then statement: “If I find myself tempted to put on dirty socks then I’ll remind myself that dirty socks can lead to foot fungus.” Having a little pre-made course of action at the ready can help re-establish a rational thought process.

If this sounds to good to be true, Jeremy Dean, author of Making Habits, Breaking Habits, says that over 800 studies support the efficacy and significance of this strategy.

So the next time you’re tempted to twirl around a few times before your next meal, or pull up your second pair of shorts, you might consider plugging in an appropriate if/then to reboot your analytical mind.

After all, how hard can it really be to stop drinking your own urine?

Transposons: The Improvisers Inside Your Brain

Even if you from time to time think about your neurons, those little chemical-electrical switches that dictate your mental and physical activity, you probably don’t give much thought to your transposons. And yet transposons don’t just play a crucial role in neural application; in a very real sense, they define who you are.

A transposon is a fragment of DNA that inserts itself into another cell. Research suggests that about half our DNA sequence is made up of these fragments, these interlopers.

In the cells of, say, your lungs, heart, or kidneys, transposons have no real effect. They don’t behave like viruses, which sneak into cells and multiply like crazy. They’re more like very mellow hitchhikers: once they’ve found their way in, they’re usually content to fall asleep and enjoy the ride.

The exception is the brain. Once transposons get inside neurons, they can alter the very nature of the cell. It’s like a troupe of improv actors that show up unexpectedly at your birthday party: suddenly you’re at a very different party. Transposons can influence a neuron’s firing sequence, or turn it off or on, or even reconfigure the operating code of the whole chemical-electrical switch.

This means they can change the entire identity and purpose of a neuron. And like any good improv actor, they can shift into a variety of roles and characters.

It’s a Darwinian parable playing out on a cellular level, what Kelly Clancy in her New Yorker piece “The Stranger in Your Brain” calls “a kind of evolution in miniature.”

The result? Even among twins, no two brains are exactly alike. Identical twins begin with identical DNA, but the arrival of those improvising transposons makes neural activity wholly unique. And since transposons aren’t passed down, your brain is truly a once-in-a-lifetime show.

Clancy explains that this is in part why it’s so difficult to find the underpinning of neurological diseases: with each transposon doing its own thing, there may be no single static genome with regions that can be identified, isolated, and acted upon.

If all this sounds a little, well, scary (rogue DNA wreaks havoc on unsuspecting brain!), Clancy says there’s no reason to panic just yet. After all, she writes, “it is our mosaic brains that may deepen our capacity for individual invention and imagination.”

Besides, who doesn’t love a little improv? More than half a century since its premier in 1959, Second City continues to thrive. And you’ll be thriving for as long as your transposons continue to act.

Bottling Habit

Last year, Americans used and discarded about 50 billion plastic water bottles. The recycling rate on those bottles is about 23%, meaning that roughly 38 billion plastic water bottles were dumped into landfills or ended up as general litter.

In the face of this growing problem, Elkay Corporation, a U.S company best-known for drinking fountains, developed the EZH2O fountain, which doubled as a water bottle filling station. This allows people to reuse their own refillable bottles, thus cutting down on waste.

As a savvy company, Elkay saw an opportunity to make money and be a little greener at the same time. The big question their engineers had was, will people actually change their habits and take advantage of the eco-friendly option?

When the engineers finished their filling station design, they decided to add a counter to show consumers how many bottles were being saved with each use. Once the full expense of research and development are factored in, it’s not uncommon for prototype models to scale back from the original concept. “At one point, we almost cut the counter from the specification due to cost,” Franco Savoni, VP of Product Marketing and Engineering at Elkay told me.

But here’s what happened: in college dormitories across the U.S. where the EZH2O was installed, students got really excited about the bottle counters. They could instantly see a tangible result each time they refilled their own bottle—and they could watch those results stack up.

It took almost no effort, saved them a few bucks a bottle, and provided instant gratification that they were doing something good for society and the planet. Watching the bottle counter turn over each time meant they were, in effect, getting gold stars—small, constant, emotionally satisfying dopamine reinforcers.

To up that reward factor, students at both University of Michigan and University of Minnesota even organize contests: which dorm floor can save the most bottles and get that counter the highest?

As of this writing, Savoni reports that Elkay has delivered “in excess of 100,000 EZH2O units.” You can find them at gyms, in airports, and of course, in college campuses.

[Editor’s note: it probably bears mentioning that Robb Best is currently employed by Elkay as Senior Advisor for Cognitive Strategy.]

Anatomy of Emotion

What is fear made of?

In his book Self Comes to Mind, Antonio Demasio describes emotions as complex, largely automated neural programs of action. He writes that emotions can be triggered by real-time events, events of the past, or images related to events. They tap into various brain regions, including the areas concerning language, movement, and reasoning. This in turn sets off a chain of chemical reactions.

Certain kinds of emotions tend to activate specific brain regions, producing a kind of lock and key effect. For instance, situations involving fear unlock the amygdala and triggers additional chemicals associated with fear. Our perceptions of those internal changes are what we call feelings. (When two regions are affected at the same time, it can create a composite or mixed emotion, such as bittersweetness or nostalgia.)

Feelings are the body’s readout of what’s happening internally, combined with your moment-by-moment state of mind. As Damasio says, “Feelings are the consequence of the ultimate emotional process; the composite perception of all that has gone on during emotions—the actions, the ideas, the style with which ideas flow—fast or slow, stuck on an image, or rapidly trading one for another.”

During an emotional state, our rapid body readouts allow us to weigh the likelihood of reward and punishment, all in an attempt to predict what might happen next and what we’ll do about it.

Basic emotions like fear, anger, sadness, and disgust can be understood as a more nuanced approach to the evolutionary choices of fight, flight, or freeze.

Damasio adds that the brain’s emotional process follows the same strategy as our body’s immune system. When a swarm of outside invaders show up, our white blood cells dispatch an equal number of antibodies. These cells lock onto the surface shapes of the trespassers in an attempt to neutralize them.

Similarly, when you find yourself in an alarming situation, the amygdala dispatches commands to the hypothalamus and the brain stem, increasing your heart rate, blood pressure, respiration pattern, gut contraction, blood vessel contraction, cortisol release, and a metabolic ramp down of digestion, culminating in a contraction of the facial muscles we would read as a frightened expression.

In primitive times, depending on the context of the situation, you might freeze in place, where you’d begin to breathe shallowly—important if you’re trying to remain motionless in order to elude a predator. On the other hand, you might make a run for it, resulting in an increased heart rate to drive blood into your legs. And your cognitive resources would be redistributed; interest in things like food or sex would temporarily fall by the wayside.

All of this takes a giant toll on your energy reserves. It’s costly—especially if it turns out to be a false alarm.

Like a dimmer switch, the basic emotions give us graduated options. Instead of entering full-on combat mode when an encounter goes poorly, I may choose to simply show my disgust towards that individual, thereby saving my precious glucose and decreasing the chance of getting knocked on the head.

Damasio suggests that because emotions are unlearned, automated, and predictably stable programs resulting from natural selection and genetic predisposition, you can choose to act bravely, but no amount of stoicism can undo the fear you’re experiencing on a basic physiological level.

This may explain why frequently people who have performed heroic actions shy away from describing themselves as courageous. Although their outward behavior was brave, they remember feeling profound terror.

It’s not a matter of being born with a fearless personality. Firefighters, Navy Seals, and many others undergo intense training to learn how to successfully function through their natural fright. Such is the power of building habits.

Digesting New Discoveries Inside the Human Body

There was a moment during the Lewis and Clark expedition of 1804 when, after months of deprivation, hardship, and arduous climbing, the party finally scaled the eastern face of the Continental Divide. Standing exhausted on that peak, they hoped to gaze down on gentle meadows and the gateway to the sea, but instead they saw…more mountains. It was mountains as far as the eye could see.

In some ways, modern biology has followed the same trajectory: in the 70’s we believed we were within striking distance of the cure for cancer, steps away from enlightenment about our own internal processes. Now we know it’s a much bigger expedition than we ever anticipated.

When it comes to understanding the body, the complexity we face is mind-boggling, a thick and tangled web of feedback loops and inner dependencies. Take, for example, the human gut.

For years, it’s been relegated to the back bench of physiological study. Recently, we’ve begun to see that the digestive tract is, in fact, a star in its own right. Those less glamorous organs are rising to costar status with the brain. The two are sometimes described as a team: the brain/gut axis. According to Giulia Enders in her new book, Gut: The Inside Story of Our Body’s Most Underrated Organ, there is incredible crosstalk between the two.

The human brain contains about 86 billion neurons. The gut, meanwhile, contains only 100 million neurons—on par with the brain of a cat. So, in the neuron department, the brain generally calls the shots. But the gut calls more than a few plays of its own.

Your gut houses a teeming microbiome of bacteria that affects your daily life in myriad ways. What goes on in your gut shapes your immune system, nutrition absorption, vitamin production, muscle function, hormone levels, libido, whether you’re hungry or full, and how you break down your food intake into proteins, carbohydrates, and fats. Your gut’s microbiome weighs in at about 4.5 pounds and Enders says some scientists are beginning to refer to it as a separate body organ.

Gut-related chronic health problems include allergies, cancer, Type II diabetes, mood swings, and anxiety. The bottom line: your gut is far more auxiliary brain than mere food dumpster.

Your vagus nerve, the longest of the twelve cranial nerves, is the superhighway connecting the brain to the gut. The gut also produces 95% of the neurotransmitter serotonin. Sometimes called the “feel-good” hormone, serotonin is strongly linked to your levels of happiness and depression. Your gut also makes GABA, an amino acid that calms the nervous system and smooths out brain waves, and a neurotransmitter called glutamate, involved in cognition, learning, and memory.

Your food choices have a direct effect on your gut’s microbiome. For that reason, Enders says it’s probably a good idea to eat a diet rich in probiotics (although more research is needed to confirm healthful benefits.) These are bacteria and yeast that support digestive health, found in fermented foods like kim chi, sauerkraut, kefir, pickles, and plain unsweetened yogurt.

According to Enders, these foods may help maintain the integrity of the gut lining, and that they serve as natural antibiotics, antivirals, and antifungals. They also play a role in regulating your body’s immune system. This in turn helps control inflammation and improve nutrient absorption.

There are many more summits to climb before we have a clear map of what goes on inside the belly’s hidden kingdom, but like Lewis and Clark, the discoveries we make continue to amaze. And yet for all our newfound knowledge, we’ve only begun to gain a tenuous foothold—and there are still mountains as far as the eye can see.