Caveperson Chemistry: Rewriting our Family Tree

Located in the mountains of southwestern Siberia, the Denisova Cave takes its name from a Russian hermit named Denis, rumored to have lived there in the 18th century. When some intriguing bones were discovered in this cavern, the moniker derived from his long-dead hermit would gain a new use: as a shorthand for a stunning discovery about the world of our early human ancestors.

In 2008, Russian researchers discovered the finger bone of a young humanoid female. It was the wrong shape to come from a Homo sapien, but when scientists sequenced the DNA, it didn’t seem to belong to a Neanderthal, either. They had found what appears to be an entirely separate group of early hominids whose time on this planet briefly overlapped with ours. A genetic cousin to us, and a genetic sister to the Neanderthal. They had found the Denisovans.

Where early Homo sapiens and Neanderthals split into separate species some 440,000 years ago, Denisovans didn’t branch off from the Neanderthals until considerably later, more like 300,000 years.

Of course, there is only so much you can learn about a species from a single finger bone, and even when archaeologists uncovered two teeth and a toe bone, this somehow failed to fill in the entire picture. Did the Denisovans have art and tools? Were they capable of symbolic thought and language? Thus far, that handful of bones is keeping mum.

Here is something we do know about the Denisovans: it looks like they interbred with us. For instance, tests show that between 4 to 6% of the modern Malanesian genome seems to have come from them.

If that much interbreeding could take place, it does seem to suggest the Denisovans were not just crude ape creatures. And if they were anything like the Neanderthal, we should be careful not to immediately dismiss them as mouthbreathing simpletons who spent their days dragging around primitive spears and grunting.

While it is important to remember that the study of anything dating this far back involves an awful lot of guessing, researchers are increasingly gathering evidence that Neanderthals might have been much more advanced than we thought.

First of all, the simple Neanderthal spear turns out to require a fair amount of engineering.

Neanderthals weren’t just finding sharp rocks on the ground, they were crafting what’s known as “Levallois flakes”: symmetric stone blades with a sharp edge all the way around and an even thickness, for easy resharpening. Even to an experienced modern flint-napper (yes, there are still flint-napping enthusiasts out there), making one involves considerable effort. First, you have to chip the flint into a precisely shaped symmetric starting piece called a “core”. After all that work, you then have exactly one chance—one perfectly aimed strike—to knock off a flake.

These flakes were tied to spear shafts with a thin strip of animal hide, which was secured in place by a black sticky substance. And here we run into another problem: analysis shows it wasn’t just tree sap lying around waiting to be collected. It was birch pitch, which doesn’t occur naturally and would have required a fairly involved, multi-step extraction process: heating the bark up to a carefully controlled temperature while keeping out the oxygen and preventing the bark from burning.

But Neanderthals weren’t just doing early chemistry, it appears like they might have been playing dress-up as well. Recent discoveries at Neanderthal dig sites include wing bones of birds of prey with cut marks on them. These wings would have had no real value of food, which suggests someone was stripping off the feathers, for what seems like decorative purposes. Other finds include little seashells sporting traces of hematite or iron ore, a red pigment, and neatly pierced little holes.

Although it’s dangerous to draw too many conclusions here, it’s hard to imagine a practical utility for this, and easy to suggest a more symbolic use, like jewelry.

All of these complex behaviors add weight to the increasingly possible, if still controversial, theory that Neanderthal culture included some form of language. After all, how would you teach pitch-brewing or jewelry-making to your offspring through a simple series of grunts? And considering that some of us got as much as 4% of our genes from Neanderthal ancestors, the Neanderthal-human hybrids didn’t just happen, they were born into a situation where the group was willing to raise them into adulthood, which suggests a degree of cooperation that would’ve required advanced communication.

If you suspect you’re carrying around some Neanderthal in your blood, don’t worry: although we don’t know the function, if any, of most of the surviving Neanderthal DNA, at least some of it appears linked to immune system responses, including a guard against the Epstein-Barr virus.

And while those Denisovan bones aren’t yielding many of their secrets, Denisovan DNA may be the reason Tibetans can adapt to low oxygen levels at such high altitudes. According to a recent article in Nature, presence of this extremely unusual and helpful EPAS1 gene in the Tibetan people “can only be convincingly explained by introgression of DNA from Denisovan or Denisovan-related individuals into humans.”

So remember, there’s nothing to be ashamed of. And the next time you’re facing an illness or climbing a mountain—or trying to distill birch pitch without the use of any modern technology—it might be time to call on your inner caveperson.

The Twain Brain, or, Why Smart People do Stupid Things

“A man who carries a cat by the tail learns something he can learn in no other way.” So said Mark Twain, printer, steamboat pilot, journalist, failed miner, entrepreneur, abolitionist, lecturer and supreme humorist. Twain is perhaps the greatest American storyteller and writer ever produced by the fifty states.

Whether attacking kings, hypocrites, or the literary offenses of Fenimore Cooper, Twain was famous for his razor-sharp wit and his allergy to BS. Yet that same man drove himself into bankruptcy, ignoring the council of his most trusted pals in favor of pouring his fortune into a series of disastrous inventions. He once invested the equivalent of 8 million dollars in the Paige typesetting machine, a marvel of engineering that dazzled crowds but also constantly broke and was obsolete by about 1884.

So why did a man renowned for not suffering fools pour his fortune into one fool’s errand after another? Could it be that Twain, like the rest of us, was seduced by a “magic bullet”? That wholly American notion that there is a shortcut out there to unmitigated wealth and happiness.

Whether it’s a diet pill (all-natural, of course) or a potion to restore skin texture to that of a six-week-old baby (all-natural, of course) or a book that promises to create a state of nirvana (no artificial additives) or a new-fangled typesetter machine, many of us are suckers for the shortcut.

We love the easy road, the secret sauce, or that ultimate financial tip (see Martha Stewart). In 2012, Americans spent a total of $78 billion on lottery tickets.

Our brains love shortcuts. The most primitive, basic parts of our brains are wired for them. Although these shortcuts lack precision and can create real problems, their saving grace is efficiency.

Still, this efficiency can suffer sometimes. Take optimism bias, the unfounded belief that things will turn out much better than the facts warrant.

It’s what allows smokers to believe they won’t get cancer, dieters to start their diet in the future, and all of us to procrastinate on our work because, as Twain noted, “Never put off till tomorrow what you can do the day after tomorrow.”

Even the great Twain fell victim to optimism bias as he traveled down what he thought was a shortcut to financial independence through a prototype-printing machine. The Paige typesetter was reported to possess more problems than actual moving parts, of which it had more than 18,000.

Ironically, many suspect that had Twain put more energy in writing and less in his pet get-rich-quick scheme, he would have gotten rich much faster, and with a whole lot less heartache.

But Twain, was plagued with one incurable problem: a human brain. If reasoning is currency, then biases and shortcuts are what the primitive brain trades in. And that brain is where the action is.

Perhaps rather than seeing biases and shortcuts as system flaws, we should instead celebrate that which makes our brains so unique and ‘predictably irrational.’

No one summed it up better than Mark Twain.

“Don’t part with your illusions. When they are gone, you may still exist, but you have ceased to live.”

Robbing a bank with your wristwatch

As you stand in the check-out line, you may find yourself wondering why two minutes in a loud, crowded store can feel like an hour, or why an hour of relaxing with a good book can flit by in what feels like seconds. Why are we so bad as a species at tracking lengths of time?

In neuroscience, the prevailing strategy for understanding the “why” of any brain behavior is to think of it in terms of evolutionary advantage. What was life like back when humans were just starting to become humans? A lot of seemingly negative or unhelpful traits make sense in this context. There is a school of thought that the species might have benefited from some members of the tribe having ADHD, for instance. A dose of extra alertness or hyperactivity might hobble a desk worker, but it can be a godsend if you’re hunting antelope.

So why are our internal clocks so terrible? To a people consumed by foraging, hunting, and gathering, the passing of 90 seconds or an hour was just not that important. Most of human history has not been counted in minutes. Luckily, the Swiss came along to help us out.

December 18, 2013 marks the 91st anniversary of the Denver Mint robbery. To be precise, it marks the robbing of a bank truck fresh from the Mint. From end to end, the crime took just 90 seconds, which, even in today’s fast-paced world of modern bank robbing, is still impressive.

Who was the mastermind? Herman “the Baron” Lamm, a German immigrant and former Prussian soldier who was kicked out of the Army for cheating at cards. He is also often credited as the father of modern bank robbery. In 1917, Lamm was rotting away in the Utah State Penitentiary for a failed bank robbery attempt. The Baron could have used this time to contemplate “going straight.” Instead, he did something remarkable: he went pro.

Lamm spent his incarceration engineering a whole new kind of bank robbery: multiple getaway routes, specialized roles (point man, lookout, vault man, driver, backup driver), and meticulous planning. He would case a perspective bank for hours, detailing the comings and goings of bank employees and deliveries. (Later, John Dillinger, public enemy #1, would cite Lamm as his inspiration.)

After he got out of the Utah Pen, he carefully recruited a crack team of experts, and they practiced various scenarios, sometimes using a mock-up bank they constructed in an abandoned warehouse. They practiced until their operation ran with the precision of a Swiss watch—or a Prussian military exercise.

Lamm’s rule was simple: 90 seconds was the absolute maximum length of time they could spend on any caper. No matter what stage the robbery was in, when a minute and a half ticked by, his team exited the bank. Their success or failure was determined by Lamm’s wristwatch.

This served him well in the Denver Mint robbery and a string of other robberies across the US. Lamm finally met his demise following a bank heist in Indiana, where no amount of rehearsal could prepare his team for their run of slapstick-level bad luck.

First, the high-powered getaway car blew a tire when their driver cut a U-turn dodging an armed vigilante. So Lamm’s crew seized another automobile, put the pedal to the metal—and discovered that this particular car had been rigged to go no faster than 35 miles an hour. (The owner had been worried about his elderly father’s reckless driving.)

The solution was simple: steal another vehicle. So they nabbed a truck. When the truck turned out to have a hole in the radiator, they were forced to abandon it in favor of stealing a third car. Unfortunately, this car was filled with killer bees.

Just kidding: it was simply more or less out of gas.

Lamm and his team sputtered to a standstill near Sidell, Illinois, surrounded by 200 Indiana state police officers and vigilantes. What happened next is a matter of controversy. The police swore that Lamm, rather than face more prison time, took his own life. Needless to say, the autopsy reportedly showed the Baron’s body riddled with bullets, making his “suicide” all the more spectacular.

A dead bank robber might not seem like the most natural role model. But after an hour of holiday shopping drags into an afternoon, many of us can appreciate the bare-bones efficiency of Lamm’s 90 second rule.

Still, how much good did it do Lamm in the end? Maybe our ancestors were onto something, feeling no need to parse time into such finite increments. Has our desire to control and measure every second of our modern day lives really improved our situation?

The Swiss gave us watches and chocolate. At least one of those was a great invention.

Transference Bias: A Tale of Bloody Wars, Baby Kings, and Bad Bosses

There was a time, and it was not so long ago, that conventional wisdom said if you were born into nobility, you possessed a set of superior traits that automatically qualified you for governance. Got royal parents? Congratulations, you’ve won the leadership lottery.

There was just one problem: the system often produced people uniquely unqualified to rule.

Consider Charles II of Spain. He came from a line of the Spanish Hapsburgs so intermarried that one ancestor appears on his family tree in 14 separate places. Charles took the throne in 1665 despite a host of physical and mental disabilities—he couldn’t chew, drooled frequently, was never really educated, and at one point it’s rumored he ordered his deceased family members dug up so he could look at them.

Consider King George IV of England, famous for his extravagant spending, love of leisure—and utterly selfish, irresponsible behavior.

Consider the many kings and queens who were handed the reigns to their country before they were old enough to put their own pants on.

Now consider: many of these people held the fate of nations in their hands.

A history pockmarked with unnecessary wars, massive public debts, and plain incompetence proves it out: leadership is not an inherited trait. Wisdom and judgement are gained through experience, not via bloodline.

These days, most surviving monarchs are more figurehead than supreme ruler. After all, the industrial revolution has ushered in modern times and modern thinking. Or has it?

Anyone working in business might guess where this is headed. Transference bias at its core presupposes that knowledge is not a requirement for climbing the ranks of leadership. (“If Jones displayed a hardworking can-do attitude over in sales, by golly he can certainly run the finance department!”)

Today, “character, positivity, and fortitude” are the new blue blood in business.

Not that these traits aren’t good things for a leader to have. Most certainly they are. But when it comes time to make hard choices, the sunniest attitude in the world is no substitute for expertise. It’s the same way that Count Chocula’s noble birth doesn’t guarantee him wisdom in the deployment of his infantry.

Unfortunately, transference bias never died, merely dressed itself in new clothes. And like the old kingdoms at war, there is much collateral damage.

So the next time your new boss shows up green behind the ears, fresh from some other unrelated department, remember it could be worse. Your cubicle could be a castle wall, facing a catapult attack of dead rotting cows. And if there’s one thing we can all agree on, it’s that nothing is worse than dead rotting cows. Except for maybe the new minty-fresh boss you’re about to train…

Aristotle’s Three Musketeers, or, A Swiftly Tipping Stool

What might the Greek philosopher and Jack-of-all-trades Aristotle think of the latest findings in neuroscience? How would his notion of what it means to be a good public speaker stack up against the bevy of brain biases Daniel Kahneman outlines in prospect theory?

In On Rhetoric, Aristotle outlines three key concepts in building a convincing speech. The speaker must demonstrate:

Ethos: character, trustworthiness, credibility
Logos: logic, facts, figures or some process
Pathos: emotion, true feelings, a sense of connection

Your ethos can be broadly defined as your reputation or honor. Unfortunately, if your listeners don’t already know you, they are less likely to give you the time of day. When famed violinist Joshua Bell played an incognito recital in the Washington subway system, virtually nobody stopped to listen. Without context, Bell’s playing was swallowed up in the chaos of the daily commute. Our sense of importance is often driven more by context than actual value.

For Aristotle, logos were the facts and details, the nuts and bolts of any logical argument. The concept that an argument should be grounded in reason is one of the many things that western science borrowed from Aristotle. And yet, while we pay lip service to rationality, split-brain studies show that what we describe as our reasons often have little connection to the actual decision. We employ logic not as a compass but as a justification.

Danish author Martin Lindstrom notes an interesting phenomenon with product satisfaction surveys. Namely, that they’re useless. Ask someone to review a product they’ve just bought, and there will be nearly no correlation between their stated stance and what they’ll do the next time

When Aristotle talks about pathos, he is referring to the emotional appeal or the connection to the group, the speaker’s ability to stir the hearts and minds of the listeners. Perhaps modern neuroscience has advanced no idea more strongly than the power of pathos. This is why Kahneman labels the emotional factor, and not our rationality, as the real star of the show.

Aristotle understood this, but in the context of a powerful trinity, with pathos as one leg of a three-legged stool. He wasn’t entirely wrong, just a bit iffy on the relative proportions.

Aristotle often gets billed as a philosopher, and while this is true, it’s also selling him short; his writings cover everything from poetry to physics, music to politics, ethics to zoology. Philosopher Bryan Magee is quoted as saying (maybe a little hyperbolically), “it is doubtful whether any human being has ever known as much as he did”.

So if Aristotle was around today, maybe he wouldn’t need to be embarrassed at having inflated the value of ethos and logos a little. He’d probably be too busy delving into the advances in all his many favorite areas of study. Maybe a few new ones as well. Just what would Aristotle think of neuroscience? There’s no way no know for sure, but he would likely find it interesting. As a wise man once said, “The energy of the mind is the essence of life.”*

(*Aristotle. It was Aristotle.)

The Morality Lag: Smartphones and Dumb Feelings

Your smartphone has more computing power than the computer that took Neil Armstrong and crew to the moon. And this is only one of the staggering technological advancements we’ve made in the last 50 years.

Have you ever wondered why technological advancements, a byproduct of the analytical brain, have far outrun our ability to create any kind of significant improvements in our emotional governance? Wars, murder, and mayhem have gone unabated for thousands of years, and yet this week Apple announced the introduction of a new iPhone with fingerprint recognition.

Back at the turn of the twentieth century, Mark Twain said something along the lines that any newspaper publisher, regardless of the era, could always bank on headlines like “Trouble in the Middle East” and “Revolution in South America”. Twain was uncanny about the consistency of humanity’s inability to live and let live.

So why have our emotional brains hit a roadblock? Why haven’t we wiped out jealously, avarice and greed the way we knocked out polio? More importantly, when can we expect the next big upgrade to the emotional brain?

Sadly, not any day soon. The brain’s emotional decision-making powers have their roots firmly entwined in our most primitive survival instincts. There is a reason why neuroscientists refer to this part of the brain as our reptilian brain: we share this subcortical structure with the creatures found in the Florida Everglades. The drive to survive trumps just about every other kind of judgment, including the moral ones.

Individually, some may attain enlightenment, but it’s not exactly a product that can be monetized like Tang. And so collectively, we have a certain amount of cognitive dissonance about the way emotions wreak havoc on our everyday lives.

It feels much better for us to share in, and highlight technological advancements. Take for example the second sentence in this post:”And this is only one of the staggering technological advancements we’ve made in the last 50 years.” I don’t know about you, but I didn’t have much to do with the creation of the smartphone.

Technological advancements are built on the backs of the very special few. Although we all enjoy the latest creations that come in the form of cameras that are phones or phones that are cameras, who among the unwashed masses is capable of developing the next great innovation?

Let’s face it, most of us are essentially still cave people. We may wear nicer clothes, but if a cataclysmic event rendered electricity, and therefore our microwave ovens, useless, we’d pay to have one of our ancient relatives from the Chauvet Cave in Southern France explain how to start a fire without propane or matches.

A handful of impressive achievements have altered the way we live our lives, but they haven’t fundamentally altered who we are. We are the same people who engaged in war over territory and treasure since recorded time. Our weapons may have improved, but our intentions haven’t.

It’s easy to conflate technological leaps with some improvement in human pathos. They have a connection, but by no means are they the same thing.

We often give ourselves credit for having evolved far more than we actually have. It’s theorized that our current brain system have been around for roughly 40,000 years, apparently without fundamental change.

Like the saying goes, “Same story, different day.”

One can confirm this by simply gazing at the next heartbreaking newspaper headline.

After all, it’s just one click away on your new smartphone.

The Great and Terrible Crocodiles of Denial

Why do we continue to eat when we are full? Why do we smoke cigarettes when we know they cause cancer? Why do we not exercise more when so many studies link exercise to a myriad of long term benefits?

Above is Chinatown’s Doyer Street. I took this picture a couple of weeks ago on a quiet Saturday morning. This humble corner––purported to be the only curved street in all of NYC––is sometimes known as “the Bloody Angle.” Back in 1909, it was the deadliest spot in the city, even more notorious than the Five Points of Scorceses’s The Gangs of New York.

Continue reading

Sourdough, Websites, and the Self: The Myth of the Driver’s Seat

Our story this week begins around 1500 BC. It was roughly 3500 years ago that somehow––let’s be honest, probably through some kind of accident that doesn’t bear thinking about––Egyptians discovered that a mixture of flour and water, left in the right conditions, could bubble and ferment into a tangy ball of risen dough, and that this ball, when baked, was not only edible but delicious.

Continue reading

The Rise and Fall of Caleb Weatherbee, or Punditry, Prognosticators, and Poblano

So imagine it’s 1826 and you want to know what the weather will be doing tomorrow. You really have one choice: pull out your trusty Farmer’s Almanac and get down to business.

The Almanac is still around today. As Sandy Duncan, managing editor, says, “The formula we use dates back to 1818. It is a mathematical and astronomical formula that takes sunspot activity, tidal action of the moon and position of the planets into consideration. The complete formula is known only by our weather prognosticator: Caleb Weatherbee.”

Sounds pretty cool. There’s only one problem: analysis shows that its accuracy falls in the 50/50 range. That is to say, garden variety coin toss territory.

Continue reading

Cavemen and the Internet

The problem with you––and by you, I mean me––is that our brains are built on a 40,000 year old platform. Let’s not forget, evolution is a slow process. We haven’t had any significant brain upgrade since our ancient ancestors started sprucing up their cave walls with paintings of local fauna.

We may marvel at our technological advancements, but despite our ability to hurl metal objects and ideas through space, we are basically cave dwellers dressed in modern garb. In fact, the technology we all enjoy (most of the time) was the work of only a few individuals. Who among us, if teleported back 40,000 years could reproduce an iPhone, a cylinder lock, or even a porcelain toilet? You get the idea.

Continue reading