Epiphanies

It’s January 6th, epiphany. This is the day the three kings apparently saw baby Jesus for the first time. Epiphany comes from the Greek word epiphaneia, “appearance” or “manifestation.” The Greeks were referring to things like the appearance of the sun on the horizon at dawn, an enemy, or a god.

When the word “epiphany” is used in English today, it usually refers to an idea or an unusually profound insight. I’ve had an epiphany! Marc exclaims. Maybe my stomach hurt because I had two apples, five kiwis and four cups of coffee before heading out for my run today! (actually, in all fairness, he probably wouldn’t have said epiphany. He’d probably just say I just thought of something. And I’d reply, Did it hurt?  And then he’d tell me about his gastric situation and I’d say, well, doh, what did you expect?)

That an epiphany can be both the physical manifestation of something or someone as well as a purely cerebral manifestation of insight – a light bulb going off in your head – is an oddly apt illustration for our times, in which the boundaries between the virtual and the physical worlds seem to be increasingly hard to pin down.

Oh, I see. What an epiphany.

One current buzz phrase is “augmented reality” – if you’re interested, my friend the Spime Wrangler writes about the topic beautifully on her blog. It means that we infuse extra layers of meaning onto objects in the physical world. But what, really, is reality? We already experience the physical world through the filter of our oh-so-fallible senses. Add some Bailey’s, and my reality is definitely augmented, and in a good way. Come to think of it, does it make any sense at all to even talk about “reality” as such?

Over the holidays, on that long trans-Atlantic flight, I finally got a chance to read Incognito by David Eagleman, the one I told you was on the top of my list back in June when I wrote a post about inspiration. Here’s the thing. Reality, augmented or just garden-variety, is a construct. Seeing has very little to do with our eyes and everything to do with our brains, which are masters of taking what they can get and creating out of it a world in which we can survive. Eagleman uses the example of a man who has been blind most of his life. He has an operation that restores his vision. But his brain doesn’t know how to interpret the incoming data. None of it makes any sense to him. He can see, but he can’t “see.”

He gazed with utter puzzlement at the objects in front of him. His brain didn’t know what to make of the barrage of inputs. He wasn’t experiencing his sons’ faces; he was experiencing only un-interpretable sensations of edges and colors and lights. Although his eyes were functioning, he didn’t have vision. […] The strange electrical storms inside the pitch-black skull get turned into conscious summaries after a long haul of figuring out how objects in the world match up across the senses. Consider the experience of walking down a hallway. Mike knew from a lifetime of moving down corridors that walls remain parallel, at arm’s length, the whole way down. So when his vision was restored, the concept of converging perspective lines was beyond his capacity to understand. It made no sense to his brain.

Talk about feeling sea-sick! The amazing thing about this is that it took Mike’s brain only a few weeks to adapt. Now he sees just like you and I do.

Eagleman uses another example, this time a blind rock climber named Eric Weihenmayer, who in 2001 became the first blind person to climb Mount Everest. He “sees” using a grid of electrodes in his mouth, a device called a BrainPort.

Although the tongue is normally a taste organ, its moisture and chemical environment make it an excellent brain-machine interface when a tingling electrode grid is laid on its surface. The grid translates a video input into patters of electrical pulses, allowing the tongue to discern qualities usually ascribed to vision, such as distance, shape, direction of movement and size. The apparatus reminds us that we see not with our eyes but rather with our brains.

This kind of rocked my boat, particularly when Eagleman went on to reveal that the BrainPort is also being used to feed infrared or sonar input to the tongue so that divers can see in murky water of soldiers can have 360-degree vision. Eyes in the back of your head, indeed. Just one page later, I had to put the book down so my brain wouldn’t overheat. Here’s why:

In the future we may be able to plug new sorts of data streams directly into the brain, such as infrared or ultraviolet vision, or even weather data or stock market data. The brain will struggle to absorb the data at first, but eventually it will learn to speak the language. We’ll be able to add new functionality and roll out Brain 2.0. […] this is not a theoretical notion; it already exists in various guises.

Now why anyone would want stock market data plugged directly into their brains is beyond me, but I know Marc would love a direct feed from ESPN.

That’s probably enough to chew on for one day. This wide, wonderful world is already amazing, even with the limited sensory apparatus we’re born with. Just thinking about what more might be out there – the rich smell and sound experience of the dog, the razor-sharp vision of the eagle, the reverberating echoes of the bat – well, it’s tempting, isn’t it? But remember that just because a data stream can be piped in, that doesn’t mean our brains will know what to do with it. That five-pound lump between our shoulders is notoriously fallible when it comes to decision-making and rationalizing.

So here’s the epiphany of the day: Be like the Greeks. Watch a sunset. Watch your enemy approach over a far-off hill. See the face of the gods in the clouds outside an airplane window.  And when the light bulb of ephiphany goes off in your brain, remember that your brain is a fallible organ, eager to create meaning from nothingness, and bask in the wonder of it all.

Lightbulb image: Shuttermonkey

3 thoughts on “Epiphanies

  1. Mary, my daughter just finished a project at school in which she had to explore a futuristic concept. She chose to explore the concept that you describe … electronic interfaces between our brains and “the web”, allowing information flow directly without the need for any type of interface that requires typing or clicking or even reading. Pretty cool stuff. But she surmises that eventually this capability will create an even bigger gulf between rich and poor … giving the wealthy access to information faster and more nimbly than the poor will be able to afford.

    • she might be right. But the scariest thing for her might be that her mom could really have eyes in the back of her head…

  2. Hi Mary,

    After reading the above blog, I returned to my Google feeds and found this Engadget post on my screen:
    http://www.engadget.com/2012/01/07/vuzix-smart-glasses-ces-2012/

    The Vuzix glasses may be somewhat less sophisticated than direct electronic input to the brain, but the concept of augmented reality is there.

    I’m in danger of having to think with greater perception after reading some of your recent blogs, and coming to terms with this reality presents some difficulties for this aging, rusting, in-the-act-of-retiring brain. Nevertheless, thanks for the references to rust-busting material.

    Regards

Leave a Reply to gydle Cancel reply

Your email address will not be published. Required fields are marked *