Humming


This morning after Brendan left – very early – to take the first written exam in his series of maturité exams, I decided to be really decadent and go back to bed for a few minutes. Marc was in the bathroom getting ready for his day –  humming and humming and humming.

No recognizable tune, just a series of little contented-sounding hums.

It reminded me of a passage I read yesterday in What I Loved” by Siri Hustvedt.

Continue reading

Colorful language

I’m sure at some point in your life – even if only as a teenager under the influence – you’ve asked yourself this deep, philosophical question:

How do I know that what I’m experiencing is real?

The answer? You can’t. What you see as reality is unique to you, because it’s a complex interaction between the physical world, your senses and your brain.

As proof, here’s a little snippet from my reality:

It’s Tuesday. We’re on the top part of this week’s circle, heading counterclockwise in the direction of Wednesday. We’ve come out of Monday’s black zone successfully, and because Tuesday is red, I’m pretty energized. It’s February, which is my favorite color (green), so all is well. We’re heading clockwise towards March, which is mauve and located at roughy 8 o’clock on the circle of 2012. I think a bit more about the word I described in my last post, plebiscite, and realize that because of the p and b, it’s a very blue word. Could that have been why I didn’t associate it with approval, which is much more yellow-orange, despite the double p?

What? Continue reading

Epiphanies

It’s January 6th, epiphany. This is the day the three kings apparently saw baby Jesus for the first time. Epiphany comes from the Greek word epiphaneia, “appearance” or “manifestation.” The Greeks were referring to things like the appearance of the sun on the horizon at dawn, an enemy, or a god.

When the word “epiphany” is used in English today, it usually refers to an idea or an unusually profound insight. I’ve had an epiphany! Marc exclaims. Maybe my stomach hurt because I had two apples, five kiwis and four cups of coffee before heading out for my run today! (actually, in all fairness, he probably wouldn’t have said epiphany. He’d probably just say I just thought of something. And I’d reply, Did it hurt?  And then he’d tell me about his gastric situation and I’d say, well, doh, what did you expect?)

That an epiphany can be both the physical manifestation of something or someone as well as a purely cerebral manifestation of insight – a light bulb going off in your head – is an oddly apt illustration for our times, in which the boundaries between the virtual and the physical worlds seem to be increasingly hard to pin down.

Oh, I see. What an epiphany.

One current buzz phrase is “augmented reality” – if you’re interested, my friend the Spime Wrangler writes about the topic beautifully on her blog. It means that we infuse extra layers of meaning onto objects in the physical world. But what, really, is reality? We already experience the physical world through the filter of our oh-so-fallible senses. Add some Bailey’s, and my reality is definitely augmented, and in a good way. Come to think of it, does it make any sense at all to even talk about “reality” as such?

Over the holidays, on that long trans-Atlantic flight, I finally got a chance to read Incognito by David Eagleman, the one I told you was on the top of my list back in June when I wrote a post about inspiration. Here’s the thing. Reality, augmented or just garden-variety, is a construct. Seeing has very little to do with our eyes and everything to do with our brains, which are masters of taking what they can get and creating out of it a world in which we can survive. Eagleman uses the example of a man who has been blind most of his life. He has an operation that restores his vision. But his brain doesn’t know how to interpret the incoming data. None of it makes any sense to him. He can see, but he can’t “see.”

He gazed with utter puzzlement at the objects in front of him. His brain didn’t know what to make of the barrage of inputs. He wasn’t experiencing his sons’ faces; he was experiencing only un-interpretable sensations of edges and colors and lights. Although his eyes were functioning, he didn’t have vision. […] The strange electrical storms inside the pitch-black skull get turned into conscious summaries after a long haul of figuring out how objects in the world match up across the senses. Consider the experience of walking down a hallway. Mike knew from a lifetime of moving down corridors that walls remain parallel, at arm’s length, the whole way down. So when his vision was restored, the concept of converging perspective lines was beyond his capacity to understand. It made no sense to his brain.

Talk about feeling sea-sick! The amazing thing about this is that it took Mike’s brain only a few weeks to adapt. Now he sees just like you and I do.

Eagleman uses another example, this time a blind rock climber named Eric Weihenmayer, who in 2001 became the first blind person to climb Mount Everest. He “sees” using a grid of electrodes in his mouth, a device called a BrainPort.

Although the tongue is normally a taste organ, its moisture and chemical environment make it an excellent brain-machine interface when a tingling electrode grid is laid on its surface. The grid translates a video input into patters of electrical pulses, allowing the tongue to discern qualities usually ascribed to vision, such as distance, shape, direction of movement and size. The apparatus reminds us that we see not with our eyes but rather with our brains.

This kind of rocked my boat, particularly when Eagleman went on to reveal that the BrainPort is also being used to feed infrared or sonar input to the tongue so that divers can see in murky water of soldiers can have 360-degree vision. Eyes in the back of your head, indeed. Just one page later, I had to put the book down so my brain wouldn’t overheat. Here’s why:

In the future we may be able to plug new sorts of data streams directly into the brain, such as infrared or ultraviolet vision, or even weather data or stock market data. The brain will struggle to absorb the data at first, but eventually it will learn to speak the language. We’ll be able to add new functionality and roll out Brain 2.0. […] this is not a theoretical notion; it already exists in various guises.

Now why anyone would want stock market data plugged directly into their brains is beyond me, but I know Marc would love a direct feed from ESPN.

That’s probably enough to chew on for one day. This wide, wonderful world is already amazing, even with the limited sensory apparatus we’re born with. Just thinking about what more might be out there – the rich smell and sound experience of the dog, the razor-sharp vision of the eagle, the reverberating echoes of the bat – well, it’s tempting, isn’t it? But remember that just because a data stream can be piped in, that doesn’t mean our brains will know what to do with it. That five-pound lump between our shoulders is notoriously fallible when it comes to decision-making and rationalizing.

So here’s the epiphany of the day: Be like the Greeks. Watch a sunset. Watch your enemy approach over a far-off hill. See the face of the gods in the clouds outside an airplane window.  And when the light bulb of ephiphany goes off in your brain, remember that your brain is a fallible organ, eager to create meaning from nothingness, and bask in the wonder of it all.

Lightbulb image: Shuttermonkey