In Our Image

If you’ve followed this little rant for any period of time, you’ve probably noticed that I pretty regularly position Digital as something “other” from our natural world of continuous experience. One of the main indicators of that “otherness” is Digital’s DNA which breaks continuous experience into discrete segments, bit and bytes in the case of the boxes. Like any good modern, that makes me paranoid that something is getting lost on the editing room floor in between the bits and bytes as engineers decide what matters and what doesn’t.
OOOOPS!!!!  Turns out there is more similarity between that Digital editing and my conception of continuous Analog experience than meets the eye. It would seem in creating our Digital spawn, we recreated or maybe just amplified, our own tendencies to edit, and edit heavily.

The King is Dead? Long live…..?

It would seem that selecting the royal purple robes that will set us humans apart from the beasts and various automatons of modern life is an increasingly limited game. Remember the good old days when what separated Humans from everything else was that we made tools?  Oops, all kinds of animals make tools. Well, then, language, yeah that’s the ticket. Language is a uniquely human attribute that raises us up from the organic rabble. Hmmmm.  No.  Not only do many species have nuanced communications, but with some patience, some can even master a limited range of our vocabulary and syntax. More than we can say about ourselves for theirs. Continue reading

MyEverything, Digital Razorwire, and the End of Civilization

Quick now, dear Reader, on a scale from 1 to 10, low to high, how civilized are you?
No need to agonize, just a first reaction is fine. Our privacy policy here at the Analog Underground will keep all responses confidential by not even collecting them. So, from 1 to 10, how civilized are you? Got it?

O.k. Next question. Same scale, 1 to 10, low to high, how civilized are we? Scope doesn’t really matter. If you want to answer for your work place, your city, your country, or the whole world, it’s your choice. Got it?

Are the two numbers the same? They weren’t for me, but how can they be different? Civilization is a joint venture to which we all contribute and in which we all participate.
Continue reading

Proximity Lost

Reach out and touch. No, not the ATT thing. Really reach out and physically touch something! Without moving from the chair, or the couch (or God forbid, the driver’s seat) reach out and touch something. Physically touch anything that’s within reach. Done it?  Good. Now stretch a bit. Find something or someone (if that’s appropriate) at the furthest extent of your wingspan. Go ahead. Reach out and touch. There. You’ve defined the full circle of your physical proximity.

Two hundred years ago, that physical circle of proximity, was, most of the time, a reasonable map of our proximity of attention as well. Oh yeah, you might have a treasured memento from the old country that pulled your thoughts there as you cradled the object in your hand. Or some guy on a horse might come rushing in with the latest news, but even then there was a pretty clear crossing of a boundary between here and there.
These days?  Not so much. It’s been a slow change. First telegraph, then telephone, then radio and T.V., computers, cell phones, and the Internet all came marching in across the decades and here we are. But where is here?

Continue reading

The Internet Epidemic: Mental Obesity

Admit it, dear reader, you and I have both put on a few unnecessary pounds in the mental department since we made the acquaintance of the Internet. Those empty Internet calories went straight past our eyes and right onto our brains.

“Oh, I’ll just have one more. [click]” “Ooooh, this looks tasty. [click]” “Here, have more, there’s plenty! [click][click][click]” “Does this Internet make my brain look fat?”

It all started so innocently, so seriously. Back in the 80’s, remember when e-mail was a work thing? At Digital Equipment we had an application called Notes (yes, they were called applications then, Virginia). Reminds me now of Facebook, but with a little dignity and a lot longer attention span. How’d we get from there to Google+ which is essentially Lean Cuisine for social networking, an answer to the mental and emotional binge and purge that is the open Internet today?

How long has it been since you’ve heard that 90% of the traffic on the Internet is porn? It’s been a while, I’d guess. I’m thinking that’s because our approach to information sharing has made virtually 100% of the traffic some kind of porn. Food porn. Weather chasing porn. legal porn. politics porn. Whatever pops into somebody’s mind somewhere porn. Hey! Hey! Psssst buddy. Yeah, you, commear. Ya wanna see some really cute baby animals? The real thing! No Photoshop, I promise.

How’d we get here? How did things escalate so quickly from conversation to cocktail party to a full-blown, dignity-be-damned, mid-life crisis, you-just-don’t-understand-us, torrid affair with information half our age?

It is seductive. That little factoid that brings a wry smile to our friends’ faces (or rather an emoticon response). The political link that evokes a flood of comments in return. I feel so ALIVE!!! Wait. Wait. I can stop any time I want to. I can. Breathe. Breathe. Deep breath. Cleansing in. Cleansing out.

I’ve got this image stuck in my head of the guy that gets lifted out of bed by a crane and ends up being buried in a piano box. I find myself wanting intellectual and emotional calorie counts on Internet sites and apps. Do you the think the FCC could come up with the equivalent of Food and Drug’s nutrition labels? Would we trust them if they did? Would we pay any attention?

In my own private Tea Party moment, I’m beginning to believe the answer isn’t found outside ourselves, in better government, more regulation or smarter apps, but rather, somewhere inside each one of us, among all of us.

Hindus have a story they tell about Lord Brahma (and yes, oh crap, I found this on the Internet). It seems that a long time ago, all humans were Gods, but they were behaving so badly with their God-like powers that Lord Brahma decided to take away their divinity. He was unsure where to hide it so that the humans wouldn’t find it and begin to misbehave again. He thought to bury it deep in the earth, but felt that humans would just dig it up. Perhaps, the bottom of the ocean? No, crazed for power and adventure, they’d just swim down and retrieve it. The highest mountain? Same result. Then he had a revelation. He would hide the humans’ divinity deep within each of them, the very last place they would think to look.

And so began our journey. Perhaps that is the energy behind the frenzy that is the Internet. Atheist, agnostic, or religious, there are few of us who do not think some how of the future and our mark on it. Call it soul, call it spirit, call it meaning, we do not want to be thought trivial. And so we throw our pebbles at the ocean, hoping to build an island out there to replace the lost world within.

And, miracle of miracles, occasionally, seemingly randomly, some bit of land appears, some eco-system that acquires… life and some persistence, even on the Internet. It doesn’t, after all, have to be all serious brow knitting reflection. Laughter feeds the eternal in ways meditation can not. The eternal, the enduring is everywhere available, even on the Internet if we’re paying attention.

But over indulgence at any point along the spectrum from giggles to grief ain’t good for you, Roy. It is a tad ironic that the self governed infrastructure of the Internet leads to a completely ungoverned world of content that urges us to indulge every whim and impulse.

As Lord Brahma could tell us, the answer isn’t out there to be somehow acquired by the perfect experience.  It’s not in the cloud, or the next killer app, or better parental controls (who controls the parents?). It’s in you and in me and the choices we make everyday on, and off, the Internet.

Zombies!!! – The Undead Technologies Among Us

Is a zombie technology eating the brains out of your organization or, worse yet, your personal life?  You know what I’m talking about.  The applications and devices that stumble around on rotting, obsolete limbs.  They’ve long overstayed their usefulness, unaware they’re dead but not gone. They feed with an insatiable lust on our time and attention and in the process turn us into … well… them.

In the world of work, there are thousands of COBOL programs muttering through our halls waiting to lunge out at us, diverting us from productivity to the sustenance of their undead existence.  More recently that new app or new mobile device policy shows up at the door all spiffy in new clothes and fresh faced, only to decay into a dirty, threadbare, unintelligible monster pulling us into a dark corner never to emerge again.

Our Own Personal Zombies

In the good old days, most of our personal encounters with zombie technology came through the intermediary of some bureaucrat.  We might get battered about a bit as they struggled to fit the reality of us into some truculent program created in search of some long gone efficiency.  The stink of rotting purpose might linger as we grumbled about the wait at the DMV but pretty quickly we were back in the light and sunshine.

That was before the internet and mobile technology.   Allof the sudden we’re forced to deal directly with the grisly automatons of the brainless websites, the morons of texting and driving, the ever so friendly but clueless voice response systems.  If we’re lucky, all they want is our currency and trade. Worst case, some malevolent shaman is sending them shambling off to do us real harm.

The crux of the matter is that even the best technology can’t compensate for bad process, but really good process might improve bad technology.  We’ve all seen the poorly architected, fragile technology succeed wildly through wily marketing, whole cottage industries of support, and the occasional intervention of blind luck.  “Would you like Windows with that?”  And we’ve seen really sweet, well thought out technology wither on the vine from the lack of sustaining integration and deployment with the mass market or just our departmental business process.

And don’t let the “P” word lull you into thinking that zombie technology only roams the halls at work.  All I ask, gentle reader, is that you substitute the word “habit” for “process” and then we can talk.

Zombies – Shiny and New – Get Them While They’re Hot!

The problem is that almost no technology shows up looking undead. COBOL was as an huge innovation in its day as Facebook and the iPad seem to us now.  The zombie emerges slowly over time as our needs, wants, and understanding mature and evolve, but the technology doesn’t.  The zombie technology is frozen in time at the moment some Digital conjurer baked the passing Analog stream into a set of fleet, but rigid bits and bytes.  No matter how dazzling and innovative when new, it can’t easily regenerate a new incarnation to address some unforeseen situation.  Any incarnation of technology moving from the ethereal Digital world to the rough and tumble Analog stream slowly acquires that ragged, falling apart look from repeated impacts of new, unaccounted-for reality.  “Would you like Facebook or iPad with that?”

The funny thing (if you like that kind of humor) is that there is a completely human predecessor for this Digital behavior.  It’s called ideology.  On the upside, you get some visionary true believer like Frank Lloyd Wright that creates uplifting architecture that just isn’t so great for actually living in long term.  “Would you like Steve Jobs with that?”  On the down side you get nut jobs like Osama Bin Laden or that guy who drags his family around to the funerals of our soldiers with some bizarre message about divine retribution in the face of honor and sorrow.  Somewhere in between, you get folks like our current round of politicians that have lots of ideas, but can’t seem to do basic math.

In every case, Digital or Analog, someone with a kind of tunnel vision cooks up a set of rules away from the messy facts and complexities of life as it is lived.  Armed with that convenient, second order picture of reality, they come charging out into the real world, hacking away at anything that doesn’t fit their particular version of a rosy picture.  God forbid you have brains, ‘cuz they’re hungry.

Being the Anti-Zombie
What’s a poor human to do as the moaning zombie mob closes in?  Despite iRobot, The Matrix I, II and III, and all the Terminator movies, Digital and the other various stripes of fundamentalism aren’t all that flexible, and that’s our out.  We don’t win by being better zombies than the zombies themselves.  We win, at work and at home, by constantly evolving our selves to account for new realities.  We win by leading with empathy and a healthy suspicion that we don’t know as much as we think we do.  That’s the baseball bat to the zombie cranium.

Analog Art and Digital Avatars

There is a kind of art in the ever more ubiquitous Digital representations of our world. The software engineers, the hardware jockeys, the web designers wield their various electronic paint brushes to draw us a picture of the world. Like Analog art, those pictures are sometimes a representation of the world as we think it is, sometimes the world as we wish it were, or even sometimes a world we want others to believe in. In a similar vein there are better or worse painters, and viewers with more or less sophisticated ways of absorbing and interpreting those Digital creations.

Picasso’s Les Demoiselles D’Avignon
Interestingly enough, however, we don’t seem to have had our Digital Picasso or Pollack moment, at least not intentionally. Where’s the software engineer that’s taught us a different way of seeing, the hardware designer showing us a new way of interpreting the world?

Pollack’s Painting #1 1950

 Grace Murray Hopper certainly has left her mark on the industry. We may not be writing a ton of new COBOL code, but God forbid it suddenly disappears from the planet. Perhaps, in its time it was as revolutionary as Picasso’s Les Demoiselles, that first step down the Cubist path in Analog art.

I’m sure there are those that will raise the Steve Jobs flag here, but truth to tell Jobs has been more a packager of other’s ideas, more Barnum than Pollock, though the end result has a certain abstract expressionist vibe to it. Tim Berners-Lee comes closer with his hypertext seed that grew into the web, but even there his role is more the guy who gave Picasso his paints rather than Picasso himself. The same could be said of Jack Kilby and Robert Noyce (semi-conductors).

What is Art? –The Digital Redux
As anyone who’s taken an Art Appreciation class knows, on the Analog side we’ve never arrived at a conclusive answer to the question “What is Art?” Perhaps we’re destined to that same wondering-in-the-fog on the Digital side, but I can’t help but think there’s a difference between the endeavor of Analog Art and Digital representations. I’m suspicious that we don’t have the Digital Picasso or Pollack because the work of Digital is driven by a different instinct than that of the painter or the sculptor. What’s more I think that difference can be traced back to two distinct roots. The first is a matter of time lines, the second, a matter of pragmatism.

Denis Dutton in his book, The Art Instinct: Beauty, Pleasure, and Human Evolution, suggests that at least some of our artistic sensibilities are hard-wired at the level of instinct. He cites studies that show a cross cultural preference in landscapes for a lot of blue sky, some kind of savannah, with a border of trees and available water. Makes sense if our sense of beauty got wired in when we were coming out of the trees and being to walk up right. The folks who survived that stage of our evolution would have had a preference for exactly that kind of landscape and passed that preference on to their progeny.

Perhaps we don’t have the Digital Picasso yet because we are, at least Digitally, just coming down out of the trees, exploring what it means to stand up right, so to speak. There is, after all, some 30,000+ years between the cave drawings in Chauvet, France and Pollack’s “Painting #1, 1950.”

Another way of thinking about this is the perhaps apocryphal story of the king who declared he wanted a fine painting of a rooster. His advisors began a search for the finest painter of animals in the realm and eventually landed on one artist. The king personally appeared at his studio and commissioned the work. When asked when he could expect to take possession of the picture, the artist told the king to come back in one year. A year passed and the king became ever more obsessed with having the finest painting of a rooster in existence. Finally the day arrived and he swooped back into the artist’s studio, barely able to contain his anticipation of the great work, a year in the painting.

The artist looks up from his work and says “Ah, The king arrives.” He gets up, walks over to a blank canvas, and in a few minutes a beautiful picture of a proud rooster emerges. All assembled admire the subtle coloring, the cocky strut, the vibrancy that almost has them hearing the rooster’s call. The king, however, is not amused. He sputters out an angry declamation that he did not wait a year to have this artist slap dash something together no matter how good it might appear. A deathly silence falls across the studio, but the artist remains cool.

He summons the king to the door of a back room. He opens the door and reveals thousands and thousands of drawings, paintings and sculptures of roosters, and even one or two of the live birds picking through the chaos. “My most revered highness, I could not have done that one painting without the year it took to work through all of these.”

Perhaps Digital is just not that far along in its own exploratory “year.” We are still early on in the process of understanding both the medium and its subjects.

Why is Art? – Digital Redux II
If the first art history question is “What is Art?” then surely the second has to be “Why is Art?” That leads directly to the second foundation of difference in Digital and Analog art. The “Why?” question on the Analog side spurs more debate than answer.  On the other hand, the why of the art of the Digital is usually pretty obvious. There is some pragmatic problem to be solved, some unanswered (and sometimes unimagined) need to be filled. There is a defining pragmatism in the art of the Digital that is not present in the Analog Arts.

The art of the Digital is rarely concerned with purely philosophical or aesthetic ends (which might explain all the really ugly software and websites out there). At least part of the fervor for Apple products is their ability to wonder into those realms while at least making a wave at meeting the uber-measure of good Digital which is practical application. We flock that direction because it satisfies a part of us that technology does not usually touch and in fact does its best to ignore. Trimming that quirky, individual human perception out of the binary, either/or representations is necessary to sustainable Digital experience. Those very aspects of our selves that most make us most human are least susceptible to representation in the art of the Digital as currently practiced.

So the Analog Underground calls out to all the artists of the Digital. Practice on! Labor through our “year” of exploration! And as we create the digital avatars of our various selves, perhaps it is worth a look aside to the world of Analog art. We might find some guidance there, from that span of 30,000+ years, on how to represent those things that make us most human.

Are You Data Literate?

Imagine a world where books were ubiquitous, but nobody knew how to read. Books as fine objects d’art. Connoisseurs prizing intricate bindings. Hand tooled leather covers, rich papers with hand torn edges, crisp fonts in dark inks marching across every page. But no comprehension, just a market place of… objects. Manufacturers pumping out these beauties so even the lowliest Walmartian could afford a whole room dedicated to blind beauty.

The weaving together and then unraveling of A’s and B’s and C’s into meaning and insight, no matter how mundane is a critical skill in our modern world. Literacy, the ability to use text to encode, transmit and decode meaning, context, and texture, enriches our lives. By and large we recognize that, teaching reading at a very young age, filling in with adult literacy programs, and getting into righteous debates about the best way for kids to learn to read.

Reading Data
Is data literacy any less important? As Moore’s Law doubles and redoubles our Digital capabilities, isn’t the data represented world becoming as ubiquitous as the print represented world? And yet the evidence is all around us, in big ways and small, that our collective ability to read data, to understand and navigate in data represented realities can’t even match the muddled appreciation of our alleged Walmartian mentioned above.

Have you noticed that every single car insurance ad from every single company seems to mention that on average, drivers that switch to them save a gazillion dollars? How is that possible? It would seem to imply a spiral that eventually has the insurance companies paying us to cover our cars (hmmm… could we make that work for health care?). That trope, of course, depends on a basic misunderstanding of what constitutes good data. Of course drivers that switch save money. Why else would they switch? It’s a population that self selects and says absolutely nothing about the relative costs of insurance between companies in general. Duh.

That’s perhaps no big deal, simply a tax on the data illiterate. But if we turn to, say, politics then things can get serious, such as the recent election and the ideology-over-reality driven agenda of our fledgling 2011 House of Representative. Take, for example, the health care repeal they recently passed. Whichever side you’re on about universal healthcare, healthcare reform, death panels etc, the charge up that repeal hill should be a tad puzzling. One of the standards held high in that action was fiscal responsibility. And yet, neutral observers agree that actually succeeding in repeal would cost an incremental $230 billion dollars over the next 10 years or so. Pretty soon, we’re talking real money.

Or let’s turn to Ms. Palin’s disingenuous cry of “blood libel” after the events in Tucson. Absolutely Jared Loughner is a loon for whom she cannot be held directly responsible. Liberal attempts to make that connection as if in a court of law are, to be generous, reaching. But to shrilly insist that the representations we make (cross hairs on districts) and the Digital cannons that we fire them with (her own website) have no impact on the collective conscious… Well I guess I’d trust my Walmartian’s digital data savvy more than hers (or the liberal head hunters for that matter).

Data is Truth, Truth Data – That is All?
We seem to have this childish belief that once we have the data, truth will follow, that the ambiguity and confusion will fall away and no further intelligence or judgment is required. When we amplify that belief through Digital means, bizarre stuff begins to happen.

If you’ve followed this rant for any period of time, you know I get jumpy about the editing that occurs when we necessarily trim messy analog reality to fit in the little data boxes that drive Digital capabilities. I still worry about that, but as we get further into this digital revolution, I’m beginning to pay more nervous attention to how those little data boxes get unpacked by us consumers of the Digital (aka everyone).

Like a band of hyper–terriers following a fleet of rats, we and our doctors leap and dart after every new study of pharmaceutical approaches to health. Are you taking statin drugs to reduce cholesterol? I am. It’s a clinically proven fact that statins reduce my chance of having a heart attack. Or maybe not.

It would seem that the ultimate data jockeys, scientists, are beginning to question the, ah, ultimate stability of any data set. In a New Yorker article titled “The Truth Wears Off”, Johan Lehrer recounts growing concern in the scientific community about something labeled the “Decline Effect” The short hand on this is (here I go trimming), many of our treasured, clinically replicated, data driven findings seem to become less data driven over time. That is, the data (and the interpreted results) of early studies become less and less replicable over time. True for popular anti-depressant drugs, studies in memory and perceptions, and a wide range of other scientifically derived “reality.” Over time, the data driven medical ‘truths” on which your doc is basing his prescriptions for you, may not be quite as true as we first thought.

If the data is getting squishy on the scientists, God help us mere mortals in our data wrestling endeavors.

Towards a More Literate Reality – Represented or Otherwise
So what to do? Run squealing into some imagined Analog Only sanctum? Well probably not. The Digital drawer of Pandora’s box has been opened and we need to find a way to cope. Digital lives and breaths on data derived reality. All the digital wonders rely on our ability to package realities into bits and bytes, transport those tidy little packages and then unpack them at some remove.

Certainly we want to hold our packagers of data, the computer scientists, the engineers, the programmers to a high standard for their initial transformation of Analog reality into Digital representation. But the burden of fidelity is not just on those that do the packaging. It also resides with all of us, whenever we open those tidy little packages and let them bloom back into the Analog light of day.

We cannot be confused, as were early consumers of moving pictures, between the picture and the represented reality. Like-wise we cannot ignore the preconceptions, the values, and the biases that the packagers bring to their tasks. Over the years, the decades, the millennia we have become quite adept at appreciating not only the object of art, but also the impact of the artists’ beliefs, skill, and context-of-the-moment on the work in front of us.

Whether Caveman, Dutch Master, or skuzzy corner performance artist there is a part of us that is made human by our representations of reality and our appreciation of them. Digital doesn’t change that, but it does amplify and accelerate the impacts of our representations and the underlying beliefs and savvy with which we create them. With great capability, comes great accountability. Be literate out there.

Pigs and Professionals in a Digital Age

No, I’m not talking the crumb-infested, corner-cubicle-coveting, pontificating bane on everyone’s existence from the IT department. I’m talking real pigs.

Let me explain. When I first moved to the Mid-West, I was a bit mystified to hear one of my colleagues extolling the local pigs. Seems she had just been to the state fair and reported back, “Them pigs! They were real professionals! They knew just where the oreos were and right how to get there!” On further investigation, turns out that pig racing is a big thing at the fair. It involves a little track and oreos at the end to entice the pigs to hustle around the corners, which they apparently do with some skill and enthusiasm.

An Extremely Brief History of Professionalism
Enquiry into the nature and practice of professionalism has longer legs than your average Mid-Western racing pig, reaching at least back to the Middle Ages and its system of professional guilds and apprenticeships. Even in our own, more proximate, middle ages, the 20th century, some folks were at great pains to draw distinctions between “the professions” such as medicine, clergy and law and the occupations of the rest of us wage slaves. Trying to untangle the nuances of those arguments is an exercise for a more bored and idle time.

However, as we cede more and more decision making to the Digital boxes, it’s probably worth taking a moment to consider what such a shift means for us as individuals and professionals.

Professionals and Digital Surrogates
Don’t get me wrong. I’m more than happy to have a chip monitoring the temperature in my house and making the call of too hot, too cold, or just right in best Goldilocks fashion. I’m glad, on those long interstate sprints, to turn the moment-by-moment decision making of faster, slower or maintain over to an engine control module. Frees me up so I can cast my focus elsewhere such as being amazed at the guy one car over who’s eating with one hand and texting with the other. Guess that’s why God invented knees.

There are whole volumes of decisions, once fraught with ambiguity and danger, which have become so well understood that no real on-the-fly judgment is required to balance out risks and desired outcomes. Hurrah for the science, technology, and industry, computer or otherwise!

Unfortunately, there is no commonly upheld, exhaustive list delineating which decisions fall in that bucket and which decisions still require someone with a body of training and experience to make a call. That’s where the conversation about being professional and our growing Digital reality collide.

Anatomy of a Digital Decision
From one vantage point, Digital is all about canned decision making. A bunch of “Is this a ‘0’ or a ‘1’?” decisions get munged together into ever expanding chains. Those nano-decisions collude to create all kinds of miracles from robotic surgery to the latest version of Angry Birds on my cell phone.

It is important to remember, though, that these miracles are completely dependent on the “canned” nature of both the nano-decisions and their answers. If you can’t predict the choices that need to be made or the appropriate answers, Digital need not apply. Digital isn’t the only the only thug out there beating the snot out of common sense, but it’s certainly at the party. Digital has a penchant for substituting data for direct experience and accelerating any decision process. Couple those two with suspect quality control and the result is not likely to pretty.

If you listen to the Lean Six Sigma priesthood, you can’t make good decisions with out good data which is true for a certain kind of decision. However, it seems to me that most of the really interesting decisions call for more than just data and established process, something not quite so susceptible to the lies, damn lies, and statistics of modern business and political discourse (with apologies to Twain, but not Fox News).

The trick, of course, is to know when a decision is ready to be canned, when we know enough and the boxes are savvy enough to permit the prediction and its reliable encoding into this system or that. Get it right and some new miracle of efficiency and precision is born. Get it wrong and you get the decisional equivalent of botulism.

Decisional Wheat, Digital Chaff
Real professionals, true leaders in their chosen fields, don’t earn their street cred by parroting back the same decisions described in the text books or reference manuals of their education and apprenticeships. True professionals make their mark, build their legacy, when a gap opens up between the present reality and the available data, information, and solutions. The better the professional, the quicker they see that gap and the more creative they are in filling it, whether they’re doctors, managers, programmers or plumbers.

Designer Reality

It would seem we’ve made our Digital world in our own image. “Well, duh!” you might say, wondering how it took me so long to reach that pretty obvious conclusion. And it is pretty obvious on the face of it. However, if we’ve made this Digital world in our own image, what can we learn about it and ourselves if we dig beneath the surface a bit?

Regular readers of the Underground know a steady theme is attention to what’s edited out by our Digital representations of Analog. That editing has always been cast as a conscious choice, made in a hubristic attempt to “improve” things. Turns out that editing may begin long before we summon up any conscious picture of what better looks like.

Making a Monkey Out of Percpetion
In their book, The Invisible Gorilla, Christopher Chabris and Daniel Simons recount a number of psychological experiments on attention, knowledge and logic. Turns out we don’t know or notice as much as we think we do. The opening chapter recounts a simple experiment in which subjects are asked to watch a video tape and count the number of passes between basketball players. To make it a little more challenging, some players wear white uniforms, some wear black and the subjects are only to count the passes between players in white. Pretty straight forward and most subjects deliver an accurate count of the number passes.

There is one other little detail that makes this experiment more interesting and relevant to our conversation. About half way through the video, a person in a full gorilla suit wanders through the middle of the basketball exercise. They stop in the middle of the frame, wave their hands, dance around, etc. They’re not trying to sneak in and back out without being noticed. But almost half the subjects, when asked about what they saw in the video will talk only about the number of passes. If asked outright about seeing a gorilla they will say they saw no gorilla. Many in that group, if shown the same video again, will deny that the gorilla was there the first time through.

It only gets worse from there as they recount other experiments that call into question our abilities to accurately remember significant events, draw rational conclusions about cause and effect, and in general reliably estimate our own capabilities and potential. As remedy, Chabris and Simons’ have an almost child-like faith that experimentation can clearly reveal an accurate knowledge of the world as it is. Their book’s sub-title is “And Other Ways Our Intuitions Deceive Us.”

I’m not so sure. Or at least not sure that really solves the more complex social, environmental, or personal challenges which are not reducible to simple experimentation. Last I checked we only get to live a given life one way. No winding the clock back and trying some other path.

The phenomenon of editing is not a revelation in the Digital world. It’s all about editing. But as we play God in creating that Digital reality, it’s probably worth asking if our Analog selves are more like Yahweh, the God of Ages in full control and delivering our certain prognostications in a rumbling voice of thunder. Or are we more like Epimetheus, the Greek God know for “running forward while looking behind.”

It’s an Architected Life
If popular movies are any indication of our baser intuitions, it seems we’re casting our instinctual vote for Epimetheus. There’s a trend in movies over the last decade of Digital emergence to have a role of a flawed God, more commonly called “The Architect” These “Architects” are not to be confused with the humble practitioner down the street designing MacMansions for the boobocracy, nor even the soaring spirits that have designed our finest public monuments. These “Architects” are designing whole worlds, but not worlds that will actually be built, touched, inhabited, but rather only experienced either as base neural stimulations (The Matrix series) or in dreams (Inception).

These imagined worlds, no matter the skill of the architect are always flawed, eventually falling into some kind of chaos. It is as if the designed reality lacks some stabilizing or self correcting element to prevent eventual disaster; as if the designer, the architect, by definition cannot see a big enough frame to create a self sustaining reality.

Of course, just because the movies say it don’t make it so, but this one rings true both in intuition born of experience and in experimentation with invisible gorillas.

Go find someone who has the title “Enterprise Architect.” About any self respecting medium to large IT shop will have at least one these days. We’re a sorry lot (did my time there several years ago) charged with the impossible task of articulating a comprehensive vision for robust technology in the modern organization. Actually that’s not so bad. Where the trouble comes is in actually implementing and maintaining that vision. Like the sci-fi architects writ large across the big screen we seem doomed to failure, to watch time and the newest buzz words relentlessly crumble our creations to chaos and dust.

Better than Perfect
But there is a kind of glory and honor to be had from this Sisyphean task, the restless plucking of dripping bits of order from the relentless chaos. We achieve that glory and honor not in realizing the perfection of our designs, but rather in understanding both our Digital and Analog worlds are improved by a certain humility, an acceptance that even our best designs are flawed in ways we cannot, at first, imagine. Honor comes not from our perfection, but rather from our living through, beyond and above our realized flaws.