I doubt if a more profound analysis of the times to come has been attempted in such a fleeting manner?
Consider this: more and more of our time online will be spent in mindless pursuit of #trends
. Mindless not in the sense that we won’t be using our mind in the pursuit — we’re all capable of kicking into play our analytical machinery, our wit, our sarcasm, our keen, largely unique (thanks to our unique experiences) insights — all of which — are by themselves very mindful
activities. No I’m not talking about that
. I’m not even worried about that
. No, we’re not going to lose our minds. We’re — how do I put it — about to lose our control over them, or whatever illusion
of control we had, of choice, of agency, of free-will, of volition.
Twitter, the latest and greatest social toy we’re all (or were) addicted to — and many others like it — does not leave us with much choice. We peek in to see what’s happening, out of our volition: a willful, mindful act. But alas, we don’t know, that the famous Eagle song was making a futuristic reference to exactly this …
Relax, said the night man, we’re programmed to receive,
You can check out anytime you like, but you can never leave
Back to the point then. The #trends, by their definition, are mainstream, however anti-mainstream a stance that the alternate/social media enthusiasts may take. Trends are about numbers, about majority, about collective interest
, or herd interest
. Or about interested herds. Same difference. To borrow the much abused, and much questioned (or so I hear) Gladwellian concept: trends are about the tipping point
. They are the organically grown news epidemic. They come and hit you, whether you’re interested in them or not. And more often than not, our disinterests are not as passionately evangelical as our interests
. They don’t have a strong veto, the disinterests. They feign apathy, even a fleeting antipathy, and then they break down, vanish into ether. Soon, just as would a passing viral epidemic makes you sick, trends catch hold of you. They increase your body temperature. You vent some steam. You take a chill pill. And it’s over.
Here is the thing in nutshell: in the social web, you can run, but you can’t hide. Not very well. Not really.
Time for a tangent.
In the, was it (let me google) the 15th century, Columbus started on his voyage to India. Just imagine if he had google maps, GPS, and a web message boards/twitter to ask for directions mid-way. Why, would Americas ever be discovered? Okay, so I’m being a little anachronistic here, and twisting all cause and effect. Still, will the near omniscience of the web oracle — Wikipedia, Google (and \<begin flame\>me-too’s like Bings\<end flame\>), Twitter, Facebook, Askme, and what nots — take away the element of serendipty, of pure chance, of lucky escapades? Or nearly make it irrelevant? For what chance hoped to discover at its own pace has already been tagged and filed by the web oracle in its infinite knowledge base?
Waves: dynamic, branched out, collective thinking processes?
Okay, enough of neo-Luddite covert (and overt) cynicism. Let’s embrace, shall we? The cutting, bleeding edge of social networking — the Google Wave — bled itself to anonymity (forgive me a bastardized metaphor). Let’s not go into the reasons. Let’s just be optimistic and believe that wave is the future — not of play, but of work.
Mail — what used to be called email before, but has now pushed its etymological predecessor into changing its identity to snail mail
, a diminutive by all means — is simple, really. It just extends the linear, to and fro, exchanges from the snail mail time to turbo powered, but essentially linear, to and fro communication.
Enter internet chats — one to one, or many to many. But nothing new, again. Just take a communication pattern that’s plainly too common in the offline world into online world, add an angle of partial-to-complete anonymity, add global accessibility (available to limited few who could spend on ISD calls before), subtract verbal and visual clues that make impersonation progressively easier. Did chats fundamentally alter the way we think? Short answer is no, for the simple reason that they weren’t fundamentally different communication paradigm (now I get to use that word). What they did
was to attack our attention spans. First real attack, and we’re still bleeding from it, while new attacks come from every direction.
Enter Wikis/waves: Wikipedia, IMO, is one of the towering achievements of the Internet technology. Wikis are fundamentally
different from things we used to do offline (okay, almost fundamentally). Of course, knowledge (in a loose sense of the term) has always grown incrementally, through edits, through generations. But those who were born later, had a tremendous advantage over those who were born before them. The changes were one way — recent to recent to recent. But recent is not necessarily right. In the non-objective disciplines, like history, sociology etc, this one way editing gave the revisionists
a free license. But there is no recent
bias in wikis. Err. Okay, there is, so far as we’re all mortals. But remember this: before wikis, knowledge (in non-objective disciplines) was just something someone pushed at you, because that someone was deemed expert. With Wikipedia, anyone is an expert. And more often that not, you see competing theories, criticisms, and so on (something that makes us doubt history more and more. If we can’t even know what’s happening around us with certainty, then how can we know past?)
But that’s just the tip of the iceberg. Wikis are the white-board of the hive mind
. They have the power to change the way we think. A generation raised on collaborative writing, might think very differently from us. Most of human endeavors are incremental. But they’re incremental in a linear way, because, we never had tools that offered what wikis/waves do – forking and merging, endlessly, across the world, across intellectual silos. No wonder we don’t know how to use them. Will we? Time will tell.
Wikis are, of course, just one corroborative tool. What wikis are to knowledge, the Wave could be to problem solving. What Wave promises (and to some extent Google Docs and tools like that), is a platform that’s fundamentally different from mails and clumsy document change tracking. That we could collaborate at all with them is a testimony to human tenacity! But imagine children growing with these tools as we grew up with pen and paper …
End of rosy picture.
Distractions: The (Un)Holy Grave of Web 2.0?
Much is said about our reduced attention spans. The popularity of micro-content is a testimony to that. What they mean is that those who create content, have to make their content consumable in small chunks. That’s a bad news for Umberto Eco, for one (imagine reading Foucalt’s Pendulum with small attention span). But there is a bigger problem: reduced attention spans of the creators themselves. Lucky were those buggers, who sat in their rooms, looking out the window, and staring at the blank pages for long times, wondering what they should write/think about. They didn’t have to worry about something popping up on the paper (except for bird shit, maybe, but let’s not be frivolous). They could keep on staring on that blank paper, on an on, till the idea struck. Then, they could write and write …
Will the writers of tomorrow have that freedom
from distraction? More importantly, will they fight for that freedom — fight with themselves, that is? Will they switch off that chat client, close that mail tab, even pull the chord (or its wireless equivalent?). When one is always connected, one is always distracted.
When you can contact friends any time, your friends can contact you anytime (well, practically). When your twitter/facebook/buzz streams are growing by every minute, that blank page
becomes that much harder to stare at.
End of Islands: Privacy in the time of User Generated Content
It is said that Immanuel Kant almost never left his home town. Not that he was a heretic or a sociopath. He did not find travel necessary for knowing the world. And as for human contact, I guess, he unplugged himself so that he could concentrate on his work. He wasn’t a misanthrope, but he chose solitude.
J. D. Salinger, author of the teenage bible, The Catcher in the Rye, chose to live away from the gaze of media and his innumerable fans, till the day he died. He lived in a small town in New Hampshire, Cornish
, where contrary to his public image of a recluse, he was a typical guy next door. After his death, his wife thanked the town
for protecting their privacy.
Salinger, was lucky. He achieved success before all the charades — book signings, conferences, Facebook fan clubs, promotional blogs, and what not — took over. In the world in which everyone is a potential paparazzi, and where you need to sell yourself with your work, could Salinger have survived
? Of course, contemplating on such hypothetical scenarios is a lost cause. The values that made Salinger what he was belonged to a world that was different from the vanity driven world of the me-me-web (or was it?). Children growing in this world — where reality TV and youtube are throwing 3 year olds into the flashlight, where losing has become the original sin, and anonymity death — would think of Salinger as an alien, if they ever hear about his way of life.
The flip side of celebrity is that it changes
you. Islands like Salinger produce what they produce, because
they’re islands, untouched by the passing streams. From early age, we will be raised, not in islands, but on the stage, like Truman Burbank
— only openly. What will that do to our thought patterns, and incidentally, to our creations? Will web spell and end to such islands, and its curious ecosystems?
The Power of Suggestion
A related point is that of suggestion. Social media is moving towards suggestions — music, movies, books, articles … If you liked this, you should check out that. In the past, this function was assumed by our friends, mentors, gurus. The thing is this: soon, social media will perfect the art of suggestion. I’m not sure that it is such a good thing. Of course, you will like what it suggests. That’s not the point. It is this: you’ll stop growing, because you’ll only consume what you’ve already liked, in different bottle. When friends suggested, they had the audacity to tell you: “trust me, you’ll like this”, when that it was something very different from what you had consumed before. It was not perfect. I’m not sure that was so bad. I fear that this social media oracles will slot you, just like actors are slotted because they were good at one specific role. Like an indulgent mom who feeds a child just sweets, because he likes it, social media will indulge us by feeding more and more of what we are sure to like. We’ll settle down into comfort zones. I guess that’s nothing new. People have always slotted themselves into ideologies, religions, cultures, and so on. But web 2.0 was supposed to be our answer to that (or was it?) — freeing us from the tyranny of mainstream, letting us define and redefine ourselves, ad infinitum. Instead, there is a risk of hardening our identities. And it will be so much earlier, so much … well serendipitous!
Of course, it’s all self-contradictory. That’s web 2.0 reality. Ayn Rand would have eaten her words “contradictions do not exist”, had she lived in the time of web. Fortunately, our Indian philosophers/mystics (and politicians) knew better.
Of course, it’s not written well. Has no structure. No central point. No answers. No real questions even. Blame it on distraction. But I was hoping you won’t notice it: distraction is supposed to take care of that too. Damn it.
Of course, there is nothing new/original there. Originality died the day web took over our lives!
Of course, there is are no conclusions. It just ends here.