
Hark back a few years to prevailing notions about the approaching millennium. Cyberia was still a distant empire but the inexorable progress of technology evoked widespread fears about The End of the World as We Know It (incidentally, a theme which is now the preserve of Y2K doomsayers and their nightmares about a return to jungle raj).
We were all supposed to be on our way to becoming number-crunching zombies, glassy-eyed from a life in front of the computer screen, scrolling down reams and reams of data, conversing with one another in mind-boggling acronyms in robotic diction.
Happily, like all visions of doom it may never come to pass. Sure, for millions the computer screen is now the undisputed window on the world, but they are not quite inundated with dehumanising data; sure, information overload is the turn-of-the-millennium favourite among psychologists, but this relentless shower of trivia and critical facts alike is increasingly being packaged in an old, comfortable form: thenarrative.
In fact, this trend is not limited to the Web and has become the mood of the moment among academics. The longstanding lament about an unbridgeable schism between the two cultures (the humanities and the sciences) and the public’s resultant alienation from momentous scientific achievements has finally been addressed in the last few years by practitioners of what has been deemed “the third culture” scientists themselves explaining advances in disciplines as diverse as biotechnology and cosmology to the “educated public”. As often as not, these explanations are packaged as narratives and experiment with a variety of storytelling devices and personal anecdotes to enthrall the non-specialist.
Another case in point is history. The so-called “new narrative history” is fast blurring the thin line between fact and fiction as more and more historians abjure objective, copiously footnoted prose and use the storytelling format to attract a wider audience. So much so that many universities are nowreportedly offering courses in writing skills as part of their history curriculum.
But it is on the Web, with its limitless interactive potential, that the narrative is being given new twists. Blame it on Oprah and her clones, but these days everybody has a story and the will to tell it. And with technology and advertising making personal Web sites ever cheaper and ever easier to construct, almost every Netizen has become a broadcaster with his or her own tales about “my Christmas in Antartica” or “a short history of the East Asian financial crisis”.
While most of these sites are of interest to a small circle, there has been an attendant explosion of online journal writers who post daily entries about their personal ups and downs for surfers around the world.
Conventional wisdom would consign them to the lunatic fringe, but many diarists like thirtysomething Cathy, who maintained a lively, heartwarming “Cook’s Diary” for ages before admitting to distressing schizophrenia, shatter the myth withtheir evident intelligence and rootedness. The upshot: a contagious enthusiasm to tell a few more tales.
Of course, this return of the narrative is not as innocuous as it seems. Besides the surrealism of living out one’s life in cyberspace, unfettered freedom of expression on the Web also translates into an infinite capacity for misinformation.
Moreover, big business has been quick to grasp the advertising potential on the Web. Sponsoring personal sites and e-mail services may all be very well, but what happens when branded products and their transformational power are sought to be woven into the storyline? That, as they say, is an altogether different story.

