Rahul Gandhi is Sourav Ganguly of Congress: Kaif
Centre of battle is also most deprived, study finds

World wide who?

It’s turned 25, but the Web is yet to discover itself

The Web has evolved in ways that its maker could not possibly anticipate. (Reuters) The Web has evolved in ways that its maker could not possibly anticipate. (Reuters)

The world wide web, the most disruptive technology since the development of steam power, the internal combustion engine and nuclear fission, has turned 25. It was born in March 1989 at CERN (now well-known for its dramatic pursuit of the Higgs boson) when Tim Berners-Lee, then a young researcher, proposed an architecture for the incremental storage and sharing of research data. It feels kind of weird to recall that just two and a half decades ago, we thought of information as stuff that sat still on paper, shopping as an activity limited to shops and spam as pink food out of a can. Meanwhile, a whole generation of the human race has grown up thinking that computers and smartphones make everything happen. Maybe that’s even weirder, in a world where the majority is still offline.

At age 25 or thereabouts, humans generally discover an answer to the eternal question: “Who am I?” The Web, however, exhibits an identity crisis at precisely that age. It has created millions of new jobs supported by hitherto impossible business models, like the online bookstore, but it has also wiped out traditional ways of making a living, like bookselling. From Tunisia to India, it has rekindled protest politics directed against authoritarian or unresponsive government. But after Edward Snowden’s damaging revelations, there is increasing disquiet about Big Brother snooping on private data. Ironically, there is as much disquiet in government, which is enraged to discover that it has been snooped on in return.

Consider the troll, who snuggles at the breast of social media. Or look at the paradox of Tor, the anonymising network developed with the honourable aims of bypassing censorship and protecting sensitive communications, such as those of journalists and activists working in hostile environments. It has sparked off a mini crime wave on the Dark Web, the part of the medium inaccessible to search engines and casual surfers, where drugs, guns, assassination contracts and child pornography enjoyed a flourishing market until a recent crackdown.

So, at 25, the Web does not really know who it is, god or demon, force of good or platform for evil. But it does know what it is not, though we may have forgotten: it is not the internet. That is a hardware network made up of servers, fibre optic, microwave and comsats talking to each other in TPC/IP, a protocol designed by Robert E. Kahn and Vinton Cerf. The Web, which consists of interlinked data, rides on the internet. It is made up of electrons and talks to our browsers in HTML and HTTP, the mark-up language and transfer protocol written by Berners-Lee, who conceived the system as a permanent, incremental archive. As the Web grew, as services like search engines and wikis were launched to index and organise data, it became the sum of non-proprietary human knowledge.

But the Web is about disruptive paradox, so the next wave of enterprise will be fuelled by unstructured data, which Big Data methodologies trawl in search of patterns which are not perceptible to the human intelligence but can be discerned by powerful computers processing data in bulk. But size matters to Big Data. It wants more, and then some. Today, almost all the accessible data is human-generated or human-mediated, which implies that the rate of data generation is limited by the size of the online population. The most valuable source of unstructured data is social media, and Indian political parties are trying to gain an edge in the coming election by applying data analytics to Facebook and Twitter. However, in the near future, the volume of data being generated will go into overdrive as the Internet of Things takes off. This is the network of connected objects and devices which can report their position and status to each other in real time. Their conversations, unimpeded by human agency, will generate data on a scale which would produce opportunities and solutions yet unanticipated.

Berners-Lee wrote his proposal for the Web in 1989 and produced a Web server and a client. The focus was on hyperlinking and the importance of images was downplayed. But when CERN opened the Web to free public access in 1993, the people’s preference was clear: Mosaic, the first graphical browser, could show pretty pictures inline. Two years later, Web design was a lucrative profession. Then, the popularity of the iPod forced music and video to move to the internet and soon, all media is likely to be distributed digitally.

The Web has evolved in ways that its maker could not possibly anticipate. Berners-Lee marked the 25th anniversary of his brainchild with a call for a crowdsourced Magna Carta for the Web, whose credibility and neutrality, he fears, are being compromised by governments, business interests, spin doctors and opponents of free speech. On a lighter note, in a 2009 interview, when he was asked if there was anything he would do differently if he could make the Web anew, he had admitted that the double slash in the address bar (http://) was wholly unnecessary. Browsers would work just fine without them and in a quarter of a century, millions of keyboards have spat out billions of useless slashes into the ether. But given the sweeping change that Berners-Lee has wrought in how we live, work, think and communicate, perhaps we can overlook this tiny, really tiny enormity.

pratik.kanjilal@expressindia.com

Do you like this story