By Katherine Maher
The year 1989 is often remembered for events that challenged the Cold War world order, from the protests in Tiananmen Square to the fall of the Berlin Wall. It is less well remembered for what is considered the birth of the World Wide Web. In March of 1989, the British researcher Tim Berners-Lee shared the protocols, including HTML, URL and HTTP that enabled the internet to become a place of communication and collaboration across the globe.
As the World Wide Web marks its on Tuesday, 12th March this year, public discourse is dominated by alarm about Big Tech, data privacy and viral disinformation. Tech executives have been called to testify before Congress, a popular campaign dissuaded Amazon from opening a second headquarters in New York and the United Kingdom is going after social media companies that it calls “.” Implicit in this tech-lash is nostalgia for a more innocent online era.
But longing for a return to the internet’s yesteryears isn’t constructive. In the early days, access to the web was expensive and exclusive, and it was not reflective or inclusive of society as a whole. What is worth revisiting is less how it felt or operated, but what the early web stood for. Those first principles of creativity, connection and collaboration are worth reconsidering today as we reflect on the past and the future promise of our digitized society.
The early days of the internet were febrile with dreams about how it might transform our world, connecting the planet and democratizing access to knowledge and power. It has certainly affected great change, if not always what its founders anticipated. If a new democratic global commons didn’t quite emerge, a new demos certainly did: An internet of people who created it, shared it and reciprocated in its use.
People have always been the best part of the internet, and to that end, we have good news. New from the Pew Research Center show that more than 5 billion people now have a mobile device and more than half of those can connect to the internet. We have passed a where more people are now connected to the internet than not. In low- and middle-income countries, however, a new shows women are 23 percent less likely than men to use the mobile internet. If we can close that gender gap it would lead to a $700 billion economic opportunity.
The web’s 30th anniversary gives us a much-needed chance to examine what is working well on the internet — and what isn’t. It is clear that people are the common denominator. Indeed, many of the internet’s current problems stem from misguided efforts to take the internet away from people, or vice versa.
Sometimes this happens for geopolitical reasons. Nearly two years ago, Turkey fully blocked Wikipedia, making it only the second country after China to do so. suggest a Russian proposal to unplug briefly from the internet to test its cyber defenses could actually be an effort to set up a mass censorship program. And now there is news that Prime Minister Narendra Modi of India is trying to implement government controls that some will lead to Chinese-style censorship.
But people get taken out of the equation in more opaque ways as well. When you browse social media, the content you see is curated, not by a human editor but by an algorithm that puts you in a box. Increasingly, algorithms can help decide what we read, who we date, what we buy and, more worryingly, the services, credit or even liberties for which we’re eligible.
Too often, artificial intelligence is presented as an all-powerful solution to our problems, a scalable replacement for people. Companies are automating nearly every aspect of their social interfaces, from creating to moderating to personalizing content. At its worst, A.I. can put society on autopilot that may not consider our dearest values.
Without humans, A.I. can wreak havoc. A glaring was Amazon’s A.I.-driven human resources software that was supposed to surface the best job candidates, but ended up being biased against women. Built using past resumes submitted to Amazon, most of which came from men, the program concluded men were preferable to women.
Rather than replacing humans, A.I. is best used to support our capacity for creativity and discernment. Wikipedia is A.I. that will flag potentially problematic edits — like a prankster vandalizing a celebrity’s page — to a human who can then step in. The system can also help our volunteer editors evaluate a newly created page or suggest superb pages for featuring. In short, A.I. that is deployed by and for humans can improve the experience of both people consuming information and those producing it.
Our collective wisdom is perhaps one of humanity’s greatest accomplishment, one built collaboratively across ages, geographies and cultures. Over the long run, knowledge, like water, proves more powerful than every vessel that seeks to contain it. This is because people — no matter where they are, no matter from where they come — possess intrinsic curiosity, creative souls and inquiring minds.
It is clear that the right to inquiry is universal, by the United Nations as applying online as well as off. But today, censorship is only part of the challenge. Without people in the loop, we risk losing the web’s fundamental humanity.
If the best part of the web is indeed people, then we must keep them at the center of every policy decision and platform design. We must defend a web that is free and unfettered, and improve connections that allow creativity and collaboration. We should leave the artificial to the machines and restore humanity to the users.
Ms. Maher is the CEO and Executive Director of the Wikipedia Foundation