Today, most of us depend heavily on the Internet, for everything from work, to managing our finances, answering correspondences, or our social lives. The applications of the web are both endless and vital to modern life.
While many of us have been online since the 1990s, the history of computing stretches back even far further – and after all, without computing, there can be no web. In fact, the first primitive computing devices were conceived as long ago as the 17th Century, with the earliest concepts for programmable computers emerging in the mid-19th Century.
Here’s a brief history of computing and of the web, and some of the scientific minds who contributed to the digital culture we know today.
The 17th century: logarithm and the slide rule appear
The history of computing begins in earnest in the early 1600s with the introduction of what would prove to be the foundations of modern computer programming. In 1614, John Napier proposed a new mathematical method, called the logarithm, which provided for an enhanced analytical scope.
Mathematicians and computer programmers use logarithmic exponents to simplify complex mathematical calculations and to create specific software program outcomes, such as the creation of graphs that compare statistical data.
Napier’s work on the logarithm first appeared in Mirifici Logarithmorum Canonis Descriptio, which became an influential text in the fields of mathematics and engineering, as well as physics and navigation.
Based on Napier’s studies, the slide rule was first developed by Edmund Gunther. Gunther’s Rule could be thought of as an early analog computer that used the principles of logarithms to multiply and divide. Reverend William Oughtred further expanded on Gunther’s design, combining two of Gunther’s Rules to create what is now commonly regarded as the first recognizable Slide Rule.
Oughtred’s slide rule designs were published by his student, William Forster, in 1632. From there, many other mathematicians and engineers developed and expanded upon Oughtred’s designs, creating slide rules capable of calculating trigonometry, roots, and exponents. The slide rule made efforts at computation much faster.
The 19th century: Lovelace, Babbage, and the analytical engine pioneered computer programming
The history of computing can’t be told without also considering the analytical engine. While the 17th Century gave us the tools to perform rapid calculations, it wasn’t until the 19th Century that the idea of a programmable computer began to emerge.
In 1822, Charles Babbage completed a small prototype “Difference Engine”; a machine operated by a hand-crank that could rapidly compute mathematical tables. Though there was some interest in his invention, contemporary metalworking techniques could not adequately produce the necessary parts for the engine and the project was ultimately abandoned.
Undeterred, Babbage joined forces with mathematician Ada Lovelace for his next endeavor – the analytical engine. The proposed structure of the analytical engine became the precursor for computers in the electronic age: an integrated memory, control flow, and an arithmetic unit.
Lovelace is often credited with writing the first computer program, thanks to her algorithm for the computation of Bernoulli numbers using the analytical engine. Sadly, much like the difference engine, the device was never completed, and Lovelace’s program was not tested within her lifetime.
Despite this, both Babbage and Lovelace are remembered today as pioneers of computing.
1943 – 1969: Electronic-age computers and the first networks
The mid-20th century brought a host of milestones in the history of computing and the web. In 1943, Tommy Flowers unveiled “The Colossus” – the world’s first electronic, programmable computer. Colossus featured a series of vacuum tubes that could perform counting operations. It was utilized by the Allies in the Second World War to intercept and decode messages from German High Command.
In 1949, EDSAC (Electronic Delay Storage Automatic Calculator) performed its first calculation. Today, this device is considered one of the first stored-program computers. In 1952, it ran the world’s first graphical computer game, “OXO”, which was developed by Professor Sandy Douglas.
1949 also saw the birth of the first modems. These early modems transmitted radar signals and were capable of modulating received digital data into sounds and demodulating sounds into data. By 1958, modems adapted for use with computers were connected to commercial telephone lines.
The first descriptions of a computer network resembling the World Wide Web of today appeared in J.C.R. Licklider’s “Galactic Network” concept, which he advanced in 1962. Licklider was a professor at MIT and the first director of the Information Processing Techniques Office (IPTO) at the Pentagon‘s Advanced Research Projects Agency (ARPA).
Licklider used the term to refer to a networking system he “imagined as an electronic commons open to all” in a 1963 memo he sent to his colleagues and addressed to them as “Members and Affiliates of the Intergalactic Computer Network”
This technological experiment is considered by many to be the first iteration of the internet as we know it.
The 1970s: The first email and the birth of ethernet change digital communication
One standing-out moment in web history is the first successful electronic mail. In March 1972, Ray Tomlinson developed the first email software; a simple send and read program to aid communication between ARPANET developers.
By July, Tomlinson had expanded the scope of the program’s abilities, adding options to file, forward, and respond to messages. These were the humble beginnings of what would come to be one of the most popular communication methods of the late 20th and early 21st centuries.
In 1973, the Xerox research lab in Palo Alto developed Ethernet, a technology that would accelerate the development of the web. The Ethernet was a system for connecting a number of computer systems together to form a local area network.
Early examples of Ethernet worked at a mere 2.94 Mbps. By 1979 a standard Ethernet rate of 10 Mbps was agreed upon by Xerox, Intel, and the Digital Equipment Corporation. Though by today’s standards this might not seem like much, this standardization represented a landmark in digital communication.
The decision to standardize the speed of transmission represented the move towards creating an accessible, commercial Internet that would eventually be made available to the public.
The 1980s: The first personal computers and the World Wide Web lead the way
The 1980s mark a point in the web’s history when the Internet began to move out of research facilities and into people’s homes. Although personal computers were available as kits as early as the late 1970s, it wasn’t until the 1980s that personal computers resembling the ones we use today hit the market.
It’s estimated that in 1980 there were a total of one million personal computers in the US, such as the Commodore VIC-20.
1982 sees the emergence of Transmission Control Protocol (TCP) and Internet Protocol (IP), commonly known as TCP/IP, as the protocol for ARPANET. These protocols are what allow data to be sent from one computer to another by creating and maintaining connections between hosts.
When data is sent over a TCP connection, the protocol divides it into packets or segments. Each packet includes a header that defines the source and destination and a data section. The protocol also ensures that the packets arrive in the correct sequence on the receiving end.
TCP/IP, to this day, remains the standard protocol for the Internet. In 1983, the Domain Name System (DNS) established the .edu, .gov, .com, .mil, .org, .net, and .int system for naming websites.
This is a massive improvement on the previous designations for websites, which consisted purely of numbers (e.g. 123.456.789.10). The term “cyberspace” first comes to prominence in the public eye in 1984 thanks to William Gibson’s “Neuromancer” novel.
In 1985, the first version of Microsoft Windows became available to the public, revolutionizing the software industry and changing the way people interacted with computers forever. This year, the first registered domain (Symbolics.com) is made for Symbolics Computer Corp.’s website.
In 1989, while working at CERN, Sir Tim Berners-Lee created the first Internet browser – the World Wide Web. During this time, he also began work on the fundamental technologies of the world wide web – HTML, URI, and HTTP – along with the first web server.
Thanks to Berners-Lee’s decision to make his codes available royalty-free in perpetuity, his work provided the basis for what would come to be known as Web 1.0; the first wave of the Internet as we know it.
Suddenly, an entirely new industry became available — internet service providers (ISP), began providing access to the Internet.
ISPs provide, in effect, a gateway to everything available on the internet. The first of them, widely thought to be “The World”, emerged in the late-1980s, and generally provided access to the net for a set monthly fee. These appeared first in Australia and the United States.
However, one of the first commercial “internet service providers” is widely considered to have been Telenet, which actually appeared in the mid-1970s. They sold commercial access to ARPANET, rather than the web as we know it today.
Internet service providers initially provided connection to the Internet via dial-up modems, often using public telephone networks. As the industry was relatively new, barriers to entry were low, and many new internet service providers would appear over the years — but few would survive.
The rise of cable television also helped increase connection speeds. Since cable providers already had wired connections to their customer’s properties, they could offer much higher speeds than dial-up alternatives.
The once highly-competitive ISP market would soon be crowded out by a few dominant internet service providers. Some of these would become de facto monopolies or duopolies in the ISP market.
1990 – 2004: Web 1.0 and the rise of social media ushers in a new age
At the beginning of the 1990s, the first web page was posted on the open Internet – a seminal event in web history, and one that marks the beginning of Web 1.0. Early websites during this time were static affairs that users could not interact with, and which were connected by a series of hyperlinks.
Soon after this, Internet-Protocol-based video conferencing also became possible. The early ancestor of what we call video chat today was made possible thanks to advanced video compression technologies, allowing desktop and personal computers to handle video conferencing.
During the early 1990s, as the internet became more and more popular, fiber internet (fiber-optic cables) started to be laid out en masse around the world.
Today, fiber-optic cables can be found in virtually every nation on the planet and form a major part of the modern telecommunications infrastructure.
Shortly after this, in 1992, the first audio and video were distributed over the internet. The phrase “surfing the internet” was also born this year.
One of the first video chat services was called CU-SeeMe, which was developed by Tim Dorcey of Cornell University. First deployed on the Apple Macintosh in 1992, it was later ported for use on Windows-based PCs. This early video chat service was eventually introduced to the public in 1993 as part of a National Science Foundation (NSF) funded education project called “Global Schoolhouse”.
1994 sees the birth of Yahoo! Started by Jerry Yang and David Filo, two electrical engineering graduates from Stanford University. The site was originally called “Jerry and David’s Guide to the World Wide Web.”
Yahoo! would eventually go public in March of 1995. Compuserve, America Online, and Prodigy also begin to provide Internet access that year. Amazon.com, Craigslist, and eBay are also born.
Match.com, the first-ever online dating website also launches in 1995. This would add a completely new way for people to meet new partners — for good or for ill.
In 1996, the Internet Archive first came into existence. This US-based, non-profit, digital library has, since its inception, aimed to provide free public access to digitized materials, including websites, software applications/games, music, movies/videos, moving images, and millions of books.
“We began in 1996 by archiving the Internet itself, a medium that was just beginning to grow in use. Like newspapers, the content published on the web was ephemeral – but unlike newspapers, no one was saving it.”
Today we have 20+ years of web history accessible through the Wayback Machine and we work with 625+ library and other partners through our Archive-It program to identify important web pages.” – Internet Archive.
The Internet Archive currently offers access to tens of millions of books, millions of videos and movies and audio files, hundreds of thousands of software programs, and hundreds of billions of web pages via its “Wayback Machine”. Members of the public can upload and download material at leisure, but the bulk of the Internet Archive’s material is collected automatically by its web crawlers.
One of the biggest moments in the history of the World Wide Web occurs in 1998, with the birth of Google. This event would profoundly change the shape of the web.
Netflix is founded in 1997 by Reed Hastings and Marc Randolph, initially offering DVDs by post. The company would later grow to become one of the most powerful entertainment companies on the internet.
1998 is the year that IPC version 6 is introduced, allowing for the future growth of internet addresses.
In 1999, AOL buys out Netscape, and peer-to-peer file sharing arrives with the birth of Napster. This event would rock the music industry as it allowed fans access to music without having to pay royalties to the artists. Napster was eventually forced to close its doors in 2001, under a ruling from a U.S. Federal judge, due to copyright infringement issues.
Though the term “Web 2.0” was first used by Darcy DiNucci in 1999, it is commonly regarded that this second wave of the Internet did not truly begin until 2004. In her article “Fragmented Future“, DiNucci prophesized the increased interactivity of the Internet of today, as well as the development of online-enabled handheld devices.
1999 also sees the first use of the term “Internet of Things (IoT)” by Kevin Ashton, during his work at Procter & Gamble. It would take around another decade for the term to catch on.
A few years later, in 2000, came the so-called dot-com bubble burst, wiping out billions of speculative investment in internet-based companies, many of whom had racked up huge stock valuations without producing any products or profits.
It would take until 2015 for the Nasdaq’s dot.com peak to return to pre-crash levels.
The SQL Slammer worm spreads around the world in less than 10 minutes in 2003, shocking millions of people. This year also sees the birth of MySpace, the video conferencing service Skype, and the Safari Web browser.
In February of 2004, Mark Zuckerberg launched Facebook from his Harvard dorm room. Within a month, almost half of Harvard’s undergraduate population was registered on the site.
Facebook, like many other social networks of its time, was indicative of the interactive trend in Web 2.0, which allowed users to comment, like, and tag each other in posts. While these additions might seem superficial, they would become hugely important sources of data for advertisers around the world.
Mozilla also launched this year.
This period also saw the rise of satellite internet for the first time. While significant advancements had been made in satellite-based telecommunications since the 1950s, it wasn’t until the early-2000s, that the first true internet-ready satellites for consumers appeared.
One of the first was launched by Eutelsat in 2003, followed by Anik F2 (the first high-throughput internet satellite) in 2004. The latter would provide broadband and multimedia services to the Northern United States and Canada only.
Beginning in 2011, more high-throughput internet satellites, like ViaSat’s ViaSat-1 satellite, and HughesNet’s Jupiter, helped achieved further improvements. These made possible the elevation of downstream data rates from 1–3 Mbit/s up to 12–15Mbit/s and beyond.
Since then, more and more companies are attempting to jump on the bandwagon by delivering their own internet satellite services. Most notable among them is SpaceX, whose “Starlink” constellation of internet satellites hopes to bring the power of the internet to some of the most remote places on Earth.
2005-2016 and the rise, the rise of social media, and the maturity of the IoT
YouTube launches in 2005 and Twitter follows suit the following year in 2006. The Internet marked its 40th anniversary in 2009.
That year also sees the first release of the first crypto-currency, Bitcoin. Since then, many thousands of alternatives have been launched, and blockchain technology is widely seen as the future for many different industries and applications.
By 2010, Facebook accumulates over 400 million active users, and social media sites like Pinterest and Instagram are also launched.
By all accounts, social media sites like Twitter and Facebook played a role in the 2011 Middle East revolts commonly known as the “Arab Spring,” as well as a major role in the growth of online bullying. The following year sees a major victory for “fair use” of material on the Internet, when a proposed bill by the Obama administration, the “Stop Online Piracy Act” and the “Protect Intellectual Property Act” are defeated.
The video chat service Zoom is also launched in 2011.
In 2013, Edward Snowden (a former CIA employee and National Security Agency (NSA) contractor) blows a whistle on the fact that the National Security Agency (NSA) was monitoring telecommunications of thousands of people around the world, including U.S. citizens. The video chat service Google Hangouts also launches in 2013.
In 2016, Google unveils its Google Assistant, marking its entry into the “smart” computerized assistant marketplace, that collectively displayed the growing maturity of the Internet of Things (IoT). Pokemon Go also launches this year (we’ll let you decide if this was a good thing).
The Internet of today and how it’s changing
Our online culture today marks a culmination of more than 400 years of web and computer history. Thanks to web technologies, we’re on the cusp of what could become a new industrial revolution. Remote work, facilitated by faster internet speeds, is quickly changing the face of the labor market.
Surgeons like Mehran Anvari can even perform operations remotely, working hundreds of miles away from their patients. However, while the Internet has become a vital tool in today’s world, it is one we take for granted at our risk.
Recent proposals, such as the eradication of Net Neutrality, critically threaten online access. Given the long history of the web, and the legacy of the many great minds who contributed to the technology, it would be remiss to allow the progression of the Internet to be hindered, but there is a great deal of debate as to where that progression should lead.