The digital decade - TechCentral

The digital decade

[By Duncan McLeod]

Shortly after 2000, when the dot-com bubble burst, a pall was cast over the technology industry. Internet companies ran out of funding and hit the wall, the Nasdaq crashed and is still valued at a fraction of what it was at the height of the dot-com investment hysteria, and a wave of consolidation followed in the IT sector, both locally and abroad.

It’s worth casting an eye back over the past decade and taking a look at just how far we’ve come in technology. Many of the grand ideas dreamt up a decade ago by those dot-com mavericks are now coming to fruition.

Who would have imagined that 10 years after the bubble burst that a social media network — Facebook — would sign up 800m users and have an implied valuation of almost US$100bn? Or that Apple, just starting to turn the corner after near bankruptcy in the late 1990s, would a decade later blow past Microsoft and Exxon Mobil to become America’s most valuable company with $70bn of cash in the bank and with most of its profits coming from two product categories — the smartphone and the tablet computer — that didn’t exist at the time?

Ten years ago, most of us were still browsing the Web using Mozilla or Netscape Navigator. Windows XP hadn’t shipped yet, so most computer users were using either Windows 95/98 or Windows 2000 and, at least in homes, connecting to the Internet through dial-up. Mobile data cost R50/MB or more and the thought of using a cellphone network as a primary Internet connection was not a consideration.

Today, hundreds of millions of people — it will soon be in the billions — carry smart devices in their pockets that are more powerful than most computers were at the turn of the century. With the swipe of a finger and a few keystrokes, virtually all the information ever created by mankind is available from almost anywhere on the planet where there is wireless coverage. And that includes the top of Mount Everest, from where — in May this year — renowned mountaineer and alpinist Kenton Cool became the first person to tweet from the summit of the world’s highest peak.

In many ways, technology in the first decade of the new millennium was driven by three megatrends: Moore’s Law, which has meant computers have continued to get exponentially more powerful while prices fall; the consumerisation of IT, in no small part due to companies like Apple; and the proliferation of cheap broadband Internet access almost everywhere, ushering in new models of computing for both consumers and companies.

New industries have emerged on the back of this: Google, founded just 15 years ago, is worth almost $200bn today, its success built on selling keywords to advertisers. And Facebook, Twitter and other online social networks have transformed the Internet from a one-way medium into a bidirectional flow of information, changing the way we socialise and communicate as a species.

Broadband networks are altering the way corporate IT functions as well. Cloud computing, where computing resources are made available online — in public clouds using the Internet, in private clouds inside companies, or a hybrid of the two models — is changing the world of corporate information systems.

Cloud computing is regarded by many as the third big wave in the computer industry. The first was the mainframe era, where powerful and expensive server computers were used in conjunction with “dumb” terminals on the desktop. It was an era thoroughly dominated by IBM.

Then, in the 1980s and 1990s, Microsoft and Intel turned this model on its head, ushering in the era of client-server computing, where desktop computers and laptops became much more powerful and mainframes became the preserve of the banks and other big companies.

Now, ubiquitous and high-speed networks are leading to the third wave, where computing is delivered as a resource in a utility-type model, as a service over a network, much like electricity. End users don’t care where in the “cloud” they’re getting their services — applications, bandwidth, processing time, storage and memory. All they care about is that they’re available over the network when they need them.

The idea is that this can save companies money and streamline corporate IT systems — computing resources are centralised in data centres, they can be optimised, fewer skills are needed, and so on. The same thing is happening in the consumer space, perhaps even more so. The line between client-side devices and applications and online services is becoming much less clear. As broadband gets cheaper and more ubiquitous, people won’t think twice about accessing online resources and sharing media-rich information with the world, even while on the go.

What will the next decade hold? If one looks at the radical changes of the past 10 years, it’s almost impossible to predict what 2021 will look like. What does seem certain, though, is that we’ll look back on 2011 as that quaint time before everyone on the planet was connected at high speed to everyone else — and to all human knowledge in a global “digital nervous system”.

  • Duncan McLeod is editor of TechCentral, SA’s technology news leader
  • This column was first published in MTN Business’s customer magazine, Di@logue

Comments are closed.

© 2009 – 2020 NewsCentral Media