Saturday, February 26, 2011

In Depth: The future of the internet revealed

Technology changes so quickly, it's hard to remember how bad we used to have it. UK internet access didn't really take off until Freeserve launched in 1998, few of us had broadband before 2001 and the UK didn't even have a 3G mobile phone network until 2003.

Google was founded in 1998, Facebook and Flickr in 2004, YouTube in 2005 and Twitter in 2006. It's impossible to imagine life without them now, and the pace shows no sign of slowing down.

The internet we'll have in 2020 will look almost nothing like the one we have in 2011, from the information we access to the devices we use to connect. Can we predict exactly what it's going to look like?

Almost certainly not, but we can see the seeds of it even now, and work out a few of the directions the industry will have to travel down to make it happen.

At the back end

If you think the internet is busy now, think again. The current internet population of 1.7 billion is expected to exceed five billion by 2020 - and we're not talking about people. Everything from televisions to old favourite the internet fridge will be hooked up.

At the moment that's impossible, simply because we need many, many more IP (internet protocol) addresses than the current IPv4 system allows. IPv4 has room for four billion IP addresses, and according to internet co-creator and Google evangelist Vint Cerf we'll use up the last ones in the spring of 2011.

The new IPv6 standard has capacity for "340 trillion, trillion, trillion" unique IP addresses, Cerf says, "so the theory is we won't run out, at least until after I'm dead".

The move to IPv6 is crucial for several reasons. In addition to freeing up lots of internet addresses, it also improves network security and makes routers' lives easier. Unfortunately, it isn't backwards compatible with IPv4, so networks running IPv6 won't be able to talk to networks running the older protocol. Desktops, smartphones, laptops and routers generally support IPv6, but many ISPs and business networks haven't switched to it yet.

To address the issue, 6UK is raising awareness of the looming crisis and urging businesses to act. "The biggest set of changes in the history of the internet [is] happening today," Cerf explains. "The change in the address space, the change in domain name languages, the introduction of digital signatures in the domain name system, the introduction of end-to-end cryptography in accessing internet-based services. This is a huge change in the net."

The arrival of cloud computing has enabled us to outsource storage and applications to distant servers, and the trend won't just continue, but accelerate: Gartner Research predicts that by 2012, cloud computing will have become so pervasive that one in five businesses won't own a single IT asset.

Moving to the cloud

Our email, images and our work documents are often in the cloud already, and entertainment will follow in their footsteps.

Video on demand services are ten-a-penny online, but streaming, not downloading, seems to be the technology of the future: it's the solution used by Netflix in the US and iPlayer here, and Apple is widely expected to unveil a streaming version of iTunes soon (which would explain why it's building a billion-dollar data centre in North Carolina). Buying something online will increasingly mean buying access to it, with no direct ownership at all.

Gaming may move to the cloud too. A service called OnLive promises console quality games with minimal hardware by doing the processing in its data centres and streaming the results to a tiny 'micro-console'.

OnLive

STREAMING GAMES: OnLive promises to deliver console-quality gaming with the processing performed remotely

OnLive is a serious company - it boasts 200 employees, and its investors include Warner Bros and BT. It's available now in the US and looks set to grow quickly.

Cloud computing will be particularly important as smartphones and other mobile devices become the platforms of choice for most of our online activities. Phones don't yet have the power or storage necessary for desktop-calibre applications, so the emerging model is what Microsoft calls 'three screens'.

Three screens

As Steve Ballmer explains it, this is "an experience that spans the PC, the phone, the TV and the cloud". Rather than store your entire computing world on a desktop PC, you store it in the cloud and then access it on whatever device happens to be handy.

There are some things that a big desktop will almost certainly always do better than a smartphone, including data input, but there's no reason why the app has to be installed or data isolated to its own hard drive. With smartphones expected to outsell PCs by 2013 and Google's cloud-based OS Chrome on the horizon, cloud computing is going to be very important in the coming decade.

According to the Pew Internet and American Life Project, by 2020 most people can expect to "access software applications online and share and access information through the use of remote server networks, rather than depending primarily on tools and information housed on their individual, personal computers." It's all very exciting, unless you're an ISP.

Our appetite for online video is enormous and it's growing: the BBC's iPlayer delivers seven petabytes (7,000 terabytes) of video a month, while YouTube's bandwidth is estimated at 126 petabytes per month. Networking firm Cisco predicts that video will account for 90 per cent of consumer internet traffic and 64 per cent of mobile internet traffic by 2013.

Microsoft thinks online video isn't smart enough, and its solution is adaptive streaming, which it calls Smooth Streaming. Unlike traditional streaming, where your connection speed is checked once (if at all), adaptive streaming monitors your internet connection constantly.

If it becomes congested, the bitrate drops to something your connection can handle. When the congestion clears, the bitrate goes up. It works well, even on large-scale events, and you can see it in action at www.smoothhd.com.

The problem with adaptive streaming is that it still uses the old client/server model, where the server transmits data to you directly across the entire internet. BitTorrent creator Bram Cohen has an alternative idea, dubbed Project Pheon, which uses peer-to-peer networking to deliver streaming video.

Speaking at the 2010 NewTeeVee conference, Cohen promised "around five-second latency from when the content goes out to when it's actually displayed on people's machines".

Join the swarm

Pheon - like BitTorrent - uses swarming rather than traditional downloading. As you download a file, the bits you've downloaded are shared with other downloaders, so in theory you should get faster downloads by connecting to somebody near you rather than a distant server.

It's a technology that works best for popular files, and if you're a regular torrent user you'll know that new, popular torrents download like lightning while obscure ones crawl. This means swarming is best suited to big events, such as newly released films, live sports and concerts.

Of course, to actually access such high bandwidth services, we'll need fast broadband. Will we have the super-fast service the government is promising by 2017?

Trefor Davies is CTO and co-founder of business ISP Timico. "The problem facing the government is that the task is a huge one, and it would be very easy for them to decide that the only way they can realistically get to the end game is by roping in BT to help," he says, pointing out that while BT has offered to match the government's �830m funding to deliver 90 per cent super-fast broadband coverage by 2017, "coming from a company that claims to have 99 per cent broadband coverage, this makes us wonder what is meant by '90 per cent high speed broadband'."

Davies believes the only way to get fast broadband throughout the UK is to involve communities. "There are things that communities can do to make it easier and cheaper to roll out fibre networks," he explains.

"For example, companies like BT are charged anything up to �10 per metre for wayleaves to run cables across private land. [That's] a nice little earner for landowners: the average length of fibre in the Eden Valley is around 20km per community. That's a lot of wayleave charges that BT has to built into its costs." Landowners might waive those costs for community organisations, making fibre roll-out cost-effective.

Going mobile

Google goggles

SMART CAMERAS: Google Goggles' visual search is a taste of things to come

Our 3G phone networks weren't built with data in mind, and if you've ever struggled to download an email on a five-bar 3G signal, you'll know that networks are already struggling to cope. "I suspect that network operators have been caught by surprise by the increase in demand,"

Davies says, pointing out that operators "work on a two-year planning horizon, so if they do come across unexpected capacity problems it isn't always possible to do a quick fix."

Two developments should help: freeing up the 800MHz frequency band when analogue TV signals are switched off in 2012, and network operators upgrading to LTE (Long Term Evolution), often dubbed 4G. LTE won't be widespread for several years though, so we're stuck with 3G until at least 2015.

There is another option: HSPA+ (High Speed Packet Access Plus) - an upgrade to existing 3G networks that can, in theory, deliver 42Mbps download speeds. It isn't as solid or as fast as LTE, but several UK operators are considering it.

With demand for mobile data soaring, a crunch is coming. Network planners already report congestion issues, but these aren't spread equally: some congestion is in specific locations, other congestion at particular times of day. In a survey commissioned by telecoms billing firm Amdocs, 20 per cent of operators reported 'severe overload' at times, and just 37 per cent said their networks are running fine. 40 per cent of operators said 'bandwidth hogs' were contributing to problems. Expect to start paying more for mobile access, or more limits on what you can and can't do.

"O2 is forecasting a hundredfold increase in bandwidth requirements over the next few years? the figures don't stack up," Davies says. "Mobile operators have to raise the cash to pay for the new infrastructure and so they are looking at innovative pricing mechanisms. This hasn't arrived in the public domain yet, but we are being warmed up for it."

One such mechanism could be Dynamic Spectrum Access, a kind of electronic auction where your phone bids for bandwidth it needs. The good news is that the crunch isn't imminent, because the real bandwidth hogs are 3G dongles rather than smartphones.

"I don't think that mobile apps will drive the need for the same bandwidth as fixed line in this timeframe, although we should never say never," Davies says. "My HTC Desire HD already supports 720p HD video, but the industry needs to sort out battery life before we will see serious high speed usage from handsets."

Once that happens, mobile bandwidth is likely to become a premium product. If you want a fast, lag-free connection at peak times or in congested areas, you'll have to pay more for it.

Bye bye browser?

It seems backwards, but just as software moves online, the browser itself is becoming less important. Increasingly, data is delivered to and received from a range of apps on a myriad of devices, which all have a bit of browser built in.

You can see that happening in social networking. Much of Twitter, Facebook and Flickr's traffic comes from applications: mobile phone apps, desktop software with social media export options, stand-alone photo uploaders, desktop widgets, browser-based aggregators that combine multiple networks in a single browser window and so on.

It's a similar story with YouTube, which delivers video to the web, mobile phone browsers, televisions, and mobile phone and iPad apps, and which accepts uploads not just from the YouTube website, but from cameras, camcorders and games.

The key to these apps is the API, short for application programming interface. APIs are hooks that developers can use to get data from or put data into online services. For example, Twitter's API enables third-party applications to post to and get data from Twitter.

But the API is just part of the picture: the data also needs to be delivered in a format that means it's easy to use and easy to combine with other sources. Increasingly that means open standards such as HTML5.

HTML5 and Flash

Designing for the web used to be simple. If it was static, you built it in HTML and CSS. If it moved, you built it in Flash. Not any more. The emerging HTML5 standard does animation, video and effects too, and Apple for one believes that it's going to make Flash obsolete. Apple may be right.

HTML5 does many things that once required plugins or stand-alone software, including video, local storage and vector drawing - and unlike Flash, it's an open standard that produces search engine-friendly content.

HTML5 is far from finished and it'll be some time before it's as attractive to developers as Flash. It lacks the write-once run-anywhere appeal of Flash, because different browsers have implemented bits of HTML5 in different ways, and it lacks the excellent authoring tools that Adobe has spent years refining.

In the long term, however, we'd expect more and more content to be created in HTML5 rather than Flash, much of it using Adobe tools.

HTML5 has one particularly interesting trick up its sleeve: microdata. This enables designers to label content, and it's something Google already supports. Its Rich Snippets uses microdata to pull the relevant information about a website and display it in the search results. That information could be details of a restaurant's cooking style, the verdict of a review, contact details or anything else that can be expressed as text or a link.

This metadata is machine readable, and machine-readable metadata gets people like Tim Berners-Lee very excited. In 1999, he said: "I have a dream for the web [in which computers] become capable of analysing all the data on the web - the content, links, and transactions between people and computers? the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines."

Visions of the future

At Intel's Visual Computing Institute (VCI) in Saarbrueken, Germany, researchers are exploring what could be the next generation of interfaces by combining motion capture, photorealistic computer graphics and 3D navigation. Can we expect 3D interfaces in the near future?

new interface

FUTURE INTERFACES: Intel's Visual Computing Institute is experimenting with interfaces that mimic the real world

"The answer is a qualified yes," says Hans-Christian Hoppe, Co-director of Operations at the VCI, who admits that "in the late 1990s, VR arguably fell into the category of 'a solution looking for a problem', and unfortunately not a very elegant solution. One could draw a parallel with tablet devices," Hoppe continues.

"They have been around for many years, in many guises, but it was only when the user experience met expectations that the devices became successful." Hoppe says we're reaching that point with immersive interfaces.

"Hardware has indeed advanced, in particular for mobile devices. That isn't the key issue, however. Social networking has evolved from a fad to addressing people's needs? immersive virtual worlds are now able to look like a natural extension of the likes of Facebook - suddenly, all this technology might be useful."

"Pervasive network connectivity and performance are important, processing power and energy efficiency likewise, particularly on mobile devices," Hoppe says. But the technology needs to be matched with innovative thinking.

"All we have today are extensions of tried and trusted 2D interfaces," Hoppe says. "What is needed is a uniform way of interacting with a mixed 2D/3D environment that's easy to understand, convenient to use and that doesn't strain the human perception and motion senses."

We've been promised 3D internet before, but Intel's vision isn't blocky avatars but realistic, ray-traced images delivered in real time, and we're approaching the point where our hardware and networks are fast enough to deliver it.

Whether we'll want it remains to be seen, but when it comes to technology, having extra options to explore is rarely a bad thing.



Source: http://feedproxy.google.com/~r/techradar/allnews/~3/pV8v2o2EkhI/story01.htm

Cinthia Moura Monica Potter Brittany Snow Lauren German Cindy Crawford

No comments:

Post a Comment