[dropcap]T[/dropcap]echnology companies are keen to bring artificial intelligence features to phones and augmented reality goggles — the ability show mechanics how to fix an engine, say, or tell tourists what they are seeing and hearing in their own language.
But there’s one big challenge: how to manage the vast quantities of data that make such feats possible without making the devices too slow or draining the battery in minutes and wrecking the user experience.
Microsoft says it has the answer with a chip design for its HoloLens goggles — an extra AI processor that analyses what the user sees and hears right there on the device rather than wasting precious microseconds sending the data back to the cloud.
The new processor, a version of the company’s existing Holographic Processing Unit, is being unveiled at an event in Honolulu, Hawaii, on Monday. The chip is under development and will be included in the next version of HoloLens; the company didn’t provide a date.
This is one of the few times Microsoft is playing all roles (except manufacturing) in developing a new processor. The company says this is the first chip of its kind designed for a mobile device.
Bringing chip making in-house is increasingly in vogue as companies conclude that off-the-shelf processors aren’t capable of fully unleashing the potential of AI.
Apple is testing iPhone prototypes that include a chip designed to process AI, a person familiar with the work said in May. Google is on the second version of its own AI chips. To persuade people to buy the next generation of gadgets — phones, VR headsets, even cars — the experience will have to be lightning fast and seamless.
Every device will have AI
“The consumer is going to expect to have almost no lag and to do real-time processing,” says Jim McGregor, an analyst at Tirias Research. “For an autonomous car, you can’t afford the time to send it back to the cloud to make the decisions to avoid the crash, to avoid hitting a person. The amount of data coming out of autonomous vehicles is so tremendous you can’t send all of that to the cloud.” By 2025, he says, every device people interact with will have AI built in.
For years, the central processing units built by Intel and others have provided enough oomph and smarts to power the world’s gadgets and servers. But the rapid development of artificial intelligence has left some traditional chip makers facing real competition for the first time in over a decade.
The accelerating abilities of AI owe much to neural networks that mimic the human brain by analysing patterns and learning from them. The general-purpose chips used in PCs and servers aren’t designed to rapidly process multiple things at once, a requirement for AI software.
Microsoft has been working on its own chips for a few years now.
It built a motion-tracking processor for its Xbox Kinect videogame system. More recently, in an effort to take on Google and Amazon.com in cloud services, the company used customisable chips known as field programmable gate arrays to unleash its AI prowess on real-world challenges. Microsoft buys the chips from Altera, a subsidiary of Intel, and adapts them for its own purposes using software, an ability that’s unique to that type of chip.
In a show of strength last year, Microsoft used thousands of these chips at once to translate all of English Wikipedia into Spanish — three billion words across five million articles — in less than a tenth of a second.
Next Microsoft will let its cloud customers use these chips to speed up their own AI tasks — a service the company will make available sometime next year. Customers could use it to do things like recognise images from huge sets of data or use machine learning algorithms to predict customer purchasing patterns.
“We’re taking this very seriously,” says Doug Burger, a distinguished engineer in Microsoft Research, who works on the company’s chip development strategy for the cloud. “Our aspiration is to be the number one AI cloud.”
Microsoft has plenty of competition. Amazon also uses field programmable gate arrays and plans to use a new state-of-the-art chip design called Volta for AI built by Nvidia, which is now the leading maker of graphics processors used to train AI systems.
Meanwhile, Google has built it’s own AI semiconductors, called Tensor Processing Units, and is already letting customers use them. Creating chips in-house is expensive, but Microsoft says it has no choice because the technology is changing so fast it’s easy to get left behind.
Moving this expertise from the cloud down to the device in a person’s hand or on their face is a key priority for Microsoft’s AI-focused CEO Satya Nadella. In a May speech he touted the idea of using AI to track industrial equipment, telling the user things like where to find a jackhammer, how to use it and generating a warning in case of unauthorised use or a chemical spill.
The new HoloLens chip will make that and much more possible. Says Microsoft chief technology officer Kevin Scott: “We really do need custom silicon to help power some of the scenarios and applications that we are building.” — Reported by Dina Bass and Ian King, (c) 2017 Bloomberg LP