Science

Machine Learning Is a Pocket Monster

Google is about to trigger a revolution in consumer A.I.

by Graham Templeton
Flickr / ScaarAT

This week, Google launched its new line of wearable tech called Android Wear 2.0, and as part of the roll-out it has quietly announced something that will be far more important, in the long-term: Google has finally begun to achieve practical machine learning applications on consumer mobile devices. It’s a breakthrough that could revolutionize app development and fundamentally change the role of the smartphone in modern life.

Right now, machine learning can do truly incredible things — for Google, or Microsoft, or Amazon. These enormous tech corporations can afford to support machine learning applications like voice commands, because they can afford the cloud-based supercomputers necessary to do the heavy computational lifting. Smartphone users can’t actually control their devices with their voice; rather, a remote server can listen to that voice and use electrically intensive programs to turn that voice into text. It’s the text translation of the command, transmitted back to the phone, that actually issues the command.

Localizing that convoluted process would allow for all new uses. The best example comes installed in Android Wear 2.0, which features a machine learning framework that can be used to efficiently run any number of programs. The flagship program is Smart Reply, which integrates into Google’s Allo chat program, studies users’ chat behavior, and tries to predict responses, allowing people to one-tap their way through interactions.

A prototype artificial neuron for use in an eventual consumer "brain inspired chip" made specifically to execute machine learning algorithms. Credit: IBM Research

Flickr / IBM Research

Previously, Allo handled this ability in the cloud, but now Google is starting to move those processes over to users’ hardware. If the process is as efficient as Google says, it should allow much wider integration of machine learning due to reduced energy requirements.

Why is this important? Because basic machine learning features will be available to third-party app developers, who previously had to pay Google for access to that sort of computing. An app boom could be in the offing.

There are, of course, some drawbacks to this localized approach. The programs that sift Allo messages at Google are much more robust, doing a more complex version of the same analysis; in Android Wear 2.0, Allo’s Smart Replies are predicted by a newly efficient message-grouping algorithm that tries to see through the infinite, largely useless variation in human interaction. “Hey,” “Hey man,” and, “Hey buddy” are not all that functionally different, after all, and mostly require the same responses. It’s the “mostly” that could be the issue, versus cloud-based solutions.

Still, like all virtual apps these days, these predictive abilities trend toward customization. If a user likes to respond with single emojis, that’s what the program will suggest. If a user likes to use proper grammar and punctuation, the Smart Replies will help with that, too.

An example of how Smart Reply groups similar messages with similar identifying strings of code. Credit: Google

Google

This currently means that the previously Allo-specific functionality will now be available on Google Hangouts and Google Messenger, but the new API released by Google should allow that list of apps to balloon very quickly.

Beyond the possibility of making machine learning available to a huge additional population of third party app developers, this could signal the beginning of the locality revolution in advanced programming. Right now companies like Qualcomm are developing modified architectures that can run machine learning code more efficiently, and at places like IBM they’re even making fully brain-inspired chips designed from the ground up to execute these programs with much, much greater efficiency than currently possible.

The sort of pure software solution Google is releasing here will allow more efficient machine learning on classical hardware — which will be good enough, for a while. A wider array of companies may only begin to really pour resources into developing and releasing robust machine learning chips when software developers start to demand it.

Though it notably cannot provide an offline version of Google’s flagship Assistant service, Android 2.0 could nonetheless generate a storm of such demands, very soon.

Related Tags