Tip

On-device machine learning offers security, reduced latency

Where should IT process machine learning data? Mobile devices can now process machine learning, but it's important to evaluate all the options -- including the cloud.

As on-device machine learning becomes an option for organizations, more use cases are emerging for mobile devices, but IT should know where and how to run the technology.

Machine learning and AI have a variety of use cases in the enterprise. For example, image recognition that can identify specific components within an image can be useful in security systems, as well as in process automation and subject identification.

Similarly, IT pros and employees can train voice recognition systems to recognize specific words or phrases and then take action. Employees can use gestures to control various systems and provide a hands-free approach to interactions, which is especially important when touchscreens are inaccessible due to protective gear.

On-device machine learning capabilities can now process significant amounts of data directly on mobile devices.

Where to run machine learning software

Many AI instances are compute-intensive programs running on massive computers in the cloud, but on-device machine learning capabilities can now process significant amounts of data directly on mobile devices. Recently, Qualcomm and Arm added AI-specific circuits to make on-device machine learning more feasible.

machine learning components
Understanding machine learning components

Chips embedded into mobile devices are now powerful enough to handle machine learning, particularly on higher-end mobile devices. Problem-solving by processing a large amount of data must still take place on massive cloud platforms, such as Google, AWS and Microsoft Azure, but there are significant advantages to on-device machine learning.

There are some issues with processing machine learning for mobile devices on the cloud. Running everything in the cloud requires the mobile device to transmit and receive vast amounts of data. This can be burdensome, even over relatively fast networks, and can also be costly. It introduces a latency issue as the data makes its way to the cloud, gets processed and returns.

Finally, privacy and security are much harder to compromise if data doesn't leave the device and is processed locally. In an age of increased privacy and security regulations, this is a major advantage for on-device machine learning.

How to choose machine learning tech for mobile

When developing a mobile strategy for machine learning, it's important to select a tool that IT can use across all platforms transparently -- mobile, local servers and cloud. For example, IT can configure TensorFlow -- a popular open source machine learning software library -- to run on large hardware in the cloud or in enterprise data centers, as well as on smaller systems, such as Android, via a lighter version of the framework.

The ability to write code and process it in the most appropriate location enables IT to take advantage of both cloud-based services and onboard processing power. Similar capabilities in other deep learning frameworks, such as Caffe, and vendor-specific capabilities, such as Azure Machine Learning, Salesforce Einstein, Amazon SageMaker and IBM Watson, may be attractive if an organization already has a relationship with those providers.

It's still an emerging area, but many companies are now starting to evaluate and deploy on-device machine learning. Because it's often used in conjunction with the cloud, it's also important to choose the right cloud provider that can host the machine learning data and connect directly to the mobile device. Most major cloud providers can segment their AI offerings out across device types.

Dig Deeper on Mobile infrastructure

Networking
Unified Communications
Security
Close