Get started Bring yourself up to speed with our introductory content.

Software is devouring the edge -- are you ready?

Software is bringing complex business logic and artificial intelligence to the edge, enabling enhanced applications and raising user expectations. But what changes should IoT executives make to their hardware-oriented business to ensure they’re not left behind?

In my previous article, I discussed how the reduced latency of 5G networks will cause a lot of software to move from end user devices to the backend world of data centers and the cloud. But while the bulk of data processing for connected edge devices currently happens on the backend, the cloud won’t cut it for cutting-edge apps because this class of applications requires instantaneous execution of action and analysis. This includes emerging IoT uses such as robotics, artificial intelligence, autonomous vehicles and manufacturing, where sending data back and forth across a central server is seconds too slow.

We’re still in the early days of this adoption, but with data volume and velocity soaring, you can expect to see a rush of real-time and business-critical software on edge devices. By 2020 the average end user will generate 1.5 gigabytes of data per day, according to CB Insights. The deluge of data puts increasing demands on bandwidth and slows down response times. This strain on bandwidth will only cause edge computing to rise in popularity.

Living on the edge

Edge computing is a prudent alternative to the centralized focus of the cloud. If the weather is bad, the power is out, the data center is on fire or someone hacks into the cloud, will your entire business shut down? Will your customers’ lives be in danger? The decentralized nature of the edge minimizes these risks.

For instance, a self-driving vehicle must make instantaneous decisions that ensure the safety of its occupants and that of nearby drivers, passengers, cyclists and pedestrians. Similarly, an autonomous train capable of making real-time decisions on routing, braking, track selection and energy usage can greatly reduce or even eliminate accidents caused by human error.

From the cloud to the edge

By bringing compute and storage systems as close as possible to the application, component, or device producing the data, edge computing significantly reduces processing latency. In 2014, Cisco coined the term fog computing, which essentially provides a standard for extending cloud networking capabilities to the edge. As David Linthicum, chief cloud strategist at Deloitte Consulting, puts it, “fog is the standard, and edge is the concept.” By facilitating the interaction of compute, storage and networking between end devices and backend data center and cloud systems, fog computing enables enterprises to push compute to the edge for better and more scalable performance.

Fog computing is often used in retail stores with point-of-sale registers. One good reason why is that when a customer uses a credit or debit card or smartphone to make a purchase, connectivity to the backend isn’t always available to complete the transaction. So the retailer will have a local server tucked away in the store, with a fog network allowing all the devices, software and components to communicate with one another. Similarly, factory floors use fog networks that enable automated machines to share data. When one machine in an assembly line completes its step in the manufacturing process, for example, it notifies the next machine down the line to start up its job and so on.

Edge technologies to watch

Some people may argue that edge computing is nothing new and that consumer edge devices have been around since the dawn of the pocket calculator. Perhaps, but edge computing has come a long way since then, and today it uses sophisticated protocols and technologies tailored to its unique characteristics.

Many edge companies use Message Queuing Telemetry Transport (MQTT), a popular, lightweight communications protocol for remote locations requiring a small code footprint or when network bandwidth is in high demand. The new standard, MQTT v5.0, has significant upgrades, including better error reporting and shared subscriptions.

Advanced Message Queuing Protocol (AMQP) is also popular on the edge. AMQP is an open-standard messaging protocol that allows you to create message-based applications with components built with different languages, frameworks and operating systems.

Edge-based businesses are using HTTP far less than MQTT, AMQP and other protocols. By contrast, in the cloud world it’s almost a given that you’ll be using HTTP more than 90% of the time.

Containers are big on the edge, too. When people talk about software containers, they’re usually referring to cloud-based and other backend systems. But containers also make it easier to securely distribute software to edge environments and to run containerized apps on a lightweight framework designed for easy patching and upgrading.

Bringing AI to the edge

A lot of business logic has already migrated to the edge, but AI on the edge is still in its infancy. Complex algorithms needed to run AI applications often require the powerful processing of data centers and cloud systems, making these apps less useful on edge devices. This is changing fast, however. By 2022, 80% of smartphones shipped will have built-in AI capabilities, up from 10% in 2017, Gartner forecasts.

However, other factors may limit AI’s effectiveness in the edge world, at least in the near future.  Current AI-driven applications often lack sufficient data — the full context — to make life-impacting or business-critical decisions. And when an AI app makes decisions based on partial data, the results aren’t good.

Here’s a simple example. Say you’re using an AI-powered assistant on your smartphone. This app can read your work calendar, and whenever a colleague sends an email requesting a meeting, the AI assistant looks at your calendar and schedules a good time to meet.

For this process to work flawlessly, however, the AI app needs full context — in this case, access to your entire daily schedule — to make intelligent decisions. So what if you’ve made plans to meet a friend at 4 p.m. on Friday—but you never put personal appointments on your work calendar? Your AI helper, working with insufficient data, may schedule a work meeting for that time because it lacks the full context of your professional and personal schedule.

Monitoring the edge

Performance monitoring in an edge environment presents a new set of challenges. To gain full context of your operation, you must monitor the entire ecosystem — network, cloud, applications, security, data center, IoT devices and the customer experience. An effective monitoring solution must examine everything, not just a subset of the overall network, to produce real-time insights, trigger actioning systems and automate operations.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

CIO
Security
Networking
Data Center
Data Management
Close