Manage Learn to apply best practices and optimize your operations.

IoT requirements in a 5G world

With the emergence of 5G, big data is going to experience a seismic shift, promising data rates 100 times that of 4G, network latency of under one millisecond, support for one million devices per square kilometer and 99.999% availability for the next-generation network. The exponential growth in data velocity and volume under 5G will increase the complexity and demands of operational analytics, disrupting the way organizations ingest, store and analyze data.

5G is also poised to advance IoT by providing faster device and sensor connectivity with higher data capacities. By 2021, IoT endpoints will reach an install base of 20.4 billion units, according to Gartner. When paired with 5G networks, these endpoints will produce unforeseen amounts of data.

The true value in 5G-generated data lies in making it actionable, which is only possible when the data is analyzed in real time to make more intelligent decisions. To take advantage of this influx of data, both private and public organizations will have to redesign their data stacks to process information closer to the edge to cut down on latency.

So, what are the top considerations for organizations looking to capitalize on IoT data in a 5G world? Let’s look at the requirements:

Event stream processing

To make sense of the vast amount of data resulting from more than 20.4 billion IoT endpoints, real-time complex event stream processing (ESP) needs to go beyond simple data moving and aggregation to keep track of some key performance indicators (KPIs). The data needs to drive cognitive decisions, combining the insights from predictive and prescriptive analytics with the fundamental contextual correlation. These decisions need to happen very rapidly with ultra-low latency and closer to the edge of the IT network to facilitate machine-to-machine (M2M) communications.

Having a contextual state is crucial to making meaningful business decisions based on data generated by connected systems, but legacy ESP frameworks and some contemporary streaming technologies, such as Apache Kafka, KSQL and Kafka Streams,  either offer static state — used primarily for enrichment — or a state that is isolated to an individual stream, limiting processing to very basic data models.

With the proliferation of 5G, most modern businesses are going to require cognitive decisioning driving robotic process automation that relies on complex data models and complex orchestration to truly differentiate themselves from competition. These modern applications depend on low latency decisions, resulting from reduced layers of technology used to perform high-impact, real-time business functions. This requires a swift and unified in-memory data processing platform that provides accurate answers and decisions.

Modern ESP frameworks will also need to offer the necessary responsiveness that IoT and other mission-critical applications demand. Oftentimes in a M2M communication scenario, there is someone or something waiting for a decision and a hint to act upon it. Without the ability to tap into the intelligence provided by data as close to the real-time event as possible, this data is destined to enter the caverns of dark data.

Lastly, to maintain the veracity of the decisions in relation to the data and information that drove it, the data platform will need to provide the traditional guarantees that a database provides. This includes atomicity, consistency, isolation and durability transactions that are required for most applications in the IoT, financial services and telecom industries. The needs go beyond simply storage level guarantees to include ingestion and application of rules and insights, ultimately driving decisions.

A new layer of infrastructure

The low latency requirements of IoT devices and applications can only be met with a new layer of infrastructure, such as edge data centers or micro data centers that are close to the end-user or devices they serve.

To capitalize on 5G’s influx of data, all industries will require scalable IoT data processing at the edge to process and analyze the data at a speed that retains its value and makes it actionable. In specific use cases, non-vital data will be able to be offloaded to cloud data centers. But when actionable decisions are needed, near-edge computing will provide organizations with the best chance to respond to events in real time.

Ultimately, IoT data processing at the edge requires the ability to conduct stateful, high-performance stream processing at scale on data in motion to deliver accurate insights. This approach combines data storage and stream processing to streamline the data stack to keep pace with the barrage of 5G IoT data.

Machine learning

Legacy database technology has traditionally been focused on analyzing historical data to gain a rear-view understanding of business performance. While this is important to the success of a business, in order to gain a competitive advantage, it’s critical to use machine learning to drive intelligent decisions on streaming data as the event is being processed.

During what is being referred to as the fourth industrial revolution, historical data about what has worked in the past won’t help organizations that refuse to embrace IoT. The key element to machine learning is finding a predictive analytics model that trains on historical data while also ingesting a high volume of constantly streaming data to operationalize it, all in real time.

For example, as cities begin to install more IoT endpoints and become more connected to the way citizens flow throughout numerous areas, IoT-driven insights may lead to more pedestrian-friendly designs or ones that improve the flow of traffic for vehicles. From an emergency services perspective, IoT endpoints could provide fire trucks, police cars and ambulances with optimal routing that produces quicker reaction times and saves lives.

It could also allow emergency services to stream a real time investigate an emergency to assess situations and allow more informed decision making. The use cases within cities are endless and will continue to grow as more endpoints provide a better look at the flow of people, cars and businesses.

In a 5G-connected IoT world, removing latency is key to making better informed business decisions that drive the success or failure of an operation. Those organizations and cities that don’t embrace the IoT and its billions of endpoints will fail to gain actionable insights and lag behind, but those that do will develop applications and improve people’s everyday lives in ways that were unfathomable with 4G.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

Data Center
Data Management