Problem solve Get help with specific problems with your technologies, process and projects.

Getting to advanced-class IoT

This is the fourth of a five-part blog series. Read part three here.

An inherently fragmented landscape

Adding to the general challenges of scale that I talked about in my last blog, the IoT landscape is inherently fragmented. For starters, a lot of domain expertise is required to pull together a system — deploying IoT takes a village! It takes meaningful partnerships instead of logos-on-websites exercises — in many cases between experts across the OT and IT domains.

There are literally thousands of connectivity standards in the OT world (when you count proprietary ones) and tens that matter in the IT world. And then there are data models on top of that. There’s an old standard joke — “I’ll fix that problem with one more standard!” Let’s face it, we all benefit from standards and there’s some very important related work happening out there, but there will never be a single standard to rule supreme in the IoT world. So, we need to help these efforts work better together.

Finally, the closer you get to the extreme edge (e.g., the physical world where people and devices live), the more preferences developers have for programming languages, operating systems and hardware (for example, specific requirements around form factor, I/O and ruggedness). Software gets a lot more complex too when you start getting into highly constrained microcontrollers or control systems running embedded applications on a real-time operating system.

Net-net — the more we can dynamically bring together different domain expertise and technology choices across all the edges while meeting the needs of key stakeholders, the better off we all are.

Platform soup is drowning customers in confusion

The promise of IoT plus all the inherent fragmentation has led to a crazy proliferation of platforms over the past few years. Back when the IoT hype really started heating up in 2014, the running joke was there were 150 platforms. By 2016, that number had jumped to about 300 and now we are facing estimates upwards of 450. This amounts to a 33% increase from 2016 to 2018, not including the roughly100 that were acquired or shut down.

Now to be fair, this number isn’t wholly accurate because not all platforms are created equal in terms of scope, but there are simply too many nonetheless. This can be paralyzing to customers who are afraid of taking a leap of faith off of a cliff, so this is a contributing factor to the relatively slow ramp in IoT adoption — behind, of course, the business case and people parts already discussed.

But there’s hope — read my IoT predictions for 2018 to understand why I think we’re hitting the peak of the platform rollercoaster!

IoT goes vertical before horizontal

To date, the vertically focused platforms with domain expertise are getting moderate traction because they address a use case that resonates with a customer by providing clear value. These are providers that know more than you’d ever want to know about things like cold-chain retail or oil well head monitoring (aka, a “well whisperer”).

Meanwhile, the horizontal platforms are stalling due to lack of a clear use case and value proposition. In short, when you’re trying to be everything to everybody, you end up doing nothing for anyone. Try saying that three times fast.

We’re definitely seeing an emergence of edge computing focus among the platform players, but it’s still fairly early. Many providers are simply jumping on the edge computing bandwagon to ride the jargon wave without really addressing the core problems, but there are some solid efforts out there for sure.

To graduate from AOL stage to advanced class, we have work to do in three key areas.

How we handle perishable data

Firstly, we have to enable intelligence in the handling of streaming data. The majority of IoT data is “perishable” within seconds to minutes, meaning if you don’t create an alert or make a change based on the data in the moment, it’s not going to do you much good later. Today, many spend money to move data they collect to a data lake without giving it much thought, stockpiling it only to realize later that they never touch it. This is like a bad episode of the TV show Hoarders.

This type of streaming analytics at the edge is done in memory for rapid response, and on the spot you do a combination of taking action, storing, forwarding and scrapping.

How we address massive scale

Analyzing streams of data to deliver outcomes breeds the hunger for more data, which in turn breeds the need for even greater scalability. Algorithms perform better with more data, so the cycle continues.

So, now we must deal with a scale problem, including how we manage and secure billions of devices out there with no human attending to them on a daily basis in order to be able to say “now wait a minute, that’s not quite right!” in the case of an attempted (or worse, successful) hack.

With user-centric systems you typically know when you’ve been hacked because your friends and colleagues start to get random emails from you about how you’re a prince that came across some money that you need to get rid of, or random charges begin to show up on your credit card statement. In the world of IoT, we have to program in that intelligence. AI will increasingly help, but that’s fun talk for another time!

How we make it an actual internet of things

As I’ve already said in a previous blog in this series, IoT may start in OT, but it scales in IT — in terms of needing robust security and manageability tools, application orchestration and so forth. And we’re not just talking edge and cloud, but a high degree of elasticity and scalability across the continuum of edges and clouds. That brings me to the notion of a “system of systems.”

Creating a simple, focused system that addresses a business need is a great starting point (actually recommended), but the true potential of IoT — and edge computing, more broadly speaking — is the concept of a system of systems. It’s about building increasingly larger intranets that then interconnect to produce even more valuable outcomes. In the business world it’s ultimately about monetization — ideally from people you don’t even know. The network effect and sharing economy — this is the real scale factor!

Consider the end-to-end ecosystem of food production to consumption. From the energy and raw ingredients produced to the food processing facility to the shipping, warehousing and grocery store and ultimately to your kitchen table. It’s one thing to create point systems in each of these domains, but how do we interconnect those subsystems into a broader system and drive new outcomes?

Trust is everything

In the consumer world, it makes sense for key ecosystems to revolve around certain providers that sell products and content. These services establish a valued relationship with the end user. We’ve seen significant pickup in the past few years with our Amazon Echo and Google Home type devices, with voice input playing a bit part in streamlining this experience. Before being able to bark commands from across the room, it was easier to just get up and dim the lights instead of opening your phone and scrolling through pages of apps to find the lighting control.

Bottom line, consumers generally place trust in a specific set of brands that they get value from, and privacy often goes out the window when sufficient value is realized. Sometimes the trust is violated when personal information is leaked, but nevertheless single large entities are often a trust nucleus. In short, the typical consumer is okay with a walled garden if there’s good stuff planted in it. (P.S. Read more in my recent blog on Dell Technologies IoT and Edge Computing strategy including key opportunities and considerations for AI.)

However, in the business world it simply doesn’t work for a small set of companies to be the keepers of the trust across a broader system of systems. Same goes for B2B2C use cases such as home health, usage-based insurance and smart grid hitting any given home. We need ways to drive intrinsic and pervasive interoperability and trust at scale.

So, today we have a bunch of monolithic IoT platforms and users have to build trust relationships one by one, building custom integrations through the respective platforms’ APIs. But in order to realize the true potential of IoT and overall digital transformation, we need more flexible and open ways for applications to be integrated and data shared across private and public domains.

And beyond that, data is only worth something if I trust it. How do I trust that the data you want to share with or sell to me is real? How do I deal with privacy rights and regulatory requirements like GDPR?

Just put some blockchain on it, right?

As mentioned in my 2018 IoT predictions blog, if you’ve ever seen the movie My Big Fat Greek Wedding, you’ll know what I mean when I say that blockchain is a current “Windex” of technology. Don’t get me wrong, blockchain (and more broadly speaking, distributed ledger technology) is set to transform society as we know it. But it’s also deep in the hype cycle and not a panacea. (Just like 5G, by the way, but I’ll save that for another time).

Distributed ledgers are highly relevant for building trust and ensuring transparency and traceability across multi-party ecosystems such as a food supply chain, medical records in a healthcare system and so forth. The power of the decentralized ledger is that it makes it impossible for any single entity in a system of systems to cook the books.

However, there are also challenges to address. First, there are issues with the compute power required for public chains (just think of how much compute people are using to mine bitcoin). It is true that there are emerging variants of public blockchain-like schemes that are showing promising efficiency. Meanwhile, a permissioned chain is doable among a constrained set of participants — for example, a set of suppliers in a logistics scenario.

Transparency vs. liability

However, a second issue is that blockchain doesn’t automatically address potential issues with business motivation and liability. Transparency within a cold supply chain is great until you’re the actor in the chain that gets exposed when someone gets sick, or worse, dies from contaminated food. If you’re the one at fault, you will instantly very much dislike blockchain.

For this reason, I think we’ll see initial distributed ledger adoption in highly regulated areas, in altruistic use cases in smart cities and communities with a clearly aligned purpose, such as sharing resources and knowledge. Said simply, fear of exposure will lead to regulated and altruistic applications of ledger technologies in the short term.

Still, distributed ledger technology will ultimately lead to scaling the monetization of data along with innovations that enable the sharing and monetization of resources (e.g., compute, storage, networking, energy) and even domain knowledge among complete strangers. For the latter, think Angie’s List on steroids.

A long chain of choices

Another challenge is that end users are faced with a dizzying selection of choices. Do I go with the Ethereum Enterprise stack? Hyperledger? R3? IOTA?

This is where vendor-neutral industry collaboration is important. A brief shout-out to VMware in the Dell Technologies family for recently launching Project Concord to help the market take a step forward toward an open-source, consensus-based blockchain stack that can integrate with other standards, de-facto standards and open source efforts.

In closing, as powerful as ledger technology is, it doesn’t replace the need for rethinking how we foundationally architect our solution stacks in terms of infrastructure and how applications are deployed, secured and managed from the edge to the cloud.

So, what else do we need to do? For starters, we need to decouple domain expertise and applications that generate insights and outcomes from the underlying infrastructure. And in order to keep up with the pace of overall innovation and change globally, we need to extend cloud-native principles to the edge so we can rapidly software-define outcomes anywhere and anytime.

If we can figure out the right open architecture, then we can start looking at a massively scalable monetization flow across it!

More on this in my next blog. In the meantime, I’d love to hear your comments and questions.

Keep in touch. Follow me on Twitter @defshepherd and @DellEMCOEM, and join our LinkedIn OEM & IoT Solutions Showcase page here.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

CIO
Security
Networking
Data Center
Data Management
Close