Manage Learn to apply best practices and optimize your operations.

Realizing the Holy Grail of digital

Thank you for reading the final post in this five-part series, where I bring it all together — outlining how we need an open, cloud-native edge to ultimately realize the Holy Grail of digital.

If the current trend of vertically focused platforms continues, we’ll have wildly different methodologies for connectivity, security and manageability across myriad use cases. This is not only a nightmare to deal with whether you’re operational technology (OT) or IT, but also a major inhibitor for the true potential of IoT.

For the market to scale, we need to decouple hardware and software infrastructure from applications and domain knowledge. After all, when was the last time your ERP system secured and managed the devices that access it?

Trusted, flexible interoperability is paramount
The clouds all want you to standardize on their IoT platforms, but in most organizations, different internal and third-party providers service different use cases (e.g., production, quality, safety/compliance, logistics, building automation and energy management). As such, it simply isn’t realistic for all use cases in any given environment to use one cloud — or even the cloud at all.

Then there’s the concept of multi-tenancy — in theory, multiple discrete (and even competing) service providers often could share the same sensing infrastructure in a given physical environment, but typically won’t today because they don’t trust there are proper considerations to prevent undesired cross-pollination of data.

Beyond realizing a simple multi-tenant use case at any given site, now try to interconnect a bunch of silos into a broader system of systems spanning a mix of private and public domains. Bottom line, we need more consistent, trusted and flexible infrastructure supporting use case-specific devices and applications.

You need an open, multi-cloud, multi-edge strategy to scale. Period.

A cloud-native edge to the rescue
As outlined in part two, key traits of cloud-native architecture include platform independence and breaking functional components down into microservices which enables each discrete function to be deployed and updated individually without taking down an entire system. This is especially important in IoT because taking down an entire OT process to push a software update is a big no-no. Imagine the pop-up: “Please save your work. Your production line will automatically restart in 15 minutes.”

We’ve only scratched the surface on what’s possible in IoT, so it’s critical to be architecting now for flexibility in the future. On top of decoupling core infrastructure from applications, it’s necessary to extend cloud-native architectural principles to all the various edges to provide this flexibility.

Start small, scale big
Using loosely coupled microservices distributed across edges and clouds provides the elasticity to right-size deployments by use case and enable important functions, such as load balancing, failover and redundancy, everywhere.

Adopting this architectural approach now doesn’t preclude direct edge-to-cloud connections, or force you to embrace continuous software delivery before you’re ready, rather it simply provides the most options for the future without having to rearchitect, which is extremely important in order to stay competitive as the world innovates around you.

I also realize this isn’t a panacea — we’ll still also need embedded software for constrained devices and control systems that operate in hard (e.g., deterministic) real time. These just plug into the cloud-native parts.

The fact remains that many people are developing with monolithic models because they’re not thinking for the long (or even medium) term, or they’re in Pi and the sky mode just trying to get started.

The power of open source
Let’s talk open source since it often goes hand in hand with cloud-native.

Many people believe that an open source model reduces their ability to protect their IP or introduces security risks. However, companies large and small are increasingly using open source code to lower overall development costs, accelerate time to market and increase security based on a global network of experts evaluating and patching potential threats.

In short, open source collaboration minimizes undifferentiated heavy lifting so you can focus on accelerating value. In this competitive world, money is made by differentiating through what I call the “ities.” Security, manageability, usability, scalability, connectivity, augmented reality — you get the drill. Not by reinventing the wheel.

The importance of (and practical reality with) standards
In part three, I talked about the inherently heterogeneous nature of the edge. To realize the true potential of IoT, we need to collaborate on standards and best practices for interoperability.

Connectivity standards efforts like OPC-UA (industrial/manufacturing) and OCF (consumer) are making great strides. In fact, the pace at which industrial players are adopting OPC-UA over time-sensitive networking (TSN) as an alternative to traditionally proprietary fieldbuses is testament that the “drivers for dollars” lock-in era is coming to a rapid end.

Still, we need ways to help different standards interoperate because there will never be one standard to rule the world. Plus, you can’t just rip out the majority of capital equipment out there that talks legacy protocols, so you need flexible ways to bridge them to IP networks.

Moreover, in addition to protocols, we also need to bring together an inherently heterogeneous mix of hardware, OS and programming language choices, and most importantly domain expertise. All of these things get increasingly complex the closer you get to the device edge.

You need a fast boat
Due to the maker movement, there simply isn’t enough money in the world for incumbent industry players to buy up and kill off all the innovative startups threatening their stale lock-in model. So, they must either pivot or die. The classic Innovator’s Dilemma.

Going forward, technology providers in any market will win by merit, not lock-in. Do you think the PC market would have scaled if it cost $1,000 to connect your keyboard? What if a custom protocol driver was required for every phone, credit card and website?

The new world is about floating all boats for scale through open collaboration and then making sure your boat is really good and really fast at producing meaningful differentiation.

EdgeX Foundry: Building an open IoT edge computing ecosystem
The network effect resulting from a community collaborating on tangible open source code is one of the most effective ways to accelerate interoperability between heterogeneous elements.

There are a lot of great open source efforts out there, but I want to highlight the EdgeX Foundry project in particular because it was architected from scratch to facilitate a hardware, OS, programming language and protocol-agnostic ecosystem of interoperable commercial value-add at the IoT edge.

In short, despite being more about creating de-facto standard interoperability APIs than anything, EdgeX is slated to do for IoT what Android did for mobile.

You can learn more in the project overview deck, and a blog expanding on key project tenets can be found here. It helps clarify that EdgeX isn’t just about giving your IP to open source.

Linux.com recently did a great write-up on the July “California” code release and the “Delhi” release dropping in early November. A number of announcements were made last week at IoT Solutions World Congress, including emerging EdgeX-based dev kits and more backing members including Intel.

Open collaboration for a smarter edge
The EdgeX community is also gearing up on integration with other open source projects such as Akraino, Hyperledger, Zephyr, Kubernetes and FIWARE.

The resulting potential is huge — imagine infrastructure that’s able to programmatically prioritize bandwidth for a healthcare application over a connected cat toy because each proprietary microservice has a de-facto standard API to advertise its current state and quality of service needs. Here, ledger technology can keep everyone honest (as much as I’d like to prioritize the cats).

These open source projects are also bridging to other key IoT/edge efforts, including emerging EdgeX-enabled testbed activity at the Industrial Internet Consortium.

Just say no to IoT gateway drugs!
There’s a reason it’s not a good idea to use the email alias from your internet provider — you’ll hesitate to change ISPs after that initial promotional rate expires. Using an agnostic alias like @gmail keeps your options open for later. [Side observation: If you still use @aol, part two must have been especially nostalgic].

The clouds are making great investments in edge, but they’re also purposely making it all too easy to get hooked with their dev kits because they want to lock you in early and then rake in the dough through API access charges as your data needs grow over time.

I call these lock-in dev kits “IoT gateway drugs.” Don’t get me wrong, the clouds are offering great services and I recommend them, but only when used with truly open edge SDKs that minimize your lock-in potential.

EdgeX dev kits: A better path to get started
In comparison, emerging EdgeX-based dev kits and associated plugin value-add will give developers confidence that they can prototype with their choice of ingredients, taking advantage of plugin components from the growing EdgeX ecosystem to supplement their own innovations.

And, most importantly, developers can readily swap out elements as they optimize their system and ramp into production and day-to-day operation.

Realizing the Holy Grail of digital
But wait, there’s more! I talked in part four about how trust is everything when it comes to realizing the true potential of IoT. And beyond IoT, it’s ultimately about what I deem to be the “Holy Grail of digital” — selling data, resources (e.g., compute, networking, energy) and services (e.g., domain-specific consulting) to people you don’t even know.

Over time, by combining silicon-based root of trust, universally trusted device provisioning (check out last week’s announcement of Intel and Arm collaborating toward this), appropriate connectivity and regulatory standards (e.g., privacy, ethical AI), open, de-facto standard APIs established by projects like EdgeX and Akraino, and ledger technologies, we’ll build the intrinsic, pervasive trust needed for the Holy Grail.

With differentiated commercial value-add backed by this open, trusted plumbing, anyone will be able to create data in the physical world, send it out into the ether and then, based on their terms, sit back and collect checks from complete strangers. Or in the more altruistic sense, simply share trusted data.

In hundreds of conversations with very smart people, nobody has really questioned that this is the Grail or even attempted to claim that we can possibly realize it through a bunch of siloed platforms trying to lock customers in, thinking they can then sell their data if allowed. It simply isn’t possible to build the necessary trust at scale.

It’s midnight, do you know where your data has been?
All too often I hear from data science experts that it’s someone else’s problem to get them clean data. Really? Data is only worth something if you can trust it. So even if you don’t buy all this Grail talk now, you should still care about transparent, open collaboration if your data’s really going to be the new oil.

[Side note: People like me call themselves CTO, so there’s plausible deniability if something doesn’t actually happen … but in this case the Grail will happen in due time].

A lesson from Mr. Mom
When Michael Keaton’s character tries to drop his kids off at school for the first time in the classic ’80s movie Mr. Mom, a lady comes up to him and says, “Hi Jack, I’m Annette. You’re doing it wrong.” This classic scene plays in my head when I think about how most IoT technology providers are doing things today.

A common natural inclination is to swim directly into a riptide current to try to save yourself, but instead you soon get tired and drown. Anyone who has seen Baywatch knows that you’re supposed to swim sideways.

Similarly, many IoT technology providers are swimming into the current today because of the herd mentality to lock customers in, instead of really breaking down the problem.

Meanwhile, for the past three years the team at Dell Technologies has been collaborating with a bunch of other great companies to swim sideways and build an open approach so we can realize the true potential of this market.

We welcome anyone to join the collaboration in the open community. In the immortal words of Jack, “220, 221, whatever it takes!”

Three rules for IoT and edge
The principles outlined in this series are summarized in my “three rules for IoT and edge”.

First, it’s important to decouple infrastructure from applications. EdgeX, combined with other open frameworks and platform-independent commercial infrastructure value-add (like Pulse IoT Center from VMware for managing IoT edge devices in the droves) is key here.

Second, it’s critical to decouple the edge from the cloud via open, cloud-native principles, as close as possible to the point of data creation in the physical world. This enables you to control your data destiny through any permutation of on-premises or cloud data integration, compared to pumping your data into a cloud and then having no choice but to pay API egress charges to subsequently fan it out anywhere else.

Finally, it’s important to decouple industry-specific domain knowledge from underlying technology. Many IoT platform providers tout the ability to do predictive maintenance, but their developers don’t have the necessary years of hands-on experience and historical data on the failure patterns of any particular type of machine.

Brass tacks, we need to be able to dynamically marry the right OT experts together with consistent, scalable IT infrastructure and the right technology bells and whistles in a secure, trusted fashion.

Closing thoughts
Think about how you create and capture value in the long term. I guarantee there’s a bigger opportunity that you can capitalize on. There’s only so much you can cut costs, but the sky’s the limit in making new money.

Act today to be able to deliver customer-valued outcomes through rapid, software-defined innovation, everywhere. This includes embracing open source to minimize undifferentiated heavy lifting and riding the wave of the network effect as part of a broader open ecosystem. Interoperability builds a bigger stage for a better show.

Remember and respect the importance of people, including fostering collaboration across OT, IT and the line of business.

Plan now for an increasing amount of edge computing. The deepest of deep learning will always happen in the cloud, but we absolutely need edge compute to scale.

Follow my three rules for IoT and edge as much as possible.

Just say no to IoT gateway drugs.
Above all, think about how your decisions today will impact your ability to realize the “Holy Grail of digital.” Scale and Grail, this is what it’s all about!

If all of this seems a bit overwhelming, no worries — it’s actually advisable to start small. Just remember that starting small with an open approach is the only path towards the Holy Grail.

I hope that you’ve found this blog series helpful and would love to hear your views and comments. Thanks for reading! Share and stay tuned for more on the OT/IT dynamic, AI, blockchain and beyond! And be sure to follow me on Twitter @defshepherd.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

CIO
Security
Networking
Data Center
Data Management
Close