complex event processing (CEP)
What is complex event processing (CEP)?
Complex event processing (CEP) is the use of technology to predict high-level events likely to result from specific sets of low-level factors. By identifying and analyzing cause-and-effect relationships among events in real time, CEP allows personnel to take effective actions in response to specific opportunities or threats. For example, a team might create an event that aggregates several low-level risk indicators, such as change of address, new spending habits and new types of purchases to elevate fraud risks.
The concept and techniques of CEP were developed in the 1990s. The approach gained popularity in the heyday of service-oriented architectures and enterprise service busses, architectural styles which have since been eclipsed by cloud-based architectures. Likewise, CEP has been somewhat overshadowed by event stream processing and streaming analytics, two approaches for processing and analyzing streams of data that implement some, but not all, of the core ideas of CEP.
Despite the fact that different terminology is often used today for processing and analyzing real-time events, experts believe that CEP remains relevant for improving enterprise architectures, and, in particular, for its application in the discipline of business process management, which aims to improve business processes end to end.
Donncha Carroll, a partner in the revenue growth practice of Axiom Consulting Partners, said that business process modeling software (BPMS) tools have started to add new functionality that lets users handle CEP, making it easier to connect to multiple sources of enterprise data. This new CEP functionality in BPMS tools allows business process managers and business owners to identify events or developments that trigger critical notifications or actions that are generally not possible to identify with enterprise software platforms serving a broader set of needs, such as CRM and ERP tools.
A good example of a CEP use case of this type is an enterprise software vendor with a large, important customer coming up for renewal. The customer is experiencing a service disruption that no one but the customer service department is aware of -- putting the renewal at risk. CEP can put both pieces of information together -- up for renewal and service disruption -- prompting the system to trigger a notification and launch a set of activities that prioritize account servicing for this customer.

History of CEP
The history of CEP begins with work done by Professor David Luckman at Stanford University.
In the 1990s, Luckham was working on distributed programming languages that could run in parallel and communicate by means of events.
"It became clear that you needed to abstract low-level events into higher-level events and do that two or three times to create accurate simulations," said Luckham, now a professor emeritus at Stanford, in an interview with TechTarget.
For example, Intel came to Luckham's team when they were trying to figure out why the adder on a new chip did not work correctly. Initially, they thought the simulation library was deficient, but it turned out the analysis of the results lacked the ability to make sense of the raw data streams.
"In those days, simulation outputs were eyeballed by humans as streams of ones and zeros, and it was easy to miss something at that level," Luckham said. He invented the term complex events to characterize higher-level events correlated from a series of lower-level events.
Luckham's team outlined three core principles of CEP: synchronization of the timing of events, event hierarchies and causation.
- Synchronization relates to the observation that the timings of events often need to be calibrated owing to differences in processing time, network routes and latency.
- Event hierarchies relate to the development of models for expressing lower-level events, such as how clicks can translate into higher-level events like user journeys. Researchers today are developing tools for creating synthetic sensors that detect behavioral events by fusing data from various sensors.
- Causation provides a framework for connecting the dots between cause-and-effect relationships buried within a stream of events.
Luckham observed that most tools have focused on synchronization. However, more work will be required to weave the science behind event hierarchies and causation into modern tools to take advantage of the original promise of CEP.
Meanwhile, CEP's techniques for identifying, analyzing and processing data in real time have become fundamental to many business projects.
"Today, many businesses plan strategic initiatives under titles such as 'Business Analytics' and 'Optimization.' Although they may not know it, complex event processing is usually a cornerstone of such initiatives," Luckham said.
Benefits of complex event processing
CEP's ability to detect complex patterns in multiple sources of data provides many benefits, including the following:
- makes it easier to understand the relationship between events;
- helps connect individual events into more complex chains;
- simplifies the development and tuning of business logic;
- can be embedded into fraud detections, logistics and IoT applications; and
- helps build more accurate simulations, models and predictive analytics.
Use cases for complex event processing
The ways in which organizations use CEP today include the following:
- improve stock market trading algorithms;
- develop more responsive real-time marketing;
- create more adaptable and accurate predictive maintenance;
- fraud detection;
- develop more nuanced IoT; and
- enable more resilient and flexible supply chains and logistics.
Difference between complex event processing and event stream processing
With the growth of cloud architectures, new terms like event stream processing (ESP), real-time stream processing and streaming architectures are starting to replace the use of the term CEP in many technical discussions and product marketing.
Roy Schulte, distinguished vice president analyst at Gartner, has been covering CEP almost from its inception. Complex event processing as a mathematical concept is still valid, he said, but the term itself is not used that often. He hears it used in reference to temporal pattern detection -- finding new occurrences of a pattern in streaming data -- and to stream data integration, that is, the loading of streaming data into databases (streaming extract, transform and load).
The disuse of the term is due in part to how complex events are defined. For instance, events based on arithmetic aggregation operators (e.g., count the number of tweets in the last 10 minutes on the topic of COVID or other count, sum, average calculations) are often not called complex events. They're relatively simple to compute, so the software need not be as sophisticated as that used for pattern detection in CEP. Simple forms of stream analytics are now available in common BI tools like Tibco Spotfire, Tableau and Microsoft Power BI.
"The technology of CEP is generally called stream analytics now, or streaming analytics, by some," he said. Streaming analytics is widely practiced because organizations need to extract near-real-time insights from the vast amount of streaming data that flows through their networks (and internet connections). This enables situational awareness.
Sophisticated forms of stream analytics, such as pattern detection, are implemented in "event stream processing platforms" like Apache Flink, IBM Streams, Kafka Streams, Microsoft Azure Stream Analytics, Spark Streaming and Tibco StreamBase. "Such ESP platforms are often embedded, unseen, within SaaS, packaged applications or other tools so the buyer may not realize that they are there," Schulte said.
The stream processing characteristic of CEP is also sometimes manually coded into SaaS, packaged applications or other tools, so the product implements the same math as CEP without using a CEP/ESP platform.
Still, most of the early CEP tools are still offered -- generally under the banner of stream analytics -- and remain viable. However, newer event stream processing platforms such as aforementioned Microsoft Azure Stream Analytics and open source ESP platforms like Flink, Spark Streaming and Kafka Streams have taken over the bulk of new applications.
Big organizations already use multiple stream analytics technologies, whether they are aware of this functionality or not. That's because these technologies are now widely embedded in SaaS applications or in packages for important activities such as transportation operations monitoring, network management or fraud detection. In other cases, they are embedded in BI tools or closed source or open source ESP platforms. "For the same reason that big organizations have a dozen or more DBMSes, they have multiple stream analytics products," Schulte said.
Most of these applications that employ event-driven architecture, however, don't use stream analytics or CEP. Instead, they use simple one-event-at-a-time events processing. In this sense, stream analytics and stream data integration are subsets of event-driven architecture applications.