A post in  Technology content of current CEP products? reminds me of why I rarely, if ever, agree with anything that comes out of Aleri’s marketing team.   To fair to Jeff, it is not only Aleri but others, who continually misdefine business process management (BPM) as CEP.

Jeff uses the example, “Smart Order Routing” as an example of taking an event and routing the resulting market order match based on some simple rules.    Routing a order kicked off by a simple order match against a deep liquidity pool (or other market factor) does not define complex event processing nor detecting a complex event – the core idea behind CEP.   Order routing based on simple rules is BPM, plain and simple.

Let’s take another example, fraud.  In this example, there is some complex neural network monitoring for credit card fraud and a potential fraud is detected – this is CEP, detecting a complex event based on some sophisticated analytics.

After a possible fraud has been detected, a process looks into a database and the routes the incident to someone in the company who is a (1) specialist in credit card fraud, (2) working at the same time of the discovered threat, and (3) immediately available to act on this type of task.   Routing the incident is not CEP, it is BPM.

Jeff makes the argument that it is OK to call an event-driven BPM task CEP because “it fits the EPTS definition” in the CEP glossary.   He also avoids the discussion of detection accuracy, and instead insists that latency is a “very important” factor in a CEP application.

If you read the various post by vendors in the blog-o-sphere, it is obvious that they are continually defining CEP as BAM, BPM, BRE, BRMS, SOA and just about every other related processing activity that is complimentary to the event correlation and analysis required to detect an opportunity or threat to your business.

I’m not picking on Aleri.  TIBCO has been doing the same thing recently in their CEP blog, continually attempting to redefine CEP as BRMS.    Detecting business opportunities and threats with high confidence requires sophisticated analytics, and their tools have not yet evolved to “real CEP” capabilities.  Instead, vendors are attempting to redefine BPM, BRMS, BRE, and even SOA to some degree, as CEP.



  1. Correction to para 7: TIBCO is not redefining CEP as BRMS, but pointing out the relationship between CEP and BRMS – e.g. business users maintain event correlations via a BRMS – and thence the value of BRMS to CEP. No more, no less.

  2. I was discussing this same issue with a friend recently. It appears some CEP vendors are attempting to use CEP for BPM whether it’s appropriate or not.

    On the fraud detection note, I thought it would be funny to share a personal experience. Recently Capital one blocked my credit card because it thought some charges were fraudulent. I called and spoke with a representative, who was kind enough to go over the charges. One of the charges was by my wife, but I didn’t know and said it was fraud. Later that day the wife told me it was valid, so I called capital one back. Over the last 5 years, capital one has blocked my card 3 times. All of them were valid charges.

    This points out the limitations of simple rule based fraud detection. People’s purchasing habits change over time, so a static model isn’t practical. To build good fraud detection, it has to use a mix of business rules with machine learning, otherwise the system could miss positives and produce too many false positives.

  3. Hi Peter,

    Yes, it is almost comical – what we are seeing with self-described CEP vendors calling basic BPM concepts “CEP”.

    Dear Paul,

    Reading the TIBCO CEP blog, it does give the impression that TIBCO is positioning CEP as some real-time new fangled BRE/BRMS.

    Yours faithfully,


  4. Tim,

    While you’ve made your position abundantly clear over the past few months, we obviously disagree on what does and does not constitute CEP. I’d like to thank you at least for acknowledging that Aleri is not alone in our position.

    You are entirely correct that Aleri CEP does not do what you want in a CEP implementation – and I would hazard to assert that probably none of the other commercial CEP (or as you often say “self proclaimed CEP”) implementations do either. We have never positioned Aleri as a tool for doing situation detection using inference techniques and the like. We have been successfully addressing a different group of problems for our customers, some of which involve situation detection by watching for event patterns, and others that involve continuous analysis of event data to support decision making (either by applications or humans). In all cases they involve the real-time analysis of events in the context of other events. Is this CEP or not? Clearly it depends on the definition of CEP.

    You recently posted your own glossary, which is different from the “official” glossary published by the Event Processing Technical Society. If your definition had been adopted, then I don’t think we would be labeling our product as CEP. But it wasn’t. The definition adopted by EPTS is broader than yours and clearly encompasses what our technology does – and was designed to do. As a founding member of EPTS, we support the EPTS glossary and are using the terms defined by it appropriately. I’m sorry if you disagree with the definition adopted by the community (EPTS representing the community). Given that you were very active in the early days of EPTS as a steering group member, I have to admit that I’m surprised that you have taken such a strong opposing stance as of late.

    Finally, some clarification on a couple of the other points:

    – I suggested that it’s important to recognize latency as an aspect of power/performance. Some CEP applications more latency sensitive than others – some measure it in milliseconds, others in seconds (or even minutes?). But I would argue that all are latency sensitive to some degree, otherwise overnight processing would be sufficient and real-time processing would not be required

    – I did not “avoid” the discussion of accuracy. I was suggesting additions to Dr. Luckham’s list, which already included “correctness” under item 1)ii.

    All the best,


  5. Thanks Tim – I will take the advice and do a clarification post (probably after Gartner/EPTS …)

    Hey Peter – what makes you think that your credit card provider isn’t using a neural net based product? 🙂


  6. Hi James,

    Excellent post on eBizq. Thanks for stopping by The CEP Blog. I agree with your post, and especially like your conclusion, “event correlation and analysis is what makes something CEP and CEP is intensely complementary to decision management. Business rules are good for both (and indeed for much more) and should be part of how you address both.”

    Hi Paul,

    Cheers. It would be good to see TIBCO talk more about their Spotfire and Insightful analytics in the concept of complex event processing. When you consistently focus on “why CEP is a better BRMS/BRE” it appears as if you are positioning BE as just a faster, better, real-time rules engine. There is nothing wrong with this positioning if that is where TIBCO is going, but it will be challenging to get TIBCO’s BE in the leaders quad of the corresponding Gartner MQ. (Then again, for for it!, if that is the TIBCO market strategy.)

    Dear Jeff,

    My position on CEP/EP has been consistent. For example, when I presented our TIBCO keynote on event-decision processing (to the first event processing symposium in 1Q 2006) and discussed how the JDL model for MSDF had previously addressed the conceptural framework that folks in the “new CEP community” seem to continually struggle with.

    In addition, I presented the same concepts at my keynote a DEBS 2007, so there should be no surprises if you have been following my presentations and blog posts over the past two and a half years on CEP.

    What might come as a surprise to some is my dissappointment with EPTS, which has proven to be a misleading technical title for a joint marketing organization. This also should come as no surprise because I have consistently commented on this as well during the process of formalizing the EPTS.

    Regarding the EPTS CEP glossary, it also might be a surprise that was also dissapointed to see David Luckham and Roy Schulte sign this document as if they were the authors when they were really editors. I have a deep regard for Roy and David, so I have been relucant to express my disappointment in their claim as glossary “authors” when they were “editors”.

    In addition, they concluded their editorial work with (paraphrasing) “so many people contributed we just can’t mention them”, which is a disappointingly weak acknowledgement of the hard work of everyone else.

    For these reasons, and others, the EPTS has proven not to be transparent and inclusive. Many members simply ignore most of the prior art in EP and push their commercial marketing agenda. In addition, much of the current EPTS glossary is misleading or technically inaccurate. The glossary does not reference any prior art, so it has little credibility.

    Now, it is being quoted (for example, by you) as if it is really some authority, but it is not authorative, it is just anunreferenced edited work from a nontransparent process that does not chronical the work and contributions of others by the editors. There is no documented “comment resolution” or any process that shows how they got from point A to point D.

    A point point; focusing on latency is orthogonal to CEP. Latency is a component of just about all processing, even my personal computer, my digital TV, and my cell phone! However, what makes CEP “CEP” is the ability to detection situations which are threats and opportunities to business in real-time, and these threats/opportunties may or may not require low latency processing.

    For example, as Peter points out, processing a few credit card transactions looking for fraud over a few days does not require low latency per se, it requires accurate analytics. I had a similar recent experience with credit card fraud in Bangkok recently. The analytics did not work. Who cares about millisecond latency, they can’t get it right over three days and 11 transactions!!

    Most companies that call themselves “CEP” are well intended and I support them all, conceptually. However, what most are selling is a very small subset of what is required to process complex events, finding the needles in the haystack of the “event cloud” or the “event fog” or “event storm” of whatever people want to call it.

    Best order routing is great stuff; but is is not really CEP, it is much more closely aligned to BPM or EAI. After all, just about every form or processing uses IF-THEN-ELSE processing, even typing on my keyboard — the event handler uses rules, but we don’t call keyboard event processing CEP either.

    So, it seems the EPTS simply defined CEP as “CEP is what our products do”….. which is not credible in my opinion.

    Yours sincerely, Tim

  7. My worldly-wise, but entirely non-technical sister gave me something to think about the other day. I was relating a similar story about how my bank stopped my credit card after incorrectly analysing a pattern of behaviour while I was visiting foreign lands. She looked puzzled and then asked me why I didn’t just ring the bank before I got on the plane and tell them I would be abroad. She apparently does this every time she travels, and has never had a card stopped.

    Progress has been made in this discussion in forging a general agreement that the basis for CEP is not rules vs. analytics, but rules + analytics. Excellent. Listening to my sister, I think maybe the real-world solution is rules + human interaction + analytics + common sense, all enabled through event processing of course 😉 We are still a long way from extinguishing human responsibility in decision making, thank goodness.

  8. Hi Charles,

    Great to see you here on the blog. Thanks for sharing,

    Of course I agree with you strongly.

    We have had “events” and “event processing” as long as I can remember.
    What is lacking is the capability to “make sense” from the events, and that means accurate and self-learning situation models, strong analytics and a scaleable agent-based event processing architecture than focuses on knowledge processing, not just low latency data processing,

    Yours faithfully,


  9. hey charles,

    glad to see someone else gets false alarms from their bank. funny thing is, I never thought about calling the credit card company before a trip, though if I charged a flight and shuttle to the airport, their system should be able to “infer” I am on a trip. good thing building smart systems is hard work, keeps us employed.

  10. I like the typology of “event processing styles” defined in the Seybold’s document, here:

    Although it’s 2 years old, it’s perfectly valid. I copy the definition here for discussion:

    SIMPLE EVENT PROCESSING. In simple event processing, a notable event happens, initiating
    downstream action(s). Simple event processing is commonly used to drive the real-time flow of
    work—taking lag time and cost out of a business.

    STREAM EVENT PROCESSING. In stream event processing, both ordinary and notable events happen. Ordinary events (orders, RFID transmissions) are both screened for notability and streamed to information subscribers. Stream event processing is commonly used to drive the real-time flow of information in and around the enterprise––enabling intime decision making.

    COMPLEX EVENT PROCESSING. Complex event processing (CEP) deals with evaluating a confluence of events and then taking action. The events (notable or ordinary) may cross event types and occur over a long period of time. The event correlation may be casual, temporal, or spatial. CEP requires the employment of sophisticated event interpreters, event pattern definition and matching, and correlation techniques.

    The order routing application above could be classified as “simple event processing”. Indeed, there is no correlation between the events, the single incoming event is filtered and used to trigger downstream actions. The typology also makes the distinction between ESP and CEP, this is very good. Also very importantly, the CEP definition clearly shows the challenges residing in the underline algorithms (versus infrastructure). Note that of course the 3 styles can be coupled, but this is another topic.

    About CEP and BRE. CEP is defined as an umbrella which encompasses all the technologies used to correlate the events in real-time or business-time environments. Among them, rule-based technologies can also be used (of course!), as well as analytics, other models, etc. But rule-based technologies can be used in many other areas outside of the CEP such as diagnosis, validation, computation. So CEP and BRE cannot be compared. They have an overlap, but no inclusion, in one direction or the other.

  11. Hi Changhai,

    Welcome back to this small corner of the blog-o-sphere. Your comments are both welcome and insightful. Thanks for posting.

    Yours sincerely,


  12. This is certainly an interesting subject. As I have seen business process management (BPM) become more ubiquitous, I have also noticed a trend where the processes being automated have greater needs. As opposed to simple “Sales Order Management” or “Employee Roll On” processes being automated today, the average customer is looking to provide more value to their automated process. As such, I see this as a great opportunity for BPM to take advantage of CEP. I definitely agree that CEP is not BPM as well as BPM not being CEP. Rather, I see that automated business processes that leverage CEP can make the BPM users more effective and efficient in providing real-time value to the end customer. More and more, as I talk to customers who are looking to get a greater return out of their business processes, I am talking to them about CEP and how it can increase ROI for them exponentially if it is implemented correctly. I am excited about the advantages that CEP has introduced to BPM to this point, and will continue to track it going forward.

    Chris Adams
    Vice President of Product Marketing and Management

  13. I love Changhais comment that “although it’s 2 years old, it’s perfectly valid”.

    There is surely real consensus here. My perspective, as a techie, is also that the two worlds of CEP and BRE overlap in terms of the territory they inhabit, and specifically they have a core technical correspondence because each world relies heavily on the common idea of set-based pattern matching (I’m obviously emphasising the ‘Rete’ part of the BRE world here). The differences between the two worlds are well illustrated by the obvious fact that, although event processing has long been recognised as something BREs can be used for, very little has been done, certainly in the commercial sector, over the years to directly address some of the central concerns of EP and CEP (e.g., temporal processing). There is some academic work I am aware of, but that is all, until very recently. CEP is a shot in the arm for the BRE crowd, injecting all kinds of new and exciting possibilities and challenges in terms of how BRE technologies can be evolved and more widely applied in the futre, alongside analytics, BPM and a host of other approaches. Some of us need to do a better job in the future of inhabiting the joint territory to the benefit of both communities, and our customers.

    It will be very interesting to see what emphasis is given to CEP at the October RulesFest in Dallas.

  14. “Regarding the EPTS CEP glossary, it also might be a surprise that was also dissapointed to see David Luckham and Roy Schulte sign this document as if they were the authors when they were really editors.”

    Someone sent me email bringing this remark to my attention thru the back copies. The glossary has been thru so many professional editors for style and formatting that I dont really keep up.
    However if you look carefully it does not say “by” Luckham and Schulte. It should say “edited by”.

    Perhaps you would like to suggest some amendments to the acknowledgements too?

    We are indebted to many correspondents who have made contributions, suggestions and comments over the past year. They are too numerous to mention individually, but we owe them our thanks.”

  15. Hi David,

    Thanks for stopping by and responding to Opher’s concerns, voiced privately as well.

    On your web site, the page about the glossary does not read like an edited work, it reads like work by two authors who give a passing blanket (weak in my opinion) acknowledgement to the community:




    Here is a quote, directly from your web site:

    “Event Processing Glossary
    David Luckham, Roy Schulte
    May 2008
    This glossary covers a small set of basic terms related to event processing. It will be frequently updated with additional terms in response to suggestions from the event processing community for improvements and additions.
    Our approach is to define each term independently of any particular implementation, product, or domain of application. So, […]”.

    From here, your web page on the glossary reads fairly directly like you and Roy are the authors of the glossary, and you guys give a somewhat misleading acknowledgement of the contributors at the end, not mentioning anyone who make major contributions.

    As an example, our major contribution (we spent significant effort on this at TIBCO), now available publicly is not mentioned (see link below):


    So, please forgive me for voicing dissapointment, but it is certainly dissapointing to see a collaborative edited work published on your web site, appearing to be authored work with “so many contributions” to anyone not familar with the process.

    In addition, there is no revision history, comment resolution, or any of the normal processing that insure all contributors are acknowledged.

    I received a PDF of a different version from Opher Etzion, and that version says “Editors” but the public version on you web site does not.

    Finally, to respond to your question about your acknowledgement:

    “We are indebted to many correspondents who have made contributions, suggestions and comments over the past year. They are too numerous to mention individually, but we owe them our thanks.”

    It is my opinion that this acknowledge is not sufficient and is misleading and you should list the contributors to the glossary, all of them. Those who submitted major contributions in one section, and folks who only commented on terms another section.

    You should also have a comment resolution document that shows why, for example, the definitions submitted by people were not accepted. This is how we did this is the USAF, because when folks take the time to submit contribute and comment, we own them the rationale of what “happened” to their contributions.


    Yours sincerely, Tim

  16. We all want nice crisp delineation of markets so that we can be experts and leaders in our niche. Those same boundaries confuse adopters and retard progress. Arguing about whether CEP overlaps BPM is less important than understanding their union. CEDetection seems too narrow a perspective versus CEProcessing which adopters naturally view as overlapping BProcessM. The fact is that events are poorly integrated in the current conception of business processes and that unity between BPM and CEP is needed – not to be avoided. These markets should converge and the leaders you have attracted here are the right people to lead the charge. Of course, CED may, like BRE, have applications outside of BPM, so the niche may persist, but the niche is less significant than the need for convergence.

  17. Hello Paul,

    Great to see you here again. Thanks for stopping by!

    As a reminder, it is the accuracy and confidence in detection that is the core “valued added” in CEP.

    The rest of the technology stack is quite mature in many other appropriate areas like EAI, BPM.

    The assymetric situation in network sense-and-respond today comes from the lack of capability to detect complex events and situations in the “event storm.”

    Just like the early days of rockets and fireworks, it was easy to launch myriad rockets and fireworks; however, tracking and analyzing the “rocket cloud” was the next challenge in the evolution of “complex rocket processing”.

    Networks are in the same “flying blind” state today. We know how to “launch” zillions of low level events, but we are in the “stone age”as to understanding what is happening in the “event storm”.

    What is lacking the most is the “sense” and &”make sense” part of “sense and respond”. Without high confidence and accurate complex event detection, response is useless – just like in network and security management today.

    Yours faithfully, Tim

Comments are closed.