Well, after a long period of working on a number of operational projects, I’m going to take a break from writing code and actually do some blogging again. For some this is good news (and I quote from a private note):

I’m thankful you’ve started blogging again. For a while there I was afraid you’d stopped blogging because of the attacks from Opher and those who side with him.

And for others this is bad news (and I quote from a passionate public comment):

Oh please Tim! After several months of quiet on this front, do we really need to start going around these loops again?

I fully expect to be either (1) relentlessly attacked or (2) completely ignored. Either is perfectly OK with me because after a long absence, I realize that it is our professional and ethical responsibility to debunk the claims of all these “instant coffee” experts in complex event processing.

The “event process vocal minority” is full of well-intended, but inexperienced folks who have never (or rarely) worked in an operational data center where they had to deal with anything complex and have never written a line of code to detect an outage or a security threat before it caused damage. It is full of people who want to sell us their “can do all, save the planet” software and silence those who disagree.

Sadly, the event processing and CEP fields are dominated by academics, marketing people and paid analysts who are constantly releasing self-serving press releases where they claim to the world they are doing something “new” or “special”. Truth in fact (no conflict of interest here), most of this software is semi-useless for detecting anything remotely “complex” and is simply an GUI-based IDE around some relatively simple query or rule processing code. In practice, we can do more to “detect” things with free and open Perl or PHP code. Really. I am not kidding.

It has been many years since the software marketing folks got a hold of the term “CEP”. They hijacked it from it’s innocent genesis in network security and complex fault management (as funded by DARPA to help make the US a safer place), and turned it into a metaphor for processing transaction streams (mostly related to financial services).

This time, I am going to “take off the gloves” because I am really disappointed in the boring “attacks by “professionals” those who only talk-the-walk or sell to FSI. The attacks, censorship and “false hero worship” in the event processing space is so self-serving and is based on greed and ego-gratification. Folks do not even reference the prior art in the field! They are “pioneers” as if their first visit to Hawaii makes them the one founding fathers there as well, LOL.

In the past 6 months I have have fought cyberattacks from “those countries” attackers; written, tested and implemented security event countermeasures; put together CDNs and virtual fault management services; worked in the clouds on performance management projects; modified, installed and tested PHP detection code; worked on machine language translation projects, and more.

I don’t have anything to sell you. I don’t want you to buy my “CEP engine” and I don’t want you to hire me or sponsor this blog. I don’t want a penny from you. I am “dangerous” because I only want you to know the truth, from a long time technology person who has working in IT operations for nearly 20 years and actually needs to detect complex events and situations.


  1. Hi Colin,

    I did not “lose sponsorship” …

    My experience was that when you have sponsors, you must change the way you blog.

    In fact, Marc Adler noticed how I deleted a post that was a bit “political” last year and he criticized me for permitting corporate sponsors to dictate what I write about.

    Therefore, I have decided not to seek nor accept any sponsorship. I don’t need the money and wish to remain completely free of influence and potential conflicts of interest.

    (Edit: I had to wait until the last annual sponsorship agreement expired to announce this, and that was Feb 15th 2010, as it is not fair to talk about my decision not to want to continue having sponsors on this blog with a paid sponsor in the logo rotation.)

    In fact, I consider it a mistaken that I ever permitted sponsorship here.

    Live and learn.. .as they say.

    Yours, Tim

  2. That’s funny – Marc just recently deleted a post where he was quoting an alleged Aleri ex-employee’s rant. Of course, no one could ever accuse Marc of using his blog to advance personal interests or goals as his unselfish efforts and valuable insight helped many of the CEP vendors get to where they are today.

    Having said that, I do think if Citi had exercised a little more diligence regarding Marc’s blogging activities that they would have noticed a much different response from CEP vendors.

    So, I applaud your decision – without sponsorship on your blog, it should certainly be easier for you to write now that you’ve, “taken the gloves off.”

    I look forward to your next series of thought provoking prose.

  3. I have always like Marc’ blog; but I think he stills need to understand that building trading applications based on deterministic straight-through processing is not “complex event processing”. He should follow the lead of STAC ( and call these stream query IDEs “event processing platforms” or “event processing IDEs” or anything other that “CEP” because to continue to parrot the marketing speak of folks who simply want to exploit David Luckham is not helping move the state of the art of detecting complex events forward.

  4. As a vendor who had to deal with Marc as a customer, I can not say that I’ve always appreciated his blog. And I know that his blog did a disservice to Citi; but that’s all in the past now.

    I can’t go into what Citi was doing in any depth in regards to Marc’s project – but I do agree with you; it was far from anything remotely resembling complex event processing. At least in my opinion.

    I’d really like to get an example of what you do consider to be complex event processing – maybe we could put together a small tutorial and host the solution in our publicly available event processing cloud (currently under construction). Our current project, TwitYourl, is really just a streaming search and aggregation. And although somewhat simple, will serve as a benchmark for some discussions in and around Cloud Event Processing andis my current focus.

  5. Hi Colin,

    I thought I did a fairly good job of describing what is “complex event processing” , from a function event processing perspective, in the eight part series:

    There is not very much I would revise because the model is based on real operational systems that process complex events. I based this functional CEP reference framework on the same proven model used by the US military, perhaps the biggest event processing organization in the world (!!), to process complex events.

    The model is based on the JDL (Joint Directors of Laboratory) multi-sensor data fusion model, a mature model used to process huge volumes of sensor data every minute of the day. This model basically keeps the world safe, and is a time proven abstraction to process event data by the world’s leading superpower. Why reinvent the wheel?

    In this model, stream processing takes on the important role of event pre-processing (more or less). Event pre-processing is a very important, critical component of the JDL functional architecture and therefore, event stream processors, query-based IDEs, etc have an important role to play in many large scale CEP applications.

    What needs to happen in my opinion ( and also is what I briefed in 2006, and it has not been done yet) is that we need to map various CEP and EP classes of problems to this long established DoD reference architecture and define the analytics or methods required at each stage, depending on the complexity of the problem.

    Naturally, some problems will not require all the components of the total functional reference architecture, but at least they will map to a common framework which is not vendor or marketing driven.

    There is nothing of form or substance to “map to” in David Luckham’s prior work because he was seemingly not aware of all the prior art in multi-sensor data fusion and the JDL functional model when he was at Stanford. A strictly “hierarchical” model for processing complex events is only a (niche) subset of an overall CEP/EP framework.

    David Luckham’s work, well intended, was written in a bit of a vacuum because he and his team did not reference any of the prior work in multi-sensor data fusion, where complex events have been processed for decades.

    I hope my explanation makes sense to you.

    Yours, Tim

Comments are closed.