The blog post, On Event Processing Agents, reminds me of a presentation back in March 2006, where TIBCO‘s ex-CEP evangelist Tim Bass (now busy working for a conservative business advisory company in Asia and off the blogosphere, as we all know) presented his keynote, Processing Patterns for Predictive Business, at the first event processing symposium.
In that presentation, Tim introduced a functional event processing reference architecture based on the long established art-and-science of multisensor data fusion (MDSF). He also highlighted the importance of mapping business requirements for event processing to established processing analytics and engineering patterns.
In addition, Tim introduced a new slide (shown below), “A Vocabulary of Confusion,” by adapting a figure from the Handbook of Multisensor Data Fusion, overlaying the notional overlap (and confusion) of the engineering components of MSDF with CEP and ESP, to illustrate this confusion:
One idea behind the slide above, dubbed the “snowman” by Tim, was that there is a wealth of mature and applicable knowledge regarding technical and high functional pre-existing event processing applications that span many years and multiple disciplines in the art-and-science of MSDF. A few emerging event processing communities, vendors and analysts do not seem to be leveraging the art-and-science of multiple core engineering disciplines, including well established vocabularies and event processing architectures.
On Event Processing Agents implies a “new” event processing reference architecture with terms like, (1) simple event processing agents for filtering and routing, (2) mediated event processing agents for event enrichment, transformation, validation, (3) complex event processing agents for pattern detection, and (4) intelligent event processing agents for prediction, decisions.
Frankly, while I generally agree with the concepts, I think the terms in On Event Processing Agents tend to add to the confusion because these concepts in On Event Processing Agents are following, almost exactly, the same reference architecture (and terms) for MSDF, illustrated again below to aid the reader.
Unfortunately, On Event Processing Agents does not reference the prior art:
My question is why, instead of creating and advocating a seemingly “new vocabulary” and “new event processing theory”, why not leverage the excellent prior art over the past 30 years?
Why not leverage the deep (very complex) event processing knowledge, well documented and solving some of the challenging CEP/EP problems we face today, by some of the top minds in the world?
Why not build upon the knowledge of a mature pre-existing CEP community (a community that does not call itself CEP) that has been building successful operational event processing applications for decades?
Why not move from a seemingly “not really invented here” approach to “let’s embrace the wealth of knowledge and experience already out there” worldview?
Since March 2006, this question remains unanswered and, in my opinion, the Vocabulary of Confusion, introduced in March 2006 at the first unofficial EPTS party, is even more relevant today. Competition is good; new ideas are good; new perspective are good; however ignoring 30 years of prior art and not leveraging critical prior art is not very good, is it?
Frankly speaking, there is more than enough CEP theory in the art-and-science of MSDF. If we map the prior art of operational MSDF systems against existing “CEP platforms” we will gain critical knowledge in just how far behind the emerging CEP/EP software vendors are in their understanding of where event processing has been and where the art-and-science is headed.
Well, enough of blogging for now. Time to get back to mudane SOA “hearding cats” tasks at Techrotech, so I’ll be back Off The Grid for a while.