Foro Formación Hadoop
Why CEPs are (so) important - AnalyticMate
Why CEPs are (so) important
Complex Event Processing is about predicting and inferring events or patterns quickly, with the use of different continuously produced data sources.
There are many use cases for CEP technologies, typically those that require figuring out that some circumstances are happening and require immediate reaction. To mention a few:
– Finance: fraud detection, risk, etc.
– Smart cities, IoT, sensors, etc.
– Social media analytics
– Clickstream analytics
– Business Process Management (BPM) and Business Activity Monitor (BAM)
– Network and apps monitoring
In general, where there are operations, transactions or monitoring, a CEP is a key advantage. CEP solutions need to deal with a high throughput, have a very low latency and perform complex techniques [1]:
– Event-pattern detection
– Event abstraction
– Event filtering
– Event aggregation and transformation
– Modeling event hierarchies
– Detecting relationships between events
– Abstracting event-driven processes
CEPs after Big Data
With the arrival of Big Data, CEP systems have evolved and let real-time insights emerge. Now it is possible to cope with extra (and massive) data sources that in many cases are ingested in real-time too. That is also, coping with continuous data streams.
In recent times Spark, with Spark Streaming, has been used for complex event processing (see [2]). But Spark has some limitations due to the micro batching processing schema [3].
Analyticmate is going one step further and integrating Apache Flink (with FlinkCEP library) to deal with such limitations (option A in the figure). Apache Flink offers a natural approach to deal with streaming dataflows: high throughput and low latency. In some cases it is needed a latency below 50 ms. So Spark Streaming is not an option [4] when real-time is needed (option B in the figure).
Artículo completo en: https://www.analyticmate.com/cep-and-big-data-the-perfect-match/
Redes sociales