1 year ago

#341482

test-img

Jack

Building event based alerting pipeline with Flink

We have a kafka source of events(~5000/sec), and another stream of rules(~5000k/day) created by users for alerting purpose. A rule expires after a day in our system. We need to match each event against 5000k rules and send alert if rule matches.

Ex. Event1: {A.temperature=110, A.weight=10}
    Event2: {B.temperature=90, A.weight=60}

    user-rule1: alert me if A.temperature > 100
    user-rule2: Alert me if B.weight < 50
    user-rule3: alert me if A.temperature < 120

Is it possible to store all these rules in Flink state, is it going to be fast enough?

For optimisation purpose, we might consider event window of 1 minute as well (for ex, find min and max temperature/weight for every past 1 minute and then check against every rule?). Or, should we store user rules with different microservice and fetch with rest call?

apache-flink

flink-streaming

flink-sql

flink-cep

0 Answers

Your Answer

Accepted video resources