You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We need to introduce a new Kafka consumer component using Spark for aggregate statistics calculations. These can be min/max, means, variances, averages and counts of the Cenote data stored in Cockroach.
This is a requirement stemming from the eeRIS application. Right now, these averages are being calculated in the real-time pipeline using Lua scripts running on Redis, i.e. some form of caching. A more robust design would separate these calculations from the real-time event streaming and place them at the batch pipeline.
Essentially, we need to introduce a design of various Spark consumers according to the job at hand. Later on, these consumers might run ML models on the data as well. Currently we need to search for the correct way for this infrastructure of Spark consumers/cluster to integrate upon the existing Cenote architecture.
The text was updated successfully, but these errors were encountered:
We need to introduce a new Kafka consumer component using Spark for aggregate statistics calculations. These can be min/max, means, variances, averages and counts of the Cenote data stored in Cockroach.
This is a requirement stemming from the eeRIS application. Right now, these averages are being calculated in the real-time pipeline using Lua scripts running on Redis, i.e. some form of caching. A more robust design would separate these calculations from the real-time event streaming and place them at the batch pipeline.
Essentially, we need to introduce a design of various Spark consumers according to the job at hand. Later on, these consumers might run ML models on the data as well. Currently we need to search for the correct way for this infrastructure of Spark consumers/cluster to integrate upon the existing Cenote architecture.
The text was updated successfully, but these errors were encountered: