top of page
  • Writer's pictureTREE53

How event-stream processing will transform Business Intelligence

Updated: Jul 5, 2019

What are the benefits of event-stream processing for Business intelligence (BI)? Opinions can diverse as the main tangible advantage of event streaming is the real-time capability that comes with it. Here is how it can impact the evolution of BI.


As from its origin, BI has been looking into the past to support business decisions. BI acts as a central brain collecting and making sense of all data across departments. Traditional BI is typically batch-oriented: every hour, day or week, data would be gathered, transformed into insights and delivered into reports. This self-reflection is essential for businesses to refine their strategy and operations.


With the introduction of Big Data, Data Science and AI, BI develops a complementary capability supporting business decisions and operations. By looking into tons of historical data, data scientists are capable of finding patterns in data that lead to a probability score. In order words, it becomes possible to predict the future on a strategic or operational level.


Let’s take a concrete example of an online gambling game company. On the one hand this company looks into their monthly performance by measuring KPI’s, tracking and visualizing progress in dashboards and graphs. Customer churn rate is one of the key measurements assessing the company success. By applying Ai and data science, this means for our Online Gambling Company that they can predict when a customer - at individual level - would leave the game and never come back (customer churn).


Now the question is, how long can the company wait before a reaction is triggered? The answer depends on the use case and the IT capability of the company. Some use cases might require a reaction a day later while another would need an instantaneous reaction. In a digitalized world where speed is a competitive advantage, instant reactions are gaining more and more traction. This is where event-stream processing, the technology at the heart of Spotify, Netflix or Uber, makes the difference: it speeds up time-to-reaction and enhances thereby the customer experience.


The essence of event-stream processing is the following: any event (ex.: customer online interaction, time-out on IoT device, change of status in database) triggers other events as soon as it happens. The best-known technology for implementing an event-driven architecture is the Kafka stream processing framework which acts as a central nervous system: it captures thousands, millions of events, analyzes them and triggers other processes across departments in real-time.


Let’s go back to our example and imagine that a customer signs up, uploads some money, bets and loses money all the time in a couple of hours. The company needs to react to be able to retain the customer. In that case, it is now or never. As soon as the system detects an alert, a pop-up window could appear letting the customer know that he will benefit from extra money as a compensatory gift for his loss. Or the customer account manager could contact him to ask what went wrong.


The BI role gets thereby broader as it does not only report on the performance on a regular basis: it also predicts the future and reacts if necessary by triggering processes across departments, like a central brain monitoring all data and constantly improving itself with little human intervention.This evolution of BI comes with two challenges: a serious mind-shift at company level and an adapted data infrastructure that breaks data and department silos. Integration of systems within an event-driven architecture is not something that can be done overnight. However the flexibility of an event-driven data platform (such as Confluent and other Kafka-based software stacks / platforms) allows to build this kind of architecture gradually, one use case at a time, in parallel with the existing infrastructure.


Traditional BI (reporting) will not necessarily benefit from the real-time capture of data in motion but can definitely be part of an event-driven architecture as it can allow for batch processing of data as well. Actually most traditional BI tools are simply not able to deal with real-time ingress and huge volumes of data. However it is interesting to note that tools such as Google BigQuery on Google Cloud show amazing capabilities in terms of data ingestion and auto-scalability. It is perfectly possible to ingress streams of data in real-time that will be append data to tables. It is then just enough to launch a SQL query on a frequent basis to get the needed insights out of these data. If you want to dig deeper into that subject (BigQuery and event-stream processing): here is a thorough article and video from Ofir Sharony presented at Kafka Summit 2019 in London.


Generally-speaking, Cloud is a great environment for event-stream processing as it redefines the boundaries of data storage, accessibility and performance. Of course to optimize costs it is recommended to store only valuable data and to store less useful or less used data into a cheaper data storage layer.


To sum up, traditional BI is not being replaced by real-time analytics (event stream processing): it is rather complementary to BI. It is a balance to find between short-term reactivity and long-term thinking. Real-time analytics allows BI to react to events while traditional BI allows the business to stop and reflect upon its decision by comparing the performances within similar time windows (monthly, weekly, yearly …). If tomorrow all processes are automated and triggered by events across the company, it will anyway be important to assess the performance of such an engine. The world is constantly changing and thereby the need to adapt never goes away.


Thanks for reading and please don’t hesitate to share your thoughts in the comments below !

181 views0 comments
bottom of page