Enterprise Data Hub: Development and Integration

Big Data


International bank


Financial sector


1000+ employees


Creation of a multifunctional centre for data aggregation, processing and presentation along with secured real-time exchange of enterprise data.


The structure of functional components:
Data Broker (Kafka) – ensures real-time data exchange.
Big Data Platform (Hortonworks / Cloudera Hadoop Data Platform, Hadoop Data Flow) – stores all data coming through a data broker.
Logical Datawarehouse (Tibco Data Virtualization) – a logical data warehouse and a tool for business users to access and manage data. Also, it enables fast data access for online monitoring of the end-to-end technical process (from data collection to data usage) and provides data for technical monitoring.
Data Governance – data management which includes data quality and security.


We built a scalable architecture on Apache Kafka for real-time data exchange, 24/7.
The number of processed events is about 2-3 million per day and up to 200 events per second.
Optimization of data governance, including its quality and security.

Form Background

Would you like to see
the full case study?

Fill out the form and we will
contact you right away