Knitting Together Enterprise Data Management and Analytics with a Logical Data Fabric

Written by Ravi Shankar, Senior Vice President and Chief Marketing Officer, Denodo Technologies

FinTech firms have been at the forefront of business process automation, as today it is fairly common for both front- and back-office operations of a financial institution to be completely digitized. Operating within one of the heavily regulated industries, FinTech firms have also built systems to facilitate compliance with “Know Your Customer” (KYC) regulations, Basel III, the Comprehensive Capital Analysis and Review (CCAR), and other guidelines and regulations.

Over the years, however, such digitization has increased the number of systems within financial firms, and the data within these systems have become siloed. IT teams have used traditional data integration techniques to copy the siloed data into yet another repository, so as to combine these disparate data sources into an integrated view for business users. But such techniques rely on scripts that will not accommodate the slightest change in a business process without being rewritten and retested, which can be time-consuming and resource-intensive.

FinTech firms need a strategic approach to enterprise data management. Fortunately, a modern data architecture called a “logical data fabric” forms the foundation for such an approach. A logical data fabric enables data to remain in source systems, yet it knits the data together using a virtual approach.

Before we cover the benefits of logical data fabric, let us review the limitations of current approaches.

The Inherent Problems of Replicating Data
To create holistic, integrated views of data across disparate systems, FinTech firms have typically been using traditional point-to-point integrations, such as extract, transform, and load (ETL) processes, to replicate the data into analytical data warehouses and marts, operational data stores, or data lakes.

But replication takes time, which naturally introduces latency into the equation. Replication also creates inconsistency, as it relies on duplicating data across the different systems. In addition, replication requires extended provisions for storing and maintaining the redundant copies.

Finally, point-to-point integrations such as ETL processes are designed specifically to extract a certain number of rows from one database, transform them so that they match the structure of the target database, and load the data into that target database. They perform this function well, and rapidly. However, to introduce any changes, such as a slightly different transformation protocol or a different target, it is necessary to change the script. Because such integrations typically involve numerous rows of sensitive data, these processes must be very carefully tested, with adequate failback provisions which require both time and a team of skilled administrators. This means that if FinTech firms rely on point-to-point protocols for integrating data, they will have to pay the price in terms of agility.

Weaving a Logical Data Fabric
A logical data fabric, as described above, relies on a modern data integration technology called “data virtualization.” Data virtualization provides real-time data integration, bringing data together from multiple silos without actually moving any data. It does this not by extracting and loading copies of data but by providing virtual views of the data in its original location.

Data virtualization enables FinTech firms to implement a logical data fabric that seamlessly draws data across the silos of FinTech systems, knitting them into an integrated view of the data, no matter the kind. Data virtualization can perform all of the needed transformations on the fly, and unlike ETL processes, can be managed with little or no code. Logical data fabric provides a way to connect to all data without having to first collect it.

In addition to streamlining and accelerating data access across silos, logical data fabric also facilitates automation, as it has the ability to utilize artificial intelligence and machine learning to automate repetitive functions.

Logical Data Fabric: A Real-World Example
In 2018, a large investment bank partnered with a global FinTech company, and needed to instigate bi-directional data exchange between the two companies and create a single source of truth for the combined data.

The bank built an enterprise integration service layer that enables data to flow back and forth between the two companies, enables scalability and efficiency, and acts as a data extraction and translation layer. Since the data is now mastered by both companies (80% by the FinTech company and 20% by the investment bank), the bank implemented a logical data fabric to form a bridge between the two companies and provide a single source of the truth.

With the logical data fabric, the bank can now push data quality processes directly to the source itself, rather than replicating data and then applying data quality checks. As a result, processes are streamlined and the company saves on expenses normally devoted to data movement and storage. Also, though much of the data is streamed and cached in key-value-store format, the logical data fabric exposes data as a table consumable in SQL by the data quality tool, which can perform hundreds of quality checks in a short time.

The logical data fabric accelerated decision-making and client onboarding; it also saved time, resources, and cost; and improved efficiency and agility across both organizations.

Bring the Data Together
Traditional approaches to data integration have not been serving the FinTech industry. It is time for FinTech firms to take a step back and take a more strategic approach, centered on the simple, effective power of logical data fabric.

Ravi Shankar
Senior Vice President and Chief Marketing Officer, Denodo Technologies
rshankar@denodo.com

Denodo is the leader in data virtualization providing agile, high performance data integration, data abstraction, and real-time data services across the broadest range of enterprise, cloud, big data, and unstructured data sources at half the cost of traditional approaches. Denodo’s customers across every major industry have gained significant business agility and ROI by enabling faster and easier access to unified business information for agile BI, big data analytics, Web, cloud integration, single-view applications, and enterprise data services. Denodo is well-funded, profitable, and privately held. For more information, visit http://www.denodo.com or call +1 877 556 2531 / +44 (0) 20 7869 8053.