Go Fast and Far Using Data Virtualization to help you Go Fast and Go Far
Reading Time: 3 minutes

We are always focused on making things “Go Fast” but how do we make sure we future proof our data architecture and ensure that we can “Go Far”?

Technologies change constantly within organizations and having a flexible architecture is key. Data architectures have changed in structure over the years and they will continue to change into the future.

In the 70s we started with essentially flat file type data structures, through to the 80’s databases started to emerge, in the 90s we found data warehouses, star schemas and multi-dimensional data structures introduced, in the 2000’s web data and social data became more prevalent, then more recently blob storage, columnar storage, data lakes, machine or IoT data and the list goes on, including more and more complexity.

How long should a technology last? Should we be rushing to replace existing technology if the current solution meets the requirements and still serves its purpose? As a consultant out in the field, I used to hear quite often from CIOs  “if it ain’t broken, then don’t fix it”. This was a message loud and clear that there was no budget to look at alternative way of doing things. There needed to be a good business use case with defined ROI for changes to be considered.

There has been a lot of focus on acquiring solutions for faster data delivery and performance, from OLAP cubes to MPP capabilities but that has only solved part of the problem that businesses face when it comes to becoming more data centric and accelerating data delivery for better business outcomes.

So how do we modernize our data landscape but at the same time leverage existing technology investments? How do you invest in the right technology to last into the future without having to make changes regularly and cause large amounts of disruption across the business?

Going “far” means we need the data infrastructure to see us into the future without having to experience too much disruption along the way. We need an agile architecture that can adapt to change quickly. We need to be able to swap in or out different technologies as required but with limited effect on the users of the data held within these technologies.

How do we address all of these questions? Well, that will depend on a lot of factors, however one definite way to start building the foundations of an architecture that will “go far” is one that includes a logical approach.

The core of the logical data fabric is the ability to consolidate many diverse, disparate data sources in an efficient manner by allowing trusted data to be delivered from all relevant data sources to all relevant data consumers through one single logical layer. This can be achieved by using a data integration technique called Data Virtualization.

Data Virtualization offers a single logical point of access, avoiding point-to-point connections from consuming applications to the information sources. As a single point of data access for applications, it is the ideal place to enforce access security restrictions and manage overall data governance.

Data Virtualization is a data integration strategy that uses a completely different approach: Rather than physically moving the data to a new, consolidated location, data virtualization provides a real-time view of the consolidated data, leaving the source data exactly where it is. It is a modern data integration approach that is already meeting today’s data integration challenges, providing the foundation for data integration in the future.

Using Denodo, the semantic layer is defined in one single place (the data virtualization layer) and shared across all tools and applications. Everyone will receive the same data avoiding data inconsistencies and mistrust in data. Data views that are defined in a single data virtualization layer can be reused across multiple applications which leads to reduced development effort and faster time to delivery.

Data Virtualization goes a long way to not only providing visibility into your corporate data assets but also allows you to make data central to your business, by making it easily discoverable and accessible. Data virtualization completes the self-service model by providing a place to find data, a Data Catalog. The Denodo Data Catalog is your data marketplace, for all of your key corporate data assets.

I have heard a Forrester Analyst quote recently that the COVID pandemic has changed our technology strategies and we need to be adaptive, creative and resilient. Data virtualization was referenced as a method to modernize the data landscape and allow organizations to adapt to change and become overall, more insights driven.

 

Katrina Briedis
Latest posts by Katrina Briedis (see all)