THANK YOU FOR SUBSCRIBING
Editor's Pick (1 - 4 of 8)

Louis DiModugno, Chief Data Officer with HSB
This potential real-time integration of data at a level that looks across all customers, products, objects, or locations, for example, allows for an understanding of cross-functional or interdisciplinary information. Having a repository that integrates these views allows consistency of a single version of the truth, access to most current information, accuracy across several lenses, secure information, and roles-based access, to name just a few.
With their requirements in hand, you now understand what data needs to be integrated as well as the frequency. (This would be the “How?”) The proliferation of new ETL and Mastering tools leveraging Machine Learning (ML) and Artificial Intelligence (AI) have lent insightful capabilities and created integration paths of data into a methodology supported by new workflow techniques. Some vendors are using ML to identify and train unknown data fields into known data models. This would accelerate the access to data as well as improve the consistency in classification.
To understand what data we can bring together, we have to go back upstream to the Acquisition layer. At this layer, an inventory needs to be taken of what you have and have access to (What).
What is accessible (free vs. purchased), what is the value and accuracy of the data, where it stored (cloud vs. on-prem), are just some of the questions that will need to be answered about what data you have and whether it is worth bringing together.
Another area of technology that is helping this situation is virtualization, which is the capability to connect to data sources, combine different data types and consume the data through multiple platform/delivery systems. We have found that virtualization is a key component of being able to continually contribute value to the organization while you are executing on the enterprise data strategy.
As we continue down this journey of evolving technology capabilities, we know that there will always be new tools/ capabilities available. My suggestion is to find a suite of tools that already have some level of integration across your data strategy components and stick with that toolset (at least to get your ROI out of the program).
This will save you time in trying to integrate them yourselves and keep you from veering off the path.
Additionally, I suggest establishing a cloud strategy and creating policies to reuse as you expand. For organizations with immature cloud efforts, you will end up learning the hard way if you don’t put some guidelines together to keep everyone aligned. Finally, find yourself a team of data warehouse experts interested in change and investing in new personal capabilities.
They are out there, and they are the lifeblood of the evolving data organization. They know what data can be trusted and where it can be found.
If you don’t start to move away from data warehousing and take advantage of the improved technology, your organization will be stuck in the past. It will be surpassed by your competition’s capability to access and use data. And your organization could end up “Same as it ever was…” or worse.
If You Don’t Start to Move Away From Data Warehousing and Take Advantage of the Improved Technology, Your Organization Will Be Stuck In The Past
