MDM using Data Virtualization
If there’s anything that can benefit from agility, it is
most certainly Master Data Management.
Untolled MDM projects flat-out failed over the last 15 years? Moreover, why? Must be largely because any dynamic corporation (growing business constantly changes) and with change
comes the demand for Master Data to reflect the current reality of at hand
immediately. That’s not possible with legacy methodologies and tools.
With the advent of Data Virtualization, “Enterprise Master
Services” or “KPIs” are always fresh and accurate ( with the most recent
information). This approach significantly reduces the number of copies of data,
thereby reducing the chance of discrepancies across instances of data. Data
remains in the original sources of record and is accessed as needed on demand
for Portals, BI, Reporting, and Integration.
Furthermore, it is not really necessary to define an "everything to everybody" Master definition. Think about it more like an organic approach, growing and changing the models, creating new versions for specific use cases or constituents. The key there is that Enterprise Enabler® (EE) tags every object with notes and keywords as well as the exact lineage, so that a search will find the best fit for the use.
Doesn’t Data Virtualization mean you’re getting a Streaming Data Point?
No, it does not, this is the
myth. I often hear the following concern: “If I want to get the KPI, I
don’t want just the current live value, I want the last month value or even
some specific range of days.” The answer
is, Data Master is actually a virtual
data model defined as a set of metadata that
indicate all of the sources, and all of the security, validation, federation, and transformation logic. When the virtual model is queried, Enterprise Enabler® reaches
out to the endpoints, federates them, applies all other logic, resolves the
query, and returns the latest results. So the data set returned depends on the
query. In other words, a Master Data Service/Model resolves the query and
retrieves data live from the sources of record, and delivers the latest data
available along with the historical data requested.
In the case where the model consists of real-time streaming
data, of course, you are interested in the live values as they are generated. These
models still apply business logic, Federation, and such, and you have some way
to consume the streaming data, perhaps continuous updates to a dynamic
dashboard. However, that’s not what
makes MDM Agile.
The Challenge of Change
The more dynamic your business, the more important agility becomes in Master Data Management. Applications change, new data sources come
along, processes and applications move to cloud versions. Companies are
acquired, and critical business decisions are made that impact operation and
the shape of business processes. All of these changes could mean updates need
to be applied to your Master Data Definitions. The truth is, with legacy
MDM methodologies (the definition, programming, and approvals) will be calculated
in months, while you are impeding the progress and alignment of new business
processes.
What’s the “Agile” part of Enterprise Enabler's MDM?
Agile MDM is a combination of rapidly configuring
metadata-based data Masters, efficiently documenting them, “sanctioning” them
and making them available to authorized users. Ongoing from there, it is a
matter of being able to modify data masters in minutes with versioning, and
moving to the corrected or updated service/model. It’s also about storing
physical Master data sets only when there is a true need for them.
Ready for the second truth? When you use an Agile Data
Virtualization technology such as StoneBond’s Enterprise Enabler®, along with proper use of its data
validation and MDM processes for identifying, configuring, testing, and
sanctioning Data Masters, you are applying agile technology, and managed agile
best practices, to ensure a stable, but flexible, MDM operation. Enterprise
Enabler offers the full range of MDM activities in a single platform.
Step 1. A programmer
or DBA configures a Data Master as defined by the designated business person.
Step 2. The Data Steward views lineage, authorization, tests,
augments notes and sanctions the model as an official Master Data Definition.
Step 3. The approved Data Master is published to the company MDM portal for general usage.