Wednesday, May 30, 2012

SAP Best Practices Streamlined by Agile Integration


With the augmentation of Business Object Data Integrator with ETL and the Rapid Marts, the orderliness of SAP reaches further out toward legacy sources.  SAP's ECC Master Data Structures, SAP AIO BP ("SAP All-in-One Best Practices" ), among other things, define and document IDOC structures for master data elements.

These tools work reasonably well as long as you are dealing with single source to the master definition.  Unfortunately, what lies beyond SAP in your environment is anybody's guess. If  more than half of these work for you "out-of-the-box,"  you must live a charmed life!   More than likely, your legacy sources impose a huge need for custom programming to align multiple data sources and transform them to meet the Best Practices requirements.  The variability and unknowns of your environment  probably will never be able to be addressed easily by SAP's tools, and yet that necessary custom programming consumes a good chunk of your budget and timeline.

Typical integration certification is for one source to SAP. Both the source and the destination have pre-defined schemas, which is virtually never true in real life. Again, with this, we see that this type of adapter seldom works without custom programming adjustments.


Whatever approach you are planning on using for your migration of legacy data to SAP,  you are most certainly facing a formidable, time consuming, and costly exercise.  SAP has greatly improved their tools and best practices over the years, and highly trained and experienced SAP consultants, architects, and programmers have contributed to streamlining the task of data migration. 

This is where Agile Integration Software (AIS), with its light-of-foot cross-application federation, can make a huge difference to your overall  success in meeting your objectives, and in ensuring that as changes occur, they can be accommodated quickly and easily.  That point where data migration becomes a nightmare is where AIS like Enterprise Enabler are the best solution.  The earlier it is introduced into the plan, the better, as some project steps may simply no longer be necessary.

Given the magnitude of a migration from Legacy systems to SAP, and of the ongoing integration required, it has to be a worthwhile effort to establish what the value could be to your project. With a high probability of a minimum of 50% reduction in manpower and elapsed time to implement the integrations, and a rapid, easy path to demonstrating this, why take the risk of NOT investigating?

You owe it to your shareholders, and to yourself to seriously consider AIS!

Thursday, May 3, 2012

Agile Master Data Services

The existence of Agile Integration Software gives rise to solid footing for a new model that is emerging for Master Data Management.

In the context of Agile Master Data Management, a Data Master is considered to be not just a schema with all the standard ancillary information about field definitions, units of measure, formats, etc., but it is actually a series of automatically generated services that make the master data accessible live from the multiple root sources that are federated and transformed as required to fulfill the requirements of the Master definition. So, once a Master Data is defined, you can use it in many ways without having to know anything about where the data comes from or whether the official sources behind it changed yesterday or will change tomorrow.

Since the bottom line is about accessing or moving the correct data, we turn the MDM exercise upside-down. The classic approach of defining data models for the whole organization is a daunting task, smiled upon, I am sure, by those Systems Integrators who have a history of making lots of money attacking tasks that are verging on the impossible. Can you really know what the data needs to be from the perspective of every potential use? As always... the Planning group sees the world differently from the Operations group who sees it differently from Finance.

Instead of focusing on an unwieldy data model, consider the approach of a having a growing body of for-purpose schemas, most of which have been defined because the data is required for a project. So what you need, rather than a big data model, is a mechanism to manage this metadata in reasonable chunks, identifying, tagging, and "sanctioning" some as Master Data, and then making them available for re-use in multiple ways with appropriate security.

With Agile Integration Software such as Stone Bond Technologies' Enterprise Enabler, you already have defined reusable virtual schemas , packaged with all the sources of record, federation, transformation, validation, and sometimes the data workflow relevant to general use of the data. With a single click, you deploy one of these nuggets in any number of ways, such as a web service (SOAP, JSON, REST) or as an ADO.net object, or as a SharePoint external list. These particular modes of delivery are virtual, on-demand, and they are not just for reading data, but contain all the information for full CRUD (Create, Read, Update, Delete). Instructions for transaction rollback are included as well as end-user awareness for securing data access to individual permissions. ETL with federation, data workflow, and notifications are processes that work with or on specific data sets and can also be packaged for reuse.

Agile MDM leverages those nuggets by "sanctioning" the ones that have general usability as Master Data Services. Then, where there are gaps, new virtual schemas can be built.

Since the generation of these nuggets, rich with all the layers of information, is a matter of minutes, their proliferation could be a challenge at some point. The System Integrators need to reinvent their Best Practices for a more agile set of processes, to ensure success for these Agile MDM projects and avoid the proverbial herding of cats. In many ways, Agile MDM parallels the growth of social media. It's extremely easy to generate complex integrations, so there could easily be many for each project. The key for MDM is to start to manage the situation at the right point, late enough to leverage the body of for-purpose metadata, but early enough to guide re-usability of Master Data before there are too many to review.