Organizations wishing to transform their operations to achieve a Zero-Latency Enterprise (ZLE) face several challenges, including:
Too many disparate applications that operate as independent silos, hindering seamless flow of information
Not treating Data as a first-class Enterprise asset - Leading to poor quality of master data used in business processes
These cause undue delays and repeated rework, lengthening the response times for business decision-making.
Organizations wishing to improve their business operations to become more agile and responsive to rapidly changing external environments must seriously adopt principles of a Zero latency enterprise.
By implementing Application Data Management (ADM) solutions, Organizations gain the full benefits of master data management (MDM) concepts to better govern their data, while avoiding the complexity and overheads of generic MDM tools. By producing high-quality data, organizations are set on a course to becoming a Zero Latency Enterprise. This finally helps organizations reap better ROI from their execution systems such as ERP.
The Zero Latency Enterprise (ZLE)
"The big never eat the small - the fast eat the slow"
-Jason Jennings and Laurence Haughton
This quote aptly illustrates the paradigm shift in competitive dynamics in today’s business environment. Conventional wisdom tells us that large companies will succeed in their markets, but that is no longer relevant today for many industries. At a period of rapid technological innovation and customer expectations for highly personalized service on demand, it is impossible to ignore the importance of time-based competition. Business leaders, across industries, acknowledge that speed is key to gaining competitive advantage in their businesses. These leaders turn to the potential of Information Technology to automate and streamline their business operations and transform them from functional silos into finely-tuned reusable business capabilities.
In tune with the changing business needs, the IT community has offered to support these business initiatives with a variety of solutions and infrastructure offerings. However, MDM vendors, employing confusing buzzwords and acronyms to highlight their competitive strengths mask the path to achieving the final outcome. Real-time Enterprise, Adaptive Enterprise, Agile Enterprise, Event-driven Apps, Zero-Latency Enterprise – over the last couple of decades, there has been no dearth of new terms and acronyms used in the business press to propagate the message of the critical importance of being able to sense relevant events and respond appropriately at the right time.
The notion of latency in business terms is an important one to understand and act upon. In business terms, latency can be described as the time required for a business event to complete and for the results of this event to be propagated to all interested consumers of this information. In a Zero Latency Enterprise (ZLE), such time lag is reduced to almost zero. In other words, current information is immediately available to all parts of the company where it is needed.
From an IT standpoint, a Zero Latency Enterprise (ZLE) is any business whose Information in applications are wired so that information is synchronized across applications, processed and analyzed, and insights are made available to decision-makers, in near real-time, with minimum latency.
For organizations that successfully manage this transformation into a ZLE, the promised benefits are many.
Easier integration of acquisitions through a common set of core capabilities
Elimination of errors and shortening of response cycle through automated processes
Increased regulatory compliance: ability to rapidly analyze and respond appropriately to changing Government policies and rules
Enhanced business model innovation capability helping to quickly react to new, emerging business needs
A positive impact on both top and bottom lines resulting in increased shareholder value and market capitalization.
Business Executives tasked with the responsibility of laying the blueprint for a transformation towards such agile enterprises face several important questions such as:
What constitutes the complete set of components of the IT solution and their inter-relationships
What parts of the current IT environment can be leveraged and how should they be integrated into the overall solution?
How to evaluate the competing claims of vendors and making a buying decision?
What milestones to plan for as part of the project
Getting the right answers to these questions takes up a considerable part of the planning exercise.
While these are very important aspects to consider, smart business executives combine both a big picture view with attention to low-level details, similar to experienced building architects. They understand that while technology is a key enabler for change, what powers that change is the availability of high fidelity representations of business objects, along with unambiguous semantics that explain the meanings of these data classes, their attributes, and their relationships.
Don’t Forget the Data!
Without over-emphasizing the obvious, it is quite clear that numerous business initiatives rely on the presence of high-quality master data to supply reliable, trustworthy facts to support making key decisions. As an example, consider the following business programs.
Rationalize the vendor base through Enterprise Spend Management or Strategic sourcing
Minimize inventory, improve profitability and visibility by focusing on parts reuse
Integrate Mergers & Acquisitions efficiently and effectively through synergies to be acquired by consolidating operations, supply chains, and product lines
When master data is inaccurate, incomplete, out-of-date, or duplicated, business processes magnify and propagate these data deficiencies further into other parts of the Enterprise.
The concept of latency is applicable as much to the abstract domain of information supply chains as it is to the physical supply chains that handle the flow of goods and materials for manufacturing operations. The lifecycle management of data within the organization is as relevant to eliminating latency and achieving a ZLE.
For companies that want to wring out non-value adding activities (In Lean Enterprise terminology, such activities would be termed as “Waste”) out of their physical and information management processes, Zero Latency Enterprise (ZLE) is an ideal state to aspire for. As the name indicates, in a ZLE, such wastages are eliminated and latency is reduced to almost zero.
For a ZLE, closed-loop operations imply eliminating the latency between transactional and analytical activities
In a closed loop business operation, transactional data captured in backend execution systems such as ERP/CRM etc. is shared in near real-time with downstream analytical applications such as Business Intelligence (BI) tools so that decisions can be made faster using accurate and timely operational data. Since the effectiveness of executive decision-making depends on the quality of data available at hand, a key source of competitive advantage lies in how such data assets are managed.
Poor Master Data Impacts the Bottom Line
Without over-emphasizing the obvious, it is quite clear that numerous business initiatives rely on the presence of high-quality master data to supply reliable, trustworthy facts to support making key decisions. But master data handling in many organizations is in a state of mess. Due to the siloed nature of many IT systems and applications, master data is stored in multiple places in a redundant manner. This results in disparate data nomenclatures for the same entity, differing data structures and definitions, inconsistent use of rules to enforce business constraints etc. Poor master data manifests in various dimensions:
Duplicate records pertaining to the same real-world entity
Out-of-date or not current/irrelevant
When business processes use such error-ridden data to run their transactions, it results in magnifying and propagating these data deficiencies further into other parts of the Enterprise. This has a profound impact on the company’s operational and financial results. Some or all of the following adverse outcomes can be observed:
Executives stop using operational BI reports voiding ROI assumptions of BI initiatives
Top and bottom lines continue to suffer on account of poor decision making and or increased cost of bad data
Executives forced to manage more by instinct and gut feel rather than using facts, risking making poor decisions
Vendors may capitalize on charging more for goods and services.
Increased costs for fulfilling customer demand on account of wrong shipments.
Increased inventory on account of poor supply chain planning.
Increased ghost demand from customers, on account of lower fulfillment expectancy.
In extreme cases, shareholders lose confidence and market capitalization plummets
Master Data Management (MDM)
Due to growing awareness of the adverse outcomes of poor master data and its impact on the business bottom line, many companies have adopted a more disciplined approach to managing their information assets by employing solutions under the label of Master Data Management (MDM). There are two distinct perspectives to understand when considering an evaluation of potential MDM solutions.
First and foremost, master data management is a business concern which should be driven by business leadership with a strategic, corporate-wide focus. Hence, MDM is primarily a discipline with clear set of concepts that educate users on how to master their data throughout the course of its usage in the enterprise and its partners
MDM is also a tool that offers features to automate the processes involved in lifecycle management of master data.
What is the Ideal MDM solution?
An ideal MDM solution would be one where:
The MDM system owns 100% authorship of all the relevant data attributes that are needed by all the member applications in the organization
Master data is available to all consumers instantaneously, on demand, without any additional efforts of data integration and other associated overheads
An MDM tool with the above-mentioned characteristics would indeed help to alleviate all the master data concerns in a cost-effective and efficient manner. However, the reality is that many such tools fall far short of achieving that ideal. And many business environments do not allow such theoretical applications to take root in their organizations.
So, how do Organizations leverage the powerful concepts of MDM without being hit by the downsides of the MDM tools? In the absence of an ideal MDM solution, Organizations have to rely on a smarter variation of deploying the MDM tool, wherein a mission-critical application such as an ERP system drives the interest in and deployment of the MDM tool. Such solutions deliver required master data with zero latency, thus eliminating wastes in the information supply chain. This is where the discipline of Application Data Management (ADM) helps to deliver upon the potential of MDM.
Application Data Management (ADM): A Refined Form of MDM
Application Data Management (ADM) is a discipline akin to MDM but with a difference. In ADM, the scope under consideration is the specific application (or set of applications) for whom master data is being governed.
ADM serves as a quality staging area for the master data needs of the specific application. In doing so, it functions as a decoupling point between the upstream data creation activities and the various downstream data consumers, thus maintaining a clear segregation of duties between data provisioning and consumption. The de-coupling of information enables all quality checks and balances to be defined centrally as business metadata, which is then implemented consistently for all data records. On the consumption end, the creation of centrally defined data removes the burden of trust on master data from the consumers and eliminates the need to build unnecessary validations of data before using them in the member application’s functions.
Information de-coupling point
Apart from providing data governance features of a generic MDM tool, the chief advantage of an ADM tool is that it is pre-integrated to downstream consuming applications such as ERP, seamlessly delivering the right information to them, thereby eliminating latency in provisioning high-quality data to desired consumers.
An added dimension of ADM is that it is applicable for all data entities used in the business – both master data as well as transactional entities such as Invoices, Orders, Requisitions etc. Thus, within the scope of a single application, ADM is the umbrella under which all data is managed across its lifecycle of usage in the Organization.
Thus, ADM stays true to the spirit of MDM when it comes to managing master data for a given application and at the same time, ADM enhances MDM by making specific features available in the tools.
Comparative assessment of MDM and ADM
While both MDM and ADM support similar foundational concepts to manage the lifecycle of master data, the differences between them become visible when one observes the capabilities of the tools currently available in the market. The following table provides an assessment across different dimensions of the discipline of MDM and its refined version ADM.
|Scope of usage||Typically aims to become the supplier of high-quality master data for several applications of the Organization.||Tailor-made to deliver high-quality master data to specific mission-critical applications.|
|Scope of data||Master data across multiple domains such as Product, Customer, Location, Asset etc., across their entire lifecycle of usage.||Covers both master data as well as other transactional entities used in the applications.|
|Data Authority & control||In many implementations, MDM and current applications both continue to maintain parts of master data. As more applications are involved, it becomes a maintenance nightmare to ensure the correct application of business and logical rules for data management consistently across all the applications and the MDM system.||100% authoring of all master data attributes and relationships that are used by the specific applications. Member applications become consumers of master data, without any concerns for the quality of data being supplied. Data Governance is effectively enforced because of single source authoring.|
|Survivorship||Typically is maintained only in the MDM and serves as a reference for the golden record in case of conflict in member applications||Goes beyond the golden record and also reflects other business documents such as invoices, sales orders, and contracts with right account information|
|Data provisioning||Different applications impose different latencies in the data consumption process as per their operating characteristics. MDM system has to support multiple modes of data consumption such as Batch, near real-time, data feeds etc., using costly data integration tools. MDM system has to be architected to handle various possibilities.||Pre-built integration with the intended consuming applications means that all master data is delivered for instantaneous usage without any overheads of data transformation and integration.|
|When best to use||When shared master data has to be harmonized across multiple applications which cannot be changed – at least in the short term - easily to work under a single Ownership model of master data. True Enterprise MDM is a very long journey!||When an Organization runs more than 80% of its operations on a mission-critical application system such as ERP and needs to get master data for this system absolutely right at very low cost, without much time to waste.|
Organizations have to realize that the right MDM tools help eliminate master data errors, thereby playing an important role in reducing the time to respond to business challenges and opportunities. The more such wastes are reduced, the sooner the organization starts to behave as a Zero Latency Enterprise.
Hence, selection of the right MDM solutions becomes a very strategic business decision. When master data for a mission-critical backend application is concerned, current “generic” MDM tools are not engineered to deliver master data to the downstream application at zero latency. Because of this design limitation, Organizations incur additional cost and time overheads over the course of the lifecycle of use of application master data. An application-specific variation of MDM called Application Data Management (ADM) helps to overcome such time delays and helps to lay the foundation for a lean and efficient information supply chain.
Organizations that run a significant majority (80% or more) of their operations on one or two mission-critical business applications such as ERP would be well-advised to avoid “generic” MDM tools that promise a lot but under-deliver. Such organizations should instead seek out more refined versions of MDM tools called ADM solutions that deliver a higher value proposition, much faster than with traditional MDM tools, and that will set them on the path to becoming Zero Latency Enterprises.
Power of Triniti MDM
Triniti’s MDM with its ADM extensions are the ideal solutions to all your master data needs. While it fulfills your needs, it does so economically both for software and implementation.