
Initiative And Enterprise Menurut Para Ahli
David Loshin, in, 2009 2.3.5 Data Governance and Data QualityAn enterprise initiative introduces new constraints on the ways that individuals create, access and use, modify, and retire data. To ensure that these constraints are not violated, the data governance and data quality staff must introduce stewardship, ownership, and management policies as well as the means to monitor observance to these policies.A success factor for MDM is its ubiquity; the value becomes apparent to the organization as more lines of business participate, both as data suppliers and as master data consumers. This suggests that MDM needs governance to encourage collaboration and participation across the enterprise, but it also drives governance by providing a single point of truth.
Designer tips, volume 2: Common color mistakes and the 60-30-10 rule; 6 May 2020. Create marketing content that resonates with Prezi Video.
Ultimately, the use of the master data asset as an acknowledged high-quality resource is driven by transparent adherence to defined information policies specifying the acceptable levels of data quality for shared information. MDM programs require some layer of governance, whether that means incorporating metadata analysis and registration, developing “rules of engagement” for collaboration, defining data quality expectations and rules, monitoring and managing quality of data and changes to master data, providing stewardship to oversee automation of linkage and hierarchies, or offering processes for researching root causes and the subsequent elimination of sources of flawed data. David Loshin, in, 2009 3.3 Manifesting Information Oversight with GovernanceBecause MDM is an enterprise initiative, there must be some assurance of stakeholder adherence to the rules that govern participation and information sharing.
As we will discuss in great detail in the next chapter, a data governance program applied across different business-level domains will address issues of data stewardship, ownership, privacy, security, data risks, compliance, data sensitivity, and metadata management. Each of these issues focuses on integrating technical data management with oversight, ensuring organizational observance of defined information policies. The four areas of concentration for data governance include the standardization of common use at the data element level, the consolidation of metadata into enterprise management systems, managing data quality, and operational data stewardship. 3.3.1 Standardized DefinitionsWhereas humans are typically adept at resolving ambiguity with words and phrases, application systems are considerably less so.
People are able to overcome the barriers of missing information or potentially conflicting definitions, although at some point each individual's translation of a business term may differ slightly from other translations. This becomes an issue during integration and consolidation when data element instances that may share a name do not share a meaning, or differently named data elements are not recognized as representing the same concept.
Processes for data analytics and for assessing organizational data element information and coalescing that information into business metadata provide standardized definitions that ultimately drive and control the determination of the catalog of master data objects and how they are resolved into the unique view. 3.3.2 Consolidated Metadata ManagementA by-product of the process for identifying and clarifying data element names, definitions, and other relevant attribution is the discovery and documentation of enterprise-wide business metadata. David Loshin, in, 2009 Publisher SummaryMaster data management (MDM) is an enterprise initiative, and that means an enterprise data governance program must be in place to oversee it. Governance is a critical issue for deploying MDM. The objective of data governance is predicated on the desire to assess and manage many kinds of risks that lurk within the enterprise information portfolio and to reduce the impacts incurred by the absence of oversight. Although many data governance activities might be triggered by a concern about regulatory compliance, the controls introduced by data governance processes and protocols provide a means for quantitative metrics for assessing risk reduction as well as measuring business performance improvements.
By introducing a program for defining information policies that relate to the constraints of the business and adding in management and technical oversight, the organization can realign itself around performance measures that include adherence to business policies and the information policies that support the business. This chapter discusses how business policies are composed of information directives and how data rules contribute to conformance to those information directives.
It examines what data governance is while introducing data stewardship roles and responsibilities and proposes a collaborative enterprise data governance framework for data sharing. The three important aspects of data governance for MDM are managing key data entities and critical data elements, ensuring the observance of information policies, and documenting and ensuring accountability for maintaining high-quality master data. David Loshin, in, 2011 20.1.4 Collaboration with Enterprise InitiativesEnterprise data quality management cannot be divorced from any other enterprise initiatives that are under way. One must consider the context and landscape in which data quality management will be deployed when designing the data quality program. One must investigate the impact that the data quality initiative will have on other organizational initiatives and vice versa.As organizations are gradually recognizing that the connectivity between the operational and analytic aspects of the business is driven by high quality data, there will be a recurring need to ensure that any major initiative is aligned with the data governance and data quality management processes described in this book.
Chapter 4 suggested looking at planning initiatives, framework initiatives, and operational and application initiatives and their relationship with data quality management, as well as how those activities affect the scope of integrating data quality management as an enterprise program.Although maintaining a competitive edge requires forethought by senior management, many organizations are plagued by the absence of strategic objectives that should be intended to drive innovation and excellence in the marketplace. Organizations that engage in defining a vision and planning a strategy are more likely to be focused on achieving well-defined objectives.
This suggests looking at examples of planning initiatives and the interdependency with data quality management, such as performance management, Key Performance Indicators (KPIs), process improvements, organizational change, and strategic planning.Often, the organically grown application infrastructure is perceived to have been assembled as a result of development of business applications supporting vertical lines of business. However, the perception that the organization should operate holistically in a way that optimizes general corporate benefit suggests aligning business processes along horizontal lines, and not just in silos, suggesting a review of framework initiatives such as enterprise architecture, enterprise resource planning, supply chain management, and the retirement of legacy applications.Aside from organizational and framework initiatives, there are operational and application initiatives that impact the entire organization, especially in relation to data quality management. In this section we consider a few examples: compliance, business intelligence, and the purchase and deployment of proprietary systems.Although the intentions of the data quality program are driven by the maturity of functional capability the scope of the data quality program is not immune to impact from other activities that are diffused across the organization. In other words, the program should be reviewed in the context of how the data quality practitioners integrate a program supporting organizational change and upheaval, new initiatives, or other broad-based organizational activities. David Loshin, in, 2009 6.1 IntroductionAt a purely technical level, there is a significant need for coordination to oversee and guide the information management aspects of an enterprise initiative such as MDM. The political and organizational aspects of this coordination are addressed as part of the governance program that must accompany an MDM program.
However, all aspects of determining need, planning, migration strategy, and future state require a clarified view of the information about the data that is used within the organization—its metadata.It is easy for us to fall into the trap of referring to metadata by its industry- accepted definition: data about the data. This relatively benign description does not provide the depth of understanding that adds value to the MDM deployment. Instead, the metadata associated with an enterprise master data set does more than just describe the size and types of each data element. It is the historically distributed application and data silos that are impacted by the variance in meaning and structure that necessitated MDM in the first place. Therefore, to develop a model, framework, and architecture that provide a unified view across these applications, there must be a control mechanism, or perhaps even a “clearing house,” for unifying the view when possible and for determining when that unification is not possible.In fact, the scale of metadata management needed for transitioning enterprise data sets into a master data environment differs from the relatively simple data dictionary-style repositories that support individual applications. Sizes and types are just the tip of the iceberg. Integration of records from different data sets can only be done when it is clear that data elements have the same meaning, that their valid data domains are consistent, that the records represent similar or the same real-world entities.
Not only that, but there are more complex dependencies as well: Do client applications use the same entity types? Do different applications use different logical names for similar objects? How is access for reading and writing data objects controlled? These and many other important variable aspects must be addressed. There is value in looking at a conceptual view of master metadata that starts with basic building blocks and grows to maintain comprehensive views of the information that is used to help an organization achieve its business objectives.
We can look at seven levels of metadata that are critical to master data management, starting from the bottom up:Business definitions. Look at the business terms used across the organizations and the associated meaningsReference metadata. Detail data domains (both conceptual domains and corresponding value domains) as well as reference data and mappings between codes and valuesData element metadata. Focus on data element definitions, structures, nomenclature, and determination of existence along a critical path of a processing streamInformation architecture. Coagulates the representations of data elements into cohesive entity structures, shows how those structures reflect real-world objects, and explores how those objects interact within business processesData governance management.
Concentrates on the data rules governing data quality, data use, access control, and the protocols for rule observance (and processes for remediation of rule violations)Service metadata. Look at the abstract functionality embedded and used by the applications and the degree to which those functions can be described as stand-alone services, along with the mapping from service to client applications and at the top of the stackBusiness metadata. Capture the business policies that drive application design and implementation, the corresponding information policies that drive the implementation decisions inherent in the lower levels of the stack, and the management and execution schemes for the business rules that embody both business and information policies. Given this high-level description of a metadata stack, the challenge is to look at how these levels interact as part of an overall metadata management strategy. This view, shown as a whole in Figure 6.1, enables us to consider metadata as a “control panel,” because the cumulative knowledge embedded within the metadata management framework will ultimately help to determine of the most appropriate methods for delivering a master data asset that is optimally suited to the organization. In this chapter, we will look at each layer of metadata from the bottom up and review its relevance to the master data management framework. The MDM metadata stack.Valuable work has been invested in developing standards for managing metadata repositories and registries as part of an International Standards Organization activity.
The resulting standard for Metadata Registries, ISO/IEC 11179 (see www.metadata-stds.org), is an excellent resource for learning more about metadata management, and some of the material in this chapter refers to the 11179 standard.One word of caution, though: the rampant interconnectedness of the information that is to be captured within the metadata model implies that analysts must take an iterative approach to collecting the enterprise knowledge. Business process models will reveal new conceptual data elements; relationships between master data object types may not be completely aligned until business process flows are documented.
The effective use of metadata relies on its existence as a living artifact, not just a repository for documentation.Within this stack, there are many components that require management. Although numerous metadata tools may supplement the collection of a number of these components, the thing to keep in mind is not the underlying tool but the relevance of each component with respect to MDM, and the processes associated with the collection and use of master metadata. Standard desktop tools can be used as an initial pass for capturing master metadata.
Once processes are in place for reaching consensus across the stakeholder community as to what will ultimately constitute the metadata asset, requirements can be identified for acquiring a metadata management tool. We will follow the alignment items through vision on into where they appear on the Road Map. The example later shows a portion of the strategic plan for Farfel. In reality, this type of artifact may depict a dozen or so enterprise initiatives with associated goals and objectives.
Farfel did not conduct the “value uses” exercise. Resistance was too high from day one.
So we managed available ties with executives very carefully, and performed a semi-underground analysis that went straight to levers, Business Information Requirements (BIRs), and actions. In addition, the lack of any solid workflows at all pushed this EIM effort into deeper areas of content and process design. While the examples show a decomposition, and derivation of metrics and requirements, let’s face it—most industries are mature enough that 70–80% of the information and content needs can be derived from a standard model. In fact, this is what we do, especially after the fourth or fifth retailer or insurance company. A transparent strategy and decomposition into a solid baseline set of EIM/IAM requirements does not take very long.
The EIM team should concentrate on that 30–20% of data and content needs where you are going to differentiate yourself ( Table 13.8). Farfel Emporiums Summarized GoalsDriverGoalDocumented ObjectivesMeasurable AttributesImprove market shareRecover lead market share in categoryRegain market share of 25%Market shareIncrease top line sales across all categories and storesIncrease same store sales 15% over three yearsSame store sales, forecasted vs.
ActualIncrease customer interactionsImprove customer experienceIncrease visits per store from three to four per yearStore visits, market-basket returnImprove service environment, highlight differencesSurveyed opinionsImprove effectiveness of web siteImprove web site sales 15% without cannibalizing store salesPercentage of sales from web siteIntegrate store and web site offeringsFrequency of assortment refresh. Alan Simon, in, 2015 Greater Appreciation for the Value of Cross-Functional Business IntelligenceEarlier in this chapter, we looked at how cross-functional, cross-organizational, cross-geography business intelligence surprisingly turned out to be “devalued” versus smaller-scope BI.
And, as a result, the motivation for struggling through an enterprise initiative diminished even further in favor of smaller-scale data marts.Today, though, this silo-type thinking is rapidly diminishing. While individual organizations and business process owners may still be rather myopic about the data-driven insights that they care about, corporate and government leaders are increasingly demanding the long-promised enterprise-level insights that have been tabled for so long.Even those leaders who have access to broad, enterprise-scale reports and dashboards often receive those capabilities through a great deal of tedious, error-prone “under the covers” manual integration of content from many different data marts and spreadmarts. As their appetite for more and more enterprise data increases, those manual processes are becoming strained to the brink of failure.Thus, we find new emphasis placed on consolidating and synthesizing data into a single location (physical or virtual) under whatever label might be in vogue (enterprise data warehouse or data lake or data refinery) for mission-critical purposes.
The objective of this book is to raise awareness among those tasked with developing MDM solutions of both the organizational and technical challenges and to help them develop a road map for success. To that end, the book concentrates on identifying the issues of critical importance, raises some of the questions that need to be asked and answered, and provides guidance on how to jumpstart that process within your organization.
The book has 13 chapters. ▪Chapter 1, Master Data and Master Data Management, introduces the historical issues that have created the need for master data management, describes both “master data” and “master data management,” begins to explore what goes into an MDM program, and reviews the business value of instituting an MDM program, along with this overview of the rest of the book. ▪Chapter 2, Coordination: Stakeholders, Requirements, and Planning, describes who the MDM stakeholders are, why they are relevant to the success of an MDM program, and what their expected participation should be over the course of the program's development.
▪Every organization exhibits different levels of maturity when it comes to sharing consolidated information, and Chapter 3, MDM Components and the Maturity Model, provides a capability model against which an organization's maturity can be measured. By assessing the organization's current state, considering the level of maturity necessary to achieve the organization's objectives, and determining where the organization needs to be, one can assemble an implementation road map that enables action. ▪Master data management is an enterprise initiative, and that means an enterprise data governance program must be in place to oversee it. Governance is a critical issue for deploying MDM. In Chapter 4, Data Governance for Master Data Management, we discuss how business policies are composed of information directives, and how data rules contribute to conformance to those information directives. We'll look at what data governance is, introduce data stewardship roles and responsibilities, and propose a collaborative enterprise data governance framework for data sharing.
▪No book on MDM would be complete without a discussion of the value of data quality, and Chapter 5, Data Quality and MDM, examines the historical evolution of MDM from data quality to its reliance on high-quality information. This chapter provides a high-level view of the data quality components and methods that are used for the purposes of master data integration.
▪The key to information sharing through an MDM repository is a solid set of data standards for defining and managing enterprise data and a comprehensive business metadata management scheme for controlling the use of enterprise data. Chapter 6, Metadata Management for MDM, discusses data standards and metadata management and explores how master metadata is managed. ▪As part of the process, it is necessary to identify the master data object types and determine the data assets that make up those object types across the enterprise. In Chapter 7, Identifying Master Metadata and Master Data, we look at the process of identifying and finding the data sets that are candidates as sources for master data and how to qualify them in terms of usability. ▪A core issue for MDM is creating the consolidation models to collect and aggregate master data. Chapter 8, Data Modeling for MDM, is where we will look at some of the issues associated with different source models and how to address data modeling issues for MDM.
▪There are different architectural paradigms for master data management, and in Chapter 9, MDM Paradigms and Architectures, we look at existing application and information architectures and different architectural styles for MDM, how they all reflect a spectrum of implementations, and the pros and cons of each of those styles. ▪Given a model and understanding the sources of master data, the next step is the actual process of data consolidation and integration, and Chapter 10, Data Consolidation and Integration, looks at collecting information from across the organization and formulating that into the integrated master data asset. ▪The power of MDM increases greatly when the master data can be integrated back into the existing application environment. Chapter 11, Master Data Synchronization, discusses the needs and approaches for synchronizing data back to the existing applications. ▪The value of MDM does not lie solely with the integration of data. The ability to consolidate application functionality (e.g., new customer creation) using a services layer that supplements multiple application approaches will provide additional value across the existing and future applications. The topic of a functional application services layer is covered in Chapter 12, Master Data Management and the Functional Services Layer.
▪The book concludes with Chapter 13, Management Guidelines for MDM, a summary of the guidance provided throughout the preceding chapters to inform management decisions. To address the ongoing management issues, we offer some management guidelines for transitioning the project from developing the business case to maintaining a successful program.