This blog post is co-written by my colleague Joseph Yacura, who came up with the brilliant idea when we were chatting yesterday!
Data quality and governance has always been “important” to the supply chain function, but today it is “critical”. Our most recent study completed by the Supply Chain Resource Cooperative in 2018 suggests that the majority of senior management may not be completely aware of the impact of poor data quality on decision support systems, though a majority perceive that their competitors may be surpassing them in the application of data-based decision-making.
While supply chains are attempting to become digitized, it is becoming very apparent that most supply chains are ill prepared to become digitized end-to-end. The survey showed that only 15% of respondents believe their existing systems are capable of producing clean data that can be trusted. In addition, 20% more respondents in 2018 emphasized that they are spending more than one-quarter of their day searching for data, a significant increase over 2017. Speed in data retrieval involves having good data governance systems that bring critical data to users in a format that is easily accessible
The “hidden cost” here is that the time spent in finding information and redoing data analysis is a significant drag on productivity in the knowledge economy and digital transformation. Slow data retrieval is an impediment to the need for real-time data and real-time decision-making.
Current digitization efforts are starting to come short of their end desired state due to critical data deficiencies. (see the report Digital Supply Chain: It’s All About The Data).
This deficiency is the result of most companies lack a focus on data quality and governance. At the root of this problem is the white elephant in the room: How did we get to this state of affairs?
Most companies traditionally focus on their data needs during the early stages of a procurement project involving third party suppliers, who are charged with collecting data, incorporating it into a system or product, and ensuring that the system is capable of ingesting new data that is processed into analytics. It is at this phase that data requirements are must be closely defined during the early stages of the contract administration process. Addressing data requirements once the contract is signed and the project begins is no longer sufficient. Procurement executives need to take action, and they need to begin incorporating requirements for data governance into all future project requirements during the early stages of system requirements definition, that go into the Request for Proposals/Quotes that go out to suppliers!
Specifically, as companies and their supply chain functions move to digitize, it should be required that all Request for Quotes and Requests for proposals integrate the specific requirement that data governance, data cleansing, and data maintenance for all systems is necessary. This requirement must clearly establish that the project includes all data cleansing and governance, and that suppliers are fully responsible for carrying this out.
That is, suppliers are must not be led to be believe anything other than that the data they will begin working with will surely be of poor quality, will have multiple errors coming from multiple systems, and that these data need to be cleansed before being “dumped” into any type of procurement or supply chain control tower, analytics system, or data lake. Second, the data input function for all new data coming into the system must be rendered “dummy proof”, to prevent further “pollution” of the cleansed data with more incomplete or poor quality data. Finally, all data output going into other applications from the system suppliers need to know your data requirements and you must understand their capabilities to satisfy these requirements. As a result, it is imperative that all companies start to include in all their RFX documents a section titled “Data Requirements”.
This “Data Requirements” section would include but not limited to:
- The types of data requested
- Definitions of each data type
- Frequency of report all data
- Units of measure for each data type
- Clarity on data ownership
The economic benefits of quality data have been quantified. A recent McKinsey study stated that statistical models and advanced analytics based on high quality data can achieve cost savings of 3 – 8 percent. This can be factored into the evaluation of supplier proposals, given that suppliers who have a better track record around data cleansing may be more expensive, but have a better payback in the long run.
By including these requirements in all future RFX’s, system developers will need to perform the due diligence to fully understand the scope of what they are looking for. Further, milestones need to be built into the project scope at various stages, where data quality will be tested using standard metrics. (Data quality measures can be found in our data governance survey.
As supply chain management functions accelerate their move to more data drive decision making (DDDM) it is critical that procurement ensure they have a written agreement with suppliers regarding your data requirements and alignment on their ability to meet this need. As the primary interface with contracting with system developers and third parties, procurement must partner with IT to ensure that it is completed effectively. The creation of a well thought out “Data Requirements” section in your RFx’s can help in achieving the objective and ensure that data quality won’t be a headache going forward!