SUPPLY CHAIN RESOURCE COOPERATIVE

One of the truths that is often overlooked in the splurge of press and hype around “Analytics” is the need for data integrity.  You can’t have analytics without data that is reliable, timely, validated, and accurate.  In a 2014 Deloitte survey of 239 chief procurement officers and directors from 25 countries, 67% of respondents stated that poor data quality was a key barrier to implementing systems. In another recent study by Procurement Leaders, 95% of procurement professional identified data quality as extremely important or crucial to achieving procurement objectives.  Another Deloitte study in 2017 also found that 49% of CPO’s believe the quality of data is a major barrier, and the lack of data integration was the number two barrier (42%).

Why is data integrity such a barrier?  The primary reason:  the lack of a data governance standard business process. One of the biggest challenges faced by major companies who are seeking to build real-time integrated supply chains is to keep the data synchronized across the data execution systems. This is especially important across transactional data, to ensure that the entire system is aligned in terms of using the same “source of truth”.  This only occurs when there is a strong governance structure across the end to end supply chain.  There is no point in having lead-time data if it is not coming together very well and misleading in terms of the conclusions people will make based on this information.  Poor data integrity will undermine the viability of the entire system.  The top supply chain executives I meet with note that this is the biggest challenge they face today.  And data governance is challenging, because it is a “people problem”, not a data or technology problem.  Bad data has a source and a root cause- and it is almost always due to human errors in inputting information into a system.  It is easy for people to fall back into their old behaviors, once a data synchronization standard is put into place.   The fact is that the data integrity flows are part of our life. You never can get to 100% correct data;  rather, the challenge for analytics is to figure out these flows and mend the integrity issues so they are synchronized and people can trust the execution of the systems, and not have to question the integrity of the data underlying them.

It is important to not only have lead-time data, but the RIGHT data lead-time data, and the correct execution related data. Once collected, data also has to be filtered in analytical approaches to eliminate noise and prioritize what data is looked at, to render it truly exception-based. With current trends around data capture, the amount of data we are exposed to is exponential. Most of the data says the supply chain is fine, so there is no need to look at this information.   Instead, people need to focus on the things that are delayed, and have to extract information from data on what needs to be done to fix the problem.   Filtering and prioritizing – allows people to focus on the decisions that can be made faster and impacted by performance.

On the downside, incorrect data can have massive repercussions.  Think of the price of a single component, and if it is off by only 10 cents.  That can escalate based on the millions of parts that are in the product line and bills of material for that company, and lead to incorrect pricing and cost reporting in the customer base.  It can lead to massive discrepancies.  The same goes for lead-time data, which can impact working capital and free cash flow.  Bad data is an easy excuse when people complain about their systems.  It’s about time managers start doing something about it.

 

4 Responses

  1. Keith Peterson

    February 19, 2017 @ 7:39 pm

    This is a substantive issue – glad to see it. A particularly pernicious cause of a lack of data governance is the push by many modern bi vendors to forego the investment is an enterprise data warehouse in favor of simple visualization tools and in-memory analytics. Good data governance requires people, process and technology investments. There are good models from across industries like media, insurance, and credit that supply chain leaders can draw upon to help guide effective data governance.

  2. glenn tamir

    March 5, 2017 @ 12:24 pm

    Great article and especially applicable to challenges in the Healthcare Supply Chain. I work for a company called Supplymind (www.supplymind.com) that focuses directly on all the points highlighted in this article. Anyone interested can contact me directly: glenn.tamir@supplymind.com

  3. handfield

    March 5, 2017 @ 3:01 pm

    Yes, healthcare is especially prone to a lack of data standards. Have a look at the white paper published that benchmarked current data integrity against a maturity model. There are several major structural issues that exist in the industry that prevent hospitals from owning and controlling their data. Most of it still is occurring in an ad hoc fashion.
    https://scm.ncsu.edu/blog/2010/09/04/how-important-is-data-cleansing-in-spend-management/

  4. Donna DiPietro

    April 19, 2017 @ 1:27 pm

    Great article!
    You have probably heard or even worked with GS1 Standards but if you are not already familiar, check out GS1 Global (a world trade standards body in over 100 countries) http://www.gs1.org/data-quality

    Standards Work Group calls (with many countries joining) happen everyday to agree and create World Trade Standards which and promote Data Quality and more.



Some HTML is OK

or, reply to this post via trackback.