Skip to main content

Procurement Analytics Hot Topic at Procurement Leaders West Coast Forum and ISM Carolinas

I had the opportunity to attend the Procurement Leaders in San Francisco forum two weeks ago, followed by attending the ISM Carolinas-Virginia meeting in Winston Salem last week.  Procurement analytics was discussed at length at both sessions. Several key takeaways emerged from the discussion with executives, which represented a wide array of companies including AARP, Roche-Genentech, Silex, RJR, Flex, Google, Northrupp Grummon, GE, and others.  Here are some of the comments I heard in speaking with different individuals.

  • The large procurement providers, such as Ariba, Zycus, Brave, Coupa, and others are making good strides into the market. Ariba has the advantage of being able to be linked into SAP systems, while Coupa has not been able to achieve this integration, which has been problematic.
  • Most companies discussed the fact that they do not have a lot of “good” data. Most are still focused on spend analytics.  Procurement analytics is fairly new as an area, and I did not see a lot of major inroads using cognitive analytics or combination of different data sets to produce insights.
  • Maturity in analytics is a progression from Descriptive (Historic) to Predictive to Prescriptive to Cognitive. Most companies are just getting past the descriptive stage, which is just about being about being able to describe and get in touch with reality. Predictive is about understanding the future. A key observation was that “analytics are useless if the customer can’t make sense of and use the data.”
  • Analytics is still seen as an emerging discipline. There are a few companies that are in advanced stages of being able to drive spend analytics, risk analytics, and others, but this has been at great expense. Even so, many are not doing sophisticated analytics, but are still primarily focused on spend analytics. For example, one company is still working on use of shopping carts and purchase orders – was the PO coded directly?
  • Spend analytics is indeed the primary focus of most companies. Executives note that spend is the foundation for many other forms of procurement analytics, but is not an end unto itself. The primary outcomes from spend analysis is the ability to identify multiple instances of a product under four different SKU’s. For instance, one company gave an example of a valve with four different SKU numbers, multiple prices, and multiple names across their facilities, and pointed out that spend analysis allowed them to build a business case around driving down complexity of their spend.
  • Not all decisions require real-time data to be useful.  There seems to be a real need to digest and create visual aids, and to find ways to speed up analysis of information. How data that is used to highlight exceptions in a real-time environment, it must be highly consistent however.
  • Procurement is increasingly taking on the role of being able to become the “source of truth” for data – and enabling other functions with data. This is occurring because they have the most to gain, but also can serve other stakeholders in this role and get their buy-in, particularly when there is an enterprise focus on analytics. Procurement should step up and lead these roles when possible. Procurement should also be at the table when system design is occurring with the IT function, and not wait for IT to tell them what they need. They need to be an active participant in designing systems.
  • A common problem faced by companies is how to structure the front-end data entry process to enable improvement of data quality. So many users are prone to enter information incorrectly into the PO and the system, which degrades data quality significantly. Examples include the use of “Other”, “Miscellaneous”, or random product codes in category fields when entering requisitions. To address this problem, companies shared best practices that included limiting the number of users who have access to the system, to only those who have been trained and have the discipline to use it in a compliant manner. A second approach is to limit the number of PO categories, to ensure they are used correctly. Finally, a best practice was to use “drop down windows” to ensure that the purchase is mapped into the proper General Ledger account. Even Google notes that we “Automated wherever we could. It was very complicated – we had to write about 10,000 lines of code to get it to where people could enter the right element.”


I found the discussions to be very compelling, as we continue to chart the path ahead in the CAPS study that our team is engaged on!