Contract management analytics are critical for tracking transactional compliance with contract pricing, terms and conditions, but also as a vehicle for workflow management around negotiation of contract renewals, and aggregation of supplier performance data on service level agreements, delivery, quality, and service performance. Transactional data must be rolled up to ensure suppliers are providing the requisite price discounts, clauses, rebates and charge backs negotiated in the original contract. Contract analytics can provide a summary view on performance and pricing deviations, risks, statistics (expiry, renewal and pending), procurement and sales metrics.
Current best practices in contract management systems are associated with application of searchable contract databases, allowing comparison and rapid access to contract information, as well as comparison of contractual terms and exposure for common suppliers, categories of spending, and specific price benchmarking internally. When spend analysis is combined with insights about contracts that are outstanding, another layer of insight can be created. Contract management systems enable summaries of when contracts are expiring, which can spur buyer-level activities, assignment of resources and timing of preparation for contract renegotiation sessions. Contract Management systems can also provide insights as to outstanding terms and conditions, and which are out of line with corporate policies. In a presentation at the EMPOWER conference held in Orlando, a large utility shared how procurement had used the decision-making algorithm embedded in the Emptoris suite of tools to guide users into the appropriate contractual mechanism. By asking a series of questions regarding the type of requirement the buyer was sourcing, an artificial intelligence tool guided the user through a decision-tree that led them to use the right contract and the right tool. The tool also helped support a well-documented statement of work, and ensured that the right regulatory requirements and questions were being addressed in the contract.
The real value of contract management systems (CMS), however, will the actual data contained in contracts. Contract data is the real ROI behind the adoption of these systems, as this data becomes foundation for application of artificial intelligence systems, that will lead to even more interesting and useful applications for supply chain management. Data obtained from contracts should be sent to a data warehouse for cleansing and processing and eventually be used in business intelligence and possible statistical (regression) analysis that can explore major trends in supply chain behaviors, and also help drive strategic planning in the face of new and emerging trends. Data from CMS’s can be used for upstream/downstream analysis, comparing how customer needs are translating into supply market capabilities, prices, and capacity requirements. This ability to link supply and demand characteristics of the market place is the true “nirvana” of supply chain analytics. . Language used in contracts such as labels, specific terminologies, suppliers, products, etc need to be classified to make the data useful, as analytics can provide insights into these components. Use of AI in contract management will help maximize the accuracy of automated results.
The application of contract analytics for analyzing data helps drive efficiency and effectiveness within the contracting process, and also helps in gaining an understanding of the source to pay workflow cycle. Contract management systems helps companies focus better on proactive management of spend and revenue as opposed to management based on historical data. The potential of Contract Analysis can help organization’s contracting function give a potential competitive edge with the organization’s counterparts and can typically lead to ROI of 15 to 20 percent. Data obtained from the contracting and the transaction systems needs to extracted, enriched and dissected to measure
Three years ago the decision was made to apply analytics to the upstream procurement supply chain at a large CPG company, but instead of the 10 biggest customers that make up 80% of sales, it was applied to the 10 biggest suppliers that make up 80% of the spend. Rather then selling, the focus was instead of prioritizing, creating visibility of spending against a baseline by region, and developing should-cost models to prioritize the scale opportunities to create leverage. Contracted volumes and spend data were used the base data set. However, the executive we interviewed noted that “spend data only gets you so far, and we realized that to move to our vision we needed to enable connections of spend data with other datasets. Using our partners in that space as classification engines, we were able to pull data out of our different operations and logistics systems, and began integrating them with our spend data in the cloud. This included budgeting tools, commodity management data, SAP inventory data, contract information, and other data from multiple parts of the organization. As we began to consolidate and link the data, we found that we were better able to drive to greater insights.” The limitations are that the data is not real-time, and the vision is to utilize Hadoop to better create connections between sensor data, real-time data, and sourcing data in the future. The sourcing analytics team has made such progress, that other functions such as R&D, quality, and production have seen the dataset and recognize that they also would like to be connected into this workstream to have better information on supply chain analytics. These functions are wanting to “pour in their own data over time based on a use-case prioritization, and we will be there to continually build the dataset to meet the differing needs of the organization. Today, the analytics team has begun embedding different functional subject matter experts into the center, and proects are prioritized based on business need. Some of the areas being explored into end to end supply chain optimization, risk assessment and categorization, and source to deliver interactions that can improve economic outcomes. The team has begun to hire data scientists from universities who arrive with a strong skill set, and also fund subject matter experts to work with the scientists who can share their tribal knowledge and add context to the project data insights and interpretation. This is a key component of the sourcing initiative.
“Our biggest successes are when we can take a “belief” and put a number to it. Someone in the organization comes to us and believes there is a benefit to a certain strategic decision, but has never been able to quantify it. We approach these types of problems through cost modeling and prioritizing against the potential benefit, and seek to answer questions that people can’t answer. For example, what is the best way to source sugar? Should you do it in bulk, in pallets, or other channels based on transportation, spoilage, and consumption factors? This often results in not just cost models and price tier outcomes, but results in a modified business process that has to be implemented to achieve that outcome. Our analysis provides insights into how we have to operate to achieve the outcome, as well as scorecards and metrics that require additional analytical insight. Unfortunately, this is an ad hoc process. It would be great to highlight all opportunities against value, and be able to prioritize these opportunities, yielding a decision on what is the key glaring opportunity leading to an intelligent conversation around where to invest analytical resources. We are not there yet, but are more likely to work on activities that pop up to the surface when someone has an idea they’d like to explore.”