The news of Walgreen’s potential acquisition of Amerisource Bergen is one of a string of recent announcements that signify efforts by channel players in the massive life sciences market to set up barriers to competition against behemoths by the likes of Amazon, (in my opinion).  This is similar to the recent acquisition of Aetna by CVS, which involved a merger between another big pharmacy and a health care insurance payor.

What are these companies seeing that everyone else isn’t?  Many acquisitions occur when there is a lucrative company that can bolster profits, but no one is saying that Amerisource is especially profitable, over and above what other distributors like Cardinal Health and McKesson are making.  Those days of hefty prices using “Next Best Alternative are gone – and pharmaceutical distribution is a cut-throat business with very tight margins.  In the old model, forward buying distribution services assisted pharmaceutical companies to manage the complexities of unpredictable demand, lumpy capacity, and served as a key transition element for distributing pharmaceutical products to a wide array of fragmented hospitals and pharmacies.  This model was effective for many years, and provided a valuable service to the pharmaceutical companies, who did not possess a core competency in distribution strategy and technology.  However, in 2005-2007 wholesalers’ position in the channel was challenged due to increasing pressure for pricing controls, lower inflation, ASP regulation, and other external variables.  As a result, pharmaceutical distribution companies had to adopt a different model, one identifying a Fee For Service model.  This model has by and large been adopted by the industry, and represents an industry standard approach

Rather, I think there are three major forces at work here that are driving this activity.  This is based on a book I wrote about five years ago which developed a set of future predictions on what will unfold in the biologics and pharmaceutical supply chain, (based on hundreds of interviews with executives).

The three primary forces are:

  1. Compensation Forces: Increased use of inventory management agreements, fee for service models, pay for performance, and pricing pressure applied by government agencies and third parties.  These forces will shape the way that products are sold, delivered, and administered in the future.  In the case of Walgreen’s and CVS, both pharmacies are facing increasing pressure to reduce costs, and payors such as insurance companies are a big part of those forces (in addition to government agencies).  By controlling the payment and reimbursement function, CVS/Caremark can streamline the efficiencies, but also start to set policies very expensive drugs, such as Orphan Drugs.
  2. Channel Forces: The increasing threat of direct sales models by pharmaceutical manufacturers and the threatened entrance of third party logistics providers (3PLs) and other competing entities in the channel.  This may be the underlying reason for the Walgreens/Amerisource relationship.  For years, big pharma was in a state of delusion thinking they could distribute to pharmacies, and by-pass the Big Three (Cardinal, Amerisource, and McKesson).  Wrong!  Compensation is earned by wholesalers through performance of wholesaling services for manufacturers, rather relying on speculation about future price increases.  Examples of services wholesalers provide to manufacturers:
  • Sophisticated ordering technology
  • Daily consolidated deliveries to health care providers
  • Emergency shipments to providers 24/7/365
  • Consolidated accounts receivable management
  • Contract and Chargeback administration
  • Returns processing
  • Customer service support
  • Inventory management
  • Licensed, environmentally controlled, PDMA compliant[1], secure facilities

To receive these services, manufacturers (who are now playing the role of a “supplier” to the wholesalers, in that they move product downstream through the supply chain) are required to pay a quarterly distribution service fee (DSP) equal to a “DSP Fee Percentage” times the WAC (wholesale acquisition cost) sales times product sales.   By moving towards a merger with Amerisource, Walgreens may be able to get these same services at a more efficient DSP than what they would pay otherwise.  And even a small percentage reduction here would amount to a huge savings.

  1. Product Forces: Increasing levels of product diversity requiring specialized handling and delivery coupled with increasing concerns about the pedigree – legitimacy – of goods being distributed in the pharmaceutical supply chain. The rise of personalized medicine will also impact how drugs are shipped to individuals.  The key to this will be having the right DATA  – and this may be the biggest opportunity that all of these providers are seeking – is the ability to drive analytics, and handle these challenges.

Examples of the types of specialized services required in this new environment are shown in the following table.  Improving performance in these markets will require greater use of integrated analytics, which is difficult when there is an array of information exchanges going on.  Integration creates the opportunity to make this a reality.  At least that’s the vision ….let’s see if Walgreen’s can convert the vision into that reality.

Regulatory compliance with the PDMA is discussed at:

[No comments]

Product development within the U.S. defense industry tends to follow a waterfall-phased lifecycle development approach beginning at the concept level and concluding at the subsystem/component level. The concept level design phase is typically summarized in a system concept document. This phase is followed by system level development, resulting in system performance requirements and description documents. The subsystem and component level phase yields product performance requirements and design documents. The resulting product baseline is then implemented, validated, and verified through a series of integration and test activities, beginning at the component and subsystem level and ending at system-level verification. At that point, production and system operations commence (DSMC, 2001). This approach has been used for many years, and tends to limit the opportunity for innovation, cost savings suggestions, and introduction of new technologies into the process, as noted by Anne Rung in her memo during her time at the Office of Management and Budget.

U.S. defense firms tend to follow a traditional “white box” sourcing approach, as shown in this figure. This process typically begins during the component or detailed-level design phase, after product specifications and engineering drawings have been released. At that point, the system and component designs are typically well established. Research suggests that up to 80 percent of the total cost of a product is ‘locked in’ by the procurement organization during this early engineering concept and design.  An important outcome of this approach is that the ability of suppliers to significantly influence the product design (and improve manufacturability and cost) is usually informal, and generally quite limited. This was observed on a factory floor of a supplier producing an aerospace part designed twenty years earlier. The product was extremely complicated to produce using a manual process, yet the worker noted that the specification could not be changed as it was mandated by the FAA, even though a simpler automated production technology existed.

Much of the sourcing behavior involved in standard defense industry practices is tied to establishing “price reasonableness” in accordance with FAR 15.403 (FAR, 2005). To demonstrate price reasonableness, adequate price competition must typically be provided. Although not required by the FAR, as a matter of course, many industry participants typically require three bids from separate suppliers to establish adequate price competition. This has become a “habitual” practice, despite the fact that bids are not formally required if one follows the FAR to the letter. But because government contractors are often risk averse, and do not want to be called out for not following regulations, this behavior persists! In referencing Figure 1, bids are typically requested after product specifications and design documents have been completed, and included as part of the solicitation package. For many suppliers, the first time they see a projects specifications is during the bidding process.

The practice of sending every requirement out for bid and awarding it based on the lowest price (often referred to as “three bids and a cloud of dust”) has been criticized as an indiscriminate approach, which does not follow industry best practices around strategic sourcing, and which leads to the failure to create collaborative and productive relationships between the government and defense contractors (Best Practices, 2005 and 2010). On a broader scale, the General Accounting Office (GAO) has identified strategic sourcing across multiple tiers of the supply base as a key defense acquisition deficiency (GAO, 2006). Enhancing supplier relationships also reflects a central theme of the most recent effort to improve federal procurement practices (Rung, 2014). Because of increasing pressure to reduce costs and shrink defense budgets, both government legislators and industry experts are calling for defense industries to reexamine their sourcing practices.  One of the most blatant of these practices involves not involving suppliers early in the product development process.

I am collaborating on a study of how the FAR may or may not have an impact on adoption of ESI practices. working with Jeff Barrows and Lane Cohee, who are leading this fascinating study.  More to follow!

[No comments]

I listened to all one and a half hours of Trump’s State of the Union address last night, and while it was certainly very positive in terms of the economy, it also left me with some unanswered questions about the relationship of the different policy and funding initiatives he proposed for the next few years.  Here’s a few of the random thoughts that went through my head, (albeit from a supply chain guy’s point of view).

Trump emphasized the growth of manufacturing in the economy.  But it also made me wonder – what kind of jobs are these going to create?  One of the truths of manufacturing is that many operations are increasingly automated, and that robots are increasingly performing many of these jobs, particularly those related to repetitive manufacturing.  For instance, he mentioned how Chrysler, Toyota and others will be building cars in the US – but anyone who has been in an automotive plant knows that most of the jobs involved skilled technicians, of which there is a shortage.  So will there really be 200,000 manufacturing jobs?  What kind of jobs are these exactly?

Which leads to my next thought.  Trump was trumpeting the fact that we will be seeing new jobs due to the influx of companies “back to America” because of his corporate tax cut, and that we will be “seeing rising wages”.  But in the same sentence, he mentioned that “unemployment claims are at a record low”, and that “African American unemployment is at the lowest levels in history”… but this is likely due to the fact that the unemployment rate is ALREADY at 4.1%.  And then, he emphasizes how he is going to limit immigration.

So let’s see, what does this add up to?  Increasing wages, more jobs, and low unemployment – sounds like a labor shortage to me!  And this is already in an environment where there is a massive shortage of logistics workers and truck drivers!

One piece of good news is that he plans to invest $1.3 trillion in “infrastructure”.  We have had a massive problem with poor infrastructure in our airports, roads, bridges, and other transportation channels for many, many years.  But this isn’t something you fix in 3 years.  It will take a decade or more to truly get the type of investment we need to improve these issues.  And then you couple the with the increase in capital investment likely to occur with the corporate tax cut, and again you have more jobs – good thing, right?  Not if you don’t have people to fill those jobs, and you don’t have any immigrant workers who can fill in on construction crews, craft labor, and the other trades for which there is already a shortage!

He didn’t mention that America had disbanded the TransPacific Partnership, or that the negotiations with NAFTA trade partners and Canada were going badly.  Apparently, we don’t need anybody else in the global economy.  If these trading partners would just cooperate to drive “fair trade deals”, then we could all get along!   But that was exactly what the TPP was – a fair trade deal to level the playing field.  Trump didn’t provide a whole lot of details on what he meant by a “good deal”…you can read about it in his book I guess…

The narrative on the pharmaceutical industry was also odd.  He targeted “reducing the price of prescription drugs”.  But most of what people buy has already gone generic and is already at the lowest price.  And if you put pressure on the fact that Americans pay more for their drugs, then that means going to a “single payor” system, which is why countries in Europe are able to negotiate lower drug prices.  But how does that occur if you are “dismantling Obamacare”?  And if you do lower the profit margin for pharmaceutical companies, then the amount of money that they have to reinvest in R&D to fight Alzheimers, Parkinson’s, cancer, and other diseases is also shrunk.  And how exactly does the FDA approving more drugs help that?  It means that pharmaceutical companies need funds to increase their supply chain capacity – which also means more jobs, for which there is already a shortage.

So if we step back and think about this for a moment:  increased wages, increased disposable income due to tax cuts, increased number of jobs in construction, pharma, and capital projects, plus reduced immigration, will result in wage increases – which ultimately, adds up to higher inflation.  That is the one thing we can definitely forecast will happen if all of this comes to fruition.  What we really face here is a shortage of talented workers to fill all of these jobs!  But I didn’t hear Trump speak about training or education in the speech.  That’s the one part that I kept listening for, but didn’t even hear mentioned once.  Lots about “wanting people to be safe”, lots on defense spending to “make America great”, lots on “depraved characters” in North Korea, and odd comments about “steel in our spine”… but not so much on making people smarter to be able to adapt to this new economic vision.



Most big banks (MUFG, Citi, Deutschbank, HSBC, JP Morgan, and others) have been forced to re-examine their external supplier relationships in a highly regulated environment.  This is old news.  But emerging technology is now beginning to create new ways to think about sourcing risk, which in the past has been a very manual, encumbered process.  For global banks, the regulations will be different in each of the countries they operate in, and will cause executives to re-think how we standardize the gaps in the organization, and whether they are in compliance with the regulatory environment in each country.  This has caused banks to look at the flow of activities in the “source chain”, which refers to the way that third parties use bank data for different opreations.  Today, this end to end sourcing process has literally hundreds of different steps in the process.  But the real benefit of technology is the promise of moving to a more efficient process, as well as the ability to better FORECAST and PLAN and BUDGET for our future supply chain activities.

Let’s examine the source chain in more detail.  At the entry point, a business has a need they have to have fulfilled.  This is generally done through a business strategy, which leads to a budgeting activity and a budgeting application by the financial planning group.  The next process involves introducing a third-party risk management assessment.  This third party will look at the engagement implied by the project, whether the process is done internally or through a third party, and assess the engagement through a risk lens.  The tools here are very immature, and must be compliant with what are very general guidelines provided by the OCC and FRB policies, which are also written in a very general manner.  Once the project has been assigned a level of risk by the third party risk assessment, it goes through a source to pay process, which includes various systems such as Coupa, Ariba or others.  The sourcing request can generate a customer proposal, solution e-auction, supplier selection process, contract negotiation, and master agreement.  Eventually this leads to a purchasing transaction, and to fulfillment, through a catalog, order release, then an invoice and accounts payable.  The last stage in the process is where the elements of these transactions are entered into the ERP system where the General Ledger resides, and the transaction is fit into one of the GL accounts determined by the company comptroller.

One of the big challenges with this process, (which involves literally hundreds of individual steps), is the lack of visibility that occurs as a project proceeds through these stages.  There is almost no visibility into where the “work in process” is, and today, most of these processes are connected through manual transactions (emails, phone calls, etc.).  The process is not only complicated, but can take up to 22 months to go through the end to end process.

A further complication arises because of the inability to correctly classify the project, supplier, and procurement transaction.  Comptrollers determine and set the GL codes, which are not always commonsense. For instance at one bank, all travel expenses fall under the GL code “Business Development’. Because of these challenges, the executive worked with his team to establish a new set of sourcing category codes that could be mapped back to the GL, which also allowed greater category-based analysis that aligned with a category structure, NOT the GL structure.  Within each sourcing category, they have developed subcategories, and the risk associated with the engagement is now down at the subcategory level, (NOT for each and every statement of work that comes across separately).  The subcategory risk is assessed for that class of supply, which then allows the team to develop six primary buying channels that can be used for each subcategory.  This then allows some level of continuity between the different stages  across the four different elements of the process.  And this is effectively a bridge now between the systems, using the subcategory code that ties together the project budget, the risk assessment, the sourcing process, and the mapping finally to the GL codes.  And for the first time, this will also help to build the ability to forecast and measure the flow of work across these different elements.

Another important outcome of tying back to a subcategory code is the ability to prioritize work that is coming across from different businesses at the portfolio level.  Today, there is no way to prioritize work for the retail bank, for information security, the transaction bank, etc.    In the past, prioritization of work inevitably got someone else upset that their work was being ignored, and eventually everyone would be upset!  And businesses don’t understand why procurement is taking so long, without a full understanding of the hundreds of steps that have to occur across the system that was designed by the bank, NOT by procurement.  But with visibility to the pipeline of work, and the ability to map workflow, there is an ability to provide reports that shed transparency on the process.

Understanding the Regulatory Risk Standards

Further discussions reveal other insights into the risk standards set up by the OCC (Office of the Comptroller of Currency).  The risk standard set up following the financial crisis are officially known a “guidelines”, but are in fact very prescriptive.  When there is a financial audit at one bank, the regulatory bodies that carry this out include auditors from multiple other banks.  This tells you that standardization of regulations is “possible”, and not that it is a moving target.. However, the regulatory standards are still not explicit, and will always be a guideline that is subject to interpretation. But if programs at different banks are all compliant, one can assume that there is some level of commonality among the approaches being used.  And there is a high probability that one bank’s interpretation of the policies, and what we think regulators will be asking for, will be the same things:  data security, business continuity, data integrity, etc.  There will certainly be a lot of elements around the contracts.  And for this reason, the first stage in the evaluation of a program in a portfolio is always around the interpretation of the overall risk associated with the engagement.  Once we understand the risk, we can overlay a taxonomy for a standard nomenclature for that level of risk and that market perspective.

It quickly became clear in this discussion of the sourcing process in financial services that there is no single tool that can be used to manage this end to end process.  In the first stage, finance uses a budgeting tool, next there is a risk assessment tool, third a set of source to pay tools (e.g. Ariba), and finally the GL codes are embedded (PeopleSoft, SAP, or multiple ERP systems).  Today, there is a need for a subcategory coding structure that “sits on top” of the process, that pulls data elements from each process and links them there a common subcategory code.  The tool set that sits on top of the process provides a portfolio view of risk, as well as allowing how to prioritize efforts for managing the many different engagements that are underway.  It also allows the team to see which business units have a higher priority, which isn’t possible without visibility.

Landscape for Emerging Tools

Emerging tools will be developed to help speed up the process include process automation and block chain, to automate the contract life cycle.  Also, it was important to move risk out from the “end” of the life cycle (during evaluation of the SOW) to the front end of the design cycle, where a new process or product can be evaluated for risks such as data privacy, cross-jurisdictional transactions, etc.  Spend is taking OUT of the risk equation.  And it is notable that in many cases, risk levels have nothing to do with the level of spend. A $500M cleaning contract may be inherently less risky than a programmer writing security codes for $100K in Russia!

It is also important to link internal and external contracts to evaluate risk.  A single contract with an external supplier (say HP global contract) may span more than 300 internal contracts across different business units, and each of these may be treated as different relationships by regulators in each of the different countries they span.  And the internal transfer costs between different legal entities may also be reviewed by regulators, and viewed as a different supply transaction.  Thus it become important to be able to look at the end to end process, and to track the source of the original contract to determine how they span and link back to a single contract.

To help create a balanced approach for risk assessment, a number of banks have created an entity called “Truesight” to provide risk assessments.  This entity uses 1200 questions with almost 90% overlap among the four, that will become hopefully a clearinghouse for risk.  The idea is that suppliers struggle to go through each banks individual risk assessment process, and a single process could prove to be better for everyone involved.  The challenge is that risk assessments are done once a year, are timely and costly.  What is really needed is a behavioral analytics program, to monitor internal and external activity, which will trigger to start a risk assessment based on certain noted irregularities.  The trigger could come at any time, and target a review based on certain guidelines.  This would be similar to the Amazon or E-Bay “Vendor Score”, which is updated in real time.  Banks would support such a tool if it covered all of the risk assessment requirements, covering and providing insights into changes in behavior.  And just like the credit card monitoring process, such a system could leverage what one company is doing, and have 4-12 others leverage the same data.  Coupa is set up for something like that as well for items such as data security, data protection, general business information, personal information security, etc.  This could also be aligned with the sourcing category structures being developed at several banks, as the language and contract structures are very similar in assigning the risk criteria.  This in turn drives similar transactional events on how to structure the SOW or the PO used.  The problem today with doing this at the individual transaction level, is that the trigger for risk assessments often are more expensive that the value of the contract for low spend items!


A meeting of financial supply management executives, faculty, and IACCM thought leaders took place on January 26, 2018 in New York.  A number of different topics were discussed during this session, and the agenda was kept fairly open in discussing how emerging technologies would impact the supply management and contracting space.

Clearly, there is a myriad of automations emerging, including block chain, process automation, AI, and digitization, and an entire new wave of technologies coming at us.  Tim Cummins noted that “we see the next wave as the potential to transform trading relationships and networks.  We have been through the era of ERP, have moved towards a high level of standardization, and have driven tremendous improvements in data, to a point where standardization led to the disaggregation of activities from the enterprise to a diverse supply base.  In financial services, an extreme amount of due diligence has been given to internal controls, but as these organizations have outsourced more and more, they do not have the same level of insight into quality and performance of diverse suppliers and customers is beyond the capability of existing technology.  So organizations are now having to throw a high level of human resources at the consequences of outsourcing to manage these outsourced resources.

There are several consequences of this trend.

  1. We have complexity, but every customer and supplier behaves differently, and different norms exist on the way we want to work.  I have a specific way for invoicing, that is unique and different.  We this creates massive a complexity and risk in the system, because we refuse to standardize aspects of our relationship that have no economic value for being different.
  2. A second dimension is the technology itself, and the way that emerging technologies have the promise to provide us with a better way of operating across organizational boundaries. We have suggested that the potential efficiencies of technologies is equivalent to improvements and savings generated through ERP systems, but this does not address the external piece.  We need to enter into the era of Relationship Resource Planning to manage our outsourced contractual relationships.  And how does automation emerge in this environment, and how could we be using these technologies to ensure that we are benefiting from outsourcing and not driving additional costs into our external relationships?

In the ensuing discussions, the role of these emerging technologies was discussed at length, by several individuals present.  Here is a snippet of some of the conversations that took place.

Dynamic Discounting

New technologies will have a big impact in the future landscape of financial flows in the supply chain.  At a major global bank, for instance, procurement is employing a dynamic discounting tool, that provide opportunities for cost saving as well as early payment discounts for supplier that require access to working capital.  For example, typical payment terms may be 10 days with a 2% discount, or 0 discount in 30 days.  But what if the supplier needs cash immediately?  What would they be willing to pay to get the payment within 2 days?  Some organizations are even borrowing against their payables to get letters of credit, and pay higher interest rates on commercial paper.  The bank is piloting a tool that enables the manager to set an internal hurdle rate, and put current payables out on an auction, that allows suppliers to bid and request early payment on their payables.  This can yield significant cost savings for the bank, and help suppliers with their cash flow positions.

Block Chain

Pilots in the financial sector is also underway using smart contracts to enforce SLA management of their server contracts within their outsourced data center.  In one case, IBM manages 400 prepositioned servers for a bank, which is in their data center.  Each sub-server has an IoT sensor, and is pre-positioned for the bank, and is tied to a condition in the contract for those storage services.  When the server is provisioned, the IoT trigger automatically triggers a purchase order and invoice for that server, and once provisioned, there is a Service Level Agreement (SLA) that goes into effect, stating that the server must operate at an uptime of 98.6% (example).  But if there is a faulty chipset, and the server fails to hit the SLA, two things happen.  First, the server sends a signal into the block chain it generates a service credit.  Second, the block chain immediately creates a settlement for the credit, using an internally generated cryypto-currency, with a defined value.

The potential future scenarios for applications of block chain are enormous for the financial services sector.  One of the biggest expenses for banks is professional services, and one of the biggest challenges associated with professional services is how to monitor services delivered in a Master Services Agreement.  From a contract management perspective, the challenge is how to ensure that the verification of services performed occurs against the contracted requirements and statement of work.  For instance, if using an “Agile” software development perspective, you could build a two-week deliverable checkpoint into the process, that would allow the customer to send a signal back into the block chain showing that the code had been completed, has been tested, and verified.  This would in turn be used to drive the payment cycle.  This is one of many potential future applications that would transform procurement, and potentially eliminate the need for software such as Ariba, etc. that requires processing of PO’s, invoices, requisitions, etc.   The entire sector would be disrupted.  This is indeed an exciting potential development.

[No comments]

I had a great opportunity to meet with CEO Pete Suerken from Resin Technologies Inc. (RTi), yesterday, and share thoughts on how should-cost models are being actively used to drive improved visibility into the true costs of resin, paper packaging, and related products.  RTi, based out of Forth Worth, TX, is working with NC State’s Supply Chain Resource Cooperative, attracted by our strong program in procurement and supply chain, the focus on analytical thinking and modeling, and the mix of engineering and business students who work on projects with our industry partners.

Resin is the most critical cost component in the processors business – ranging from 45 to 85% of the total cost of business.  Resin is found in just about anything you buy today:  plastic trinkets, food packaging, meat packaging, toys, industrial products, you name it.  And keeping track of prices in this market is more complicated than you think.  RTi bring together an analytical network that spans the globe and 20 billion pounds of transactional benchmarks, which gives them the ability to bring knowledge and transparency to clients and a clear vision regarding the resin markets.  Most of their staff are familiar not only with the prices of resin, but know all of the different types of equipment, the run rates, the labor costs, transportation costs, overhead rates, and typical profit margins for pretty much everyone in the industry.  They are resin gurus, but also expanded into paper packaging and have developed strong capabilities in this area as well.

But RTi is growing.  And that means they need people.  Pete shared with me that “we need to bring in young people who have an interest in procurement, because most of the information we develop is used in procurement negotiations.  The objective here is not to cut the supplier’s margins, but to identify the true cost of the packaging and raw materials, and find ways to allow the supplier to make a fair margin at the right price.  Our engineers can generate some incredible cost models that pull in data from multiple sources, but we also need people who can work with our clients, to help them think through how should-cost models can be integrated into the supplier relationship model that results.”

RTi’s should-cost models are incredibly accurate.  This is because of the experience of its engineers, but also due to the massive amount of data available to its analysts.  Most purchasers are satisfied if during contract negotiations their suppliers show them that they are paying resin prices that are stable, at current commodity exchange rates.  However, the reality is that prices are continually in flux.  While oil prices drive much of the changes, bottlenecks in the feedstock supply process, as well as imbalances in supply and demand are constantly changing the resin landscape.  Processors do not have the time or networks to gather essential information needed in order to make the best decisions possible every time a purchase is executed.  RTi resin experts dedicate 100% of their time to analyzing critical market drivers from the well head through the manufactured finished product using our vast network.  Market drivers such as feedstocks, exports, pricing benchmarks, supplier actions, producer inventory levels and operating rates all have an impact on the price of resin.

Although resin prices in North America are reacting to global pressure and have fallen rapidly each processor should ask:

  • What should our current buying strategy be in today’s market?
  • What should I know about the resins I consume over the next quarter?
  • What drivers are in place that can move the resin one direction or the other short term?
  • What are the key drivers that impact resin domestically as well as globally?
  • Where and when will resin pricing hit bottom?

These are questions that every good supply management professional should be asking.  But the answers are not always clear.  Faculty and students at NC State look forward to working with RTi in exploring how to address these questions and integrate them into the sourcing process in the near future.

[No comments]

In a prior post on 2018 supply chain predictions, I shared how NC State’s Poole College of Management engages directly with industry through student-led projects.  I also predicted how this type of engagement would continue to grow in the future, as organizations begin to recognize the value of having an objective, third party view on their problems can create innovative solutions.  In the NC State Supply Chain Resource Cooperative, supply chain projects occur at both the undergraduate and graduate level, and are organized around focused deliverables related to our partner company supply chain issues and challenges.  And everyone I know has issues and challenges with their supply chains!

Here is a sample of the type of projects our MBA and Engineering students are working on this semester, proving again that NC State is all about “THINK AND DO” (in our case, executing on practical applications in the real world using robust supply chain analytical approaches).  These projects all invovle cutting edge topics that are often discussed in blogs and websites.  Our students are executing and making them happen in these projects.

  1.  Cheniere:  Modernizing Master Service Agreement Contract Language.  In this project, students are using search algorithms  to identify all MSAs that have an evergreen Term and replace the older vintage MSAs with newer vintage MSAs.  Such MSAs were entered into by any one or several of a plethora of entities, and must be addressed to minimize risk.
  2. Cisco:  Category Spend Segmentation.  This project continues the work the team did in the fall semester, in regards to Spend Classification and taxonomy and applying it to category maturity and optimization. The current phase will involve using the analyzed spend & other inputs to refine the Category Strategy optimization approach
  3. Duke Energy Sourcing Automation through AI:  Sourcing Optimization through technology and artificial intelligence are becoming more readily available within the Supply Chain industry. This team will work to identify specific opportunities within the sourcing process at Duke Energy to advance sourcing strategies through AI technology and reduce low value sourcing efforts.
  4. Lenovo Blockchain Analysis Part 2.  Following a project we did last fall, Lenovo’s Data Centre Group (DCG) at Morrisville, North Carolina is seeking to identify and explore its opportunities for process improvement and supply chain efficiency by leveraging Blockchain. Blockchain utilizes cryptography protocol to transfer titles, enforce permissions, and record activity to track the flow of goods and services among various parties through a commonly shared ledger.
  5. Lenovo Excess and Obsolete Inventory.  This project is the extension of an executive workshop we ran last fall.  The students will review and document the current geography allocation process for Surplus/excess, Overage and Unconsumed Purchase parts inventory (SOUP).  The objective will be to recommend process improvements including process flow, timeline, meetings, and roles and responsibilities for a revised E&O allocation process.
  6. MUFG Union Bank Strategic Performance Metrics.  The team will seek to develop a comprehensive set of forward thinking, long lasting, Strategic Performance Metrics suitable for use by the Chief Procurement Officer within a major US bank’s procurement and vendor payment program.   The Strategic Metrics should serve the needs of a Chief Procurement Officer that operates as a critical part of the enterprise, encompassing all material program elements.  Metrics should be focus on managing a bank’s Procurement and Payment Program, not on managing individual vendors.
  7. Premier Category Calendar.  The team will develop a system that will positively impact the workload of Premier’s  internal sourcing team and external membership by reviewing the current category calendar, developing an understanding of new category and split category expectations, taking into account distributed and non-distributed product and how it effects workload, as well as number of suppliers participating in the sourcing process.
  8. Yum Brands Beef Analytics.  The Taco Bell beef quality assurance predictive analytics project provides the unique opportunity for a team of strategically and analytically focused future executives to apply business analytic tools to determine if there is a relationship between the beef raw materials used to produce Taco Bell seasoned beef and the product scorecard ratings for Taco Bell’s seasoned beef. The project will help to identify superior raw materials and superior sources of those raw materials in order to ensure that Taco Bell continues to deliver the highest quality beef product to its guests.

What makes NC State unique, is that we team up MBA students with Masters students from Engineering colleges around the university.  We have had engineers from Industrial, Mechanical, Textiles, Manufacturing Systems, and Agricultural Sciences come over to Poole and partner with our MBA’s.  This brings a unique skillset to these projects that isn’t found at most universities.  These are exciting projects, and I have no doubt this brilliant team of MBA and engineering students will come up with some highly innovative insights and solutions!

[No comments]

Unemployment levels are at an all time low.  Economists are trumpeting about the economy, and how great this is for everyone.  Employers added 228,000 jobs in November, while the unemployment rate remained at a 17-year low of 4.1%, the government reported in December. The report also showed that on average weekly paychecks increased by 3.1% over the last 12-months, the first time that reading has topped 3% in nearly seven years. But much of that gain came from Americans working longer hours. Average hourly pay increased 2.5%.

The Bureau of Labor Statistics noted that in May 2017, unemployment rates in seven states were at their lowest levels since the state unemployment data series began in January 1976. Colorado had the lowest unemployment rate in May, 2.3 percent, followed by North Dakota, 2.5 percent, both of which were the lowest rates ever recorded in those states. The rates in Arkansas (3.4 percent), California (4.7 percent), Mississippi (4.9 percent), Oregon (3.6 percent), and Washington (4.5 percent) were also at series lows

Isn’t this great!!  Well actually…no….just ask anyone who works in the logistics sector, especially if they rely on contingent workers.  As we’ve noted in previous posts, contingent workers are the folks who come in to work in warehouses, on trucks, and distribution centers, and who keep the packages coming in from all of those Amazon orders you place online.  And they are becoming harder and harder to find.  And when you find them, you hope they will pass the drug test (required for working around heavy machinery), and you hope they will stick around (most leave after a few weeks for other jobs).  And people are scratching their heads.  How do we find workers for these jobs?  Should we be paying them higher?

Higher pay is most certainly not the answer.  I was exposed to this thinking when I recently had the opportunity to meet with Tara Greene, from The Greene Group in Charlotte, NC.  The Greene Group is a staffing agency focused on the logistics sector, specifically identifying trained professionals for the warehouse, transportation, and truck driver segments.  Their operations include StrataForce continent work management, as well as Road Dog Team, which works with the market for truck drivers.

Tara and her colleagues at Greene Group (GG) have identified and developed a new business for the staffing and contingent worker segment that is unique and unlike any others today in the field. With unemployment at an all-time low (4.1%) in 2018, and shortages of qualified truck drivers and warehouse workers, GG has identified that the key to creating a reliable workforce is not through increasing wages, but to offer a new benefit to workers: FLEXIBILITY.  There is an estimated shortage of 50,000 truck drivers at any given time in the US supply chain network.  Additional survey data suggests that 72% of contingent workers choose to remain contingent, and that only 42% are contingent because they feel they have to.  Anecdotal experience suggests that 50% of Uber drivers are full-time.  There is clearly a trend going on which is being enabled by the “Uber” economy and the enabling technology allowing people the opportunity to work how and when they want to.  This may be part of a movement towards increasing desire for “entrepreneurial mindsets” that involve controlling one’s destiny, enabled by digital marketplaces.

GG supports many large companies, including Stanley/ Black and Decker, and several large trucking companies, through their online scheduling platform “MyWorkChoice”.  This app has created a community of 100,000 contingent workers and drivers that interact with the work schedules created by employers, to provide a reliable source of labor for companies that have shifting demand requirements for labor.  This business model is part of an economic and demographic change towards what has become known as the “Gig Economy”, referring to the fact that skilled workers want to work at times that is convenient to their schedule, and are not necessarily committed to working for a single company for an extended period of time.  For example, IDEA acquired TaskRabbit, which allowed them to hire individuals on a contingent basis to assemble their furniture.  Amazon Flex is emerging as a way for people to sign up to delivery packages on their time off. Due to the large numbers of people willing to work part-time or temporary positions, the result of a gig economy is cheaper, more efficient services (such as Uber or Airbnb) for those willing to use them. Those who don’t engage in using technological services such as the Internet tend to be left behind by the benefits of the gig economy.

In light of all these shifts, the GG has approached the Supply Chain Resource Cooperative as a partner to help them drive improved visibility, education, and document what is happening in the Gig Economy.  Stay tuned!

[No comments]

I had an opportunity to work on a number of exciting projects this past year, and in the process, spent a lot of time talking about digital supply chains with a lot of really informed people.  Here are a few of the big stories that I wrote about this past year that I believe define some of the big shifts in the supply chain ecosystem we witnessed in 2017, and the predictions for 2018.


  1.  Prediction:  Real-Time LIVING Supply Chain Investments Will Separate the Winners from the Laggards.  The most exciting development for me personally this year was the publication of my book The LIVING Supply Chain, co-authored with Tom Linton, Chief Supply Chain Officer of FLEX.  The importance of velocity, speed of decision-making, and the return on working capital generated was recently emphasized in a financial analysis by UBS who explained the financial impact of having faster more responsive supply chains and impact on share price.

“…Flex’s philosophy emphasizes speed whereas EMS competitors emphasize control or resiliency. [Linton] co-authored The Living Supply Chain, which argues that “Speeding up the supply chain is at the root of everything that is good: improved revenue, reduced working capital, higher profitability, and less obsolete inventory. Conversely, slowing down the supply chain is at the root cause of everything that is bad: working capital write-offs, reduced profitability, and slowing revenues.” A control approach creates friction. With 780,000 parts being handled each day, Flex can’t afford to slowdown decision making. Amazon also optimizes for speed.  Rather than have one mega-ERP system, Flex is willing to stitch together 26 ERP systems that feed into Pulse. It argues that it would rather the software works for Flex than Flex works for the software.  Flex is helping its own engineers design for profit by recommending the best components and suppliers to use in their product designs, taking out cost and increasing speed.  The goal is to move upstream toward demand creation with customers rather than downstream to product fulfillment. Upstream increases leverage and profitability.”

Real-time data sharing will become more important then ever in driving quality improvements, but will hinge largely on the ability of organizations to build effective data governance strategies to control how data is managed and collected.  This will become especially important in healthcare supply chains, where data governance is abysmal for the most part.  Real-Time Data will also be key in managing risk and disruption, given what was a particularly horrific hurricane season this year.

2. Prediction:  Small is Beautiful:  Innovation will Increasingly Be Found in Small Companies.   Just like the economist Schumacher predicted years ago in his incredible book, the biggest shifts will come from the smallest companies. Supplier driven innovation was also a major theme at the Procurement Leaders meeting in Miami this spring.  Small companies are continuing to grow in multiple areas, and creating innovative solutions using the massive amounts of data floating around in the ecosystem.  These small companies are focusing on specific business issues, and attacking them using small teams of analytical experts, and piloting and re-piloting new projects with clients.  I also wrote about this in the context of “heedful interrelating” between people working in the same supply chain.  This is how innovation ultimately occurs.  In a podcast discussion with Kathryn Kelly, who leads the Ohio State University Manufacturing Network, we talked about some of the big impacts that the LIVING Supply Chain will have on small business.  On this same trip, we discussed how small companies need to find the right supply chain partners and for companies to be courageous in exploring new technologies.  (It is no coincidence that Flex has a sketch to scale center in their Innovation Center that works with small companies to develop their new products with them!)

3.  Prediction:  People Will Begin to Experiment and Learn to Interact with Cognitive Technologies.  Other blogs on the LIVING (Live, Interactive, Velocity, Intelligent, Networked, Good) Supply Chain also note some of the major shifts that will be required.  I also wrote a white paper and webinar that documented the journey that IBM took in developing the Transparent Supply Chain, and the catalysts that drove them to move in this direction.  The webinar was with Jeannette Barlow from IBM and Simon Ellis from IDC, and the focus was on the impact of AI on supply chain decision-making, and I spoke about a recent case study on IBM’s application of Watson to their own supply chain.  One of the key lessons I took away from this experience was the importance of training machines, and understanding not only the potential of cognitive learning, but also the limitations of computers in completing tasks that are performed by humans (which I wrote about in another blog earlier this year on Big Blue beating Gary Kasparov in chess).  People, particularly those with expertise and specialized skills, will need to learn how to train machines and interact with them to be truly effective in the future.  Asking the right question will increasingly be an important component of successful human-machine interaction.

4.  Prediction:   Blockchain, Smart Contracts, Counterfeiting and Legal Involvement Will Continue to Escalate.  We have certainly seen a lot of excitement and activity going on around blockchain this year.   As my friend at Spend Matters, Pierre Mitchell, joked to me this week, it’s probably not a case of “my business will fail unless I have a distributed ledger”, or that those that fall behind on blockchain won’t be able to do business.  However, the link between blockchain and smart contracting is certainly an important emerging area.    Smart contracts are probably the best area for AI applications, as it requires translation, understanding patterns of dialogue, dialects, and legalese – to unpack the meaning of meaning.  There were a number of predictions I laid out for blockchain in the future, including its potential impact on the procure to pay process and financial transactions, the potential for use in preventing counterfeiting, and other applications.  I had an opportunity to sit through some wonderful presentations at CSCMP in Atlanta this year.  One of the standout discussions was DHL’s approach to customizing the supply chain using digital technology, as well as a discussion by a group of 3PL’s and lawyers on the legal concerns that exist around data transparency in an increasingly litigious world.

5.  Prediction:   People Will Increasingly Stand Up and Do the Right Thing.  What a difference a year makes!  This time last year I was predicting doom and gloom around the emergence of the new Trump presidency.   In fact, we had an entire Spring meeting dedicated to the topic of “Mapping the Road Ahead in Uncertain Times.”  The impact of Trump’s social media activities have certainly impacted people’s decision-making and those of markets, as I noted in a previous post.  The two standout items from this administration are the the pending tax plan, and pulling out of the most important trade agreements (the TPP) that could really hurt us in the future growth of trade.  One of the highlights of this fall was going to the Supply Chain 50 meeting in New York, and sitting in on some great speeches by Larry Bossidy, Tom Linton, and Charles Duhigg.  One of the big takeaways for me from these sessions was the critical nature of human interaction, discussion, and debate that drives the right decision-making.  The importance of diversity as a cornerstone for thinking differently about the supply chain was spoken in a very eloquent way by Merck CEO Ken Fraser, and in this speech he talked about the importance of acting courageous when we see something that is fundamentally wrong happening.  (Fraser took the lead in resigning from Trump’s Manufacturing Council, which was followed by a wave of resignations by other CEO’s).  The idea of collaboration is one that was discussed very well in a session by New York Times author Charles Duhigg, who talked about how structuring teams to get the best out of people will become more important than ever.

6.  Prediction:  Collaboration Between Industry and Academia in Supply Chain Thinking Will Become More Important Than Ever.  Within the SCRC, we held a number of fantastic events this past year, including hosting two meetings that focused on the themes of “Navigating the Digital Supply Chain“, and “Mapping the Road Ahead in Uncertain Times“.  At both of these events, there were great presentations by industrial leaders, including DHL’s presentation on the analytics environment, as well as a great presentation by Advanced Auto Parts on their digital supply chain strategy.  We also held a series of excellent Executive Roundtables, including the Role of Supply Chains in Combating Counterfeiting last May, as well as roundtable on the need for Sales and Procurement to work more collaboratively and achieve mutually beneficial outcomes.  This will require re-designing the processes used to manage these sales-procurement relationships.  We also held an Executive Workshop on Excess and Obsolete Inventory, and developed a white paper that provides insights and sets the stage for the next workshop in this area.  This type of industry-academic engagement is getting NC State a lot of attention, as we celebrated our 25th anniversary of the Poole College of Management, and are getting called out for output of our research that is making a big impact on the business world.  After visiting with a number of my supply chain colleagues on different visits this year, and benchmarking their programs at the top SCM schools such as  Michigan State, Penn State, Ohio State, MIT, and Arizona State, I have to admit that NC State is truly leading the Pack when it comes to a “Think and Do” mentality in working with our industry partners.  Our students are working on “real” projects with our partner companies, tackling such challenges as deployment of block chains, managing risk in supplier contracts, creating predictive analytics in the food supply chain, developing cost models for heavy equipment, and benchmarking supply chain salary roles, and many other relevant projects.  Students are walking away with skills that allow them to hit the ground running when they graduate.  The impact of these projects are seen in their deployment into the companies we work with, and are not just case studies, but innovative analytical and managerial insights.  It’s why we are ranked as a top 25 supply chain program by SCM World and Gartner.  And we continue to roll into 2018 with a new batch of 28 projects this coming semester!




Research by NC State Poole College Professor Robert Handfield, director of the Supply Chain Resource Cooperative based Poole College, and Yung-Yun Huang, a recent graduate of North Carolina State University’s interdisciplinary doctoral program in operations research, supplied two of the 37 data inputs used in the Wall Street Journal’s inaugural “Management Top 250” ranking of companies, published recently.

Produced in partnership with the Drucker Institute, a unit in Claremont Graduate School, the “landmark ranking marks the first time the ideals and teachings of the late business guru Peter Drucker have been used to analyze and compare the performance of major U.S. companies,” the Wall Street Journal reported.

The ranking “is based on an analysis of 37 data inputs provided by 12 third-party sources.  Data for two of the categories used in the ranking – Innovation and Social Responsibility – are based on processes developed by Handfield and Huang. In the innovation category, metrics related to spending management, category management, strategic sourcing and supplier relationship management. In the social responsibility category, metrics related to supply chain policies, practices and results, including audits and lawsuits, with respect to human relations and the environment.

The nature of the Handfield-Huang index is based on empirical studies showing that “sustainable firms have superior financial outcomes” (e.g. Jacobs et al., 2010;).  Stated differently, there is evidence to suggest transparency of supply chains is simply a function of better management practices, improved systems, and more mature governance mechanisms, which in turn enables organizations to track, measure, and manage their global suppliers in a more transparent manner?

This question was pursued by Yung-Yun Huang in her dissertation completed in May 2017 at NC State University, entitled “Machine Learning in Automating Supply Management Maturity Ratings”.  Huang applied a rigorous machine-based learning automation process to assess the supply management maturity ratings of over 600 global companies. This required a comparison unigram and bigram feature settings, three text summarization techniques (full text, paragraph extraction and sentence extraction), and two different support vector machine approaches (one-against-one and one-against-all) on balanced and imbalanced datasets. The exhibited an 89.9% accuracy when compared to manually acquired maturity ratings, completed by prior generations of graduate students. The automation process was adapted as an external evaluation approach (through public online resources) to assess supply chain sustainability maturity, such as labor & human rights and environmental management.

Read the full report at The Wall Street Journal – Additional links: Methodology for the Management Top 250 Company Rankings ( and the Drucker Institute’s website

[No comments]