One major technological development is being able to map genomes for products. A genome is all of a living thing’s genetic material. It is the entire set of hereditary instructions for building, running, and maintaining an organism, and passing life on to the next generation.[1] In most living things, the genome is made of a chemical called DNA. The genome contains genes, which are packaged in chromosomes and affect specific characteristics of the organism. A genome map helps scientists navigate around the genome. Like road maps and other familiar maps, a genome map is a set of landmarks that tells people where they are, and helps them get where they want to go. The landmarks on a genome map might include short DNA sequences, regulatory sites that turn genes on and off, and genes themselves. Often, genome maps are used to help scientists find new genes.

Genome mapping is a capability that grew out of the biotech industry, and which is proliferating, largely due to the dropping cost of genome sequencing. The cost of genome sequencing in 2001 was $100M per sequence. By 2007 it was $10M, and it has now dropped to $1000, and is expected to drop to a penny by 2020!  Less than the cost of flushing a toilet!  A picture of a genome mapping pulled from a website is shown.
In a similar fashion, the digitization of things will allow us to be able to better map product genomes. The ability to track products and be able to pinpoint not only where they are today, but to track the entire history and ancestry of a product through the chain, is emerging as a key enabler of transparency and visibility in the supply chain.

Consider this:  Are you able to connect the essential leverage points in your network through cloud, mobile, and other mediums that provides a platform for analytics? Can you track the DNA of your supply chain at a part number level, globally? Today, the answer to both these questions is no, but very soon, we will see technology that will permit anyone, whether a consumer, a manager, or a suppliers, to be able to do this.  This is one of the big questions to consider when we think about how supply chains evolve. We need as structure to map the genome of our supply chains – and this means having an ability to establish part number tracking and coding in the end to end supply chain. One of the biggest opportunities here is to think about a vehicle for encoding the genome, to enable understanding of where products go and come from, which is one of the most important elements in combating counterfeit and fraud. This element of waste is rarely discussed in supply chain scholars, but remains one of the biggest and overlooked areas of lost profits and revenues in the world. The importance of tracking and measuring all goods, including the possibility of counterfeit goods, must be estimated using data tracking.   But unlike the calls for “Big Data”, we must “de-mystify” the view that Big Data is the answer for supply chain improvements. Big Data is static and useless;  it’s the questions you ask of the data that change supply chain outcomes.

The digitization of products and things is a key technology development that will drive the ability of individuals in supply chains to be able to track what is happening in their living supply chain. In effect, these digital signals are like the nerves in our body, that transmit to our brain, that drives us to act when our hand is close to a stove burner, or we taste something pleasurable, or any other sensation that is driven by nerve endings.

But we can’t process everything at once in a supply chain. There is just too much data, and this can lead to sensory overload, where data is flying at us, and we can’t process it all. We have to sleep sometime also! And maybe play golf without having to look at your mobile phone all the time. So how is the digital living supply chain going to work?

The key here is to think about what data you NEED at any given time, and what data is considered CRITICAL at any given time. As humans, we can only focus on a limited number of inputs, so we need to define ahead of time what we view as critical. If we need information on something that doesn’t fall in that category, we also need to be able to know where to look for it, and if we have to do a “deep dive”, have a system that allows us to do that in more detail to the right level of granularity. But more than that, you need to think about getting data that is useful, and for it to be useful, it needs to be current – ideally, it needs to be in real time! And to be able to process it quickly, data should be in a “visual” form. That is, people understand pictures a whole lot better than tables and tables of figures. So taking data and visualizing it, and putting it together in such a way that you can easily process it is key. That is what Steve Jobs understood immediately when he designed the Apple as an interactive, visual device, with a human interface.  How we interact with the data will determine how useful it is.  Automation can help us focus on the right information, but ultimately, humans must make the decisions based on that information.


[No comments]

Unlocking growth in the biobased products sector is tied to the ability to unlock access to inexpensive sugar, derived from corn farming. The second element has to do with production credits. This has begun to emerge at the state level, particularly in the state of Minnesota which is leading the way. Early credits offset the capital to be raised with a ten year clause, which encouraged investment and redirected investment by reducing the cost of capital. This did not just produce a tax credit for companies producing the biobased product, but anyone who uses the product also enjoyed the tax credit in Minnesota.

Iowa has also identified biobased products as a targeted growth sector for the state, and have advocated for a biobased production credit to be established.[1] Interviews suggest that a biorenewable chemical production tax credit would be fundamental to the ultimate location decision in or outside Iowa for new projects. Using income tax data alone, the payback period on $61M the state invested in the ethanol industry was just two years. If we use the larger 8% of all economic activity in all taxes (income, property sales, etc.), then the payback period was one year. Iowa has more deployed biomanufacturing capital assets than any other state and this has made it a highly appealing target for growth due to the proliferation of carbohydrates in the state.

To be truly successful, however, production tax credits should require labeling requirements to ensure that the product is certified as biobased, and educating consumers on what this mean. This can help stem the claims being made that are not supported by production that is truly biobased in nature.

But the ability to account for whether a product is biobased is not an easy task today. Think about the implication and complexity of forming a specific chemical, segregating it, and managing the supply chain to be able to make that claim. This is a significant accounting exercise. Now think about how this product is going to compete in a highly competitive commodity-world. That is the reality of what the biobased product market is facing today. It is clear that the only way that the biobased market will function is if it can be scaled to a large enough size, where economies of scale are working for the industry, not against it (like it is today). This does not require that it be the size of the petrochemical industry, but it cannot simply be a series of small plants scattered around a landscape. And many of the chemicals that are produced through fermentation that go into biobased products also happen to be competing in commodity markets. The implication is that producers need to think clearly about focusing on specialty markets, with products that have unique performance characteristics, based conversion of biobased commodity chemicals into specialized chemicals through chemical synthesis. And in this manner, specialty biobased chemicals can be produced at a total cost of production that is lower than the variable cost of production of petrochemically derived chemicals, which allows them to grow and compete.  Without any credits.

Our interviews regarding production tax credits (PTC) also lead us to believe that short-term incentives are sorely needed. Several companies are moving overseas with US-developed biobased technologies to build new facilities, and the rationale for this is largely due to foreign incentives that are absence in the United States.

Given the intense international competition at this historic point, companies we interviewed are exploring all global options for expansion projects, which includes taking into account political leadership in support of market push and pull policies beyond a specific 1-year operating dollar value at this time.

As one executive we interviewed noted: “It is a commodity world. You can either come in as a commodity or cherry-pick the niches that others aren’t playing in. The market won’t change to accommodate biobased products – we have to accommodate the market. We have to understand the performance of molecules, and blend it with other ingredients to create unique blends, and target the right applications through innovation, and understand the commercial impacts and how to price it. We can maneuver better than the big commodity players, so we have to be able to do that using frigates, whereas they have aircraft carriers that are harder to maneuver….”

Investment in the sector remains strong.  However, continued growth can be bolstered by development of a strong end user market as well as growth in production credits to help launch the sector.

[1] Brent Willett & Joe Hrdlicka, “The Case for a Renewable Biochemical Production Tax Credit”, Iowa Biotechnology Association, 2016.

[No comments]

I had a chance to meet with my good friend Tom Choi from Arizona State University a couple of weeks ago. We met at the United Club in Chicago, as he was on his way back from a visit with Honda, and met to discuss the upcoming study I will be working on with him on procurement analytics for the Center for Advanced Purchasing Studies later this summer. We had a chance to catch up on a great number of things, but one that sticks in my mind is the discussion on predictive analytics and forecasting.

Tom recalled a couple of simple rules around prediction, based on some of the time-honored rules of forecasting methods that we have both taught for years.

“In your forecasting classes you take, all you are really learning is a sophisticated way to create mathematical expressions to capture the past, and trying to extrapolate it into the future.   And we always return to the two cardinal rules that deal with the accuracy of forecasting.”

“The first rule is that aggregated forecasts are always better then product-specific forecasts.   (This has been made popular by the trend of “crowd sourcing” recently.) The basic rule is that multiple points of view formed individual experts and respondents act as a synthesizing mechanism to help us see what is going on at a macro level. They offer opinions and paint the future for us. If we can combine multiple insights we can being to get a reasonable view of the future. For example, predicting between whether the NFC or AFC will win the Superbowl is easier than trying to predict which team will win the Super Bowl – and forecasting the trend for products families rather than individual products is likewise much easier.”

“The second rule is that the longer you wait to develop forecast, the more accurate it will be. That is because forecasts for shorter periods into the future are more accurate than those that go further out. Forecasts for tomorrow are better than forecasts for two months from now. We learn about accurate response methods, and as you increase standardization and reduce leadtime you can get “first market data” which is the best predictor. Waiting to get the very latest (and earliest) market response to your product offering works best for product releases. So if you until the last minute –and if you have a very flexible supply chain – you can afford to wait until that last moment and your forecast accuracy increases.”

“If you apply those ideas to the cognitive arena and analytics arena that we find ourselves in, than the implications are obvious. If we can combine last minute decision-making with computing power, you have a very powerful predictive analytics capability. You can afford to wait until the last minute to make a decision, and also you can aggregate the data (using Big Data). If you are trying to make a decision on whether to buy precious metal form one location versus another, then you can bring together the data from both locations, and combine this data with the location’s record in terms of their production stability and their past market performance, and make a solid decision. You can reflect the information that happened that very day to affect your decision-making. Regardless of how smart algorithms are, you are still doing forecasting. Nobody has a crystal ball – but we are just getting a little more sophisticated in how we apply these two rules.   We used to have to make forecasts that were very linear and trying to extrapolate the future based on the past. But now with real-time data we are learning how to do this using smaller increments of time, and we have machines that also are learning faster and faster. We are using a high “alpha” value in our exponential smoothing models and weighting the most recent data the highest. This is driving greater velocity of decision-making. Which is ultimately the capability necessary for survival in today’s global economy!”

[No comments]

I was recently interviewed by Phillip Ideson of the “Art of Procurement” webcast.  I met Phillip at the ISM Global Procurement Tech Summit about a month ago, and he invited me to join the show.

In the webcast, we talked about a range of different topics, including the following:

  • The importance of pizza in attracting students to the Supply Chain area of study!
  • How the NC State Supply Chain curriculum has evolved to reflect the changing nature of the skills necessary to be successful in procurement.
  • What is predictive analytics?
  • The concept of Innovative Data Leveraging
  • 5 predictions on the impact that predictive analytics will have on the future of procurement
  • What procurement leaders should be thinking about as they consider building greater data and analytic capabilities.

I hope you enjoy the webcast…Phillip’s questions really got me going!  I had an opportunity to provide a lot of detail about the types of projects our students are working on, and I believe can be a good introduction for anyone interested in procurement at NC State University and who is considering a career in procurement.

[No comments]

I spent the day at Biogen’s headquarters in Cambridge, MA today, with a group of executives sharing ideas on how to drive innovation.  Biogen is already a highly innovative company, as they have continued to drive new treatments in MS, and in 2014, the first major treatment advances in nearly two decades for people living with hemophilia. This has propelled their stock to crazy high’s, with a return to more normal P/E levels recently.  Today, prior to the session opening, an announcement was made that Biogen would be spinning off its hemophilia business.  The announcement was made via email to the attendees present, and a public announcement to the markets.

The executives in the meeting today were all managers from different parts of the organization, who had been challenged to drive innovative approaches in Biogen’s business models.  This was part of NC State’s Poole College of Management program on Innovation Leadership, with the managers involved tasked with coming up with a business solution for a new challenge facing Biogen in its markets.  Over six weeks, these managers from all over the world had come together to create insights into what they felt were the opportunities in the market for this highly innovative company.

What emerged was a fascinating set of “constrained imagination” viewpoints, that included everything from linking of PET scan services for Alzheimers, approaches for driving innovation processes in the company, and ways to link with global consumers all over the world via connectivity apps.  The approaches were all diverse, but all demonstrated incredibly novel ideas that will impact the end to end supply chain, from the customer’s customer to the supplier’s supplier.  As a supply chain professor, I was enthralled by learning more about how companies in the biotechnology space are working to create new, highly novel approaches in this new era of therapeutic medicine.  What I saw was the merger of digitization, medical therapies, customer service, and cloud-based applications that will drive the new era of medicine with the goal of eliminating diseases such as Alzheimers, Multiple Sclerosis, and hemophilia.  As a separate entity, the Biogen spinoff is endowed with the same innovative spirit as its predecessor, and I have no doubt it will thrive.

[1 comment]

As organizations move towards visible, high velocity, transparent supply chains, a number of questions arise that are fundamental to the business, but which executive often struggle to answer with any clarity.

  • How do we extend the concept visibility and control beyond our four walls to drive better execution in our supply chain?
  • How do we get materials from our suppliers to our partners to our customers – given that we rely primarily on spreadsheets and email to interact.
  • How do we understand and limit the data we focus on for decision-making? The problem in most supply chains is “not enough data”, because data is being created from machines, from people, from systems, and from external sources. No, the problem is too much data, not enough!
  • What do I care about and therefore focus on, that is the primary factor that will hinder my ability to get products and services to customers? This means finding the right data, the right tools, and finding the exceptions, and rendering decisions based on the data excerpts available. No easy task.
  • How do we identify problems that may be surfacing, that are hidden in the mass of unfiltered data we have today? Often the sources of information are hidden in piles of data that we don’t think about. For example, Elementum identified the Tianjin explosion when someone took a photo of the explosion, tagged it and posted it on Sina Weibo, the Chinese version of Twitter. Social media feeds are just one more form of intelligence that can be leveraged to deal with the flood of information. But the challenge is that although companies have all sorts of news and media monitoring, very little of it is actionable information.
  • The problem thus becomes contextualized into the following sets of questions: How do we take information and data, and put it in the context of our company and our situation, and leverage this information into actionable insights? How will real-time intelligence help me any more than what I’m doing today? What impact does digitization (the internet of things, or IoT) have on my organization and my employees? Will digitization change the way we measure things and monitor metrics?

In addressing these questions, Dana Martin from Elementum emphasized the need to think about the next generation of supply chain. Vertical integration went away because we have moved in the direction of running virtual vertical integration. Brand owners are a key part of this. Companies like Flex are looking for ways to integrate vertically, not just on product manufacturing, but also in terms of how to collaborate better to drive more efficiency. This also implies the need to restructure contractual terms to be able to ensure that as problems arise, (whether due to fluctuations in demand or other factors), managers can quickly adapt to these changes in the supply chain to drive the right outcome.

This ability to contextualize data into decision-making does not occur overnight. It is an evolution that occurs in stages. Today, we are mostly reactive, because we are so close to the problem. Although you may not realize it, but executives really aren’t making many decisions, because there aren’t many to make! We are forced down a path because you found out too late. But if we can begin to learn about problems earlier and earlier, we have more options available to us, and very often these options happen to have much lower costs.

Data Crosses Functional Silos

Dana emphasized “This is a journey people are taking as they drive visibility into the supply chain. The responsive piece is all about how to align teams within the organization, not just internally, but across the organization. We are used to operating in functional silos that involve managing people and keeping them in buckets, and the data these people are exposed to reinforces these silos. And very often, the processes those functions have are within silos as well. A problem in procurement can impact manufacturing and logistics and planning, but often these dots are never connected, so there never emerges a cross-functional approach to working on them. But when we connect the dots linking a problem to other functions, we are now able to create a coordinated and multi-disciplinary team that together is able to solve the problem faster.

OK, great! We now know that cross-enterprise data can tie people together and be automated. Big deal – people have been saying that for years! But the real challenge here is not only automating this process, but ensuring that it is only the exceptions that are used to pull the functional silos together to solve problems.

Let me emphasize that: it is the exceptions that govern and bring the right team of people from silos together to solve a problem! So as you look across the enterprise, there is a need for a mechanism that pulls exceptions and pulls together in real-time a cross-functional team that can across the end to end supply chain, (including manufacturing sites, 3PL, 4PL, transportation, distribution sites, and suppliers.) This mechanism must be driven not simply by external impacts, such as floods in Houston or explosions in Tianjin. The mechanism must be at a far more finite level to extract data showing events that impacts the overall efficiency and throughput of our supply chains.

Monitoring Small Events, not Black Swans

The mechanism for screening data is therefore not just about tracking Black Swans. Black Swan events don’t happen very often, and you don’t optimize your supply chain in the expectation of a Black Swan. The challenge is to be able to filter out small issues and events that happen day to day. Customers change the quantity they ship inside of lead times set by the supply base, and the quantities double. There is a quality problem in production and the schedule falls behind while the problem is resolved. The server goes down for an hour and shuts down communication. Or there is a quantity shortage on a critical raw material at a sub-tier supplier that delays shipment by a day.

These types of small but important events require the attention of a cross-functional team, composed of individuals from multiple functions, including design, marketing, sales, order fulfillment, logistics, procurement, manufacturing and suppliers. A demand fluctuation or a planning issue, if left unresolved, can quickly escalate into a bigger problem, unless it is solved using the right team of individuals. The later delivery may be escalated if there is a contractual obligation with the customer. Small issues represent friction on the flow of the supply chain that drive up cost and impact customer satisfaction. To address these issues means taking a small team for the initial assessment, and building a larger team if the problem is bigger than anticipated. The speed at which teams are drafted and combined to solve problems is in direct proportion to the ability to solve the problem quickly at a low cost and minimize this friction.

Going back to this example – a problem in on-time customer delivery – really only becomes a major problem if the problem is not visible across a multi-carrier network route. The ability to quickly become aware of the problem and solve it means having the right data pulled and put in front of decision makers at the right place and the right time? Improved decision-making occurs when data is presented in a fashion that escalates the nature of the issue to decision-makers. This is also challenging if we have dispersed decision-makers in Brazil, the UK, the US, and other locations. The worst case scenario is that everyone believes everything is fine until the customer notices recognizes hasn’t got his stuff, and contacts the company to inquire about it. This is effectively the first recognition that the delivery is late, but it is too late in the process to do anything about it. The late delivery has already occurred. So buyers are now in a firefighting mode to try to find the right data to explain where the shipment is, why it’s late, etc., which isn’t about solving the problem before it impacts the customer. No matter what happens, the customer is upset now that the shipment is already late.

Assumptions for Creating Transparency

The problem of course is that information in the supply chain is never complete, and will always contain bad data. Even as US-based companies worldwide invest in software such as SAP and Oracle, the standards, availability and consistency of the data produced by these systems will never be 100% stable. And when you now expand your supply chain to places like China, Vietnam, and Latin America, where emerging country customers are located, the variability in data standards and integrity will only increase, as many individuals are still using fax and phone calls in these regions.

So the de facto position should automatically be that data is relatively easy to get, often contains errors and incomplete datasets, and is produced by a multitude of technologies. Any visibility system must be constructed with these basic tenets in mind.   Elementum uses non-relational databases that have no set data schemas, allowing them to input any kind of data that will be stored and analyzed. Graphical interconnections between people, parts, and functions are constructed that enables a problem in one area to be immediately linked and related to another area, where the problems can be quickly scanned and potentially solved. This approach of linking data through non-traditional forms of relationships is the true “secret sauce” behind effective visibility systems in real-time supply chain systems. It is a characteristic that makes the approach powerful and actionable.

How are these connections identified, established, and hard-coded into the visibility system? Dana notes that “We want to understand your end to end supply chain, and begin by literally mapping the entire supply chain from supply distribution through to customers.   We want to know where your subassemblies come from, at as granular a level as we possibly can, which is the level at which there is specific risk. We want to know how you are organized, where you have external dependencies, whether it be a location, a supplier, or something else, and how these elements are interconnected to your supply chain.” This mapping activity is something that should be happening anyhow – but companies often overlook this simple process mapping tool as a vehicle for driving continuous improvement, as well as visibility.

[No comments]

We kicked off the 33rd SemiAnnual SCRC Partner Meeting today, with a room full of people from various segments of government, industry, and academic attending.  We kicked off the session with a round of introductions, and an overview of the SCRC recent accomplishments achieved this past year.  A summary of these is shown below.  What is remarkable is the growth in our rankings given how young the Poole College of Management is (it was founded in 1994), and has grown in stature at an exponential rate.  We are particularly proud of our Bloomberg #6 ranking of Undergraduate Supply Chain programs by Bloomberg, as well as the#15 ranking by Gartner #18 ranking of SCM programs by SCM World.  This is a quick rise, given the SCM program and the SCRC was kicked off only 15 years ago!

Rankings 2

This was followed by a great set of insights from Dana Magliola, Lindsay Schilleman, and John Elliott, three MBA’s who shared their insights on the impact of the supply chain industry in North Carolina.  This project provided a great set of insights related to the huge economic impact that the supply chain industries have, and the need for infrastructure investments.  This report was presented to the North Carolina Legislature, as well as the NC Chamber of Commerce.  Several participants noted that the government of North Carolina has really fallen beyond relative to other states like Georgia, which recently held a state-wide Logistics government this week, attended by the Governor, as well as 1700 representatives form industry government and academia.  We believe this is a call for the NC Legislature and government to sit up and take notice of the impact of the supply chain to the state of North Carolina.


[No comments]

Sculley Handfield

I sat in on a great session at SIG today, where SIG’s CEO Dawn Tiura had a “fireside chat” with John Sculley.  Sculley was the legendary CEO of Pepsi Cola Co who was recruited by Steve Jobs to Apple. John served as Apple CEO for 10 years increasing sales over 1000%. Under his leadership the Macintosh became the largest selling personal computer in the world in December 1992.

John recounted an interesting story that he recalls from those days.

“It’s 1978 and I’m CEO of Pepsi and we have been wildly successful in our marketing campaign.  The Pepsi Challenge has allowed us to pass Coke in sales and market share. I was speaking at Harvard University, and at the end of the class -a student comes up to me, and says ‘I created something knowing you were coming here that I want you to look at, that I developed specifically for you.”

“So we go over to the other building and I see for the first time in my life – what looked like an Apple II personal computer. This was something this kid had put together, before Steve Jobs and Wozniak – and it has rows and columns on the screen.”

“What do you call it? I asked him. ‘I call it an Interactive Spreadsheet.’  The kid’s name was Dan Bricklin and he joined with Bob Frankson to start a company called Visicorp that became Visicalc …and what he had just showed me was the first spreadsheet in the world, which eventually became the foundation for Excel, one of the most applied tools in the world.”

“I was also there at the beginning of Power Point and Hypercard. I’ve watched small teams create tools that change the way we work – companies like People Ticker. We create tools that improve productivity that improve the workforce. These are tools that improve productivity. Slack went from 0 to $4B in sales in three years –because it is a great tool that improves productivity.”

“I’m a huge fan of tools for people.  The most important development is that we need to equip our workforce with better and better tools. Humans still have judgment and they can do things that are repetitive and they can process things quickly. But give our talent out there the tools, especially in contingent skilled labor, and they will double their productivity. Get the people who recruit that talent and benchmark that talent better tool to use, and the organization will be that much more productive now that they have radar.”

A question came from the audience:

“Fundamentally – procurement is measured on cost savings as the primary metric. But this can be destructive on the business. You set the requirements and set the solution that results in the requirements. Other metrics are whether you agree off of purchasing agreements in the company, and some customer satisfaction metrics – and are internal stakeholders satisfied. If we are going to be truly strategic and not just drive towards cost mitigation – what are the things we should look for in terms of tools and sensory capabilities to help us evaluate more strategically what we are buying to drive customer retention and top line metrics? How do you see strategic sourcing leaders to do that?”

Sculley replied with a very insightful comment:

“Here is how I think about it. I believe all technology commoditizes. What is unique and valuable today will become affordable at different price points tomorrow. The way I think about those various points I brought up – is you have to judge how you are recruiting talent  in the context of domain expertise – and can’t focus just on the costs in isolation of the domain expertise issue.  Strategically when you are looking to staff a project, almost all work will be done with project teams, inside of organizations and outside with contingent teams.”

“Let’s imagine we are back in 2007 – and Kodak was focused on a project which sought how to compete with Walmart in a single use camera taking market share away from their camera. They made a decision based on their expertise (and their business was film cameras) to double down and spend billions on additional vertical integration on film processing to compete on a cost basis better with Wal Mart. (And remember – Kodak were the ones who invented the digital camera!”

“At the same time, Steve Jobs introduced the iPod and began to make the connections around what was happening in the market around the development of digital components for consumer products. And he also understood another domain, which was wireless operators, were moving from 2G (text sending) to emails and photos (3G). Apple understood that there were these other domains that would impact consumer electronics, in terms of how to take a photo from software to another mobile device. In 2007, Apple launched the I-Phone. Three years later – Kodak files for bankruptcy.”

“Strategically when you look at recruiting talent, you need to look at domain expertise beyond the domains that you have in your company, and a wider scope of things you are looking at. We are all vulnerable and we have seen that in the last 15 years. And at the same time one can get into a new domain by procuring talent that may not already be in your organization.”

Sculley made an important point: Innovation takes place on the fringes. We can draw a circle around domains and they are in motion – and as they touch, they start to collide and change things. Which means that procurement can adopt change to drive innovation and create new technologies by working on the edges of different technologies, where things touch. Innovation in procurement has to be done in the context of different domains, which means developing talent in domains that you don’t currently have and recruiting talent that is not on your full time payroll.  And in many cases, that domain involves working with suppliers that we haven’t worked with before.  And this means working with more contingent labor as well, as the economy continues to shrink, and more people are working on their own.

I even got my picture taken with Sculley later that day….Wow!  I’ll be sure to be back for the next SIG Summit next year!

[1 comment]

I had an opportunity to sit in on the second day of the Sourcing Industry Group meeting in Orlando today.  Ed Hansen from the law firm of Morgan Lewis presented on the essentials of writing a good contract.  Good is a difficult word here, but it is important to think through how contracts operate in the “complex” world.  This was truly one of the best presentations I’ve seen on how to write effective contracts and make good deals that drive mutual benefit.

Ed began by pointing out that the definition of successful contracts has nothing to do with “never having to work it again”. It is not a weapon to battle the other side – and not just something that legal should haggle with every time. But contract management is about getting the ROI that you negotiated, and ensuring that it is driven by a common sense approach.

An important caveat is to differentiate contracting in terms of commodity vs. complexity deals.

A commodity is something you can completely describe in a contract.  The specifications are right and supplier agnostic.  Price and value is generally a linear relationship.

Complexity occurs when parties are very interdependent, and neither party can be truly successful without the input, support, and cooperation of the other. The price/value relationship may not be inverse.

It is much more difficult to write a contract for a complex deal.  Economic rents are often involved, and there can be different facets of the deal.  For example, software license usually is not considered complex sourcing – but the systems integration work most certainly is.  The people skills can be critical in systems integration, as driving change is a non-linear activity that may not be easily quantified into a price.  Think about using a fixed price deal on an ERP implementation system where there are so many unknowns and requirements.  Talk about complexity!

Ed went through his list of “pet peeves” when it comes to ineffective contract management terms and phrases. He emphasized how important it was to avoid the use of nominalization, the passive voice, and a pompous voice.

For example, one might have a simple statement of working stating that “The supplier will drink a glass of water” or even better, state a specific outcomes (“I will do what is required to remain hydrated, including drinking water, Gatorade, or another beverage.”).  Examples of poor contract language include the following:

  • “After they agree on the timing, the supplier will drink a glass of water.”
  • “He shall partake in the drinking of water from a glass.”
  • “I will fill a glass with water and raise it to my lips. For the avoidance of doubt, I will not pour the water on my head, or use it to wash my hands.”
  • “I will grab a glass, fill the glass, raise it to my face, open my mouth, and pour the contents of the glass into my mouth.”

A supplier will prefer this last statement – but it could mean they are drinking bleach!  In fact, the supplier may not know they are expected to stay hydrated!  The important part here is that clear contract language provides transparency. When you do contracts with clear language, you reduce the level of stress and the more likely you are to get good results.

Simple language also helps to make the contract easier to read and understand.  If the lawyer is the only one who can understand a contract, then it’s poorly drafted. A lawyer doesn’t have better reading comprehension than a business professional. If reasonable people can differ, they will. Anything that requires “mutual agreement” means that the other party has a veto…so you have to try not to agree to agree. If substitutions are allowed – what are they. The operational element you are hiding, will come back to hide you later.

It is important to have a good limitation of liability clause, that is determined by market conditions. You need to have an emphasis on execution, not just terms. You need to look at the overall investment, including up front costs, soft costs, etc. to maximize return on investment. Limitation of liability clauses are invoked in less than ½ % of all contractual deals…. but this doesn’t mean you shouldn’t have one!   If you don’t pay attention to your fee schedule, then you are making a big mistake.

A good contracting process should avoid the RFP prisoner’s dilemma. The idea is that the other party shouldn’t be able to rip you off on cost, but if you make price important upfront, you will not see the solution-ing capabilities of the supplier. You may get a good cost, but if the project is 200% over budget, then where is the savings?   The worst you can do in a complex deal is eliminate the right supplier for the deal. To try to force a bad supplier into a good deal, or vice versa, is not a good idea, as you may not be able to enforce it.  Contracts are based on relationships, but relationships based on contracts tend to fail. If you have a good relationship and you contract for what you want – and try to build the relationship after you have signed – it is a recipe for failure.

Looking forward to more insights tomorrow!

[No comments]

When Coca Cola set about thinking about how to drive sustainability into their supply chain, they recognized from the outset that the entire end to end life cycle of a bottle of Coke needed to be considered. Most people don’t think about this relationships – but the fact is that recycling and renewable feedstock go hand in hand when thinking about a simple item like a plastic bottle. Renewables are the feedstock that go into the “Plant Bottle”, and on the back end, when the consumer is through with it, it can be recycled and ground up to crate a carpet and keep the CO2 sequestered. As such, the plant is used to not only create the product, but the plant is also used to create the following product that comes in the secondary form of the carpet.

Coke’s website[1] notes that: PlantBottle™ is made with a combination of traditional materials and up to 30% made from plants. Because the end product is still PET plastic, the PlantBottle™ package delivers the same performance (e.g. shelf life, recyclability, weight, chemical composition, appearance), but it reduces potential carbon dioxide emissions when compared to PET plastic bottles made from fossil fuels. PET plastic is made up of two components: MEG (mono-ethylene glycol), which makes up 30% of the PET by weight, and which is made from plants, as well as and PTA (purified terephthalic acid), which makes up the other 70%. What is now exciting is the on-going innovation projects by Coke and other companies to take the 70% of the bottle that is non-renewably sourced, and move it towards a 100% biobased resin technology. This will involve using second generation feedstocks, such as cellulosic sugars and technologies, to utilise new available technology on the journey to a 100% plant bottle.

This is moving slowly, but definitely in the right direction. The avenue that is being explored is the development of Furanics building blocks from plant based sugars, under the name YXY. These Furanics building blocks are the basis of a next-generation plant-based plastics and chemicals, and the company producing it (Avantium) is focused its efforts on using the YXY technology as a catalytic process to convert sugars to FDCA, a biobased alternative to terephthalic acid (TA). FDCA can be used to produce the polyester Polyethylene-furanoate (PEF), a 100% bobased material that could replace PET in large markets such as bottles, fiber, and film. Coca-Cola is working Avantium, Danone, Gevo, and Virent, to support the scale-up of Avantium’s plant-based PEF.   Virent’s chemical allows the remaining 70% of the bottle to be plant-based.

Coke is in more than 40 countries with the plant bottle, and has launched over 40 billion bottles. This is a large and critical mass of PET that is used in a number of other leading brands such as Simply and Minute Maid, Gold Peak team, Dasani, and Smart Water. The program is continuing to grow, despite the drop in crude oil prices. Bio-based PET was predominantly used for the packaging of CSD (Carbonated Soft Drinks), accounting for more than 75% of market share in 2013. Growing beverage consumption in emerging markets of BRICS is expected to drive bio based PET market growth. CSD marketing companies such as Coca-Cola are committed on promoting the use of bio-based PET in packaging, which is expected to have a major impact on market growth in the near future.[2] This is occurring despite the price of crude coming down. The feedstock for the Plant bottle is sugar cane out of Brazil, which is moving towards cost parity relative to crude oil-derived PET. The small premium for biobased is absorbed by the system, but Coke sees a pathway to having the renewable plant bottom emerge as the dominant package in thelong run, especially as oil is expected to go up in price in the long run.

There were three driving forces around the championing that occurred in making the decision to move towards a 100% plant bottle.

1.  Sustainability platform and carbon capture. The response from consumers was overwhelmingly positive around the plant bottle, and any misunderstandings occurred around the technology, and has been a strong positive reaction.

2.  Cost and line of sight around competitive elements. This came about as the cane sugar feedstocks in Brazil proved to be cost competitive. Coke also needed to prove that the cane was being farmed on arable land and was not creating competition for land or water with other crops , and used by-products derived from extracting sugar from products. GMO-grown products was also not a factor in this case.

3.  Top-line growth and brand differentiation. The Plant Bottle has become a core differentiating element to the Coke brand, especially in light of the growing awareness of sustainability in the population. Coke works with the WWF and other consortiums, including competing brands such as Nestle, Danone, Unilever, Ford, P&G, and others – to set the guidelines and industry standards that prevents others from jumping in with “green washing” claims, and drive confusion over the issue.

A Coca Cola representative notes that “We are the largest biobased PET buyer, and we see ourselves as a catalyst for the industry to move towards renewable material, and we are working with our partners to make it happen. We are working hard to enable other companies to come into the space and benefit from the PET supply chain that we are creating, and allowing access to technology. ”

It should be noted that one of the benefits that the Renewable Fuel Standard provides significant benefits to biofuel, but the polymer market does not enjoy such benefits. It is easier to get ethanol into biofuels then to take ethanol and make plastic out of it. Ethanol goes into fuels, but the benefit does not translate into plastic. Fuel companies also tend to have limited partnerships that provide tax benefits, that the buyer of renewable plastics do not have. Hopefully, this will change!

[1] [2]

[No comments]