I had the opportunity to attend a number of exciting events on procurement analytics in the last month.   First, I attended a CAPS Research event in RTP on Procurement Analytics, held right down the road at IBM headquarters.  Next, I attended a second CAPS Research Roundtable held in Tempe, AZ. Finally, I attended an event held by IBM, called Empower2016, in Orlando, which featured a number of speakers and analysts discussing the emerging technologies being developed around Watson Cognitive Analytics. This last event was rudely cut short by the arrival of Hurricane Michael, but I nevertheless was able to stick around long enough to gain some great insights into the capabilities of IBM Procurement combining the power of Emptoris with the emerging capabilities of Watson.  I also was able to sit in on numerous user group meetings sharing how companies are using Emptoris capabilities to improve their contract management outcomes.  For instance, one presenter discussed how she was using Emptoris decision tree features to guide users to the right contractual template.  All the user had to do was answer a series of questions, and the system would guide them to the right contractual template that had been developed and recommended by legal.  At this meeting, I also had the chance to speak with a number of analysts from Gartner and other software analytical companies to better understand their opinions of what is emerging.

What is emerging is certainly impressive.  I sat in on a number of demos, and saw the capabilities emerging in a number of IBM systems for procurement that are briefly described here.


Spend analysis  Real-time spend analysis combined with visualization techniques and structured queries on contracts can provide another level of value that is currently being developed by software providers such as Coupa and Emptoris.  In a second example, Emptoris is able to provide visualization of data analytics by category, by geography, etc. to permit deeper drill-down understanding of spending across a category across the business. In a final demo,   I saw how Watson is able to query the current spend analysis in and quickly return a number of parameters of interest. For example, in response to the query of “What is our spend with Dell?” a dashboard is produced that allows “drill down” capabilities to better understand the opportunities for combining contractual requirements.

Sourcing and market analysis (Watson Buying Advisor). This approach combines IBM catalog information, and uses natural language classifiers and speech to text technology, combined with mobile-enabled and smart digital technology to create a buying assistant. This assistant allows users to input their requirements (either through visual cameras, text description, or other approaches), and the system provides a list of preferred products and suppliers. The disruption in technology is that the technology interacts with the user through natural language, mobile picture devices, unstructured text and digital imagery, combined with clarifying questions, to understand the need and channel the need to the right sources. The system can provide suggested products and services based on the description or picture input by the user, and narrow the search using a decision-tree like set of questions. The user is guided to the right product from an approved supplier at a pre-negotiated price in many cases.

Contract management   IBM Watson has also invested in programs to cognitively manage contracts across the supply base (Blue Hound). Contract technology cognitive applications seek to accelerate contract analysis by identifying clauses that link to changing market conditions, specific supplier conditions, and how this relates to the contractual agreements in place that govern such changes. There is also a need to better able to compare and contract different terms and conditions across multiple contracts within a spending category, and determine alignment of agreements across both buy-side and sell-side agreements. Contracts for an entire organization often span hundreds of millions of pages, and these are rarely read and reviewed. Cognitive computing provides the capability to be able to rapidly scan contracts using specific queries and keywords to help understand exposure, limitations, best practices, and other insights. This is made even more complicate during a merger or acquisition, when another entity’s contracts are absorbed and must be rapidly integrated into the current supply base. Currently, extracting insights from statements of work given the high volume of unstructured data is a very painful process. Cognitive technology holds the promise of being able to parse key terms in a contract, and train it to become smarter, thus building a corpus of knowledge around what represents best in class contractual terms and conditions. This could down the road to an engine that could construct a contract for specific supply situations, or an expert system that drives the right activity.

Market Intelligence   An emerging technology at IBM is the increased understanding of price movements in the market (Pricing IQ). These technologies will provide market intelligence advice, and is envisioned to be deployed across technical services and other categories of spending. The opportunity is to correlate pricing with events in the market, including election results, interest rates, natural disasters, and other issues that impact supply and demand. Technologies that correlate pricing to such macro events can help drive predictive pricing for 3, 6, or 12 months into the future. Understanding price predictions can help drive contracting and hedging strategies. This technology is also envisioned to collect and refine real-time market research events and update contracts with pricing clauses in real-time. The technology may also be able to eventually work to establish how employment statistics from government websites will impact labor availability, and eventually impact labor pay rates. For instance, the trends may dictate the need to pay above market rates in some locations to avoid high labor turnover rates, and other areas where pay rates are well in excess of reasonable market rates.

These emerging opportunities for building greater insight into spending patterns using cognitive technologies are emerging and will be available in the next 2 to 4 years.

[No comments]

This week’s blog is a guest blog from a PhD Student, MD Rejaul Hasan.  Rejaul is from Bangladesh, and is working on his PhD in the College of Textiles at NC State University, and is passionate on the subject of sustainable apparel from his home country.  This week he provided a great set of summaries from the Harvard Conference on “Sustainable Models for the Apparel Industry” held on September 24 at the Harvard Law School.  The issues provide a great backdrop for the analysis of a “should cost” model to truly understand what is a fair price for apparel, in a sustainable supply chain.

What is the fair price of an apparel product responsibly sourced from Bangladesh, and what price do apparel suppliers need to ensure a safe and sustainable work place and fair wages for the workers in Bangladesh? Are big name brands really paying a fair price to a supplier in Bangladesh? What are the major facts inhibiting the Bangladesh apparel industry in being safe and sustainable? Are the major factory accidents in Bangladesh linked to the lower price paid by the brands to Bangladeshi supplier? Who is making the bulk of profit in the apparel supply chain? These are some of many discussions on building a “Sustainable Model for the Apparel Industry” that occurred at the Bangladesh Development Conference 2016 on September 24 at Harvard Law School.

All the major apparel industry stakeholders across the world participated in the conference including representatives from U.S. retailers, U.S. State Department, EU, Bangladesh Government, Bangladesh garments business association, Bangladesh suppliers, US Embassy Bangladesh, World Bank, Netherland Ministry of Foreign Affairs, International Labor Organization, Solidarity Centre, Worker Rights Consortium, Cotton Incorporated, Better Buying, South Asia Water Advisory, Ethical Trading Initiative, Harvard Business School, and Penn State University.

Bangladesh is the second largest exporter of apparel and textile after China with $28 billion in exports last year. It directly employs 4 million workers, 80% of them are women; the industry is playing a vital role in the country’s huge economic development and women empowerment since early 80s. Though the incidents of “Rana Plaza” and “Tazrin Fashion” in 2012-13 attracted the world’s negative attention and showed still how vulnerable Bangladesh is in terms of work place safety and labor rights, it remains the most desired sourcing destination for major apparel brands and retailers in US and EU.

During the Conference, Prof. Ross from Clark University mentioned the recent “Tempaco” factory fire in Bangladesh that caused the death of more than 30 workers. He supports the US government decision of not assignment duty free access as a GSP facility for apparel exported to the US. He mentioned that until such time as Bangladesh ensures major improvements in factory safety, this GSP facility should not be assigned and the EU should also ban this duty free access. Regarding the GSP facility, Evan Fox, the Political Officer in US embassy at Bangladesh explained how they are working with Bangladesh Industry to build a 16-point action plan and execution to support the Bangladesh industry to get GSP facility assignment as soon as possible.

Regarding factory safety, suppliers and business associations leaders from Bangladesh claimed that every year the price of product is going down with an increase in requirements for more advanced product design, quality and overall sustainability requirements from the brands and retailers. This is a major challenge for suppliers to improve factory safety as fast as expected. In support of that, Prof. Mark Anner from Penn State university showed how the US apparel import price went down in the last 20 years and the correlation between import price and the labor rights situation of exporting countries was very high. Suppliers from Bangladesh also mentioned how accord and alliances are inspecting a huge number of factories and making corrective action plans, which is causing labor wages increase and more recent environmental initiatives by governments and suppliers in Bangladesh.

Liana Foxvog from ILO shared a study mentioning how female workers getting harassed by co-workers or supervisors was prevalent, and that a lack of effective monitoring systems by the factory and labor unions in Bangladesh is problematic. In 2010 to 2016 the Bangladesh Joint Directorate of Labor (JDL) approved 362 labor unions against 783 registration applications. Tim Ryan, Asia Regional Program Director from the Solidarity Center added and reinforced the necessity of labor unions in preserving labor rights.

Representative from retailers mentioned the increasing competition in retail and how the industry is really struggling to survive. Consumer expectations for having the most fashionable apparels are the lowest price possible, consumer lack of interest of paying high prices for sustainable product, challenges from competitors and increases in production cost in Asia is putting the whole apparel retail industry in survival mode. Many brands and retailers already modified their product to adjust for price competition but are struggling to stay competitive. However, a report in the Sourcing Journal on 8 Aug 2016 “Cheap Clothing Created Some of the World’s Richest People” mentioned how owners of major apparel brands and retailers became billionaires through the apparel business. This remains a matter of heated discussion and arguments arise around who is making most of the money in the apparel supply chain: the consumers, the brands and retailers or the suppliers.

The ultimate question that arises is what is the standard price or minimum price of any product sustainably sourced from Bangladesh? Suppliers from Bangladesh also emphasize that having such standards would provide an industrywide benchmark. This could probably justify the claim whether brands and retailers are really paying lower prices to suppliers than necessary to ensure a safe work environment, fair wages and overall sustainability of the supply chain. At the same time, it is important to examine whether factory fires or compliance violations in Bangladesh are really linked to price issues. It is really that expensive to ensure a safe work place? For instance, the Rana Plaza, Tazrin Fashion and recent Tempaco factory fire seem less linked to low prices as opposed to compliance with regulations, monitoring, a sustainability culture and of course ethical sourcing practices of top brands and retailers.

A big question in everyone’s mind at the conference is if the price goes up, whether Bangladesh will survive and be able to compete against strong competitors such as India, Pakistan, Sri Lanka, Vietnam and Cambodia. Yevgeniya Savchenko, an economist from the World Bank mentioned how apparel manufacturing is going to move from China to South Asia because of high prices from China and the question is who is best able to seize market share. clearly Bangladesh is ahead of any competitors in terms of apparel export business growth (8.11% growth last year, the highest in the world). But it will be key for Bangladeshi manufacturers to improve every stage of the manufacturing and supply chain to not only improve cost but also to build a sustainable industry that will grow in the long-term.


Stuart William was in one of my former MBA classes at NC State in 2008, and graduated into one of the worst economies ever in May of 2009. Upon graduation, there simply weren’t any jobs available at all! During that period, he networked with as many people as possible, including a fund raising arm at Wake Tech. He met a colleague who worked at Car Quest, and after several interviews, started there. He started in supply planning, overseeing over $100M of spend in batteries and other categories. He then went into global imports for the central purchasing group in Raleigh . He became a director at that point, working with sales planning, inventory planning, and financial planning, and pulling together the Sales and Operations Planning team, as well as introducing new products and eliminating obsolescence. This was a lot of planning, a lot of analytics, and a lot of work. At that time Carquest went up for sale, and this led to a position at Marsh Furniture.

Marsh did not have centralized approach to supply chain management.   One of their HR people reached out to me (as they already had a relationship with our Forest Products College), and I put them in touch with Stuart. This was a new era for Marsh, as it involved moving towards a much more centralized approach to SCM.

Marsh is a 110 year old company, with the 3rd generation of leadership. The are a regional make to order kitchen cabinet manufacturer, and have two primary sales channels of modular kitchen and bathroom cabinets. They have 5 MFC retail outlets, and a dealer network. Their primary customers are single family homes and multi-family homes.

During the boom time Marsh had sales in the triple digit millions. But because of their less mature supply chain organization, the net income percentage was in the low single digits. A lot of companies no longer exist in High Point because of the housing crash. In 2009 it lost two-thirds of their sales – and lost money for four years in a row. The company had invested and saved well – they owned their factories and equipment – and had a rainy day fund to make it through. They went from 1000 employees down to less then 300 – and a lot of people lost jobs, but they were able to survive.

In 2013 housing began to come back, and survived. By 2014 and 2016 revenue was back up to almost its previous high point. This also led to new product offerings to survive, including more styles, more product offerings, and greater value. The company grew at an enormous rate of 30-35%, through overtime and labor increases. There was also a transformation in operational flow and how we operated our factory. In 2013 we had $8M of raw material inventory – due to poor purchasing, poor safety stock decisions, and a lot of WIP in the system. Today, we are at 5.5M in inventory, and well over double the sales volume.

Kitchen cabinets are simple – but there are 5 species, 20 colors, and 6 glazes, 30 door sizes, 3 overlay options – which leads to 2500+ cabinet configurations, as well as 7000+ accessories (panels, moldings, etc. When you throw in multiple hardware options, and multiple case construction options, you have the opportunity for a lot of excess inventory to build up in finished goods.  In the last year, Stuart has disposed and destroyed all that inventory and taken it off the books. “We would never move that material!”

Stuart discussed the need for a supply chain Swiss Army knife skill set.  On any given day, he has to pull one of many tools and skills to deal with the problem du jour. Relationship management, data modeling, problem solving, sourcing analysis, negotiation, data-driven decision-making, metrics development, forecasting, process flow, BOM understanding, MRP/production schedule, Pareto analysis, and information presentations are all key skills that he applies in his CPO role . In short, he has to be able to “figure stuff out” using whatever tools he has in his Swiss Army knife. In supply chain, you can focus in procurement, sourcing, inventory control, logistics, and other off-shoots that occur on a daily basis.   The basic approach is to  first try to understand what went wrong, and how to fix it.  This means having a broad understanding of a number of different supply chain tools and concepts and being able to pull them together to apply the tools to the problem on hand.

Stuart described a great example:  “We had a supplier in Myrtle Beach that got hit by Hurricane Michael last week – and we receive material from them every day! But we didn’t more then 2 days of inventory onhand, as we receive daily deliveries by truck. They had both their phone line and internet knocked out. So how am I going to place an order? We have a truck that runs every day – and can we send a thumb drive so they can scan what they are sending us? So we figured out to go to a server – and coordinate with a local lumber mill that did have access to internet nearby, to figure out what we need in an Excel format – so it could be printed out in a format to use. And we had to coordinate with the general manager at the mill that has connectivity who can get it to him! How is the information going to flow – and how much do I need – and how much inventory do I have? We have a secondary source but can’t turn in less than 5 days –and we will run out before then! So that is what I did and recrafted the plan for tomorrow morning. The GM will run over at lunch time and get them the information so they will know what we need, load their information on the thumb drive, and get it back to us with the shipment!”

This is just one of many great examples that illustrate the need for multiple skill sets in supply chain managers that requires a ton of different capabilities. You have to be able to justify assumptions, and be able to defend them appropriately!

Stuart also discussed the importance of using scorecards with their suppliers to establish performance expectations.  This dovetailed nicely with the MBA class assignment that involves a supplier scorecard assessment – and his description of scorecards as an essential element of long-term relationships rung well with the approach that students are currently working on in their take home midterm.  Thanks Stuart!!

[No comments]

As a young assistant professor at Michigan State University in 1992, I was part of a group called the Global Procurement Benchmarking Initiative. This initiative benchmarked over 300 global companies, and set forth many of the principles for what we called “World Class Supply Chains”. Many of these principles became the foundation for consulting practices at Accenture, Deloitte, Booz Allen, and others. The principles at the time were certainly appropriate, and were considered ground-breaking at the time.

Many of these principles were based on the theme that supply chain managers could move beyond being a “buyer and shipper of stuff” “ but as a centralized function that combine spending across both direct and indirect categories of spending, leveraged this volume through purchase power, and sought to achieve significant cost improvements and efficiencies. An automotive executive recalls how difficult this was given the technology limitations that existed in the early 1970’s.

To drive centralized buying I had to dive down into the Bill of Materials and do a cross-tab. Fortunately, we had a decent commodity coding system for each part number with a prefix that described the car the part was attached to, the function of the item and the suffix that described engineering levels, color, and material to a certain degree. This was the first intelligent database system at the time that allowed me to look at a “deck” of the current buy for a commodity group for our production buy. I started by looking at no more than 10 commodity groups, and looked at the production buy across these commodity groups. With the data organized this way, I was able to see that I could consolidate the number of suppliers by commodity, the value of the buy for the group, using data that no one had ever seen before! This was really exciting! Then I created a matrix of around 35 existing buyers, pulled all the heavy truck stuff out of it, and ran summaries of the data. I could then start to see how many commodities I was dealing with, and was able to reload the commodities to a smaller group of buyers with broader responsibilities that might cover more than one commodity.

Over time, executives also began to understand and realize the critical role that suppliers and distributors played in driving revenue and controlling costs. They began to establish systems for measuring supply chain participants’ performance, improving performance through development activities when they could not do so on their own, and acknowledged that not all relationships with these third parties were the same, with some needing more attention because they were more strategic than others. Over time, the terms “strategic sourcing” and “logistics integration” were coined, which largely involved combining volumes of requirements from across the business, grouping them into large bids, and driving down costs due to larger quantity discounts achieved. This also led to the use of “reverse auctions”, where suppliers would bid on these quantities online. In logistics, the focus became on centralization of distribution centers and warehouses to drive optimization in transportation routing, reduce inventory carrying and handling costs across the system.

Many of the traditional concepts that evolved from this perspective of “driving down costs” focused on driving efficient operations in the supply chain from supplier through to end customer. Many of these principles also coincided with the introduction of “lean manufacturing”, based on the “just-in-time” thinking pioneered by the likes of Toyota. For example, the “Theory of Constraints”[1] emphasized that to optimize the end to end system, the “bottleneck” operation had to be addressed by adding capacity at this operation. The concept of “just-in-time” and “lean manufacturing” focused on standardization of products, improving coordination between different enterprises to reduce inventory, and only delivering the exact amount needed in small quantities that could be immediately consumed by the follow-on operation.

Another group at MSU under the leadership of Dr. Donald Bowersox led the concept of the “Logistics Renaissance”, proclaiming that logistics was a value-added function that could drive market penetration through technology adoption. All of the work going on during this period highlighted many important issues, that were encapsulated in a “maturity model” that identified how organizations could develop these capabilities over time towards a truly “world class supply chain” organization.

However, “world class” still emphasized distinctions between the functional areas of purchasing, operations, and logistics, which were still viewed as disparate functions. Arguments broke out internally and in academic debates over which area had dominance over the others. The three groups involved in these activities (purchasing, operations, and logistics) were lumped together as “supply chain” functions, but never really stopped working independently from one another. Professional disputes emerged among the logistics, operations, and purchasing trade associations over who was really in control of the supply chain; purchasing felt they are calling the shots, while logistics professionals claim that they have oversight over all movement of material in the chain! All the while, they claim to be driving “world class procurement” or “world class logistics” practices, implying that these practices are the best of the best. In fact, “technology integration” was supposed to bring all these groups together, but in fact there still exists lingering tensions, discontinuities, and waste in the end to end supply chain of many organizations. Sure, they could buy things more efficiently, and ship things more efficiently, but were they really linked? Hardly.

One executive I interviewed emphasized this very succinctly:

We get hung up on World Class too much. World Class is simply a set of tools on a tool belt – but the real wave of change is on understanding the business well enough to apply the tools that will drive a total cost model, that spans the end to end value stream. Leveraging and strategic sourcing has gotten in the way of that. And a centralized world-class solution is not always appropriate in every operation globally, because a single model may not work for every small, medium, and large operation. 

World Class is focused on ticking the box around completion of the tools. We are too focused on getting an answer, rather than focusing on an outcome. We want to create nice two by two’s with a label for a supplier, rather than generating and delivering a coherent strategy. I see supply chain practitioners using the tools incorrectly, and should be spending more time instead understanding what a performance specification looks like before going to the supply chain. There needs to be much more focus on the pre-award phase of the business, and the competence of the people doing this. And the total cost concept goes well beyond that.

In the end, there are some real problems with the “World Class” view of the supply chain. Although transactional excellence and efficiency is certainly an operative element that forms the basis for excellence, there is a shift away from the idea that “World Class” is something that applies to every situation.  And so we need to approach the problem with a different toolbelt, and be ready to use a number of different tools depending on the different business drivers and geographic components that are in play.

[1] E. Goldratt, The Goal, 2nd ed. (Great Barrington, MA: North River Press, 1992).

[1 comment]

The number of workers who are “contingent is growing rapidly.  Brian Hoffmeyer from IQ Navigator, a procure to pay solution, spoke today at IBM’s Empower event on this trend, and the implications for procurement.  I had the opportunity to sit in on this presentation and get some takeaways…

Several factors are driving the move towards contingent labor.  Contingent labor spans everything from temporary warehouse workers, consultant, and project workers.  Baby boomers are retiring. For a variety of reasons, they want to stay connected and want to continue working. Many retirees want to have a “hobby” – and still stay connected with their work! Millenials are a huge part of the workforce – 30% – and they want to move to expand their skill set, maintain a strong work-life balance – and research shows that they mov around a lot, and on average work only for 16 months or so at a location. The third factor is there is a shortage of college-educated workers. There will be 30-40m shortage of workers – and 2/3 of companies believe this will impact their bottom line.

Companies themselves are also looking for flexibility and agility that the flexible workforce gives them.  This allows them to deal with surges in demand, especially in retail and other industries where a temporary workforce is needed.

There is demand for contingent workers on both sides of the equation – supply and demand. But 60% of the workforce is not well managed and unaccounted for in financial planning, budgeting and forecasting. Companies that Brian talks to can’t answer basic questions on how many people, who is accessing their systems, and can’t answer strategic questions on whether I’m paying the right amount for a project manager. How are our performers performing? There are data breaches, contract workers who set fire at airports, and misclassified workers resulting in fines. The worse example was the NSA hiring Snowden despite his resume discrepancies, and some clear red flags on his resume. He was a contractor, and the NSA didn’t follow contract procedures.

IQN Compass is an approach to vendor management system that involves a SAAS solution to integrate with corporate systems that is easy to implement and operate. It helps to manage the procure to pay process for contingent labor – and helps to manage workflows. This creates a consolidated invoice based on their assignment – and the off-boarding system occurs so they can no longer badge into building – and reporting and analytics are used to monitor and manage workers.

Brian mentioned several case studies.  A leading multinational banks wants to be able to have greater control over non-employee workforce. They had no visibility into billing rates, no hiring manager access, no third-party invoicing, and no visibility into managed services resources. They were able to reduce back-office headcount and saved over 1000 person hours through process automation, and show billing accuracy,

Contingent labor management will continue to be an issue that will impact the procurement landscape going forward.


[No comments]

I’m writing to share an exciting new report released today entitled “An Economic Impact Analysis of the U.S. Biobased Products Industry: 2016 Update.”

This new report shows how the U.S. biobased industry continues to grow and generate substantial economic activity and American jobs.

A full copy of the report can be downloaded here.

Key takeaways from the 2016 report include:

  • The biobased products industry contributed $393 billion to the U.S. economy up $24 billion from the most recent year measured.
  • The biobased products industry supports a total of 4.22 million American jobs (direct and spillover jobs combined)
  • For every biobased product industry employee in the U.S., nearly two other jobs are supported in other sectors of the economy
  • The production and use of biobased products replacing petroleum-based products has the potential to reduce GHG emissions up to 10 million metric tons of CO2 equivalents.

Agriculture Secretary Tom Vilsack, the only original member of President Obama’s cabinet still serving in the administration, spoke at a National Press Club Luncheon on today at 12:30, as the report was being released.  Vilsack, a former governor of Iowa, briefly ran in the 2008 presidential race and was considered by Hillary Clinton as a potential running mate in 2016.  Vilsack also leads the President’s Rural Task Force, coordinating White House efforts to fight heroin use in rural communities. As head of the U.S. Department of Agriculture, he is working toward opening Cuba to American agriculture exports and aiding economic development in small towns.

Agriculture Secretary Tom Vilsack announced the release of a report completed by the SCRC and Duke’s Sustainability Center that shows the U.S. biobased industry is generating substantial economic activity and American jobs, and supporting the Obama Administration’s efforts to improve the rural economy, promote creation of sustainable jobs, and combat environmental threats like climate change.

“This report exemplifies the effect of the growing U.S. biobased products industry from an economics and jobs perspective. In 2014, America’s biobased industry contributed a total of 4.22 million jobs and $393 billion to our economy” Vilsack said. “Better economic opportunities, like those offered by the biobased product manufacturers, are creating wealth in rural America. The rural unemployment rate has dropped below 6 percent for the first time since 2007, and from 2012-2014, we saw rural child poverty fall by 3 percentage points.”

According to An Economic Impact Analysis of the U.S. Biobased Products Industry, the contribution of the industry is growing. It directly supported 1.53 million jobs in 2014, with each job in the industry responsible for generating 1.76 jobs in other sectors. Between 2013 and 2014, the industry’s total employment contributions increased by 220,000 jobs to support a total of 4.22 million biobased related jobs directly and indirectly throughout the United States’ economy

When compared to the 2013 results (presented in the previous study), the direct value added contribution of the biobased products industry grew by 0.2 percent. Year to year percent changes in direct value added were measured using Producer Price Index for all commodities to account for inflation. The direct jobs contribution to the U.S. economy from the biobased products industry grew 0.5 percent from 2013 to 2014. Figure 6 shows the growth in total jobs and total value added to the biobased products industry from 2013 to 2014.

The biobased products industry experienced steady growth from 2013 to 2014. The growth in the direct value added was smaller than the growth in the total value added. This contributions-based total value added growth is predicated on the strengthening of inter-industry linkages between the biobased products industry and other parts of the U.S. economy. The steady growth of the biobased products industry is particularly impressive given that the price of oil dropped to roughly half its January 2014 price by December 2014.

One would expect that as the price of oil decreased and petroleum-based products became relatively cheaper, the biobased products industry would see a decrease in demand for the products that compete with petroleum-based products. The growth in the biobased products industry proves that the industry is robust and diverse enough to grow even in the face of a sharp decrease in oil prices. It is likely that the biobased products industry will experience even greater growth when the cycle of low oil prices turns around.

Interviews conducted for this report indicate that pricing pressure from petroleum-based products resulted in challenges to profitability, but, in spite of that, revenue and jobs increased and the biobased products industry expanded.

It is apparent that growth is occurring increasingly in specialty sectors (as exemplified by several of the case studies in the report). The methodology used to create this report involved scaling the outputs from IMPLAN using estimates of the biobased portion of each sector from the 2015 study.


NC State recently hosted Mark Costa the CEO of Eastman Chemical who spoke at Wells Fargo series, and I had an opportunity to ask Mr. Costa about the biobased sector.  Specifically, I asked the CEO about the fact that East;man had to cease production of biobutanol this past year. He was open and candid in stating that specialty chemicals has been a real grind – a tough year. Have had to cut some people, as there hasn’t been the demand and pricing pressure is constant. However, he sees a rosy future ahead.  Just like the report does as well…

[No comments]

I had the opportunity to hear George Moakley, formerly the Enterprise Architect and Strategic Planner at Intel, who is now one of the founders of the new Intelligence at the Edge for Supply Chain Lab at Arizona State University.  George spoke to a group of executives at the CAPS Roundtable at IBM in RTP a week ago, and shared some of his ideas about what lies in store for the future of the IoT.

George emphasized that “A different conceptual framework is required. Technology is both evolutionary and revolutionary.  Evolutionary technology is about doing things better; revolutionary technology is about doing better things. Doing things better is where most of the technology changes will occur, and this will come in the form of novel services which enable novel platforms.”  George also noted that “People think the IoT is about collecting data at the edge and throwing it into a data center and crunching it.  But developers need to think differently about how the IoT will be used, because the true innovators will be computing at the edge of their supply chains, not at the center.”

He used the example of the cell phone.  “When smart phones came out, people first discovered they could use a browser.  But recall the first time you used a browser on a phone:  it was awful!  So we became smarter about designing websites so browsers could be used more easily – and that in turn led to Apps, which are a better way to use smart phones.  So the evolution of the service platform led to revolution platforms, which is how technologies will enable platforms.”

The digitization of the supply chain is coming, but no one is certain as to how it will unfold.  Along with this evolution in how technology impacts our daily lives, one of the biggest changes will be in the “internet of things”, which will drive “computing at the edge”. This is also known as “distributed computing” and “distributed analytics”, which involves computing data at the source of collection, which will increasingly be in machines and equipment. Smart sensors are emerging in the sub-$5 range that will capture data on shipments’ temperature, location, and velocity.

A great example to consider is the collection of tire pressure on a moving truck. Sensors in the tires will capture tire pressure every millisecond – but there is no need to dump all of this data into a centralized data center! Smart sensors combined with distributed computing on the truck will collect first-pass data, and generate summary statistics, such as the fact that the tire will last 362 miles before a flat. Sensors combined with a local computer can provide key analytics, and when multiple sensors interact, they can provide clues as to what is happening in the supply chain. For instance, vibration sensors combined with the tire sensors may have a strong correlation with theft, and analysts can determine that if they know what they are looking for. In another case, a truck pulling up to a loading dock in a distributed computing model will interact in the cloud with systems at the loading dock. A notification will be sent regarding the number of loads ahead of them at the dock, and the driver may be notified to slow down, burn less diesel, or take a break for a meal, as their slot at the dock has been pushed back. These types of interactions will require that cloud edges need the ability to discover each other and communicate.   For this to occur, standard will need to be established, just like Ethernet cables and internet protocol standards were agreed on for telecommunications.

Such technological evolutions will create service provider niches that will form the basis for commercial platform creation, (“evolution”), but also the potential for extinction of existing service providers through disintermediation and reintermediation. New technology has the potential to drive innovation and new platforms, but waves of digitization also can cause creative destruction of existing players that are too slow to keep up. In other cases, lawyers are quick to object with security concerns – just as they were with Personal Computers, personalized apps, and voicemail. Waves of digitization will continue to re-shape the environment we live in.

How will digitization change your life?

[No comments]

I had the opportunity to spend some time at IBM’s Research Triangle Park facility, (the actual hosting site for the Watson computer) this past week.  IBM hosted a roundtable for the Center for Advanced Purchasing Studies, with executives from P&G, US Steel, CB&I, Mastercard, and others attending.  In that session, Dan Carrell from IBM shared a number of key perspectives that is driving IBM to diversify into the world of analytics.

Three major shifts are driving the new economy:

  1. Data is foundational to everything we do. Data is a natural resource – those who capture data and learn how to exploit it will be those who succeed in the new economy.
  2. The Cloud is transforming information technology and moving business processes into digital services.
  3. The shift to Cognitive computing is unlocking new insights and enabling optimized outcomes. Cloud Platforms and Cognitive Solutions will take companies like IBM into the future.

An amazing fact is that 90% of the world’s data has been produced in the last two years – and 80% of this data is unstructured! The massive amount of data found in the form of speech, text, articles, videos, digitized images, and graphics is exploding. But organizations and individuals have never had tools to put their arms around this massive flood of data, which is constantly changing and morphing, and increasing in scale. This is changing with the emergence of cognitive computing which creates an ability to take this data and impose a structured format over it which is dynamic and able to adapt and learn. Research shows that 77% of organizations are implementing technologies to begin to exploit this capability.

Cognitive systems have the ability to

  • Understand unstructured data and natural language.
  • Reason and provide confidence levels around the reliability of predictions
  • Learn and adapt through feedback, a process which is sometimes called “supervised learning”
  • Interact in a natural way with users, sometimes through voice recognition and computer-generated responses (such as Siri on the Apple Phone)

Cognitive systems have the potential to help procurement executives make better decisions, bring to bear the expertise of the most experienced individuals to other parts of the organization, discover new insights, and prove th quality, consistency, and compliance around sourcing processes.

IBM Watson is building capability around four primary sourcing processes:

  • Sourcing and market analysis (Buying Assistant)
  • Understanding of price movements in the market (Pricing IQ)
  • Deep insights into specific suppliers (Supplier IQ)
  • Managing contracts across the supply base (Blue Hound)
  • Managing supplier risk (Risk Insights)

These developments will be coming online in the next few years, and are likely to have a major impact on our working environment and the roles we take on in the supply chain.  However, a key theme that comes up in our discussions is the notion that the capability to exploit “big data” and derive key predictive  analytics from the broad world of social media, digital images, and internet chatter involves more than just “buying technology”.  Developing this capability, like many other other organizational capabilities, involves REAL WORK!  Organizations that are successful must begin by establishing an approach to data governance, and then begin to launch pilot studies that target specific business problems and issues. In the process, they are likely to hit dead ends, roadblocks, and unsuccessful outcomes.  But they will also learn “what works”, and understand how to employ many of the analytical tools in the context of their own organizational culture, mindset, and desired outcomes.

We are really in the very early stages of this incredible transformation.  To be successful in the years ahead requires leadership support, a willingness to accept failure, and most of all, the mindset and attitude of a Research and Development -focused organization.  This is where procurement and supply chain leaders need to start to change their culture, which is a tough thing to do in an environment of cost-cutting priorities.  But to overlook this now will put you way behind your competitors in three to five years from now.

[No comments]

Jeff Townley, former Chief Procurement Officer at Nortel Networks, provided some fantastic insights in my MBA class this evening.  In so doing, he shared some great insights about “Outsourcing:  The Good, The Bad, and the Ugly.”

The GOOD – “ODM’s were a wonderful way to outsource, and we had a great experience with them, as they are always good at common hardware. The ODM’s in Taiwan have a single tech park where they all started. Almost every one of the heads of these ODM’s all went to school together and started this Tech park. Nortel decided to put their phones as well as other data access products with the ODM’s – and they would help you by adding some design value. You didn’t have to add designers, and they leveraged the fact that they designed similar products for other people, could do it quicker, and would leverage that, and would get their money back through manufacturing in a low cost environment. At the time – it was all in Taiwan, and over time most of them opened manufacturing locations in China in order to compete on cost.

Clean sheeting was an activity Jeff led on all of Nortel’s projects, and which involved taking all of the transformation costs, where it was being built, and put it all down on paper. “We would go negotiate with other suppliers to try and move it. It didn’t make it easy to move, but having to learn a lot of experiences, we learned to move things pretty well. Nortel was on the first to create an outsource system house in Boston which was moved to Penang. PCB’s were outsourced in the 1990’s, and final assemblies not long after, but the system house where it was all consolidated, putting it through your systems test and shipping it – was all outsourced to Penang.”

The BAD – “Back in the 90s a lot was going to Mexico. Nortel had a plant in Calgary which produced phones that were moved to Mexico  This was Solectron, that was later bought by Flex. All of a sudden I recognized there was a problem – you are not keeping up with demand. I couldn’t get a good story out of them. Eventually we figured out that when Nortel had the PO’s on the suppliers for the long leadtime items – and they would get put back to the front of the queue. Component suppleirs put it at the front of the queue and it resulted in a 20 week leadtime! This was a 10B company – and a $50M hole in our fourth quarter. I got a team of people and we killed ourselves to get the orders out and only missed our revenue target by $2M, which ended up being noise on a bottom line.”


The UGLY – “Nortel went to Chapter 11 in 2008. One reason was they blew the whistle on themselves for financials – that maybe wasn’t needed. They weren’t doing anything that others weren’t also doing.  It is my feeling and the feeling of many others, that Nortel should have sold off some divisions and streamlined their business in the 2005-2007 timeframe rather then trying to maintain such a broad portfolio for the Enterprise and Carrier markets”. Nortel had an opportunity to sell the optical division to Corning for $100B – but decided not to do it! Years later, when they declared chapter 11 – they left Flex with $400M of inventory. Needless to say, the relationship got really bad.  They put everything on stop ship – would ship nothing. What good would that do us? We have to continue to ship to get paid by customers. But we wrote them a check for tens of millions. Nortel ended up paying 97 cents on the dollar for many of the claims when they sold their IP through the Google consortium, and paid off not all, but a number of their bills.

Today Flex has software that helps them understand when components aren’t turning fast enough. That was surely developed on the foot of getting stuck with Nortel’s inventory! Nortel was growing too quickly, and World Com and others realized they had a lot of “dark fiber” in the ground, and the internet was not being used.  Today, Netflix is 37% of Internet traffic on any given night! I guarantee you that if that had happened in 1999 – the fiber would not have been dark. And there is finally a push for more bandwidth. But it’s about 15 years too late!


[No comments]

The online magazine 1IT Enterprise recently published a special issue on some of the emerging predictions about Big Data in their latest ediction.  Many of the enterprises working in this space are still thinking about how applications can access data in the cloud, how cross-cutting applications can move from data centers to cloud-based computing, as well as how mobile messaging will become part of your daily life, exploiting GPS capabilities while accessing your calendar.  Some apps are already doing this, such as Waze which reminds you to leave early for an appointment on your calendar if traffic is particularly heavy.  In addition, museums and other locations will no longer need audio guides, as they can provide this through updated cloud-based audio files.  I provided some of my earlier predictions about procurement and supply chain analytics (pp. 42-43), but what really struck me about this special issue is the number of different industries and experiences that will be impacted by the growing digital revolution.  There are stories about how utilities will be alerted to intra-grid disruptions, how airports will change customers’ experience, how bricks and mortar stores can get more walk-by people to convert into in-store customers, and a host of other examples.

The confluence of digitization of objects, cloud-based computing, mobile computing, GPS and sensor data, and the growing connectedness of individuals to the digital economy is resulting in some incredible changes in the way we live our lives.  Are we ready for this change?   And have we thought about how to design our supply chains to deal with these new market forms?

[No comments]