SUPPLY CHAIN RESOURCE COOPERATIVE

Our prior post discussed the benefits and challenges of visibility in logistics.  But WAIT!  There’s more to it than just benefits. Carriers and shippers should also proceed carefully as they move into the era of visibility, and start asking the following questions:

  • What are the legal implications of tracking and visibility, and how will it wind up taking dollars out of your pocket?
  • What are the biggest concerns in times of claims, injury, and documentation of claims?

The biggest issue today that shippers have to worry about is freight loss claims, personal injury, and of course, liability!

Tamara Goorevitz noted that in the past year that have been a number of “NUCLEAR” verdicts in trucking litigation cases.  “Examples of some of the awards that juries are making across the country in trucking verdicts include amounts like $51M, $35M, and event $178M.  Maryland has a cap on non-economic damages, but is one of very few states with a cap.  Most of these cases are liability/personal injury claims.

A question:  How much liability are most truckers required to have?  The minimum is that one must have 750K$ .  Most have $1M of coverage for the carrier, because even though they did not cause the damage in many cases, they were present.  Tamara notes that “I am a defense attorney!  So I know what the other side is asking:  Where will they get that money and how many motor carriers are there?  We will go up the chain, and will go big.  To keep yourself in business, there are plaintiffs attorneys that are finding out what transportation is doing, and trying to find the big deep pockets to pay damages like the $281M damage awarded in Texas – because the smaller carrier won’t be able to pay.”

An example of a typical case shows the categories of awards made.  This is based on a truck accident on a highway.

  • Past medical expenses – $20M.
  • Future care – $35M.
  • Past lost wages $23,000
  • Pain and suffering (past) $15M.
  • Future pain and suffering $15M

Other categories include “scarring and disfigurement, $12m, past loss enjoyment of life $8M, Future loss of enjoyment of life, $20M, and on and on.  Juries are going “nuclear”, as when they hear a case and hear the injury, they are willing to go with whatever categories the plaintiff’s attorney comes up with, and whatever numbers they are attaching to them!

Goorevitz notes that “We are in a world of direct liability through channels.  These cases are involving direct corporate liability, with the presumption that carriers are involved in negligent hiring of drivers, negligent training, supervision, retention, etc.  The perception is that carriers are all cutting corners on safety for the almighty bottom line, and espousing the idea that this is a matter of dollars over lives.  The idea of profitable transportation providers is something that is inherently “bad”.  And these plaintiff lawyers will follow up with “we all know that these transportation corporations are all bad!  How many of you have been on the road when a trucker drove into your lane and almost hit you!?”  However, none of these plaintiff’s attorneys every bring up the fact that everything you wear, that you eat, that you use and buy at a store, came on a truck, and arrived there somehow.  Or that the Amazon package that arrived at your doorstep had to travel a circuitous route to get there on a truck.  They conveniently forget about this minor detail.

Part of the problem of course involves “Society’s Perception of Money”.  The extreme media coverage of large salaries of high profile sports atheletes, movie stars, and CEO’s has blown the concept of money out of the water.  Five years ago, if someone wanted $1M as a settlement – that seemed enormous.  Now if someone wants $1M, the attitude is almost “where do we sign the check?”  There is a different perception around lawsuits, almost as if settlements should be awarded like lottery winners.

Which brings us to the issue of Information Overload.  Tamara notes that “We are battling what we need to know versus the availability of information.  Good customer service brought about by transparency, tracking, and monitoring essentially equals bad legal facts.  Despite your good intentions of wanting to  know what is going on, and the more you try to control it, the worse it becomes in defending yourself if a lawsuit arises.  You want to give good customer service, but the availability of data used in tracking can often serve to bolster the case against you.  Sad but true.

This of course raises issues about privacy concerns.  On the legal end, the more involved and the more you know about a particular load and what it is going to do – the worse it can be for the lawyer who is defending you.  The more you know about it, the more information you have, the more sharing you have with partners, the better (so goes the argument).  But what is also left out of this argument is that  More information = Higher Duty of Care!

Tamara notes that “We now know where the driver is minute to minute, and that concerns me legally!”  We don’t have enough time to deal with more sharing or data – because shippers are stating that  they want this information, and they expect it.   But because you share it with your shipper, and that you have this information, does it mean you should give it out to everyone?  What is my Duty of Care?  What is the standard legally that I am held to in terms of being held negligent?  What would a reasonable company do in this situation?”

“These are all important questions to consider. For example, I know where the trucker is, and when he will arrive, and his breaks, and his routes, and how fast he is going.  No situation where I have to delve into that as a shipper.  I don’t control him, and never reasonably expect to call him and tell him he is over his legal hours.  Am I creating a higher duty of care?  It is changing!  We are incorporating this all the time.”

Tamara also notes that “One thing I do know is plaintiff’s attorneys will be all over it, and will be excited about the information you have and what you know about it.  Don’t just adopt it because it is available – but look at how you are getting that information.  Trucker on Trucker tools can turn off the information – and it may be there but not shared with the broker.  How you get that information and share it, requires that we begin to start thinking about questions like “do I need everything that is available, and what am I doing with it?  Who am I sharing it with?  I am assuming that people have training – can a broker negligently hire a carrier who will then go to a jury?”  Are you exercising too much control over the load and the driver? If you have all this information – is it for the purpose of controlling?  But if it is only information – just because it is available, do we really need it?  There needs to be a policy about what we are going to do with it.  Customer service that is good can also create bad facts that increase your liability.  A load confirmation sheet will provide documentation that we are outside of the contract!

You shouldn’t need to know where it is, or if they don’t accept it.  Great to be able to go back and have that information and understand what went wrong – but this also allows the shipper to blame the carrier.  Having data in real-time is no problem with I’m expecting it to arrive.  GPS shipment trackers can track freight over the globe,  helping to avoid cargo theft, understanding where it takes place and seeking advice of areas to avoid.  Pharma has specific products that can need monitoring, and having the controls on high value freight is required just to get insurance coverage!  To back all of those high value loads – there is most definitely a need to understand where a shipment is and avoiding those areas that are not so safe.  And across different modes, it also becomes important, as there are some docks in Los Angeles that are known for bribery among customs agents, and having truckers avoid those specific docks provides another level of control.

What makes the case for visibility even more compelling is that deliveries for big box retailers often have contracts with heavy financial fines for not delivering on time.   Having a tracker can be helpful for a number of reasons:   knowing ahead of time if you will be delayed, tracking how long you have to wait wait, and being able to document from a legal viewpoint if the delivery is counted late and having an argument that you arrived on time but were made to wait.   Lawyers are largely concerned about risk assessments and have discussions about what are acceptable risks.   It is therefore important to begin to consider the legal considerations around visibility of transportation and make an educated decision on the acceptable level of risks.  At what point does this customer request for information turn into control over our drivers and our operations, and how can we become the intermediary for information without relinquishing control. He who has the gold makes the rules! And shippers need to own the gold.

[No comments]

At a recent CSCMP meeting, I attended a session of transportation providers called “Supply Chain Visibility and the Possible Legal Implications” – and the participants shared some insights into their serious concerns about the implications of visibility in the transport world.

In transportation, people want to be able to see the loads on trucks, but the speakers at this session pointed out that sometimes people  forget about what might happen when everyone can see the information.  A three person panel, including Prasad Collapali from Trucker Tools.  Jason Beardall, from England Logistics, Tamara Goorevitz, a lawyer who deals with lawsuits and litigation around transportation, shared insights into this issue.

Prasad Collapalli began by discussing how the tracking and monitoring of freight and assets to optimize asset utilization and operations is typically viewed as a great tool to have.   In addition the goal of using visibility tools is to identify exceptions and delays.  Asset utilization includes trucks, and resources, and operational optimization of these assets is key.  Note that the goal of visibility is NOT to monitor where the trucker stops, etc…but should just be about tracking freight.  Real-time accurate tracking and monitoring is key to establishing the right level of  visibility, to see what is happening and when, and this can only come when there is real-time (every second) freight time monitoring.  A 5 mile radius tracking doesn’t help you monitor freight.  If you go through Atlanta, it could take you five hours to go five miles.  Accuracy of information will become key to improve operations and information.

Amazon has taught us that everyone who has Amazon Prime account now expects two day shipping!  Amazon NOW is a new offering that promises 3 hour delivery from point of purchase, and this is being tested in large markets.  The NOW approach redefines the way that logistics is set up, with more DC’s which are closer together, and knowing spending habits, how quickly they are buying, and having product stocked in the market.  Prasad notes that “we had four offices in China and we met up with the Ministry of Transportation, and went into their war room.  We were amazed to see that the MoT monitors EVERY truck in the country in real-time, and each is being tracked.  They could push a button and put a carrier out of business for a hazmat load that is out of compliance!”

“This is not the case in Western markets, as we are a market based on capitalism, so the visibility technology has to be adopted, and one could argue that it is evolving.  A big problem is that the average profitability for OTR truckers used to be in the range of 6-9 points, but we are now down 3 points to around 6-9 points.  Shippers are looking for extended payment terms!  The 3PL world is trying to relieve the tension that is mounting between shippers and carriers, and seeking to create extended pricing commitments that can lock down terms. Shippers want two year pricing commitments, but their attitude is “if the market shifts in our favor, we will revisit the agreement – but if it goes in the other direction we will hold you to the terms!  A big problem has to do with the nature of unpredictability in this market, whereas in the past shippers could predict rates, produce levels were predictable, and business was good.   In today’s market we don’t know what will happen next week!  So committing to 12 – 24 months rates has created tension in carrier/3PL relationships.  Shippers are coming to realization that they see the value in long-term contracts, but there is a requirement to bid to the market, extend payables, and increase fineds for loads not tracked. Everyone is feeling the pressure.

Jason Beardley pointed out that tracking of shipments is being used against carriers.  The technology is moving in the direction where a shipper will require the carrier or broker to contractually agree that product rejection is not subject to the confirmation by a a desired individual. So technology will be important here, as there was be a lower need to check calls, improve optimization of routes, and reduce waste in miles and freight.  Market visibility will be key, as real-time visibility can result in fewer empty miles and greater productivity.  The benefits will accrue as follows:

  • 3PL’s – fewer check calls, greater productivity, streamlined bill of lading and back office operations, reduced rates, market visibility.
  • Shipper – reduced cost of transportation, better load planning and dock management, greater market visibility and predictability.

In our next post, we discuss how the benefits have to be tempered against the risks of having complete visibility.

[No comments]

Our upcoming meeting on November 29th, the 36th semi-annual Supply Chain Resource Cooperative meeting, will explore some of the more interesting aspects of how organizations are coping with the availability of greater visibility in the supply chain.  While the benefits of creating a supply chain which is “LIVE, INTERACTIVE, VELOCITY, INTELLIGENT, NETWORKED, and GOOD” has been touted in my own book “The LIVING Supply Chain“.

In this meeting, we will be exploring some of the many challenges that exist as we think about what this means for the natural ecosystem of our economy and enterprises.

In my keynote, I will explore some of the topics here that impact this issue, including the impact of visibility on the transportation system, the new technologies being deployed around block chain, smart contracts, distributed ledgers, and visibility through the Internet of Things.  I will also discuss some of the downsides that lie ahead, including the possibility of exploiting visibility in transportation and logistics, the impact on exposure to diversion and counterfeiting, and the challenges around deployment across a global scale.

This will be followed up with practical insights by a group of executives.  Todd Greener, SVP of Supply Chain at Advanced Auto Parts, will share how his organization is using data to help drive improved decision-making, as well as the challenges of working with large amount of data across a large global network.  This will be followed by a presentation by Omer Rashid, Director of Solutions Design and Vince Peters, Vice President of Business Development, entitled “Data Driven Solutions for the Supply Chain”.  In this session, DHL will share their experience in working with large volumes of data for multiple clients in multiple geographies, and how solutions need to consider all of the security and IP issues as part of their design.

Finally, the day will conclude with insights from Vel Dhinagaravel from Beroe, Todd Carrico from Cougaar Software, and Rob Allan from IBM Watson Supply Chain, discussing some of the challenges each of them has seen in managing an ever expanding pool of data from clients.  The panel will be opened up to the audience, leading to what will be hopefully a great interactive discussion.  You need to be there!

 

[No comments]

I had the privilege of being on a webinar today with Jeannette Barlow from IBM and Simon Ellis from IDC.  The focus was on the impact of AI on supply chain decision-making, and I spoke about a recent case study on IBM’s application of Watson to their own supply chain.

IBM, like many large enterprises, is always under pressure to drive shareholder returns.  And like many other companies, their leaders recognized that they needed to respond to the business drivers that created revenue growth, new market penetration, and on-going cost savings.  The supply chain is certainly a source of improvement for driving these elements, and like other companies, IBM’s supply chain contained too many suppliers, black holes where material disappears under the radar, inbound/outbound disconnects, and other discontinuities in their end to end business processes.  The leadership team also recognized that the only way they could be more agile was to focus on changing the end to end supply chain in a way that was entirely novel.  There was a real desire to have information more quickly, in realtime, to be able to access information about what was happening more quickly, to respond to problems, but also to drive quicker business insights and exploit opportunities to add value for clients.  Data governance was a challenge, as there often existed multiple versions of data and no single source of truth.!  And all the while there were also escalating requirements for compliance, both from a legal standpoint, but also in terms of sustainable operations, and to exploit the power of Watson to make this happen!  In particular, the leadership team recognized that there were four areas that required improvement right away.

 

  • Risk Identification– One of their executives noted that “we didn’t identify risks and issues early enough. We were like a raft in the river, with no ability to see around the corner, and not knowing what rapids lay ahead, and floating along wherever the current took us!”  Supply chain problems were always recognized after the fact, and were not being predicted ahead of time.  In retrospect, the signs of a disruption were evident, but nobody knew how or where to look for them ahead of time.  On the other hand, maybe somebody, somewhere in the supply chain, may have some inkling about the problem, but not the person responsible for calling the shots or making the decision.  And so nobody was able to adjust the pace, and the entire organization went along like a raft in the river, responding to every current and eddy that moved it along at whatever pace the river determined!  Nobody was in control of the raft, and it was a constant reaction to events.
  • Lacking the Right Information – Some risk management systems provide alerts. But alerts are just that – like a light that shows up on your automobile dash panel, telling you that something is wrong.  But like a dash panel alert, it doesn’t tell you what the nature of the problem is, the technical details on how the problem can be solved, or the exact location of the issue.  To create a truly transparent solution to risk management, it is critical to identify the element metadata associated with the problem, and the touchpoints in the network that need to come together to address the issue.  The user experience should be characterized by immediate notification to and event and the right information to make a decision.
  • Teams Make Decisions, Not Individuals – The third critical need identified was the recognition by IBM that decisions are made by cross-functional, cross-enterprise, and multiple tiers of people, not by individuals in a void. In order to respond to an issue or decide on resource allocation in the face of uncertainty, multiple points of view almost always yield a better decision.  But the speed of decision-making is also important.  So the challenge became, how can IBM build a solution room to enable a virtual come-together meeting, as opposed to the usual chain of emails and texts that often cause more confusion than anything else!  IBM recognized that ‘we are good at working on big problems in a war-room taskforce environment, but often struggle with the small day-to-day issues that arise’.  The ability to create a smaller “resolution room” that could be rapidly deployed in an ad hoc manner, to address supply chain issues in an agile manner, was another major business driver that led to the need for real-time transparent supply chains.
  • Learning from the Past – The final business driver that led IBM to pursue this strategy was the fact that the same issues often occurred – yet were treated as new every time. The team recognized that while they were creating “gold nuggets” each time in response to issue resolution situations, there was no way to capture the key “lessons learned” to create organizational learning from these instances.  The team recognized that ‘if we could digitize the way we solve problems – in terms of who was invited, how long did it take, the metadata and dynamics of the team, an the content that ended up solving the problem, we could respond more quickly to similar situations that came up!’  This became part of the cognitive approach, that required creating a system that was enabled with ways to learn from situations, and that learned from these instances, capturing these essential elements and creating a closed loop on how to better manage the end to end supply chain.

Individuals dealing with an issue must be able to understand the processes of the end-to-end supply chain, and be able to come together with others in a fast action plan.  This requires individuals who are willing to be key informants, and who are willing to transcend typical functional barriers between sales, operations, and procurement, to drive the right solutions.

In this webinar, Simon also discussed his views on how intelligence, analytics, MBL, and cognitive computing are moving together along a continuum. Jeannette spoke about the “supply chain playbooks” that IBM is designing around their Watson capability.  I learned a lot from these individuals, and urge anyone who is interested to take the time to listen to it, and read the case study on the topic.

[No comments]

The Supply Chain Resource Cooperative held its first ever “Executive Roundtable on Excess and Obsolete Inventory” on the NC State campus on October 25, 2017.  The event was attended by 25 executives from a variety of different industries and backgrounds.  Inventory Management Partners sponsored the event, and helped to bring together the format and content for the discussion.  The objective of this session was to openly discuss some of the challenges that exist in managing this over-looked asset, and begin to shine a light on approaches that can be more effective in dealing with the issue.

After much discussion, the executive roundtable identified a number of approaches that were deemed necessary to deal with E&O.  In the end, excess and obsolete inventory occurs because of mistakes, mis-aligned decision-making, and lack of consideration of the cost of inventory in countless decisions, including product design, sales forecasting, sales and operations planning, and lack of awareness.  The following issues are some of the suggestions executives identified.  The SCRC will be working further on these challenges in the coming months, and explore them in greater detail.

Assign Accountability.  Executives need to deal with inventory issues as they arise!  Organizations need to be proactive about how to avoid making the decision, and when it does occur, immediately seek to address the issue.  Can it be used somewhere else, or can we assume we won’t use it and absorb that cost into the business and recognize it?

Design products with the end of life cycle in mind.  Ensure that engineers are more aware of how design parts left over at the end of the product life cycle will consume working capital, and train them on these costs.  For example, Huawei had a component engineering team reporting into procurement, and they were responsible for dicating components that went into every line of business to ensure maximum flexibility for usage of parts.  They forced component engineers to pull designs from existing baskets of parts, which addresses many of the problems with complexity and avoiding unique parts.

Management awareness of E&O impacts.  Is there a senior management team committed to driving down Excess and Obsolete inventory levels.  E&O should be viewed as pure cash. For example, more and more companies are establishing incentives for sales people who now earn part of their bonus based on how accurately they forecast to the SKU level, not to the planning level (which aggregates many parts and which is relatively stable and easy to forecast).

Planning and Sales communication.  There needs to be important communication channels between planning and sales managers.  The discussion could include a dialogue that includes a discussion such as “how real is your forecast?  (I won’t expose you)”. Sales people tend to load their forecasts by as much as 10%, which drives the MRP orders. There needs to be a one to one relationship between sales and demand planning to ensure complete transparency and real-time communication.

Change sales incentives.  It also helps if sales team bonuses are tied to inventory and tied to budget on S&OP’s.  Metrics on sales forecasts not only on final shipping, but on configuration and BOM accuracy is an important element. Customer-named accounts and configurations can help to improve sales accuracy, and to drive accountability for how the inventory was generated to a specific customer order and sales person can drive accountability six months down the road.  Sales people will change their behaviors under these conditions.

Develop an E&O narrative.  There needs to be a story constructed around how inventory is generated, and accountability for the inventory, to drive out the buffer planning behavior that occurs. There needs to be reviews of min-max cycles, minimum liability planning on configured products, and intelligence narrowing of the product portfolio as a result.  Product design standards and ownership is key.

Focus on forecasting performance for mix, not final product.  Forecasting performance analysis should be used to understand the strategy around what products/components will be consistently inaccurate.  At one company, leaders challenged managers to understand people are ordering parts, and performing a deep analysis on what parts were driven into the supply chain through poor planning activities, which can help to prevent such problems from recurring. A pilot project was done to look at service parts through tier 2 components, what was being purchased, the MOQ’s, and having suppliers share what they were seeing vs. what was being ordered.  Opening up discussions with partners on leadtimes, inventory levels, and forecast accuracy can start to open up the discussion.

Measure life cycle inventory cost.  A planning process in the design stage can also help to build in the cost of inventory early on.  A best practice at one company is to establish during the design phase the life cycle cost for components, and define the total life cycle cost of having ANYTHING in inventory over the life of the product.  At least setting a planned number makes sense and can enable a category strategy around that target to be established.

Evaluate decision impacts related to E&O.  There also needs to be some work around the cost of decisions and their impact on inventory.  What is the cost of an engineering change and the resulting E&O cost?  What is the cost of a new product and end of life inventory write-offs?  Development cost of product should include tooling, supplier qualification, warehousing, and write-offs at end of life.  Focusing on these costs can start the conversation going on cost of complexity.

While there is no “silver bullet” to resolving the E&O problem, awareness and focus across business functions, and the real impact on working capital and profitability needs to be clarified and measured against desired actions and strategies that are unknowingly the cause of many of the problems.

[1 comment]

An important differentiation between data governance, business intelligence, business analytics cognitive analytics and predictive analytics is needed as a basis for building a digital supply chain strategy.  Every organization needs to define for themselves the differences between these terms, and not just bend to how external consultants are professing to position their views on these concepts.

Data Governance” is the exercise of authority and control (planning, monitoring, and enforcement) over the management of data assets.[1] The basic components of data governance ensure the split of accountability and responsibility related to data thus empowering better decision making while using data from disparate data sources and methods. In effect, data governance provides a system of decision rights and accountabilities for the information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances, using what methods.[2]

A data governance program can provide many benefits, including,

  • Increasing the value of your existing data by identifying ways to utilize it.
  • Enhancing existing processes and build additional processes that work better
  • Decreasing the cost of managing data through synergies with other organizations
  • Standardizing policies, standards, procedures and systems related to data
  • Providing ways to resolve existing problems related to data (such as quality, availability, security etc.)
  • Improving transparency through socialization, dissemination and creation of awareness
  • Ensuring better compliance, security and privacy
  • Increasing revenue through improved customer-facing responsiveness
  • Enable better decision making in the end to end supply chain
  • Reducing organizational strains related to data issues

Business intelligence (BI) is a technology-driven process for analyzing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions. Sporadic use of the term business intelligence dates back to at least the 1860s, but consultant Howard Dresner is credited with first proposing it in 1989 as an umbrella phrase for applying data analysis techniques to support business decision-making processes. What came to be known as BI tools evolved from earlier, often mainframe-based analytical systems, such as decision support systems and executive information systems.[3]  Typically, business intelligence can be used for ad hoc analysis using visualization tools.

Analytics is the outcome of a series of advanced operations performed on data extracted from business intelligence systems. Business analytics may include dashboards, visual graphics, charts, etc. that are developed using tools such as data mining, predictive analytics, text mining, statistical analysis and big data analytics.  In many cases, advanced analytics projects are conducted and managed by separate teams of data scientists, statisticians, predictive modelers and other skilled analytics professionals, while BI teams oversee more straightforward querying and analysis of business data.

Gartner notes that the BI and analytics platform market is undergoing a fundamental shift. During the past ten years, BI platform investments have largely been in IT-led consolidation and standardization projects for large scale systems of record reporting. These have tended to be highly governed and centralized, where IT-authored production reports were pushed out to inform a broad array of information consumers and analysts. Now, a wider range of business users are demanding access to interactive styles of analysis and insights from advanced analytics, without requiring them to have IT or data science skills. As demand from business users for pervasive access to data discovery capabilities grows, IT wants to deliver on this requirement without sacrificing governance.[4]

Gartner also notes that as “…companies implement a more decentralized and bimodal governed data discovery approach to BI, business users and analysts are also demanding access to self service capabilities beyond data discovery and interactive visualization of IT curated data sources. This includes access to sophisticated, yet business user accessible, data preparation tools. Business users are also looking for easier and faster ways to discover relevant patterns and insights in data.”

According to a recent study by the International Institute of Analytics and the SAS Institute[5], BI adoption is more prevalent across the organization than advanced analytics. They note that “While the path from basic reporting to more advanced analytics work is often considered as a shift from BI to AA (Advanced Analytics), the reality is that advanced capabilities should augment, not replace, less advanced functionality.”  The reasons stated for this include criticality to business, recognition of benefits and utilization in strategy. Organizational weaknesses are perceived to be one of the strongest deterrents to the adoption of BI and advanced analytics practices across organizations. Data Governance programs are fundamental to both BI and BA outcomes, as it is critical to ensure acceptable data quality levels.

In many companies, pockets of analytics practice have developed in a random and disjointed way. Organizations need to develop a strategy for development of BI platforms to create advanced analytics, but in a structured and planned fashion that allows the greatest flexibility for multiple business units to conduct their own functional analytics, using a common and trusted source of data.

A variety of opinions, debates and points of view have emerged regarding the differentiation between BI and BA. For example, experts have claimed that BI is a noun and BA is a verb, that BI is backward-looking and that BA is forward-looking, and that BI is needed to run the business while BA is needed to change the business.[7]  Discussions can also wander into data structure and quality, internal or external analytics, or customer vs. supplier focused analytics.  Due to the confusion of issues, it is imperative that a common framework be established within any organization to provide a common language for creation of a strategic vision of the future.

[1] DAMA UK Working, Group. (2013, October). The Six Primary Dimensions for Data Quality Assessment. Retrieved from http://www.damauk.org/rw/CatViewLeafPublic.php?&cat=403

[2] Thomas, G. (n.d.). How to use the DGI Data Governance Framework to configure your program. Retrieved from http://www.datagovernance.com/wp-content/uploads/2014/11/wp_how_to_use_the_dgi_data_governance_framework.pdf

[3] http://searchbusinessanalytics.techtarget.com/definition/business-intelligence-BI

[4] Gartner, “Magic Quadrant for Business Intelligence and Analytics Platforms” 23 February 2015 ID:G00270380.

[5] International Institute for Analytics. (2016). IIA Business Intelligence and Analytics Capability report. Retrieved from http://iianalytics.com/analytics-resources/2016-business-intelligence-and-analytics-capabilities-report

 

[No comments]

“If someone asked me to come up with the answer to the world’s problems in an hour, I’d spend the first 55 minutes coming up with the right question.” – Albert Einstein

I heard this quote from someone about a week ago, and it got me thinking more carefully about the importance of beginning with the right research question, and being careful to delineate what it is we are trying to do.  The need to identify a critical business issue is at the core of creating the right digital analytics.  The problem, of course, is not the data, but the question we are seeking to answer using the data.

As noted by many executives, there is too much data, and the issue is more on understanding what data is important, and what data holds the clue that lends insight to the right problem.  For example, the Internet of Things is exploding with massive amounts of sensor data that is being collected, but people don’t know what to do with it!  Genomics as a field is also exploding, which effectively gives us a “part list” or bill of material for the human body.  However, how this information can be used to help make us more healthy is still a mystery!

In a sense, supply chain managers must begin to think more like research and development scientists, and learn the art of discovery.  What happens in an R&D environment?  It begins with scientists, who are very good at being curious, exploring new ideas, and developing hypotheses, testing them against lab or other empirical data, and extending their knowledge as they learn what works and what doesn’t.

Similarly, supply chain organizations need to be able to define testable hypotheses and be continually querying their supply chain systems to understand what is happening according to their mental models.  The importance of cognitive awareness is key here.

A common excuse is that “our data is terrible.”  This is no excuse.  There are ways to collect the data and cleanse it, to extract it for a singular purpose.  But you need to know what data you need.  And to do that, you need to know what you want it for.  Hence the question.

But this also requires that teams be able to challenge one another to create new mental models, and be allowed to do so.  As noted in my earlier blog, challenging one another in a team environment with new ideas is at the core of this capability…

So ask yourself:  Do I have the question right?

 

[No comments]

I had an opportunity to speak to a group of executives at Ohio State University ‘s Center for Operational Excellence, at the Fisher College of Management, on the subject of transparency and the cultural changes implicit in moving to a digital ecosystem.

In addressing some of the questions, one that came up is the fat that many companies are hesitant to act, because of their inability to construct an effective business case for transparency.  “What is the ROI?” is a common complaint.  The key to establishing the business case is not on market share, but through the velocity of material and decision-making, taking action in the face of uncertainty, using the most up-to-date information.

For legal arguments, the issue of privacy and security are generally of paramount concern.. There is no arguing against this point of view, as it is an extension of risk management that every legal counsel is going to push for.  To address this issue, care must be taken to establish the right governance structure.  This structure should be guided by the Law of Transparency:  “People need access to information that ensures they can react and improve supply chain outcomes.”, but also be guided by the right “firewalls” between individuals, that ensures legal connectivity policies are not violated.

Another concern has to do with releasing “validated” data.  For example, in the pharmaceutical sector, there was a discussion in a meeting I had with Pharma executives earlier in the week whether GMP data (data from manufacturing and material handling that is validated using ‘Good Manufacturing Practices’) can be shared with partners, if has not been fully vetted.  There are several responses to consider to this argument.  First, the FDA and other agencies only require GMP data for specific reporting applications.  Second, demand forecasts and production/inventory data are not GMP regulated, and thus are not subject to this validation argument.  This is typically one of the most important type of data we require.

CIO’s are also concerned about data security.  But as we start down the road of transparency and start to think about it, much of the information is going to be new data, data that is not currently within the four walls of our organizations.  That can make the argument a lot easier, since we do not have to “violate the CIO’s security stack”.  Pulling internal data and hosting it onto a cloud server where it can be shared, is not going to expose the organization to cyber-security risks, since hackers will not have structural access to the data used by the organization.  In this sense, creating a LIVING supply chain involves hosting only that data that is required to operate the supply chain, and that is required to be shared with key supply chain partners.   It is also combined with external data, which is not considered secure.  These considerations render the LIVING supply chain easier to execute and can overcome concerns regarding data sharing leading to cyber security risks.  As noted by one executive I spoke with, “You can’t let IT and the CIO control your destiny.”  Supply chain executives have to take control of this factor.  CIO’s will throw Oracle, SAP an other systems jargon at you.  Acting to control your destiny means finding a way around these arguments to drive real-time visibility of information required to operate your supply chain more efficiently.

[No comments]

A core element in the LIVING supply chain is the concept around the “Anti-Control Tower”, of what has been called a “data democracy”.  An important shift in human behavior is required to adopt to a world where individuals at a user level are given the authority to make decisions independently, while acting collaboratively in a virtual team environment.

I had a chance to hear New York Times write and author Charles Duhigg speak in New York last week, on how the best teams in the world were organized and governed in leading organizations worldwide.  He also spoke about how the best teams he studied operated in the face of massive uncertainty and new data.  His examples were drawn from individual teams of comedians working on Saturday Night Live, to massive global organizations like Google.  Duhigg started out by noting that individuals are exposed to cues, which results in behaviors, that is either rewarded or punished.  The presence of continuous rewards will lead to habits, which is the reason why individuals develop habits.  Part of our brain recognizes that rewards are produced by behaviors, which is a good thing.  Eventually behaviors are formed into habits.  But because the world is changing so quickly, our brains sometimes cannot keep up with all the changes, and sometimes we have so many diverse elements hitting us that our brain is unable to form habits.  But why is it that some people are able to adapt to changes more quickly, while others stumble in the face of constant new information?   Given the explosion of data that we are exposed to, it is important that people are given accurate information, in order to transfer the data into knowledge, and create good habits?

The question of course is how can individuals take information and make it more usable?  Why are some people good at using data, and others aren’t?  For instance, a dimension of IQ measures people’s ability to see patterns and shapes, and in a similar vein, to be able to see and understand data in the context of the situation, interpret it, and act.  An important concept to consider here is that many people have made decisions purely on intuition.  Intuition is a “gut feeling”, and while this cannot be completely discounted, data can challenge intuition.  But a common theme here is that a distinguishing capability around data interpretation in the face of uncertainty is the idea of narrative storytelling.

When people are exposed to data, some individuals are very good at taking data and processing it in a manner that becomes information.  This requires that they not just read the information, but that they interpret it, and extract what is important about the data that is noteworthy, unusual, or can spot a trend.  But the really important part involves taking the important elements of data, noticing the patterns, and weaving a story around it that relates that information back to elements in the real world.  This connection is essential for data interpretation.

As we move into a connected, LIVING supply chain, the people who will be the most productive in this environment will be uniquely adept at building mental models around new and emerging data.  The essence of this involves an ability to tell ourselves a “story” around the data.  Very few managers teach their direct reports how to do this, and it is certainly not taught in most university analytics and supply chain classes.  But it is such an essential part of what is needed to manage in the digital economy.

In this book, “Smarter Faster Better”,  Duhigg provides multiple examples of this capability.  For instance, he describes how experienced firefighters have an uncanny ESP that they use when fighting a fire.  They effectively tell themselves a story about where the fire will be whenever they enter a room, and are thus prepared for unexpected eruptions  This is a skill that is learned from entering many rooms with fires, and paying attention to the cues.  Another example is how taking notes by laptop versus writing it down is more efficient in the classroom.  Students who wrote manually had less information to study with after the class was over, and this made it more difficult for the information to be absorbed, and thus more difficult to convert it into knowledge.

In a similar manner, teams operate better when they are asked to tell a story, and this dramatically increases productivity.  Duhigg also spoke about how intelligence was the ability to absorb information, but being smart was the ability to interpret it and apply it to storytelling.  And our experience often drives us to look for the familiar that is in our context of what we know.  For example, the tendency is for managers to hire people that are just like they are.  This is often an error in judgement, because we want to hear from people who fit our definition of what we think is smart.  I have biases, and may be wrong, but I may not appreciate other points of view that in fact may be right.

Google spent over $15M on a study of over 15,000 teams, to try to understand what made them successful.  In the end, they found that there were no singular predictive elements of what made teams successful!  Finally, they decided to examine the culture of the teams, and examine their patterns and habits to see if anything made a difference.  The research team noted that successful teams had one common element:  psychological safety.  That is, people on the teams had an environment where anyone on the team felt comfortable in saying anything without consequences, ridicule, or belittlement.  In effect, these teams had an element of trust that allowed more creativity to occur through engagement of all team members.

Further, two elements were found to be instrumental in creating productive teams.  First, it was critical to ensure that everyone spoke up.  Some team members will always sit back and not say anything, but successful teams force everyone to speak.  Google found that teams that actually monitored how many times each team member spoke up, using checklists, and ensured that everyone on the team spoke, had more successful outcomes.  A good team leader will pay attention to non-verbal cues, and urge those who aren’t talking to engage.  A second component involves ensuring that other team members are listening and responding!   It is important to encourage people to disagree, and in fact, to reward people for saying what they think, and then reward them for disagreeing!  Too often, people don’t speak up because they are afraid of appearing naïve, stupid, uninformed, or making a mistake.  The importance of pulling people out of their comfort zone will drive the creativity process.  Not every creative idea will be a good idea, but you will never get to a great idea if you don’t jumpstart the creativity process.  And sometimes the feedback creates a new idea that is a winner.

Duhigg described the importance of this, as he observed how Lorne Michaels initiated the process of creating a show on “Saturday Night Live”.  Before every show, he would bring together the group of comedians, and have a team meeting to “come up with jokes”, as well as the skits that would accompany them.  Comedians are a moody and often disruptive bunch, often at odds with one another, and not always cooperative, so this was no easy task.  Michaels would force every single comedian to speak up over the course of his meeting.  In fact, he would use strategies to ensure this happened.  For example, if a comedian came up with a joke or idea that was “stupid”, he would perk up and say “that’s a great idea, and it makes me think we could do such and such…”  On the other hand, if a comedian had a fantastic idea for a skit, he would react in a non-committal and silent manner.  IN the end, the good idea would end up being on the show, but during the meeting Michaels would act like it was no big deal. In effect, he was rewarding comedians for bad ideas, not just tolerating them!

This is an important concept that relates to how transparent supply chains need to create a culture of openness across the network, and virtual teams that can quickly work on emerging issues.  We want to be able to reward suppliers and distributors for transparency and openness, as well as for “bad ideas” that are part of the creative innovation process.  By explicitly addressing those elements of a team interaction that are the most stressful and directly having conversations that people are the most scared of, the process of sharing the concern openly makes it less stressful, reduces the emotionality, and increases the ability of all parties to learn and address potential problems that may arise.  Duhigg mentions the idea of the best hospitals having post mortem discussions that specifically discuss things that went wrong during the operation, as one of the most important element of learning and improvement.  He discussed the example of Alcoa, and their CEO who instead of focusing on revenue and costs, focused on worker safety.  By measuring and focusing exclusively on safety, it drove habits on the part of workers that helped drive other components of productivity, that were essential to eventually improving revenue and reducing costs!  Alcoa became the darling of the Dow Jones as a result.

Duhigg emphasized the importance of proving to yourself who you are, and creating a sub-conscious self-image.  Telling a story to yourself about the person you are, and believing that story through self-actualization becomes important.  Tell yourself a story of how you want things to work around you, and then perform the actions to make it happen.  Nobody wants to disappoint others, nobody wants to be shamed.  By knowing yourself and understanding what works for yourself, you can address the issues that are preventing the story from being told are enacted.  And when this spirit is pervasive across an entire supply chain, a true culture for transparency can emerge.

Establishing basic rules for strong virtual teams that operate in an environment of extreme complexity and little time will be an important element for operating in the LIVING Supply Chain.

[No comments]

I had the opportunity to hear Ken Frazier, CEO of Merck,  speak at the SC50 meeting.   Ken was a single child raised by a single father in Philadelphia, making his way into Harvard Law eventually.  Prior to becoming CEO, Ken was the general counsel for Merck, and was the lead representative in defending Merck against the 60,000 individual lawsuits associated with the Vioxx scare.  Ken spoke at length about many of his experiences during this time, not least of which was his ability to allow the company to survive the barrage of lawsuits and settle them in an appropriate manner.  He also oversaw the downsizing of the company, involving layoffs of over 1/3 of the company, a period he described as “one of the most difficult experiences I’ve ever been through”.

Ken also spoke about the problems of opioid sales, the $86B of sales in this category, which has led to so much addiction.  “This is more of a reflection of a lack of hope, in parts of the country where peope have been left behind.  Fortunately, the CDC is beginning to limit the prscription length and number of people, and research is on-going to develop other forms of pain medication that is not addictive.”

Merck has an incredibly complex supply chain, supplying 140 countries, 60,000 SKUY’s, and over 200 products.  This includes tablets, vials, injectable’, implants, and inhalers, to name a few.  The company is also involved in cold chain under difficult circumstances, including distribution of Ebola vaccines in sub-Sahara Africa which goes to small villages with no refrigeration.  Ken noted that “our SCM team is incredible!”  Two days after we had a cyber-attack and our computers went down, we didn’t even miss a single shipment, due to the hard work of our SCM team.  During the recent weather disasters, Merck has been on the front lines helping people.  “We were bringing jets into distribute pallets of water to people in Puerto Rico, for example.

When asked about being the first person to resign from Trump’s Manufacturing Council earlier this year, Ken was very circumspect about this.  “In general, I don’t believe CEO’s should be involved in politics.  But this situation was not about politics.  Charlottesville was an event that was against our values as a country, a country which reinforces tolerance.  Prior to resigning, I called all the Board Members.  The US Government, after all, is our biggest customer, and this could cause some waves.  The Board unanimously supported me.  Although the President wanted to learn about our industry, I didn’t feel like I remain on the Council  after he made those comments.  I felt like the first Wildebeest who runs into the river with all the alligators in there.  But once I did so, all of the others followed me, and the alligators were scared by a herd of stampeding wildebeests!  A big part of driving change is to learn from our failures, as well as our successes.  We have to get people to stop fearing failure, and push them to learn.  Only then can we grow as a nation and as a community and adopt to the changes going on around us.”

 

[No comments]