In my new book “The Living Supply Chain” co-written with Tom Linton, we reflect on the new capabilities required to be able to work with the evolving imperative of real-time data.  Two key concepts reflect the core elements of real-time supply chains. Velocity is the ability of an organization to flow working capital rapidly from suppliers through end customers. Working capital is generally in the form of inventory, which is an asset that doesn’t produce any revenue or cash. Thus, the object of the real-time supply chain is to achieve velocity in every aspect of how companies run their business. Real-time data in the supply chain provides support to managers who make a number of important decisions, including the following:

  • Tracking and monitoring inventory;
  • Determine the volume and mix of product to schedule for delivery;
  • Scheduling incoming material and communicating with suppliers;
  • Establishing the time and modes of efficient, responsive transportation providers,
  • Planning and scheduling personnel in distribution and warehouse operations,
  • Establishing how to move product through global logistics systems,
  • Communicating instantaneously with personnel in alll roles across the supply chain (suppliers, distributors, customers), who must and make decisions related to unexpected events and disruptions that impact material flow.

The ability to respond to real-time data and derive supply chain performance improvements is closely linked to prior work in organizational information processing theory (OIPT). According to OIPT, an organization’s information processing capabilities must be aligned with its information needs. That is, an organization must be able to gather, interpret, synthesize, and coordinate information across the organization (Burns and Wholey, 1993). Processing information in such a structured and logical way reduces uncertainty and helps various decision makers develop a shared interpretation of the information (Daft and Lengel, 1986).

The opposite of visibility is opaqueness, or the absence of visibility on what is happening in an organization’s upstream and downstream networks. When individuals have visibility to events that enable decision-making velocity, minor problems and disruptions are resolved more quickly and easily before they escalate into bigger problems (Craighead et al., 2007). Examples of visibility include demand visibility, market visibility, and supply visibility.  Speed of decision-making increases not just the flow of information, but also the flow of materials, shipments, production, and all activities in the chain. A metaphor is that of reducing friction, which increases flow, where friction includes all of the typical delays and problems that slow material flows and increase inventory. Examples include the multiple layers of approvals for purchase orders, delays in decisions when a forecast deviation occurs, or the lack of response when a major disruption shuts down shipments to customers. Friction can produce bottlenecks in production systems and shipments, which delays material and causes inventory to build up or shortages to occur. Examples include the Tianjin explosion, the tsunami in Japan, and the port closure in Los Angeles.

These principles are not new. Many of the concepts around “lean production systems” have emphasized flow and visibility. For instance, some professionals maintain that demand information sharing and visibility enable improved supply chain responsiveness, alerting executives to opportunities and challenges in the extended supply chain.   But the emergence of real-time information that enables the instantaneous visibility of assets across multiple tiers in supply chains has only been realized in the last two years. Real-time data is enabled by the emergence of cloud computing and mobile devices, which creates “big data” technology platforms that process higher volumes of internal and external data from multiple sources.

Some organizations have invested in very expensive systems called “control towers” to manage their “big” data. In a control tower, information from all of an organization’s logistics systems, production facilities, inbound shipments, outbound shipments, and inventory levels are dumped into a massive data warehouse (Brooks, 2014). The information is then centralized into a “control tower”, where individuals are scanning what is going on, and senior executives render decisions, sometimes using complicated algorithms and automated ordering systems. The fundamental assumption behind control towers is that senior executives removed from the day-to-day have the best knowledge of how to optimize the entire supply chain, because they are the only ones who have access to all of the data. Much of the data is “integrated” (e.g., lumped together) from ERP systems, transportation management systems (TMS’s), warehouse management systems (WMS’s), distribution requirement systems (DRP;s) and material requirement planning systems (MRP’s). Because many of these systems are in a “batch mode, meaning they are updated on a weekly, or perhaps daily basis, the information being viewed in the control tower is always lagging. As a result, decision-makers in the control tower are making decisions based on what happened a few days ago, and are determining what to do next based on what they think will happen next. This scenario embodies the “old” themes of “supply chain integration”: batch processing, information updates, “control-tower” thinking where only some people see the information, and decisions made by the “top brass”.

This model, in my opinion is turned around.  Because only when people on the ground can see the data, collaborate with one another, and resolve their issues through virtual, mobile data reviews, can problems be resolved in a timely manner before they spin out of control.  Our review of the research literature reveals a dearth of information that describes how organizations are combining mobile computing and system-wide supply chain analytics to derive new emerging capabilities.  This phenomenon is so new (e.g. Flex installed its real-time Pulse Center in 2016) that it is clear that current managerial ways of thinking about real-time data have not caught up with the technological capabilities descending upon us!

No Responses

Some HTML is OK

or, reply to this post via trackback.