The Winding Road toward the “Autonomous” Supply Chain (Part 2)

3d-matrix

Last week, I began this train of thought with The Winding Road toward the ‘Autonomous’ Supply Chain (Part 1)”.  Now, as this weekend approaches, I conclude my piece, but I hope to spur your ideas.

Detect, Diagnose, Decide with Speed, Precision & Advanced Analytics

Detection of incidental challenges (e.g. a shipment that is about to arrive late, a production shortfall, etc.) in your value network can be significantly automated to take place in almost real-time.   Detection of systemic challenges will be a bit more gradual and is based on the metrics that matter to your business, capturing customer service, days of supply, etc., but it is the speed (and therefore, the scope) that is now possible that drives more value today from detection.

Diagnosing the causes of incidental problems is only limited by the organization and detail of your transactional data.  Diagnosing systemic challenges requires a hierarchy of metrics with respect to cause and effect (such as, or similar to, the SCOR® model).  Certainly, diagnosis can now happen with new speed, but it is the combination of speed and precision that makes a new level of knowledge and value possible through diagnosis.

With a clean, complete, synchronized data set and a proactive view of what is happening and why, you need to decide the next best action in a timeframe where it is still relevant.  You must optimize your tradeoffs and perform scenario (“what-if”) and sensitivity analysis.

Ideally, your advanced analytics will be on the same platform as your wrangled supra data set.  The Opalytics Cloud Platform (OCP) not only gives you state of the art data wrangling, but also provides pre-built applications for forecasting, value network design and flow, inventory optimization, transportation routing and scheduling, clustering and more.  OCP also delivers a virtually unlimited ability to create your own apps for decision modeling, leveraging the latest and best algorithms and solver engines.

Speed in detection, speed and precision in diagnosis, and the culmination of speed, precision and advanced analytics in decision-making give you the power to transpose the performance of your value network to levels not previously possible (see Figure above).  Much of the entire Detect, Diagnose, Decide cycle and the prerequisite data synchronization can be, and will be, automated by industry leaders.  Just how “autonomous” those decisions become remains to be seen.

As yet another week slips into our past, I leave you with a thought from Ralph Waldo Emerson, “There is properly no history, only biography.”

Have a wonderful weekend and thank you, again, for stopping by.

The Winding Road toward the “Autonomous” Supply Chain (Part 1)

There is a lot of buzz about the “autonomous” supply chain these days.  The topic came up recently at a conference I recently attended where a topic of discussion was the supply chain of 2030. But, before we turn out the lights and lock the door to a fully automated, self-aware, supply chain decision machine, let’s take a moment and put this idea into some perspective.  I’ve heard the driverless vehicle used as an analogy for the autonomous supply chain.  However, orchestrating the value network where goods, information and currency pulse between facilities and organizations, following the path of least resistance may prove to be considerably more complex than driving a vehicle.  Most sixteen-year-olds can successfully drive a car, but you may not want to entrust your global value network to them.

Before you can have an autonomous supply chain, you need to accelerate what I call the Detect, Diagnose, Decide cycle.  In fact, as you accelerate the cycle you may learn just how much autonomy may be possible and/or wise.

Detect, Diagnose, Decide

The work of managing the value network has always been to detect challenges and opportunities, diagnose the causes, and decide what to do next –

  1. Detect (and/or anticipate) market requirements and the challenges in meeting them
  2. Diagnose the causes of the challenges, both incidental and systematic
  3. Decide the next best action within the constraints of time and capital in relevant time

The Detect, Diagnose, Decide cycle used to take a month.  Computing power, better software, and availability of data shortened it to a week.  Routine, narrowly defined, short-term changes are now addressed even more quickly under a steady state – and a lot of controlled automation is not only possible in this case, but obligatory.  However, no business remains in a steady state, and changes from that state require critical decisions which add or destroy significant value.

Data Is the Double-edged Sword

digital-value-network-matrix

Figure 1

The universe of data is exploding exponentially from networks of organizations, people and things.  Yet, many companies are choking on their own ERP data, as they struggle to make decisions on incomplete, incorrect and disparate data.  So, while the need for the Detect, Diagnose, Decide cycle to keep pace grows more ever more imperative, some organizations struggle to do anything but watch.  The winners will be those who can capitalize on the opportunities that the data explosion affords by making better decisions through advanced analytics (see Figure 1).  The time required just to collect, clean, and synchronize data for analysis remains the fundamental barrier to a better detection, diagnosis and decisions in the value network.

A consolidated data store which can connect to source systems and on which data can be programmatically “wrangled” into a supra data set would be helpful in the extreme.  While this may seem like an almost insurmountable challenge, this capability exists today.  For example, the Opalytics Cloud Platform enables you to use Python to automatically validate, reconcile and synchronize data from various sources, forming the foundation of a better Detect, Diagnose, Decide cycle.

Thanks for taking a moment to stop by.  As we enter this weekend, remember that life is short, so we should live it well.

I’ll be back next week with Part 2.

Make Analytics Useful, Meaningful and Actionable

Last week, I identified reasons for the organizational malady of failing to fully leverage analytics to make higher quality decisions in less time.  As promised, this week, I want to share a remedy.

For the analyst, I recommend the following:

  1. Put yourself in the shoes of the decision-maker.  Try to step back from the details of your analysis for a moment and ask yourself the questions he or she will ask.
  2. Engage your decision-maker in the process.  Gather their perspective as an input.  Don’t make any assumptions.  Ask lots of questions.  They probably know things that you don’t know about the question you are trying to answer.  Draw them out.  Schedule updates with the decision-maker, but keep them brief and focused on essentials.  Ask for their insight and guidance.  It may prove more valuable than you think.
  3. Take time to know, explore and communicate the “Why?” of your analysis – Why is the analysis important?  Why are the results the way they are?  To what factors are the results most sensitive and why?  Why are the results not 100% conclusive?  What are the risks and why do they exist?  What are the options? 
  4. Make sure you schedule time to explain your approach and the “Why?”  Your decision-maker needs to know beforehand that this is what you are planning to do.  You will need to put the “Why”? in the context of the goals and concerns of your decision-maker.
  5. Consider the possible incentives for your decision-maker to ignore your recommendations and give him or her reasons to act on your recommendations that are also consistent with their own interest.
  6. “A picture is worth a thousand words.”  Make the analysis visual, even interactive, if possible.
  7. Consider delivering the results in Excel (leveraging Visual Basic, for example), not just in a Power Point presentation or a Word document.  In the hands of a skilled programmer and analyst, amazing analysis and pictures can be developed and displayed through Visual Basic and Excel.  Every executive already has a license for Excel and this puts him or her face-to-face with the data (hopefully in graphical form as well as tabular).  You may be required to create a Power Point presentation, but keep it minimal and try to complement it with Excel or another tool that actually contains the data and the results of your analysis. 

Frustration with your decision-making audience will not help them, you, or the organization.  Addressing them where they are by intelligently and carefully managing the “soft” side of analytics will often determine whether you make a difference or contribute to a pile of wasted analytical effort. 

Thanks again for stopping by.  I hope that these suggestions will improve the usefulness of your analysis.  As a final thought for the weekend, consider these words from Booker T. Washington, “There is no power on earth that can neutralize the influence of a high, pure, useful and simple life.” 

Have a wonderful weekend!

Why the Soft Side of Analytics Is So Hard to Manage

I’m borrowing both inspiration and content from two good friends and long-time supply chain professionals, Scott Sykes and Mike Okey.  They deserve the credit for the seminal thoughts.  Any blame for muddling the ideas or poorly articulating them is all mine.

If you are an analyst, operations researcher or quantitative consultant, you probably enjoy the “hard” side of analytics.  What we often struggle with as analysts is what you might call the “soft” side of analytics which is always more challenging than the “hard” stuff.  Here are a few of the reasons why.

Many times, the problem is not insufficient data, defective data, inadequate data models, or even incompetent analysis.  Often, the reason that better decisions are not made in less time is that many companies of all sizes have some, if not many, managers and leaders who struggle to make decisions with facts and evidence . . . even when it is spoon-fed to them.  One reason is that regardless of functional or organizational orientation, some executives tend not to be analytically competent or even interested in analysis.  As a result, they tend to mistrust any and all data and analyses, regardless of source.

In other situations, organizations still discount robust analysis because the resulting implications require decisions that conflict or contrast with “tribal knowledge”, institutional customs, their previous decisions, or ideas that they or their management have stated for the record.  Something to keep in mind is that at least some of the analysis may need to support the current thinking and direction of the audience that is analytically supportable if you want the audience to listen to the part of your analysis that challenges current thinking and direction.

Understanding the context or the “Why?” of analysis is fundamental to benefiting from it.  However, there are times when the results of an analysis can be conflicting or ambiguous.  When the results of analysis don’t lead to a clear, unarguable conclusion, then managers or executives without the patience to ask and understand “Why?” may assume that the data is bad or, more commonly, that the analyst is incompetent.

Perhaps the most difficult challenge an organization must overcome in order to raise the level of its analytical capability, is the natural hubris of senior managers who believe that their organizational rank defines their level of unaided analytical insight.  Hopefully, as we grow older, we also grow wiser.  The wiser we are, the slower we are to conclude and the quicker we are to learn.  The same ought to be true for us as we progress up the ranks of our organization, but sometimes it isn’t.

So, if these are the reasons for the organizational malady of failing to fully leverage analytics to make higher quality decisions in less time, what is the remedy?

The remedy for this is the subject of next week’s post, so please “stay tuned”!

Thanks for having a read.  Whether you are an executive decision-maker, a manager, or an analyst, I hope these ideas have made you stop and think about how you can help your organization make higher quality decisions in less time.

A final thought comes from T.S. Eliot, “The only wisdom we can hope to acquire is the wisdom of humility—humility is endless.”

Have a wonderful weekend!

Guest Post: Big Data is Getting Bigger! Are Retail Companies Ready?

This month’s McKinsey Quarterly carries an article, “Are You Ready for the Era of Big Data?”.  In contrasting two competitors, it states the following:

“The [One] competitor had made massive investments in its ability to collect, integrate, and analyze data from each store and every sales unit and had used this ability to run myriad real-world experiments.  At the same time, it had linked this information to suppliers’ databases, making it possible to adjust prices in real time, to reorder hot-selling items automatically, and to shift items from store to store easily.  By constantly testing, bundling, synthesizing, and making information instantly available across the organization—from the store floor to the CFO’s office—the rival company had become a different, far nimbler type of business.”

This week’s guest post explores what retailers need to do in order to take advantage of “big data”.  The author is Özgür Yazlalı, one of my former colleagues and a gentleman who has years of experience helping retailers significantly improve decisions that directly impact financial results through data-driven analysis.  I greatly appreciate Özgür’s contribution and look forward to more in the future.

_______________

As an internal consultant in a Fortune 500 retail company, one of the interesting transitions I have witnessed was the significant change in all dimensions of available data.  Day by day, not only more granular data but also more categories of data have become increasingly accessible to planning functions.

The retail store remains the closest link to the end customer of supply chains – the consumer.  Leveraging this connection requires collecting tremendous amounts of very detailed transaction data.  Consider the following examples:

– Customer:  “For this very random reason, I’ll rightfully return this product that I don’t remember from which store I bought”.
– Retailer:  “Ooops, we cannot see how much we charged you for this product, so we’ll credit the ticket price back to you.”

The emergence of social electronic media has multiplied the complexity and volume of interaction between consumers and retail stores.  Smart phones, social networks, image processing, and cloud computing have become reality in our everyday lives, further illustrating the relentless and ubiquitous nature of Moore’s Law.

The “big data” phenomenon is a great opportunity for the manager who knows how to utilize it.  The possibilities of localized assortment planning with reliable store-product category forecasts, localized pricing, and demand planning with social networks and web-based trends are now very real.  The limitation lies neither in the analytical capability nor the existence of reliable data, but rather in the legacy planning systems.

This phenomenon can also be a curse if all you know about data is limited to spreadsheet programs.  My humble observation is that there is now an ever-expanding gap between the rate of increase in the data and the spreadsheet capabilities of individual users.  Getting the big picture could now be a difficult task.  Without a better way to address this gap, you risk being limited to analyzing specific events and/or drawing conclusions based on a limited sample.

Given the increasing gap, the traditional way of building IT solutions based on business requirements documents and restricted interaction is no longer viable.  Big data requires cross-functional and data-capable analytical teams that operate as intermediary functions between business and IT organizations.  This is not a team simply put together from ex-business and ex-IT folks, but data scientists, optimization experts, experienced data and business analytics consultants that could unleash the capabilities of SQL and spreadsheets together.  Such teams not only facilitate the discussions between the two organizations but also enhance a tool design process that is more interactive with rapid innovation and prototyping.  This is critical because no one really knows a priori what assumptions and models will consistently work.

Most companies are organized as silos of functions, as are their data sets (though IT might store these datasets in the same server).  Thus, in addition to enhanced IT-business interaction, these analytical teams could work with different business functions that do not routinely communicate, leveraging data as common ground.  For example, the inventory management teams that use sales and inventory data may not know about, or may choose to ignore, trends in store traffic that consumer insights teams usually track.  Yet, an integrated analysis of these two data sets could reveal that a decreasing sales trend in a well-inventoried store is being driven by poor assortment, even if there is sufficient traffic.

Big data presents significant challenges and opportunities to businesses.  Cloud-based, functionally local solutions may help individual business functions maintain their own small data warehouses, but an holistic approach demands greater IT involvement.  Unhappily, IT organizations are not often seen as centers of innovation.  A team of skilled, experienced analysts with access to both the computational power owned by IT and the big data in which other business functions find themselves awash provides an effective means for overcoming the challenges and delivering on the opportunities of big data.  These analytical teams can be a part of the IT organization, but whatever organizational structure is employed, they must have powerful computing resources, access to all relevant data, and a voice with decision-makers. [Editorial Note:  A hyper-performance, secure, could-based platform would be the natural vehicle for data of all types from all sources where challenges and opportunities can be identified, diagnosed and the next best action determined and directed.]

In summary, big data is now more accessible, and companies must continuously explore new ways of using it to increase profitability and market share.  For retailers in particular, these opportunities are far greater than for any other industry because of both their proximity to consumers and the availability of structured data (social networks, CRM, POS, inventory, traffic, e-commerce).  Integrating all this valuable information into a predictive [Editor’s note:  and prescriptive] planning process is a difficult task requiring tighter linkage between the IT and business functions.  Analytical consulting teams with enhanced data-capabilities who could facilitate and guide this interaction are now more important than ever in achieving this goal of increased profitability and market share.

____________________________

Thanks again for dropping by.  If you liked this week’s guest post, please rate the piece and feel free to leave a comment.  Until next week, remember the words of the American poet, Harry Kemp who said, “The poor man is not he who is without a cent, but he who is without a dream.”

Have a great weekend!

%d bloggers like this: