The old aphorism about genius being 99 percent perspiration applies to many things in life and business —writing, research, parenting, invention and more.
One small example from the business logistics corner of the world is the design of a distribution network. Though distribution execs now have software to help them blast through the complexities and provide neatly packaged comparisons of multiple options, in the end, success depends on how well prepared you are going into the process.
For consumer packaged goods (CPG) companies in particular, getting the network right—achieving the right balance of inventory, material handling and transportation costs, while offering service levels that satisfy their retailer customers' unforgiving requirements—is crucial.
A number of experts who provide both software tools and consulting advice for customers involved in network design and implementation acknowledge that the bulk of the work comes up front, primarily in gathering and validating historical data and business forecasts that accurately reflect the business. Bruce Baring, director of strategic services for Peach State Integrated Technologies, says that anyone entering into a network analysis should expect to spend 60 to 80 percent of the time collecting, cleaning and validating data.
That can, of course, vary widely, depending on the project and on how much mastery the company has over its own data. While a typical network analysis project can take three to four months, others unfold much more rapidly. Bob Belshaw, chief operating officer of Insight, cites one such project with Motorola as an example. "The whole study took only six weeks," he says, which is about half the average time required. "It's really a case of whether the team has access to the data."
Only when the data have been crunched does the fun begin—when the modeling engines take over to develop a variety of options.
Be wise, optimize!
Given the difficulty of evaluating and optimizing an entire network—not to mention the time required for the project—it's not something most companies jump into without a compelling reason.
Baring lists several drivers. "A merger or an acquisition is always a good time to review the network," he says. "If you are acquiring manufacturing or distribution, your demand patterns may be changing, and that's a good time to look at your facilities."
A second reason, he says, is growth, particularly in specific geographic areas where growth may be exceeding historical norms.
"The other big indicator," he says, "is a significant change in sourcing."
Dan Sobbott, director of business development for Slim Technologies, a developer of optimization software with a number of retail and consumer goods customers, sees it much the same way. "There are maybe three big drivers that we see," he says. "Growth is one of them. It may be more stores or product launches or the product mix is changing. Planning those things through the distribution network is a motivating factor.
"Second, we see a fair amount of companies interested after a merger or acquisition. Strategic reasons drive companies together, but after that, they face the fundamental questions of how to bring two supply chains together.
"The third largest reason: cost cutting initiatives. They may have done some benchmarking and [have realized] they're sort of out of line with where they should be."
Belshaw, whose company's Sails network optimization engine is used by a number of major CPG players, argues that manufacturers should not necessarily wait for a precipitating event to look at their networks. "It's really something in all industries that needs to be looked at on a very frequent basis," he contends. He says that with the pace of business in most industries constantly accelerating, it makes sense to evaluate the network regularly to see if it needs fine tuning.
Overall, the CPG companies are constantly adding new products, almost as quickly as their high-tech brethren, he says. "They are constantly introducing new products, taking older products off the market, adding whole new brands or divesting. Those have significant impacts on the network … on warehousing or transportation or inventory positioning. When it comes to the supply chain, there is so much ripple effect. When you do something in manufacturing, it has huge implications for distribution and transportation."
Numbers, then more numbers
Whether the review is prompted by a major change in business or represents a regularly occurring event, the first step is the hardest step—gathering and validating data.
That's somewhat easier than it used to be, thanks to the enormous gains in data gathering capabilities in most industries over the last decade or so.
Sobbott points to the retail industry as an example. "Retailers historically have focused less on distribution and more on merchandising," he says. "Now, with substantial information available from point-of-sale (POS) data, they have the data for fact-based planning. So supply chain initiatives are of greater and greater interest."
Sobbott, Belshaw and Baring all agree on the crucial need for good data. In fact, Sobbott calls data readiness "the greatest challenge in a network optimization study. You have to have the right data."
But what data?
"The first thing to do is to develop data profiles," says Baring, "with demand analysis, inbound and outbound transportation costs, and fixed and variable warehousing costs.You have to pull all this historical data. Then you have to figure out where the business is going—you want to plan for the future, not the past.How is business going to evolve? How will the supply chain look three or five years from now? That's a pretty extensive step."
Next comes the validation step—testing the data before moving forward. Essentially, the data fed into the optimization engine are compared to the actual historical events— and they should match.
"We build a bubble map based on demand in different parts of the country," Baring says. "That provides a good visual validation. You typically should see bubbles where Wal-Mart or Target DCs are located." But he cautions users to make sure they're collecting and using "ship-to" and not "bill-to" addresses. "You shouldn't see a big bubble around Bentonville, Ark.," he says.
Sobbott explains, "The first thing you try to do is produce a baseline model that replicates a historical model. You want to include all the costs, volumes and activities. That allows you to understand if you have data that is well defined and that is replicating supply chain activities accurately. The validation model is constrained: We're trying to make it act like the historical time period."
For all the power of some of the optimization engines, the enormous volume of data in even a modest supply chain requires aggregation in appropriate ways in order to make it manageable.
"When you build the network optimization, there are only so many variables the software can handle efficiently," Baring says. "You may have demand to the three-digit ZIP code level, or aggregate by product types where you group together like products based on source, handling or similar cube-to-weight ratio. That simplifies the math in the optimization engine."
Once the validation is complete, it's time to take the constraints off the software and let it run.
"We unconstrain the model to do the optimization," Sobbott says. "We run a lot of scenarios." The options, he says, are almost unlimited, going from fine-tuning distribution using existing DCs, to closing some and opening others, to a complete green field analysis.
"We teach people to use a green field analysis," Belshaw says. "If you could put your distribution anywhere, where would that be?" The point, he says, is to see an optimal solution. "We never go there 100 percent," he says. But what it does is set some outlying goals for the potential of an efficient supply chain.
Scenarios can compare national versus regional distribution, making use of third parties, segregating some inventory such as slow movers, and more. "It's not uncommon to run dozens of scenarios," Sobbott says. "There may be a dozen runs on regional DC scenarios.
"There may be other scenarios," Sobbott says, "closing or opening DCs and actually changing assets.We may look at service where there's limited time to ship to the customer. Some may look at a regional DC strategy, so the average length of haul gets shorter.We look at overall costs.
"One thing we've started to see more with CPGs is looking at a centralized versus regional distribution strategy." The tradeoffs are obvious to any DC pro: Centralizing DCs means longer shipping distances but lower inventory costs.
Decentralizing and using more regional DCs, on the other hand, means faster customer service but higher inventory costs.What the analysis does is quantify those tradeoffs in dollars.
"You know what it will cost you," Sobbott says. "You will know optimally where to locate—where to put your DCs and how large they should be."
There are limitations, of course. The scenarios may not tell you much about real estate costs, or labor availability and rates, for instance. But they do provide solid information on where to concentrate your investigations after the optimization study is completed.
Belshaw says that in a typical study, the optimization engine will run 45 to 75 analyses. "We do lots of sensitivity runs," he says. The idea is to see how the proposed solutions would work if some of the forward-looking assumptions prove incorrect—for instance, if demand in a year or two exceeds projections used in the analysis. "You don't want to build a network that's so fragile that if business grows 12 percent, not 10 percent, you're in trouble."
Baring agrees. "Once we come up with the best solution, then we start to test sensitivities. If you said you'd be growing the market at a 20-percent pace but it only grows 5 percent —is it still the right strategy? Or what if the labor rate is 10 percent higher [than modeled], is it still the best solution? That gives an indication if the solution is fairly robust."
Analyzing the various scenarios and deciding how to proceed is the final step in this network analysis. "It is not always a case of the lowest cost," Sobbott says. "There are a number of key business factors and strategies [to consider]."
This is where business experience and understanding of strategic objectives is crucial. The software shows options, but human intelligence drives implementation decisions.
"The key thing in any network optimization is trying to balance the cost and service relationship," Baring says. "Network optimization is not the be all and end all, but a strategic tool to support business decisions. The decisions have to make sense. You really have to bring an operational bias to the exercise." For instance, scenarios may point to abandoning some geography or reducing service levels to a key customer—options that may be efficient, but are just plain bad business. "The answer may come back not to serve an area, but no CPG is going to do that," Belshaw says.
The network analysis is just the start. The implementation phase is where the heavy spending and effort take place, where timing and investment decisions are enacted, and the network may be most at risk as facilities are opened, revamped or closed. But all that requires a map—and that's where the heavy lifting up front pays off.