Simulation software: A powerful way to improve organizational decision makingBy Joe G. Herlihy
This article introduces a tool that can make an immediate, positive impact on the quality of quantitative analysis in an organization. Consider the following: Susan is a product manager at a Northern New England technology company and has been directed to forecast sales, revenues, and profitability for her product for three years. She uses past and current data to build a spreadsheet model and then schedules time with her manager to discuss the model and its underlying assumptions. Both managers realize the range of possible outcomes cannot be shown in one simple model and decide to create three scenarios to present to senior management – "best case", "mid-case", and "worst case". The task of developing these scenarios is frustrating. Susan’s first model contains over 50 different variables including everything from the initial sales price to the projected inflation rate in year three. Adding three scenarios increases this to over 150 variables -- assuming there is only ONE of each scenario. Susan probably has at least two versions for each scenario and is now managing almost 500 variables. Susan’s final presentation starts off well with senior management patiently listening as she presents the scenarios. However, the gloves come off as soon as she finishes – "Why did you pick that growth rate?" "How does that growth rate impact our manufacturing capacity and resultant gross margins?" "Why did you use that growth rate for your ‘mid-case’ scenario? – It seems more like a ‘worst-case’ scenario." "How did you reach that final conclusion on sales volume?" What are the key risks driving these scenarios?" And on, and on. . . . Susan deals as best she can with the questions but is frustrated that her presentation has left the audience more confused then before about the key business drivers. No one can agree on what variables are most important – one senior manager becomes focused on the wage increase in year-three; another focuses on sales volume growth rate; and the head of operations goes wild over gross-margin expectations. The meeting ends with great confusion and indecision about what scenario is most likely and – more importantly – what actions must be taken to manage the outcomes. What happened? Susan put a great deal of effort into building these scenarios to represent her best estimate of the future and made a compelling presentation to accurately capture her insights. What could she have done differently to help make the discussion more constructive and insightful for senior management? Unfortunately, this situation is probably much closer to the norm then anyone would like. Every organization I have encountered struggles to bring clarity and focus to their forecasting discussions and in most cases are unaware of the inefficiencies and problems in their current process. Senior managers might feel pleased that they called attention to glaring errors made in Susan’s model or delude themselves into believing they made the best decision for the company. The risk, however, is that they focused on the wrong issues and unknowingly made bad decisions that could damage the company. There are three key issues in this scenario that could hurt the organization: The first issue is "variable overload." Susan’s responsibility as product manager is to fundamentally understand the key model variables and how their interaction impacts her business. However, the rapid escalation in number of variables created by the different scenarios has reduced her capacity in this area and forced her to inefficiently spend time trying to remember and track changes across different scenario files on her computer. She has little time to fully investigate and understand the underlying forces that impact the variables themselves. Susan has moved from being a product manager to being a database administrator – an unintended outcome from this method of analysis. The second issue is senior management’s "red herring focus" on final numbers – and neglecting to focus appropriately on the underlying variables. If management is not happy with the numbers, they will send Susan away to redo the scenarios and business plan until she gets satisfactory numbers. The result will be a lost opportunity for everyone to share a deep understanding of the key model drivers and their impact on the business. The final, third issue is the confusing and potentially misleading discussions and resulting actions created by the first two issues. Given the "variable overload" and "red herring focus" of the traditional approach, it is doubtful that Susan’s presentation will convey an accurate story of the forces driving the business. The risk is that management may come to erroneous conclusions about the business and make poor tactical or strategic decisions -- leading to inefficient and wasteful application of valuable resources. Does this scenario sound familiar? Most companies I work with struggle to effectively capture and communicate the full range of risks and opportunities present in any analysis. Whether the projected financials of a start-up company, the expected market penetration of a new product, or the cost of a new building, every organization must manage uncertainty and its incumbent risk. The more effectively organizations understand and communicate the uncertainty in their businesses, the more likely they are to focus their time and resources on those variables that have the greatest impact and are capable of being managed. But how should an organization manage these variables? The answer – in a broad sense -- depends on the organization. Every organization differs in how they manage internal decisions and analysis and there is no one method to fit all circumstances. However, there are some consistencies across organizations that give us a place to start. One is that most organizations depend on spreadsheet programs such as Microsoft Excel or Lotus 123 to conduct the vast majority of internal quantitative analysis. Even large organizations that depend on ERP systems, like SAP, download data from these systems into spreadsheets for discreet analysis. Additionally, the power of PCs and spreadsheets is vastly under-utilized due to the limitations imposed largely by users’ skills or lack of awareness of the power available to them. However, let me introduce a set of tools I have used with great success to remedy these limitations and immediately improve the way organizations analyze and communicate uncertainty and risk. I refer to these tools as "simulation software." The simulation software packages I am referring to in this discussion are add-on modules to popular programs such as Excel and Lotus 123. They require minimal training and little to no changes to existing models already created by your organization. However, the impact is substantial and I believe will immediately change the nature of how analysis is conducted in organizations. Simulation software allows the spreadsheet to automatically run a large number of scenarios based on a set of parameters for each variable entered by the user. For example, Susan may not know with any certainty the product’s raw material cost given the fluctuating market conditions. However, she may be very comfortable describing the range of possibilities with something like a statistical normal distribution curve fit between a min and max point. The simulation software allows her to define this range and any other characteristics of the variable she feels must be considered. She then identifies an output cell that the software evaluates while running scenarios. When activated, the software automatically runs a specified number of scenarios by randomly picking variable values from within the parameters defined by the user. It then returns a clean graph and statistics showing the range of possible values for the output cell and likelihood for each. It also ranks the impact of each variable on the output cell – giving the user valuable information about what key variables have the most significant influence on the model. Referring back to Susan and her situation, we see that simulation software packages can have a powerful impact on our three issues: First, by using one simulation model, Susan can focus on managing the fifty variables in ONE model and use her time to gain a deeper understanding of each one for use in later discussions and presentations. Susan is free to be a product manager instead of a database manager of an unmanageable number of variables spread across numerous scenario files. Second, by using this tool, Susan is able to focus managements’ attention on the most important variables within the model. The software’s sensitivity analysis function will show the impact of every variable – allowing her to keep the conversation on the most important areas – which leads to the final issue . . . As a result of Susan’s ability to highlight the key variables and present the full range of potential outcomes, the quality of discussion and debate will be far more powerful and constructive. Using this tool, Susan has been able to focus management’s attention on the key variables and let the simulation deliver the final results. As long as there is general consensus on the assumptions used for each variable, there is really no reason to argue about the final number. Susan’s managers can now have a constructive conversation about how to best manage the key variables that impact the model – rather than wasting valuable time on variables that have minimal impact on the final result. This has been a brief introduction to a set of tools that can make an immediate and powerful impact to how organizations conduct their internal analysis. Managers at all levels can use these tools to increase their efficiency and effectiveness when conducting analysis, focus discussion and debate on key drivers, and generally improve their organization’s ability to make critical tactical and strategic decisions in uncertainty. There are several simulation packages available and most offer the opportunity to download a demo for evaluation. Two such programs are Crystal Ball (www.decisioneering.com) and @Risk (www.palisade.com). While there is a cost for the software (from $700 – to $1,500 – depending on additional modules purchased), I believe it is very reasonable given the time it can save managers and the powerful impact it will have on the quality of decisions made within organizations. — Joe G. Herlihy is president of Clarity Consulting Group, a strategy and management-consulting firm headquartered in Portland, Maine. He may be reached at joe@claritycg.com. Joe Herlihy works with clients to implement their business strategy by identifying and applying the appropriate quantitative and qualitative tools that bring clarity and focus to key business decisions. A graduate of the United States Naval Academy and Duke University's Fuqua School of Business (MBA), Joe currently resides in Portland, Maine where he is an avid kayaker and participant in numerous local organizations including the Big Brothers/Big Sisters Organization. Joe Herlihy provided permission to re-publish and store this article at DSSResources.COM on Tuesday, April 23, 2002. A version of this paper appeared in www.interfaceNOW.com. This article was posted at DSSResources.COM on June 2, 2002. All information copyright ©2002 by Joe Herlihy, all rights reserved. |