You are currently browsing the monthly archive for February 2013.
I’ve been using electronic spreadsheets for more than 30 years. I consider this technology among the 20th century’s top five most important advances in business management. Spreadsheets have revolutionized every aspect of running any organization. A spreadsheet (specifically, VisiCalc) was the original “killer app” that made business people feel the necessity to buy a personal computer.
Yet, as enthusiastic as I am about spreadsheets, I know they have their limits. If you fail to respect those limits, you wind up paying for it, either in obvious ways, as JP Morgan did when faulty spreadsheets used by its trading desk cost it billions, or in less obvious ways, such as in multiple business mistakes that go unnoticed and uncounted, or by hours lost trying to make spreadsheets do things they were never designed for, or in valuable opportunities organizations cannot take advantage of because they run out of time dealing with basic issues imposed by spreadsheets.
The essential problem is the disconnect between what spreadsheets were originally designed to do and how they are actually being used in corporations. Spreadsheets were designed to be a personal productivity tool, good at prototyping models and analytics used in processes, performing one-off analyses using simple models and storing small amounts of data. But many people run into issues using spreadsheets for collaborative, repetitive, enterprise-wide tasks. More than half (58%) of participants in our research said that the most important spreadsheet they work with is used in just these sorts of efforts.
One of the most serious, longstanding, but mostly overlooked issues is that desktop spreadsheets are error-prone. One-third of our research participants say that errors in data occur in the spreadsheets they use for their most important process, and a quarter say there are errors in formulas. Errors in spreadsheets are common, and may remain even when they are checked and, indeed, audited. The consequences of these errors can be catastrophic. Yet few of our participants (even the most experienced and proficient users) said they do rigorous checking to ensure accuracy of the data; half do checking only when something doesn’t look right, and when they do check they look only at selected cells. This likely is another reason why most people see no impact on their productivity from problems in spreadsheets: They simply don’t spend a lot of time checking for errors. It also may explain why mistakes have a limited impact on businesses – even the selective eyeballing method of checking for accuracy is likely to spot many of the blatant mistakes that would have a significant impact on business.
Productivity is one of the important reasons why spreadsheets became popular – but when used improperly, spreadsheets sap productivity. Spreadsheets are seductive because people are familiar with them. They are easy to set up to do even moderately complex models, perform analyses and create reports. However, after more than a few people become involved and a spreadsheet file is used and reused, cracks begin to appear. Very quickly, a large percentage of the time spent with the file is devoted to finding the source of errors and discrepancies and fixing the mistakes. Our research confirms this. When it comes to important spreadsheets that people use over and over again to collaborate with colleagues, on average people spend about 12 hours per month consolidating, modifying and correcting the spreadsheets. That’s about a day and a half per month – or about 5-10 percent of their time – just maintaining these spreadsheets.
It’s fairly common for there to be multiple spreadsheets circulating around an organization that are supposed to be the same, but that are different versions with different numbers in them. Long ago, this phenomenon got its own term: dueling spreadsheets. It happens because the data in a desktop spreadsheet is not bound to a single source that everyone is working from. It may be a single source, but the data may be collected at different times. There may be changes to formulas or additions or deletions that take place, and not everyone is on the same version. This is what happens when spreadsheets are used in repetitive, collaborative enterprise processes. It’s a problem that is more common in larger companies than small; 44 percent of companies with 10,000 or more employees – very large companies – say that it happens frequently or all the time, which is about twice the rate reported by small or midsize businesses, and for companies between 1,000 and 10,000 employees the percentage is about one-third. It’s also more commonly experienced in finance departments – which requires absolute accuracy. Sometimes the differences between spreadsheets are not that important, or you can work around them. But time is wasted as our research found – more than half of organizations (56% always time-consuming and usually time-consuming) spend significant amount of time combining spreadsheets.
But in many other cases it’s important to not waste time trying to resolve whose spreadsheet is right – because they all could be wrong.
People routinely communicate the results of their data gathering and analysis using spreadsheet charts and graphs as visual aids. Most (64%) research participants – and 80 percent of those who spend more than three-fourths of their time working with spreadsheets – find it easy or very easy to create graphs and charts in spreadsheets. Yet while it’s easy enough to create them, organizations find that these graphs and charts do not satisfy their needs for visualizing information. Only one-fourth said that these graphical elements are all they need to communicate information effectively in visual form; another one-fourth said that these graphs and charts are suitable for only some of their work. While one-third of those in small or midsize organizations said spreadsheet visuals are enough, just 18 percent from large and very large organizations said spreadsheets are a suitable visual communications tool. People working in larger companies likely have higher expectations for the quality of presentations and other such material and may need to convey more complex information. We conclude that the charting and graphing capabilities included in desktop spreadsheets (notably Microsoft Excel) are not adequate for communicating data and insights.
Today businesses have many more options than ever before to embrace and extend spreadsheets – to have the best of both worlds. Many new, affordable software solutions complement spreadsheets and address their shortcomings. Many applications use Excel as an interface – giving people a familiar look and feel while addressing key technological shortcomings of desktop spreadsheets that make them error-prone and difficult to consolidate or roll up. Spreadsheets are a leading reporting tool – critical for communicating results, forecasts, situations and risks. Yet even after all this time, when it comes to reporting, spreadsheets fall short because they can be time-consuming to update contain unseen errors and – even though it’s easy to create basic charts – they aren’t flexible or capable enough for even moderately sophisticated visual communications. Even with these drawbacks, people frequently choose to create reports themselves because the IT department takes too long to create and modify them: nearly half (46%) find this to be the case.
It’s important for senior executives to understand the limitations of spreadsheets and acknowledge their shortcomings. Spreadsheets are indispensible to any organization, but they must be used only for what they were designed for. It’s important not to abuse them and allow them to saddle your organization with hidden costs, risks, and poor or untimely decisions driven by spreadsheets.
Robert Kugel – SVP Research
Profit Velocity Solutions’ PV Accelerator is an analytic application designed to enable capital-intensive companies to consistently achieve substantially wider margins and higher return on assets (ROA). Companies in industries such as specialty chemicals, building materials, integrated steel mills and silicon chip fabrication (to name just four) routinely fail to make the right decisions about pricing, production and sales management because they use analytic methods that, from an economic perspective, present a distorted measure of profitability. Profit Velocity’s approach is to use profit contribution per unit of time as the core principle for driving decisions about production, pricing and CRM-related issues, including compensation-, customer- and account management.
Profitability is one of the key objectives of running a business. Profitability is a gauge of the competence of a company’s management and the soundness of its strategic direction. It’s important, therefore, to be able to accurately measure profitability and use this information to support routine business decisions. Because there are many ways that a company rolling in dough can be on a certain path to insolvency, modern accounting has developed ways to address the shortcomings of using a simple – but simplistic – cash-based approach to determining the profitability and health of a business. Over the years accounting science has seen a slow but steady progression of improvements in measuring true profitability, mostly through refinements in quantifying costs.
Accounting attempts to use better measures of the underlying economic reality to more faithfully represent the financial health of a corporation. As the Industrial Revolution altered the structure of business operations, the first formal cost accounting methods (now referred to as “traditional”) emerged more than a century ago to address the need to measure and analyze costs to reflect those changes. Management accounting, a newer approach to cost accounting designed to be forward-looking rather than historical, is geared to the needs of a business’s internal executives and managers rather than its outside shareholders and creditors.
An important refinement, marginal cost accounting, emerged in the late 1940s in Germany, where it is known as Grenzplankostenrechnung (GPK). GPK is designed to accurately measure the marginal cost of a good or service rather than the average or some statutory accounting-based measurement. Understanding marginal cost is critical in many pricing decisions. For instance, one hour before a flight’s departure, the marginal cost of an airline seat essentially is the cost of the fuel needed to carry the incremental passenger’s weight. Any revenue generated above that goes to the bottom line. By calculating a more accurate economic measure of profitability, GPK can enable companies to generate higher economic returns than traditional cost accounting and provides a more useful management approach to controlling costs. However, GPK gained few adherents in North America, where corporations stuck with traditional cost accounting. Some U.S. and Canadian companies began adopting activity-based costing (ABC) in the late 1980s as part of a response to their diminishing competitiveness, particularly in manufacturing. ABC attempts to measure all activities that drive costs rather than using direct labor cost as a proxy, and therefore, like GPK, provides a measure of profitability more closely aligned with real economic returns.
A more accurate economic measurement of cost is a key element to achieving better profitability management – but for many types of businesses it is insufficient.
Profit Velocity’s innovation is to take profitability measurement to a new dimension by calculating it on a per-unit-of-time basis (minute, hour or day, depending on what’s most relevant). For any type of costing methodology this is a superior approach to managing profitability in situations where productive capacity is a key, and relatively expensive, resource; that is, for asset-intensive businesses with a high opportunity cost, such as integrated steel, specialty chemicals, integrated circuit fabrication facilities and hospitals. To illustrate why it’s important to incorporate the time dimension, consider a company with two products that have the same per-unit profitability – that is, they use the identical direct labor and materials inputs – but product B requires twice as much processing time in the company’s facilities as product A. If the company only makes product A, it can generate twice the profit per year compared to producing just product B. For many reasons (such as limited market demand or long-term strategic considerations) selling only A is likely to be an infeasible solution. Yet this analysis illustrates that the company is better off emphasizing A in its selling efforts and giving it priority in its production plans.
In companies with even a moderately complex product lineup, a time-based approach to analyzing profitability can be a lens that leads to better insight into profit optimization opportunities. “Quick nickels are better than slow dollars” is an old discount retailer’s catchphrase that applies equally well to many industrial and consumer goods businesses. If, say, product A earns lower margins based on materials and labor cost than B but requires significantly less machine time, a manufacturer that emphasized product B on the grounds that it is a higher margin product would have lower returns than one that emphasized A.
To enable executives and managers to better understand each product’s real contribution to the bottom line, PV Accelerator presents a company’s offerings graphically along two axes: margin per unit (the vertical axis) and units per hour (the horizontal one). By looking at where each product sits in this array, it’s easy to identify the products in the upper right quadrant that have the best combination of unit profitability and throughput – the ones with maximum “profit velocity.” It’s also possible to see which ones fit into the other quadrants and use this information to frame sales and product strategies. Those products in the upper right quadrant with the greatest margin per hour are ones that should receive emphasis in sales and where the company must defend its market position. Conversely, companies should consider dropping slow-moving products with the lowest profit margins – the ones in the lower-left quadrant. Higher-than-average margin but lower throughput products need to be de-emphasized in sales and manufacturing decisions. Alternatively, a product’s design or its production process could be changed to increase its throughput to enhance its margin-per-unit-of-time value. Finally, high throughput but low margin products might be candidates for price increases or redesign to enhance their value to the bottom line.
A time-based profitability metric serves as a common denominator to align the objectives of product organizations, sales and marketing, and finance. It is especially useful in focusing attention on the often difficult issue of intelligently managing customer profitability, and can serve as a starting point for more effective pricing strategies and tactics.
For me, the most attractive aspect of PV Accelerator is its practicality as a business tool. For its target market, it’s relatively quick and easy to deploy, so it has a short time to value. The software is a cloud-based service, so up-front investment is limited. The company offers a free preview of its software that uses a simple data extract to demonstrate the opportunity to enhance returns. A full deployment can be completed in weeks.
PV Accelerator supports a straightforward approach to continuous improvement. As a planning tool it facilitates analysis of historical data to have a clearer picture of what’s driving profitability. It calculates the revenue and profit impact of any number of scenarios an organization might consider as it puts a plan in place. Subsequently that plan serves as a baseline that is used to measure actual-to-plan variances and pinpoint their underlying causes, enabling a deeper understanding of what opportunities or issues need to be addressed. Companies may start with a core set of users of the software and over time extend its use to additional areas of the enterprise. Used as a change management tool, it can enable a more intelligent approach to product, production and sales strategies as well as to making better tactical decisions in production planning and sales promotions.
Profit Velocity uses an indirect sales approach exclusively, which is the best fit for the software and how it’s used to support better management decision-making. Decades of experience shows that it’s difficult to sell software where the value of the software can only be realized with a “change management” effort. In this case, the decision to make fundamental changes to production, pricing and selling must start at the top of an organization. The focus must first be on the people and process elements required to achieve results. Profit Velocity therefore has consultant partners that concentrate on implementing and sustaining advanced profitability management initiatives, and that offer their expertise and guidance on defining and executing a better management strategy. They resell Profit Velocity as the means to enable and support their clients’ new strategic direction. In these types of situations this division of labor is superior to one where the consulting organization itself creates and maintains the software, because experience shows that clients get the best results when the software is created and maintained by an organization whose sole focus is on the code.
Profit Velocity’s software gives a company a clearer picture of how it is making money so it can make better decisions more consistently. Business is never static, and Profit Velocity adapts continuously to changing conditions. Moreover, it does so without requiring a major investment in information technology or a laborious implementation process. Asset-intensive industries – and consultants that specialize in supporting these types of businesses – should get to know what PV Accelerator has to offer them.
Robert Kugel – SVP Research