You are currently browsing the tag archive for the ‘Risk Management’ tag.
One of the most important IT trends over the past decade has been the proliferation of ever wider and deeper sets of information sources that businesses use to collect, track and analyze data. While structured numerical data remains the most common category, organizations are also learning to exploit semistructured data (text, for example) as well as more complex data types such as voice and image files. They use these analytics increasingly in every aspect of their business – to assess financial performance, process quality, operational status, risk and even governance and compliance. Properly applied, business analytics can deliver significant value by deepening insight, supporting better decision-making and providing alerts when situations require attention from managers or executives.
Packaged analytic applications and specialized tools continue to expand in number and improve in functionality in response to specific needs. Businesses now have access to tools for creating tailored applications for vertical industries, which means analytics use is no longer limited to trained statisticians. Techniques for processing large data sets (big data) have proliferated to the point where employing insights from them for practical business purposes is now within reach, even for midsize companies. This includes the use of predictive analytics to enable earlier and more intelligent responses to changing business conditions. Web-based platforms for handling huge sets of data make it possible for companies to access and utilize these advanced analytics economically. Advances in mobile technology provide simpler access to analytics from smartphones and tablets and facilitate collaboration.
Analytics has long been a tool used by finance. Because accounting records are numerical and readily available, people in the finance function have been able to use forms of analytics for centuries. As a result, analytical techniques for assessing balance sheets, income statements and cash flow statements are well-developed and widely accepted. Unfortunately, because these techniques are so well-established, finance professionals have been slow to broaden their palette of analytics even as the opportunities available to them have proliferated. Our research shows that many organizations lag in their use of advanced finance analytics. Many of them view the role of finance analytics narrowly; as a whole, finance has largely failed to take advantage of advanced analytics to address the broader needs of today’s enterprises and thus increase its own value. Indeed, too few professionals even realize that these tools can help finance take more of a leadership role in their corporation.
Forward-looking finance departments can start to use cutting-edge analytic initiatives in areas that include customer profitability, price and profitability optimization, lean manufacturing, risk mitigation and economic costing methods, but their use requires an in-depth understanding of the options available and the information technology requirements for each. Those insights alone, though, will not make a difference; few finance organizations have done the evaluations necessary for selecting the right analytic methods and tools and using them properly. For example, our research in big data shows that fewer than half of organizations are using big data for activities that should be staples of finance organizations, such as contingency (“what if”) planning and predictive analytics.
I find that prospective buyers’ poor understanding of analytics-based best practices and functional requirements are significant issues in most companies, as are deficiencies in their software and data environments. All these hinder their ability to improve their control over business processes and make it more difficult to choose new technology that can deliver value.
Our research shows that finance organizations typically trail other lines of business in adopting technology. Today, advances such as in-memory processing, big data, predictive analytics and visual discovery offer the potential to revolutionize finance analytics. Finance organizations need to understand how to use advanced analytics to achieve better performance. Analytics also must be accessible anywhere and at any time to foster collaboration and promote agility – two management ingredients that can drive superior performance. Many analytics-related processes (such as planning and reviewing) are collaborative as well as iterative. Mobile analytic capabilities (provided by smartphones and tablets) that enable quick access and instant communication across an organization are not shiny new toys. They are important components to an IT infrastructure that supports better processes.
Robert Kugel – SVP Research
Banking giant JP Morgan raised eyebrows in 2012 when it revealed that it had lost a substantial amount of money because of poorly conceived trades it had made for its own account. The losses raised questions about the adequacy of its internal controls, and broader questions about the need for regulations to reduce systemic risk to the banking system. At the heart of the matter were the transactions made by “the London Whale,” the name given to a JP Morgan’s trading operation in the City by its counterparties because of the outsized bets it was making. Until that point, JP Morgan’s Central Investment Office had been profitable and apparently well controlled. In the wake of a discovery of the large losses racked up by “the Whale,” JP Morgan launched an internal investigation into how it happened, and released the findings of the task force established to review the losses and their causes [PDF document].
One of the key points that came out of the internal investigation was the role of desktop spreadsheets in creating the mess. “The Model Review Group noted that the VaR [Value at Risk] computation was being done on spreadsheets using a manual process and it was therefore ‘error prone’ and ‘not easily scalable.’” The report also cited as an inherent operational issue the process of copying and pasting data into analytic spreadsheets “without sufficient quality control.” This form of data entry in any critical enterprise function is a hazard because the data sources themselves may not be controlled. After the fact it is impossible to positively identify the source of the data, and (unless specifically noted) its properties (such as time stamps of the source data) will also be indeterminate. These last two are sensitive issues when marking portfolios to market (that is, determining their value for periodic disclosure purposes), especially when there are thinly traded securities in that portfolio.
The report notes, “Spreadsheet-based calculations were conducted with insufficient controls and frequent formula and code changes were made.” In particular, in response to a recommendation by the Internal Audit group for providing greater clarity and documentation of how securities prices were arrived at, the individual with responsibility for implementing these changes made changes to a spreadsheet that inadvertently introduced material calculation errors, which slipped through because the changes were not subject to a vetting process. Absent a thorough audit of a spreadsheet, these sorts of errors are difficult to spot.
This was not the first time a financial institution has incurred serious losses because of faulty spreadsheets. One of the first big debacles took place more than 20 years ago when a faulty spreadsheet caused First Boston to take a multimillion-dollar hit trading collateralized debt obligations, which had only recently been invented.
Spreadsheet mistakes are quite common. Our recent spreadsheet benchmark research confirmed that errors in data and formulas are common in users’ most important spreadsheets. There’s even an association, The European Spreadsheet Risks Interest Group, that tracks the fallout from spreadsheet errors. Yet, even when millions of dollars or euros are at stake, the misuse of spreadsheets persists.
It’s a good thing that spreadsheet hazards are intangible, or they might have been banned or heavily regulated long ago in the United States by the Occupational Safety and Health Administration (OSHA) or similar bodies in other countries. All kidding aside, the London Whale incident raises the question, “Why do people persist in using desktop spreadsheets when they pose this magnitude of risk?”
Our research finds that the answer is ease of use. This is particularly true in situations like the one described by the JP Morgan task force. In the capital markets portion of the financial services industry it’s common for traders, analysts, strategists and risk managers to use desktop spreadsheets for analysis and modeling. Spreadsheets are handy for these purposes because of the fluid nature of the work these individuals do and their need to quickly translate ideas into quantifiable financial models. Often, these spreadsheets are used and maintained mainly by a single individual and undergo frequent modification to reflect changes in markets or strategies. For these reasons, more formal business intelligence tools have not been an attractive option for these users. It’s unlikely that these individuals could be persuaded to take the time to learn a new set of programming skills, and the alternative – having to communicate concepts and strategies to someone who can translate them into code – is a non-starter. Moreover, these tools can be more cumbersome to use for these purposes, especially for those who have worked for years translating their concepts into a two-dimensional grid.
Desktop spreadsheets have become a bad habit when they are used in situations where the risk of errors and their consequences are high. Increasingly, however, they are a habit that can be broken without too much discomfort. The task force recommended more controls over the spreadsheets used for portfolio valuation. One way of doing this is simply to add vetting and sign-off before a spreadsheet is used, controls to prevent unauthorized changes and periodic audits after that to confirm the soundness of the file. This classic approach, however, is less secure and more time-consuming than it needs to be. Organizations can and should use at least one of three approaches to achieve better control of the spreadsheets they use for important processes. First, tools available today can automate the process of inspecting even complex spreadsheets for suspicious formulas, broken links, cells that have a fixed value rather than a formula and other structural sources of errors. Second, server-based spreadsheets retain the familiar characteristics of desktop spreadsheets yet enable greater control over their data and formulas, especially when integrating external and internal data sources (say, using third-party feeds for securities pricing or parameters used in risk assessments). Third, multidimensional spreadsheets enable organizations to create libraries of formulas that can be easily vetted and controlled. When a formula needs updating, changing the source formula changes every instance in the file. Some applications can be linked to enterprise data sources to eliminate the risks of copy-and-paste data entry. Since they are multidimensional, it’s easy to save multiple risk scenarios to the same file for analysis.
Spreadsheets are a remarkable productivity tool, but they have limits that users must respect. Desktop spreadsheets are seductive because they are easy to set up. They are especially seductive in capital markets operations because they also are easy to modify and manipulate. However, these same qualities make it just as easy to build in errors with grave consequences that can be nearly impossible to spot.
A decade ago, there were few practical alternatives to desktop spreadsheets. Today, there are many, and therefore fewer good reasons not to find and use them. The issues uncovered by the “London Whale” episode are far from unique. Only when a disaster occurs and the fallout is made public do people see the consequences, but by then it’s too late. Executives, especially in risk management functions, must become more knowledgeable about spreadsheet alternatives so they can eliminate systemic risks in their internal operations.