You are currently browsing the tag archive for the ‘Risk Management’ tag.
Senior finance executives and finance organizations that want to improve their performance must recognize that technology is a key tool for doing high-quality work. To test this premise, imagine how smoothly your company would operate if all of its finance and administrative software and hardware were 25 years old. In almost all cases the company wouldn’t be able to compete at all or would be at a substantial disadvantage. Having the latest technology isn’t always necessary, but even though software doesn’t wear out in a physical sense, it has a useful life span, at the end of which it needs replacement. As an example, late in 2013 a major U.K. bank experienced two system-wide failures in rapid succession caused by its decades-old mainframe systems; these breakdowns followed a similarly costly failure in 2012. For years the cost and risk of replacing these legacy systems kept management from taking the plunge. What they didn’t consider were the cost and risk associated with keeping the existing systems going. Our new research agenda for the Office of Finance attempts to find a balance between the leading edge and the mainstream that will help businesses find practical solutions.
The example above suggests why it’s important for executives to understand how technology affects a finance organization’s ability to improve its overall effectiveness and strategic contribution to the rest of the company. Finance must use technology to support improvements to processes and enable its people to focus on work that provides the most value to the operating results of the company. For example, analytics can enhance understanding and visibility, giving executives the ability to make better decisions more consistently. The role of technology will become even more important for finance organizations over the next several years as the pace of change in business computing accelerates. This change will be driven by the cumulative impact of a decade’s worth of technology evolution and the increasing demographic shift from executives and managers of the baby boom generation to those who grew up with computer technology. These demographic shifts will drive demand for a new generation of software, one that emphasizes mobility and agility.
In this context it’s important that CFOs and financial executives monitor the evolution of technology and understand what’s possible. We recommend a rigorous annual technology audit for this purpose.
With this in mind, our research focus for the Office of Finance in 2014 will cover three main areas.
First, to address its basic requirements, we’ll look at the application of financial performance management (FPM) to help achieve consistently better results. Ventana Research defines FPM as the process of addressing the often overlapping issues that affect how well finance organizations support the activities and strategic objectives of their companies and manage their own operations. FPM deals with the full cycle of the finance department’s functions, including corporate and strategic finance, planning, forecasting, analysis, closing and reporting. It involves a combination of people, processes, information and technology. We see information technology as a particular focus of FPM because we find that most finance organizations are not using IT assets as fully as they could. In particular, they often focus only on efficiency and neglect opportunities to use IT to enhance their effectiveness, which can make a difference in their overall results.
Similarly, ERP systems are a core technology that supports finance operations. Finance executives must take into consideration more than just the difficulty and cost of implementing and modifying these systems. As for the British bank that has experienced multiple major failures, reluctance to make timely changes to ERP systems poses cost and risk issues of its own. Organizations that are reluctant to use cloud-based ERP also should re-examine their assumptions for not doing so. In particular, midsize companies that are transitioning from entry-level accounting software to something more sophisticated often will discover that a cloud-based deployment is a more sensible alternative to persisting with their existing software or buying an on-premises system. As well, larger companies and those that have geographically dispersed operations may find that cloud alternatives for their second-tier ERP systems offer better value than staying with on-premises systems.
To strengthen the core capabilities of the Office of Finance, there are several technologies that will be particularly important. One is in-memory computing. Because of its ability to rapidly process computation of even complex models with large data sets, in-memory computing can change the nature of planning, budgeting, forecasting and reviews. It enables organizations to run more simulations to understand trade-offs and the consequences of specific events, as well as change the focus of reviews from what-just-happened to what-do-we-do-next. For these reasons, in-memory computing also may encourage more companies to replace desktop spreadsheets (which have practical limits to the size, complexity and adaptability of the models that are created in them) with dedicated planning applications that can harness the power of in-memory computing. Our recent planning benchmark research reveals that a majority of midsize and larger companies continue to use spreadsheets for planning, forecasting and budgeting – and these companies continue to suffer the consequences, such as an inability to do effective contingency planning or drill down into underlying data to have true visibility into root causes of opportunities or issues.
Predictive analytics is another technique that companies can utilize to achieve better results. It can be used to create more accurate or nuanced projections of future outcomes and is especially useful in quickly finding divergences from expectations to create more timely alerts. For instance, rather than having to wait until the end of a month to look at actual results and then initiate a course of action, early in the month a corporation could use predictive analytics to spot and address a probable revenue shortfall in a specific product line, or to change production rates or shipments to avoid a likely regional stock-out caused by demand that is stronger than expected.
The second area where technology will play an expanding role in business computing is enabling the finance organization to play a more active role in improving performance in the company’s operations. Finance has the necessary analytical talent as well as the ability to be a neutral party in cases where issues cross business unit or geographic boundaries. For example, software that helps manage pricing and profitability is spreading from hospitality, transportation, retailing and consumer financial services to other areas, especially business-to-business industries. Used properly, this type of software enables a company to tailor its control of individual decisions regarding pricing, discounts and other terms to achieve results that are best suited to its strategy. It can continuously make adjustments consistent with longer-term objectives in response to market conditions. Similarly companies increasingly use expense management systems to gain greater control over aggregate spending and vendor selection. These sorts of systems can provide controllers and treasurers with greater forward visibility into future outlays, ensure volume discounts are utilized and honored, and help streamline the accounts payable process to earn early-pay discounts.
Another operational area, taxation, is one of a company’s biggest expenses, yet direct (income) tax management is still in its infancy. Here also, most organizations use desktop spreadsheets to manage their direct tax analytics and provisioning, a time-consuming process that fails to deliver transparency or the ability to manage tax risk exposure effectively. They would do better with technology more conducive to a strategic approach to managing income taxes. There is a growing list of software available to make the tax provisioning process faster and more visible, enabling companies to make better decisions about the timing and aggressiveness of their tax positions. In addition, all larger and even some midsize corporations can benefit from a dedicated tax data warehouse to support the automation of tax planning and provisioning.
The third area where technology can help senior executives achieve better results is in implementing fundamental changes in business management. For example, our 2013 benchmark research on long-range planning demonstrates that better management of technology and information can improve alignment between strategy and execution. As well, far from simply being a technology concern, cloud computing enables corporations to cut costs and gain access to more sophisticated technology than they could feasibly support in an on-premises deployment. Using the right technology can boost performance. The improper use of spreadsheets as seen in our research continues be an unseen killer of corporate productivity because they have inherent defects that significantly reduce users’ efficiency in these tasks. Increasingly companies have inexpensive options that are easier to use and enable them to do more advanced, reliable modeling, analysis and reporting. Finance organizations are usually involved in developing scorecards to assess performance. In developing and applying balanced scorecards, companies must incorporate operational risks to the mix of measures. Managing operational risk outside of financial services is still hit and miss, as our recent governance, risk and compliance benchmark research shows. Nonfinancial businesses rarely manage risk well, usually because they do not measure risk explicitly and therefore do not formally consider it as a trade-off in making decisions.
In the new year we will explore these issues through several approaches. Our benchmark research rigorously investigates how business and technology issues intersect. In 2014 our Office of Finance benchmark research will examine how well finance organizations utilize technology to address a growing need for greater efficiency and to play a more strategic role in the company’s management and operations. Also, as noted, developments in information technology are making it feasible to transform business planning to become more interconnected, collaborative, mobile and interactive. Our next-generation business planning benchmark research will investigate these factors as we assess the current state of business planning, the issues that prevent organizations from planning, forecasting and budgeting effectively, and the ways in which information technology affects their ability to plan.
From another perspective, Ventana Research Value Indexes are detailed evaluations of vendor software offerings in specific categories. We carefully assess an exhaustive list of product functionality and its suitability to task, product architecture and the effectiveness of vendor support for the buying process and customer assurance. Each Value Index represents the value a vendor offers and relevant aspects of its products and services for users. In 2014 we will once again assess financial performance management suites and introduce a new Value Index for Business Planning software to complement our benchmark research.
Overall, finance departments and lines of business still focus on improving the efficiency of the mechanics of day-to-day operations, and they often fail to use available technology to support more effective approaches. Executives in finance and business must look at their existing IT systems with an eye to making better use of them to automate repetitive tasks and speed execution of cross-departmental functions. Doing that can free up time and money better spent on activities that return value, such as more insightful and actionable analyses or more accurate forecasting and planning.
Information technology is an essential element of business management. Yet many senior executives and managers have too narrow and too limited an understanding of IT’s full potential, much as those managing corporate information technology usually don’t appreciate business issues and how IT can address them. The business/IT divide is a barrier that prevents many companies from achieving their performance potential. The divide is not necessary. Business executives need not be able to write Java code or master the intricacies of an ERP or sales compensation application. However, CEOs and executives should master the basics of IT just as they must understand the fundamentals of corporate finance, the production process and – at least at a high level – the technologies that support that process. My research agenda for 2014 continues to focus on the major issues that confront businesses where technology plays a key role in addressing those issues.
Robert Kugel – SVP Research
Integrated risk management (IRM) was a major theme at IBM’s recent Smarter Risk Management analyst summit in London. In the market context, IBM sees this topic as a means to differentiate its product and messaging from those of its competitors. IRM includes cloud-based offerings in operational risk analytics, IT risk analytics and financial crimes management designed for financial institutions and draws on component elements of software that IBM acquired over the past five years, notably from Algorithmics for risk-aware business decisions, Open Pages for compliance management, SPSS for sophisticated analytics, Cognos for reports, dashboards and scorecards, and Tivoli for managing all of this in a Web environment. Putting its software in the cloud enables IBM to streamline integration and maintenance, offer more flexible deployment and consumption options and potentially lower the total cost of ownership.
From a competitive standpoint, IRM is an attempt to change today’s highly fragmented financial services software market by emphasizing an integrated approach to managing risk and the often intertwined regulatory compliance. Although in concept this could apply to any risky, highly regulated business, the greatest payoff today is in financial services. IRM focuses the value proposition at a high level in the organization and shifts the objective from a narrow functional or business silo-based approach to a more strategic one. Beyond fully exploiting its applications portfolio, IBM is trying to capitalize on an important trend in global finance: the need to optimize the use of capital to achieve a higher risk-adjusted return on equity than one’s competitors. This has several implications:
- Deploying bank capital in areas that offer the best risk/return characteristics in a way that matches the organization’s strategy. Since business and financial market conditions are in constant flux, optimization must be an ongoing process and the systems that support it must be fast and efficient.
- Striving for fewer “unforced errors” in trading and lending and mitigating the impact when loss situations develop. According to IBM’s tracking of incidents, 45 percent of the biggest losses incurred by financial institutions in 2013 occurred at the boundary between credit and operational risks – an area that unintegrated risk management systems may not be able to track.
- Integrating regulatory compliance into the risk management environment. Our research shows that financial services companies are far more regulated than other businesses. Nearly eight in 10 participants from this industry sector described themselves as heavily regulated compared to 58 percent of government, education and nonprofits, 40 percent of services companies and just 19 percent of those in manufacturing according to our benchmark research on governance, risk and compliance. Today, because the economics of managing financial services business are shaped by a micromanaged regulatory structure, it’s increasingly valuable to incorporate compliance into risk management systems.
More effective risk management will be among the three top strategic objectives of all financial institutions in open financial market systems for the next decade. The ability of these organizations to balance risk and return across their entire asset portfolio in a way that matches their institutional strengths, minimizes avoidable losses and responds quickly to changing market conditions will be a critical determinant for long-term success. The importance of optimizing trade-offs between risk and return in structuring financial institution assets – in daily trading-desk decisions as well as longer-term strategic portfolio ones – reflects a fundamental change in the financial services environment. For the three decades leading up the 2008 financial crisis, capital was relatively more abundant (in the sense that regulators permitted higher leverage), and in a relatively benign, liquidity-driven environment returns were high enough to compensate for mistakes. That has not been the case since the crisis. Rather, returns on capital have been constrained by a systematic deleveraging of financial institutions, increasing regulation and constraints on how these companies operate, especially in deposit-taking institutions. Given the severity of the crisis and its aftermath, it’s unlikely that this stringency will lessen soon.
To be sure, other strategic elements – such as having sufficient critical mass in one or more segments of the capital markets or retail brand equity – will continue play roles in differentiating individual companies’ strategies. In many instances those may be more important than integrated risk management, but the latter will be a capability essential to ensuring the competitiveness of all banking, capital markets and insurance organizations for at least the next decade. As well, the full impact of this sea change has not yet taken effect. In the United States, for example, the fiendishly complex Dodd-Frank Act is still a work in progress. Some of the provisions of new regulations have altered the economics of business and rendered some seemingly plain-vanilla offerings unattractive or even unprofitable. New rules governing risk capital and liquidity (such as the Net Stable Funding Ratio) have yet to go into effect. It now appears that under the Volcker Rule bank executives may be responsible for attesting to their compliance environment. This means at the least that U.S.-regulated institutions must have sufficiently effective enterprise-wide compliance monitoring and reporting that goes a step beyond a plausible deniability standard. As well, over the past five years governments in many developed nations have been coddling the balance sheets of local financial institutions (directly or indirectly) to preserve and/or rebuild their balance sheets. This period is coming to an end, and the pressure on senior executives to eke out even basis-point measures of performance will intensify.
Today, most financial services organizations achieve a unified view of risk and make determinations of how to deploy bank capital by cobbling together information from multiple systems. The process consumes a great deal of employee time, is slow and uses data that is not always trustworthy. While the process integrates data and analyses, it is far from integrated. In today’s environment for financial services organizations, IBM’s challenge is to create a market for integrated risk management largely from scratch. It’s a concept likely to get enthusiastic endorsement at the executive level but then founder on the practical problems of rolling it out – especially in the sort of complex organizations that could utilize it best. Two sets of issues – one related to data and the other to people – are key obstacles.
For the former, IBM is advocating the adoption of an integrated risk platform (IRP) to better address risk management. The platform integrates three broad pieces: A data repository with data management capabilities, a unified risk modeling approach supported by risk information governance to ensure commonality in performing planning and analysis to be able to frame risk policy and highlight issues on an enterprise-wide basis. These must be supported by reporting and other communications capabilities.
Integrating risk data is a significant challenge for financial institutions. Historically, risk data has been collected and managed close to its source. Consequently, financial services firms have multiple silos of risk processes, risk systems and risk data. Our research on information management shows that data fragmentation is a bigger issue for financial services than other businesses: On average, they source data from about twice as many systems as manufacturing and services companies (39% vs. 22% and 19%, respectively) according to our information management benchmark research. On top of that, each part of the business may use different terminology, apply its own rules for quantifying and qualifying risk and have different governance procedures. Those operating in multiple jurisdictions must conform local operations to local regulations but also at parent levels that may be in different regulatory regimes. Thus companies have multiple risk management systems and data stores, each structured for the specific needs of individual business silos. It’s therefore difficult for them to aggregate risk data into an enterprise view in a meaningful way and report on risk in a comprehensive and timely fashion. Similarly, like many businesses, few financial institutions have a unified view of their customer data. Parts of the organization may be dealing with different legal entities of the same organization, and this can have an impact on risk and compliance issues. From both risk management and compliance standpoints, it’s vital that the organization maintain accurate master client data that contains data hierarchies that reflect the structure of the clients’ business.
As for people issues, IBM insists on the need to have an integrated risk management platform and broad, cross-functional compliance management capabilities to support an effective chief risk officer (CRO). Our research finds that two in three (66%) financial services companies have a CRO. Yet this person often lacks a strategic mandate to manage and quantify the full spectrum of risk and returns from front-office risk intelligence, to operational governance processes and strategic capital planning. Instead, the CRO acts as a point person who has responsibility for overseeing a wide array of atomized, silo-based sets of risk management operations. It’s an aggregation of administrative responsibilities rather than a reimagined, integrated approach that transforms what today is a cost-minimization effort into something that promotes long-term competitiveness. To be truly strategic, a CRO must have an accurate, unified view of risk and compliance. It’s essential that financial services companies be able to automate the assembly of this information to facilitate rapid risk management cycles, enable full drill-down and drill-around analysis and increase the reliability the data and analyses while reducing the amount of staff time required to do all of this.
The data and people issues are mutually reinforcing. Support for IRM is essential to developing a truly strategic CRO position for financial services companies. Such a CRO will be able to drive improvements in managing risk and compliance in an integrated fashion that produces data and analyses that are reliable and timely. This approach is necessary to provide more trustworthy risk and compliance information to senior executives to enable them to confidently make consistently good decisions faster. Thus, establishing a more effective CRO role and the systems to support that function will be essential in the industry’s new environment. Having this connection recognized at the most senior levels of an organization is important because absent a top-level mandate for a CRO, the process of achieving a unified view of risk and compliance probably will be painfully slow. And unless they address their fragmented systems and data, financial services companies will find it increasingly difficult to manage risk and compliance well in the challenging business and regulatory climate.
Making major changes to enterprise data structures and making the role of a CRO more strategic are not going to happen overnight. IBM executives are well aware of that, describing the process of getting to integrated risk management as a journey. Fortunately, this is not the sort of initiative that requires a “big bang” to produce results. Data management and data integration efforts can produce measurable results if they are handled in a piecemeal yet steady fashion. Assembling a unified risk management platform can be performed on a step-by-step basis, allowing financial services companies to minimize deployment and disruption risk while developing skills for managing the implementation process. Risk and regulatory management are more important than ever to the success of financial services companies. They should being their journey to an integrated view of both as soon as possible.
Robert Kugel – SVP Research