Robert Kugel's Analyst Perspectives

IBM Displays Software for GRC at Vision 2012

Posted by Robert Kugel on May 22, 2012 12:22:27 PM

I recently attended Vision 2012, IBM’s conference for users of its financial governance, risk management and performance optimization software. I reviewed the finance portion of the program in a previous blog. I’ve been commenting on governance, risk and compliance (GRC) for several years, often with the caveat that GRC is a catch-all term invented by industry analysts initially to cover a broad set of individual software applications. Each of these was designed to address specific requirements across a spectrum of users in operations, IT and Finance within a company, often to meet the needs for a specific industry such as financial services or pharmaceuticals. Vision 2012 covered a lot of ground under the GRC heading, confirming the breadth of both this software category and IBM’s offerings in it. I want to focus on two areas: automation of IT governance activities and effective management of GRC-related data.

The event coincided with the final stages of our own GRC benchmark research, which finds that most organizations are immature in how they manage governance, risk and compliance. Our Maturity Index analysis places nearly two-thirds (63%) of participating organizations in the bottom two of the four levels that comprise the framework of our maturity assessment. Just 8 percent reached the highest Innovative level. Our assessments examine the people, process, information (mainly data) and technology (mainly software) aspects of a given business area. Software is important because it can enable more automated processes, which, in turn, can improve the efficiency with which they are performed. As well, data accessibility, consistency and accuracy can be significant issues within most governance, risk and compliance processes.

Using software effectively and safely for governance and risk management requires IT system controls. Today, most of the core activities in midsize and larger organizations depend on IT systems. Strong IT controls therefore limit the potential for financial fraud, operational disruption or theft of intellectual property, among the most significant areas of vulnerability.

Effective IT controls in turn require the ability to execute periodic mapping of risk. This involves identifying and assessing risks, tracking their incidence, reporting on these events as they occur, developing methods to mitigate the negative consequences when risk events occur and managing the mitigation process. Risk mapping is a collaborative, repetitive process that, done properly, improves the quality and consistency of risk identification and assessment. Using dedicated software to manage the risk mapping process promotes consistency, completeness and the use of a common language and metrics. It also increases the efficiency of IT governance and reduces the incidence of risk management failures.

To illustrate the value of this sort of automation, IBM demonstrated its OpenPages application performing periodic IT mapping. OpenPages, which IBM acquired in 2011, is designed to automate and support the IT risk assessment process. IBM has confirmed that clearly and consistently assessing the probability and potential severity of the impact of individual IT risks and identifying the necessary controls makes it possible to establish a prioritized plan for managing IT risks. An application such as OpenPages is well-suited to the task because it facilitates process management (through automated workflows), collaboration (by facilitating communication) and documentation (information is collected in a structured and consistent fashion). But this capability is apparently ahead of the curve, as our GRC benchmark research shows that only 14 percent of participants have adopted an application similar to OpenPages. Most manage these sort of IT risk management efforts in a partially automated fashion, using email, desktop word processing software and spreadsheets. Especially for larger companies, we believe this approach is far more labor-intensive, less adaptable to changing requirements and less effective in managing risk.

From my perspective, a second important point that came out of the conference is the need for a dedicated store of risk data that serves the needs of multiple sets of users. Managing GRC-related data is a pressing need for a majority of companies. Fewer than one-third (29%) of the participants in our benchmark research scored in the upper half of our maturity assessment for information. Data itself can be a serious impediment to effective GRC. Data availability and accessibility issues often contribute to excessive use of spreadsheets, which in turn create accuracy and consistency problems and promote inefficiencies. This applies to many regulated industries but most certainly to banks and other financial services companies.

Issues in availability and consistency of data necessary for managing a variety of compliance and risk management requirements arise in financial services companies that create the information needed for risk management and compliance in multiple systems. The problem is acute not only in, say, banks that have evolved organically, it’s made worse by those that have grown through acquisitions. This is because typically the data originates in systems that were provisioned to address the needs of one constituency but now has to be used by a variety of risk management and compliance groups, each with different requirements. Standard extract, transform and load (ETL) methods don’t work well enough for these users. The transformations have to be made at a more granular level to permit the variety of reporting and risk analysis and measurement requirements. A successful risk data store therefore must not only manage the information technology challenge but also accommodate each of the multiple constituencies that need to be part of the process, including all regulatory and operational risk groups.

This data management issue was a tangential point in a broader presentation on coping with the Dodd–Frank Wall Street Reform and Consumer Protection Act. This connection indicates why I think CIOs in financial services should make having a risk data store a priority. Financial services companies almost certainly will have to continue adapting to rapidly changing and growing regulatory requirements in the coming years. The return on the investment is likely to be higher than many assume because the pervasive use of spreadsheets creates multiple hidden pockets of inefficiency. A properly configured risk data store makes adapting far easier.

Proper use of information technology to support GRC processes and requirements is important because it increases the efficiency with which companies execute what are usually administrative functions that have little or no direct economic value to the corporation. Improving IT control and governance is crucial because effective IT systems can be the most efficient and effective means of controlling operational processes. (For example, IT access controls combined with IT process controls offer the best protection against many types of financial fraud.) Investments in IT controls therefore can generate attractive returns. Similarly, improving GRC-related data accessibility and data quality can also be a worthwhile investment. I recommend that companies have an ongoing process for assessing technology and data gaps in how they execute their governance, risk management and compliance-related processes to address opportunities to enhance efficiency.


Robert Kugel – SVP Research

Topics: Governance, GRC, Office of Finance, Operational Performance Management (OPM), OpenPages, Analytics, Business Collaboration, IBM, Business Performance Management (BPM), compliance, Financial Performance Management (FPM), controls, IT controls

Robert Kugel

Written by Robert Kugel

Rob heads up the CFO and business research focusing on the intersection of information technology with the finance organization and business. The financial performance management (FPM) research agenda includes the application of IT to financial process optimization and collaborative systems; control systems and analytics; and advanced budgeting and planning. Prior to joining Ventana Research he was an equity research analyst at several firms including First Albany Corporation, Morgan Stanley, and Drexel Burnham, and a consultant with McKinsey and Company. Rob was an Institutional Investor All-American Team member and on the Wall Street Journal All-Star list. Rob has experience in aerospace and defense, banking, manufacturing and retail and consumer services. Rob earned his BA in Economics/Finance at Hampshire College, an MBA in Finance/Accounting at Columbia University, and is a CFA charter holder.