You are currently browsing the monthly archive for June 2012.
Risk has always been an integral part of business, but our recent Governance, Risk and Compliance (GRC) benchmark research shows that companies deal with risk with varying degrees of effectiveness – especially operational risk. A majority of companies lag in their overall GRC maturity, as I covered in a recent blog post. Operational risk management should be of greater interest to executives today because they can have greater control of it than before. The expansion of IT systems to automate and support most business processes has made it easier than ever to measure, monitor and report on what’s going on in a company. It’s now practical to expand the scope of operational risk management and improve companies’ effectiveness in handling risk events when they occur.
Our research shows that managing risk more effectively is the main reason why people want a better approach to GRC. Nearly eight out of 10 (77%) want to be able to identify and manage risks faster. Another 59 percent want to achieve a better risk control environment – for example, they want to ensure that rules and procedures are being followed. In many instances, it possible to use information technology to keep people from not following rules and policies. For example, there’s a long-standing approach to reducing financial fraud by having a policy for separation of duties that keeps people who approve invoices separate from those who sign checks or issue payment instructions. Because invoice approval and payment are done via computer systems today, the process can be designed to enforce separation of duties and to continuously monitor systems and process execution to ensure this policy is followed.
Computing systems also can be used to stay on top of compliance to limit the chance that someone fails to do what they are supposed to do. Was a critical piece of maintenance performed on schedule? Has everyone who needed to sign off on a regulatory filing?
Managing risk is an ongoing process that must be defined, refined and re-examined regularly. Managing risk effectively means having ongoing discussions about risk, usually face to face. But IT is a critical piece of effective risk management. Technology can automate many aspects of risk management – separation of duties and identity management are two examples. Reporting systems can be used to enable managers and executives to monitor operations more efficiently by reliably providing alerts but only doing so when some situation requires their attention.
One analytic technique that’s applicable to managing operational risks is predictive analytics, a subject I’ve covered in the past. “Predictive” does not necessarily mean that you can foretell the future; rather, this approach sifts through a lot of data, tells you if some key aspect of the business is behaving the way it should and alerts someone if it isn’t. Do order patterns signal a problem? If you can spot the negative trend on the fifth business day of the month rather than in the monthly review, you may be able to address the causal factors before you have a big problem. Predictive analytics can inform managers that they will need to add shifts or workers to address some supply chain snag that has developed.
Predictive analytics is a powerful tool that’s becoming increasingly accessible to many businesses. However, many companies face a fundamental issue: They don’t have the data. Our research shows a mixed picture. Participants were pretty much split on how easy it is to access and use the data necessary to measure and assess risk. About half (53%) took the middle ground, saying that is neither easy nor difficult. Of the remainder, 24 percent said it’s easy or very easy and 19 percent said it’s difficult.
Companies are not completely ineffective in managing operational risks – only about one in five said they have ineffective operational risk controls for handling natural disaster, supply chain disruption, competitive threats, reputation loss, internal fraud and demand disruption. (I think this is because companies that have ineffective controls usually go out of business.) However, the data also shows that even fewer rate their risk controls as very effective. For example, only 15 percent assessed their controls for natural disasters as very effective, and just 12 percent rated their supply chain controls as very effective. The research shows that companies are least good at controlling the impact of demand disruption: More than one-fourth said their controls are ineffective while just 5 percent said they are very effective. Just 8 percent are very effective at controlling separation of duties and sources of internal fraud at an operational level. While most companies rate themselves somewhere in the middle, I think “very effective” ought to be the standard companies apply to their operational risk management. And the fact that a majority of organizations think they’re doing reasonably well in controlling operational risk is itself a risk. This sort of assessment typically leads to complacency and a lack of effort to improve operational risk management.
Managing risk intelligently is one of the key capabilities of successful organizations because it can deliver a competitive edge. Companies that are good at managing risk can make aggressive moves more prudently, spot negative trends faster and respond more quickly and effectively when disaster strikes. IT continues to be one of the main sources of innovation in operational risk management. Executives and managers must become familiar with the technology if they want to manage risks as intelligently as they should.
Robert Kugel – SVP Research
Our benchmark research on business analytics finds that just 13 percent of companies overall and 11 percent of finance departments use predictive analytics. I think advanced analytics – especially predictive analytics – should play a larger role in managing organizations. Making it easier to create and consume advanced analytics would help organizations broaden their integration in business planning and execution. This was one of the points that SPSS, an IBM subsidiary that provides analytics, addressed at IBM’s recent analyst summit.
Predictive analytics are especially useful for anticipating trend divergences or spotting them earlier than one might otherwise. For example, sales may be up compared to a prior period, but is it simply month-to-month variability or the start of an upward trend? Better analytical techniques can help distinguish between normal variation and the beginning of a new trend. By using analytics, one might even discern that while the revenue numbers have been positive recently, the underlying data contains warning signs that point to diminishing volumes, lower prices or both in the future.
Predictive analytic models are created using a top-down or a bottom-up approach, or some combination of the two. SPSS offers tools to handle both. The top-down approach involves creating a statistical hypothesis based on business observations or theories and then testing that hypothesis using statistical methods. IBM SPSS Statistics enables users to build a relevant picture from a sample, as well as test assumptions and hypotheses about that picture. A bottom-up approach unleashes automated data mining techniques on data sets (typically large ones) to distill statistically significant relationships from them. SPSS Modeler is designed for use by experienced data miners but also business analysts to speed the creation and refinement of predictive models. Often, companies employ both approaches iteratively to refine and improve models.
I used to joke that the main value proposition of SPSS was that while its chief rival, SAS, required its users to have a Ph.D. in statistics, SPSS could be used by anyone with a master’s degree. Applying predictive analytics techniques is simple in concept but far from simple to integrate into day-to-day business beyond its traditional roles such as market research. This partly explains why so few companies have woven predictive analytics into their planning and review cycles. It’s possible to create relatively simple predictive models, but for many business issues, such models may be too simplistic to be useful. And they may not be reliable enough because they generate too many false positives (people spend too much time chasing non-issues) or false negatives (missing important developments or breaks in trends).
Beyond the data and technology challenges posed by advanced analytics, there are significant people issues that companies must address to make their use practical. These can be more difficult to tackle than most business/IT issues because of the experience and skills that are needed that our benchmark in predictive analytics still finds lack of adequate resources. Automating general business processes, for instance, requires bringing together business subject-matter experts with people who understand IT. Advanced analytics, however, requires three sets of skills – business subject-matter expertise, IT and statistics – that are rarely found in any single individual. Communication among sets of individuals who have these skills often is difficult because they have a limited appreciation of the others’ domains and often have difficulty expressing the nuances of their own area of expertise.
Today there’s greater focus than ever on analytics, partly because an explosion of available data has made it possible and even necessary to make sense of it. As part of IBM, SPSS has been benefitting from the parent’s “smarter planet” marketing theme. SPSS also has taken steps to expand demand for its tools by reducing the people barriers to adopting advanced analytics. One step has been to automate data preparation for use in Statistics and Modeler. Another is an automated modeler that takes several different approaches to analyze a set of data in a single run and then compares the results. Yet despite these steps, I expect advanced analytics to require specialized skills for many years.
Therefore, I also expect adoption of advanced analytics to happen slowly. Most executives at the senior and even middle levels of corporations have limited familiarity with advanced analytics. Many may have had their last formal education with statistics as a required business school course. To spur broader adoption of predictive and other advanced analytics, IBM and others must foster a “pull” approach to marketing analytics. Business executives need to know that advanced analytics are available and of practical value, especially outside of traditional statistics-heavy realms such as consumer research and fraud detection. Sales planning, financial planning, enterprise risk management, maintenance and customer service are all areas ripe for use of predictive and other advanced analytics. We found all of these as future use of predictive analytics in our benchmark. It’s easy to convince analysts like me of the value of analytics; it’s much harder to get business executives to incorporate them into day-to-day practices. It would be helpful for its own cause if IBM SPSS were to identify promising uses of advanced analytics by function and industry and provide a canned blueprint that can serve as a starting point. Such a blueprint would incorporate a business case illustrating the problem, the suggested steps for addressing it and the scope of benefits that can be realized.
The continuing explosion of data will give rise to an increasing number of ways that business and finance executives can use information to their advantage. But first they have to know that they can.
Robert Kugel – SVP Research