You are currently browsing the category archive for the ‘Business Analytics’ category.
IBM hosted the Big Data and Analytics Analyst Insights conference in Toronto recently to emphasize the strategic importance of this topic to the company and to highlight recent and forthcoming advancements in its big data and analytics software. Our firm followed the presentations with interest. My colleagues Mark Smith and Tony Cosentino have commented on IBM’s execution of its big data strategy and its approach to analytics. As well, Ventana Research has conducted benchmark research on challenges in big data.
The perennial challenge for the IT industry observer is to be skeptical enough to avoid being taken in by overblown claims (often, the next big thing isn’t) without missing turning points in the evolution of technology. “Big data” has always been with us – it’s the amount that constitutes “big” that has changed. A megabyte was considered big data in the 1970s but is a trivial amount today. There’s no question that the hype around big data is excessive, but beneath the noise is considerable substance. Big data is of particular interest today because the scope and depth of the data that can be analyzed rapidly and turned into useful information are so large as to enable transformations in business management. The effect will be to raise computing – especially business computing – to a higher level.
The IBM event demonstrated that the technology for handling big data analytics to support more effective business computing is falling into place. It’s not all there yet but should improve strongly over the next several years. Yet while technological barriers are falling, there are other issues organizations will need to resolve. For example, the conversation in the conference sessions frequently turned to the people element of successfully deploying big data and analytics. These discussions confirmed an important finding in our big data research, shown in the chart, that more than three-fourths of organizations find staffing and training people for big data roles to be a challenge. It’s useful to remind ourselves that there will be the usual lag between what technology makes possible and the diffusion of new information-driven management techniques.
The conference focused mostly on the technology supporting big data and analytics but included examples of conceivable use cases for that technology. For example, there was a session on better management of customer profitability in the banking industry. Attempts to use software to optimize customer profitability go back to the 1980s, but results have been mixed at best, and a great many users continue to perform analysis in spreadsheets using data stored in business silos. IBM speakers described a generic offering incorporating Cognos TM1 to automate the fusion of multiple bank data sources that are incorporated in a range of profitability-related analytic applications. The aim is to enable more precise pricing and price-related decisions related to rates and fees, among other factors. This application enables consistent costing methodologies, including activity-based ones for indirect and shared expenses, to promote a more accurate assessment of the economic profitability of offers to customers. A good deal of the value in this offering is that it puts the necessary data in one place, giving executives and managers a consistent and more complete data set than they typically have. As well, the product’s use of in-memory analytic processing enables much faster calculations. Faster processing of a more complete data set enables more iterative, what-if analyses that can be used to explore the impact of different strategies in setting objectives for a new product or service or examining alternatives to exploit market opportunities or address threats.
As the Internet did, big data will change business models and drive the creation of new products and services. The dramatic drop in the cost of instrumenting machinery of all types and connecting them to a data network (part of the concept of the Internet of Things) is already changing how companies manage their productive assets, for example, by optimizing maintenance using sensors and telematics to enhance uptime while minimizing repair expense. Decades ago, companies monitored production parameters to ensure quality. More recent technologies can extend the speed and scope of what’s monitored and provide greater visibility into production conditions and trends. Representatives from BMW were on hand at the conference to talk about their experience in improving operations with predictive maintenance. Centrally monitoring the in-service performance of equipment and capital assets is old hat for airlines and jet engine manufacturers. For them, the economic benefits of optimizing maintenance to maximize the availability and uptime were significant enough to warrant the large investment they started making decades ago. The same basic techniques can be used to for early detection of warranty issues, such as identifying specific vehicles subject to “lemon law” provisions. From IBM’s perspective, these new technologies will enhance the value of Maximo, its asset management software, by extending its functionality to incorporate monitoring and analytics that help users explore options and optimize responses to specific conditions.
IBM Watson is the company’s poster child for the transformative capabilities of big data and analytics on how organizations operate. The company’s objective is to enable customers to achieve better performance and outcomes by having systems that learn through interactions, providing evidence-based responses to queries. My colleagues Mark Smith and Richard Snow have written about Watson in the contexts of cognitive computing and its application to customer service. And we awarded IBM Watson Engagement Advisor our 2012 Technology Innovation Award. Conference presenters gave an extensive review of progress to date with Watson, featuring innovative ways to use it for diagnosis and treatment in medicine as well as to provide customer support.
Although this was not directly related to big data, IBM also used the conference to announce the availability of Cognos Disclosure Management (CDM) 10.2.1, which will be available both on-premises and in a cloud-based SaaS version. CDM facilitates the creation, editing and publishing of highly structured enterprise documents that combine text and numbers and are created repeatedly and collaboratively, including ones that incorporate eXtensible Business Reporting Language (XBRL) tags. The new version offers improvements in scalability and tagging over the earlier FSR offering. The SaaS version, available on a per-seat subscription basis, will make this technology feasible for midsize companies, enabling them to save the time of highly skilled individuals as well as enhance the accuracy and consistency of, for example, regulatory filings and board books. A SaaS option also will help IBM address the requirements of larger companies that prefer a cloud option.
Most of the use cases presented at the conference were extensions and enhancements of well-worn uses for information technology. However, when it comes to business, the bottom line is what matters, not novelty. Adoption of technology occurs fastest when “new” elements of its use are kept to a minimum. The rapid embrace of the Internet in North America and other developed regions of the world was a function of the substantial investment that had been made over the previous decades in personal computers and local- and wide-area communications networks as well as the training and familiarity of people with these technologies. Big data and the analytics that enable us to apply it have a similar base of preparation. Over the next five years we can expect advances in practical use that benefit businesses and their customers.
Robert Kugel – SVP Research
In some parts of the world, bribing government officials is still considered a normal cost of doing business. Elsewhere there has been a growing trend over the past 40 years to make it illegal for a corporation to pay bribes. In the United States, Congress passed the Foreign Corrupt Practices Act (FCPA) in 1977 in the wake of a succession of revelations of companies paying off government officials to secure arms deals or favorable tax treatment. More recently other governments have implemented anticorruption statutes. The U.K., for instance, enacted the strict Bribery Act in 2010 to replace increasingly ineffective statutes dating back to 1879. The purpose of these actions is to enable ethical and law-abiding companies to compete on a level playing field with those that are neither. A cynic might wonder about the real, functional difference between, say, Wal-Mart’s recent payments to officials in Mexico to accelerate approval of building permits and the practice in New York City of having to engage expediters to ensure timely sign-offs on construction approval documents. No matter – the latter is legal (it’s a domestic issue, after all) while the former is not.
Moreover, the U.S. Department of Justice (DOJ) and the Securities and Exchange Commission (SEC) have increased their oversight of bribery. At the beginning of 2013 they jointly issued the Resource Guide to the U.S. Foreign Corrupt Practices Act. For its part, the SEC has stepped up enforcement using its own resources. Recently, it charged a group of bond traders with enabling a Venezuelan finance official to embezzle millions of dollars by disguising the money as fees paid to the broker/dealer to handle apparently legitimate transactions. Tellingly, though, there was another relatively recent bribery issue that involved Morgan Stanley where the SEC declined to include that company in an enforcement action because it had demonstrated diligence to prevent it.
Before anticorruption laws, it was expedient for corporations to pay government officials to close business, get preferred status or prevent punishment. Once the laws were established, that stopped being the case. However, from a management standpoint, compliance with the law became complicated because of the dual nature of the corporation, which is both an entity and a group of individuals. In the case of the latter, when an individual breaks the law, is that person at fault, is the corporation or are both? Regardless of how a case is decided, there can be severe reputational damage to a company found violating the law, and that will have repercussions for corporate boards and executives.
This question leads to the agency dilemma, an important consideration in enterprise risk management. Economists long ago recognized the agency dilemma when the modern corporation separated the roles of its principals (that is, the shareholders) from its management. The agency issue exists where the best interests of the principals are either not aligned or in conflict with the interests of the agents (the professional managers running the corporation). But agency issues also extend to the company’s executives and may be rife in any large-scale business. Within the management group, authority to act independently is delegated down through the hierarchy, and the interests of the lower-level managers may be in conflict with those of senior executives, the board of directors and shareholders. For example, suppose that a local manager believes his performance evaluation, compensation and prospects for promotion hinge on the timely opening of a new facility. Confronted with a culture of payoffs for permits, that manager may try to find a way to pay officials for expedited consideration, especially if he is local to the area. From that individual’s perspective, corrupt activity may be the norm, and he may believe himself to be clever enough to violate company policy without detection.
It was once acceptable for a company to claim that it had a stated policy prohibiting bribery and that executives were ignorant of an employee’s actions. Absent proof to the contrary, that often was enough. However, the FCPA changed this norm, imposing the need for diligence and affirmative actions on the part of companies to prevent employees from breaking the law as well as to detect and report any such violations that do occur (which is how the Wal-Mart situation came to light). Public standards, too, have changed since the 1970s. Despite its self-disclosure after the fact and the steps it took to address the corrupt behavior, Wal-Mart suffered severe reputational damage. Yet even with the likelihood potential consequences, our benchmark research reveals that just 6 percent of companies have effective controls for managing reputational risk.
We assert that the most effective control is to prevent illegal activity from taking place at all. Short of that, companies that can demonstrate that they have taken all reasonable steps to prevent a violation of the law are in a better position to claim that the individual, not the company, is at fault.
An organization should have clearly articulated and documented antibribery and corruption policies and procedures, institute mandatory training of and signed acknowledgements of having taken it by executives and managers, and put in place incentives and disciplinary measures. However, these required measures are increasingly insufficient to demonstrate diligence in preventing corrupt activities. Companies also must have a software-supported internal control system that flags suspicious activity immediately and triggers a rigorous remediation process that analyzes, investigates and documents the disposition of each incident. Incidents that are detected long after their commission are more difficult to cope with and pose much higher legal, financial and reputational risk.
Software is available that helps detect activities that violate anticorruption laws and regulations as they occur or shortly thereafter; this is far more effective than waiting for internal audits or (worse still) whistleblowers to uncover malfeasance. To prevent violations of the FCPA and other antibribery statues, corporations must be able to monitor their financial and other systems for warning signs. These applications take advantage of operational intelligence, a class of analytical capabilities built on event-focused information-gathering that can uncover suspicious actions as they occur. Our research on innovating with operational intelligence shows that companies use an array of systems (led by IT systems management and major enterprise applications such as ERP and CRM) to track events, analyze them, report results and create alerts when conditions warrant them, as detailed in the related chart. The research also shows that about half (53%) use 11 or more information sources in implementing their operational intelligence efforts. In the future, effective FCPA software increasingly will need to look at a wider range of internal data as well as information from external sources and social media to determine, for example, whether a consulting company that just received a finder’s fee is run by or employs a relative of a government official. Today, companies can utilize software from large vendors such as IBM, Oracle and SAP, as well as vendors with FCPA-specific software such as Compliancy and Oversight Systems.
Bribery and corruption are unlikely to disappear entirely. Regardless of anyone’s best intentions, corporate boards and executives can find themselves enmeshed in a scandal not of their own devising. The best defense in such cases is plain evidence that the organization has done everything reasonable to prevent its occurrence and has discovered and dealt with it promptly if it does. Policies and training are vital components, but software can be the extra component necessary to improve the effectiveness of monitoring and auditing to support anticorruption efforts.
Robert Kugel – SVP Research