You are currently browsing the category archive for the ‘Operational Intelligence’ category.
All the hubbub around big data and analytics has many senior finance executives wondering what the big deal is and what they should do about it. It can be especially confusing because much of what’s covered and discussed on this topic is geared toward technologists and others working outside of Finance, in areas such as sales, marketing and risk management. But finance executives need to position their organization to harness this technology to support the strategic goals of their company. To do so, they must have clarity as to what big data can do, what they want it to do, and what skills and tools they need to meet their objectives.
Big data has always been with us, just on smaller scales: The term refers to data sets so large and complex that organizations have difficulty processing them using on-hand database management systems and applications. It has become a popular buzzword because technology for handling big data has crossed a threshold, making it at the same time more capable and cost-effective. Companies now can tap into huge amounts of structured and unstructured data using advanced data processing technologies, analytics and visualization tools to achieve insights not previously available using more conventional techniques. In a recent research analysis, I covered some of the potential benefits (and potential pitfalls) of big data as it relates to company management. Increasingly, the ability to analyze large quantities of business-related data rapidly holds the promise of fundamental changes in how executives and managers run their businesses. Properly deployed, big data analytics enables a more forward-looking and agile management style, even in very large enterprises. Because it allows more flexible forms of business organization, it can give finance organizations greater scope to play a more strategic role in corporate management.
Big data and analytics are a natural combination. By itself, a mass of data is not especially useful, and there are significant challenges to teasing out insight from such large data sets. However, information technology has evolved to make assembling and working with extremely large amounts of data far more practical. As well, routines involving advanced analytics that were once the domain of people with Ph.D.s in statistics are increasingly usable by business analysts, as new analytical software packages are designed to hide the complexity of the underlying statistical work. My colleague Tony Cosentino recently summarized the progress to date in adoption of big data analytics, covering some of the existing uses (already numerous) and emerging trends.
Keep in mind that it’s not just a matter of learning how to master new software and munge data. Finance departments must sharpen their skills in determining how to best utilize big data analytics. And it’s even more important that finance executives understand how to make practical use of big data analytics: In some cases users may want to consume only the output of the analytics created by other parts of the organization (such as demand forecasts by product families), while in others the organization may want to purchase applications that use or embed big data analytics (such as continuous monitoring for ERP systems governance) or enable price and profit optimization.
Our benchmark research on operational intelligence, a technology-driven discipline that has been using big data operating across networks and systems has been using analytics for years, shows that the most common reasons for using such applications (cited by almost three in five companies) are to manage performance, detect fraud, comply with regulations and manage risk. These areas are broadly applicable for finance organizations, but I assert that as well as governance and control, initially the three main applications of big data analytics are planning, reviews and alerts. Here’s how.
Companies do a lot of planning, so it’s useful to segment the activity. One way is by time. There are three main planning time frames in which big data analytics plays a role.
- Short-term tactical planning is used, for example, to project demand for specific products or create offers that might spur incremental demand. Especially in consumer products and business-to-consumer marketing, these models are statistically and computationally challenging, as they must be continually updated and adjusted. However, this is not an area of business where Finance has taken a role.
- Long-term and strategic planning can help determine the impact of a confluence of factors on markets and costs. Decades ago, the largest companies maintained strategic planning staffs to generate long-term forecasts to inform senior executives of important market trends. Except at companies that have very long cycles with specific demand and supply requirements, those staffs have disappeared or have been substantially reduced as corporations switched to third-party sources.
- In the time horizon between short- and long-term planning there are techniques for improving the accuracy of forecasts of revenue and costs using large sets of historical data, which enable organizations to better understand the various factors that influence demand. This sort of advanced modeling using predictive analytics can be useful in improving the accuracy of corporate business planning and budgeting, which is at the core of financial planning and analysis. Predictive analytics uses techniques from statistics, modeling and data mining that weigh multiple current and historical facts and their interactions to predict outcomes. Good predictive models can identify the most important factors driving outcomes, and because of this, they often can be more accurate than simple extrapolation. For example, by examining large sets of historical data, a fast-food chain can predict with reasonable accuracy demand for certain menu items at specific locations at a given hour of a given day by taking into account factors such as the day of the week, time of the year, sales patterns over the past three weeks, advertising spend and special offers.
As useful as predictive analytics are for forecasting, they may be even more valuable when applied to reviews and alerts. Predictive analytics can provide a baseline against which to compare actuals. This, in turn, enables an organization to get an earlier warning when results diverge meaningfully from what was expected, so executives and managers can react immediately rather than in days or weeks. For example, in business-to-business relationships that involve many routine purchases (any sort of supplies, for example) a divergence from established trends could generate an alert to the sales organization. Embedded analytics in an order-entry system could highlight late or smaller-than-usual orders. These might indicate a competitive threat or some other issue that would benefit from a timely interaction with the customer. This is just one of the ways that data captured by the financial systems can be used to improve the effectiveness of other business units, enabling the department to play a more strategic role in supporting the company.
Another use is in accounts receivable, where predictive analytics can promote customer satisfaction. To illustrate, a company that does a routine analysis of payment patterns can have a good idea of when specific customers will pay. If one that routinely pays its invoice between the 16th and 19th day of the month has not paid by the 23rd day, the analytics system generates an alert. A call to the customer or an automated email notes the delayed payment, asking if there was an error in the billing or some other point in dispute. There are a couple of advantages to this approach. If nothing else, if there is an issue, it is likely to be resolved more quickly. Moreover, from a customer satisfaction perspective, it’s a far superior form of customer interaction than waiting several weeks and then sending out a dunning notice demanding payment. Resolving any issue sooner improves cash flow, and if the company did make a mistake, asking for payment will only annoy the customer. Another use of big data in receivables is to automatically identify customers that are routinely tardy in paying. This can kick off an internal company discussion about what ought to be done about the situation, such as limiting credit or finding ways to accelerate payments.
Governance is another area where big data analytics are already at work, with companies using it for fraud detection and alerting. For instance, software packages can monitor a company’s financial systems for evidence of suspicious activities such as payments to bogus vendors or top-level alterations to financial statements. Such systems are designed to be high-level controls that reduce the need for manual internal and external audit work. And even more is possible. As I noted earlier, in the not-too-distant future it may be possible to have an “auditor in a box” – a forensic system that continuously identifies and lists all suspicious activities, transactions and conditions and weighs their materiality. Such a system would permit more timely responses to the risk of material errors or fraud and facilitate examinations by external auditors. In addition to being far more efficient than periodic manual effort, the auditor-in-a-box concept is potentially more reliable because it examines everything rather than relying on sampling.
However, there are challenges. Staffing and training are significant issues for Finance in dealing with big data analytics. Our research into the challenges of utilizing big data shows that nearly four in five companies find staffing and training to be an obstacle in utilizing big data. Despite the fact that analytics is an inherent element of the finance function, it almost always involves the broad application of basic approaches employing simple math (ratio and margin analysis, for example). Few departments have applied advanced analytics: Our finance analytics research finds that only 13 percent of finance departments employ predictive analytics.
To be able to handle these staffing and training needs, finance executives must understand their department’s big data analytics competence requirements. A useful place to start is to become familiar with the five personas Tony Cosentino developed to describe the people working with business intelligence and analytics. These personas illustrate the various objectives, skills and interests that individuals bring to the discipline. Adapting his approach to big data analytics to this discussion, at the top of the list are highly skilled statisticians who do exploratory work and create purpose-built analytics and analytical models to address specific tasks. These people usually have advanced degrees in statistics and understand how to use sophisticated analytical software and data sets to their fullest. Few finance organizations need this level of capability. A second type of user includes business analysts who have in-depth knowledge of the business and finance issues, know how to access and apply available data relevant to the issue, and have the ability and commitment to master software that requires training but not an advanced degree in statistics. Depending on a company’s size, finance organizations will need a person or a group of people with this level of competence. A third type is the knowledge worker. This description includes executives, managers and directors who need to interact with – not just consume – advanced analytics. These types of users should not be expected to learn how to create or structure analytics, but they need to know how to employ analytics embedded in dashboards or applications as well as visual discovery tools, which are increasingly user-friendly. This level is where the need is broadest, so finance executives must focus most of their efforts in terms of developing these skills.
Big data analytics is an important development that will challenge finance organizations to use new capabilities to improve their effectiveness and enhance their company’s competitiveness. There are many ways organizations can begin to address the challenge. At least, CFOs and senior finance executives should create a steering committee to identify opportunities to apply big data analytics; identify gaps in skills, processes, data availability and software; and establish timelines and goals. Moreover, if CFOs are serious about exploiting the potential of big data analytics, they must communicate its importance to their department and demonstrate a commitment to a plan of action.
Robert Kugel – SVP Research
IBM hosted the Big Data and Analytics Analyst Insights conference in Toronto recently to emphasize the strategic importance of this topic to the company and to highlight recent and forthcoming advancements in its big data and analytics software. Our firm followed the presentations with interest. My colleagues Mark Smith and Tony Cosentino have commented on IBM’s execution of its big data strategy and its approach to analytics. As well, Ventana Research has conducted benchmark research on challenges in big data.
The perennial challenge for the IT industry observer is to be skeptical enough to avoid being taken in by overblown claims (often, the next big thing isn’t) without missing turning points in the evolution of technology. “Big data” has always been with us – it’s the amount that constitutes “big” that has changed. A megabyte was considered big data in the 1970s but is a trivial amount today. There’s no question that the hype around big data is excessive, but beneath the noise is considerable substance. Big data is of particular interest today because the scope and depth of the data that can be analyzed rapidly and turned into useful information are so large as to enable transformations in business management. The effect will be to raise computing – especially business computing – to a higher level.
The IBM event demonstrated that the technology for handling big data analytics to support more effective business computing is falling into place. It’s not all there yet but should improve strongly over the next several years. Yet while technological barriers are falling, there are other issues organizations will need to resolve. For example, the conversation in the conference sessions frequently turned to the people element of successfully deploying big data and analytics. These discussions confirmed an important finding in our big data research, shown in the chart, that more than three-fourths of organizations find staffing and training people for big data roles to be a challenge. It’s useful to remind ourselves that there will be the usual lag between what technology makes possible and the diffusion of new information-driven management techniques.
The conference focused mostly on the technology supporting big data and analytics but included examples of conceivable use cases for that technology. For example, there was a session on better management of customer profitability in the banking industry. Attempts to use software to optimize customer profitability go back to the 1980s, but results have been mixed at best, and a great many users continue to perform analysis in spreadsheets using data stored in business silos. IBM speakers described a generic offering incorporating Cognos TM1 to automate the fusion of multiple bank data sources that are incorporated in a range of profitability-related analytic applications. The aim is to enable more precise pricing and price-related decisions related to rates and fees, among other factors. This application enables consistent costing methodologies, including activity-based ones for indirect and shared expenses, to promote a more accurate assessment of the economic profitability of offers to customers. A good deal of the value in this offering is that it puts the necessary data in one place, giving executives and managers a consistent and more complete data set than they typically have. As well, the product’s use of in-memory analytic processing enables much faster calculations. Faster processing of a more complete data set enables more iterative, what-if analyses that can be used to explore the impact of different strategies in setting objectives for a new product or service or examining alternatives to exploit market opportunities or address threats.
As the Internet did, big data will change business models and drive the creation of new products and services. The dramatic drop in the cost of instrumenting machinery of all types and connecting them to a data network (part of the concept of the Internet of Things) is already changing how companies manage their productive assets, for example, by optimizing maintenance using sensors and telematics to enhance uptime while minimizing repair expense. Decades ago, companies monitored production parameters to ensure quality. More recent technologies can extend the speed and scope of what’s monitored and provide greater visibility into production conditions and trends. Representatives from BMW were on hand at the conference to talk about their experience in improving operations with predictive maintenance. Centrally monitoring the in-service performance of equipment and capital assets is old hat for airlines and jet engine manufacturers. For them, the economic benefits of optimizing maintenance to maximize the availability and uptime were significant enough to warrant the large investment they started making decades ago. The same basic techniques can be used to for early detection of warranty issues, such as identifying specific vehicles subject to “lemon law” provisions. From IBM’s perspective, these new technologies will enhance the value of Maximo, its asset management software, by extending its functionality to incorporate monitoring and analytics that help users explore options and optimize responses to specific conditions.
IBM Watson is the company’s poster child for the transformative capabilities of big data and analytics on how organizations operate. The company’s objective is to enable customers to achieve better performance and outcomes by having systems that learn through interactions, providing evidence-based responses to queries. My colleagues Mark Smith and Richard Snow have written about Watson in the contexts of cognitive computing and its application to customer service. And we awarded IBM Watson Engagement Advisor our 2012 Technology Innovation Award. Conference presenters gave an extensive review of progress to date with Watson, featuring innovative ways to use it for diagnosis and treatment in medicine as well as to provide customer support.
Although this was not directly related to big data, IBM also used the conference to announce the availability of Cognos Disclosure Management (CDM) 10.2.1, which will be available both on-premises and in a cloud-based SaaS version. CDM facilitates the creation, editing and publishing of highly structured enterprise documents that combine text and numbers and are created repeatedly and collaboratively, including ones that incorporate eXtensible Business Reporting Language (XBRL) tags. The new version offers improvements in scalability and tagging over the earlier FSR offering. The SaaS version, available on a per-seat subscription basis, will make this technology feasible for midsize companies, enabling them to save the time of highly skilled individuals as well as enhance the accuracy and consistency of, for example, regulatory filings and board books. A SaaS option also will help IBM address the requirements of larger companies that prefer a cloud option.
Most of the use cases presented at the conference were extensions and enhancements of well-worn uses for information technology. However, when it comes to business, the bottom line is what matters, not novelty. Adoption of technology occurs fastest when “new” elements of its use are kept to a minimum. The rapid embrace of the Internet in North America and other developed regions of the world was a function of the substantial investment that had been made over the previous decades in personal computers and local- and wide-area communications networks as well as the training and familiarity of people with these technologies. Big data and the analytics that enable us to apply it have a similar base of preparation. Over the next five years we can expect advances in practical use that benefit businesses and their customers.
Robert Kugel – SVP Research