You are currently browsing the category archive for the ‘Business Intelligence (BI)’ category.
Technology for the Office of Finance can have transformative power. Although progress has been slow at times, today’s finance organizations are fundamentally different from those of 50 years ago. For one thing, they require far fewer resources (chiefly people) to perform basic accounting, treasury and corporate finance tasks. In addition, public corporations report results sooner – sometimes weeks sooner – than they could in the mid-20th century. And finance departments are able to harness substantially more data and a wider array of analytics to promote insight and support more agile decision-making.
Even in this context, in many corporations the tax function remains a backwater in its use of technology. Most of their tax professionals are awash in desktop spreadsheets, tools that they initially thought would promote efficiency and accuracy. But the inherent problems with desktop spreadsheets make tax processes not only needlessly time-consuming (even using macros and other spreadsheet automation techniques) but also prone to errors and inconsistencies. This is especially true if people must enter the same information multiple times, which increases the chances of mistakes. In addition, assumptions made and rationales behind formulas used in data transformations may not be documented or readily accessible to others in the organization. This legacy can be problematic years later in an audit, especially if the individual who prepared the spreadsheet is no longer employed at the company. And with spreadsheet-driven processes, auditing taxes and the underlying data and calculations is difficult and time-consuming. On top of all this, the effects of the desktop spreadsheet’s inherent shortcomings multiply with the size of a corporation. By their nature, these spreadsheets hinder a corporation’s ability to understand tax issues and optimize related decisions.
Organizations do not have to put up with these outdated, counterproductive practices. New tools can help streamline tax processes. One is Vertex Enterprise (which I reviewed earlier this year). Vertex recently was awarded Ventana Research’s 2013 Innovation Award for the Office of Finance for this suite of application and integrated use of tax data and analytics. Vertex offers a single-platform approach to managing all types of taxes (direct and indirect) across the entire tax life cycle, from analysis through provisioning to audit defense, using a single data source. Direct (income) tax management is still a largely manual process involving a plethora of desktop spreadsheets. Calculating and accounting for direct taxes is complicated, largely because income tax laws can be quite convoluted, especially in certain industries. As well, the larger the number of tax jurisdictions it operates in and the more numerous its subsidiary legal entities, the more complex tax management becomes.
Vertex Enterprise includes tax provisioning software for both indirect and direct taxes. Indirect Tax is managed by the company’s O Series software, a single platform that handles sales-and-use-, VAT- and goods and services taxes on a global scale. Direct taxes is handled by Vertex Tax Accounting, global tax provisioning and reporting software that works with multiple general ledger systems in multiple currencies and across multiple years within the context of multiple accounting regimes such as US- and other national generally accepted accounting standards (GAAP) and International Financial Reporting Standards (IFRS). The web-based software supports a distributed, collaborative, enterprise-wide tax decision cycle process that can achieve more optimal tax-related decision making. Direct taxes exist in a parallel universe, and one advantage in having a single system is that it facilitates the process of reconciling the business accounting performed in ERP systems with the tax accounting governed by law. Vertex automates tax account reconciliation and reporting and manages adjustments that span multiple time periods.
One notable advance for corporate tax departments offered by Vertex Enterprise is its tax data warehouse (TDW). Because a TDW can automate much of the painstaking work that occupies much of the tax practitioner’s time, it can free these trained individuals to focus on making the best tax decisions possible. Also, because a TDW can increase visibility into tax analyses and calculations, corporations can be more confident in their tax-related decisions. And with this greater visibility and confidence, tax departments can become a mainstream participant in the finance function. This is especially important in an era of increasing cooperation between tax authorities worldwide. More than ever, corporations that operate globally must be able to optimize their tax positions over time and across multiple jurisdictions. A TDW enhances that capability.
Conceptually, a TDW is simple: It’s a data store that makes all tax data readily available and can be used to plan and provision a company’s taxes. However, in larger companies (those with more than 1,000 employees) that operate in multiple tax jurisdictions and have even moderately complex legal entity structures, tax-related data structures and calculations become fiendishly complex, as I noted in an earlier perspective. For that reason, the first attempts to create TDWs proved unworkable because the sheer complexity of the direct (that is, income) tax domain overwhelmed the available information technology. These limitations forced companies to take shortcuts, which meant that each TDW had to be a largely custom effort and therefore expensive to build. And because these shortcuts rendered the systems brittle and difficult to change, they were expensive to maintain. Today, however, technology is available to make TDWs practical.
A TDW has several purposes: to ensure accuracy and consistency in tax analysis and calculations, improve visibility into tax provisioning, and cut the time and effort required to execute tax processes. The technology is especially useful because data management is one of the biggest operational challenges facing tax departments today. That is, the information necessary for tax provisioning, planning, compliance and audit may not be readily available to the tax department because accounting and other information is kept in multiple systems from multiple vendors. Our benchmark research shows that 90 percent of companies with 1,000 or more employees use financial systems from multiple vendors, and 43 percent use four or more. In addition, not all of the data necessary for tax department purposes is captured by the ERP system. As well, data collected in a general finance department warehouse or pulled together in a financial consolidation system may not be sufficiently granular for tax department purposes.
Moreover, most companies’ ERP systems (the core technology for gathering transaction data) are not inherently “tax aware,” so tax departments repeatedly need to perform transformational steps to ensure that data is formatted and organized properly. Sometimes the data must undergo multiple transformations because, for example, the transaction information collected in an overseas subsidiary must be reported locally using the local currency and accounting standard but translated to the parent company’s tax books in a different currency using a different accounting standard. In some industries (such as financial services) there may be multiple local reporting standards, one for general statutory purposes and another reflecting specific rules for that industry demanded by some regulatory authority. In short, there are numerous data-driven headaches tax professionals have to address before they even get down to work. A TDW addresses this problem.
Ventana Research is dedicated to helping organizations enhance the effectiveness and strategic value of the Office of Finance. Tax departments, CFOs and controllers in larger, more complex corporations should examine how the tax organization spends its time and determine to what degree more enterprise automation and fewer desktop spreadsheets would enable the company to understand and manage its tax needs more intelligently. Vertex Enterprise can be a useful foundation for transforming the tax function, which is why it received our Technology Innovation Award for Office of Finance for 2013.
Robert Kugel – SVP Research
IBM hosted the Big Data and Analytics Analyst Insights conference in Toronto recently to emphasize the strategic importance of this topic to the company and to highlight recent and forthcoming advancements in its big data and analytics software. Our firm followed the presentations with interest. My colleagues Mark Smith and Tony Cosentino have commented on IBM’s execution of its big data strategy and its approach to analytics. As well, Ventana Research has conducted benchmark research on challenges in big data.
The perennial challenge for the IT industry observer is to be skeptical enough to avoid being taken in by overblown claims (often, the next big thing isn’t) without missing turning points in the evolution of technology. “Big data” has always been with us – it’s the amount that constitutes “big” that has changed. A megabyte was considered big data in the 1970s but is a trivial amount today. There’s no question that the hype around big data is excessive, but beneath the noise is considerable substance. Big data is of particular interest today because the scope and depth of the data that can be analyzed rapidly and turned into useful information are so large as to enable transformations in business management. The effect will be to raise computing – especially business computing – to a higher level.
The IBM event demonstrated that the technology for handling big data analytics to support more effective business computing is falling into place. It’s not all there yet but should improve strongly over the next several years. Yet while technological barriers are falling, there are other issues organizations will need to resolve. For example, the conversation in the conference sessions frequently turned to the people element of successfully deploying big data and analytics. These discussions confirmed an important finding in our big data research, shown in the chart, that more than three-fourths of organizations find staffing and training people for big data roles to be a challenge. It’s useful to remind ourselves that there will be the usual lag between what technology makes possible and the diffusion of new information-driven management techniques.
The conference focused mostly on the technology supporting big data and analytics but included examples of conceivable use cases for that technology. For example, there was a session on better management of customer profitability in the banking industry. Attempts to use software to optimize customer profitability go back to the 1980s, but results have been mixed at best, and a great many users continue to perform analysis in spreadsheets using data stored in business silos. IBM speakers described a generic offering incorporating Cognos TM1 to automate the fusion of multiple bank data sources that are incorporated in a range of profitability-related analytic applications. The aim is to enable more precise pricing and price-related decisions related to rates and fees, among other factors. This application enables consistent costing methodologies, including activity-based ones for indirect and shared expenses, to promote a more accurate assessment of the economic profitability of offers to customers. A good deal of the value in this offering is that it puts the necessary data in one place, giving executives and managers a consistent and more complete data set than they typically have. As well, the product’s use of in-memory analytic processing enables much faster calculations. Faster processing of a more complete data set enables more iterative, what-if analyses that can be used to explore the impact of different strategies in setting objectives for a new product or service or examining alternatives to exploit market opportunities or address threats.
As the Internet did, big data will change business models and drive the creation of new products and services. The dramatic drop in the cost of instrumenting machinery of all types and connecting them to a data network (part of the concept of the Internet of Things) is already changing how companies manage their productive assets, for example, by optimizing maintenance using sensors and telematics to enhance uptime while minimizing repair expense. Decades ago, companies monitored production parameters to ensure quality. More recent technologies can extend the speed and scope of what’s monitored and provide greater visibility into production conditions and trends. Representatives from BMW were on hand at the conference to talk about their experience in improving operations with predictive maintenance. Centrally monitoring the in-service performance of equipment and capital assets is old hat for airlines and jet engine manufacturers. For them, the economic benefits of optimizing maintenance to maximize the availability and uptime were significant enough to warrant the large investment they started making decades ago. The same basic techniques can be used to for early detection of warranty issues, such as identifying specific vehicles subject to “lemon law” provisions. From IBM’s perspective, these new technologies will enhance the value of Maximo, its asset management software, by extending its functionality to incorporate monitoring and analytics that help users explore options and optimize responses to specific conditions.
IBM Watson is the company’s poster child for the transformative capabilities of big data and analytics on how organizations operate. The company’s objective is to enable customers to achieve better performance and outcomes by having systems that learn through interactions, providing evidence-based responses to queries. My colleagues Mark Smith and Richard Snow have written about Watson in the contexts of cognitive computing and its application to customer service. And we awarded IBM Watson Engagement Advisor our 2012 Technology Innovation Award. Conference presenters gave an extensive review of progress to date with Watson, featuring innovative ways to use it for diagnosis and treatment in medicine as well as to provide customer support.
Although this was not directly related to big data, IBM also used the conference to announce the availability of Cognos Disclosure Management (CDM) 10.2.1, which will be available both on-premises and in a cloud-based SaaS version. CDM facilitates the creation, editing and publishing of highly structured enterprise documents that combine text and numbers and are created repeatedly and collaboratively, including ones that incorporate eXtensible Business Reporting Language (XBRL) tags. The new version offers improvements in scalability and tagging over the earlier FSR offering. The SaaS version, available on a per-seat subscription basis, will make this technology feasible for midsize companies, enabling them to save the time of highly skilled individuals as well as enhance the accuracy and consistency of, for example, regulatory filings and board books. A SaaS option also will help IBM address the requirements of larger companies that prefer a cloud option.
Most of the use cases presented at the conference were extensions and enhancements of well-worn uses for information technology. However, when it comes to business, the bottom line is what matters, not novelty. Adoption of technology occurs fastest when “new” elements of its use are kept to a minimum. The rapid embrace of the Internet in North America and other developed regions of the world was a function of the substantial investment that had been made over the previous decades in personal computers and local- and wide-area communications networks as well as the training and familiarity of people with these technologies. Big data and the analytics that enable us to apply it have a similar base of preparation. Over the next five years we can expect advances in practical use that benefit businesses and their customers.
Robert Kugel – SVP Research