You are currently browsing the category archive for the ‘Uncategorized’ category.

I’ve written before about the increasing importance of having a solid technology base for a company’s tax function, and it’s important enough for me to revisit the topic. Tax departments are entrusted with a highly sensitive and essential task in their companies. Taxes usually are the second largest corporate expense, after salaries and wages. Failure to understand this liability is expensive – either because taxes are overpaid or because of fines and interest levied for underpayment. Moreover, taxes remain a political issue, and corporations – especially larger ones – must be mindful of the reputational implications of their tax liabilities.

In this context of seriousness, there are five interrelated requirements for the work that tax departments do:

  • The work must be absolutely accurate.
  • Corporate and tax executives must be certain that the numbers are right – instilling confidence is key.
  • Certainty depends on transparency: Source data and calculations must be demonstrably accurate, and any questions about the numbers must be answerable without delay.
  • Speed is critical. All department tasks related to tax planning, analysis and provisioning can become sources of delay in core finance department processes. Being able to quickly execute data collection and calculations allows more time to explore the results and consider alternatives.
  • Control of the process is essential. Only particular trustworthy individuals can be permitted to access systems, perform tasks and check results. Control promotes accuracy, certainty and transparency.

vr_Office_of_Finance_15_tax_depts_and_spreadsheetsThese requirements form the basis of a business case for a tax data warehouse. Properly executed, it promotes all of these qualities. However, our forthcoming benchmark research on the Office of Finance shows that not many corporations have adopted one. Rather, most companies rely mainly or entirely on spreadsheets for provisioning income tax: managing  data, calculations and modeling. More than half (52%) of companies use spreadsheets alone to handle income taxes while just 10 percent use a dedicated application designed for that purpose. Desktop spreadsheets are a poor choice for managing taxes since they are error-prone, lack transparency, are difficult to use for data aggregation, lack controls and have little ability to handle more than a few dimensions at a time. To deal with these deficiencies companies have to spend more time than they should in assembling data, making calculations, checking for errors and creating reports. 

There are strong reasons to change this reliance on inappropriate tools. One is that more companies must deal with an increasingly complex tax environment. Despite decades of talk about simplifying the tax code in the United States, it has grown ever more intricate. For those with a long memory, there was some simplification in the 1980s, but since then complexity has returned with a vengeance. Moreover, as corporations grow and expand internationally, their legal entity structure becomes more multifaceted, and their source systems for collecting and managing tax data can become fragmented. Unless the tax function is completely centralized, companies that operate in more than a handful of tax jurisdictions can find it hard to coordinate their tax data, calculations and processes. Centralization is not a cure-all, either, as the lack of local presence poses its own issues in tax management in coordinating with local operations and finance organizations.

Another reason is that national taxing authorities are beginning to improve their coordination with one another, which means that tax departments will have to deal with increasing complexity in reporting and a more stringent compliance environment. In 2013, the Organization for Economic Cooperation and Development (OECD) published a report titled “Action Plan on Base Erosion and Profit Shifting”, which describes the challenges national governments face in enforcing taxation in an increasingly global environment with a growing share of digital commerce. The OECD also is providing a forum for member governments to take action (including collective action) to strengthen their tax collection capabilities. Although the process of increased government coordination is likely to take years to unfold, the outcome almost certainly will be to put additional pressure on companies that have legal entities domiciled in multiple countries. The impact is likely to mean longer and more frequent audits (including concurrent audits by multiple tax authorities) with more detailed requests for information. Increased data sharing among tax authorities will make it even more critical that the tax data – and all the minutiae of adjustments, reconciliations and year-to-year permanent changes – be absolutely accurate, consistent and readily available.

In addition, governments worldwide are increasing their electronic collection of tax data. This enables them to improve scrutiny of tax returns by applying analytic techniques that highlight errors and discrepancies as well as to identify suspicious activities or potentially aggressive tax treatments. Eliminating paper forms allows tax authorities to require even more data from companies. In this environment, having accurate, consistent data becomes essential. Having time to consider the best tax-related options thus becomes even more valuable.

In this increasingly complex and demanding environment it is good news that technology, such as a tax data warehouse, has advanced to become feasible and affordable for the kinds of organizations that can benefit most from it. A tax data warehouse addresses all of the needs of a tax department listed above. A single source of data minimizes errors and ensures consistency. It also promotes transparency, especially when used in conjunction with a dedicated direct tax management application. Because it is possible to exactly recreate the assumptions, data and methods used and because the data and the calculations are consistent, the answer to “where did that number come from” can be found quickly and with complete assurance. The process is better controlled, access to the records and application is more secure, and the entire process is much more easily audited than when working with spreadsheets. Since all of the numbers and assumptions are kept intact and readily available, an audit defense can be performed with less effort and greater confidence. In addition, the ability to create multiple scenarios with different assumptions and treatments enables tax and legal departments to determine the best approach for the company’s risk tolerance. The financial impact of these benefits can be considerable because most companies that operate in multiple direct (income) tax jurisdictions spend considerable amounts of time gathering and assembling data manually. As noted, they often use spreadsheets – sometimes dozens or even hundreds of them – for tax calculations and data storage. These spreadsheet-based systems are built on a weak foundation because of the data issues inherent when there are multiple systems of record and inconsistent data-related processes.

vr_fcc_tax_effectivenessThe evolving tax environment means that tax departments must be in the mainstream of finance organizations. Our research on the financial close finds that a majority of finance executives do not know how long it takes for the tax department to complete quarterly tax calculations. Executives who are not tax professionals usually do not appreciate the important difference between finance and tax data requirements. Corporations are constantly changing their organizational structure as well as acquiring and divesting business units. As these events occur, accounting and management reporting systems adapt to the changes both in the current as well as past periods. Tax data, on the other hand, must be stable. Legal obligations to pay taxes are based on facts as they exist in specific legal entities operating in a specific tax jurisdiction in a specific period. From a tax authority’s standpoint, these facts never change even as operating structures and ownership evolve. Audit defense requires a corporation to assemble the facts and related calculations, sometimes years after the fact. A general finance data warehouse does not deliver this capability because it is not – and for all practical purposes cannot be – structured to satisfy the needs of a tax department, particularly those that operate in multiple jurisdictions.

To ensure accuracy and inspire confidence in the products of the tax department’s work, it important for tax departments to tightly control the end-to-end process of taking numbers from source systems, constructing tax financial statements, calculating taxes owed and keeping track of cumulative amounts and other balance sheet items related to taxes. Transparency is the natural result of a controlled process which uses a single set of all relevant tax data. A readily accessible authoritative data set makes tax department operations more efficient. Reducing the time and effort to execute the tax department’s core functions frees up the time of tax professionals for more useful analysis. In a more challenging tax-levying environment, having tax data and tax calculations that are immediately traceable, reproducible and permanently accessible provides company executives with greater certainty and reduces the risk of noncompliance and the attendant costs and reputation issues. Having an accurate and consistent tax data warehouse of record provides corporations and their tax departments with the ability to better execute tax planning, provisioning and compliance.

Regards,

Robert Kugel – SVP Research

Our benchmark research on business analytics finds that just 13 percent of companies overall and 11 percent of finance departments use predictive analytics. I think advanced analytics – especially predictive analytics – should play a larger role in managing organizations. Making it easier to create and consume advanced analytics would help organizations broaden their integration in business planning and execution. This was one of the points that SPSS, an IBM subsidiary that provides analytics, addressed at IBM’s recent analyst summit.

Predictive analytics are especially useful for anticipating trend divergences or spotting them earlier than one might otherwise. For example, sales may be up compared to a prior period, but is it simply month-to-month variability or the start of an upward trend? Better analytical techniques can help distinguish between normal variation and the beginning of a new trend. By using analytics, one might even discern that while the revenue numbers have been positive recently, the underlying data contains warning signs that point to diminishing volumes, lower prices or both in the future.

Predictive analytic models are created using a top-down or a bottom-up approach, or some combination of the two. SPSS offers tools to handle both. The top-down approach involves creating a statistical hypothesis based on business observations or theories and then testing that hypothesis using statistical methods. IBM SPSS Statistics enables users to build a relevant picture from a sample, as well as test assumptions and hypotheses about that picture. A bottom-up approach unleashes automated data mining techniques on data sets (typically large ones) to distill statistically significant relationships from them. SPSS Modeler is designed for use by experienced data miners but also business analysts to speed the creation and refinement of predictive models. Often, companies employ both approaches iteratively to refine and improve models.

I used to joke that the main value proposition of SPSS was that while its chief rival, SAS, required its users to have a Ph.D. in statistics, SPSS could be used by anyone with a master’s degree. Applying predictive analytics techniques is simple in concept but far from simple to integrate into day-to-day business beyond its traditional roles such as market research. This partly explains why so few companies have woven predictive analytics into their planning and review cycles. It’s possible to create relatively simple predictive models, but for many business issues, such models may be too simplistic to be useful. And they may not be reliable enough because they generate too many false positives (people spend too much time chasing non-issues) or false negatives (missing important developments or breaks in trends).

Beyond the data and technology challenges posed by advanced analytics, there are significant people issues that companies must address to make their use practical. These can be more difficult to tackle than most business/IT issues because of the experience and skills that are needed that our benchmark in predictive analytics still finds lack of adequate resources. Automating general business processes, for instance, requires bringing together business subject-matter experts with people who understand IT. Advanced analytics, however, requires three sets of skills – business subject-matter expertise, IT and statistics – that are rarely found in any single individual. Communication among sets of individuals who have these skills often is difficult because they have a limited appreciation of the others’ domains and often have difficulty expressing the nuances of their own area of expertise.

Today there’s greater focus than ever on analytics, partly because an explosion of available data has made it possible and even necessary to make sense of it. As part of IBM, SPSS has been benefitting from the parent’s “smarter planet” marketing theme. SPSS also has taken steps to expand demand for its tools by reducing the people barriers to adopting advanced analytics. One step has been to automate data preparation for use in Statistics and Modeler. Another is an automated modeler that takes several different approaches to analyze a set of data in a single run and then compares the results. Yet despite these steps, I expect advanced analytics to require specialized skills for many years.

Therefore, I also expect adoption of advanced analytics to happen slowly. Most executives at the senior and even middle levels of corporations have limited familiarity with advanced analytics. Many may have had their last formal education with statistics as a required business school course. To spur broader adoption of predictive and other advanced analytics, IBM and others must foster a “pull” approach to marketing analytics. Business executives need to know that advanced analytics are available and of practical value, especially outside of traditional statistics-heavy realms such as consumer research and fraud detection. Sales planning, financial planning, enterprise risk management, maintenance and customer service are all areas ripe for use of predictive and other advanced analytics.  We found all of these as future use of predictive analytics in our benchmark. It’s easy to convince analysts like me of the value of analytics; it’s much harder to get business executives to incorporate them into day-to-day practices. It would be helpful for its own cause if IBM SPSS were to identify promising uses of advanced analytics by function and industry and provide a canned blueprint that can serve as a starting point. Such a blueprint would incorporate a business case illustrating the problem, the suggested steps for addressing it and the scope of benefits that can be realized.

The continuing explosion of data will give rise to an increasing number of ways that business and finance executives can use information to their advantage. But first they have to know that they can.

Regards,

Robert Kugel – SVP Research

Twitter Updates

Stats

  • 80,787 hits
Follow

Get every new post delivered to your Inbox.

Join 71 other followers

%d bloggers like this: