You are currently browsing the tag archive for the ‘Forecasting’ tag.
Tagetik provides financial performance management software. One particularly useful aspect of its suite is the Collaborative Disclosure Management (CDM). CDM addresses an important need in finance departments, which routinely generate highly formatted documents that combine words and numbers. Often these documents are assembled by contributors outside of the finance department; human resources, facilities, legal and corporate groups are the most common. The data used in these reports almost always come from multiple sources – not just enterprise systems such as ERP and financial consolidation software but also individual spreadsheets and databases that collect and store nonfinancial data (such as information about leased facilities, executive compensation, fixed assets, acquisitions and corporate actions). Until recently, these reports were almost always cobbled together manually – a painstaking process made even more time-consuming by the need to double-check the documents for accuracy and consistency. The adoption of a more automated approach was driven by the requirement imposed several years ago by United States Securities and Exchange Commission (SEC) that companies tag their required periodic disclosure filings using eXtensible Business Reporting Language (XBRL), which I have written about. This mandate created a tipping point in the workload, making the manual approach infeasible for a large number of companies and motivating them to adopt tools to automate the process. Although disclosure filings were the initial impetus to acquire collaborative disclosure management software, companies have found it useful for generating a range of formatted periodic reports that combine text and data, including board books (internal documents for senior executives and members of the board of directors), highly formatted periodic internal reports and filings with nonfinancial regulators or lien holders.
Tagetik’s Collaborative Disclosure Management automates the document creation process, eliminating many repetitive, mechanical functions and reducing the time needed to administer the process and ensure accuracy. Automation can shorten finance processes significantly. For example, our benchmark research on trends in developing the fast, clean close finds that companies that use little or no automation in their accounting close take almost twice as long to complete the process as those that fully automate it (9.1 days vs. 5.7 days). Manually assembling the narrative text from perhaps dozens of contributors and combining it with data used in tables and elsewhere in the document is a time-consuming chore. Regulatory filings are legal documents that must be completely accurate and conform to mandated presentation styles. They require careful review to ensure accuracy and completeness. Complicating this effort recently are increasingly stringent deadlines, especially in the U.S. Anyone who has been a party to these efforts knows that there can be frequent changes in the narratives and presentation of the numbers as they are reviewed by different parties, and those responsible need to ensure that any change to a number that occurs is automatically reflected everywhere that amount is cited in the document; to use the depreciation and amortization figure as an example, that would include the statement of cash flows, income statement, the text of the management discussion and analysis and the text or tables of one or more footnotes. Moreover, automated systems afford greater control over the data used. They make it possible to answer the common question of where a number came from quickly and with complete assurance. While inaccuracies in other types of financial documents may not have legal consequences, mistakes can have reputational or financial consequences.
Those managing the process also spend a great deal of energy simply checking the document to ensure that the various sections include the latest wording, that the numbers are consistent in the tables and text, that amounts have been rounded properly (which can be really complicated) and that the right people have signed off on every part of the filing. Automation obviates the need for much of these tasks. Tagetik’s CDM workflow-enables the process, so handoffs are automated, participants get alerts if they haven’t completed their steps in timely fashion, and administrators can keep track of where everyone is in the process. Workflow also promotes consistent execution of the process, and the workflows can be easily modified as needed.
In designing Collaborative Disclosure Management, Tagetik took advantage of users’ widespread familiarity with Microsoft Excel and Word to reduce the amount of training required to use its product. CDM’s workflow design makes it relatively easy for business users to define and modify business process automation. Typically, individuals or small groups work on different sections of the document. CDM enables multiple contributors from finance, accounting, legal, corporate and other functions to work with their part of the document without being concerned about other contributors’ versions. Work can proceed smoothly, and those administering the process can see at any time which components have been completed, are in progress or have not even started. Tagetik’s software can cut the time required to prepare any periodic document, since once a company has configured its system to create what is in effect a template, it’s relatively easy to generate these documents on monthly, quarterly or annual bases. The numbers relevant to the current period are updated from the specified controlled sources, and references to tabular data within the text are automatically adjusted to tie back to these new figures. Often a large percentage of the narrative text is boilerplate that either must not be updated or requires only limited editing to reflect new information. Starting with the previous edition of the report, contributors can quickly mark up a revised version, and reviewers can focus only on what has changed. Other important automation features are data validation, which reduces errors and revisions, and the system’s ability to round numbers using the appropriate statutory methodology.
CDM also handles XBRL tagging, which is essential for all SEC documents and necessary for an increasing number of regulatory filings around the world. The software specifically handles tagging for the two main European prudential regulatory filings for banks and other credit extending institutions, COREP (Common Reporting related to capital) and FINREP (Financial Reporting performed in a consistent fashion across multiple countries).
Companies can gain several key benefits by automating the production of their periodic regulatory filings and internal or external financial reports that combine text and data. One of the most important is time. Automation can substantially reduce the time that highly trained and well-compensated people spend on mechanical tasks (freeing them to do more productive things), and the process can be completed sooner. Having the basic work completed sooner gives senior executives and outside directors more time to review the document before it must be filed or made public. Time that can be devoted to considering how best to polish the narratives or if necessary lengthen upstream deadlines to handle last-minute developments and consider options for how best to treat accounting events. Automation can also reduce the chance of errors, since the numbers tie directly back to the source systems and (if properly configured) ensure that references in the narratives and footnotes to items in tables and the numbers in those table agree completely. Restatements of financial reports caused by errors are relatively rare but when they occur are exceptionally costly for public companies’ reputations.
Disclosure management systems are an essential component for any financial performance management (FPM) system. All midsize and larger corporations should be using this software to automate the production of their periodic mandated filings and other documents that combine text and data. They will find that they are useful in cutting the time and effort required to produce these documents, provide senior executives and directors more time to review and craft the final versions, and reduce the chance of errors in the process. Companies that are using older FPM software should investigate replacing it with an FPM suite to gain the additional capabilities – including disclosure management – that newer suites offer. Tagetik’s should be among the financial systems evaluated for office of finance.
Robert Kugel – SVP Research
Our research consistently finds that data issues are a root cause of many problems encountered by modern corporations. One of the main causes of bad data is a lack of data stewardship – too often, nobody is responsible for taking care of data. Fixing inaccurate data is tedious, but creating IT environments that build quality into data is far from glamorous, so these sorts of projects are rarely demanded and funded. The magnitude of the problem grows with the company: Big companies have more data and bigger issues with it than midsize ones. But companies of all sizes ignore this at their peril: Data quality, which includes accuracy, timeliness, relevance and consistency, has a profound impact on the quality of work done, especially in analytics where the value of even brilliantly conceived models is degraded when the data that drives that model is inaccurate, inconsistent or not timely. That’s a key finding of our finance analytics benchmark research.
A main requirement for the data used in analytics is that it be accurate because accuracy affects how well finance analytic processes work. One piece of seemingly good news from the research is that a majority of companies have accurate data with which to work in their finance analytics processes. However, only 11 percent said theirs is very accurate, and there’s a big difference between accurate enough and very accurate. The degree of accuracy is important because it correlates with, among other things, the quality of finance analytics processes and the agility with which organizations can respond to and plan for change.
Although almost all (92%) of the companies that have very accurate data also have a process that works well or very well, that assessment drops to 43 percent of companies that said their data is just accurate. Even in small doses, bad data has an outsized impact on finance analytic processes. Inaccuracies, inconsistencies and not comparable data can seriously gum up the works as analysts search for the source of the issue and then try to resolve it. As issues grow, dissatisfaction with the process increases. Just 22 percent of those with somewhat accurate data and none of the companies with data that is not accurate said their company has a process that works well or very well.
To be truly useful for business, analytics provided to executives, managers and other decision-makers must be fresh. The faster a company can deliver the assessments and insight as to what just happened, the sooner company can respond to those changes. Almost all (85%) companies with very accurate data said they are able to respond immediately or soon enough to changes in business or market conditions, but only 35 percent of those with accurate data and just 24 percent of those with somewhat accurate data are able to do so.
Moreover, having data that is timely enables companies to react in a coordinated fashion as well as quickly. Companies that are able to operate in a coordinated fashion are usually more successful in business than those that are somewhat coordinated in the same way that a juggler who is somewhat coordinated drops a lot of balls. Almost all (86%) companies whose data is all up-to-date said they are able to react to change in a coordinated or very well coordinated fashion, compared to just 38 percent of those whose data is mostly up-to-date and 19 percent that have a significant percentage of stale data. Three-fourths (77%) of companies that have very accurate data are able to respond to changes in a coordinated or very well coordinated fashion, but just one-third (35%) of those with accurate data and 14 percent with somewhat accurate data are able to accomplish this.
Speed is essential in delivering metrics and performance indicators if they are to be useful for strategic decision-making, competitive positioning and assessing performance. Companies that can respond sooner to opportunities and threats are more able to adjust to changing business conditions. The research finds that fewer than half (43%) of companies are able to deliver important metrics and performance indicators within a week of a period’s end – that is, soon enough to respond to an emerging opportunity or threat.
One way to speed up the delivery of analytics is to have analysts focus their time on the analytics. But the research shows that not many do: A majority of analysts spend the biggest chunk of their time dealing with data-related issues rather than on the analysis itself. Two-thirds (68%) of participants reported that they spend the most time dealing with the data used in their analytics – waiting for it, reviewing it for quality and consistency or preparing it for analysis. Only one-fourth (28%) said their efforts focus most on analysis and trying to determine root causes, which are the main reasons for doing the analysis in the first place. In other words, in a majority of companies, analysts don’t spend enough time doing what they are valued and paid for.
The results also show that there are negative knock-on effects of spending time on data-related tasks rather than on analysis. More than half (56%) of the companies that spend the biggest part of their time working on analytics can deliver metrics and indicators within a business week, compared to just one-third (36%) of those that spend the biggest part of the time grappling with data issues. Having high-quality, timely and accessible data therefore is essential to reaping the benefits of finance analytics.
Data issues diminish productivity in every part of a business as people struggle to correct errors or find workarounds. Issues with data are a man-made phenomenon, yet companies seem to treat bad data as a force of nature like a tornado or an earthquake that’s beyond their control to fix. Our benchmark research on information management suggests that inertia in tackling data issues is more organizational than technical. Companies simply do not devote sufficient resources (staff and budget) to address this ongoing issue. One reason may be because the people who must confront the data issues in their day-to-day work fail to understand the connection between these and getting the results from analytics that they should.
Excellent data quality is the result of building quality controls into data management processes. Our research finds a strong correlation between the degree of data quality efforts in finance analytics and the quality of the finance department’s analytic processes and output, and ultimately its timeliness and its value to the company. Corporations generally – and finance organizations in particular – must pay closer attention to the reliability of the data they use in their analytics. The investment in having better data will pay off in better analytics.
Robert Kugel – SVP Research