You are currently browsing the category archive for the ‘Information Management (IM)’ category.

Optimization is the application of algorithms to sets of data to guide executives and managers in making the best decisions. It’s a trending topic because using optimization technologies and techniques to better manage a variety of day-to-day business issues is becoming easier. I expect optimization, once the preserve of data scientists and operations research specialists will become mainstream in general purpose business analytics over the next five years.

Optimization was first adopted by businesses in the middle of the 20th century, aided by the introduction of digital computers. The first technique that gained broad for a few specific purposes was linear programming, one of the most basic optimization methods. Linear programming enables analysts to quickly determine how to achieve the best outcome (such as maximum unit volume or lowest cost) in a given situation. They do so using a mathematical model that captures the key variables that go into the decision and any constraints that may affect that decision. A food processor, for instance, may use three types of cooking oil to make a product. To maximize its profit, the company needs to determine the exact proportions of the three oils that result in the lowest production cost. However, it can’t just choose the cheapest of the three in every case because for flavor and shelf-life requirements there’s a limit to the maximum percentage of each oil that it can use. Linear programming using the simplex algorithm quickly solves the problem.

As computing capabilities became increasingly affordable, companies could use more complex algorithms to handle ever more difficult optimization problems. For instance, the airline industry used it to determine how best to route aircraft between two cities and to staff flight crews. Not only can softwarevr_Big_Data_Analytics_01_use_of_big_data_analytics find the best solution for scheduling these assets in advance, it also can rapidly re-optimize the solution when weather or mechanical issues force a change in how aircraft and crews are deployed. Airlines were also in the vanguard in the 1980s when they started using revenue management techniques. In this case, the optimization process was designed to enable established airlines to compete against low-cost startups. Revenue management enabled the large carriers to offer low fares to price-sensitive but flexible vacationers without sacrificing the higher fares that the less flexible businesspeople were willing to spend. The same approach was adopted by hotels in pricing their rooms. Starting in the 1990s markdown management software, which I have written about gained ground. It enables retailers to make more intelligent pricing decisions by monitoring the velocity of purchases of specific items and adjusting prices to maximize revenue. To be feasible, each of these optimization problems require large data sets and sufficient raw computing power.

We’re now on the cusp of “democratizing” what I call optimization analytics. Big data technologies are making it feasible and affordable for even midsize companies to work with much larger data sets than they have been able to in the past. Our benchmark research on big data analytics finds that about half of participating companies already use analytics with big data. This is partly the result of more powerful and affordable data processing resources but also because companies have invested in systems to automate many functions. The rich data sets created by these business applications provide corporations with the raw material for analysis. This data has the potential to enable businesses to make more intelligent decisions. From a practical standpoint, though, the value of these large data sets can only be realized by moving optimization analytics out of the exclusive realm of data scientists and into the hands of business analysts. These analysts are the ones who have a sufficient understanding of the business and the subtleties of the data to find useful and repeatable optimization opportunities. Three-fourths of companies in our research said that they need these business skills (“domain expertise”) to use big data analytics successfully.

vr_Big_Data_Analytics_14_big_data_analytics_skillsOptimization analytics is a breakthrough technology with the potential to improve business performance and create a competitive advantage. You can’t do optimization in your head, and it’s not feasible in desktop spreadsheets for anything but the most basic use cases, such as linear programming optimizations on relatively small data sets. This is a good reason for almost any company to consider adopting optimization software.

Another reason why companies will find it attractive to apply optimization analytics broadly is that the results of applying optimization routines may be superior to using common rules of thumb or relying on instinct and experience. One of the most important lessons for executives about optimization analytics is that optimal solutions are sometimes (but – crucially – not always) counterintuitive to established norms. For instance, in markdown management, retailers often have found that smaller, more frequent price reductions maximize profits and produce a considerable improvement in sales over the end-of-the-season price slashing that was once considered to be the best practice. In financial services, charging your best customers more for loans and other services turns out to be the optimal choice for the bottom line of financial institutions. Another important insight from our collective experience with optimization is that while the value of these analytics as realized in a single event or transaction may be small, it can have a measurable impact on profitability and competitiveness when applied broadly in a business.

At this point optimization analytics is in a dual mode. On the one hand, there are proven examples of the narrow application of optimization such as those mentioned above. On the other, bringing optimization analytics to the masses is only beginning. Some vendors have made progress in simplifying their analytics, but mainstream products are only on the horizon. It’s also important to recognize that, as with past breakthroughs in information technology, there are bound to be more duds than success stories in initial attempts at using optimization analytics. Experience suggests that a small number of companies that have strong analytical skills and a rigorous approach to managing company data will prove to be the leaders in finding profitable opportunities for applying optimization technologies and techniques. Others will do well to find these examples and consider how to apply them to their own organizations.


Robert Kugel – SVP Research

One of the issues in handling the tax function in business, especially where it involves direct (income) taxes, is the technical expertise required. At the more senior levels, practitioners must be knowledgeable about accounting and tax law. In multinational corporations, understanding differences between accounting and legal structures in various localities and their effects on tax liabilities requires more knowledge. Yet when I began to study the structures of corporate tax departments, I was struck by the scarcity of senior-level titles in them. This may reflect the low profile of the department in most companies and the tactical nature of the work it has performed. Advances in information technology have the potential to automate most of the manual tasks tax professionals perform. This increase in efficiency will enable tax departments to fill a more strategic, important role in the companies they serve.

In the past (and still in many organizations today) there was a sharp pyramid of value-added work in the tax function, with tax attorneys at the top, corporate counsel in the middle and tax practitioners at the bottom. The tax attorney, versed in the intricacies of laws and their applications – especially in cross-border situations – typically has had the greatest ability to minimize tax expenditures. The role requires a combination of inspiration and art to see the underlying logic of tax laws and legal structures to be able to apply creative interpretations to black-letter statutes and has been rewarded accordingly. Senior corporate counsel has weighty responsibilities and therefore merited elevated titles. But the role of tax preparation has been oriented toward functional execution. It typically is done hard-working experts who have limited impact on policy and decision-making. Because the workings of this group are particularly sensitive, it also has a culture that attracts people who tend to be tight-lipped and not given to self-promotion. This hierarchy has helped to keep tax matters outside of the mainstream activities of the corporation, as I have discussed.

Today, information technology can flatten the value-added pyramid  by automating routine work. Doing this can give skilled practitioners in the tax department more time to spend on value-adding analysis and contingency planning because they spend less time on data gathering, data transformation and vr_Office_of_Finance_15_tax_depts_and_spreadsheetscalculations. Increased productivity creates more time to find tax or cash flow savings, as well as to provide better-informed guidance on alternative strategies. To accomplish this, corporations must automate their tax provisioning process; most will benefit from having a tax data warehouse, which I have written about. However, our recent Office of Finance research finds that this is not widely done. Instead almost all midsize and larger companies (90%) use spreadsheets exclusively or mainly to manage their tax provisioning process, including calculations and analysis, and this demands manual effort.

Desktop spreadsheets are not well suited to any repetitive collaborative enterprise task or as a corporate data store. They are a poor choice for managing taxes because they are error-prone, lack transparency, are difficult to use for data aggregation, lack controls and have a limited ability to handle more than a few dimensions at a time. Data from corporate sources, such as ERP systems, may have to be adjusted and transformed to put this information into its proper tax context, such as performing allocations or transforming the data so that it reflects the tax-relevant legal entity structure rather than corporate management structure. Moreover, in desktop spreadsheets it is difficult to parse even moderately complex nested formulas or spot errors and inconsistencies. Pivot tables have only a limited ability to manage key dimensions (such as time, location, business unit and legal entity) in performing analyses and reporting. As a data store, spreadsheets may be inaccessible to others in the organization if they are kept on an individual’s hard drive. Spreadsheets are rarely documented well, so it is difficult for anyone other than the creator to understand their structure and formulas or their underlying assumptions. The provenance of the data in the spreadsheets may be unclear, making it difficult to understand the source of discrepancies between individual spreadsheets as well as making audits difficult. Companies are able to deal with spreadsheets’ inherent shortcomings only by spending more time than they should assembling data, making calculations, checking for errors and creating reports.

On the other hand, a tax data warehouse addresses spreadsheet issues. It is a central, dedicated repository of all of the data used in the tax provisioning process, including the minutiae of adjustments, reconciliations and true-ups. As the authoritative source, it ensures that data and formulas used for provisioning are consistent and easily audited. Since it preserves all of the data, formulas and legal entity structures exactly as they were in the tax period, it’s far easier to handle a subsequent tax audit, even several years later. In this respect a dedicated tax data warehouse has an advantage over corporate or finance department data warehouses, which are designed for general use and often are modified from one year to the next as a result of divestitures or reorganizations.

Another benefit of automating provisioning and having a tax data warehouse is that this approach provides greater visibility and transparency (at least internally) into tax-related decisions. This gives senior executives greater certainty about and control over tax matters and allows them to engage more in tax-related decisions. In companies where executives are more engaged in tax, the tax department gains visibility. Also, because automation enables better process and data control, external auditors spend less time examining the process and tax-related calculations in financial filings, and it cuts the time the tax department might need to spend in audit defense with tax authorities. Process automation enables tax departments to increase their efficiency and give members more time to apply their tax expertise to increase the business value of their work, thereby flattening the value-added pyramid.

The scope of the value that tax practitioners can add is broadening because having greater visibility into the methods used in direct tax provisioning will be increasingly important. I have noted that companies that have significant operations in multiple tax jurisdictions are likely to face a more challenging future. While I’m skeptical that there will be a massive change in how tax authorities manage cross-border tax information soon (which is more a reflection of the competence of the taxing authorities than their motivation), it’s worth assuming that it will grow at least gradually and therefore companies must be prepared to deal with increasingly better-informed tax officials. Transparency also fosters consistency in tax treatments and the ability to manage the degree of risk that CEOs, CFOs and their board are willing to take on in weighing how conservative or aggressive a corporation would like to be in how it handles taxes. This is another way for tax practitioners to increase their value and visibility in their corporation.

Forward-looking companies have been making the transition to automating their direct tax provisioning process, redefining their approach to managing taxes and giving their tax departments greater visibility. It’s unlikely that these companies are using desktop spreadsheets to any meaningful degree. It’s not clear when the mainstreaming of the tax department will be common, but it’s probably at least several years away. Evidence that a fundamental shift has occurred in how corporations manage their income tax exposure will exist when a majority of midsize and larger companies use dedicated software rather than spreadsheets for this function. That’s also likely to be when Senior Vice President – Tax becomes a common title. This promotion won’t be the result of title inflation. It will happen because the tax department’s role will be important, making a bigger, more visible contribution to the company as practitioners focus more on analyses that optimize tax decisions and far less on the calculations and other repetitive mechanical processes that consume time but produce little value.

I recommend that every CFO of a company operating in multiple tax jurisdictions with even a slightly complex legal structure consider automating tax provisioning and deploying a third-party (rather than a custom-built) tax data warehouse.


Robert Kugel – SVP Research

Twitter Updates


  • 100,957 hits

Get every new post delivered to your Inbox.

Join 75 other followers

%d bloggers like this: