Robert Kugel's Analyst Perspectives

Encountering New Bottlenecks with Oracle’s Breakthrough Technology

Posted by Robert Kugel on Oct 3, 2012 11:43:24 AM

Two key themes that emerged from Larry Ellison’s Sunday night keynote at this year’s Oracle OpenWorld were faster processing speed and cheaper storage. An underlying purpose to these themes was to assert the importance of Oracle’s strategic vertical integration of hardware and software with the acquisitions of Sun. I try to view technology keynotes like this from the perspective of a practical business user. Advancements such of these are important because enhancing the performance and cost-effectiveness of IT infrastructure can drive substantially improved business capabilities. As I’ve noted in the past, the ability to rapidly process large amounts of data provides business users with significant new capabilities in areas such as complex event processing, social media analytics and the ability to analyze unstructured or semi-structured data. In planning, it has the potential to change how companies perform a wide range of analytics-driven processes, especially in areas such as planning, budgeting and forecasting. It makes it feasible to more fully explore the impact of different courses of action, because rather than having to wait hours or days for answers to questions that start with “What happens if we…” the answers come back in seconds. Review and planning sessions can focus more on what’s next rather than rehashing history.

All well and good, but a business-focused skeptic would also note that while fast processing capability may be important to people in the IT department, it’s not a panacea. It doesn’t necessarily address the other major bottlenecks in business process execution. Many companies that we benchmark face issues that stem from poor process design (including too many manual steps that cannot make use of faster data processing) or an inability to reliably acquire necessary data in a timely fashion (which also may be the root cause of a need to do a manual process). Both factors have a demonstrable, significant impact on how well companies execute. For example, our recent fast, clean close benchmark research shows that just 30 percent of the companies with the tactical (lowest) level of maturity in the information dimension can close their monthly or quarterly books in five to six business days, compared to 62 percent of the more mature companies.  Similarly, 30 percent of companies with only a tactical level of process maturity close their books within five to six business days, while 64 percent of the more mature companies can do this. Moreover, our business analytics benchmark research shows that dealing with data occupies the biggest slice of time for two-thirds (69%) of companies.

Poorly designed or executed processes and inadequate information infrastructures are significant barriers to improving business performance. Fast processors and in-memory computing do not address these basic shortcomings.

Oracle’s technology advancements will undoubtedly have a positive impact on business computing users. Balanced against this is the fact that focusing on faster data processing distracts IT departments’ attention from the critical information management issues that plague companies. Dealing with them can be far more cost-effective, even with Oracle’s OpenWorld announcements that point to substantially lower data center costs. By this I mean that if all the CPU-related time required to complete a process accounts for 20 percent of the total time required to execute a repetitive corporate process, cutting the computing time by a whopping 90 percent will result in a total time savings of just 19 percent. If today it takes a week to get something done, this might cut only one day from the process. Meanwhile, fixing the information and process bottlenecks to speed completion of some business process, such as poor process design or data issues, may enable a company to slice even more time and save money to boot.

Shiny new equipment in the data center or the cloud won’t solve basic, pervasive problems of poor data, poor process design and poor process execution that plague business. Unfortunately, senior executives usually are not aware that better data stewardship would address a broad range of day-to-day business management issues. Dissatisfied with problems that always seem to be the fault of IT, they are unwilling to fund only a “keep the lights on” IT budget. CIOs who want to ensure they get budgets to fund cutting-edge information technology must make sure that they pay attention to basics such as information management.


Robert Kugel – SVP Research

Topics: Big Data, Customer Experience, executive, Business Analytics, Data Management, In-Memory Computing, Information Management, Business Performance Management (BPM), Business Process Management, Data, Financial Performance Management (FPM), IT Performance Management (ITPM), FPM

Robert Kugel

Written by Robert Kugel

Rob heads up the CFO and business research focusing on the intersection of information technology with the finance organization and business. The financial performance management (FPM) research agenda includes the application of IT to financial process optimization and collaborative systems; control systems and analytics; and advanced budgeting and planning. Prior to joining Ventana Research he was an equity research analyst at several firms including First Albany Corporation, Morgan Stanley, and Drexel Burnham, and a consultant with McKinsey and Company. Rob was an Institutional Investor All-American Team member and on the Wall Street Journal All-Star list. Rob has experience in aerospace and defense, banking, manufacturing and retail and consumer services. Rob earned his BA in Economics/Finance at Hampshire College, an MBA in Finance/Accounting at Columbia University, and is a CFA charter holder.