You are currently browsing the tag archive for the ‘Data Integration’ tag.
IBM this week announced its pending acquisition of the Star Analytics product portfolio. Star Analytics is a privately held company that offers products designed to provide easy access to and integration with Oracle Hyperion data sources. While Star Analytics has a good product and solid references, it has lacked critical mass to support more effective sales and marketing efforts. Star Analytics’ strategic value to IBM lies in its ability to unlock data held in Oracle Essbase multidimensional databases, which is the repository for applications such as Hyperion Enterprise, Financial Management and Planning. It supports IBM’s aim to offer comprehensive business analytics capabilities, which means it must be able to facilitate access to all data sources. Longer term, it enables IBM to compete with Oracle for finance department customers with IBM’s own financial performance management applications. Star Analytics gives IBM a means of fostering relationships with existing users of Hyperion applications and a more graceful migration path to using IBM’s financial, analytics and business intelligence software.
Finance organizations have been heavy users of Hyperion financial applications. While these applications are workhorses for important finance tasks, the information in their data stores has been difficult to access from outside of Hyperion environments. Larger companies often must try to knit together their complex applications and databases with hard-to-maintain custom integrations, or use spreadsheets to combine data from multiple sources. Since companies maintain a great deal of useful information, they wind up spending a considerable amount of time and effort working around accessibility issues. Star Analytics’ software helps companies make data much easier to integrate into financial information processes, enabling consolidated reporting and analytics that can accelerate the time it takes to assess overall financial results.
Larger companies that have significant, longstanding investments in Hyperion software often store a great deal of information in multidimensional databases (“cubes”) that serve as the foundation for their consolidation, reporting and planning software. These databases are extremely useful for analysis and reporting and are a considerable improvement over desktop spreadsheets for collaborative and repetitive tasks. Yet their utility is confined mainly to the needs of the finance department’s specific application because, in practice, it’s very difficult to extract the information contained in these databases into data warehouses accessible to the rest of the organization by using standard business intelligence (BI) tools. This is a serious ongoing problem. Today, to produce the kind of advanced analyses and reporting that generate deeper insight and more useful, effective business models, it is necessary to integrate financial and operational information. The operational data comes not only from a company’s ERP system but also supply chain and logistics systems, customer relationship management and other software that manages processes and tracks business performance. The finance department itself likely has multiple cubes in use, and the process of extracting data from them and synthesizing it into a report or analysis can be time-consuming.
To address these issues, Star Analytics developed the Star Integration Server so that companies that have Hyperion applications (Planning, Financial Management or custom-built software built on Essbase) can more easily extract information for wider use (say, in a company-wide data warehouse or financial data mart). All departments within a corporation can then use the data in their applications or analyses. They can create reports (using whatever reporting and analysis tools they currently use) that use this information in conjunction with operational data from other sources. Moreover, the software allows companies to bring together data from multiple cubes so that they can analyze and report on all of the information that is kept in these separate data stores. Finance departments in particular are able to consolidate and manage complex hierarchy structures in their cubes so that they can speed up processes such as business analyses, periodic reporting, forecasting and planning. The Integration Server also enables companies to pull out proprietary data structures, embedded calculations (such as allocations or ratio analyses), business rules (consolidation parameters), security protocols and supporting detail from Hyperion applications. The software can support the data volumes larger companies use, as well as ensure that data security protocols are observed. The Integration Server has native support for exporting to major relational databases such as IBM’s DB2, Oracle, Microsoft SQL Server, MySQL and Sybase.
The Star Command Center automates data movements, database consolidations and schedule-related processes, eliminating the need to devote IT resources to repetitive mechanical tasks. It is a far more comprehensive, flexible and easy-to-maintain alternative to the custom coding approach organizations often adopt to lash together their disparate applications and databases. It is designed for point-and-click simplicity so that it is usable by finance/IT-types, in contrast to other tools aimed at IT/data center professionals. The Command Center even enables integration between on-premises and cloud-based data sources, and can be accessed and operated via a mobile application.
Currently, most companies either extract information from their Oracle/Hyperion systems in a laborious and time-consuming fashion, or have consultants build custom solutions. Companies that use Oracle’s Financial Management, Planning, Enterprise or custom applications running on Essbase should make it easier to integrate data from these systems for broader business analytics purposes. Users of Star Analytics’ software can save considerable amounts of time in performing analyses and generating reports – in some cases going from days down to hours. This technology will help IBM enable its vision for finance that I outlined last year and help companies using Oracle to find value from existing investments. Since financial information is available quickly, the data can be used to do more rapid planning and analysis cycles or have management or financial reports available sooner. Moreover, unlocking financial and other information enables corporations to do more integrated business analysis, planning and reporting. Information that was once inaccessible can be readily available when it’s needed. Knowing more enables companies to do more.
Robert Kugel – SVP Research
My colleague Mark Smith and I recently chatted with executives of Tidemark, a company in the early stages of providing business analytics for decision-makers. It has a roster of experienced executive talent and solid financial backing. There’s a strategic link with Workday that reflects a common background at the operational and investor levels. As it gets rolling, Tidemark is targeting large and very companies as customers for its cloud-based system for analyzing data. It can automate alerts and enhance operating visibility, collaboratively assess the potential impacts of decisions and support the process of implementing those decisions.
Tidemark’s product fits into the performance management/decision support category but has several points of differentiation. One is that it enables larger enterprises to work interactively with a broad set of operating and financial metrics rather than working with multiple legacy business intelligence and reporting systems. It can integrate enterprise data from ERP, CRM, supply chain, logistics, maintenance, real estate and other systems into a single cloud-based analytic and data store in a way that ensures there is master data control from source systems to the new warehouse, and that uses common semantic-layer definitions consistently across all processes and systems.
The business purpose of marshalling all the data in one place is to take advantage of recent advances in managing large-scale data using new database techniques that we have assessed. This can lower the cost of doing so along with utilizing in-memory data processing, which makes it feasible for large organizations to work with operational and financial information much more interactively. Because this can reduce the time required to assemble information (in reports and dashboards), organizations gain time to focus on making the right decision and implementing it. Rather than struggling to lash together sets of fragmented and siloed data every time a new set of analytics is needed, the right data in the right format and context is available at any level for various purposes, including integrated business planning, predictive analytics, and analysis of trends, cost-to-serve metrics, customer profitability and product profitability. The same data set can be used for dashboards, performance scorecards and operational risk measurement and management. Executives, managers and analysts can go beyond drilling down into the details of what just happened. For example, they can do real-time collaborative contingency planning to explore in detail the impacts of several courses of action during a one-hour meeting.
Tidemark’s second point of differentiation is to make the user experience as close to that of a consumer application (and as far away from traditional business software) as possible. Design has become an increasingly important element in industrial products once sold mainly on functional considerations instead of usability. I expect the same is finally coming true in software as past limitations to design melt away with advances in the underlying technology. I also put tablet and mobility support in this category, because these have become table stakes for enterprise applications. An individual vendor’s advantage in mobility may be built initially on technical capabilities, but competitors can quickly replicate it. User interface design, however, can be a source of lasting advantage.
Tidemark has elected to use HTML5 Internet technology standard so it can support as many Web browsers, smart phones and tablets as possible. In this respect, I think executives decided that the underlying technology is a commodity and therefore it’s best to embrace a broad standard. This may prove to be a smart bet or it could be a sticking point since it will not use the native capabilities of mobile platforms. Notably, it will not be part of the Apple ecosystem, which I think may hurt Tidemark today but may be of no consequence in the future.
Another new, consumer-like feature is the visual metaphor of using sentences to describe what you are doing and seeing with business analytics, which adds to usability for business people. In this form, analysis, planning and review activities fit the rhythms of how people naturally do and think about business. This approach makes it simple to understand what a business user is asking from the system and what it has presented.
A third point of differentiation from standard performance management offerings is having advanced analytics integrated into the application. Because comprehensive real-time and near-real-time finance and operations data are available for in-memory processing, it’s feasible to employ predictive analytics to improve forecast accuracy and to provide a comprehensive range of alerts when results diverge materially from what’s expected. Such a structure facilitates creating driver-based models for planning and budgeting. Moreover, Tidemark includes risks in its scorecard presentation. All business decisions involve risk, yet risks are rarely part of a balanced scorecard. Scorecards are balanced precisely because they cover trade-offs (for example, in a call center, balancing time-on-call with first-call resolution). Although some risks are implicit in every item on a scorecard (that is, not achieving a key objective) others are indirect or less than obvious, such as deferring scheduled maintenance or operating above capacity limits to accommodate unexpected orders. That noted, it’s impossible to say, based on this introduction, just how broad and deep Tidemark’s capabilities (especially industry-specific metrics, advanced analytics and risk measures) will be in the initial release. Operational risk analytics are likely to be elementary because, outside of financial services, that is the (sad) state of the art at this moment.
It’s still early days for Tidemark, so file the comments above under “excellent if true.” There are a number of big potential snags that the company and its users may confront when they put their system in place.
First on the list are the performance and scalability proof points that must be demonstrated in full-scale deployments. I’m not sure this will be a big issue since the basic technology foundations (cloud data and applications, in-memory processing and core analytics) that Tidemark rests on are solid. More likely, the challenge to users and Tidemark will be the snags that are inevitable when dealing with large organizations’ IT infrastructure (nonstandard legacy systems, for instance) as well as the maintenance issues that companies will have to address that might cause performance and other system metrics to fall short of what customers are expecting. Or performance may be adequate for most complex analytical tasks but not for, say, organizations that do complex sales and operations planning involving large numbers of rapidly changing stock-keeping units (SKUs). At this point, I doubt any of these issues will prove to be overall showstoppers, but they may limit the software’s appeal.
A more fundamental issue that I see is whether and to what degree companies will change how they operate to take advantage of the technology available. Just because something is technically feasible doesn’t mean that companies will change old ways of doing things. Our research shows few companies (just 6%) use driver-based plans, which, from a practical standpoint, is the only way to be able to do agile planning. Outside of financial services, few companies understand how to measure and manage operational risks beyond those major that are already incorporated in business reviews (such as failure to meet sales and profit targets). Attenuating the impact of unfavorable events collaboratively with established risk mitigation scripts (something that is feasible in Tidemark) is not well practiced outside of the military or the parts of companies facing heavy safety and health liabilities.
The next fundamental issue is data governance. As I look at it, the initial system setup can, with varying degrees of difficulty, overcome the accumulated impact of sloppy data management practices. Thereafter, there can be considerable cost involved in maintaining the integrity of the comprehensive data unless the company is rigorous about data governance.
The last fundamental issue for some companies may be the total cost of ownership. Tidemark’s initial pricing projections are competitive with on-premises systems. Companies with sprawling, disjointed enterprise system infrastructures and poor data stewardship probably will have to spend quite a bit more both up front and for ongoing maintenance for a system that includes data from a broad set of enterprise systems. Since few companies have mature data governance practices, dealing with data issues may prove to be a big cost factor in regards to time and agreement on the definitions and representation.
On balance, my initial impression of Tidemark is positive. There’s a great deal of promise and more than a few things that its established competition might want consider copying. Stay tuned.
Robert Kugel CFA – SVP of Research