You are currently browsing the tag archive for the ‘IBM’ tag.
IBM’s Big Data and Analytics Analyst Insights conference started me thinking about the longer-term potential impact of big data and related technologies on business management. I covered some of the near-term uses of big data and analytics in an earlier perspective. There are numerous uses of big data that can provide incremental improvements to existing processes and practices. Some of these will have a significant impact on changing business models, enabling new classes of products and services and improving performance. As well, the technology will have more profound, longer lasting effects. The ability to analyze large quantities of business-related data rapidly has the potential to set in motion fundamental changes in how executives and managers run their business. Properly deployed, it will enable a more forward-looking and agile management style even in very large enterprises. It will allow more flexible forms of business organization. None of these changes will be universal, and the old school will be with us for some time. Technology, however, will give executives and their boards of directors a powerful tool for strategic differentiation to achieve a sustainable competitive advantage.
A skeptic might assert that an era of big data will expand the scope of “paralysis by analysis” simply by adding so much more grist to the mill. In an environment where it takes days or at least hours to get an answer back from some bit of analysis, doing another iteration provides a handy excuse for delay. While managers who hesitate to act could exploit big data to delay decisions, they would do so in spite of what’s feasible. In-memory processing of analytics on large data sets can support a much faster decision-making – even in large organizations – because business feedback loops will shrink. However, when answers come in seconds and when systems can almost immediately provide a range of potential solutions to a business issue with supporting considerations (as IBM Watson has demonstrated), active managers will be able to make more objective, numbers-based decisions faster; there’s less excuse for delay or no need for gut reactions. Predictive analytics, which are built on large data sets, have the ability to spot meaningful divergences from expected results, enabling companies to anticipate changes in markets and their environment. More decisive management styles will evolve because using big data with in-memory analytics better supports a fail-early-and-adapt approach to running a business. While this has always the case for small entrepreneurs, it’s increasingly possible for large organizations. (Watch for the strategic consultants to emphasize this point.)
A second area where change will occur is in the forward-looking aspects of management. Today, budgeting is the main form of collaborative business planning that’s done across the entire enterprise. Budgeting is a financial process where the main purpose is to control spending. Planning, however, is a process of determining the best way to achieve business objectives and, in the process, optimize spending. The two are intertwined, and both are necessary. That said, today most companies spend too much time budgeting and not enough time doing collaborative, integrated business planning. This situation is partly due to habit but also the result of practical limitations in technology and data. As can be seen in the chart, our research shows that having a more integrated planning environment promotes better business coordination. More than one-fourth of companies that have no integration of individual plans experience a lack of coordination compared to just 7 percent of those that integrate these the details of individual plans into an enterprise view. The same research reveals that companies with limited ability to calculate the impact of sudden major changes in their business environment have to base their decisions on what to do next on simplistic calculations or just “wing it.” By making it feasible to build scenarios rapidly using integrated business models, big data and in-memory technologies offer the potential to make more informed decisions about how to adapt to changing business conditions. This approach can shift the focus of reviews to a forward-looking, what-to-do-next mindset. That, in turn, can change the emphasis from creating an annual budget to integrating business planning and accelerating reaction cycles.
The ability to harness and share large data sets, along with an expanded ability to interoperate remotely, will support a third change to business organizations: increased dis-integration of enterprises as more companies embrace a federated “Hollywood model” of business organization. In the 1950s, television and the court-ordered divestiture of theaters in the United States forced the movie studios to abandon their factory model. They eliminated their contract workforces and a large portion of their employees. Motion picture production evolved into a system that today brings together loose confederations of professionals, skilled trades people and highly specialized small or midsize businesses. They operate as independent contractors collaborating on a project-by-project basis. This model is increasingly the norm for startups and small businesses, especially in the technology sector, because it is economical and offers greater flexibility. Even though they are less rigid, federated business relationships tend to persist because quality and personal dynamics are important to making them successful. Big data and analytics can support more flexible organizational structures, even for larger corporations. While this model is unlikely to supplant the standard corporate model, it will define an increasing share of the North American economy (and perhaps others) as it allows companies to grow in a more organic fashion. Many successful small and midsize businesses today could not have existed 20 years ago because their business structure would have been unworkable without today’s information technology. It would have been too difficult for them to assemble their talent (employees or contractors) or implement their business model with the cost and technical limitations of the IT and network technologies available in that era.
So far, much of the focus on the impact of big data and related technologies has been on refining or extending existing use cases or tasks. This is understandable since this is where the immediate impact will fall. Over the coming decade, though, technologies for acquiring, organizing and analyzing ever-larger data sets will alter how executives manage business, continuing a long trend toward a more data-driven management style. Compared to even ten years ago, leaders pay considerably more attention to objective measures and metrics and rely less on gut feelings in making decisions. As companies acquire the ability to harness large quantities of business-related data and analyze it rapidly, good management teams will be able to get better results more consistently. Smaller companies will find they can have more of the capabilities of large organizations while large corporations will be able to become more agile in responding to markets and better coordinated in their responses. Technology can enable even elephants to dance.
Robert Kugel – SVP Research
IBM hosted the Big Data and Analytics Analyst Insights conference in Toronto recently to emphasize the strategic importance of this topic to the company and to highlight recent and forthcoming advancements in its big data and analytics software. Our firm followed the presentations with interest. My colleagues Mark Smith and Tony Cosentino have commented on IBM’s execution of its big data strategy and its approach to analytics. As well, Ventana Research has conducted benchmark research on challenges in big data.
The perennial challenge for the IT industry observer is to be skeptical enough to avoid being taken in by overblown claims (often, the next big thing isn’t) without missing turning points in the evolution of technology. “Big data” has always been with us – it’s the amount that constitutes “big” that has changed. A megabyte was considered big data in the 1970s but is a trivial amount today. There’s no question that the hype around big data is excessive, but beneath the noise is considerable substance. Big data is of particular interest today because the scope and depth of the data that can be analyzed rapidly and turned into useful information are so large as to enable transformations in business management. The effect will be to raise computing – especially business computing – to a higher level.
The IBM event demonstrated that the technology for handling big data analytics to support more effective business computing is falling into place. It’s not all there yet but should improve strongly over the next several years. Yet while technological barriers are falling, there are other issues organizations will need to resolve. For example, the conversation in the conference sessions frequently turned to the people element of successfully deploying big data and analytics. These discussions confirmed an important finding in our big data research, shown in the chart, that more than three-fourths of organizations find staffing and training people for big data roles to be a challenge. It’s useful to remind ourselves that there will be the usual lag between what technology makes possible and the diffusion of new information-driven management techniques.
The conference focused mostly on the technology supporting big data and analytics but included examples of conceivable use cases for that technology. For example, there was a session on better management of customer profitability in the banking industry. Attempts to use software to optimize customer profitability go back to the 1980s, but results have been mixed at best, and a great many users continue to perform analysis in spreadsheets using data stored in business silos. IBM speakers described a generic offering incorporating Cognos TM1 to automate the fusion of multiple bank data sources that are incorporated in a range of profitability-related analytic applications. The aim is to enable more precise pricing and price-related decisions related to rates and fees, among other factors. This application enables consistent costing methodologies, including activity-based ones for indirect and shared expenses, to promote a more accurate assessment of the economic profitability of offers to customers. A good deal of the value in this offering is that it puts the necessary data in one place, giving executives and managers a consistent and more complete data set than they typically have. As well, the product’s use of in-memory analytic processing enables much faster calculations. Faster processing of a more complete data set enables more iterative, what-if analyses that can be used to explore the impact of different strategies in setting objectives for a new product or service or examining alternatives to exploit market opportunities or address threats.
As the Internet did, big data will change business models and drive the creation of new products and services. The dramatic drop in the cost of instrumenting machinery of all types and connecting them to a data network (part of the concept of the Internet of Things) is already changing how companies manage their productive assets, for example, by optimizing maintenance using sensors and telematics to enhance uptime while minimizing repair expense. Decades ago, companies monitored production parameters to ensure quality. More recent technologies can extend the speed and scope of what’s monitored and provide greater visibility into production conditions and trends. Representatives from BMW were on hand at the conference to talk about their experience in improving operations with predictive maintenance. Centrally monitoring the in-service performance of equipment and capital assets is old hat for airlines and jet engine manufacturers. For them, the economic benefits of optimizing maintenance to maximize the availability and uptime were significant enough to warrant the large investment they started making decades ago. The same basic techniques can be used to for early detection of warranty issues, such as identifying specific vehicles subject to “lemon law” provisions. From IBM’s perspective, these new technologies will enhance the value of Maximo, its asset management software, by extending its functionality to incorporate monitoring and analytics that help users explore options and optimize responses to specific conditions.
IBM Watson is the company’s poster child for the transformative capabilities of big data and analytics on how organizations operate. The company’s objective is to enable customers to achieve better performance and outcomes by having systems that learn through interactions, providing evidence-based responses to queries. My colleagues Mark Smith and Richard Snow have written about Watson in the contexts of cognitive computing and its application to customer service. And we awarded IBM Watson Engagement Advisor our 2012 Technology Innovation Award. Conference presenters gave an extensive review of progress to date with Watson, featuring innovative ways to use it for diagnosis and treatment in medicine as well as to provide customer support.
Although this was not directly related to big data, IBM also used the conference to announce the availability of Cognos Disclosure Management (CDM) 10.2.1, which will be available both on-premises and in a cloud-based SaaS version. CDM facilitates the creation, editing and publishing of highly structured enterprise documents that combine text and numbers and are created repeatedly and collaboratively, including ones that incorporate eXtensible Business Reporting Language (XBRL) tags. The new version offers improvements in scalability and tagging over the earlier FSR offering. The SaaS version, available on a per-seat subscription basis, will make this technology feasible for midsize companies, enabling them to save the time of highly skilled individuals as well as enhance the accuracy and consistency of, for example, regulatory filings and board books. A SaaS option also will help IBM address the requirements of larger companies that prefer a cloud option.
Most of the use cases presented at the conference were extensions and enhancements of well-worn uses for information technology. However, when it comes to business, the bottom line is what matters, not novelty. Adoption of technology occurs fastest when “new” elements of its use are kept to a minimum. The rapid embrace of the Internet in North America and other developed regions of the world was a function of the substantial investment that had been made over the previous decades in personal computers and local- and wide-area communications networks as well as the training and familiarity of people with these technologies. Big data and the analytics that enable us to apply it have a similar base of preparation. Over the next five years we can expect advances in practical use that benefit businesses and their customers.
Robert Kugel – SVP Research