You are currently browsing the tag archive for the ‘big data’ tag.

The developed world has an embarrassment of riches when it comes to information technology. Individuals walk around with far more computing power and data storage in their pockets than was required to send men to the moon. People routinely hold on their laps what would have been considered a supercomputer a generation ago. There is a wealth of information available on the Web. And the costs of these information assets are a tiny fraction of what they were decades ago. Consumer products have been at the forefront in utilizing information technology capabilities. The list of innovations is staggering. The “smart” phone is positively brilliant. Games are now a far bigger business than motion pictures.

VR Logo Bug Square BufferYet few business users are tapping the full potential of today’s systems. Most organizations have been slow to integrate IT innovation into their core processes. Companies have made considerable investments in information technology, but their business methods have been slow to adapt to the resources available. For instance, software for incentive compensation software and for planning and budgeting has made it possible to improve these processes, but most companies manage compensation, budget and plan in much the same way they did decades ago. To be sure, it’s much easier for individuals to adopt new tools for themselves than it is to align groups and executives in corporations to change proven approaches – even mediocre ones. But it’s also the case that business software must make it easier for individuals to realize more of the potential of information technology. And this part of the evolution of business software is only beginning. This is the context in which I took note of two emerging capabilities of IBM’s business software. One is its Concert user experience software and the other its emerging application (not yet officially named) designed to make advanced analytics more consumable. These are two important capabilities IBM highlighted at its recent Insight user group meeting and Big Data and Analytics analyst summit.

Integrating processes and data across a business has long been a challenge for IT departments. Some decades ago the issue of “islands of automation” emerged as companies implemented stand-alone business applications one by one to perform some function but then realized that it would be handy if these could share data and manage processes from start to finish. Initial progress toward this goal was made in the form of applications such as ERP that offer integrated functionality and enterprise data stores, although these often were difficult to implement and complex to maintain. Lately, software vendors have been refocusing to provide users with ways of facilitating end-to-end process management and making data more accessible.

IBM Concert is such an attempt. Announced in November 2013, it is a user interface IBM designed to be the central touch point across multiple applications and data stores. (It’s possible to link Concert to other vendors’ software, but it’s unlikely that a company would buy it on its own to link other applications.) It’s meant to replace menu-driven interactions between the user and the system with “I want to do this process” and a “day in the life” approach to organizing how individuals access applications and data. IBM Concert is an example of how we are only now beginning to achieve the longstanding objective of having IT systems conform to the user’s needs rather than the opposite. On a single screen Concert organizes personal task lists and presents metrics, conditions and dashboard elements configured to an individual’s preferences so the user can easily monitor conditions and enable more management by exception. Users can organize the data they need to support a given process right in front of them rather than having to go to some other application to fetch that data.

IBM Concert also has a social component that provides the ability for users to collaborate in context. Social applications generally have improved organizations’ connectedness. They offer greater immediacy than “copy all” email, greater inclusiveness than chat software and better communication in a mobile and geographically dispersed workforce. Initially, social applications took a broadcast approach similar to an unfiltered Twitter feed but as I pointed out at the time, that wasn’t a useful approach. As anyone who has used Twitter during an event can attest, the volume of messages quickly exceeds one’s ability to pick out the important ones. Moreover, not everyone wants to share information broadly, especially, for example, finance departments. Concert by contrast “understands” the area in which the individual is working and connects him or her to the conversations of others who are part of the group that needs to collaborate on that specific task. Users also apply hashtags to add a specific context to the message.

I think that Concert has the potential to become the nexus of business people’s computing environments and a sidekick that helps them stay organized and informed, get alerts, collaborate, find answers and explore their workday world.

Both at Vision and again at the Big Data and Analytics analyst summit, IBM previewed Project Catalyst Insight which is a not-yet-named application that is a significant advancement from its SPSS Analytic Catalyst software. Making big data and analytics more useful and consumable by the white-collar workforce (and even some of the blue collars) would be provide a major boost to organizational performance. By itself, a mass of data is not especially useful, and there are significant challenges to teasing out insights from large data sets, especially when that requires sophisticated analytical techniques. Another as-yet-unnamed application from IBM is designed to make big data and analytics more consumable and more useful. Typically large volumes of data are now accessible mainly to those with Ph.D.s in statistics or otherwise highly trained. The main objective of this project is to package advanced analytics routines for use, after limited training, by ordinary business analysts working in any department in any industry.

Big data has always been with us; it is just a question how much “big” is. Today the term refers to data sets so large and complex that organizations have difficulty processing them using standard database management systems and applications. Technology for handling big data has crossed a threshold, becoming more capable and cost-effective. Companies can now to tap into much larger amounts of structured and unstructured data. Big data has potential – and potential pitfalls – for improving a company’s performance, as I have noted. Big data is of little use unless organizations have the ability to use analytics to achieve insights not available through more conventional techniques. The ability to sift through large quantities of business-related data rapidly could set in motion fundamental changes in how executives and managers run their business. Properly deployed, big data can support a more forward-looking and agile management style even in very large enterprises. It will allow more flexible forms of business organization. It can give finance organizations greater scope to play a more strategic role in corporate management by changing the focus of business reviews from backward-looking assessments of what just happened to emphasis on what to do next.

vr_NG_Finance_Analytics_14_innovative_companies_adapt_betterThe challenge for many companies is that big data and advanced analytics are not readily consumable. Our research on finance analytics finds that fewer than one-third (29%) of companies use big data to support their finance analytics, even though this technology can handle the flood of data into today’s businesses and can help produce more useful analytics and advanced techniques. Although analytics is essential to finance departments, their focus remains on the basics. Fewer than half (44% each) use the proven newer techniques of predictive analytics and leading indicators. Nearly three out of four (73%) do not assess relevant economic or market data and trends, and fewer than half assess customer and product profitability; any of these could make analyses more relevant to the overall success of the company. The ability of finance organizations to master analytical techniques – especially advanced ones – ought to be a priority for senior executives because our research shows a correlation between competence in utilizing big data and analytics and the ability to adapt quickly to changing business and economic conditions.

IBM SPSS Analytic Catalyst Insight is designed to make it easier for business users who are not trained statisticians to create predictive analytical models just by answering a few preliminary questions about what they want to accomplish using the data. The new incarnation with IBM Project Catalyst Insight aims to simplify the process even further to bring it into the reach of a wider set of business users. It does so by packaging a range of standard routines that would be applied to data sets and providing even more guidance to analysts and even some business managers that know what they want to know but have a limited grasp of the analytical techniques necessary to find meaning in a mass of data. If IBM can create an application that enables more business users to utilize predictive analytics and other advanced analytical techniques, it would represent a big step forward in making big data a useful tool for many more functional areas than it is today.

Both IBM Concert and the new business analytics tool called Project Catalyst Insight “to be officially named later” reflect IBM’s strategy of achieving product differentiation in a rapidly evolving software market. The first decades of packaged business applications were characterized by a race to create new categories and load them with distinguishing features and functions. In the next decade competitive advantage will fall to software vendors that – in addition to features and functions – can provide business people with a user experience that is easily molded to how they naturally work. IBM Concert is a useful first step that is likely to be further refined. The new analytical environment derived from IBM SPSS Modeler and SPSS Analytic Catalyst Insight looks and sounds like a good idea, and it will be interesting to see how it develops when it is generally available.

Regards,

Robert Kugel – SVP Research

Our research consistently finds that data issues are a root cause of many problems encountered by modern corporations. One of the main causes of bad data is a lack of data stewardship – too often, nobody is responsible for taking care of data. Fixing inaccurate data is tedious, but creating IT environments that build quality into data is far from glamorous, so these sorts of projects are rarely demanded and funded. The magnitude of the problem grows with the company: Big companies have more data and bigger issues with it than midsize ones. But companies of all sizes ignore this at their peril: Data quality, which includes accuracy, timeliness, relevance and consistency, has a profound impact on the quality of work done, especially in analytics where the value of even brilliantly conceived models is degraded when the data that drives that model is inaccurate, inconsistent or not timely. That’s a key finding of our finance analytics benchmark research.

vr_NG_Finance_Analytics_04_accurate_data_is_key_for_finance_analyticsA main requirement for the data used in analytics is that it be accurate because accuracy affects how well finance analytic processes work. One piece of seemingly good news from the research is that a majority of companies have accurate data with which to work in their finance analytics processes. However, only 11 percent said theirs is very accurate, and there’s a big difference between accurate enough and very accurate. The degree of accuracy is important because it correlates with, among other things, the quality of finance analytics processes and the agility with which organizations can respond to and plan for change.

Although almost all (92%) of the companies that have very accurate data also have a process that works well or very well, that assessment drops to 43 percent of companies that said their data is just accurate. Even in small doses, bad data has an outsized impact on finance analytic processes. Inaccuracies, inconsistencies and not comparable data can seriously gum up the works as analysts search for the source of the issue and then try to resolve it. As issues grow, dissatisfaction with the process increases. Just 22 percent of those with somewhat accurate data and none of the companies with data that is not accurate said their company has a process that works well or very well.

To be truly useful for business, analytics provided to executives, managers and other decision-makers must be fresh. The faster a company can deliver the assessments and insight as to what just happened, the sooner company can respond to those changes. Almost all (85%) companies with very accurate data said they are able to respond immediately or soon enough to changes in business or market conditions, but only 35 percent of those with accurate data and just 24 percent of those with somewhat accurate data are able to do so.

Moreover, having data that is timely enables companies to react in a coordinated fashion as well as quickly. Companies that are able to operate in a coordinated fashion are usually more successful in business than those that are somewhat coordinated in the same way that a juggler who is somewhat coordinated drops a lot of balls. Almost all (86%) companies whose data is all up-to-date said they are able to react to change in a coordinated or very well coordinated fashion, compared to just 38 percent of those whose data is mostly up-to-date and 19 percent that have a significant percentage of stale data. Three-fourths (77%) of companies that have very accurate data are able to respond to changes in a coordinated or very well coordinated fashion, but just one-third (35%) of those with accurate data and 14 percent with somewhat accurate data are able to accomplish this.

Speed is essential in delivering metrics and performance indicators if they are to be useful for strategic decision-making, competitive positioning and assessing performance. Companies that can respond sooner to opportunities and threats are more able to adjust to changing business conditions. The research finds that fewer than half (43%) of companies are able to deliver important metrics and performance indicators within a week of a period’s end – that is, soon enough to respond to an emerging opportunity or threat.

One way to speed up the delivery of analytics is to have analysts vr_NG_Finance_Analytics_09_too_much_time_to_prepare_datafocus their time on the analytics. But the research shows that not many do: A majority of analysts spend the biggest chunk of their time dealing with data-related issues rather than on the analysis itself. Two-thirds (68%) of participants reported that they spend the most time dealing with the data used in their analytics – waiting for it, reviewing it for quality and consistency or preparing it for analysis. Only one-fourth (28%) said their efforts focus most on analysis and trying to determine root causes, which are the main reasons for doing the analysis in the first place. In other words, in a majority of companies, analysts don’t spend enough time doing what they are valued and paid for.

The results also show that there are negative knock-on effects of spending time on data-related tasks rather than on analysis. More than half (56%) of the companies that spend the biggest part of their time working on analytics can deliver metrics and indicators within a business week, compared to just one-third (36%) of those that spend the biggest part of the time grappling with data issues. Having high-quality, timely and accessible data therefore is essential to reaping the benefits of finance analytics.

dataData issues diminish productivity in every part of a business as people struggle to correct errors or find workarounds. Issues with data are a man-made phenomenon, yet companies seem to treat bad data as a force of nature like a tornado or an earthquake that’s beyond their control to fix. Our benchmark research on information management suggests that inertia in tackling data issues is more organizational than technical. Companies simply do not devote sufficient resources (staff and budget) to address this ongoing issue. One reason may be because the people who must confront the data issues in their day-to-day work fail to understand the connection between these and getting the results from analytics that they should.

Excellent data quality is the result of building quality controls into data management processes. Our research finds a strong correlation between the degree of data quality efforts in finance analytics and the quality of the finance department’s analytic processes and output, and ultimately its timeliness and its value to the company. Corporations generally – and finance organizations in particular – must pay closer attention to the reliability of the data they use in their analytics. The investment in having better data will pay off in better analytics.

Regards,

Robert Kugel – SVP Research

Twitter Updates

Stats

  • 76,400 hits
Follow

Get every new post delivered to your Inbox.

Join 68 other followers

%d bloggers like this: