You are currently browsing the monthly archive for January 2013.

Actuate this week announced BIRT Analytics, and thereby puts itself firmly into supporting a range of business analytics needs from data discovery and visualization to a range of data mining and predictive capabilities that allows itself new avenues of growth. Actuate has long been a staple of large Business Intelligence deployments; in fact the company says that ActuateOne delivers more insights to more people than all other BI applications combined. This is likely true, given that Actuate is embedded in major consumer applications across industries worldwide. This announcement builds and utilizes its advancements into big data that I already assessed last year that can help it further expand its technology value to business and IT.

Tools such as BIRT Analytics can change the organizational culture aroundvr_ngbi_br_importance_of_bi_technology_considerations data and analytics. They put the power of data discovery and data visualization into the hands of tool-savvy managers as well as business analysts.  While Actuate has allowed highly functional and interactive dashboards in the past, BIRT Analytics brings the usability dimension to a different level. Usability is of the highest importance to 63 percent of organizations for business intelligence software, according to our next-generation business intelligence benchmark research, and one where BIRT Analytics and other tools in its class really show their value. The technology allows not just for visual data exploration, but also for new sources of data to be connected and analyzed without a predefined schema. This fits well with the current world of distributed computing, where everything can no longer be nicely modeled in one place. The software can gather data from different sources, including big data sources, flat files and traditional relational databases, and mash these up through visually appealing toolsets, allowing end user analysts to bypass IT and avoid much of the data preparation that has been a hallmark of business intelligence in the past. In fact our recent business technology innovation benchmark research shows that only a little more than half of companies are satisfied with their analytic processes, and 44 percent of organizations indicate the most time-consuming part of the analytics process is data-related tasks that Actuate is addressing with their ability to handle data efficiently.

Some of the advantages of the BIRT Analytics product are its fast in-memory engine,vr_predanalytics_benifits_of_predictive_analytics its ability to handle large amounts of data, and the more advanced analytic capabilities in the system. The company’s web site says it offers the fastest data loading tool in the industry with the FastDB main memory database system and an ability to explore 6 billion records in less than a second. These are impressive numbers, especially as we look at big data analytics, which often runs against terabytes of data. The usability of this tool’s analytics features is particularly impressive. For instance, set analysis, clustering and predictive capabilities are all part of the software, allowing analysts who aren’t necessarily data scientists to conduct advanced data analysis. These capabilities give tools like BIRT Analytics an advantage in the market since they offer simple end-user-driven ways to produce market segmentation and forecasting reports. These advancements help Actuate provide new benefits of its BIRT Analytics that according to our benchmark research on predictive analytics, 68 percent of organizations see predictive analytics as a source of competitive advantage.

Actuate already ranked as a hot vendor in the 2012 Ventana Research Business Intelligence Value Index thanks to its enterprise-level reliability and validation of its deployments which this release will help it even more in its ratings.  In the short term, BIRT Analytics will certainly boost Actuate’s market momentum and allow it to compete in areas where it would not have been seen before and help it expand its value to its existing customers.

Regards,

Tony Cosentino

VP and Research Director

SAP just released solid preliminary quarterly and annual revenue numbers, which in many ways can be attributed to a strong strategic vision around the HANA in-memory platform and strong execution throughout the organization. Akin to flying an airplane while simultaneously fixing it, SAP’s bold move to HANA may at some point see the company continuing to fly when other companies are forced to ground parts of their fleets.

Stepping into 2012, the HANA strategy was still nascent, and SAP provided incentives both to customers and the channel to bring them along. At that point SAP also promoted John Schweitzer to senior vice president and general manager for business analytics.  Schweitzer, an industry veteran who spent his career with Hyperion and Oracle before moving to SAP in 2010, gave a compelling analytics talk at SAPPHIRE NOW in early 2012 and was at the helm for the significant product launches during the course of the year.

One of these products is SAP’s Visual Intelligence, which was released in the spring of 2012. The product takes Business Objects vr_ngbi_br_importance_of_bi_technology_considerationsBusiness Explorer and moves it to the desktop so that analysts can work without the need to have IT involved. Because it runs on HANA, it allows real-time visual exploration of data akin to what we have been seeing with companies such as Tableau, which recently passed the $100 million mark in revenue.  As with many of SAP’s new offers, the advantage of running on top of HANA allows visual exploration analysis on very large data sets. The same size data sets may quickly overwhelm certain “in-memory” competitor systems. In order to facilitate buzz and help develop “Visi,” as Visual Intelligence is also called, the company ran a Data Geek Challenge in which SAP encouraged users to develop big data visualizations on top of HANA.

Tools such as SAP’s Visual Intelligence begin to address the analytics usability challenge that I recently wrote about in the The Brave New World of Business Intelligence which focused on recent research we did on next-generation business intelligence. One key takeaway from that research is that usability expectations for business intelligence are being set on a consumer level, but that our information environments and our processes in the enterprise are too fragmented to achieve the same level of usability, resulting in relatively high dissatisfaction with next-generation business tools. Despite that dissatisfaction, the high switch costs for enterprise BI mean that customers are essentially captive, and this gives incumbents time to adapt their approaches. There is no doubt that SAP looks to capitalize on this opportunity with agile development approaches and frequent iterations of the Visual Intelligence software.

Another key release in 2012 was around Predictive Analytics which went generally available late in the year after SAP TechEd in Madrid. With this release, SAP moves away from its partnership
vr_predanalytics_benifits_of_predictive_analytics with IBM SPSS. The move makes sense given that sticking with SPSS would have resulted in a critical dependency for SAP. According to our benchmark research on Predictive Analytics, 68% of organizations see predictive analytics as a source of competitive advantage. Once again, HANA may prove to be a strong differentiator for SAP in that the company will be able to leverage the in-memory system to visualize data and run predictive analytics on very large data sets and across different workloads. Furthermore, the sap Predictive Analytics offering inherits data acquisition and data manipulation functionality from SAP Visual Intelligence. However, it cannot be installed on the same machine as Visi, according to a blog on the SAP Community Network.

SAP will need to work hard to build out predictive analytics capabilities to compete with the likes of SAS, IBM SPSS and other providers which have years of custom developed vertical and Line-of-Business solution sets. Beyond the customized analytical assets, IBM and SAS will likely promote the idea of commodity models as such non-optimized modeling approaches become more important. Commodity models are a breed of “good enough” models that allow for prediction and data reduction that is a step-function better than a purely random or uninformed decision. Where deep analytical skill sets are not available, sophisticated software can run through data and match it to the appropriate model.

In 2012, SAP also continued to develop targeted analytical solutions, which bundles SAP BI and the Sybase IQ server, a columnar database. Column-oriented databases such as Sybase IQ, Vertica, ParAccel and Infobright have gained momentum over the last few years by providing a platform that organizes data in a way that allows for much easier analytical access and data compression.  Instead of writing to disk in a row oriented fashion, columnar approaches write to disk in a column oriented fashion allowing for faster analytical cycles and reduced Time-to-Value.

The competitive advantage of columnar databases, however, may be mitigated as in-memory approaches gain favor in the marketplace. For this reason, continued development on the Sybase IQ platform may be an intermediate step until the HANA analytics portfolio is built out; after which HANA will likely cannibalize parts of its own stack. SAP’s dual approach to Big Data analytics with both “in-memory” and columnar provides good visibility into the classic Innovator’s Dilemma that faces many technology suppliers today and how SAP is dealing with this dilemma. It should be noted, however, that SAP is also working on an integrated portfolio approach and that Sybase IQ may actually be a better fit as datasets move to Petabyte scale.

Another aspect of that same Innovator’s Dilemma is a fragmented choice environment as new technologies develop. Our research shows that the market is undecided on how it will roll out critical next-generation business intelligence capabilities such as collaboration. Just under two-fifth of our Next Generationvr_ngbi_br_collaboration_tool_access_preferences Business Intelligence study participants (38%) prefer business intelligence applications as the primary access method for collaborative BI, but 36 percent prefer access through office productivity tools, and 34 percent prefer access through applications themselves. (Not surprisingly, IT leans more toward providing tools within the already existing landscape, while business users are more likely to want this capability within the context of the application.)  This fragmented choice scenario carries over to analytics as well, where spreadsheets are still the dominant analytical tool in most organizations. Here at Ventana Research, we are fielding more inquiries on application-embedded analytics and how these will play out in the organizational landscape. I anticipate this debate will continue through 2013, with different parts of the market providing solid arguments for each of the three camps. Since HANA uniquely provides both transactional processing and analytic processing in one engine, it will be interesting to look closer at the HANA and Business Objects roadmap in 2013 to see how they are positioning with respect to this debate. Furthermore, as I discuss in my 2013 Research Agenda blog post, disseminating insights within the organization is a big part of moving from insights to action, and business intelligence is still the primary vehicle for moving insight into the organization.  For this reason, the natural path for many organizations may indeed be through their Business Intelligence systems.

SAP, clearly separating its strategic position, looks to continue to innovate its entire portfolio, including both applications and analytics, based on the HANA database. In the most recent quarter, SAP took a big step forward in this regard by porting its entire business suite to run on HANA as my colleague Robert Kugel discussed in a recent blog.

While there are still some critical battles to be played out, one thing remains clear: SAP is one of the dominant players in business intelligence today. Our Value Index on Business Intelligence has assessed SAP as a Hot Vendor and is top ranked. SAP aims to stay that way by continuing to innovate around HANA and giving its customers and prospects a seamless transition to next-generation analytics and technologies.

Regards,

Tony Cosentino
VP and Research Director

Did you catch all the big data analogies people used in 2012? There were many, like the refinement of oil analogy, or the spinning straw into gold analogy, and less useful but more entertaining ones, like big data is like a box of chocolates, or big data is like The Matrix (because “there’s no way Keanu Reeves learns Kung Fu in five seconds without using big data”).  I tend to like the water analogy, which I’ll use here to have a little fun and to briefly describe how I see the business analytics market in 2013.

2013 is about standing lakes of information that will turn into numerous tributaries. These various tributaries of profile, behavioral and attitudinal data will flow into the digital river of institutional knowledge. Analytics, built out by people, process, information and technology, will be the only banks high enough to control this vast river and funnel it through the duct of organizational culture and into an ocean of market advantage.

With this river of information as a backdrop, I’m excited to introduce the Ventana Research Business Analytics Research Agenda for 2013, focused on three themes:

Answering the W’s (the what, the so what, the now what and the then what)

The first and perhaps most important theme of the 2013 research agenda builds on answering the W’s – the what, the so what, the now vr_bigdata_big_data_capabilities_not_availablewhat and the then what – which was also the topic of one of my most widely read blog posts last year. In that piece I suggested a substantive shift from the discussion the three V’s to the four W’s corresponds to the shift from a technologically-oriented discussion to a business-oriented one. Volume, variety and velocity are well-known parameters of big data that help facilitate the technology discussion, but when we look at analytics and how it can drive success for an organization, we need to move to the so what, now what and then what of analytical insights, organizational decision-making and closed-loop processes. Our big data research found some significant gaps in the business analytics spectrum for what is available in the organization today fueling a new generation of technology for consuming big data.

Outcome-driven approaches are a good way of framing issues, given that business analytics, and in particular big data analytics, are such broad topics, yet the use cases are so specific. Our big data analytics benchmark research for 2013 that we will start shortly will therefore look at specific benefits and supported business cases across industries and LOB in order to assess best practices for big data analytics. The research will investigate the opportunities and barriers that exist today  and explore what needs to happen for us to move from an early adopter market to an early majority market. The benchmark research will feed weighting algorithms into our Big Data Analytics Value Index, which will look at the analytics vendors that are tackling the formidable challenges of providing software to analyze large and multistructured datasets.

Disseminating insights within the organization is a big part of moving from insights to action, and business intelligence is still a primary vehicle for driving insight into the organization. While there is a lot to be said about mobile BI, collaborative BI, visual discovery and predictive analytics, core business intelligence systems remain at the heart of many organizations. Therefore we will continue our in-depth coverage of core business intelligence systems with our Business Intelligence Value Index, in the context of our next-generation business analytics benchmark research that will start in 2013.

Embracing next generation technology for business analytics

We’re beginning to see businesses embracing next-generation technology for business analytics.Collaborative business intelligence is a critical part of this conversation, both in terms of getting insights and in termsvr_ngbi_br_location_is_important_for_bi of making decisions. Last year’s next-generation business intelligence benchmark research showed us that the market is still undecided on how next-generation BI will be rolled out, with business applications being the preferred method, but only slightly more so than through business intelligence or office productivity tools. In addition to collaboration, we will focus on mobile and location trends in our next-generation business analytics benchmark research and our new location analytics benchmark research that we have already found has specific needs for business analysts. We see mobile business intelligence as a particularly hot area in 2013, and we are therefore breaking out mobile business intelligence vendors in this year’s Mobile Business Intelligence Value Index that we will conduct.

Another hot area of next-generation technology revolves around right-time data and real-time data and how they can be built into organizational workflows. As our operational intelligence benchmark research found, perceptions around OI and real-time data differ significantly between IT and business users. We will extend this discussion in the context of the big data analytics benchmark research, and specifically in the context of our Operational Intelligence Value Index that we will do in 2013.

Using analytical best practices across business and IT

Our final theme relates to the use of analytical best practices across business and IT. We’ll be looking at best practices for companies as they evolve to become analytics-driven organizations. In this context, we’llvr_predanalytics_benifits_of_predictive_analytics look at exploratory analytics, visual discovery and even English representation of the analytics itself as approaches in our next-generation business analytics benchmark research and how these impact how we assess BI in our Business Intelligence Value Index. We’ll look at how organizations exploit predictive analytics on big data as a competitive advantage as found in our predictive analytics benchmark within the context of our big data analytics benchmark research. We’ll look at the hot areas of sales and customer analytics, including best practices and their intersection with cloud computing models. And we’ll look at previously untapped areas of analytics that are just now heating up, such as human capital analytics. In our human capital analytics benchmark research that will begin shortly we’ll look across the landscape to assess not just analytics associated with core HR, but analytics around talent management and workforce optimization as well.

I see a high level of innovation and change going on in the business analytics market in 2013. Whenever an industry undergoes such change, high-quality primary research acts as a lighthouse for both customers and suppliers.  Companies can capitalize on all of the exciting developments in analytics, business intelligence and related areas of innovation to drive competitive advantage, but only if they understand the changes and potential value.

I am looking forward providing a practical perspective on using all forms of business analytics as a value in organizations and helping our Ventana Research community and clients.

Come read and download the full research agenda.

Regards,

Tony Cosentino
VP and Research Director

Revolution Analytics is a commercial provider of software and services related to enterprise implementations of the open source language R. At its base level, R is a programming language built by statisticians for statistical analysis, data mining and predictive analytics. In a broader sense, it is data analysis software used by data scientists to access data, develop and perform statistical modeling and visualize data. The R community has a growing user base of more than two million worldwide, and more than 4,000 available applications cover specific problem domains across industries. Both the R Project and Revolution Analytics have significant momentum in the enterprise and in academia.

Revolution Analytics provides value by taking the most recent release from the R community and adding scalability and other functionality so that R can be implemented and seamlessly work in a commercial environment. Revolution R provides a development environment so that data scientists can write and debug R code more effectively, and web service APIs that integrate with other BI tools and dashboards so that R can work with business intelligence tools and visual discovery tools. In addition, Revolution Analytics makes money through professional and support services.

Companies are collecting enormous amounts of data, but few have activevr_bigdata_the_volume_of_big_data big data analytics strategies. Our big data benchmark research shows that more than 50 percent of companies in our sample maintain more than 10TB of data, but often they cannot analyze the data due to scale issues. Furthermore, our research into predictive analytics says that integrating into the current architecture is the biggest obstacle facing the implementation of predictive analytics.

Revolution Analytics helps address these challenges in a few ways. It can perform file-based analytics, where a single node orchestrates commands across a cluster of commodity servers and delivers the results back to the end user. This is an on-premise solution that runs on Linux clusters or Microsoft HPC clusters. A perhaps more exciting use case is alignment with the Hadoop MapReduce paradigm, where Revolution Analytics allows for direct manipulation of the HDFS file system, can submit a job directly to the Hadoop jobtracker, and can directly define and manipulate analytical data frames through Hbase database tables. When front-ended with a visualization tool such as Tableau, this ability to work with data directly in Hadoop becomes a powerful tool for big data analytics. A third use case has to do with the parallelization of computations within the database itself. This in-database approach is gaining a lot of traction for big data analysis primarily because it is the most efficient way to do analytics on very large structured datasets without moving a lot of data. For example, IBM’s PureData System for Analytics (IBM’s new name for its MPP Netezza appliance) uses the in-database approach with an R instance running on each processing unit in the database, each of which is connected to an R server via ODBC. The analytics are invoked as the data is served up to the processor such that the algorithms run in parallel across all of the data.

In the big data analytics context, speed and scale are critical drivers of success, and Revolution R delivers on both. It is built with the Intel Math Kernel Library, so that processing is streamlined for multithreading at the processor level and it can leverage multiple cores simultaneously. In test cases on a single node, R was only able to scale to observations of about 400,000 in a linear regression model, while Enterprise R was able to go into the millions. With respect to speed, Revolution R 6.1 was able to conduct a principal component analysis in about 11 seconds versus 142 seconds with version R-2 14.2. As similar tests are performed across multiple processors in a parallelized fashion, the observed performance difference increases.

vr_predanalytics_predictive_analytics_obstaclesSo why does all of this matter? Our benchmark research into predictive analytics shows that companies that are able to score and update their models more efficiently show higher maturity and gain greater competitive advantage. From an analytical perspective, we can now squeeze more value out of our large data sets. We can analyze all of the data and take a census approach instead of a sampling approach which in turn allows us to better understand the errors that exist in our models and identify outliers and patterns that are not linear in nature. Along with prediction, the ability to identify outliers is probably the most important capability since seeing the data anomalies often leads to the biggest insights and competitive advantage. Most importantly, from a business perspective, we can apply models to understand things such as individual customer behavior, a company’s risk profile or how to take advantage of market uncertainty through tactical advantage.

I’ve heard that Revolution R isn’t always the easiest software to use and that the experience isn’t exactly seamless, but it can be argued that in the cutting-edge field of big data analytics a challenging environment is to be expected. If Revolution Analytics can address some of these useability challenges, it may find its pie growing even faster than it is now. Regardless, I anticipate that Revolution Analytics will continue its fast growth (already its headcount is doubling year-over-year). Furthermore, I anticipate that in-database analytics (an area where R really shines) will become the de-facto approach to big data analytics and that companies that take full advantage of that trend will reap benefits.

Regards,

Tony Cosentino

VP and Research Director

Ventana Research has been researching and advocating operational intelligence for the past 10 years, but not always with that name. The use of events and analytics in business process management and the need for hourly and daily operational business intelligence originally drove the discussion, but the alignment with traditional BI architecture didn’t allow for a seamless system; so a few years later the discussion started to focus around business process management and the ability of companies to monitor and analyze BPM on top of their enterprise applications. Business activity monitoring became the vogue term, but that term did not denote the action orientation necessary to accurately describe this emerging area. Ventana Research had at that point already defined a category of technology and approaches that allow both monitoring and management of operational activities and systems along with taking action on critical events. Today, Ventana Research defines Operational Intelligence as a set of event-centered information and analytics processes operating across the organization that enable people to take effective actions and make better decisions.

The challenge in defining a category in today’s enterprise software market is that prolific innovation  is driving a fundamental reassessment of category taxonomies. It’s nearly impossible to define a mutually exclusive and combinatorially exhaustive set of categories, and without that, there will necessarily be overlapping categories and definitions. Take the category of big data; when we ask our community for the definition, we get many perspectives and ideas of what big data represents.

Operational intelligence overlaps in many ways with big data. In technological terms, both deal with a diversity of data sources and data structures, both need to provide data in a timely manner, and both must deal with the exponential growth of data.

Also, business users and technologists often see both from different perspectives. Much like the wise men touching the elephant, each group feels that OI has a specific purpose based on their perspective. The technologist looks at operational intelligence from a systems and network management perspective, while business users look at things from a business performance perspective. This is apparent when we look into the data sources used for operational intelligence: IT places more importance on IT systems management (79% vs. 40% for business), while business places more importance on financial data (54% vs. 39% for IT) and customer data (40% vs. 27% for IT). Business is also more likely to use business intelligence tools for operational intelligence (50% vs. 43%), while IT is more likely to use specialized operational intelligence tools (17% vs. 9% for business).

The last and perhaps biggest parallel is that in both cases, the terms are general, but their implementations and business benefits are specific. The top use cases in our study for operational intelligence were managing performance (59%), fraud and security (59%), compliance (58%) and risk management (58%). Overall we see relative parity in the top four, but when we drill down by industry, in areas such as financial services, government, healthcare and manufacturing, we see many differences. We conclude that each industry has unique requirements for operational intelligence, and this is very similar to what we see with big data.

It is not surprising that our definition of operational intelligence is still evolving. As we move from the century of designed data to the century of organic data (terminology coined by Census Director Robert Groves), many of our traditional labels are evolving. Business intelligence is beginning to overlap with categories such as big data, advanced analytics and operational intelligence. As I discussed in a recent blog post, The Brave New World of Business Intelligence, the business intelligence category was mature and was showing incremental growth only a few years ago, but it is difficult to call the BI category mature any longer.

Based on the results of our latest operational intelligence benchmark research, we feel confident that our current definition encompasses the evolving state of the market. As operational intelligence advances, we will continue to help put a frame around it. For now, it acts very much like what might be called “right-time big data.”

Regards,

Tony Cosentino

VP & Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,610 hits
%d bloggers like this: