You are currently browsing the tag archive for the ‘Operational Intelligence’ tag.

PentahoWorld 2015, Pentaho’s second annual user conference, held in mid-October, centered on the general availability of release 6.0 of its data integration and analytics platform and its acquisition by Hitachi Data Systems (HDS) earlier this year. Company spokespeople detailed the development of the product in relation to the roadmap laid out in 2014 and outlined plans for its integration with those of HDS and its parent Hitachi. They also discussed Pentaho’s and HDS’s shared intentions regarding the Internet of Things (IoT), particularly in telecommunications, healthcare, public infrastructure and IT analytics.

Pentaho competes on the basis of what it calls a “streamlined data refinery” that enables a flexible way to access, transform and integrate data and embed and present analytic data sets in usable formats without writing new code. In addition, it integrates a visual analytic workflow interface with a business intelligence front end including customization extensions; this is a differentiator for the company since much of the self-serve analytics market in which it competes is still dominated by separate point products.

Pentaho 6 aims to provide manageable and scalable self-service analytics. A key advance in the new version is what Pentaho calls “virtualized data sets” that logically aggregate multiple data sets according to transformations and integration specified by the Pentaho Data Integration (PDI) analytic workflow interface. This virtual approach allows the physical processing to be executed close to the data in various systems such as Hadoop or an RDBMS, which relieves users of the burden of having to continually move data back and forth between the vr_oi_factors_impeding_ol_implementationquery and the response systems. In this way, logical data sets can be served up for consumption in Pentaho Analytics as well as other front-end interfaces in a timely and flexible manner.

One challenge that emerges when accessing multiple integrated and transformed data sets is data lineage. Tracking its lineage is important to establish trust in the data among users by enabling them to ascertain the origin of data prior to transformation and integration. This is particularly useful in regulated industries that may need access to and tracking of source data to prove compliance. This becomes even more complicated with events and completely sourcing them along with the large number of them as found in over a third of organizations in our operational intelligence benchmark research that examined operational centric analytics and business intelligence.

Similarly, Pentaho 6 uses Simple Network Management Protocol (SNMP) to deliver application programming interface (API) extensions so that third-party tools can help provide governance lower in the system stack to further enable reliability of data. Our benchmark research consistently shows that manageability of systems is important for user organizations and in particular for big data environments.

The flexibility introduced with virtual tables and improvements in Pentaho 6.0 around in-line modeling (a concept I discussed after last year’s event are two critical means to building self-service analytic environments. Marrying various data systems with different data models, sometimes referred to as big data integration, has proven to be a difficult challenge in such environments. Pentaho’s continued focus on vr_BDI_01_automating_big_data_integrationbig data integration and providing an integration backbone to the many business intelligence tools (in addition to its own) are potential competitive differentiators for the company. While analysts and users prefer integrated tool sets, today’s fragmented analytics market is increasingly dominated by separate tools that prepare data and surface data for consumption. Front-end tools alone cannot automate the big data integration process, which Pentaho PDI can do.Our research into big data integration shows the importance of eliminating manual tasks in this process: 78 percent of companies said it is important or very important to automate their big data integration processes. Pentaho’s ability to integrate with multiple visual analytics tools is important for the company, especially in light of the HDS accounts, which likely have a variety of front-end tools. In addition, the ability to provide an integrated front end can be attractive to independent software vendors, analytics services providers and certain end-user organizations that would like to embed both integration and visualization without having to license multiple products.

Going forward, Pentaho is focused on joint opportunities with HDS such as the emerging Internet of Things. Pentaho cites established industrial customers such as Halliburton, Intelligent Mechatonic Systems and Kirchoff Datensysteme Software as reference accounts for IoT. In addition, a conference participant from Caterpillar Marine Asset Intelligence shared how it embeds Pentaho to help analyze and predict equipment failure on maritime equipment. Pentaho’s ability to integrate and analyze multiple data sources is key to delivering business value in each of these environments, but the company also possesses a little-known asset in the Weka machine learning library, which is an integrated part of the product suite. Our research on next-generation predictive analytics finds that Weka is used by 5 percent of organizations, and many of the companies that use it are large or very large, which is Pentaho’s target market. Given the importance of machine learning in the IoT category, it will be interesting to see how Pentaho leverages this asset.

Also at the conference, an HDS spokesperson discussed its target markets for IoT or what the company calls “social innovation.” These markets include telecommunications, healthcare, public infrastructure and IT analytics and reflect HDS’s customer base and the core businesses of its parent company Hitachi. Pentaho Data Integration is currently embedded within major customer environments such as Caterpillar, CERN, FINRA, Halliburton, NASDAQ, Sears and Staples, but not all of these companies fit directly into the IoT segments HDS outlined. While Hitachi’s core businesses provide a fertile ground in which grow its business, Pentaho will need to develop integration with the large industrial control systems already in place in those organizations.

The integration of Pentaho into HDS is a key priority. The 2,000-strong global sales force of HDS is now incented to sell Pentaho, and it will be important for the reps to include it as they discuss their accounts’ needs. While Pentaho’s portfolio can potentially broaden sales opportunities for HDS, big data software is a more consultative sale than the price-driven hardware and systems that the sales force may be used to. Furthermore, the buying centers, which are shifting from IT to lines of business, can be significantly different based on the type of organization and their objectives. To address this will require significant training within the HDS sales force and with partner consulting channels. The joint sales efforts will be well served by emphasizing the “big data blueprints” developed by Pentaho over the last couple of years and developing of new ones that speak to IoT and the combined capabilities of the two companies.

HDS says it will begin to embed Pentaho into its product portfolio but has promised to leave Pentaho’s roadmap intact. This is important because Pentaho has done a good job of listening to its customers and addressing the complexities that exist in big data and open source environments. As the next chapter unfolds, I will be looking at how the company integrates its platform with the HDS portfolio and expands it to deal with the complexities of IoT, which we will be investigating in upcoming benchmark research study.

For organizations that need to use large-scale integrated data sets, Pentaho provides one of the most flexible yet mature tools in the market, and they should consider it. The analytics tool provides an integrated and embeddable front end that should be of particular interest to analytics services providers and independent software vendors seeking to make information management and data analytics core capabilities. For existing HDS customers, the Pentaho portfolio will open conversations in new areas of those organizations and potentially add considerable value within accounts.


Ventana Research

The emerging Internet of Things (IoT) extends digital connectivity to devices and sensors in homes, businesses, vehicles and potentially almost anywhere. This innovation enables devices designed for it to generate and transmit data about their operations; analytics using this data can facilitate monitoring and a range of automatic functions.vr_oi_goals_of_using_operational_intelligence_updated

To perform these functions IoT requires what Ventana Research calls Operational Intelligence (OI), a discipline that has evolved from the capture and analysis of instrumentation, networking and machine-to-machine interactions of many types. We define operational intelligence as a set of event-centered information and analytic processes operating across an organization that enable people to use that event information to take effective actions and make optimal decisions. Our benchmark research into Operational Intelligence shows that organizations most often want to use such event-centric architectures for defining metrics (37%) and assigning thresholds for alerts (35%) and for more action-oriented processes of sending notifications to users (33%) and linking events to activities (27%).

In many industries, organizations can gain competitive advantage if they can reduce the elapsed time between an event occurring and actions taken or decisions made in response to it. Existing business intelligence (BI) tools provide useful analysis of and reporting on data drawn from previously recorded transactions, but to improve competitiveness and maximize efficiencies organizations are concluding that employees and processes – in IT, business operations and front-line customer sales, service and support – also need to be able to detect and respond to events as they happen. Our research into big data integration shows that nearly one in four companies currently integrate data into big data stores in real time. The challenge is to go further and act upon both the data that is stored and the data that is streaming in a timely manner.

The evolution of operational intelligence, especially in conjunction with IoT, is encouraging companies to revisit their priorities and spending for information technology and application management. However, sorting out the range of options poses a challenge for both business and IT leaders. Some see potential value in expanding their network infrastructure to support OI. Others are implementing event processing (EP) systems that employ new technology to detect meaningful patterns, anomalies and relationships among events. Increasingly, organizations are using dashboards, visualization and modeling to notify nontechnical people of events and enable them to understand their significance and take appropriate and immediate action.

As with any innovation, using OI for IoT may require substantial changes. These are among the challenges organizations face as they consider adopting operational intelligence:

  • They find it difficult to evaluate the business value of enabling real-time sensing of data and event streams using identification tags, agents and other systems embedded not only in physical locations like warehouses but also in business processes, networks, mobile devices, data appliances and other technologies.
  • They lack an IT architecture that can support and integrate these systems as the volume and frequency of information increase.
  • They are uncertain how to set reasonable business and IT expectations, priorities and implementation plans for important technologies that may conflict or overlap. These can include business intelligence, event processing, business process management, rules management, network upgrades and new or modified applications and databases.
  • They don’t understand how to create a personalized user experience that enables nontechnical employees in different roles to monitor data or event streams, identify significant changes, quickly understand the correlation between events and develop a context in which to determine the right decisions or actions to take.

Ventana Research has announced new benchmark research on The Internet of Things and Operational Intelligence that will identify trends and best practices associated with this technology and these processes. It will explore organizations’ experiences with initiatives related to events and data and with attempts to align IT projects, resources and spending with new business objectives that demand real-time intelligence and event-driven architectures. The research will investigate how organizations are increasing their responsiveness to events by rebalancing the roles of networks, applications and databases to reduce latency; it also will explore ways in which they are using sensor data and alerts to anticipate problematic events. We will benchmark the performance of organizations’ implementations, including IoT, event stream processing, event and activity monitoring, alerting, event modeling and workflow, and process and rules management.

As operational intelligence evolves as the core of IoT platforms, it is an important time to take a closer look at this emerging opportunity and challenge. For those interested in learning more or becoming involved in this upcoming research, please let me know.


Ventana Research

Splunk’s annual gathering, this year called .conf 2015, in late September hosted almost 4,000 Splunk customers, partners and employees. It is one of the fastest-growing user conferences in the technology industry. The area dedicated to Splunk partners has grown from a handful of booths a few years ago to a vast showroom floor many times larger. While the conference’s main announcement was the release of Splunk Enterprise 6.3, its flagship platform, the progress the company is making in the related areas of machine learning and the Internet of Things (IoT) most caught my attention.

Splunk’s strength is its ability to index, normalize, correlate and query data throughout the technology stack, including applications, servers, networks and sensors. It uses distributed search that enables correlation and analysis of events across local- and wide-area networks without moving vast amounts of data. Its architectural approach unifies cloud and on-premises implementations and provides extensibility for developers building applications. Originally, Splunk provided an innovative way to troubleshoot complex technology issues, but over time new uses for Splunk-based data have emerged, including digital marketing analytics, cyber security, fraud prevention and connecting digital devices in the emerging Internet of Things. Ventana Research has covered Splunk since its establishment in the market, most recently in this analysis of mine.

Splunk’s experience in dealing directly with distributed, time-series data and processes on a large scale puts it in position to address the Internet of Things from an industrial perspective. This sort of data is at the heart of large-scale industrial control systems, but it often comes in different formats and its implementation is based on different formats and protocols. For instance, sensor technology and control systems that were invented 10 to 20 years ago use very different technology than modern systems. Furthermore, as with computer technology, there are multiple layers in stack models that have to communicate. Splunk’s tools help engineers and systems analysts cross-reference these disparate systems in the same way that it queries computer system and network data, however, the systems can be vastly different. To address this challenge, Splunk turns to its partners and its extensible platform. For example, Kepware has developed plug-ins that use its more than 150 communication drivers so users can stream real-time industrial sensor and machine data directly into the Splunk platform. Currently, the primary value drivers for organizations in this field of the industrial IoT are operational efficiency, predictive maintenance and asset management. At the conference, Splunk showcased projects in these areas including one with Target that uses Splunk to improve operations in robotics and manufacturing.

For its part, Splunk is taking a multipronged approach by acquiring companies, investing in internal development and enabling its partner ecosystem to build new products. One key enabler of its approach to IoT is machine learning algorithms built on the Splunk platform. In machine learning a model can use new data to continuously learn and adapt its answers to queries. This differs from conventional predictive analytics, in which users build models and validate them based on a particular sample; the model does not adapt over time. With machine learning, for instance, if a piece of equipment or an automobile shows a certain optimal pattern of operation over time, an algorithm can identify that pattern and build a model for how that system should behave. When the equipment begins to act in a less optimal or anomalous way, the system can alert a human operator that there may be a problem, or in a machine-to-machine situation, it can invoke a process to solve the problem or recalibrate the machine.

Machine learning algorithms allow event processes to be audited, analyzed and acted upon in real time. They enable predictive capabilities for maintenance, transportation and logistics, and asset management and can also be applied in more people-oriented domains such as fraud prevention, security, business process improvement, and digital products.  IoT potentially can have a major impact on business processes, but only if organizations can realign systems to discover-and-adapt rather than model-and-apply approaches. For instance, processes are often carried out in an uneven fashion different from the way the model was conceived and communicated through complex process documentation and systems. As more process flows are directly instrumented and more processes carried out by machines, the ability to model directly based on the discovery of those event flows and to adapt to them (through human learning or machine learning) becomes key to improving organizational processes. Such realignment of business processes, however, often involves broad organizational transformation.Our benchmark research on operational intelligence shows that challenges associated with people and processes, rather than information and technology, most often hold back organizational improvement.

Two product announcements made at the conference illuminate the direction Splunk is taking with IoT and machine learning. The first is User Behavior Analytics (UBA), based VR2015_InnovationAwardWinneron its acquisition of Caspida, which produces advanced algorithms that can detect anomalous behavior within a network. Such algorithms can model internal user behavior, and when behavior deviates from the specified norm, it can generate an alert that can be addressed through investigative processes usingSplunk Enterprise Security 4.0. Together, Splunk Enterprise Security 4.0 and UBA won the 2015 Ventana Research CIO Innovation Award.The acquisition of Caspida shows that Splunk is not afraid to acquire companies in niche areas where they can exploit their platform to deliver organizational value. I expect that we will see more such acquisitions of companies with high value ML algorithms as Splunk carves out specific positions in the emergent markets.

The other product announced is IT Service Intelligence (ITSI), which highlights machine learning algorithms alongside of Splunk’s core capabilities. The IT Service Intelligence App is an application in which end users deploy machine learning to see patterns in various IT service scenarios. ITSI can inform and enable multiple business uses such as predictive maintenance, churn analysis, service level agreements and chargebacks. Similar to UBA, it uses anomaly detection to point out issues and enables managers to view highly distributed processes such as claims process data in insurance companies. At this point, however, use of ITSI (like other areas of IoT) may encounter cultural and political issues as organizations deal with changes in the roles of IT and operations management. Splunk’s direction with ITSI shows that the company is staying close to its IT operations knitting as it builds out application software, but such development also puts Splunk into new competitive scenarios where legacy technology and processes may still be considered good enough.

We note that ITSI is built using Splunk’s Machine Learning Toolkit and showcase, which currently is in preview mode. The vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsplatform is an important development for the company and fills one of the gaps that I pointed out in its portfolio last year. Addressing this gap enables Splunk and its partners to create services that apply advanced analytics to big data that almost half (45%) of organizations find important. The use of predictive and advanced analytics on big data I consider a killer application for big data; our benchmark research on big data analytics backs this claim: Predictive analytics is the type of analytics most (64%) organizations wish to pursue on big data.

Organizations currently looking at IoT use cases should consider Splunk’s strategy and tools in the context of specific problems they need to address. Machine learning algorithms built for particular industries are key so it is important to understand if the problem can be addressed using prebuilt applications provided by Splunk or one of its partners, or if the organization will need to build its own algorithms using the Splunk machine learning platform or alternatives. Evaluate both the platform capabilities and the instrumentation, the type of protocols and formats involved and how that data will be consumed into the system and related in a uniform manner. Most of all, be sure the skills and processes in the organization align with the technology from an end user and business perspective.


Ventana Research

At its annual industry analyst summit last month and in a more recent announcement of enterprise support for parallelizing the R language on its Aster Discovery Platform, Teradata showed that it is adapting to changes in database and analytics technologies. The presentations at the conference revealed a unified approach to data architectures and value propositions in a variety of uses including the Internet of Things, digital marketing and ETL offloading. In particular, the company provided updates on the state of its business as well as how the latest version of its database platform, Teradata 15.0, is addressing customers’ needs for big data. My colleague Mark Smith covered these announcements in depth. The introduction of scalable R support was discussed at the conference but not announced publicly until late last month.

vr_Big_Data_Analytics_13_advanced_analytics_on_big_dataTeradata now has a beta release of parallelized support for R, an open source programming language used significantly in universities and growing rapidly in enterprise use. One challenge is that R relies on a single-thread, in-memory approach to analytics. Parallelization of R allows the algorithm to run on much larger data sets since it is not limited to data stored in memory. For a broader discussion of the pros and cons of R and its evolution, see my analysis. Our benchmark research shows that organizations are counting on companies such as Teradata to provide a layer of abstraction that can simplify analytics on big data architectures. More than half (54%) of advanced analytics implementations are custom built, but in the future this percentage will go down to about one in three (36%).

Teradata’s R project has three parts. The first includes a Teradata Aster R library, which supplies more than 100 prebuilt R functions that hide complexity of the in-database implementation. The algorithms cover the most common big data analytic approaches in use today, which according to our big data analytics benchmark research are classification (used by 39% of organizations), clustering (37%), regression (35%), time series (32%) and affinity analysis (29%). Some use innovative approaches available in Aster such as Teradata’s patented nPath algorithm, which is useful in areas such as digital marketing. All of these functions will receive enterprise support from Teradata, likely through its professional services team.

The second part of the project involves the R parallel constructor. This component gives analysts and data scientists tools to build their own parallel algorithms based on the entire library of open source R algorithms. The framework follows the “split, apply and combine” paradigm, which is popular among the R community. While Teradata won’t support the algorithms themselves, this tool set is a key innovation that I have not yet seen from others in the market.

Finally, the R engine has been integrated with Teradata’s SNAP integration framework. The framework provides unified access to multiple workload specific engines such as relational (SQL), graph (SQL-GR), MapReduce (SQL-MR) and statistics. This is critical since the ultimate value of analytics rests in the information itself. By tying together multiple systems, Teradata enables a variety of analytic approaches. More importantly, the data sources that can be merged into the analysis can deliver competitive advantages. For example, JSON integration, recently announced, delivers information from a plethora of connected devices and detailed Web data.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsTeradata is participating in industry discussions about both data management and analytics. As Mark Smith discussed, its unified approach to data architecture addresses challenges brought on competing big data platforms such as Hadoop and other NoSQL approaches like that one announced with MongoDB supporting JSON integration. These platforms access new information sources and help companies use analytics to indirectly increase revenues, reduce costs and improve operational efficiency. Analytics applied to big data serve a variety of uses, most often cross-selling and up-selling (for 38% of organizations), better understanding of individual customers (32%) and optimizing price (30%) and IT operations (24%). Teradata is active in these areas and is working in multiple industries such as financial services, retail, healthcare, communications, government, energy and utilities.

Current Teradata customers should evaluate the company’s broader analytic and platform portfolio, not just the database appliances. In the fragmented and diverse big data market, Teradata is sorting through the chaos to provide a roadmap for largest of organizations to midsized ones. The Aster Discovery Platform can put power into the hands of analysts and statisticians who need not be data scientists. Business users from various departments, but especially high-level marketing groups that need to integrate multiple data sources for operational use, should take a close look at the Teradata Aster approach.


Tony Cosentino

VP & Research Director

Information Builders announced two major new products at its recent annual user summit. The first was InfoDiscovery, a tool for ad hoc data analysis and visual discoveryThe second was iWay Sentinel, which allows administrators to manage applications in a proactive and dynamic manner. Being a privately held company, Information Builders is not a household name, but it is a major provider of highly scalable business intelligence (BI) and information management software to companies around the world.

VRMobileBIVI_HotVendorThis year’s announcements come one year after the release of WebFOCUS 8.0, which I wrote about at the time. Version 8.0 of this flagship BI product includes a significant overhaul of the underlying code base, and its biggest change is how it renders graphics by putting the parameters of the HTML5 graph code directly inside the browser. This approach allows consistent representation of the business intelligence graphics in multiple device environments including mobile ones. Our research into information optimization shows that mobile technology improves business performance significantly in one out of three organizations. The graphics capability helped Information Builders earn the rating of Hot vendor in our latest Value Index on Mobile Business Intelligence. It is an increasingly important trend to combine analytics with transactional systems in a mobile environment. Our research shows that mobile business intelligence is advancing quickly. Nearly three-quarters (71%) of participants said they expect their mobile workforce to have BI capabilities in the next 12 months.

vr_Big_Data_Analytics_12_benefits_of_visualizing_big_dataWebFOCUS InfoDiscovery represents the company’s new offer in the self-service analytics market. For visual discovery it enables users to extract, blend and prepare data from various data sources such as spreadsheets, company databases and third-party sources. Once the analytic data set is created, users can drill down into the information in an underlying columnar database. They can define queries as they go and examine trends, correlations and anomalies in the data set. Users given permission can publish the visualization from their desktop to the server for others to view or build further. Visualization is another area of increasing importance for organizations. Our research on big data analytics said data visualization has a number of benefits; the most-often cited are faster analytics (by 49%), understanding content (48%), root-cause analysis (40%) and displaying multiple result sets at the same time (40%).

InfoDiscovery is Information Builders’ contender in the new breed of visual discovery products. The first generation of visual discovery products drew attention for their visual capabilities, ease of use and agility. More recently, established business intelligence vendors, of which Information Builders is one, have focused on developing visual discovery tools on the platform of their well-known BI products, with the aim of taking advantage of their maturity. Currently this second wave of tools is still behind the first in terms of ease of use and visual analysis but are advancing rapidly, and they can provide better data governance, version control, auditing and user security. For instance, InfoDiscovery uses the same metadata as the enterprise platform WebFOCUS 8 so objects from both InfoDiscovery and other WebFOCUS applications can be configured in the same user portal. When a business user selects a filter, the data updates across all the components in the dashboard. The HTML5 rendering engine, new in WebFOCUS 8.0, makes the dashboard available to various devices including tablets and smartphones.

vr_oi_how_operational_intellegence_is_usedThe other major announcement at the conference, iWay Sentinel, is a real-time application monitoring tool that helps administrators manage resources across distributed systems. It works with iWay Service Manager, which is used to manage application workflows. IWay Sentinel allows multiple instances of Service Manager to be viewed and managed from a single Web interface, and administrators can address bottlenecks in system resources both manually and automatically. The tool belongs in the category we call operational intelligence and as our research finds, activity and event monitoring is the most important use (for 62% of research participants), followed by alerting and notification.

Sentinel is an important product in the Information Builders portfolio for a couple of reasons. Tactically speaking, it enables large organizations that are running multiple implementations of iWay Service Manager to manage infrastructure resources in a flexible and streamlined manner. From a strategic perspective, it ties the company to the emerging Internet of Things (IoT), which connects devices and real-time application workflows across a distributed environment. In such an environment, rules and processes flows must be monitored and coordinated in real time. Information is passed along an enterprise service bus that enables synchronous interaction of various application components. The use of IoT is in multiple areas such as remote management of devices, telematics and fleet management, predictive maintenance, supply chain optimization, and utlilities monitoring. The challenge is that application software is often complex and its processes are interdependent. For this reason, most approaches to the IoT have been proprietary in nature. Even so, Information Builders has a large number of clients in various industries, especially retail, that may be interested in its approach.

Information Builders continues to innovate in the changing IT industry and business demand for analytics and data, building on its integration capabilities and its core business intelligence assets. The breadth and depth of its software portfolio enable the company to capitalize on these assets as demand shifts. For instance, temporal analysis is becoming more important; Information Builders has built that capability into its products for years. In addition, the company’s core software is hardened by years of meeting high-concurrency needs. Companies that have thousands of users need this type of scalable, battle-tested system.

Both iWay Sentinel and InfoDiscovery are in limited release currently and will be generally available later this year. Users of other Information Builders software should examine InfoDiscovery and assess its fit in their organizations. For business users it offers a self-service approach on the same platform as the WebFOCUS enterprise product. IT staff can uphold their governance and system management responsibilities through visibility and flexible control of the platform. For its part iWay Sentinel should interest companies that have to manage multiple instances of information applications and use iWay Service Manager. In particular, retailers, transportation companies and healthcare companies exploring IoT uses should consider how it can help.

Information Builders is exploiting the value of data into what is called information optimization for which they are finding continued growth in providing information applications that meet specific business and process needs. Information Builders is also beginning to further exploit the big data sources and mobile technology areas but will need to further invest to ensure it can be part of a spectrum of new business needs. I continued to recommend any company that must serve a large set of employees in the workforce and has a need for blending data and analytics for business intelligence or information needs to consider Information Builders.


Tony Cosentino

VP and Research Director

Did you catch all the big data analogies people used in 2012? There were many, like the refinement of oil analogy, or the spinning straw into gold analogy, and less useful but more entertaining ones, like big data is like a box of chocolates, or big data is like The Matrix (because “there’s no way Keanu Reeves learns Kung Fu in five seconds without using big data”).  I tend to like the water analogy, which I’ll use here to have a little fun and to briefly describe how I see the business analytics market in 2013.

2013 is about standing lakes of information that will turn into numerous tributaries. These various tributaries of profile, behavioral and attitudinal data will flow into the digital river of institutional knowledge. Analytics, built out by people, process, information and technology, will be the only banks high enough to control this vast river and funnel it through the duct of organizational culture and into an ocean of market advantage.

With this river of information as a backdrop, I’m excited to introduce the Ventana Research Business Analytics Research Agenda for 2013, focused on three themes:

Answering the W’s (the what, the so what, the now what and the then what)

The first and perhaps most important theme of the 2013 research agenda builds on answering the W’s – the what, the so what, the now vr_bigdata_big_data_capabilities_not_availablewhat and the then what – which was also the topic of one of my most widely read blog posts last year. In that piece I suggested a substantive shift from the discussion the three V’s to the four W’s corresponds to the shift from a technologically-oriented discussion to a business-oriented one. Volume, variety and velocity are well-known parameters of big data that help facilitate the technology discussion, but when we look at analytics and how it can drive success for an organization, we need to move to the so what, now what and then what of analytical insights, organizational decision-making and closed-loop processes. Our big data research found some significant gaps in the business analytics spectrum for what is available in the organization today fueling a new generation of technology for consuming big data.

Outcome-driven approaches are a good way of framing issues, given that business analytics, and in particular big data analytics, are such broad topics, yet the use cases are so specific. Our big data analytics benchmark research for 2013 that we will start shortly will therefore look at specific benefits and supported business cases across industries and LOB in order to assess best practices for big data analytics. The research will investigate the opportunities and barriers that exist today  and explore what needs to happen for us to move from an early adopter market to an early majority market. The benchmark research will feed weighting algorithms into our Big Data Analytics Value Index, which will look at the analytics vendors that are tackling the formidable challenges of providing software to analyze large and multistructured datasets.

Disseminating insights within the organization is a big part of moving from insights to action, and business intelligence is still a primary vehicle for driving insight into the organization. While there is a lot to be said about mobile BI, collaborative BI, visual discovery and predictive analytics, core business intelligence systems remain at the heart of many organizations. Therefore we will continue our in-depth coverage of core business intelligence systems with our Business Intelligence Value Index, in the context of our next-generation business analytics benchmark research that will start in 2013.

Embracing next generation technology for business analytics

We’re beginning to see businesses embracing next-generation technology for business analytics.Collaborative business intelligence is a critical part of this conversation, both in terms of getting insights and in termsvr_ngbi_br_location_is_important_for_bi of making decisions. Last year’s next-generation business intelligence benchmark research showed us that the market is still undecided on how next-generation BI will be rolled out, with business applications being the preferred method, but only slightly more so than through business intelligence or office productivity tools. In addition to collaboration, we will focus on mobile and location trends in our next-generation business analytics benchmark research and our new location analytics benchmark research that we have already found has specific needs for business analysts. We see mobile business intelligence as a particularly hot area in 2013, and we are therefore breaking out mobile business intelligence vendors in this year’s Mobile Business Intelligence Value Index that we will conduct.

Another hot area of next-generation technology revolves around right-time data and real-time data and how they can be built into organizational workflows. As our operational intelligence benchmark research found, perceptions around OI and real-time data differ significantly between IT and business users. We will extend this discussion in the context of the big data analytics benchmark research, and specifically in the context of our Operational Intelligence Value Index that we will do in 2013.

Using analytical best practices across business and IT

Our final theme relates to the use of analytical best practices across business and IT. We’ll be looking at best practices for companies as they evolve to become analytics-driven organizations. In this context, we’llvr_predanalytics_benifits_of_predictive_analytics look at exploratory analytics, visual discovery and even English representation of the analytics itself as approaches in our next-generation business analytics benchmark research and how these impact how we assess BI in our Business Intelligence Value Index. We’ll look at how organizations exploit predictive analytics on big data as a competitive advantage as found in our predictive analytics benchmark within the context of our big data analytics benchmark research. We’ll look at the hot areas of sales and customer analytics, including best practices and their intersection with cloud computing models. And we’ll look at previously untapped areas of analytics that are just now heating up, such as human capital analytics. In our human capital analytics benchmark research that will begin shortly we’ll look across the landscape to assess not just analytics associated with core HR, but analytics around talent management and workforce optimization as well.

I see a high level of innovation and change going on in the business analytics market in 2013. Whenever an industry undergoes such change, high-quality primary research acts as a lighthouse for both customers and suppliers.  Companies can capitalize on all of the exciting developments in analytics, business intelligence and related areas of innovation to drive competitive advantage, but only if they understand the changes and potential value.

I am looking forward providing a practical perspective on using all forms of business analytics as a value in organizations and helping our Ventana Research community and clients.

Come read and download the full research agenda.


Tony Cosentino
VP and Research Director

Ventana Research has been researching and advocating operational intelligence for the past 10 years, but not always with that name. The use of events and analytics in business process management and the need for hourly and daily operational business intelligence originally drove the discussion, but the alignment with traditional BI architecture didn’t allow for a seamless system; so a few years later the discussion started to focus around business process management and the ability of companies to monitor and analyze BPM on top of their enterprise applications. Business activity monitoring became the vogue term, but that term did not denote the action orientation necessary to accurately describe this emerging area. Ventana Research had at that point already defined a category of technology and approaches that allow both monitoring and management of operational activities and systems along with taking action on critical events. Today, Ventana Research defines Operational Intelligence as a set of event-centered information and analytics processes operating across the organization that enable people to take effective actions and make better decisions.

The challenge in defining a category in today’s enterprise software market is that prolific innovation  is driving a fundamental reassessment of category taxonomies. It’s nearly impossible to define a mutually exclusive and combinatorially exhaustive set of categories, and without that, there will necessarily be overlapping categories and definitions. Take the category of big data; when we ask our community for the definition, we get many perspectives and ideas of what big data represents.

Operational intelligence overlaps in many ways with big data. In technological terms, both deal with a diversity of data sources and data structures, both need to provide data in a timely manner, and both must deal with the exponential growth of data.

Also, business users and technologists often see both from different perspectives. Much like the wise men touching the elephant, each group feels that OI has a specific purpose based on their perspective. The technologist looks at operational intelligence from a systems and network management perspective, while business users look at things from a business performance perspective. This is apparent when we look into the data sources used for operational intelligence: IT places more importance on IT systems management (79% vs. 40% for business), while business places more importance on financial data (54% vs. 39% for IT) and customer data (40% vs. 27% for IT). Business is also more likely to use business intelligence tools for operational intelligence (50% vs. 43%), while IT is more likely to use specialized operational intelligence tools (17% vs. 9% for business).

The last and perhaps biggest parallel is that in both cases, the terms are general, but their implementations and business benefits are specific. The top use cases in our study for operational intelligence were managing performance (59%), fraud and security (59%), compliance (58%) and risk management (58%). Overall we see relative parity in the top four, but when we drill down by industry, in areas such as financial services, government, healthcare and manufacturing, we see many differences. We conclude that each industry has unique requirements for operational intelligence, and this is very similar to what we see with big data.

It is not surprising that our definition of operational intelligence is still evolving. As we move from the century of designed data to the century of organic data (terminology coined by Census Director Robert Groves), many of our traditional labels are evolving. Business intelligence is beginning to overlap with categories such as big data, advanced analytics and operational intelligence. As I discussed in a recent blog post, The Brave New World of Business Intelligence, the business intelligence category was mature and was showing incremental growth only a few years ago, but it is difficult to call the BI category mature any longer.

Based on the results of our latest operational intelligence benchmark research, we feel confident that our current definition encompasses the evolving state of the market. As operational intelligence advances, we will continue to help put a frame around it. For now, it acts very much like what might be called “right-time big data.”


Tony Cosentino

VP & Research Director

In this second in a blog series on business analytics I focus on the increasingly important area of predictive analytics. Our benchmark research into predictive analytics shows that while the vast majority of companies see this technology as important or very important for the future of their organizations, most are not taking full advantage of it. This finding suggests that there is an opportunity for companies to gain competitive advantage by implementing predictive analytics in the near term.

Earlier this year I spoke at the Predictive Analytics Summitin San Diego as part of a panel entitled “Winning with Data Science: Transforming Complexity into Simplicity”. Listening to my fellow panelists and presenters as well as speaking with vendors at their booths confirmed that the category of predictive analytics is still being defined. In fact, the environment reminded me a bit of the dot-com era in its energy as well as its disorder. But the two are different: The dot-com era was built on a “field of dreams” where companies built massive web properties but the consumers they expected to come never arrived. The value for the consumer was not obvious. The value of predictive analytics is much clearer and much more rooted in the realities of business. This is confirmed by our benchmark research, in which more than two-thirds of companies view the use of predictive analytics as conferring a competitive advantage.

Because the term predictive analytics is sometimes confused with others, let’s take a moment to define it. In its simplest sense, predictive analytics is about using existing data to predict future outcomes. For example, in database marketing it is often associated with scoring a customer record with the probability (or likelihood) of a desired behavior such as purchasing a particular product. Predictive analytics differs from descriptive analytics in that the latter is about describing existing data and examining how an existing dataset behaves. Descriptive analytics is exploratory in nature, and it is the basic approach used with legacy BI systems as well as with the new class of visual discovery tools. In descriptive analytics, the data is what it is; in predictive analytics, we use the existing data and the laws of probability to predict the future.

The market for predictive analytics is best understood by viewing it as divided into three subcategories: business operations and financial predictive analytics, industry-specific predictive analytics and customer behavior and marketing predictive analytics.

Operations and financial predictive analytics includes the use of predictive analytics in areas such as financial planning, workforce management, IT and supply chain operations. Financial forecasting (the domain of my colleague Robert Kugel) has been a part of the predictive analytics world for a long time. Financial predictive models utilize many different factors, including past company performance and leading economic indicators, to predict revenues and to budget more effectively.

More recently, in areas such as supply chain management, predictive analytics is allowing companies to match their stock with customer demand, thereby reducing inventory costs. In such a system, a manufacturer may collaborate with the retailer to look at run rates and predict stock-keeping unit (SKU) levels. Traditionally this was done with a store manager’s guess or by applying uniform assumptions across all inventories. By applying this type of predictive analysis, companies are able to reduce the inventory levels needed by their partners and segment the market to more efficiently align to an increasingly niched retail market environment.

Predictive analytics has applicability across an array of other areas as well. Workforce management systems, for example, use predictive analytics to understand the staying power of an employee based on his or her job history, or to plan capacity and rationalize new hires. IT is using predictive analytics to analyze log data and automate systems, thereby reducing the time it takes to manage the company IT infrastructure.

Industry-specific predictive analytics encompasses niche undertakings like fraud prevention, risk analysis and disease prediction. Police departments, for instance, do a better job of matching resources to threats when they use predictive models to determine when and where a violent crime might occur. Niche applications in sports analytics (think “Moneyball”) are changing how teams recruit players and even play their games. Healthcare companies and practitioners increasingly are predicting the occurrence of diseases and using these predictions to shape clinician behavior, drug production priorities and treatment protocols.

Big data approaches make possible interesting predictive analytics opportunities in specific areas such as Internet security. Predicting and preventing security threats, for example, is complex since it involves multiple variables that are constantly changing and new variables that are constantly being introduced. The ability to analyze the large volumes of network flow, log and new malware data to understand the different patterns and threat vectors now makes it possible to build predictive algorithms that can be used to recognize and score potential harm to the system.

Customer behavior and marketing predictive analytics is the area that likely hits closest to home for the many business managers who have been hearing that big data and predictive analytics are changing the world. In fact, according to our benchmark research, revenue-producing functions are the business areas where predictive analytics are being used most, with 65 percent of organizations using the technology in marketing and 59 percent in sales.

Loyalty and customer analytics are hot topics right now, and analytical CRM frameworks married with the right toolsets are providing sophisticated ways of not only predicting attrition but preventing it from happening. Companies are looking at individual-level behavior and wallet share across both online and offline environments. This individual-level view currently predominates, and it is proving to be a powerful tool when supported by the right data.

One area in particular where this sort of modeling is effectively being used is sales attribution, which is a major component of return on marketing investment (ROMI). The adage often attributed to John Wanamaker that “Half the money I spend on advertising is wasted; the trouble is I don’t know which half” has been applied to marketing spend as well, but it may not necessarily be true any longer in the era of big data and predictive analytics.

Implications and Recommendations

The adoption of predictive analytics, particularly in the important areas of marketing and sales, is forcing an uneasy partnership between CIOs and CMOs. This is because data quality and information management issues, traditionally the domain of the CIO, need to be resolved in order to realize the true value of predictive analytics. From the CMO’s perspective, predictive analytics has enormous power to predict things such as the next best customer offer, but if the product or customer data is incorrect, the value of the prediction is severely diminished.

On the flip side, some marketing services categories and approaches face disruption due to the emergence of predictive analytics. Media buying is an obvious one, but also impacted is the lesser known cottage industry around market-mix modeling. This modeling technique uses multivariate regression techniques to predict the impact of various promotional channels (that is, of the market mix) on future sales. As companies are able to do predictive behavioral modeling on an individual basis, they can fine tune how they tie together promotions and sales. This diminishes the need for less precise aggregate approaches such as market-mix models.

Increased reliance on predictive analytics may also result in realignment of business processes and roles. As business decision makers are able to do their own exploratory analysis and predictive “what-if” modeling and take immediate action based on that, sophisticated BI tools used by those executives may begin to replace the traditional analyst. However, with executive level baby boomers extending their stay in corporate America and the first generation of “digital natives” just graduating from school, such a scenario isn’t likely for mainstream businesses anytime soon.

As organizations move forward with their predictive analytics initiatives, I recommend they think broadly about how the models will be integrated into their existing systems, what type of modeling is needed, and how complex the models need to be. And organizations should by all means explore making use of the support offered by the vendors of the applications and tools that are deployed. At the moment IBM’s SPSS and SAS are leaders in the predictive analytics space with a broad range of tools and models addressing a wide range of use cases. MicroStrategy takes a different approach with its 9.3 release, allowing R to be programmed inside the software so that the functions run in an embedded manner. Many providers including those mentioned above support Predictive Modeling Markup Language (PMML), an XML-based standard that allows predictive models to be shared across applications. For big data initiatives, there are some interesting offerings: Datameer has partnered with Zementis, for example, to develop a universal PMML plug-in that allows SPSS, SAS, and R to be integrated with their Hadoop-based engine.

Approaches such as these help overcome the challenge of architectural integration, which was one of the key obstacles to the deployment and use of predictive analytics identified in our benchmark research. The integration of models is difficult because the statistician who builds the model rarely has the skill set to code the model, and so there is often much misalignment between the intent of the model designer and what the model actually does once it is implemented. With more and more vendors embedding analytic support in their portfolios and further adoption of the PMML standard, this should become less of an issue.

Companies should also pay close attention to the human factor when rolling out a predictive analytics initiative. Our Predictive Analytics Maturity Index shows that of the four dimensions (People, Process, Information, Technology) in terms of which we evaluate maturity, the People dimension is the least mature when it comes to predictive analytics. This issue, which largely is about available skills sets, potentially can be addressed through hiring of recent graduates, as many schools are teaching the R language and graduates are coming out with an appreciation of its power. Open source R is an increasingly popular language that is in many ways the “Lingua Franca” for predictive analytics. Going one step farther, IBM is working with schools such as Northwestern University to put SPSS and other of its advanced analytic tools in the hands of educators and students. SAS, meanwhile, has a very strong and loyal user base already resident in many of today’s corporations.

Predictive analytics initiatives should involve only organizational data sources in which managers have complete confidence. In the longer term, the right way for companies to do this is first to address the adequacy of their information management before embarking on wide-ranging predictive analytics initiatives. Our recent benchmark research into Information Management shows that organizations continue to face information management and data quality challenges. These result from the heterogeneous environment of disparate systems in organizations, the lack of a common metadata layer, and most of all a lack of attention and budget resources available to tackle the issue. My colleague Mark Smith offers a more in-depth look at the data quality and information management issues in his recent blog post. The net/net is that predictive models are no different than any model; if garbage goes in the front end, it’s garbage that comes out on the other side.

As organizations address these issues and respond to competitive market pressures, operational intelligence and predictive analytics inevitably will gain center-stage. Though businesses are still early in the maturity cycle with respect to predictive analytics, we at Ventana Research see companies capitalizing on the market advantage that predictive analytics provides. In some industries – financials, insurance, telecommunications, and retail, for example – things are moving quickly. Companies in these industries that are not currently taking advantage of predictive analytics or are not actively evaluating their options may be putting their businesses at risk.

What’s your thinking on the deployment and use of predictive analytics in your organization? Let me know!


Tony Cosentino

VP and Research Director

Our benchmark research on business analytics suggests that it is counterproductive to take a general approach. A better approach is to focus on particular use cases and lines of business (LOB). For this reason, in a series of upcoming articles, I will look at our business analytics research in the context of different industries and different functional areas of an organization, and illustrate how analytics are being applied to solve real business problems.

Our benchmark research on business analytics reveals that 89 percent of organizations find that it is important or very important to make it simpler to provide analytics and metrics. To me, this says that today’s analytic environments are a Tower of Babel. We need more user-friendly tools, collaboration and most of all a common vernacular.

With this last point in mind, let’s start by defining business analytics. Here at Ventana Research, business analytics refers to the application of mathematical computation and models to generate relevant historical and predictive insights that can be used to optimize business- and IT-related processes and decisions.This definition helps us to focus on the technological underpinning of analytics, but more importantly, it focuses us on the outcomes of business and IT processes and decisions.

To provide more context, we might think of the what, the so what and the now what when it comes to information, analytics and decision-making. The what is data or information in its base form. In order to derive meaning, we apply different types of analytics and go through analytical processes. This addresses the so what, or the why should I care about the data. The now what involves decision-making and actions taken on the data; this is where ideas such as operational intelligence and predictive analytics play a big role. I will look to our benchmark research in these areas to help guide the discussion.

It’s important not to think about business analytics in a technological silo removed from the people, process, information and tools that make up the Ventana Maturity Index. In this broader sense, business analytics helps internal teams derive meaning from data and guides their decisions. Our next-generation business intelligence research focuses on collaboration and mobile application of analytics, two key components for making analytics actionable within the organization.

In addition, our research shows a lot of confusion about the terms surrounding analytics. Many users don’t understand scorecards and dashboards, and find discovery, iterative analysis, key performance metrics, root-cause analysis and predictive analytics to be ambiguous terms. We’ll be discussing all of these ideas in the context of business technology innovation, our 2012 business intelligence research agenda and of course our large body of research on technology and business analytics.

Organizations must care about analytics because analytics provides companies with a competitive advantage by showing what their customers want, when they want it and how they want it delivered. It can help reduce inventory carrying costs in manufacturing and retail, fraud in insurance and finance, churn in telecommunications and even violent crime on our streets. The better an organization can integrate data and utilize both internal and external information in a coherent fashion, the greater the value of their analytics.

I hope you enjoy this series and find it useful as you define your own analytics agenda within your organization.


Tony Cosentino

VP & Research Director

Hello! I’m excited to be the newest member of the Ventana Research leadership team to bring research insights and education to the business analytics and technology industry. I’d like to start by telling you a bit about who I am, why I’ve chosen to join this company and what I hope to contribute.

For more than 15 years I’ve been studying businesses and their buying behaviors in technology markets. I have a long-time passion for technology, which led me early in my career to systems design and integration at General Electric. I’ve led technology initiatives across marketing, sales and customer service and brought to market one of the first global deployments of a Web-based architecture for Voice of the Customer (VOC). Over the years, I’ve worked with some of the largest technology vendors, including Cisco, Hewlett-Packard, IBM, Microsoft and Oracle, on strategic initiatives in the areas of market segmentation, offer optimization and stakeholder management. Through my predictive analytics work I’ve come to understand how companies can use the new generation of tools to look into the future rather than just analyze the past. My book, Into the River: How Big Data, The Long Tail, and Situated Cognition are Changing the World of Market Insights Forever discusses the revolutionary changes in the way innovative companies use data to effect change and gain competitive advantage. I appreciate that Ventana Research has the most in-depth and new benchmark research in big data and predictive analytics that came out in 2012 building on top of its research on business analytics and Hadoop in 2011.

At Ventana Research I’ll focus on the expanding world of business analytics. Businesses increasingly are looking at past and present behaviors in order to be able to predict future ones. While we’ve done this for a long time in select areas such as financial forecasting, it’s only been in the past few years that the amount of data available, married with massive computing power, has made it possible for the newest generation of business intelligence systems to provide decision support that goes beyond the “what” to begin to provide the “so what” and the “now what.” Including social media for contextual inquiry and attitude analysis, it’s now possible to build a solid, powerful decision support system.

My interest in being part of a team that works hard at accumulating and analyzing reliable data to be able to help organizations move ever closer to “the truth” is a large part of why I came to this company. Its research and advisory services model has kept Ventana Research going strong through two recessions and has made it the go-to choice to advise both technologists and business professionals. Its prolific work, all grounded in research data, puts Ventana Research in a unique position to help companies navigate their way.

A second reason I joined the company is because I share its conviction that technology categories cannot be analyzed in a vacuum. Facing the dynamic interactions today of cloud computing, mobile technology, social media, analytics, business collaboration and big data, to look at markets as silos is to proceed with blinders on. Our ongoing benchmark research, maturity analysis and Value Index work allow us to look across the spectrum of technologies and understand both their interactions and their roles in the business.

As I’ve suggested, I’m a firm believer that knowledge evolves – that we approach the truth at an uneven pace, though hopefully moving ever closer. I learn from everyone around me – including, I hope, you. If you have a thought about something I write, please don’t hesitate to let me know.

I will be posting regularly to report on the exciting research we have going on now, trends in the industry and my views on market developments and directions. I look forward to hearing from you and working with you to help create effective, forward-looking business strategies.


Tony Cosentino – VP & Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter


  • 72,942 hits
%d bloggers like this: