You are currently browsing the category archive for the ‘Information Applications’ category.

Ventana Research recently completed the most comprehensive evaluation of analytics and business intelligence products and vendors available anywhere. As I discussed recently, such research is necessary and timely as analytics and business intelligence is now a fast-changing market. Our Value Index for Analytics and Business Intelligence in 2015 scrutinizes 15 top vendors and their product offerings in seven key
categories: Usability, Manageability, Reliability, Capability, Adaptability, Vendor Validation and TCO/ROI. The analysis shows that the top supplier is Information Builders, which qualifies as a Hot vendor and is followed by 10 other Hot vendors: SAP, IBM, MicroStrategy, Oracle, vr_VI_BI_2015_Weighted_OverallSAS, Qlik, Actuate (now part of OpenText) and Pentaho.

The evaluations drew on our research and analysis of vendors’ and products along with their responses to our detailed RFI or questionnaire, our own hands-on experience and the buyer-related findings from our benchmark research on next-generation business intelligence, information optimization and big data analytics. The benchmark research examines analytics and business intelligence from various perspectives to determine organizations’ current and planned use of these technologies and the capabilities they require for successful deployments.

We find that the processes that comprise business intelligence today have expanded beyond standard query, reporting, analysis and publishing capabilities. They now include sourcing and integration of data and at later stages the use of analytics for planning and forecasting and of capabilities utilizing analytics and metrics for collaborative interaction and performance management. Our research on big data analytics finds that new technologies collectively known as big data vr_Big_Data_Analytics_15_new_technologies_enhance_analyticsare influencing the evolution of business intelligence as well; here in-memory systems (used by 50% of participating organizations), Hadoop (42%) and data warehouse appliances (33%) are the most important innovations. In-memory computing in particular has changed BI because it enables rapid processing of even complex models with very large data sets. In-memory computing also can change how users access data through data visualization and incorporate data mining, simulation and predictive analytics into business intelligence systems. Thus the ability of products to work with big data tools figured in our assessments.

In addition, the 2015 Value Index includes assessments of their self-service tools and cloud deployment options. New self-service approaches can enable business users to reduce their reliance on IT to access and use data and analysis. However, our information optimization research shows that this change is slow to proliferate. In four out of five organizations, IT currently is involved in making information available to end users vr_Info_Optimization_01_whos_responsible_for_information_availabilityand remains entrenched in the operations of business intelligence systems.

Similarly, our research, as well as the lack of maturity of the cloud-based products evaluated, shows that organizations are still in the early stages of cloud adoption for analytics and business intelligence; deployments are mostly departmental in scope. We are exploring these issues further in our benchmark research into data and analytics in the cloud, which will be released in the second quarter of 2015.

The products offered by the five top-rated com­pa­nies in the Value Index provide exceptional functionality and a superior user experi­ence. However, Information Builders stands out, providing an excep­tional user experience and a completely integrated portfolio of data management, predictive analytics, visual discovery and operational intelligence capabilities in a single platform. SAP, in second place, is not far behind, having made significant prog­ress by integrating its Lumira platform into its BusinessObjects Suite; it added pre­dictive analytics capabilities, which led to higher Usability and Capability scores. IBM, MicroStrategy and Oracle, the next three, each provide a ro­bust integrated platform of capabilities. The key differentiator between them and the top two top is that they do not have superior scores in all of the seven categories.

In evaluating products for this Value Index we found some noteworthy innovations in business intelligence. One is Qlik Sense, which has a modern architecture that is cloud-ready and supports responsive design on mobile devices. Another is SAS Visual Analytics, which combines predictive analytics with visual discovery in ways that are a step ahead of others currently in the market. Pentaho’s Automated Data Refinery concept adds its unique Pentaho Data Integration platform to business intelligence for a flexible, well-managed user experience. IBM Watson Analytics uses advanced analytics and VR_AnalyticsandBI_VI_2015natural language processing for an interactive experience beyond the traditional paradigm of business intelligence. Tableau, which led the field in the category of Usability, continues to innovate in the area of user experience and aligning technology with people and process. MicroStrategy’s innovative Usher technology addresses the need for identity management and security, especially in an evolving era in which individuals utilize multiple devices to access information.

The Value Index analysis uncovered notable differences in how well products satisfy the business intelligence needs of employees working in a range of IT and business roles. Our analysis also found substantial variation in how products provide development, security and collaboration capabilities and role-based support for users. Thus, we caution that similar vendor scores should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every organization or for a specific process.

To learn more about this research and to download a free executive summary, please visit.

Regards,

Ventana Research

Ventana Research recently completed the most comprehensive evaluation of analytics and business intelligence products and vendors available anywhere. As I discussed recently, such research is necessary and timely as analytics and business intelligence is now a fast-changing market. Our Value Index for Analytics and Business Intelligence in 2015 scrutinizes 15 top vendors and their product offerings in seven keyvr_VI_BI_2015_Weighted_Overall categories: Usability, Manageability, Reliability, Capability, Adaptability, Vendor Validation and TCO/ROI. The analysis shows that the top supplier is Information Builders, which qualifies as a Hot vendor and is followed by 10 other Hot vendors: SAP, IBM, MicroStrategy, Oracle, SAS, Qlik, Actuate (now part of OpenText) and Pentaho.

The evaluations drew on our research and analysis of vendors’ and products along with their responses to our detailed RFI or questionnaire, our own hands-on experience and the buyer-related findings from our benchmark research on next-generation business intelligence, information optimization and big data analytics. The benchmark research examines analytics and business intelligence from various perspectives to determine organizations’ current and planned use of these technologies and the capabilities they require for successful deployments.

We find that the processes that comprise business intelligence today have expanded beyond standard query, reporting, analysis and publishing capabilities. They now include sourcing and integration of data and at later stages the use of analytics for planning and forecasting and of capabilities utilizing analytics and metrics for collaborative interaction and performance management. Our research on big data analytics finds that new technologies collectively known as big data vr_Big_Data_Analytics_15_new_technologies_enhance_analyticsare influencing the evolution of business intelligence as well; here in-memory systems (used by 50% of participating organizations), Hadoop (42%) and data warehouse appliances (33%) are the most important innovations. In-memory computing in particular has changed BI because it enables rapid processing of even complex models with very large data sets. In-memory computing also can change how users access data through data visualization and incorporate data mining, simulation and predictive analytics into business intelligence systems. Thus the ability of products to work with big data tools figured in our assessments.

In addition, the 2015 Value Index includes assessments of their self-service tools and cloud deployment options. New self-service approaches can enable business users to reduce their reliance on IT to access and use data and analysis. However, our information optimization research shows that this change is slow to proliferate. In four out of five organizations, IT currently is involved in making information available to end users vr_Info_Optimization_01_whos_responsible_for_information_availabilityand remains entrenched in the operations of business intelligence systems.

Similarly, our research, as well as the lack of maturity of the cloud-based products evaluated, shows that organizations are still in the early stages of cloud adoption for analytics and business intelligence; deployments are mostly departmental in scope. We are exploring these issues further in our benchmark research into data and analytics in the cloud, which will be released in the second quarter of 2015.

The products offered by the five top-rated com­pa­nies in the Value Index provide exceptional functionality and a superior user experi­ence. However, Information Builders stands out, providing an excep­tional user experience and a completely integrated portfolio of data management, predictive analytics, visual discovery and operational intelligence capabilities in a single platform. SAP, in second place, is not far behind, having made significant prog­ress by integrating its Lumira platform into its BusinessObjects Suite; it added pre­dictive analytics capabilities, which led to higher Usability and Capability scores. IBM, MicroStrategy and Oracle, the next three, each provide a ro­bust integrated platform of capabilities. The key differentiator between them and the top two top is that they do not have superior scores in all of the seven categories.

In evaluating products for this Value Index we found some noteworthy innovations in business intelligence. One is Qlik Sense, which has a modern architecture that is cloud-ready and supports responsive design on mobile devices. Another is SAS Visual Analytics, which combines predictive analytics with visual discovery in ways that are a step ahead of others currently in the market. Pentaho’s Automated Data Refinery concept adds its unique Pentaho Data Integration platform to business intelligence for a flexible, well-managed user experience. IBM Watson Analytics uses advanced analytics and VR_AnalyticsandBI_VI_2015natural language processing for an interactive experience beyond the traditional paradigm of business intelligence. Tableau, which led the field in the category of Usability, continues to innovate in the area of user experience and aligning technology with people and process. MicroStrategy’s innovative Usher technology addresses the need for identity management and security, especially in an evolving era in which individuals utilize multiple devices to access information.

The Value Index analysis uncovered notable differences in how well products satisfy the business intelligence needs of employees working in a range of IT and business roles. Our analysis also found substantial variation in how products provide development, security and collaboration capabilities and role-based support for users. Thus, we caution that similar vendor scores should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every organization or for a specific process.

To learn more about this research and to download a free executive summary, please visit.

Regards,

Ventana Research

Just a few years ago, the prevailing view in the software industry was that the category of business intelligence (BI) was mature and without room for innovation. Vendors competed in terms of feature parity and incremental advancements of their platforms. But since then business intelligence has grown to include analytics, data discovery tools and big data capabilities to process huge volumes and new types of data much faster. As is often the case with change, though, this one has created uncertainty. For example, only one in 11 participants in our benchmark research on big data analytics said that their organization fully agrees on the meaning of the term “big data analytics.”

There is little question that clear definitions of analytics and business intelligence as they are used in business today would be of value. But some IT analyst firms have tried to oversimplify the process of updating these definitions by merely combining a market basket of discovery capabilities under the label of analytics. In our estimation, this attempt is neither accurate nor useful. Discovery tools are only components of business intelligence, and their capabilities cannot accomplish all the tasks comprehensive BI systems can do. Some firms seem to want to reduce the field further by overemphasizing the visualization aspect of discovery. While visual discovery can help users solve basic business problems, other BI and analytic tools are available that can attack more sophisticated and technically challenging problems. In our view, visual discovery is one of four types of analytic discovery that can help organizations identify and understand the masses of data they accumulate today. But for many organizations visualization alone cannot provide them with the insights necessary to help make critical decisions, as interpreting the analysis requires expertise that mainstream business professionals lack.

In Ventana Research’s view, business intelligence is a technology managed by IT that is designed to produce information and reports from business data to inform business about the performance of activities, people and processes. It has provided and will continue to provide great value to business, but in itself basic BI will not meet the new generation of requirements that businesses face; they need not just information but guidance on how to take advantage of opportunities, address issues and mitigate the risks of subpar performance. Ventana_Research_Value_Index_LogoAnalytics is a component of BI that is applied to data to generate information, including metrics. It is a technology-based set of methodologies used by analysts as well as the information gained through the use of tools designed to help those professionals. These thoughtfully crafted definitions inform the evaluation criteria we apply in our new and comprehensive 2015 Analytics and Business Intelligence Value Index, which we will publish soon. As with all business tools, applications and systems we assess in this series of indexes, we evaluate the value of analytic and business intelligence tools in terms of five functional categories – usability, manageability, reliability, capability and adaptability – and two customer assurance categories – validation of the vendor and total cost of ownership and return on investment (TCO/ROI). We feature our findings in these seven areas of assessment in our Value Index research and reports. In the Analytics and Business Intelligence Value Index for 2015 we assess in depth the products of 15 of the leading vendors in today’s BI market.

The Capabilities category examines the breadth of functionality that products offer and assesses their ability to deliver the insights today’s enterprises need. For our analysis we divide this category into three subcategories for business intelligence: data, analytics and optimization. We explain each of them below.

The data subcategory of Capabilities examines data access and preparation along with supporting integration and modeling. New data sources are coming into being continually; for example, data now is generated in sensors in watches, smartphones, cars, airplanes, homes, utilities and an assortment of business, network, medical and military equipment. In addition, organizations increasingly are interested in behavioral and attitudinal data collected through various communication platforms. Examples include Web browser behavior, data mined from the Internet, social media and various survey and community polling data. The data access and integration process identifies each type of data, integrates it with all other relevant types, checks it all for quality issues, maps it back to the organization’s systems of record and master data, and manages its lineage. Master data management in particular, including newer approaches such as probabilistic matching, is a key component for creating a system that can combine data types across the organization and in the cloud to create a common organizational vernacular for the use of data.

Ascertaining which systems must be accessed and how is a primary challenge for today’s business intelligence platforms. A key part of data access is the user interface. Whether it appears in an Internet browser, a laptop, a smartphone, a tablet or a wearable device, data must be presented in a manner optimized for the interface. Examining the user interface for business intelligence systems was a primary interest of our 2014 Mobile Business Intelligence Value Index. In that research, we learned that vendors are following divergent paths and that it may be hard for some to change course as they continue. Therefore how a vendor manages mobile access and other new means impacts its products’ value for particular organizations.

Once data is accessed, it must be modeled in a useful way. Data models in the form of OLAP cubes and predefined relationships of data sometimes grow overly complex, but there is value in premodeling data in ways that make sense to business people, most of whom are not up to modeling it for themselves. Defining data relationships and transforming data through complex manipulations is often needed, for instance, to define performance indicators that align with an organization’s business initiatives. These manipulations can include business rules or what-if analysis within the context of a model or external to it. Finally, models must be flexible so they do not hinder the work of organizational users. The value of premodeling data is that it provides a common view for business users so they need not redefine data relationships that have already been thoroughly considered.

The analytics subcategory includes analytic discovery, prediction and integration. Discovery and prediction roughly map to the ideas of exploratory and confirmatory analytics, which I have discussed. Analytic discovery includes calculation and visualization processes that enable users to move quickly and easily through data to create the types of information they need for business purposes. Complementing it is prediction, which typically follows discovery. Discovery facilitates root-cause and historical analysis, but to look ahead and make decisions that produce desired business outcomes, organizations need to track various metrics and make informed predictions. Analytic integration encompasses customization of both discovery and predictive analytics and embedding them in other systems such as applications and portals.

The optimization subcategory includes collaboration, organizational management, information optimization, action and automation. Collaboration is a key consideration for today’s analytic platforms. It includes the ability to publish, share and coordinate various analytic and business intelligence functions. Notably, some recently developed collaboration platforms incorporate many of the characteristics of social platforms such as Facebook or LinkedIn. Organizational management attempts to manage to particular outcomes and sometimes provides performance indicators and scorecard frameworks. Action assesses how technology directly assists decision-making in an operational context. This includes gathering inputs and outputs for collaboration before and after a decision, predictive scoring that prescribes action and delivery of the information in the correct form to the decision-maker. Finally, automation triggers alerts in circumstances based on statistical triggers or rules and should be managed as part of a workflow. Agent technology takes automation to a level that is more proactive and autonomous.

vr_Info_Optim_Maturity_06_oraganization_maturity_by_dimensionsThis broad framework of data, analytics and optimization fits with a process orientation to business analytics that I have discussed. Our benchmark research on information optimization indicates that the people and process dimensions of performance are less well developed than the information and technology aspects, and thus a focus on these aspects of business intelligence and analytics will be beneficial.

In our view, it’s important to consider business intelligence software in a broad business context rather than in artificially separate categories that are designed for IT only. We advise organizations seeking to gain a competitive edge to adopt a multifaceted strategy that is business-driven, incorporates a complete view of BI and analytics, and uses the comprehensive evaluation criteria we apply.

Regards,

Ventana Research

Our benchmark research into business technology innovation shows that analytics ranks first or second as a business technology innovation priority in 59 percent of organizations. Businesses are moving budgets and responsibilities for analytics closer to the sales operations, often in the form of so-calledvr_Big_Data_Analytics_15_new_technologies_enhance_analytics shadow IT organizations that report into decentralized and autonomous business units rather than a central IT organization. New technologies such as in-memory systems (50%), Hadoop (42%) and data warehouse appliances (33%) are top back-end technologies being used to acquire a new generation of analytic capabilities. They are enabling new possibilities including self-service analytics, mobile access, more collaborative interaction and real-time analytics. In 2014, Ventana Research helped lead the discussion around topics such as information optimization, data preparation, big data analytics and mobile business intelligence. In 2015, we will continue to cover these topics while adding new areas of innovation as they emerge.

Three key topics lead our 2015 business analytics research agenda. The first focuses on cloud-based analytics. In our benchmark research on information optimization, nearly all (97%) organizations said it is important or very important to Ventana_Research_Benchmark_Research_Logosimplify informa­tion access for both their business and their customers. Part of the challenge in optimizing an organization’s use of information is to integrate and analyze data that originates in the cloud or has been moved there. This issue has important implications for information presentation, where analytics are executed and whether business intelligence will continue to move to the cloud in more than a piecemeal fashion. We are currently exploring these topics in our new benchmark research called analytics and data in the cloud Coupled with the issue of cloud use is the proliferation of embedded analytics and the imperative for organizations to provide scalable analytics within the workflow of applications. A key question we’ll try to answer this year is whether companies that have focused primarily on operational cloud applications at the expense of developing their analytics portfolio or those that have focused more on analytics will gain a competitive advantage.

The second research agenda item is advanced analytics. It may be useful to divide this category into machine learning and predictive analytics, which I have discussed and covered in vr_predanalytics_benefits_of_predictive_analytics_updatedour benchmark research on big data analytics. Predictive analytics has long been available in some sectors of the business world, and two-thirds (68%) of organizations as found in our research that use it said it provides a competitive advantage. Programming languages such as R, the use of Predictive Model Markup Language (PMML), inclusion of social media data in prediction, massive scale simulation, and right-time integration of scoring at the point of decision-making are all important advances in this area. Machine learning also been around for a long time, but it wasn’t until the instrumentation of big data sources and advances in technology that it made sense to use in more than academic environments. At the same time as the technology landscape is evolving, it is getting more fragmented and complex; in order to simplify it, software designers will need innovative uses of machine learning to mask the underlying complexity through layers of abstraction. A technology such as Spark out of Amp-Lab at Berkeley is still immature, but it promises to enable increasing uses of machine learning on big data. Areas such as sourcing data and preparing data for analysis must be simplified so analysts are not overwhelmed by big data.

Our third area of focus is the user experience in business intelligence tools. Simplification and optimization of information in a context-sensitive manner are paramount. An intuitive user experience can advance the people and process dimensions VR_Value_Index_Logoof business, which have lagged technology innovation according to our research in multiple areas. New approaches coming from business end-users, especially in the tech-savvy millennial generation, are pushing the envelope here. In particular, mobility and collaboration are enabling new user experiences in both business organizations and society at large. Adding to it is data collected in more forms, such as location analytics (which we have done research on), individual and societal relationships, information and popular brands. How business intelligence tools incorporate such information and make it easy to prepare, design and consume for different organizational personas is not just an agenda focus but also one focus of our 2015 Analytics and Business Intelligence Value Index to be published in the first quarter of the year.

This shapes up as an exciting year. I welcome any feedback you have on this research agenda and look forward to providing research, collaborating and educating with you in 2015.

Regards,

Ventana Research

Actuate, a company known for powering BIRT, the open source business intelligence technology, has been delivering large-scale consumer and industrial applications for more than 20 years. In December the company announced it would be acquired by OpenText of Ontario, Canada. OpenText is Canada’s largest software vendor with more than 8,000 employees and a portfolio of enterprise information management products. It serves VR2014_Leadership_AwardWinnerprimarily large companies. The attraction of Actuate for such a company can be seen in a number of its legacy assets as well as more current acquisitions and developments but also its existing customer base. It was also awarded a 2014 Ventana Research Business Leadership Award.

Actuate’s foundational asset is BIRT (Business Intelligence and Reporting Tools) and its developer community. With more than 3.5 million developers and 13.5 million downloads, the BIRT developer environment is used in a variety of companies on a global basis. The BIRT community includes Java developers as well as sophisticated business intelligence design professionals, which I discussed in my outline of analytics personas. BIRT is a key project for the Eclipse Foundation, an open source integrated development environment familiar to many developers. BIRT provides a graphical interface to build reports at a granular level, and being Java-based, it provides ways to grapple with data and build data connections in a virtually limitless fashion. While new programming models and scripting languages, such as Python and Ruby, are gaining favor, Java remains a primary coding language for large-scale applications. One of the critical capabilities for business intelligence tools is to provide information in a visually compelling and easily usable format. BIRT can provide pixel-perfect reporting and granular adjustments to visualization objects. This benefit is coupled with the advantage of the open source approach: availability of skilled technical human resources on a global basis at relatively low cost.

Last year Actuate introduced iHub 3.1, a deployment server that integrates data from multiple sources and distributes content to end users. IHub has connectors to most database systems including modern approaches such as Hadoop. While Actuate provides the most common connectors out of the box, BIRT and the Java framework allow any data from any system to be brought into the fold. This type of approach to big data becomes particularly compelling for the ability to vr_Big_Data_Analytics_04_types_of_big_data_for_analyticsintegrate both large-scale data and diverse data sources. The challenge is that the work sometimes requires customization, but for large-scale enterprise applications, developers often do this to deliver capabilities that would not otherwise be accessible to end users. Our benchmark research into big data analytics shows that organizations need to access many data sources for analysis including transactional data (60%), external data (50%), content (49%) and event-centric data (48%).

In 2014, Actuate introduced iHub F-Type, which enables users to build reports, visualizations and applications and deploy them in the cloud. F-Type mitigates the need to build a separate deployment infrastructure and can act as both a “sandbox” for development and a broader production environment. Using REST-based interfaces, application developers can use F-Type to prototype and scale embedded reports for their custom applications. F-Type is delivered in the cloud, has full enterprise capabilities out of the box, and is free up to a metered output capacity of 50MB. The approach uses output metering rather than input metering used by some technology vendors. This output metering approach encourages scaling of data and focuses organizations on which specific reports they should deployed to their employees and customers.

Also in 2014, Actuate introduced BIRT Analytics 5.0, a self-service discovery platform that includes advanced analytic capabilities. In my review of BIRT Analytics, I noted its vr_predanalytics_benefits_of_predictive_analytics_updatedabilities to handle large data volumes and do intuitive predictive analytics. Organizations in our research said that predictive analytics provides advantages such as achieving competitive advantage (for 68%), new revenue opportunities (55%) and increased profitability (52%). Advances in BIRT Analytics 5.0 include integration with iHub 3.1 so developers can bring self-service discovery into their dashboards and public APIs for use in custom applications.

The combination of iHub, the F-Type freemium model, BIRT Analytics and the granular controls that BIRT provides to developers and users presents a coherent strategy especially in the context of embedded applications. Actuate CEO Pete Cittadini asserts that the company has the most APIs of any business intelligence vendor. The position is a good one especially since embedded technology is becoming important in the context of custom applications and in the so-called Internet-of-Things. The ability to make a call into another application instead of custom-coding the function itself within the workflow of an end-user application cuts developer time significantly. Furthermore, the robustness of the Actuate platform enables applications to scale almost without limit.

OpenText and Actuate have similarities, such as the maturity of the organizations and the types of large clients they vr_Info_Optimization_02_drivers_for_deploying_informationservice. It will be interesting to see how Actuate’s API strategy will impact the next generation of OpenText’s analytic applications and to what degree Actuate remains an independent business unit in marketing to customers. As a company that has been built through acquisitions, OpenText has a mature onboarding process that usually keeps the new business unit operating separately. OpenText CEO Mark Barrenechea outlines his perspective on the acquisition which will bolster its portfolio for information optimization and analytics or what it calls enterprise information management. In fact our benchmark research on information optimization finds that analytics is the top driver for deploying information in two thirds of organizations. The difference this time may be that today’s enterprises are asking for more integrated information which embeds analytics rather than having different interfaces for each of the applications or tools. The acquisition of Actuate by OpenText has now closed and now changes will occur to Actuate that should be watched closely to determine its path forward and it potential higher value for customers within OpenText.

Regards,

Ventana Research

In 2014, IBM announced Watson Analytics, which uses machine learning and natural language processing to unify and simplify the user experience in each step of the analytic processing: data acquisition, data preparation, analysis, dashboarding and storytelling.  After a relatively short beta testing period involving more than 22,000 users, IBM released Watson Analytics for general availability in December. There are two editions: the “freemium” trial version allows 500MB of data storage and access to file sizes less than 100,000 rows of data and 50 columns; the personal edition is a monthly subscription that enables larger files and more storage.

Its initial release includes functions to explore, predict and assemble data. Many of the features are based on IBM’s SPSS Analytic Catalyst, which I wrote about and which won the 2013 Ventana Research Technology Innovation Award for business analytics. Once data is uploaded, the explore function enables users to analyze data in an iterative fashion using natural language processing and simple point-and-click actions. Algorithms decide the best fit for graphics based on the data, but users may choose other graphics as needed. An “insight bar” shows other relevant data that may contain insights such as potential market opportunities.

The ability to explore data through visualizations with minimal knowledge is a primary aim of modern analytics tools. With the explore function incorporating natural language processing, which other tools in the market lack, IBM makes analytics accessible to users without the need to drag and drop dimensions and measures across the screen. This feature should not be underestimated; usability is the buying criterion for analytics tools most widely cited in our benchmark research on next-generation business intelligence (by 63% of organizations).

vr_ngbi_br_importance_of_bi_technology_considerations_updatedThe predict capability of Watson Analytics focuses on driver analysis, which is useful in a variety of circumstances such as sales win and loss, market lift analysis, operations and churn analysis. In its simplest form, a driver analysis aims to understand causes and effects among multiple variables. This is a complex process that most organizations leave to their resident statistician or outsource to a professional analyst. By examining the underlying data characteristics, the predict function can address data sets, including what may be considered big data, with an appropriate algorithm. The benefit for nontechnical users is that Watson Analytics makes the decision on selecting the algorithm and presents results in a relatively nontechnical manner such as spiral diagrams or tree diagrams. Having absorbed the top-level information, users can drill down into top key drivers. This ability enables users to see relative attribute influences and interactivity between attributes. Understanding interactivity is an important part of driver analysis since causal variables often move together (a challenge known as multicollinearity) and it is sometimes hard to distinguish what is actually causing a particular outcome. For instance, analysis may blame the customer service department for a product defect and point to it as the primary driver of customer defection. Accepting this result, a company may mistakenly try to fix customer service when a product issue needs to be addressed. This approach also overcomes the challenge of Simpson’s paradox, in which a trend that appears in different groups of data disappears or reverses when these groups are combined. This is a hindrance for some visualization tools in the market.

Once users have analyzed the data sufficiently and want to create and share their analysis, the assemble function enables them to bring together various dashboard visualizations in a single screen. Currently, Watson Analytics does such sharing (as well as comments related to the visualizations) via email. In the future, it would good to see capabilities such as annotation and cloud-based sharing in the product.

Full data preparation capabilities are not yet integrated into Watson Analytics. Currently, it includes a data quality report that gives confidence levels for the current data based on its cleanliness, and basic sort, transform and relabeling are incorporated as well. I assume that IBM has much more in the works here. For instance, its DataWorks cloud service offers APIs for some of the best data preparation and master data management available today. DataWorks can mask data at the source and do probabilistic matching against many sources including both cloud and on-premises addresses.  This is a major challenge organizations face when needing to conduct analytics across many data sets. For instance, in multichannel marketing, each individual customer may have many email addresses as well as different mailing addresses, phone numbers and identifiers for social media. A so-called “golden record” needs to be created so all such information can be linked together. Conceptually, the data becomes one long row of data related to that golden record, rather than multiple unassociated data in rows of shorter length. This data needs to be brought into a company’s own internal systems, and personally identifiable information must be stripped out before anything moves into a public domain. In a probabilistic matching system, data is matched not on one field but through associations of data which gives levels of certainty that records should be merged. This is different than past approaches and one of the reasons for significant innovation in the category. Multiple startups have been entering the data preparation space to address the need for a better user experience in data preparation. Such needs have been documented as one of the foundational issues facing the world of big data. Our benchmark research into information optimization shows that data preparation (47%) and quality and consistency (45%) are the most time-consuming tasks for organizations in analytics.

Watson Analytics is deployed on IBM’s SoftLayer cloud vr_Info_Optimization_04_basic_information_tasks_consume_timetechnology and is part of a push to move its analytic portfolio into the cloud. Early in 2015 the company plans to move its SPSS and Cognos products into the cloud via a managed service, thus offloading tasks such as setup, maintenance and disaster recovery management. Watson Analytics will be offered as a set of APIs much as the broader Watson cognitive computing platform has been. Last year, IBM said it would move almost all of its software portfolio to the cloud via its Bluemix service platform. These cloud efforts, coupled with the company’s substantial investment in partner programs with developers and universities around the world, suggest that Watson may power many next-generation cognitive computing applications, a market estimated to grow into the tens of billions of dollars in the next several years.

Overall, I expect Watson Analytics to gain more attention and adoption in 2015 and beyond. Its design philosophy and user experience are innovative, but work must be done in some areas to make it a tool that professionals use in their daily work. Given the resources IBM is putting into the product and the massive amounts of product feedback it is receiving, I expect initial release issues to be worked out quickly through the continuous release cycle. Once they are, Watson Analytics will raise the bar on self-service analytics.

Regards,

Ventana Research

PentahoWorld, the first user conference for this 10-year-old supplier of data integration and business intelligence that provides business analytics, attracted more than 400 customers in roles ranging from IT and database professionals to business analysts and end users. The diversity of the crowd reflects Pentaho’s broad portfolio of products. It covers the integration aspects of big data analytics with the Pentaho Data Integration tools and the front-end tools and visualization with the Pentaho Business Analytics. In essence its portfolio provides end-to-end data to analytics through what they introduced as Big Data Orchestration that brings governed data delivery and streamlined data refinery together on one platform.

vr_BDI_03_plans_for_big_data_technologyPentaho has made progress in business over the past year, picking up Fortune 1000 clients and moving from providing analytics to midsize companies to serving more major companies such as Halliburton, Lufthansa and NASDAQ. One reason for this success is Pentaho’s ability to integrate large scale data from multiple sources including enterprise data warehouses, Hadoop and other NoSQL approaches. Our research into big data integration shows that Hadoop is a key technology that 44 percent of organizations are likely to use, but it is just one option in the enterprise data environment. A second key for Pentaho has been the embeddable nature of its approach, which enables companies, especially those selling cloud-based software as a service (SaaS), to use analytics to gain competitive advantage by placing its tools within their applications. For more detail on Pentaho’s analytics and business intelligence tools please my previous analytic perspective.

A key advance for the company over the past year has been the development and refinement of what the company calls big data blueprints. These are general use cases in such areas as ETL offloading and customer analytics. Each approach includes design patterns for ETL and analytics that work with high-performance analytic databases including NoSQL variants such as Mongo and Cassandra.

The blueprint concept is important for several reasons. First, it helps Pentaho focus on specific market needs. Second, it shows customers and partners processes that enable them to get immediate return on the technology investment. The same research referenced above shows that organizations manage their information and technology better than their people and processes; to realize full value from spending on new technology, they need to pay more attention to how the technology fits with these cultural aspects.

vr_Info_Optimization_09_most_important_end_user_capabilitiesAt the user conference, the company announced release 5.2 of its core business analytics products and featured its Governed Data Delivery concept and Streamlined Data Refinery. The Streamlined Data Refinery provides a process for business analysts to access the already integrated data provided through PDI and create data models on the fly. The advantage is that this is not a technical task and the business analyst does not have to understand the underlying metadata or the data structures. The user chooses the dimensions of the analysis using menus that offer multiple combinations to be chosen in an ad hoc manner. Then the Streamlined Data Refinery automatically generates a data cube that is available for fast querying of an analytic database. Currently, Pentaho supports only the HP Vertica database, but its roadmap promises to add high-performance databases from other suppliers. The entire process can take only a few minutes and provides a much more flexible and dynamic process than asking IT to rebuild a data model every time a new question is asked.

While Pentaho Data Integration enables users to bring together all available data and integrate it to find new insights, Streamlined Data Refinery gives business users direct access to the blended data. In this way they can explore data dynamically without involving IT. The other important aspect is that it easily provides the lineage of the data. Internal or external auditors often need to understand the nature of the data and the integration, which data lineage supports. Such a feature should benefit all types of businesses but especially those in regulated industries. This approach addresses the two top needs of business end users, which according to our benchmark research into information optimization, are to drill into data (for 37%) and search for specific information (36%).

Another advance is Pentaho 5.2’s support for Kerberos security on Cloudera, Hortonworks and MapR. Cloudera, currently the largest Hadoop distribution, and Hortonworks, which is planning to raise capital via a public offering, hold the lion’s share of the commercial Hadoop market. Kerberos puts a layer of authentication security between the Pentaho Data Integration tool and the Hadoop data. This helps address security concerns which have dramatically increased over the past year after major breaches at retailers, banks and government institutions.

These announcements show results of Pentaho’s enterprise-centric customer strategy as well as the company’s investment in senior leadership. Christopher Dziekan, the new chief product officer, presented a three-year roadmap that focuses on data access, governance and data integration. It is good to see the company put its stake in the ground with a well-formed vision of the big data market. Given the speed at which the market is changing and the necessity for Pentaho to consider the needs of its open source community, it will be interesting to see how the company adjusts the roadmap going forward.

For enterprises grappling with big data integration and trying to give business users access to new information sources, Pentaho’s Streamlined Data Refinery deserves a look. For both enterprises and ISVs that want to apply integration and analytics in context of another application, Pentaho’s REST-based APIs allow embedding of end-to-end analytic capabilities. Together with the big data blue prints discussed above, Pentaho is able to deliver a targeted yet flexible approach to big data.

Regards,

Ventana Research

Qlik was an early pioneer in developing a substantial market for a visual discovery tool that enables end users to easily access and manipulate analytics and data. Its QlikView application uses an associative experience that takes  an in-memory, correlation-based approach to present a simpler design and user experience for analytics than previous tools. Driven by sales of QlikView, the company’s revenue has grown to more than $.5 billion, and originating in Sweden it has a global presence.

At its annual analyst event in New York the business intelligence and analytics vendor discussed recent product developments, in particular the release of Qlik Sense. It is a drag-and-drop visual analytics tool targeted at business users but scalable enough for enterprise use. Its aim is to give business users a simplified visual analytic experience that takes advantage of modern cloud technologies. Such a user experience is important; our benchmark research into next-generation business intelligence shows that usability is an important buying criterion for nearly two out of three (63%) companies. A couple of months ago, Qlik introduced Qlik Sense for desktop systems, and at the analyst event it announced general availability of the cloud and server editions.

vr_bti_br_technology_innovation_prioritiesAccording to our research into business technology innovation, analytics is the top initiative for new technology: 39 percent of organizations ranked it their number-one priority. Analytics includes exploratory and confirmatory approaches to analysis. Ventana Research refers to exploratory analytics as analytic discovery and segments it into four categories that my colleague Mark Smith has articulated. Qlik’s products belong in the analytic discovery category. Users can use the tool to investigate data sets in an intuitive and visual manner, often conducting root cause analysis and decision support functions. This software market is relatively young, and competing companies are evolving and redesigning their products to suit changing tastes. Tableau, one of Qlik’s primary competitors, which I wrote about recently, is adapting its current platform to developments in hardware and in-memory processing, focusing on usability and opening up its APIs. Others have recently made their first moves into the market for visual discovery applications, including Information Builders and MicroStrategy. Companies such as Actuate, IBM, SAP, SAS and Tibco are focused on incorporating more advanced analytics in their discovery tools. For buyers, this competitive and fragmented market creates a challenge when comparing offers in the analytic discovery market.

A key differentiator is Qlik Sense’s new modern architecture, which is designed for cloud-based deployment and embedding in other applications for specialized use. Its analytic engine plugs into a range of Web services. For instance, the Qlik Sense API enables the analytic engine to call to a data set on the fly and allow the application to manipulate data in the context of a business process. An entire table can be delivered to node.js, which extends the JavaScript API to offer server-side features and enables the Qlik Sense engine to take on an almost unlimited number of real-time connections  by not blocking input and output. Previously developers could write PHP script and pipe SQL to get the data, and the resulting application is viable but complex to build and maintain. Now all they need is JavaScript and HTML. The Qlik Sense architecture abstracts the complexity and allows JavaScript developers to make use of complex constructs without intricate knowledge of the database. The new architecture can decouple the Qlik engine from the visualizations themselves, so Web developers can define expressions and dimensions without going into the complexities of the server-side architecture. Furthermore, by decoupling the services, developers gain access to open source visualization technologies such as d3.js. Cloud-based business intelligence and extensible analytics are becoming a hot topic. I have written about this, including a glimpse of our newly announced benchmark research on the next generation of data and analytics in the cloud. From a business user perspective, these types of architectural changes may not mean much, but for developers, OEMs and UX design teams, it allows much faster time to value through a simpler component-based approach to utilizing the Qlik analytic engine and building visualizations.

vr_Big_Data_Analytics_06_benefits_realized_from_big_data_analyticsThe modern architecture of Qlik Sense together with the company’s ecosystem of more than 1,000 partners and a professional services organization that has completed more than 2,700 consulting engagements, gives Qlik a competitive position. The service partner relationships, including those with major systems integrators, are key to the company’s future since analytics is as much about change management as technology. Our research in analytics consistently shows that people and processes lag technology and information in performance with analytics. Furthermore, in our benchmark research into big data analytics, the benefits most often mentioned as achieved are better communication and knowledge sharing (24%), better management and alignment of business goals (18%), and gaining competitive advantage (17%).

As tested on my desktop, Qlik Sense shows an intuitive interface with drag-and-drop capabilities for building analysis. Formulas are easy to incorporate as new measures, and the palate offers a variety of visualization options which automatically fit to the screen. The integration with QlikView is straightforward in that a data model from QlikView can be saved seamlessly and opened intact in Qlik Sense. The storyboard function allows for multiple visualizations to build into narratives and for annotations to be added including linkages with data. For instance, annotations can be added to specific inflection points in a trend line or outliers that may need explanation. Since the approach is all HTML5-based, the visualizations are ready for deployment to mobile devices and responsive to various screen sizes including newer smartphones, tablets and the new class of so-called phablets. In the evaluation of vendors in our Mobile Business Intelligence Value Index Qlik ranked fourth overall.

In the software business, of course, technology advances alone don’t guarantee success. Qlik has struggled to clarify the position its next-generation product and it is not a replacement for QlikView. QlikView users are passionate about keeping their existing tool because they have already designed dashboards and calculations using this tool. Vendors should not underestimate user loyalty and adoption. Therefore Qlik now promises to support both products for as long as the market continues to demand them. The majority of R&D investment will go into Qlik Sense as developers focus on surpassing the capabilities of QlikView. For now, the company will follow a bifurcated strategy in which the tools work together to meet needs for various organizational personas. To me, this is the right strategy. There is no issue in being a two-product company, and the revised positioning of Qlik Sense complements QlikView both on the self-service side and the developer side. Qlik Sense is not yet as mature a product as QlikView, but from a business user’s perspective it is a simple and effective analysis tool for exploring data and building different data views. It is simpler because users no do not need to script the data in order to create the specific views they deem necessary. As the product matures, I expect it to become more than an end user’s visual analysis tool since the capabilities of Qlik Sense lends itself to web scale approaches. Over time, it will be interesting to see how the company harmonizes the two products and how quickly customers will adopt Qlik Sense as a stand-alone tool.

For companies already using QlikView, Qlik Sense is an important addition to the portfolio. It will allow business users to become more engaged in exploring data and sharing ideas. Even for those not using QlikView, with its modern architecture and open approach to analytics, Qlik Sense can help future-proof an organization’s current business intelligence architecture. For those considering Qlik for the first time, the choice may be whether to bring in one or both products. Given the proven approach of QlikView, in the near term a combination approach may be a better solution in some organizations. Partners, content providers and ISVs should consider Qlik Branch, which provides resources for embedding Qlik Sense directly into applications. The site provides developer tools, community efforts such as d3.js integrations and synchronization with Github for sharing and branching of designs. For every class of user, Qlik Sense can be downloaded for free and tested directly on the desktop. Qlik has made significant strides with Qlik Sense, and it is worth a look for anybody interested in the cutting edge of analytics and business intelligence.

Regards,

Ventana Research

Our benchmark research consistently shows that business analytics is the most significant technology trend in business today and acquiring effective predictive analytics is organizations’ top priority for analytics. It enables them to look forward rather than backward and, participate organizations reported, leads to competitive advantage and operational efficiencies.

In our benchmark research on big data analytics, for example, 64 percent of organizations ranked predictive analytics as the most Untitledimportant analytics category for working with big data. Yet a majority indicated that they do not have enough experience in applying predictive analytics to business problems and lack training on the tools themselves.

Predictive analytics improves an organization’s ability to understand potential future outcomes of variables that matter. Its results enable an organization to decide correct courses of action in key areas of the business. Predictive analytics can enhance the people, process, information and technology components of an organization’s future performance.

In our most recent research on this topic, more than half (58%) of participants indicated that predictive analytics is very important to their organization, but only one in five said they are very satisfied with their use of those analytics. Furthermore, our research found that implementing predictive analysis would have a transformational impact in one-third of organizations and a significant positive impact in more than half of other ones.

In our new research project, The Next Generation of Predictive Analytics, we will revisit predictive analysis with an eye to determining how attitudes toward it have changed,  along with its current and planned use, and its importance in business. There are significant changes in this area, including where, how, why, and when predictive analytics are applied. We expect to find changes not only in forecasting and analyzing customer churn but also in operational use at the front lines of the organization and in improving the analytic process itself. The research will also look at the progress of emerging statistical languages such as R and Python, which I have written about.

vr_predanalytics_benefits_of_predictive_analytics_updatedAs does big data analytics, predictive analytics involves sourcing data, creating models, deploying them and managing them to understand when an analytic model has become stale and ought to be revised or replaced. It should be obvious that only the most technically advanced users will be familiar with all this, so to achieve broad adoption, predictive analytics products must mask the complexity and be easy to use. Our research will determine the extent to which usability and manageability are being built into product offerings.

The promise of predictive analytics, including competitive advantage (68%), new revenue opportunities (55%), and increased profitability (52%), is significant. But to realize the advantages of predictive analytics, companies must transform how they work. In terms of people and processes a more collaborative strategy may be necessary. Analysts need tools and skills in order to use predictive analytics effectively. A new generation of technology is also becoming available where predictive analytics are easier to apply and use, along with deploy into line of business processes. This will help organizations significantly as there are not enough data scientists and specially trained professionals in predictive analytics that will be available for organizations to utilize or afford to hire.

This benchmark research will look closely at the evolving use of predictive analytics to establish how it equips business to make decisions based on likely futures, not just the past.

Regards,

Tony Cosentino

VP & Research Director

VRMobileBIVI_HotVendorMobility continues to be a hot adoption area in business intelligence, according to our research across analytics and line of business departments. Nearly three-quarters (71%) of organizations said their mobile workforce would be able to access BI capabilities in the next 12 months according to our next generation mobile business intelligence research. Roambi, a provider of mobile business intelligence applications, has made important strides this year after moving to deploying its products in the cloud, an event that I covered previously. Roambi is rated as one of the top providers of mobile business intelligence or what we refer to as a ‘Hot Vendor’ according to our Value Index.

Earlier this year, Roambi announced a partnership with Box, which offers cloud-based data storage and file sharing. More recently it began to catch up with the market by announcing support for the Android mobile operating system. Roambi has focused on the mobile BI market from its inception, first by building on the Apple iPhone’s small screen and then progressing to support the majority of mobile devices in corporations today.

The Box partnership, announced in March, enables joint Box and Roambi customers to visualize and interact with data stored on the Box file sharing system. Specifically, users are able to sync Roambi Analytics, the company’s visualization engine, and Roambi Flow, its storyboarding capability, with Box. The integration allows fast importation of Box source files and the ability to create and share interactive reports through Roambi Analytics and to create digital magazines and content through Roambi Flow. Data can be automatically refreshed in real time via Box Sync.

This partnership serves both companies since the coupled service provide users with expanded capabilities and little overlap. Box’s rapid growth is being driven by its ease of use, open platform and cloud approach to file sharing. Thisis a natural platform for Roambi to build on and expand its customer base. For Box, Roambi’s dynamic visualization and a collaboration capabilities address its business customers’ needs and increase its market opportunities. In our benchmark research on information optimization 83 percent of organizations said it is important to have components that provide interactive capabilities to support presentation of information.

Roambi also works with other application vendors in the CRM and ERP market to integrate their applications. The same research shows that CRM (for 45%) and ERP (38%) are important types to integrate with others especially in areas such as sales and customer service. Apple’s recent announcement of a new SDK, should facilitate process integration between software systems so that, for example, users can access and navigate applications such as those from Box and Roambi and transactional applications such as CRM and ERP in a single workflow. This capability can provide further usability advantages for Roambi, which scored the highest rating in this area in our Mobile Business Intelligence Value Index.

Roambi announced its mobile BI support for the Google Android mobile operating system that operates across a wide range of those smartphone and tablet technologies. It had delayed its development of its software on the Android platform, which required significant development resources and investment but was part of its strategy to maximize its business potential and relationship with Apple. The release are available at the Google Play store . The initial version will include four of Roambi’s most popular views: Card, Catalist, Layers and Superlist. Similar to its application for the Apple platform, security features for Android include Device Lockout, Remote Wipe and Application Passcode. Integration with Roambi’s partner Okta provides identity management services and gives access to any applications supported by Okta. Integration includes Active Directory (AD) and lightweight directory access protocol (LDAP). While Roambi Flow will not be available on Android out of the gate, the company says it will be becoming available by the end of 2014.

Current approaches to mobile business intelligence applications on the market include native, Web-based and hybrid (a combination of both). We compare these approaches in detail in the executive summary of our Mobile Business Intelligence Value Index report.With the new Android support, Roambi has a unique approach to the hybrid architecture that bypasses the browser completely. There is no data cached in a browser and in fact the data is loaded directly to the device and rendered through Roambi natively on the device.  From the user’s perspective, the benefit of this approach is performance since interactivity does not rely on data traveling over a network. A second benefit is offline access to data, which is not available via non-native approaches. From the developer’s or information assembler perspective, testing across browsers is not needed since there is no data caching and the experience is the same regardless of browser in use.

vr_Info_Optimization_16_information_software_evaluation_criteriaOur next-generation business intelligence research shows that executives strongly support mobile BI: Nearly half of them said that mobility is very important to their BI processes. Executives also favor Apple over Android devices which is likely one of the key reasons for Apple’s dominance in the current business intelligence landscape. However our research shows latent demand for Android devices in the business intelligence market and given the dominance of Android in the consumer market as well as dominance in places like Asia and other emerging markets, any company that seeks to take significant global market share will need support for both platforms. Our information optimization research fond Google Android as second in smartphone platform (24%) behind Apple iPhone (59%) as first ranked position. Therefore, Android support is an important evolution for Roambi in order to address an increasing demand in the market for Android devices.

Usability has become the most important evaluation criteria and in the information optimization benchmark research was selected as most important to over half (58%) of organizations. Roambi ranked as the highest in usability in the Ventana Research Mobile Business Intelligence Value Index, though its overall score was hampered somewhat by the lack of support for Android. With Android support, the company now addresses the need for multiple and the so called bring your own device (BYOD) to work methods now becoming more prevalent and allowed by organizations. By addressing this as well as taking advantages of broader market developments such as the new Apple SDKs, Roambi continues to address what organizations find important today. Usability of business intelligence systems is a top priority for 63% of companies. Even our big data analytics research finds a growing level of overall importance for mobile access in almost half of organizations (46%).  Any company that wants to get a user friendly business intelligence into the hands of its mobile workers quickly and effectively should have Roambi in the consideration set.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,232 hits
%d bloggers like this: