You are currently browsing the category archive for the ‘Location Intelligence’ category.

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Ventana Research

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Tony Cosentino

VP and Research Director

Our benchmark research consistently shows that business analytics is the most significant technology trend in business today and acquiring effective predictive analytics is organizations’ top priority for analytics. It enables them to look forward rather than backward and, participate organizations reported, leads to competitive advantage and operational efficiencies.

In our benchmark research on big data analytics, for example, 64 percent of organizations ranked predictive analytics as the most Untitledimportant analytics category for working with big data. Yet a majority indicated that they do not have enough experience in applying predictive analytics to business problems and lack training on the tools themselves.

Predictive analytics improves an organization’s ability to understand potential future outcomes of variables that matter. Its results enable an organization to decide correct courses of action in key areas of the business. Predictive analytics can enhance the people, process, information and technology components of an organization’s future performance.

In our most recent research on this topic, more than half (58%) of participants indicated that predictive analytics is very important to their organization, but only one in five said they are very satisfied with their use of those analytics. Furthermore, our research found that implementing predictive analysis would have a transformational impact in one-third of organizations and a significant positive impact in more than half of other ones.

In our new research project, The Next Generation of Predictive Analytics, we will revisit predictive analysis with an eye to determining how attitudes toward it have changed,  along with its current and planned use, and its importance in business. There are significant changes in this area, including where, how, why, and when predictive analytics are applied. We expect to find changes not only in forecasting and analyzing customer churn but also in operational use at the front lines of the organization and in improving the analytic process itself. The research will also look at the progress of emerging statistical languages such as R and Python, which I have written about.

vr_predanalytics_benefits_of_predictive_analytics_updatedAs does big data analytics, predictive analytics involves sourcing data, creating models, deploying them and managing them to understand when an analytic model has become stale and ought to be revised or replaced. It should be obvious that only the most technically advanced users will be familiar with all this, so to achieve broad adoption, predictive analytics products must mask the complexity and be easy to use. Our research will determine the extent to which usability and manageability are being built into product offerings.

The promise of predictive analytics, including competitive advantage (68%), new revenue opportunities (55%), and increased profitability (52%), is significant. But to realize the advantages of predictive analytics, companies must transform how they work. In terms of people and processes a more collaborative strategy may be necessary. Analysts need tools and skills in order to use predictive analytics effectively. A new generation of technology is also becoming available where predictive analytics are easier to apply and use, along with deploy into line of business processes. This will help organizations significantly as there are not enough data scientists and specially trained professionals in predictive analytics that will be available for organizations to utilize or afford to hire.

This benchmark research will look closely at the evolving use of predictive analytics to establish how it equips business to make decisions based on likely futures, not just the past.

Regards,

Tony Cosentino

VP & Research Director

Alteryx has released version 9.0 of Alteryx Analytics that provides a range of data to predictive analytics in advance of its annual user conference called Inspire 2014. I have covered the company for several years as it has emerged as a key player in providing a range of business analytics from predictive to big data analytics. The importance of this category of analytics is revealed by our latest benchmark research on big data analytics, which finds that predictive analytics is the most important type of big data analytics, ranked first by nearly half (47%) of research participants. The new version 9 includes new capabilities and integration with a range of new information sources including read and write capability to IBM SPSS and SAS for range of analytic needs.

vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsAfter attending Inspire 2013 last year, I wrote about capabilities that are enabling an emerging business role, that which Alteryx calls the data artisan. The label refers to analysts who combines both art and science in using analytics to help direct business outcomes. Alteryx uses an innovative and intuitive approach to analytic tasks, using workflow and linking various data sources through in-memory computation and processing. It takes a “no code” drag and drop approach to integrate data from files and databases, prepare data for analysis, and build and score predictive models to yield relevant results. Other vendors in the advanced analytics market are also applying this approach, but few mature tools are currently available. The output of the Alteryx analytic processes can be shared automatically in numerous data formats including direct export into visualization tools such as those from Qlik (new support) and Tableau. This can help users improve their predictive analytics capabilities and take action on the outcomes of analytics, which are the two capabilities most-often cited in our research as needed to improve big data analytics.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsAlteryx now works with Revolution Analytics to increase the scalability of its system to work with large data sets. The open source language R continues to gain popularity and is being embedded in many business intelligence tools, but it runs only on data that can be loaded into memory. Running only in memory does not address analytics on datasets that run into Terabytes and hundreds of millions of values, and potentially requires use of a sub-sampling approach to advanced analytics. With its RevoScaleR, Revolution Analytics rewrites parts of the R algorithm so that the processing tasks can be parallelized and run in big data architectures such as Hadoop. Such capability is important for analytic problems including recommendation engines, unsupervised anomaly detection, some classification and regression problems, and some clustering problems. These analytic techniques are appropriate for some of the top business uses of big data analytics, which according to our research are cross-selling and up-selling (important to 38%), better understanding of individual customers (32%), analyzing all data rather than a sample (30%) and price optimization (28%). Alteryx Analytics automatically detects whether to use RevoScaleR or open source R algorithms. This approach simplifies the technical complexities of scaling R by providing a layer of abstraction for the analytic professional.

Scoring – the ability to input a data record and receive the probability of a particular outcome – is an important if not well understood aspect of predictive analytics. Our research shows that companies that score models on a timely basis according to their needs get better organizational results than those that score all models the same way. Working with Revolution Analytics, Alteryx has enhanced scoring scalability for R algorithms with new capabilities that chunk data in a parallelized fashion. This approach bypasses the memory-only approach to enable a theoretically unlimited number of scores to be processed. For large-scale implementations and consumer applications in industries such as retail, an important target market for Alteryx, and these capabilities are becoming important.

Alteryx 9.0 also improves on open source R’s default approach to scoring, which is “all or nothing.” That is, if data is missing (a null value) or a new level for a categorical variable is not included in the original model, R will not score the model until the issue is addressed. This process is a particular problem for analysts who want to score data in small batches or individually. In contrast, Alteryx’s new “best effort” approach scores the records that can be run without incident, and those that cannot be run are returned with an error message. This adjustment is particularly important as companies start to deploy predictive analytics into areas such as call centers or within Web applications such as automatic quotes for insurance.

vr_Big_Data_Analytics_02_defining_big_data_analyticsAlteryx 9.0 also has new predictive modeling tools and functionality. A spline model helps address regression and classification problems such as data reduction and nonlinear relationships and their interactions. It uses a clear box way to serve users with differing objectives and skill levels. The approach exposes the underpinnings of the model so that advanced users can modify a model, but at the same time less sophisticated users can use the model without necessarily understanding all of the intricacies of the model itself. Other capabilities include a Gamma regression tool allows data matching to model the Gamma family of distributions using the generalized linear modeling (GLM) framework. Heat plot tools for visualizing joint probability distributions, such as between customer income level and customer advocacy, and more robust A/B testing tools, which are particularly important in digital marketing analytics, are also part of the release.

At the same time, Alteryx has expanded its base of information sources. According to our research, working with all sources of data, not just one, is the most common definition for big data analytics, as stated by three-quarters (76%) of organizations. While structured data from transaction systems and so-called systems of record is still the most important, new data sources including those coming from external sources are becoming important. Our research shows that the most widely used external data sources are cloud applications (54%) and social media data (46%); five additional data sources, including Internet, consumer, market and government sources, are virtually tied in third position (with 39% to 42% each). Alteryx will need to be mindful of best practices in big data analytics as I have outlined to ensure it can stay on top of a growing set of requirements to blend big data but also apply a range of advanced analytics.

New connectors to the social media data provider Gnip give access to social media websites through a single API, and a DataSift (http://www.datasift.com) connector helps make social media more accessible and easier to analyze for any business need. Other new connectors in 9.0 include those for Foursquare, Google Analytics, Marketo, salesforce.com and Twitter. New data warehouse connectors include those for Amazon Redshift, HP Vertica, Microsoft SQL Server and Pivotal Greenplum. Access to SPSS and SAS data files also is introduced in this version; Alteryx hopes to break down the barriers to entry in accounts dominated by these advanced analytic stalwarts. With already existing connectors to major cloud and on-premises data sources, the company provides a robust integration platform for analytics.

Alteryx is on a solid growth curve as evidenced by the increasing number of inquiries and my conversations with company vr_Customer_Analytics_08_time_spent_in_customer_analyticsexecutives. It’s not surprising given the disruptive potential of the technology itself and its unique analytic workflow technology for data blending and advanced analytics. This data blending and workflow technology that Alteryx provides is not highlighted enough as it is one of the largest differentiators of its software and reduces the data related tasks like preparing (47%) and reviewing (43%) data that our customer analytics research finds gets in the way of analysts performing analytics. Additionally Alteryx ability to apply location analytics within its product is a key differentiation that our research found delivers exponential value from analytics than just viewing traditional visualization and tables of data. Also location analytics like Alteryx provides helps rapidly identify areas where customer experience and satisfaction can be improved and is the top benefit found in our research. The flexible platform resonates particularly well with line-of-business and especially in fast-moving, lightly regulated industries such as travel, retail and consumer goods where speed of analytics are critical to be performed. The work the company is doing with Revolution Analytics and the ability to scale is important for advanced analytic that operate on big data. The ability to seamlessly connect and blend information sources is a critical capability for Alteryx and it’s a wise move to invest further in this area but Alteryx will need to examine where collaborative technology could be used to help business work together on analytics within the software. Alteryx will need to continue to adapt to the market demand for analytics and keep focused on varying line of business areas so it can continue its growth. Just about any company involved in analytics today should evaluate Alteryx and see how it can streamline analytics in a very unique approach.

Regards,

Tony Cosentino

VP and Research Director

We recently released our benchmark research on big data analytics, and it sheds light on many of the most important discussions occurring in business technology today. The study’s structure was based on the big data analytics framework that I laid out last year as well as the framework that my colleague Mark Smith put forth on the four types of discovery technology available. These frameworks view big data and analytics as part of a major change that includes a movement from designed data to organic data, the bringing together of analytics and data in a single system, and a corresponding move away from the technology-oriented three Vs of big data to the business-oriented three Ws of data. Our big data analytics research confirms these trends but also reveals some important subtleties and new findings with respect to this important emerging market. I want to share three of the most interesting and even surprising results and their implications for the big data analytics market.

First, we note that communication and knowledge sharing is a primary vr_Big_Data_Analytics_06_benefits_realized_from_big_data_analyticsbenefit of big data analytics initiatives, but it is a latent one. Among organizations planning to deploy big data analytics, the benefits most often anticipated are faster response to opportunities and threats (57%), improving efficiency (57%), improving the customer experience (48%) and gaining competitive advantage (43%). However, once a big data analytics system has moved into production, the benefits most often mentioned as achieved are better communication and knowledge sharing (51%), gaining competitive advantage (51%), improved efficiency in business processes (49%) and improved customer experience and satisfaction (46%). (The chart shows rankings of first choices as most important.) Although the last three of these benefits are predictable, it’s noteworthy that the benefit of communication and knowledge sharing, while not a priority before deployment, becomes one of the two most often cited later.

As for the implications, in our view, one reason why communication and knowledge sharing are more often seen as a key benefit after deployment rather than before is that agreement on big data analytics terminology is often lacking within organizations. Participants from fewer than half (44%) of organizations said that the people making business technology decisions mostly agree or completely agree on the meaning of big data analytics, while the same number said there are many different opinions about its meaning. To address this particular challenge, companies should pay more attention to setting up internal communication structures prior to the launch of a big data analytics project, and we expect collaborative technologies to play a larger role in these initiatives going forward.

vr_Big_Data_Analytics_02_defining_big_data_analyticsA second finding of our research is that integration of distributed data is the most important enabler of big data analytics. Asked the meaning of big data analytics in terms of capabilities, the largest percentage (76%) of participants said it involves analyzing data from all sources rather than just one, while for 55 percent it means analyzing all of the data rather than just a sample of it. (We allowed multiple responses.) More than half (56%) told us they view big data as finding patterns in large and diverse data sets in Hadoop, which indicates the continuing influence of this original big data technology. A second tier of percentages emphasizes timeliness as an aspect of big data: doing real-time processing on streams of data (44%), visualizing large structured data sets in seconds (40%) and doing real-time scoring against a database record (36%).

The implications here are that the primary characteristic of big data analytics technology is the ability to analyze data from many data sources. This shows that companies today are focused on bringing together multiple information sources and secondarily being able to process all data rather than just a sample, as well as being able to do machine learning on especially large data sets. Fast processing and the ability to analyze streams of data are relegated to third position in these priorities. That suggests that the so-called three Vs of big data are confusing the discussion by prioritizing volume, velocity and variety all at once. For companies engaged in big data analytics today, sourcing and integration of various data sources in an expedient manner is the top priority, followed by the ideas of size and then speed of arrival of data.

Third, we found that usage is not relegated to particular industries, vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticscertain types of companies or certain functional areas. From among 25 uses for big data analytics those that participants are personally involved with, three of the four most often mentioned involve customers and sales: enabling cross-selling and up-selling (38%), understanding the customer better (32%) and optimizing pricing (28%). Meanwhile, optimizing IT operations ranked fifth (24%) though it was most often chosen by those in IT roles (76%). What is particularly fascinating, however, is that 17 of the 25 use cases were named by more than 10 percent, which indicates many uses for big data analytics.

The primary implication of this finding is that big data analytics is not following the famous technology adoption curves outlined in books such as Geoffrey Moore’s seminal work, “Crossing the Chasm.” That is, companies are not following a narrowly defined path that solves only one particular problem. Instead, they are creatively deploying technological innovations en route to a diverse set of outcomes. And this is occurring across organizational functions and industries, including conservative ones, which conflicts with conventional wisdom. For this reason, companies are more often looking across industries and functional disciplines as part of their due diligence on big data analytics to come up with unique applications that may yield competitive advantage or organizational efficiencies.

In summary, it has been difficult for companies to define what big data analytics actually means and how to prioritize their investments accordingly. Research such as ours can help organizations address this issue. While the above discussion outlines a few of the interesting findings of this research, it also yields many more insights, related to aspects as diverse as big data in the cloud, sandbox environments, embedded predictive analytics, the most important data sources in use, and the challenges of choosing an architecture and deploying big data analytic products. For a copy of the executive summary download it directly from the Ventana Research community.

Regards,

Ventana Research

Our benchmark research shows that analytics is the top businessvr_bti_br_technology_innovation_priorities technology innovation priority; 39% of organizations rank it first. This is no surprise as new information sources and new technologies in data processing, storage, networking, databases and analytic software are combining to offer capabilities for using information never before possible. For businesses, the analytic priority is heightened by intense competition on several fronts; they need to know as much as possible about pricing, strategies, customers and competitors. Within the organization, the IT department and the lines of business continue to debate issues around the analytic skills gap, information simplification, information governance and the rise of time-to-value metrics. Given this backdrop, I expect 2014 to be an exciting year for  studying analytic technologies and how they apply to business.

Three key focus areas comprise my 2014 analytics research agenda. The first includes a specific focus on business analytics and methods like discovery and exploratory. This area will be covered in depth in our new research on next-generation business analytics commencing in the first half of 2014. At Ventana Research, we break discovery analytics into visual discovery, data discovery, event discovery and information discovery. The definitions and uses of each type appear in Mark Smith’s analysis of the four discovery technologies. As part of this research, we will examine these exploratory tools and techniques in the context of the analytic skills gap and the new analytic process flows in organizations. The people and process aspects of the research will include how governance and controls are being implemented alongside these innovations. The exploratory analytics space includes business intelligence, which our research shows is still the primary method of deploying information and analytics in organizations. Two upcoming Value Indexes, Mobile Business Intelligence, due out in the first quarter, and Business Intelligence, starting in the second, will provide up-to-date and in-depth evaluations and ranking of vendors in these categories.

Ventana_Research_Value_Index_LogoMy second agenda area is big data and predictive analytics. The first research on this topic will be released in the first quarter of the year as benchmark research on big data analytics. This fresh and comprehensive research maps to my analysis of the four pillars of big data Analytics, a framework for thinking about big data and the associated analytic technologies. This research also has depth in the areas of predictive analytics and big data approaches in use today. In addition to that benchmark research, we will conduct a first of its kind, the Big Data Analytics Value Index, which will assess the major players applying analytics to big data. Real-time and right-time big data also is called operational intelligence, an area Ventana Research has pioneered over the years. Our Operational Intelligence Value Index, which will be released in the first quarter, evaluates vendors of software that helps companies do real-time analytics against large streams of data that builds on our benchmark research on the topic.

The third focus area is information simplification and cloud-based business analytics including business intelligence. In our benchmark research on information optimization, recently released, Ventana_Research_Benchmark_Research_Logonearly all (97%) organizations said it is important or very important to simplify informa­tion access for both their business and their customers. Paradoxically, at the same time the technology landscape is getting more fragmented and complex; in order to simplify, software design will need innovative uses of analytic technology to mask the underlying complexity through layers of abstraction. In particular, users need the areas of sourcing data and preparing data for analysis to be simplified and made more flexible so they can devote less time to these tasks and more the actual analysis. Part of the challenge in information optimization and integration is to analyze data that originates in the cloud or has been moved there. This issue has important implications for debates around information presentation, the semantic web, where analytics are executed, and whether business intelligence will move to the cloud in any more than a piecemeal fashion. We’ll explore these topics in benchmark research on business intelligence and analytics in the cloud, which is planned for the second half of 2014. We released in 2013 research on location analytics and the use of geography for presentation and processing of data which we refer to as location analytics.

Analytics as a business discipline is getting hotter as we move forward in the 21st century, and I am thrilled to be part of the analytics community. I welcome any feedback you have on my research agenda and look forward to continuing to providing research, collaborating and educating with you in 2014.

Regards,

Tony Cosentino

VP and Research Director

Users of big data analytics are finally going public. At the Hadoop Summit last June, many vendors were still speaking of a large retailer or a big bank as users but could not publically disclose their partnerships. Companies experimenting with big data analytics felt that their proof of concept was so innovative that once it moved into production, it would yield a competitive advantage to the early mover. Now many companies are speaking openly about what they have been up to in their business laboratories. I look forward to attending the 2013 Hadoop Summit in San Jose to see how much things have changed in just a single year for Hadoop centered big data analytics.

Our benchmark research into operational intelligence, which I argue is another name for real-time big data analytics, shows diversity in big data analytics use cases by industry. The goals of operational intelligence are an interesting mix as the research shows relative parity among managing performance (59%), detecting fraud and security (59%), complying with regulations (58%) and managing risk (58%), but when we drill down into different industries there are some interesting nuances. For instance, healthcare and banking are driven much more by risk and regulatory compliance, services such as retail are driven more by performance, and manufacturing is driven more by cost reduction. All of these make sense given the nature of the businesses. Let’s look at them in more detail.

vr_oi_goals_of_using_operational_intelligenceThe retail industry, driven by market forces and facing discontinuous change, is adopting big data analytics out of competitive necessity. The discontinuity comes in the form of online shopping and the need for traditional retailers to supplement their brick-and-mortar locations. JCPenney and Macy’s provide a sharp contrast in how two retailers approached this challenge. A few years ago, the two companies eyed a similar competitive space, but since that time, Macy’s has implemented systems based on big data analytics and is now sourcing locally for online transactions and can optimize pricing of its more than 70 million SKUs in just one hour using SAS High Performance Analytics. The Macy’s approach has, in Sun-Tzu like fashion, made the “showroom floor” disadvantage into a customer experience advantage. JCPenney, on the other hand, used gut-feel management decisions based on classic brand merchandising strategies and ended up alienating its customers and generating law suits and a well-publicized apology to its customers. Other companies including Sears are doing similarly innovative work with suppliers such as Teradata and innovative startups like Datameer in data hub architectures build around Hadoop.

Healthcare is another interesting market for big data, but the dynamics that drive it are less about market forces and more about government intervention and compliance issues. Laws around HIPPA, the recent Healthcare Affordability Act, OC-10 and the HITECH Act of 2009 all have implications for how these organizations implement technology and analytics. Our recent benchmark research on governance, risk and compliance indicates that many companies have significant concerns about compliance issues: 53 percent of participants said they are concerned about them, and 42 percent said they are very concerned. Electronic health records (EHRs) are moving them to more patient-centric systems, and one goal of the Affordable Care Act is to use technology to produce better outcomes through what it calls meaningful use standards.  Facing this title wave of change, companies including IBM analyze historical patterns and link it with real-time monitoring, helping hospitals save the lives of at-risk babies. This use case was made into a now-famous commercial by advertising firm Ogilvy about the so-called data babies. IBM has also shown how cognitive question-and-answer systems such as Watson assist doctors in diagnosis and treatment of patients.

Data blending, the ability to mash together different data sources without having to manipulate the underlying data models, is another analytical technique gaining significant traction. Kaiser Permanente is able to use tools from Alteryx, which I have assessed, to consolidate diverse data sources, including unstructured data, to streamline operations to improve customer service. The two organizations made a joint presentation similar to the one here at Alteryx’s user conference in March.

vr_grc_worried_about_grcFinancial services, which my colleague Robert Kugel covers, is being driven by a combination of regulatory forces and competitive market forces on the sales end. Regulations produce a lag in the adoption of certain big data technologies, such as cloud computing, but areas such as fraud and risk management are being revolutionized by the ability, provided through in-memory systems, to look at every transaction rather than only a sampling of transactions through traditional audit processes. Furthermore, the ability to pair advanced analytical algorithms with in-memory real-time rules engines helps detect fraud as it occurs, and thus criminal activity may be stopped at the point of transaction. On a broader scale, new risk management frameworks are becoming the strategic and operational backbone for decision-making in financial services.

On the retail banking side, copious amounts of historical customer data from multiple banking channels combined with government data and social media data are providing banks the opportunity to do microsegmentation and create unprecedented customer intimacy. Big data approaches to micro-targetting and pricing algorithms, which Rob recently discussed in his blog on Nomis, enable banks and retailers alike to target individuals and customize pricing based on an individual’s propensity to act. While partnerships in the financial services arena are still held close to the vest, the universal financial services providers – Bank of America, Citigroup, JPMorgan Chase and Wells Fargo – are making considerable investments into all of the above-mentioned areas of big data analytics.

Industries other than retail, healthcare and banking are also seeing tangible value in big data analytics. Governments are using it to provide proactive monitoring and responses to catastrophic events. Product and design companies are leveraging big data analytics for everything from advertising attribution to crowdsourcing of new product innovation. Manufacturers are preventing downtime by studying interactions within systems and predicting machine failures before they occur. Airlines are recalibrating their flight routing systems in real time to avoid bad weather. From hospitality to telecommunications to entertainment and gaming, companies are publicizing their big data-related success stories.

Our research shows that until now, big data analytics has primarily been the domain of larger, digitally advanced enterprises. However, as use cases make their way through business and their tangible value is accepted, I anticipate that the activity around big data analytics will increase with companies that reside in the small and midsize business market. At this point, just about any company that is not considering how big data analytics may impact its business faces an unknown and uneasy future. What a difference a year makes, indeed.

Regards,

Tony Cosentino

VP and Research Director

Information Builders  (IBI) was as the highest ranked vendor in Ventana Research’s Business Intelligence Value Index for 2012. The combination of data integration, business analytics, visual and dataBI_VentanaResearch2012_HotVendor discovery and performance management software in a single framework allows the company to address a range of both IT and business user needs and gives it a measure of advantage in an intensely competitive market. At the same time, emerging trends are disrupting the BI category, which seemed mature not long ago. The 2013 IBI user conference in Orlando showed how the company is addressing these industry trends. (For analysis of last year’s event, see my colleague Mark Smith’s comments).

At the core of the IBI strategy are its WebFocus 8.0 platform and iWay, its information management suite of software. Our benchmark research into Business Technology Innovation shows that data preparation and quality are critical challenges and time consuming activities impacting analysts in 42 percent of organizations, so information management must be part of any general discussion of business intelligence. The latest release, iWay 7, was announced at the conference. It can integrate more than 300 data sources using prebuilt adapters and handles data preparation and quality and multidomain master data management. Management spun off iWay into a separate operating company but brought it back into the core business recently as executives recognized the trend toward big data and what we call information optimization. The combination of data integration with business intelligence is a critical factor for business intelligence companies in large part because big data integration is essential to big data analytics. The ability to denormalize data and combine diverse data into a wide single view of an analytical data set is an important aspect of big data analytics. Information Builders uses the iWay and a columnar database called Hyperstage running on commodity servers to handle these big data challenges.

The picture of how WebFocus 8 addresses emerging BI trends is becoming clearer. The first of these trends is the necessity for self-vr_ngbi_br_importance_of_bi_technology_considerationsservice and ease of use in business intelligence tools. Our next-generation business intelligence benchmark research shows that usability is the most critical buying criterion for nearly two-thirds (63%) of organizations. IBI has prepared its applications for a broad user base through capabilities that enable the Web-based WebFocus to deliver features normally associated with desktop software. Additional functionality provided through InfoAssist, a component of WebFocus 8, helps power users explore data, define metrics and publish information without coding. Additionally, the suite now includes Visual Discovery, which has data mashup and discovery capabilities that enable analysts to look at data without a predefined schema and find relationships that may not have been apparent previously. Location analytics technology from ESRI, a long-time leader in location intelligence, can be incorporated into the analysis as well. Location analytics has not been given a lot of attention, but it is gaining more recognition, according to our recent location analytics benchmark research. Finally, Magnify offers a search capability for both structured and unstructured data, which helps users find business content across the enterprise. While Magnify presents a valuable search tool for analysts, the product appears to be suffering from lack of awareness. In a session on self-service BI, few attendees had even heard of it.

Analytics applied to social media is another hot topic in business, and IBI has made significant advancements with its Social Media Integration application, also part of WebFocus 8. It enables users to examine posts, blogs and other social data to detect patterns in customer opinions. Sentiment algorithms that interpret and quantify the inherent complexities of language are provided as a third-party Web service or a REST adapter. Users can search via the Magnify tool and receive a robust contextual inquiry experience with tag clouds, quantitative information around mentions, and sentiment on a scale from very negative to very positive. Users can assign thresholds based on numeric value and assign appropriate stakeholders to follow up. Many marketing departments are using ad-hoc tools to drive these types of initiatives, but ultimately it makes more sense to place these queries within the context of their business intelligence initiatives; social information alone has limited value, but when married with internal metrics such as customer lifetime value, it has much more impact.

On another increasingly important front, mobile business intelligence ranks as a business priority among the six areas of technology innovation that Ventana Research studies. IBI takes a hybrid HTML5vr_ngbi_br_what_capabilities_matter_for_mobile_bi approach to mobile intelligence and analytics. That is, a user downloads a native shell from an online store associated with a particular device, and then the content is rendered through the browser via HTML5. Seeing the mobile trend early, IBI completely rewrote its charting engine to support HTML5 and Mobile Favs on the native side. This method exploits native gestures, while at the same time designers benefit from a develop once, deploy anywhere strategy. While our research shows that mobile users still prefer native applications over HTML5, the pendulum may be swinging. In December 2012 W3C, the body that oversees the HTML5 standard, agreed on candidate recommendations, which means that important companies such as Apple, Google and Microsoft have accepted standards to be implemented by the larger development community. This will help HTML5 vendors including IBI. IBI’s Mobile strategy provides robust dashboard and portal access which is a high priority for 36% of users, however IBI should work to make improvements that leverage prescriptive analytics and operational capabilities to drive proactive alerts and notifications which are the top capabilities mentioned by 42% of mobile BI users.

IBI’s cloud initiative is in the form of platform as a service (PaaS). As opposed to infrastructure as a service or software as a service, PaaS provides both infrastructure and a development environment for BI applications. IBI’s product encompasses service level agreements for testing, validation and production environments with performance tuning, database provisioning and network management. The company has 10 international data centers, which helps to overcome regulatory challenges associated with international data movement. IBI does not have a “pay as you go” usage model but treats it more as a professional service based on assessment. This matches the company’s intended brand image as a service-oriented provider. In the bigger picture of cloud computing, BI is a laggard with only a few percent of participants in our research actually having adopted cloud-based BI. Security and data movement are the biggest perceived obstacles among those organizations.

In the area of predictive analytics, IBI has embedded RStat, which uses the open source R statistical language and can be accessed vr_predanalytics_predictive_analytics_obstacleswithin Developer Studio or as part of its WebFocus BI product. While the customers I spoke with are still building their models outside the IBI system, they suggested that the models will be translated back into RStat and scored within the IBI system. Predictive analytics is a challenge for many business intelligence vendors, which until now have dealt in historical data and simple descriptive statistics. Traditional relational databases are able to provide basic descriptive functions such as min, max, sum and mean, but more advanced functions have been beyond their scope. Our benchmark research on predictive analytics shows that the difficulty of integrating predictive analytics into a current information architecture is the biggest obstacle to predictive analytics for more than half (55%) of organizations.

In a broader analytics discussion with its product leaders Kevin Quinn and Rado Katorov, an interesting analytic concept that bears on data discovery was revealed. Simpson’s Paradox is the idea that a trend that appears in a single group disappears, and often reverses itself, when combined with other data. For instance, in 1973, the University of California Berkeley was sued for discrimination against women based on the fact that 44 percent of male applicants were admitted but only 35 percent of women were admitted. While the difference is indeed significant, when the data is looked at on a departmental basis, an interesting causal variable emerges. That is, men were applying to the easier programs and women were applying to the more difficult programs. Thus it was concluded that the disparity was not due to discrimination but rather to men who applied to the university that year may simply have been a bit lazier than the women applying. The point for analytics is that many discovery tools in the market today often rely on people to make discoveries based on single groupings of variables, and such discoveries may be misleading or worse. IBI’s approach to this issue is to use data reduction techniques such as cluster analysis that allow the data to group itself in an a-priori manner, thus making it easier for the analyst to recognize important patterns among groups of variables rather than just single variables. In the Berkeley admission example, for instance, IBI’s system presumably would have linked the difficulty of the program with gender, and that insight could perhaps have prevented the lawsuit from even being filed.

In sum, IBI has a strong base in large and midsize companies due to itsVR_leadershipwinner posture as more than a BI company. Our recent recognition of Scott Franzel at OFS Brands with the 2013 Overall IT Leadership Award for their use of Information Builders is another example of its business intelligence software helping organizations and individuals be successful. IBI’s success in extending BI to a broader base of stakeholders in both B2B and B2C markets allows the company to keep up with current trends and is at the center of the company’s big data and analytics initiatives. Companies that have already deployed WebFocus should look at the extended capabilities of version 8 and in particular the opportunity to brand information as a service throughout the organization. On a broader basis, any business group or IT department that is trying to take a customer-driven approach to business intelligence should consider IBI.

Regards,

Tony Cosentino

VP and Research Director

Microsoft has been steadily pouring money into big data and business intelligence. The company of course owns the most widely used analytical tool in the world, Microsoft Excel, which our benchmark research into Spreadsheets in the Enterprise shows is not going away soon. User resistance (cited by 56% of participants) and lack of a business case (50%) are the most common reasons that spreadsheets are not being replaced in the enterprise.  The challenge is ensuring the spreadsheets are not just personally used but connected and secured into the enterprise to address consistency and a range of  and potential errors. These issues all add up to more work and maintenance as my colleague has pointed out recently.

vr_ss21_spreadsheets_arent_easily_replacedAlong with Microsoft SQL and SharePoint, Excel is at the heart of the company’s BI strategy. In particular, PowerPivot, originally introduced as an add-on for Excel 2010 and built into Excel 2013, is a discovery tool that enables exploratory analytics and data mashups. PowerPivot uses an in-memory, column store approach similar to other tools in the market. Its ability to access multiple data sources including from third parties and government through Microsoft’s Azure Marketplace, enables a robust analytical experience.

Ultimately, information sources are more important than the tool sets used on them. With the Azure Marketplace and access to other new data sources such as Hadoop through partnership with Hortonworks as my colleague assessed, Microsoft is advancing in the big data space. Microsoft has partnered with Hortonworks to bring Hadoop data into the fold through HDInsights, which enable familiar Excel environments to access HDFS via HCatalog. This approach is similar to access methods utilized by other companies, including Teradata which I wrote about last week. Microsoft stresses the 100 percent open source nature of the Hortonworks approach as a standard alternative to the multiple, more proprietary Hadoop distributions occurring throughout the industry. An important benefit for enterprises with Microsoft deployments is that Microsoft Active Directory adds security to HDInsights.

As my colleague Mark Smith recently pointed out about data discovery methods, the analytic discovery category is broad and includes visualization approaches. On the visualization side, Microsoft markets PowerView, also part of Excel 2013, which provides visual analytics and navigation on top of the Microsoft’s BI semantic model. Users also can annotate and highlight content and then embed it directly into PowerPoint presentations. This direct export feature is valuable because PowerPoint is still a critical communication vehicle in many organizations. Another visual tool, currently in preview, is the Excel add-in GeoFlow, which uses Bing Maps to render visually impressive temporal and geographic data in three dimensions. Such a 3-D visualization technique could be useful in many industries.  Our research into next generation business intelligence found that deploying geographic maps (47%) and visualizing metrics on them (41%) are becoming increasing important but Microsoft will need to further exploit location-based analytics and the need for interactivity.

Microsoft has a core advantage in being able to link its front-office tools such as Excel with its back-end systems such as SQL Server 2012 and SharePoint. In particular, having the ability to leverage a common semantic model through Microsoft Analytical Services, in what Microsoft calls its Business Intelligence Semantic Model, users can set up a dynamic exploratory environment through Excel. Once users or analysts have developed a BI work product, they can publish the work product such as a report directly or through SharePoint. This integration enables business users to share data models and solutions and manage them in common, which applies to security controls as well as giving visibility into usage statistics to see when particular applications are gaining traction with organizational users.

Usability, which our benchmark research into next-generation business intelligencevr_ss21_employee_spreadsheet_skills_are_adequate identifies as the number-one evaluation criterion in nearly two-thirds (64%) of organizations, is still a challenge for Microsoft. Excel power users will appreciate the solid capabilities of PowerPivot, but more casual users of Excel – the majority of business people – do not understand how to build pivot tables or formulas. Our research shows that only 11 percent of Excel users are power users and most skill levels are simply adequate (49%) compared to above average or excellent. While PowerView does give some added capability, a number of other vendors of visual discovery products like Tableau have focused on user experience from the ground up, so it is clear that Microsoft needs to address this shortcoming in its design environment.

When we consider more advanced analytic strategies and inclusion of advanced algorithms, Microsoft’s direction is not clear. Its Data Analysis eXpressions (DAX) can help create custom measures and calculated fields, but it is a scripting language akin to MDX. This is useful for IT professionals who are familiar with such tools, but here also business-oriented users will be challenged in using it effectively.

A wild card in Microsoft’s BI and analytics strategy is with mobile technology. Currently, Microsoft is pursuing a build-once, deploy-anywhere model based on HTML5, and is a key member of the Worldwide Web Consortium (W3C) that is defining the standard. The HTML5 standard, which has just passed a big hurdle in terms of candidate recommendation is beginning to show value in the design of new applications that can be access through web-browsers on smartphones and tablets.  The approach of HTML5 could be challenging as our technology innovation research into mobile technology finds more organizations (39%) prefer native mobile applications from the vendors specific application stores compared to 33 percent through web-browser based method and a fifth with no preference. However, the success or failure of its Windows 8-based Surface tablet will be the real barometer of Microsoft mobile BI success since its integration with the Office franchise is a key differentiator. Early adoption of the tablet has not been strong, but Microsoft is said to be doubling down with a new version to be announced shortly. Success would put Office into the hands of the mobile workforce on a widespread basis via Microsoft devices, which could have far-reaching impacts for the mobile BI market.

As it stands now, however, Microsoft faces an uphill battle in establishing its mobile platform in a market dominated by Android and Apple iOS devices like the iPhone and iPad. If the Surface ultimately fails, Microsoft will likely have to open up Office to run on Android and iOS or risk losing its dominant position.  My colleague is quite pessimistic about Microsoft overall mobile technology efforts and its ability to overcome the reality of the existing market. Our technology innovation research into mobile technology finds that over half of organizations have a preference for their smartphone and tablet technology platform, and the first ranked smartphone priorities has Apple (50%), Android (27%) and RIM (17%) as top smartphone platforms with Microsoft a distant fourth (5%); for tablets is Apple (66%), Android (19%) and then Microsoft (8%). Based on these finding, Microsoft faces challenges on both the platform front and if they adapt their technology to support others that are more preferred in business today.

Ultimately, Microsoft is trying to pull together different initiatives across multiple internal business units that are known for being very siloed and not organized well for customers.  Ultimately, Microsoft has relied on its channel partners and customers to figure out how to not just make them work together but also think about what is possible since they are not always given clear guidance from Redmond. Recent efforts find that Microsoft is trying to come together to address the big data and business analytics challenge and the massive opportunity it represents. One area in which this is coming together is Microsoft’s cloud initiatives. Last year’s announcements of Azure virtual machines enables an infrastructure-as-a-service (IaaS) play for Microsoft and positions Windows Azure SQL Database as a service. This could make the back end systems I’ve discussed available through a cloud-based offer, but currently this is only offered through the client version of the software.

For organizations that already have installed Microsoft as their primary BI platform and are looking for tight integration with an Excel-based discovery environment, the decision to move forward is relatively simple. The trade-off is that this package is still a bit IT-centric and may not attract as many in the larger body of business users as a more user-friendly discovery product might do and address the failings of business intelligence. Furthermore, since Microsoft is not as engaged in direct support and service as other players in this market, it will need to move the traditionally technology focused channel to help their customers become more business savvy. For marketing and other business departments, especially in high-velocity industries where usability and time-to-value is at a premium and back-end integration is secondary, other tools will be worth a look. Microsoft has great potential and with analytics being the top ranked technology innovation priority among its customers I hope that the many divisions inside the global software giant can finally come together to deliver a comprehensive approach.

Regards,

Tony Cosentino

VP and Research Director

Responding to the trend that businesses now ask less sophisticated users to perform analysis and rely on software to help them, Oracle recently announced a new release  of its flagship Oracle BI Foundational Suite (OBIFS 11.1.1.7) as well as updates to Endeca, the discovery platform that Oracle bought in 2011. Endeca is part of a new class of tools that bring new capabilities in information discovery, self-service access and interactivity. Such approaches represent an important part of the evolution of business intelligence to business analytics as I have noted in my agenda for 2013.

Oracle Business Intelligence Foundational Suite includes many components not limited to Oracle Business Intelligence Enterprise Edition (OBIEE), Oracle Essbase and a scorecard and strategy application. OBIEE is the enabling foundation that federates queries across data sources and enables reporting across multiple platforms. Oracle Essbase is an in-memory OLAP tool that enables forecasting and planning, including what-if scenarios embedded in a range of Oracle BI Applications, which are sold separately. The suite, along with the Endeca software, is integrated with Exalytics, Oracle’s appliance for BI and analytics. Oracle’s appliance strategy, which I wrote about after Oracle World last year invests heavily in the Sun Microsystems hardware acquired in 2010.

These updates are far-ranging and numerous (including more than 200 changes to the software). I’d like to point out some important pieces that advance Oracle’s position in the BI market. A visualization recommendations engine offers guidance on the type of visualization that may be appropriate for a user’s particular data. This feature, already sold by others in the market, may be considered a subset of the broader capability of guided analysis. Advanced visualization techniques have become more important for companies as they make it easier for users to understand data and is critical to compete with the likes of  Tableau, a player in this space which I wrote about last year.

Another user-focused update related to visualization is performance tiles, which enable important KPIs to be displayed prominently within the context of the screen surface area. Performance tiles are a great way to start improving the static dashboards that my colleague Mark Smith has critiqued. From what I have seen it is unclear to what degree the business user can define and change Oracle’s performance tile KPIs (for example, the red-flagged metrics assignedvr_bigdata_big_data_capabilities_not_available to the particular business user that appear within the scorecard function of the software) and how much the system can provide in a prescriptive analytic fashion. Other visualizations that have been added include waterfall charts, which enable dependency analysis; these are especially helpful for pricing analysis by showing users how changes in one dimension impact pricing on the whole. Another is MapViews for manipulation and design to support location analytics that our next generation BI research finds the capability to deploy geographic maps are most important to BI in 47 percent of organizations, and then visualize metrics associated with locations in 41 percent of organizations. Stack charts now provide auto-weighting for 100-percent sum analysis that can be helpful for analytics such as attribution models. Breadcrumbs empower users to understand and drill back through their navigation process, which helps them understand how a person came to a particular analytical conclusion. Finally Trellis View actions provides contextual functionality to help turn data into action in an operational environment. The advancements of these visualizations are critical for Oracle big data efforts as visualization is a top three big data capability not available in 37 percent of organizations according to our big data research and our latest technology innovation research on business analytics found presenting data visually as the second most important capability for organizations according to 48 percent of organizations.

vr_ngbi_br_collaboration_tool_access_preferencesThe update to Oracle Smart View for Office also puts more capability in the hands of users. It natively integrates Excel and other Microsoft Office applications with operational BI dashboards so users can perform analysis and prepare ad-hoc reports directly within these desktop environments. This is an important advance for Oracle since our benchmark research in the use of spreadsheets across the enterprise found that the combination of BI and spreadsheets happens all the time or frequently in 74 percent of organization. Additionally the importance of collaborating with business intelligence is essential and having tighter integration is a critical use case as found in our next generation business intelligence research that found using Microsoft Office for collaboration with business intelligence is important to 36 percent of organizations.

Oracle efforts to evolve its social collaboration efforts through what they call Oracle Social Network have advanced significantly but do not appear to be in the short term plan to integrate and make available through its business intelligence offering. Our research finds more than two-thirds (67%) rank this as important and then embedding it within BI is a top need in 38 percent of organizations. Much of what Oracle already provides could be easily integrated and meet business demand for a range of people-based interactions that most are still struggling to manage through e-mail.

Oracle has extended its existing capabilities in its OBIEE with Hadoop integration via a HIVE connector that allows Oracle to pull data into OBIEE from big data sources, while an MDX search function enabled by integration with the Endeca discovery tool allows OBIEE to do full text search and data discovery. Connections to new data sources are critically important in today’s environment; our research shows that retaining and analyzing more data is the number-one ranked use for big data in 29 percent of organizations according to our technology innovation research. Federated data discovery is particularly important as most companies are often unaware of their information assets and therefore unknowingly limit their analysis.

Beyond the core BI product, Oracle made significant advances with Endeca 3.0. Users can now analyze Excel files. This is an existing capability for other vendors, so it was important for Oracle to gain parity here. Beyond that, Endeca now comes with a native JavaScript Object Notation (JSON) reader and support for authorization standards. This furthers its ability to do contextual analysis and sentiment analysis on data in text and social media. Endeca also now can pull data from the Oracle BI server to marry with the analysis. Overall the new version of Endeca enables new business-driven information discovery that is essential to relieve the stress on analysts and IT to create and publish information and insights to business.

Oracle’s continued investments into BI applications that supply prebuilt analytics and these packaged analytics applications span from the front office (sales and marketing), to operations (procurement and supply chain) to the back office (finance and HR). Given the enterprise-wide support, Oracle’s BI can perform cross-functional analytics and deliver fast time to value since users do not have to spend time building the dashboards. Through interoperation with the company’s enterprise applications, customers can execute action directly into applications such as PeopleSoft, JD Edwards or Oracle Business Suite. Oracle has begun to leverage more of its score-carding function that enables KPI relationships to be mapped and information aggregated and trended. Scorecards are important for analytic cultures because they are a common communication platform for executive decision-makers and allow ownership assignment of metrics.

I was surprised to not find much advancement in Oracle business intelligence efforts that operate on smartphones and tablets. Our research finds mobile business intelligence is important to 69 percent of organizations and that 78 percent of organizations reveal that no or some BI capabilities are available in their current deployment of BI. For those that are using mobile business intelligence, only 28 percent are satisfied. For years, IT has not placed a priority on mobile support of BI while business has been clamoring for it and now more readily leading the efforts with 52 percent planning new or expanded deployments on tablets and 32 percent on smartphones. In this highly competitive market to capture more opportunity, Oracle will need to significantly advance its efforts and make its capabilities freely available without passwords as other BI providers have already done. It also will need to recognize that business is more interested in alerts and events through notifications to mobile technology than trying to make the entire suite of BI capabilities replicated on these technologies.

Oracle has foundational positions in enterprise applications and database technology and has used these positions to drive significant vr_ngbi_br_importance_of_bi_technology_considerationssuccess in BI. The company’s proprietary “walled garden” approach worked well for years, but now technology changes, including movements toward open source and cloud computing, threaten that entrenched position. Surprisingly, the company has moved slowly off of its traditional messaging stance targeted at the CIO, IT and the data center. That position seems to focus the company too much on the technology-driven 3 V’s of big data and analytics, and not enough on the business driven 3 W’s that I advocate. As the industry moves into the age of analytics, where information is looked upon as a critical commodity and usability is the key to adoption (our research finds usability to be the top evaluation consideration in 63 percent of organizations), CIOs will need to further move beyond its IT approach for BI as I have noted and get more engaged into the requirements of business. Oracle’s business intelligence strategy and how it addresses these business outcomes and the use across all business users is key to the company’s future and organizations should examine these critical advancements to its BI offering very closely to determine if you can improve the value of information and big data in an organization.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,277 hits
%d bloggers like this: