You are currently browsing the category archive for the ‘Human Capital’ category.

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Ventana Research

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Tony Cosentino

VP and Research Director

Our benchmark research consistently shows that business analytics is the most significant technology trend in business today and acquiring effective predictive analytics is organizations’ top priority for analytics. It enables them to look forward rather than backward and, participate organizations reported, leads to competitive advantage and operational efficiencies.

In our benchmark research on big data analytics, for example, 64 percent of organizations ranked predictive analytics as the most Untitledimportant analytics category for working with big data. Yet a majority indicated that they do not have enough experience in applying predictive analytics to business problems and lack training on the tools themselves.

Predictive analytics improves an organization’s ability to understand potential future outcomes of variables that matter. Its results enable an organization to decide correct courses of action in key areas of the business. Predictive analytics can enhance the people, process, information and technology components of an organization’s future performance.

In our most recent research on this topic, more than half (58%) of participants indicated that predictive analytics is very important to their organization, but only one in five said they are very satisfied with their use of those analytics. Furthermore, our research found that implementing predictive analysis would have a transformational impact in one-third of organizations and a significant positive impact in more than half of other ones.

In our new research project, The Next Generation of Predictive Analytics, we will revisit predictive analysis with an eye to determining how attitudes toward it have changed,  along with its current and planned use, and its importance in business. There are significant changes in this area, including where, how, why, and when predictive analytics are applied. We expect to find changes not only in forecasting and analyzing customer churn but also in operational use at the front lines of the organization and in improving the analytic process itself. The research will also look at the progress of emerging statistical languages such as R and Python, which I have written about.

vr_predanalytics_benefits_of_predictive_analytics_updatedAs does big data analytics, predictive analytics involves sourcing data, creating models, deploying them and managing them to understand when an analytic model has become stale and ought to be revised or replaced. It should be obvious that only the most technically advanced users will be familiar with all this, so to achieve broad adoption, predictive analytics products must mask the complexity and be easy to use. Our research will determine the extent to which usability and manageability are being built into product offerings.

The promise of predictive analytics, including competitive advantage (68%), new revenue opportunities (55%), and increased profitability (52%), is significant. But to realize the advantages of predictive analytics, companies must transform how they work. In terms of people and processes a more collaborative strategy may be necessary. Analysts need tools and skills in order to use predictive analytics effectively. A new generation of technology is also becoming available where predictive analytics are easier to apply and use, along with deploy into line of business processes. This will help organizations significantly as there are not enough data scientists and specially trained professionals in predictive analytics that will be available for organizations to utilize or afford to hire.

This benchmark research will look closely at the evolving use of predictive analytics to establish how it equips business to make decisions based on likely futures, not just the past.

Regards,

Tony Cosentino

VP & Research Director

Organizations should consider multiple aspects of deploying big data analytics. These include the type of analytics to be deployed, how the analytics will be deployed technologically and who must be involved both internally and externally to enable success. Our recent big data analytics benchmark research assesses each of these areas. How an organization views these deployment considerations may depend on the expected benefits of the big data analytics program and the particular business case to be made, which I discussed recently.

According to the research, the most important capability of big data analytics is predictive analytics (64%), but among companies vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsthat have deployed big data analytics, descriptive analytic approaches of query and reporting (74%) and data discovery (64%) are more readily available than predictive capabilities (57%). Such statistics may be a function of big data technologies such as Hadoop, and their associated distributions having prioritized the ability to run descriptive statistics through standard SQL, which is the most common method for implementing analysis on Hadoop. Cloudera’s Impala, Hortonworks’ Stinger (an extension of Apache Hive), MapR’s Drill, IBM’s Big SQL, Pivotal’s HAWQ and Facebook’s open-source contribution of Presto SQL all focus on accessing data through an SQL paradigm. It is not surprising then that the technology research participants use most for big data analytics is business intelligence (75%) and that the most-used analytic methods — pivot tables (46%), classification (39%) and clustering (37%) — are descriptive and exploratory in nature. Similarly, participants said that visualization of big data allows analysts to perform faster analysis (49%), understand context better (48%), perform root-cause analysis (40%) and display multiple result sets (40%), but visualization does not provide more advanced analytic capabilities. While various vendors now offer approaches to run advanced analytics on big data, the research shows that in terms of big data, organizational capabilities still revolve around more basic analytic access.

For companies that are implementing advanced analytic capabilities on big data, there are further analytic process considerations, and many have not yet tackled those. Model building and model deployment should be manageable and timely, involve specialized personnel, and integrate into the broader enterprise architecture. While our research provides an in-depth look at adoption of the different types of in-database analytics, deployment of advanced analytic sandboxes, data mining, model management, integration with business processes and overall model deployment, that is beyond the topic here.

Beyond analytic considerations, a host of technological decisionsvr_Big_Data_Analytics_13_advanced_analytics_on_big_data must be made around big data analytics initiatives. One of these is the degree of customization necessary. As technology advances, customization is giving way to more packaged approaches to big data analytics. According to our research, the majority (54%) of companies that have already implemented big data analytics did custom builds using big data-specific languages and interfaces. The most of those that have not yet deployed are likely to purchase a dedicated or packaged application (44%), followed by a custom build (36%). We think that this pre- and post-deployment comparison reflects a maturing market.

The move from custom approaches to standardized ones has important implications for the skills sets needed for a big data vr_Big_Data_Analytics_14_big_data_analytics_skillsanalytics initiative. In comparing the skills that organizations said they currently have to the skills they need to be successful with big data analytics, it is clear that companies should spend more time building employees’ statistical, mathematical and visualization skills. On the flip side, organizations should make sure their tools can support skill sets that they already have, such as use of spreadsheets and SQL. This is convergent with other findings about training needs, which include applying analytics to business problems (54%), training on big data analytics tools (53%), analytic concepts and techniques (46%) and visualizing big data (41%). The data shows that as approaches become more standardized and the market focus shifts toward them from customized implementations, skill needs are shifting as well. This is not to say that demand is moving away from the data scientist completely. According to our research, organizations that involve cross-functional teams or data scientists in the deployment process are realizing the most significant impact. It is clear that multiple approaches for personnel, departments and current vendors play a role in deployments and that some approaches will be more effective than others.

Cloud computing is another key consideration with respect to deploying analytics systems as well as sandbox modelling and testing environments. For deployment of big data analytics, 27 percent of companies currently use a cloud-based method, while 58 percent said they do not and 16 percent do not know what is used. Not surprisingly, far fewer IT professionals (19%) than business users (40%) said they use cloud-based deployments for big data analytics. The flexibility and capability that cloud resources provide is particularly attractive for sandbox environments and for organizations that lack big data analytic expertise. However, for big data model building, most organizations (42%) still utilize a dedicated internal sandbox environment to build models while fewer (19%) use a non-dedicated internal sandbox (that is, a container in a data warehouse used to build models) and others use a cloud-based sandbox either as a completely separate physical environment (9%) or as a hybrid approach (9%). From this last data we infer that business users are sometimes using cloud-based systems to do big data analytics without the knowledge of IT staff. Among organizations that are not using cloud-based systems for big data analytics, security (45%) is the primary reason that they do not.

Perhaps the most important consideration for big data analytics is choosing vendors to partner with to achieve organizational objectives. When we understand the move from custom technological approaches to more packaged ones and the types of analytics currently being implemented for big data, it is not surprising that a majority of research participants (52%) are looking to their business intelligence systems providers to supply their big data analytics solution. However, a significant number of companies (35%) said they will turn to a specialist analytics provider or their database provider (34%). When evaluating big data analytics, usability is the most important vendor consideration but not by as wide a margin as in categories such as business intelligence. A look at criteria rated important and very important by research participants reveals usability is the highest ranked (94%), but functionality (92%) and reliability (90%) follow closely. Among innovative new technologies, collaboration is important (78%) while mobile access (46%) is much less so. Coupled with the finding that communication and knowledge sharing combined is an important benefit of big data analytics, it is clear that organizations are cognizant of the collaborative imperative when choosing a big data analytics product.

Deployment of big data analytics starts with forethought and a well-defined business case that includes the expected benefits I discussed in my previous analysis. Once the outcome-driven framework is established, organizations should consider the types of analytics needed, the enabling technologies and the people and processes necessary for implementation. To learn more about our big data analytics research, download a copy of the executive summary here.

Regards,

Tony Cosentino

VP & Research Director

We recently released our benchmark research on big data analytics, and it sheds light on many of the most important discussions occurring in business technology today. The study’s structure was based on the big data analytics framework that I laid out last year as well as the framework that my colleague Mark Smith put forth on the four types of discovery technology available. These frameworks view big data and analytics as part of a major change that includes a movement from designed data to organic data, the bringing together of analytics and data in a single system, and a corresponding move away from the technology-oriented three Vs of big data to the business-oriented three Ws of data. Our big data analytics research confirms these trends but also reveals some important subtleties and new findings with respect to this important emerging market. I want to share three of the most interesting and even surprising results and their implications for the big data analytics market.

First, we note that communication and knowledge sharing is a primary vr_Big_Data_Analytics_06_benefits_realized_from_big_data_analyticsbenefit of big data analytics initiatives, but it is a latent one. Among organizations planning to deploy big data analytics, the benefits most often anticipated are faster response to opportunities and threats (57%), improving efficiency (57%), improving the customer experience (48%) and gaining competitive advantage (43%). However, once a big data analytics system has moved into production, the benefits most often mentioned as achieved are better communication and knowledge sharing (51%), gaining competitive advantage (51%), improved efficiency in business processes (49%) and improved customer experience and satisfaction (46%). (The chart shows rankings of first choices as most important.) Although the last three of these benefits are predictable, it’s noteworthy that the benefit of communication and knowledge sharing, while not a priority before deployment, becomes one of the two most often cited later.

As for the implications, in our view, one reason why communication and knowledge sharing are more often seen as a key benefit after deployment rather than before is that agreement on big data analytics terminology is often lacking within organizations. Participants from fewer than half (44%) of organizations said that the people making business technology decisions mostly agree or completely agree on the meaning of big data analytics, while the same number said there are many different opinions about its meaning. To address this particular challenge, companies should pay more attention to setting up internal communication structures prior to the launch of a big data analytics project, and we expect collaborative technologies to play a larger role in these initiatives going forward.

vr_Big_Data_Analytics_02_defining_big_data_analyticsA second finding of our research is that integration of distributed data is the most important enabler of big data analytics. Asked the meaning of big data analytics in terms of capabilities, the largest percentage (76%) of participants said it involves analyzing data from all sources rather than just one, while for 55 percent it means analyzing all of the data rather than just a sample of it. (We allowed multiple responses.) More than half (56%) told us they view big data as finding patterns in large and diverse data sets in Hadoop, which indicates the continuing influence of this original big data technology. A second tier of percentages emphasizes timeliness as an aspect of big data: doing real-time processing on streams of data (44%), visualizing large structured data sets in seconds (40%) and doing real-time scoring against a database record (36%).

The implications here are that the primary characteristic of big data analytics technology is the ability to analyze data from many data sources. This shows that companies today are focused on bringing together multiple information sources and secondarily being able to process all data rather than just a sample, as well as being able to do machine learning on especially large data sets. Fast processing and the ability to analyze streams of data are relegated to third position in these priorities. That suggests that the so-called three Vs of big data are confusing the discussion by prioritizing volume, velocity and variety all at once. For companies engaged in big data analytics today, sourcing and integration of various data sources in an expedient manner is the top priority, followed by the ideas of size and then speed of arrival of data.

Third, we found that usage is not relegated to particular industries, vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticscertain types of companies or certain functional areas. From among 25 uses for big data analytics those that participants are personally involved with, three of the four most often mentioned involve customers and sales: enabling cross-selling and up-selling (38%), understanding the customer better (32%) and optimizing pricing (28%). Meanwhile, optimizing IT operations ranked fifth (24%) though it was most often chosen by those in IT roles (76%). What is particularly fascinating, however, is that 17 of the 25 use cases were named by more than 10 percent, which indicates many uses for big data analytics.

The primary implication of this finding is that big data analytics is not following the famous technology adoption curves outlined in books such as Geoffrey Moore’s seminal work, “Crossing the Chasm.” That is, companies are not following a narrowly defined path that solves only one particular problem. Instead, they are creatively deploying technological innovations en route to a diverse set of outcomes. And this is occurring across organizational functions and industries, including conservative ones, which conflicts with conventional wisdom. For this reason, companies are more often looking across industries and functional disciplines as part of their due diligence on big data analytics to come up with unique applications that may yield competitive advantage or organizational efficiencies.

In summary, it has been difficult for companies to define what big data analytics actually means and how to prioritize their investments accordingly. Research such as ours can help organizations address this issue. While the above discussion outlines a few of the interesting findings of this research, it also yields many more insights, related to aspects as diverse as big data in the cloud, sandbox environments, embedded predictive analytics, the most important data sources in use, and the challenges of choosing an architecture and deploying big data analytic products. For a copy of the executive summary download it directly from the Ventana Research community.

Regards,

Ventana Research

Ventana Research recently completed the most comprehensiveVRMobileBIVI evaluation of mobile business intelligence products and vendors available anywhere today. The evaluation includes 16 technology vendors’ offerings on smartphones and tablets and use across Apple, Google Android, Microsoft Surface and RIM BlackBerry that were assessed in seven key categories: usability, manageability, reliability, capability, adaptability, vendor validation and TCO and ROI. The result is our Value Index for Mobile Business Intelligence in 2014. The analysis shows that the top supplier is MicroStrategy, which qualifies as a Hot vendor and is followed by 10 other Hot vendors: IBM, SAP, QlikTech, Information Builders, Yellowfin, Tableau Software, Roambi, SAS, Oracle and arcplan.

Our expertise, hands on experience and the buyer research from our benchmark research on next-generation business intelligence and on information optimization informed our product evaluations in this new Value Index. The research examined business intelligence on mobile technology to determine organizations’ current and planned use and the capabilities required for successful deployment.

What we found was wide interest in mobile business intelligence and a desire to improve the use of information in 40 percent of organizations, though adoption is less pervasive than interest. Fewer than half of organizations currently access BI capabilities on mobile devices, but nearly three-quarters (71%) expect their mobile workforce to be able to access BI capabilities in the next 12 months. The research also shows strong executive support: Nearly half of executives said that mobility is very important to their BI processes.

Mobile_BI_Weighted_OverallEase of access and use are an important criteria in this Value Index because the largest percentage of organizations identified usability as an important factor in evaluations of mobile business intelligence applications. This is an emphasis that we find in most of our research, and in this case it also may reflect users’ experience with first-generation business intelligence on mobile devices; not all those applications were optimized for touch-screen interfaces and designed to support gestures. It is clear that today’s mobile workforce requires the ability to access and analyze data simply and in a straightforward manner, using an intuitive interface.

The top five companies’ products in our 2014 Mobile Business Intelligence Value Index all provide strong user experiences and functionality. MicroStrategy stood out across the board, finishing first in five categories and most notably in the areas of user experience, mobile application development and presentation of information. IBM, the second-place finisher, has made significant progress in mobile BI with six releases in the past year, adding support for Android, advanced security features and an extensible visualization library. SAP’s steady support for the mobile access to SAP BusinessObjects platform and support for access to SAP Lumira, and its integrated mobile device management software helped produce high scores in various categories and put it in third place. QlikTech’s flexible offline deployment capabilities for the iPad and its high ranking in assurance-related category of TCO and ROI secured it the fourth spot. Information Builders’ latest release of WebFOCUS renders content directly with HTML5 and its Active Technologies and Mobile Faves, the company delivers strong mobile capabilities and rounds out the top five ranked companies. Other noteworthy innovations in mobile BI include Yellowfin’s collaboration technology, Roambi’s use of storyboarding in its Flow application.

Although there is some commonality in how vendors provide mobile access to data, there are many differences among their offerings that can make one a better fit than another for an organization’s particular needs. For example, companies that want their mobile workforce to be able to engage in root-cause discovery analysis may prefer tools from Tableau and QlikTech. For large companies looking for a custom application approach, MicroStrategy or Roambi may be good choices, while others looking for streamlined collaboration on mobile devices may prefer Yellowfin. Many companies may base the decision on mobile business intelligence on which vendor they currently have installed. Customers with large implementations from IBM, SAP or Information Builders will be reassured to find that these companies have made mobility a critical focus.

To learn more about this research and to download a free executive summary, please visit http://www.ventanaresearch.com/bivalueindex/.

Regards,

Tony Cosentino

Vice President and Research Director

As a new generation of business professionals embraces a new generation of technology, the line between people and their tools begins to blur. This shift comes as organizations become flatter and leaner and roles, vr_ngbi_br_importance_of_bi_technology_considerationscontext and responsibilities become intertwined. These changes have introduced faster and easier ways to bring information to users, in a context that makes it quicker to collaborate, assess and act. Today we see this in the prominent buying patterns for business intelligence and analytics software and an increased focus on the user experience. Almost two-thirds (63%) of participants in our benchmark research on next-generation business intelligence say that usability is the top purchase consideration for business intelligence software. In fact, usability is the driving factor in evaluating and selecting technology across all application and technology areas, according to our benchmark research.

In selecting and using technology, personas (that is, an idealized cohort of users) are particularly important, as they help business and IT assess where software will be used in the organization and define the role, responsibility and competency of users and the context of what they need and why. At the same time, personas help software companies understand the attitudinal, behavioral and demographic profile of target individuals and the specific experience that is not just attractive but essential to those users. For example, the mobile and collaborative intelligence capabilities needed by a field executive logging in from a tablet at a customer meeting are quite different from the analytic capabilities needed by an analyst trying to understand the causes of high customer churn rates and how to change that trend with a targeted marketing campaign.

Understanding this context-driven user experience is the first step toward defining the personas found in today’s range of analytics users. The key is to make the personas simple to understand but comprehensive enough to cover the diversity of needs for business analytic types within the organization. To help organizations be more effective in their analytic process and engagement of their resources and time, we recommend the following five analytical personas: (Note that in my years of segmentation work, I’ve found that the most important aspects are the number of segments and the names of those segments. To this end, I have chosen a simple number, five, and the most intuitive names I could find to represent each persona.)

Information Consumer: This persona is not technologically savvy and may even feel intimidated by technology. Information must be provided in a user-friendly fashion to minimize frustration. These users may rely on one or two tools that they use just well enough to do their jobs, which typically involves consuming information in presentations, reports, dashboards or other forms that are easy to read and interpret. They are oriented more to language than to numbers and in most cases would rather read or listen to information about the business. They can write a pertinent memo or email, make a convincing sales pitch or devise a brilliant strategy. Their typical role within the organization varies, but among this group is the high-ranking executive, including the CEO, for whom information is prepared. In the lines of business, this consumer may be a call center agent, a sales manager or a field service worker. In fact, in many companies, the information consumer makes up the majority of the organization. The information consumer usually can read Excel and PowerPoint documents but rarely works within them. This persona feels empowered by consumer-grade applications such as Google, Yelp and Facebook.

Knowledge Worker: Knowledge workers are business, technologically and data savvy and have domain knowledge. They interpret data in functional ways. These workers understand descriptive data but are not likely to take on data integration tasks or interpret advanced statistics (as in a regression analysis). In terms of tools, they can make sense of spreadsheets and with minimal training use the output of tools like business intelligence systems, pivot tables and visual discovery tools. They also actively participate in providing feedback and input to planning and business performance software. Typically, these individuals are over their heads when they are asked to develop a pivot table or structure multi-dimensional data. In some instances, however, new discovery tools allow them to move beyond such limitations. The knowledge worker persona includes but is not limited to technology savvy executives, line of business managers to directors, domain experts and operations managers. Since these workers focus on decision-making and business outcomes, analytics is an important part of their overall workflow but targeted at specific tasks. For analytical tools this role may use applications with embedded analytics, analytic discovery and modeling approaches. Visual discovery tools and in many instances user friendly SaaS applications are empowering the knowledge worker to be more analytically driven without IT involvement.

Analyst: Well versed in data, this persona often knows business intelligence and analytics tools that pertain to the position and applies analytics to analyze various aspects of the business. These users are familiar with applications and systems and know how to retrieve and assemble data from them in many forms. They can also perform a range of data blending and data preparation tasks, and create dashboards and data visualizations along with pivot tables with minimal or no training. They can interpret many types of data, including correlation and in some cases regression. The analyst’s role involves modeling and analytics either within specific analytic software or within software used for business planning and enterprise performance management. More senior analysts focus on more advanced analytics, such as predictive analytics and data mining, to understand current patterns data and predict future outcomes. These analysts might be called a split persona in terms of where their skills and roles are deployed in the organization. They may reside in IT, but a lot more are found on the business side, as they are accountable for analytics tied to the outcomes of the analytics. Analysts on the business side may not be expert in SQL or computer programming but may be adept with languages such as R or SAS. Those on the IT side are more familiar with SQL and the building of data models used in databases. With respect to data preparation, the IT organization looks at integration through the lens of ETL and associated tool sets, whereas the business side looks at it from a data-merge perspective and the creation of analytical data sets in places like spreadsheets.

The roles that represent this persona often are explicitly called analysts with a prefix that in most cases is representative of the department they work from, such as finance, marketing, sales or operations but could have prefixes like corporate, customer, operational or other cross-departmental responsibilities. The analytical tools they use almost always include the spreadsheet, as well as complementary business intelligence tools and a range of analytical tools like visual discovery and in some cases more advanced predictive analytics and statistical software. Visual discovery and commodity modeling approaches are empowering some analyst workers to move upstream from a role of data monger to a more interpretive decision support position. For those already familiar with advanced modeling, today’s big data environments, including new sources of information and modern technology, are providing the ability to build much more robust models and solve an entirely new class of business problems.

Publisher: Skilled in data and analytics, the publisher typically knows how to configure and operate business intelligence tools and publish information from them in dashboards or reports. They are typically skilled in the basics of spreadsheets and publishing information to Microsoft Word or PowerPoint tools. These users not only can interpret many types of analytics but can also build and validate the data for their organizations. Similar to the analyst, the publisher may be considered a split persona, as these individuals may be in a business unit or IT. The IT-based publisher is more familiar with the business intelligence processes and knows the data sources and how to get to data from the data warehouse or even big data sources. They may have basic configuration and scripting skills that enable them to produce outputs in several ways. They may also have basic SQL and relational data modeling skills that help them identify what can be published and determine how data can be combined through the BI tool or databases. The titles related to publisher may include business intelligence manager, data analyst, or manager or director of data or information management. The common tools used by the publisher include business intelligence authoring tools, various visualization and analytic tools, and office productivity tools like Microsoft Office and Adobe Acrobat.

Data Geek: A data geek, data analyst or potentially as sophisticated as a data scientist has expert data management skills, has an interdisciplinary approach to data that melds the split personas discussed at the analyst and senior analyst levels. The primary difference between the data geek and the analyst is that the latter usually focuses on either the IT side or the business side. A senior analyst with a Ph.D. in computer science understands relational data models and programming languages but may not understand advanced statistical models and statistical programming languages. Similarly, a Ph.D. in statistics understands advanced predictive models and associated tools but may not be prepared to write computer code. The data scientist not only understands both advanced statistics and modeling but enough about computer programming and systems along with domain knowledge. The titles for this role vary but include chief analytics officer, enterprise data architect, data analyst, head of information science and even data scientist.

To align analytics and the associated software to individuals in the organization, businesses should use personas to best identify who needs what set of capabilities to be effective. Organizations should also assess competency levels in their personas to avoid adopting software that is too complicated or difficult to use. In some cases you will have individuals that can perform multiple personas. Instead of wasting time, resources and financial capital, look to define what is needed and where training is needed to ensure business and IT work collaboratively in business analytics. While some business analytics software is getting easier to use, many of the offerings are still difficult to use because they are still being designed for IT or more sophisticated analysts. While these individuals are an important group, they represent only a small portion of the users who need analytic business tools.

vr_bigdata_obstacles_to_big_data_analytics (2)The next generation of business intelligence and business analytics will in part address the need to more easily consume information through smartphones and tablets but will not overcome one of the biggest barriers to big data analytics: the skills gap. Our benchmark research on big data shows staffing (79%) and training (77%) are the two biggest challenges organizations face in efforts to take advantage of big data through analytics. In addition, a language barrier still exists in some organizations, where IT speaks in terms of TCO and cost of ownership and efficiency while the business speaks in terms of effectiveness and outcomes or time to value, which I have written about previously. While all of these goals are important, the organization needs to cater to the metrics that are needed by its various personas. Such understanding starts with better defining the different personas and building more effective communication among the groups to ensure that they work together more collaboratively to achieve their respective goals and get the most value from business analytics.

Regards,

Tony Cosentino

VP and Research Director

Users of big data analytics are finally going public. At the Hadoop Summit last June, many vendors were still speaking of a large retailer or a big bank as users but could not publically disclose their partnerships. Companies experimenting with big data analytics felt that their proof of concept was so innovative that once it moved into production, it would yield a competitive advantage to the early mover. Now many companies are speaking openly about what they have been up to in their business laboratories. I look forward to attending the 2013 Hadoop Summit in San Jose to see how much things have changed in just a single year for Hadoop centered big data analytics.

Our benchmark research into operational intelligence, which I argue is another name for real-time big data analytics, shows diversity in big data analytics use cases by industry. The goals of operational intelligence are an interesting mix as the research shows relative parity among managing performance (59%), detecting fraud and security (59%), complying with regulations (58%) and managing risk (58%), but when we drill down into different industries there are some interesting nuances. For instance, healthcare and banking are driven much more by risk and regulatory compliance, services such as retail are driven more by performance, and manufacturing is driven more by cost reduction. All of these make sense given the nature of the businesses. Let’s look at them in more detail.

vr_oi_goals_of_using_operational_intelligenceThe retail industry, driven by market forces and facing discontinuous change, is adopting big data analytics out of competitive necessity. The discontinuity comes in the form of online shopping and the need for traditional retailers to supplement their brick-and-mortar locations. JCPenney and Macy’s provide a sharp contrast in how two retailers approached this challenge. A few years ago, the two companies eyed a similar competitive space, but since that time, Macy’s has implemented systems based on big data analytics and is now sourcing locally for online transactions and can optimize pricing of its more than 70 million SKUs in just one hour using SAS High Performance Analytics. The Macy’s approach has, in Sun-Tzu like fashion, made the “showroom floor” disadvantage into a customer experience advantage. JCPenney, on the other hand, used gut-feel management decisions based on classic brand merchandising strategies and ended up alienating its customers and generating law suits and a well-publicized apology to its customers. Other companies including Sears are doing similarly innovative work with suppliers such as Teradata and innovative startups like Datameer in data hub architectures build around Hadoop.

Healthcare is another interesting market for big data, but the dynamics that drive it are less about market forces and more about government intervention and compliance issues. Laws around HIPPA, the recent Healthcare Affordability Act, OC-10 and the HITECH Act of 2009 all have implications for how these organizations implement technology and analytics. Our recent benchmark research on governance, risk and compliance indicates that many companies have significant concerns about compliance issues: 53 percent of participants said they are concerned about them, and 42 percent said they are very concerned. Electronic health records (EHRs) are moving them to more patient-centric systems, and one goal of the Affordable Care Act is to use technology to produce better outcomes through what it calls meaningful use standards.  Facing this title wave of change, companies including IBM analyze historical patterns and link it with real-time monitoring, helping hospitals save the lives of at-risk babies. This use case was made into a now-famous commercial by advertising firm Ogilvy about the so-called data babies. IBM has also shown how cognitive question-and-answer systems such as Watson assist doctors in diagnosis and treatment of patients.

Data blending, the ability to mash together different data sources without having to manipulate the underlying data models, is another analytical technique gaining significant traction. Kaiser Permanente is able to use tools from Alteryx, which I have assessed, to consolidate diverse data sources, including unstructured data, to streamline operations to improve customer service. The two organizations made a joint presentation similar to the one here at Alteryx’s user conference in March.

vr_grc_worried_about_grcFinancial services, which my colleague Robert Kugel covers, is being driven by a combination of regulatory forces and competitive market forces on the sales end. Regulations produce a lag in the adoption of certain big data technologies, such as cloud computing, but areas such as fraud and risk management are being revolutionized by the ability, provided through in-memory systems, to look at every transaction rather than only a sampling of transactions through traditional audit processes. Furthermore, the ability to pair advanced analytical algorithms with in-memory real-time rules engines helps detect fraud as it occurs, and thus criminal activity may be stopped at the point of transaction. On a broader scale, new risk management frameworks are becoming the strategic and operational backbone for decision-making in financial services.

On the retail banking side, copious amounts of historical customer data from multiple banking channels combined with government data and social media data are providing banks the opportunity to do microsegmentation and create unprecedented customer intimacy. Big data approaches to micro-targetting and pricing algorithms, which Rob recently discussed in his blog on Nomis, enable banks and retailers alike to target individuals and customize pricing based on an individual’s propensity to act. While partnerships in the financial services arena are still held close to the vest, the universal financial services providers – Bank of America, Citigroup, JPMorgan Chase and Wells Fargo – are making considerable investments into all of the above-mentioned areas of big data analytics.

Industries other than retail, healthcare and banking are also seeing tangible value in big data analytics. Governments are using it to provide proactive monitoring and responses to catastrophic events. Product and design companies are leveraging big data analytics for everything from advertising attribution to crowdsourcing of new product innovation. Manufacturers are preventing downtime by studying interactions within systems and predicting machine failures before they occur. Airlines are recalibrating their flight routing systems in real time to avoid bad weather. From hospitality to telecommunications to entertainment and gaming, companies are publicizing their big data-related success stories.

Our research shows that until now, big data analytics has primarily been the domain of larger, digitally advanced enterprises. However, as use cases make their way through business and their tangible value is accepted, I anticipate that the activity around big data analytics will increase with companies that reside in the small and midsize business market. At this point, just about any company that is not considering how big data analytics may impact its business faces an unknown and uneasy future. What a difference a year makes, indeed.

Regards,

Tony Cosentino

VP and Research Director

Responding to the trend that businesses now ask less sophisticated users to perform analysis and rely on software to help them, Oracle recently announced a new release  of its flagship Oracle BI Foundational Suite (OBIFS 11.1.1.7) as well as updates to Endeca, the discovery platform that Oracle bought in 2011. Endeca is part of a new class of tools that bring new capabilities in information discovery, self-service access and interactivity. Such approaches represent an important part of the evolution of business intelligence to business analytics as I have noted in my agenda for 2013.

Oracle Business Intelligence Foundational Suite includes many components not limited to Oracle Business Intelligence Enterprise Edition (OBIEE), Oracle Essbase and a scorecard and strategy application. OBIEE is the enabling foundation that federates queries across data sources and enables reporting across multiple platforms. Oracle Essbase is an in-memory OLAP tool that enables forecasting and planning, including what-if scenarios embedded in a range of Oracle BI Applications, which are sold separately. The suite, along with the Endeca software, is integrated with Exalytics, Oracle’s appliance for BI and analytics. Oracle’s appliance strategy, which I wrote about after Oracle World last year invests heavily in the Sun Microsystems hardware acquired in 2010.

These updates are far-ranging and numerous (including more than 200 changes to the software). I’d like to point out some important pieces that advance Oracle’s position in the BI market. A visualization recommendations engine offers guidance on the type of visualization that may be appropriate for a user’s particular data. This feature, already sold by others in the market, may be considered a subset of the broader capability of guided analysis. Advanced visualization techniques have become more important for companies as they make it easier for users to understand data and is critical to compete with the likes of  Tableau, a player in this space which I wrote about last year.

Another user-focused update related to visualization is performance tiles, which enable important KPIs to be displayed prominently within the context of the screen surface area. Performance tiles are a great way to start improving the static dashboards that my colleague Mark Smith has critiqued. From what I have seen it is unclear to what degree the business user can define and change Oracle’s performance tile KPIs (for example, the red-flagged metrics assignedvr_bigdata_big_data_capabilities_not_available to the particular business user that appear within the scorecard function of the software) and how much the system can provide in a prescriptive analytic fashion. Other visualizations that have been added include waterfall charts, which enable dependency analysis; these are especially helpful for pricing analysis by showing users how changes in one dimension impact pricing on the whole. Another is MapViews for manipulation and design to support location analytics that our next generation BI research finds the capability to deploy geographic maps are most important to BI in 47 percent of organizations, and then visualize metrics associated with locations in 41 percent of organizations. Stack charts now provide auto-weighting for 100-percent sum analysis that can be helpful for analytics such as attribution models. Breadcrumbs empower users to understand and drill back through their navigation process, which helps them understand how a person came to a particular analytical conclusion. Finally Trellis View actions provides contextual functionality to help turn data into action in an operational environment. The advancements of these visualizations are critical for Oracle big data efforts as visualization is a top three big data capability not available in 37 percent of organizations according to our big data research and our latest technology innovation research on business analytics found presenting data visually as the second most important capability for organizations according to 48 percent of organizations.

vr_ngbi_br_collaboration_tool_access_preferencesThe update to Oracle Smart View for Office also puts more capability in the hands of users. It natively integrates Excel and other Microsoft Office applications with operational BI dashboards so users can perform analysis and prepare ad-hoc reports directly within these desktop environments. This is an important advance for Oracle since our benchmark research in the use of spreadsheets across the enterprise found that the combination of BI and spreadsheets happens all the time or frequently in 74 percent of organization. Additionally the importance of collaborating with business intelligence is essential and having tighter integration is a critical use case as found in our next generation business intelligence research that found using Microsoft Office for collaboration with business intelligence is important to 36 percent of organizations.

Oracle efforts to evolve its social collaboration efforts through what they call Oracle Social Network have advanced significantly but do not appear to be in the short term plan to integrate and make available through its business intelligence offering. Our research finds more than two-thirds (67%) rank this as important and then embedding it within BI is a top need in 38 percent of organizations. Much of what Oracle already provides could be easily integrated and meet business demand for a range of people-based interactions that most are still struggling to manage through e-mail.

Oracle has extended its existing capabilities in its OBIEE with Hadoop integration via a HIVE connector that allows Oracle to pull data into OBIEE from big data sources, while an MDX search function enabled by integration with the Endeca discovery tool allows OBIEE to do full text search and data discovery. Connections to new data sources are critically important in today’s environment; our research shows that retaining and analyzing more data is the number-one ranked use for big data in 29 percent of organizations according to our technology innovation research. Federated data discovery is particularly important as most companies are often unaware of their information assets and therefore unknowingly limit their analysis.

Beyond the core BI product, Oracle made significant advances with Endeca 3.0. Users can now analyze Excel files. This is an existing capability for other vendors, so it was important for Oracle to gain parity here. Beyond that, Endeca now comes with a native JavaScript Object Notation (JSON) reader and support for authorization standards. This furthers its ability to do contextual analysis and sentiment analysis on data in text and social media. Endeca also now can pull data from the Oracle BI server to marry with the analysis. Overall the new version of Endeca enables new business-driven information discovery that is essential to relieve the stress on analysts and IT to create and publish information and insights to business.

Oracle’s continued investments into BI applications that supply prebuilt analytics and these packaged analytics applications span from the front office (sales and marketing), to operations (procurement and supply chain) to the back office (finance and HR). Given the enterprise-wide support, Oracle’s BI can perform cross-functional analytics and deliver fast time to value since users do not have to spend time building the dashboards. Through interoperation with the company’s enterprise applications, customers can execute action directly into applications such as PeopleSoft, JD Edwards or Oracle Business Suite. Oracle has begun to leverage more of its score-carding function that enables KPI relationships to be mapped and information aggregated and trended. Scorecards are important for analytic cultures because they are a common communication platform for executive decision-makers and allow ownership assignment of metrics.

I was surprised to not find much advancement in Oracle business intelligence efforts that operate on smartphones and tablets. Our research finds mobile business intelligence is important to 69 percent of organizations and that 78 percent of organizations reveal that no or some BI capabilities are available in their current deployment of BI. For those that are using mobile business intelligence, only 28 percent are satisfied. For years, IT has not placed a priority on mobile support of BI while business has been clamoring for it and now more readily leading the efforts with 52 percent planning new or expanded deployments on tablets and 32 percent on smartphones. In this highly competitive market to capture more opportunity, Oracle will need to significantly advance its efforts and make its capabilities freely available without passwords as other BI providers have already done. It also will need to recognize that business is more interested in alerts and events through notifications to mobile technology than trying to make the entire suite of BI capabilities replicated on these technologies.

Oracle has foundational positions in enterprise applications and database technology and has used these positions to drive significant vr_ngbi_br_importance_of_bi_technology_considerationssuccess in BI. The company’s proprietary “walled garden” approach worked well for years, but now technology changes, including movements toward open source and cloud computing, threaten that entrenched position. Surprisingly, the company has moved slowly off of its traditional messaging stance targeted at the CIO, IT and the data center. That position seems to focus the company too much on the technology-driven 3 V’s of big data and analytics, and not enough on the business driven 3 W’s that I advocate. As the industry moves into the age of analytics, where information is looked upon as a critical commodity and usability is the key to adoption (our research finds usability to be the top evaluation consideration in 63 percent of organizations), CIOs will need to further move beyond its IT approach for BI as I have noted and get more engaged into the requirements of business. Oracle’s business intelligence strategy and how it addresses these business outcomes and the use across all business users is key to the company’s future and organizations should examine these critical advancements to its BI offering very closely to determine if you can improve the value of information and big data in an organization.

Regards,

Tony Cosentino

VP and Research Director

SAP just released solid preliminary quarterly and annual revenue numbers, which in many ways can be attributed to a strong strategic vision around the HANA in-memory platform and strong execution throughout the organization. Akin to flying an airplane while simultaneously fixing it, SAP’s bold move to HANA may at some point see the company continuing to fly when other companies are forced to ground parts of their fleets.

Stepping into 2012, the HANA strategy was still nascent, and SAP provided incentives both to customers and the channel to bring them along. At that point SAP also promoted John Schweitzer to senior vice president and general manager for business analytics.  Schweitzer, an industry veteran who spent his career with Hyperion and Oracle before moving to SAP in 2010, gave a compelling analytics talk at SAPPHIRE NOW in early 2012 and was at the helm for the significant product launches during the course of the year.

One of these products is SAP’s Visual Intelligence, which was released in the spring of 2012. The product takes Business Objects vr_ngbi_br_importance_of_bi_technology_considerationsBusiness Explorer and moves it to the desktop so that analysts can work without the need to have IT involved. Because it runs on HANA, it allows real-time visual exploration of data akin to what we have been seeing with companies such as Tableau, which recently passed the $100 million mark in revenue.  As with many of SAP’s new offers, the advantage of running on top of HANA allows visual exploration analysis on very large data sets. The same size data sets may quickly overwhelm certain “in-memory” competitor systems. In order to facilitate buzz and help develop “Visi,” as Visual Intelligence is also called, the company ran a Data Geek Challenge in which SAP encouraged users to develop big data visualizations on top of HANA.

Tools such as SAP’s Visual Intelligence begin to address the analytics usability challenge that I recently wrote about in the The Brave New World of Business Intelligence which focused on recent research we did on next-generation business intelligence. One key takeaway from that research is that usability expectations for business intelligence are being set on a consumer level, but that our information environments and our processes in the enterprise are too fragmented to achieve the same level of usability, resulting in relatively high dissatisfaction with next-generation business tools. Despite that dissatisfaction, the high switch costs for enterprise BI mean that customers are essentially captive, and this gives incumbents time to adapt their approaches. There is no doubt that SAP looks to capitalize on this opportunity with agile development approaches and frequent iterations of the Visual Intelligence software.

Another key release in 2012 was around Predictive Analytics which went generally available late in the year after SAP TechEd in Madrid. With this release, SAP moves away from its partnership
vr_predanalytics_benifits_of_predictive_analytics with IBM SPSS. The move makes sense given that sticking with SPSS would have resulted in a critical dependency for SAP. According to our benchmark research on Predictive Analytics, 68% of organizations see predictive analytics as a source of competitive advantage. Once again, HANA may prove to be a strong differentiator for SAP in that the company will be able to leverage the in-memory system to visualize data and run predictive analytics on very large data sets and across different workloads. Furthermore, the sap Predictive Analytics offering inherits data acquisition and data manipulation functionality from SAP Visual Intelligence. However, it cannot be installed on the same machine as Visi, according to a blog on the SAP Community Network.

SAP will need to work hard to build out predictive analytics capabilities to compete with the likes of SAS, IBM SPSS and other providers which have years of custom developed vertical and Line-of-Business solution sets. Beyond the customized analytical assets, IBM and SAS will likely promote the idea of commodity models as such non-optimized modeling approaches become more important. Commodity models are a breed of “good enough” models that allow for prediction and data reduction that is a step-function better than a purely random or uninformed decision. Where deep analytical skill sets are not available, sophisticated software can run through data and match it to the appropriate model.

In 2012, SAP also continued to develop targeted analytical solutions, which bundles SAP BI and the Sybase IQ server, a columnar database. Column-oriented databases such as Sybase IQ, Vertica, ParAccel and Infobright have gained momentum over the last few years by providing a platform that organizes data in a way that allows for much easier analytical access and data compression.  Instead of writing to disk in a row oriented fashion, columnar approaches write to disk in a column oriented fashion allowing for faster analytical cycles and reduced Time-to-Value.

The competitive advantage of columnar databases, however, may be mitigated as in-memory approaches gain favor in the marketplace. For this reason, continued development on the Sybase IQ platform may be an intermediate step until the HANA analytics portfolio is built out; after which HANA will likely cannibalize parts of its own stack. SAP’s dual approach to Big Data analytics with both “in-memory” and columnar provides good visibility into the classic Innovator’s Dilemma that faces many technology suppliers today and how SAP is dealing with this dilemma. It should be noted, however, that SAP is also working on an integrated portfolio approach and that Sybase IQ may actually be a better fit as datasets move to Petabyte scale.

Another aspect of that same Innovator’s Dilemma is a fragmented choice environment as new technologies develop. Our research shows that the market is undecided on how it will roll out critical next-generation business intelligence capabilities such as collaboration. Just under two-fifth of our Next Generationvr_ngbi_br_collaboration_tool_access_preferences Business Intelligence study participants (38%) prefer business intelligence applications as the primary access method for collaborative BI, but 36 percent prefer access through office productivity tools, and 34 percent prefer access through applications themselves. (Not surprisingly, IT leans more toward providing tools within the already existing landscape, while business users are more likely to want this capability within the context of the application.)  This fragmented choice scenario carries over to analytics as well, where spreadsheets are still the dominant analytical tool in most organizations. Here at Ventana Research, we are fielding more inquiries on application-embedded analytics and how these will play out in the organizational landscape. I anticipate this debate will continue through 2013, with different parts of the market providing solid arguments for each of the three camps. Since HANA uniquely provides both transactional processing and analytic processing in one engine, it will be interesting to look closer at the HANA and Business Objects roadmap in 2013 to see how they are positioning with respect to this debate. Furthermore, as I discuss in my 2013 Research Agenda blog post, disseminating insights within the organization is a big part of moving from insights to action, and business intelligence is still the primary vehicle for moving insight into the organization.  For this reason, the natural path for many organizations may indeed be through their Business Intelligence systems.

SAP, clearly separating its strategic position, looks to continue to innovate its entire portfolio, including both applications and analytics, based on the HANA database. In the most recent quarter, SAP took a big step forward in this regard by porting its entire business suite to run on HANA as my colleague Robert Kugel discussed in a recent blog.

While there are still some critical battles to be played out, one thing remains clear: SAP is one of the dominant players in business intelligence today. Our Value Index on Business Intelligence has assessed SAP as a Hot Vendor and is top ranked. SAP aims to stay that way by continuing to innovate around HANA and giving its customers and prospects a seamless transition to next-generation analytics and technologies.

Regards,

Tony Cosentino
VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,368 hits
%d bloggers like this: