You are currently browsing the tag archive for the ‘CIO’ tag.

In his keynote speech at the sixth annual Tableau Customer Conference, company co-founder and CEO Christian Chabot borrowed from Steve Jobs’ famous quote that the computer “is the equivalent of a bicycle for our minds,” to suggest that his company software is such a new bicycle. He went on to build an argument about the nature of invention and Tableau’s place in it. The people who make great discoveries, Chabot said, start with both intuition and logic. This approach allows them to look at ideas and information from different perspectives and to see things that others don’t see. In a similar vein, he went on, Tableau allows us to look at things differently, understand patterns and generate new ideas that might not arise using traditional tools. Cabot key point was profound: New technologies such as Tableau with its visual analytics software that use new and big data sources of information are enablers and accelerators of human understanding.

vr_bti_br_technology_innovation_prioritiesTableau represents a new class of business intelligence (BI) software that is designed for business analytics allowing users to visualize and interact on data in new ways and does not mandate that relationships in the data be predefined. This business analytics focus is critical as it is the top ranked technology innovation in business today as identified by 39 percent of organizations as found in our research. In traditional BI systems, data is modeled in so called cube or more defined structures which allow users to slice and dice data instantaneously and in a user friendly fashion. The cube structure solves the problem of abstracting the complexity of the structured query language (SQL) of the database and the inordinate amount of time it can take to read data from a row oriented database. However, with memory decreasing in cost significantly, the advent of new column oriented databases, and approaches such as VizQL (Tableau’s proprietary query language that allows for direct visual query of a database), the methods of traditional BI approaches are now challenged by new ones. Tableau has been able to effectively exploit the exploratory aspects of data through its technological approach, and even further with the advent of many of the new big data sources that require more discovery type methods.

After Chabot’s speech, Chris Stolte, Tableau’s co-founder and chief development officer, took the audience through the company’s product vision, which is centered around the themes of seamless access to data, visual analytics for everyone, ease of use and beauty, storytelling, and enterprise analytics everywhere. This is essential as we classify what Tableau is performing in their software as methods of discovery for business analytics for which my colleague points out the four types where Tableau currently has two of them with data and visual support. Most important is that Tableau is working to address a broader set of personas of users that I have outlined to the industry for its products and expanding further to analyst, publishers and data geeks. As part of Stolte’s address, the product team took the stage to discuss innovations coming in Tableau 8.1 scheduled for this fall of 2013 and 8.2 product releases due early in 2014 all of which have been publicly disclosed in their Internet broadcast of the keynote.

One of those innovations is a new connection interface that enables a workflow for connecting to data. It provides very light data integration capabilities with which users clean and reshape data in a number of ways. The software automatically detects inner join keys with a single click, and the new table shows up automatically. vr_infomgt_barriers_to_information_managementUsers can easily manipulate left and right joins as well as change the join field altogether. Once data is extracted and imported, new tools such as a data parser enable users to specify a specific date format. While these data integration capabilities are admittedly lightweight compared with tools such as Informatica or Pentaho (which just released its latest 5.0 platform) that is integrated with its business analytics offering, they are a welcome development for users who still spend the majority of their time cleaning, preparing and reviewing data compared to analyzing it. Our benchmark research on information management shows that dirty data is a barrier to information management 58 percent of the time. Tableau’s developments and others in the area of information management should continue to erode the entrenchment of tools inappropriately used for analytics, especially spreadsheets, which my colleague Robert Kugel has researched in the use and misuse of spreadsheets. These advancements in simpler and access to data are critical as 44 percent of organizations indicated more time is spent on data related activities compared to analytic tasks.

A significant development in the 8.1 release of Tableau is the integration of R, the open source programming language for statistics and predictive analytics. Advances in the R language through the R community have been robust and continue to gain steam, as I discussed recently in an analysis of opportunities and barriers for R. Tableau users will still need to know the details of R, but now output can be natively visualized in Tableau. Depending on their needs, use can gain a more integrated and advanced statistical experience with Tableau partner tools such as Alteryx, which both my colleague Richard Snow and I have written about this year. Alteryx integrates with R at a higher level of abstraction and also directly integrates with Tableau output. While R integration is important for Tableau to provide new capabilities, it should be noted that this is a single-threaded approach and will be limited to running in-memory. This will be a concern for those trying to analyze truly large data sets since a single thread approach limits the modeler to about a single terabyte of data. For now, Tableau likely will serve mostly as an R sandbox for sample data, but when users need to move algorithms into production for larger data, they probably will have to use a parallelized environment. Other BI vendors like Information Builders and WebFocus has already embedded R into its BI product that is designed for analysts and hides the complexities of R.

Beyond the R integration, Tableau showed useful descriptive analyst methods such as box plots and percentile aggregations. Forecast vr_predanalytics_top_predictive_techniques_usedimprovements facilitate change of prediction bands and adjustment of seasonality factors. Different ranking methods can be used, and two-pass totals provide useful data views. While these analytic developments are nice, they are not groundbreaking. Tools like BIRT Analytics, Datawatch Panopticon and Tibco Spotfire, are still ahead with their ability to visualize data models in many methods like decision trees and clustering methods. Meanwhile, SAP just acquired KXEN and will likely start to integrate predictive capabilities into SAP Lumira, its visual analytics platform. SAS is also integrating easy-to-use high-end analytics into its Visual Analytics tool, and IBM’s SPSS and Cognos Insight work together for advanced analytics. Our research on predictive analytics shows that classification trees (69%) followed by regression techniques and association rules (66% and 61%, respectively) are the statistical techniques most often used in organizations today. Tableau also indicated future investments into improving location and maps with visualization. This goal aligns with our Location Analytics research which found 48 percent of business has found that using location analytics significantly improves their business process.  Tableau advancements in visualizing analytics is great for the analysts and data geeks but it is still beyond competencies of information consumers. At the same time, Tableau has done what Microsoft has not done with Microsoft Excel: simplicity in preparing analytics for interactive visualization and discovery. In addition Tableau is easily accessible by mobile technology like Tablets which is definitely not a strong spot for Microsoft.

Making it easier for analysts and knowledge workers, Tableau has two-click copy and paste in dashboards between workbooks and is a significant development that allows users to organize visualizations in an expedient fashion. They store the collection of data and visualization in folders from the data window, where they access all the components that are used for discovery. They also can support search to find any detail easily. Transparency features and quick filter formatting allow users to match brand logos and colors. Presentation mode allows full-screen center display, and calendar controls let users select dates in ways they are familiar with from using other calendaring tools. What was surprising is that Tableau did not show how to present supporting information to the visualization like free form text that is what analysts do in using Microsoft Powerpoint and support the integration of content/documents that is not just structured data. My colleague has pointed to the failures of business intelligence and what analysts need to provide more context and information to the visualization, and it appears Tableau is starting to address them.

Tableau developer Robert Kosara showed a Tableau 8.2 feature called Storypoints, which puts a navigator at the top of the screen and pulls together different visualizations to support a logical argument. This is important to advance the potential issue my colleague has pointed out with what we have in visual anarchy today. Storypoints are linked directly to data visualizations from which you can navigate across and see varying states of the visualization. Storytelling is of keen interest to analysts, because it is the primary way in which they prepare information and analytics for review and in support of the decision making needs in the organization. Encapsulating the observations in telling a story though requires more than navigation across states of visualization with a descriptive caption. It should support embedding of descriptive information related to the visualization and not just to navigation across it. Tableau has more to offer with its canvas layout and embedding other presentation components but did not spend much time outlining what is fully possible today. The idea of storytelling and collaboration is a hot area of development, with multiple approaches coming to market including those from Datameer, Roambi, Yellowfin and QlikTech (with its acquisition of NcomVA, a Swedish visualization company). These approaches need to streamline the cumbersome process of copying data from Excel into PowerPoint, building charts and annotating slides. Tableau’s Storypoints and the ability to guide navigation on the visualizations and copy and paste dashboards together are good first steps that can be a superior alternative to just using a personal productivity approach with Microsoft Office, but Tableau will still need more depth to replicate the flexibility in particular of Microsoft Powerpoint.

The last area of development, and perhaps the most important for Tableau, is making the platform and tools more enterprise ready. Security, availability, scalability and manageability are the hallmarks of an enterprise-grade application, and Tableau is advancing in each of these areas. The Tableau 8.1 release includes external load-balancing support and more dynamic support for host names. Companies using the SAML standard can administer a single sign-on that delegates authentication to an identity provider. IPv6 support for next-generation Internet apps and, perhaps most important from a scalability perspective, 64-bit architecture have been extended across the product line. (Relevant to this last development, Tableau Desktop in version 8.2 will operate natively on Apple Mac, which garnered perhaps the loudest cheer of the day from the mostly youthful attendees). For proof of its scalability, Tableau pointed to Tableau Public, which fields more than 70,000 views each day. Furthermore, Tableau’s cloud implementation called Tableau Online offers a multi-tenant architecture and a columnar database for scalability, performance and unified upgrades for cloud users. Providing a cloud deployment is critical according to our research that found that a quarter of organizations prefer a cloud approach; however cloud BI applications have seen slower adoption.

Enterprise implementations are the battleground on which Tableau wants to compete and is making inroads through rapid adoption by the analysts who are responsible for analytics across the organization. During the keynote, Chabot took direct aim at legacy business intelligence vendors, suggesting that using enterprise BI platform tools is akin to tying a brick to a pen and trying to draw. Enterprise BI platforms, he argued, are designed to work in opposition to great thinking. They are not iterative and exploratory in nature but rather developed in a predefined fashion. While this may be true in some cases, those same BI bricks are often the foundation of many organizations and they are not always easy to remove. Last year I argued in analyzing Tableau 8.0 that the company was not ready to compete for larger BI implementations. New developments coming this year address some of these shortcomings, but there is still the lingering question of the entrenchment of BI vendors and the metrics that are deployed broadly in organizations. Companies such as IBM, Oracle and SAP have a range of applications including finance, planning and others that reside at the heart of most organizations, and these applications can dictate the metrics and key indicators to which people and organizations are held accountable. For Tableau to replace them would require more integration with the established metrics and indicators that are managed within these tools and associated databases. For large rollouts driven by well-defined parameterized reporting needs and interaction with enterprise applications, Tableau still has work to do. Furthermore, every enterprise BI vendor has its own visual offering and is putting money into catching up to Tableau.

In sum, Tableau’s best-in-class ease of use does serve as a bicycle for the analytical mind, and with its IPO this year, Tableau is pedaling as fast as ever to continue its innovations. We research thousands of BI deployments and recently awarded Cisco our Business Technology Leadership Award in Analytics for 2013 for its use of Tableau Software. VR_leadershipwinnerCisco, who has many business intelligence (BI) tools, uses Tableau to design analytics and visualize data in multiple areas of its business. Tableau’s ability to capture the hearts and minds of those analysts that are responsible for analytics, and demonstrate business value in short period of time, or what is called time-to-value (TTV) and even more important for big data as I have pointed out, is why they are growing rapidly building a community and passion towards its products. Our research finds that business is driving adoption of business analytics which helps Tableau avoid the politics of IT while addressing the top selection criteria of usability. In addition, the wave of business improvement initiatives are changing how 60 percent of organizations select technology with buyers no longer simply accepting the IT standard or existing technology approach. Buyers both on the IT and in business should pay close attention to these trends and for all organizations looking to compete with analytics through simple but elegant visualizations should consider Tableau’s offerings.


Tony Cosentino

VP and Research Director

As a new generation of business professionals embraces a new generation of technology, the line between people and their tools begins to blur. This shift comes as organizations become flatter and leaner and roles, vr_ngbi_br_importance_of_bi_technology_considerationscontext and responsibilities become intertwined. These changes have introduced faster and easier ways to bring information to users, in a context that makes it quicker to collaborate, assess and act. Today we see this in the prominent buying patterns for business intelligence and analytics software and an increased focus on the user experience. Almost two-thirds (63%) of participants in our benchmark research on next-generation business intelligence say that usability is the top purchase consideration for business intelligence software. In fact, usability is the driving factor in evaluating and selecting technology across all application and technology areas, according to our benchmark research.

In selecting and using technology, personas (that is, an idealized cohort of users) are particularly important, as they help business and IT assess where software will be used in the organization and define the role, responsibility and competency of users and the context of what they need and why. At the same time, personas help software companies understand the attitudinal, behavioral and demographic profile of target individuals and the specific experience that is not just attractive but essential to those users. For example, the mobile and collaborative intelligence capabilities needed by a field executive logging in from a tablet at a customer meeting are quite different from the analytic capabilities needed by an analyst trying to understand the causes of high customer churn rates and how to change that trend with a targeted marketing campaign.

Understanding this context-driven user experience is the first step toward defining the personas found in today’s range of analytics users. The key is to make the personas simple to understand but comprehensive enough to cover the diversity of needs for business analytic types within the organization. To help organizations be more effective in their analytic process and engagement of their resources and time, we recommend the following five analytical personas: (Note that in my years of segmentation work, I’ve found that the most important aspects are the number of segments and the names of those segments. To this end, I have chosen a simple number, five, and the most intuitive names I could find to represent each persona.)

Information Consumer: This persona is not technologically savvy and may even feel intimidated by technology. Information must be provided in a user-friendly fashion to minimize frustration. These users may rely on one or two tools that they use just well enough to do their jobs, which typically involves consuming information in presentations, reports, dashboards or other forms that are easy to read and interpret. They are oriented more to language than to numbers and in most cases would rather read or listen to information about the business. They can write a pertinent memo or email, make a convincing sales pitch or devise a brilliant strategy. Their typical role within the organization varies, but among this group is the high-ranking executive, including the CEO, for whom information is prepared. In the lines of business, this consumer may be a call center agent, a sales manager or a field service worker. In fact, in many companies, the information consumer makes up the majority of the organization. The information consumer usually can read Excel and PowerPoint documents but rarely works within them. This persona feels empowered by consumer-grade applications such as Google, Yelp and Facebook.

Knowledge Worker: Knowledge workers are business, technologically and data savvy and have domain knowledge. They interpret data in functional ways. These workers understand descriptive data but are not likely to take on data integration tasks or interpret advanced statistics (as in a regression analysis). In terms of tools, they can make sense of spreadsheets and with minimal training use the output of tools like business intelligence systems, pivot tables and visual discovery tools. They also actively participate in providing feedback and input to planning and business performance software. Typically, these individuals are over their heads when they are asked to develop a pivot table or structure multi-dimensional data. In some instances, however, new discovery tools allow them to move beyond such limitations. The knowledge worker persona includes but is not limited to technology savvy executives, line of business managers to directors, domain experts and operations managers. Since these workers focus on decision-making and business outcomes, analytics is an important part of their overall workflow but targeted at specific tasks. For analytical tools this role may use applications with embedded analytics, analytic discovery and modeling approaches. Visual discovery tools and in many instances user friendly SaaS applications are empowering the knowledge worker to be more analytically driven without IT involvement.

Analyst: Well versed in data, this persona often knows business intelligence and analytics tools that pertain to the position and applies analytics to analyze various aspects of the business. These users are familiar with applications and systems and know how to retrieve and assemble data from them in many forms. They can also perform a range of data blending and data preparation tasks, and create dashboards and data visualizations along with pivot tables with minimal or no training. They can interpret many types of data, including correlation and in some cases regression. The analyst’s role involves modeling and analytics either within specific analytic software or within software used for business planning and enterprise performance management. More senior analysts focus on more advanced analytics, such as predictive analytics and data mining, to understand current patterns data and predict future outcomes. These analysts might be called a split persona in terms of where their skills and roles are deployed in the organization. They may reside in IT, but a lot more are found on the business side, as they are accountable for analytics tied to the outcomes of the analytics. Analysts on the business side may not be expert in SQL or computer programming but may be adept with languages such as R or SAS. Those on the IT side are more familiar with SQL and the building of data models used in databases. With respect to data preparation, the IT organization looks at integration through the lens of ETL and associated tool sets, whereas the business side looks at it from a data-merge perspective and the creation of analytical data sets in places like spreadsheets.

The roles that represent this persona often are explicitly called analysts with a prefix that in most cases is representative of the department they work from, such as finance, marketing, sales or operations but could have prefixes like corporate, customer, operational or other cross-departmental responsibilities. The analytical tools they use almost always include the spreadsheet, as well as complementary business intelligence tools and a range of analytical tools like visual discovery and in some cases more advanced predictive analytics and statistical software. Visual discovery and commodity modeling approaches are empowering some analyst workers to move upstream from a role of data monger to a more interpretive decision support position. For those already familiar with advanced modeling, today’s big data environments, including new sources of information and modern technology, are providing the ability to build much more robust models and solve an entirely new class of business problems.

Publisher: Skilled in data and analytics, the publisher typically knows how to configure and operate business intelligence tools and publish information from them in dashboards or reports. They are typically skilled in the basics of spreadsheets and publishing information to Microsoft Word or PowerPoint tools. These users not only can interpret many types of analytics but can also build and validate the data for their organizations. Similar to the analyst, the publisher may be considered a split persona, as these individuals may be in a business unit or IT. The IT-based publisher is more familiar with the business intelligence processes and knows the data sources and how to get to data from the data warehouse or even big data sources. They may have basic configuration and scripting skills that enable them to produce outputs in several ways. They may also have basic SQL and relational data modeling skills that help them identify what can be published and determine how data can be combined through the BI tool or databases. The titles related to publisher may include business intelligence manager, data analyst, or manager or director of data or information management. The common tools used by the publisher include business intelligence authoring tools, various visualization and analytic tools, and office productivity tools like Microsoft Office and Adobe Acrobat.

Data Geek: A data geek, data analyst or potentially as sophisticated as a data scientist has expert data management skills, has an interdisciplinary approach to data that melds the split personas discussed at the analyst and senior analyst levels. The primary difference between the data geek and the analyst is that the latter usually focuses on either the IT side or the business side. A senior analyst with a Ph.D. in computer science understands relational data models and programming languages but may not understand advanced statistical models and statistical programming languages. Similarly, a Ph.D. in statistics understands advanced predictive models and associated tools but may not be prepared to write computer code. The data scientist not only understands both advanced statistics and modeling but enough about computer programming and systems along with domain knowledge. The titles for this role vary but include chief analytics officer, enterprise data architect, data analyst, head of information science and even data scientist.

To align analytics and the associated software to individuals in the organization, businesses should use personas to best identify who needs what set of capabilities to be effective. Organizations should also assess competency levels in their personas to avoid adopting software that is too complicated or difficult to use. In some cases you will have individuals that can perform multiple personas. Instead of wasting time, resources and financial capital, look to define what is needed and where training is needed to ensure business and IT work collaboratively in business analytics. While some business analytics software is getting easier to use, many of the offerings are still difficult to use because they are still being designed for IT or more sophisticated analysts. While these individuals are an important group, they represent only a small portion of the users who need analytic business tools.

vr_bigdata_obstacles_to_big_data_analytics (2)The next generation of business intelligence and business analytics will in part address the need to more easily consume information through smartphones and tablets but will not overcome one of the biggest barriers to big data analytics: the skills gap. Our benchmark research on big data shows staffing (79%) and training (77%) are the two biggest challenges organizations face in efforts to take advantage of big data through analytics. In addition, a language barrier still exists in some organizations, where IT speaks in terms of TCO and cost of ownership and efficiency while the business speaks in terms of effectiveness and outcomes or time to value, which I have written about previously. While all of these goals are important, the organization needs to cater to the metrics that are needed by its various personas. Such understanding starts with better defining the different personas and building more effective communication among the groups to ensure that they work together more collaboratively to achieve their respective goals and get the most value from business analytics.


Tony Cosentino

VP and Research Director

Microsoft has been steadily pouring money into big data and business intelligence. The company of course owns the most widely used analytical tool in the world, Microsoft Excel, which our benchmark research into Spreadsheets in the Enterprise shows is not going away soon. User resistance (cited by 56% of participants) and lack of a business case (50%) are the most common reasons that spreadsheets are not being replaced in the enterprise.  The challenge is ensuring the spreadsheets are not just personally used but connected and secured into the enterprise to address consistency and a range of  and potential errors. These issues all add up to more work and maintenance as my colleague has pointed out recently.

vr_ss21_spreadsheets_arent_easily_replacedAlong with Microsoft SQL and SharePoint, Excel is at the heart of the company’s BI strategy. In particular, PowerPivot, originally introduced as an add-on for Excel 2010 and built into Excel 2013, is a discovery tool that enables exploratory analytics and data mashups. PowerPivot uses an in-memory, column store approach similar to other tools in the market. Its ability to access multiple data sources including from third parties and government through Microsoft’s Azure Marketplace, enables a robust analytical experience.

Ultimately, information sources are more important than the tool sets used on them. With the Azure Marketplace and access to other new data sources such as Hadoop through partnership with Hortonworks as my colleague assessed, Microsoft is advancing in the big data space. Microsoft has partnered with Hortonworks to bring Hadoop data into the fold through HDInsights, which enable familiar Excel environments to access HDFS via HCatalog. This approach is similar to access methods utilized by other companies, including Teradata which I wrote about last week. Microsoft stresses the 100 percent open source nature of the Hortonworks approach as a standard alternative to the multiple, more proprietary Hadoop distributions occurring throughout the industry. An important benefit for enterprises with Microsoft deployments is that Microsoft Active Directory adds security to HDInsights.

As my colleague Mark Smith recently pointed out about data discovery methods, the analytic discovery category is broad and includes visualization approaches. On the visualization side, Microsoft markets PowerView, also part of Excel 2013, which provides visual analytics and navigation on top of the Microsoft’s BI semantic model. Users also can annotate and highlight content and then embed it directly into PowerPoint presentations. This direct export feature is valuable because PowerPoint is still a critical communication vehicle in many organizations. Another visual tool, currently in preview, is the Excel add-in GeoFlow, which uses Bing Maps to render visually impressive temporal and geographic data in three dimensions. Such a 3-D visualization technique could be useful in many industries.  Our research into next generation business intelligence found that deploying geographic maps (47%) and visualizing metrics on them (41%) are becoming increasing important but Microsoft will need to further exploit location-based analytics and the need for interactivity.

Microsoft has a core advantage in being able to link its front-office tools such as Excel with its back-end systems such as SQL Server 2012 and SharePoint. In particular, having the ability to leverage a common semantic model through Microsoft Analytical Services, in what Microsoft calls its Business Intelligence Semantic Model, users can set up a dynamic exploratory environment through Excel. Once users or analysts have developed a BI work product, they can publish the work product such as a report directly or through SharePoint. This integration enables business users to share data models and solutions and manage them in common, which applies to security controls as well as giving visibility into usage statistics to see when particular applications are gaining traction with organizational users.

Usability, which our benchmark research into next-generation business intelligencevr_ss21_employee_spreadsheet_skills_are_adequate identifies as the number-one evaluation criterion in nearly two-thirds (64%) of organizations, is still a challenge for Microsoft. Excel power users will appreciate the solid capabilities of PowerPivot, but more casual users of Excel – the majority of business people – do not understand how to build pivot tables or formulas. Our research shows that only 11 percent of Excel users are power users and most skill levels are simply adequate (49%) compared to above average or excellent. While PowerView does give some added capability, a number of other vendors of visual discovery products like Tableau have focused on user experience from the ground up, so it is clear that Microsoft needs to address this shortcoming in its design environment.

When we consider more advanced analytic strategies and inclusion of advanced algorithms, Microsoft’s direction is not clear. Its Data Analysis eXpressions (DAX) can help create custom measures and calculated fields, but it is a scripting language akin to MDX. This is useful for IT professionals who are familiar with such tools, but here also business-oriented users will be challenged in using it effectively.

A wild card in Microsoft’s BI and analytics strategy is with mobile technology. Currently, Microsoft is pursuing a build-once, deploy-anywhere model based on HTML5, and is a key member of the Worldwide Web Consortium (W3C) that is defining the standard. The HTML5 standard, which has just passed a big hurdle in terms of candidate recommendation is beginning to show value in the design of new applications that can be access through web-browsers on smartphones and tablets.  The approach of HTML5 could be challenging as our technology innovation research into mobile technology finds more organizations (39%) prefer native mobile applications from the vendors specific application stores compared to 33 percent through web-browser based method and a fifth with no preference. However, the success or failure of its Windows 8-based Surface tablet will be the real barometer of Microsoft mobile BI success since its integration with the Office franchise is a key differentiator. Early adoption of the tablet has not been strong, but Microsoft is said to be doubling down with a new version to be announced shortly. Success would put Office into the hands of the mobile workforce on a widespread basis via Microsoft devices, which could have far-reaching impacts for the mobile BI market.

As it stands now, however, Microsoft faces an uphill battle in establishing its mobile platform in a market dominated by Android and Apple iOS devices like the iPhone and iPad. If the Surface ultimately fails, Microsoft will likely have to open up Office to run on Android and iOS or risk losing its dominant position.  My colleague is quite pessimistic about Microsoft overall mobile technology efforts and its ability to overcome the reality of the existing market. Our technology innovation research into mobile technology finds that over half of organizations have a preference for their smartphone and tablet technology platform, and the first ranked smartphone priorities has Apple (50%), Android (27%) and RIM (17%) as top smartphone platforms with Microsoft a distant fourth (5%); for tablets is Apple (66%), Android (19%) and then Microsoft (8%). Based on these finding, Microsoft faces challenges on both the platform front and if they adapt their technology to support others that are more preferred in business today.

Ultimately, Microsoft is trying to pull together different initiatives across multiple internal business units that are known for being very siloed and not organized well for customers.  Ultimately, Microsoft has relied on its channel partners and customers to figure out how to not just make them work together but also think about what is possible since they are not always given clear guidance from Redmond. Recent efforts find that Microsoft is trying to come together to address the big data and business analytics challenge and the massive opportunity it represents. One area in which this is coming together is Microsoft’s cloud initiatives. Last year’s announcements of Azure virtual machines enables an infrastructure-as-a-service (IaaS) play for Microsoft and positions Windows Azure SQL Database as a service. This could make the back end systems I’ve discussed available through a cloud-based offer, but currently this is only offered through the client version of the software.

For organizations that already have installed Microsoft as their primary BI platform and are looking for tight integration with an Excel-based discovery environment, the decision to move forward is relatively simple. The trade-off is that this package is still a bit IT-centric and may not attract as many in the larger body of business users as a more user-friendly discovery product might do and address the failings of business intelligence. Furthermore, since Microsoft is not as engaged in direct support and service as other players in this market, it will need to move the traditionally technology focused channel to help their customers become more business savvy. For marketing and other business departments, especially in high-velocity industries where usability and time-to-value is at a premium and back-end integration is secondary, other tools will be worth a look. Microsoft has great potential and with analytics being the top ranked technology innovation priority among its customers I hope that the many divisions inside the global software giant can finally come together to deliver a comprehensive approach.


Tony Cosentino

VP and Research Director

Actuate this week announced BIRT Analytics, and thereby puts itself firmly into supporting a range of business analytics needs from data discovery and visualization to a range of data mining and predictive capabilities that allows itself new avenues of growth. Actuate has long been a staple of large Business Intelligence deployments; in fact the company says that ActuateOne delivers more insights to more people than all other BI applications combined. This is likely true, given that Actuate is embedded in major consumer applications across industries worldwide. This announcement builds and utilizes its advancements into big data that I already assessed last year that can help it further expand its technology value to business and IT.

Tools such as BIRT Analytics can change the organizational culture aroundvr_ngbi_br_importance_of_bi_technology_considerations data and analytics. They put the power of data discovery and data visualization into the hands of tool-savvy managers as well as business analysts.  While Actuate has allowed highly functional and interactive dashboards in the past, BIRT Analytics brings the usability dimension to a different level. Usability is of the highest importance to 63 percent of organizations for business intelligence software, according to our next-generation business intelligence benchmark research, and one where BIRT Analytics and other tools in its class really show their value. The technology allows not just for visual data exploration, but also for new sources of data to be connected and analyzed without a predefined schema. This fits well with the current world of distributed computing, where everything can no longer be nicely modeled in one place. The software can gather data from different sources, including big data sources, flat files and traditional relational databases, and mash these up through visually appealing toolsets, allowing end user analysts to bypass IT and avoid much of the data preparation that has been a hallmark of business intelligence in the past. In fact our recent business technology innovation benchmark research shows that only a little more than half of companies are satisfied with their analytic processes, and 44 percent of organizations indicate the most time-consuming part of the analytics process is data-related tasks that Actuate is addressing with their ability to handle data efficiently.

Some of the advantages of the BIRT Analytics product are its fast in-memory engine,vr_predanalytics_benifits_of_predictive_analytics its ability to handle large amounts of data, and the more advanced analytic capabilities in the system. The company’s web site says it offers the fastest data loading tool in the industry with the FastDB main memory database system and an ability to explore 6 billion records in less than a second. These are impressive numbers, especially as we look at big data analytics, which often runs against terabytes of data. The usability of this tool’s analytics features is particularly impressive. For instance, set analysis, clustering and predictive capabilities are all part of the software, allowing analysts who aren’t necessarily data scientists to conduct advanced data analysis. These capabilities give tools like BIRT Analytics an advantage in the market since they offer simple end-user-driven ways to produce market segmentation and forecasting reports. These advancements help Actuate provide new benefits of its BIRT Analytics that according to our benchmark research on predictive analytics, 68 percent of organizations see predictive analytics as a source of competitive advantage.

Actuate already ranked as a hot vendor in the 2012 Ventana Research Business Intelligence Value Index thanks to its enterprise-level reliability and validation of its deployments which this release will help it even more in its ratings.  In the short term, BIRT Analytics will certainly boost Actuate’s market momentum and allow it to compete in areas where it would not have been seen before and help it expand its value to its existing customers.


Tony Cosentino

VP and Research Director

Did you catch all the big data analogies people used in 2012? There were many, like the refinement of oil analogy, or the spinning straw into gold analogy, and less useful but more entertaining ones, like big data is like a box of chocolates, or big data is like The Matrix (because “there’s no way Keanu Reeves learns Kung Fu in five seconds without using big data”).  I tend to like the water analogy, which I’ll use here to have a little fun and to briefly describe how I see the business analytics market in 2013.

2013 is about standing lakes of information that will turn into numerous tributaries. These various tributaries of profile, behavioral and attitudinal data will flow into the digital river of institutional knowledge. Analytics, built out by people, process, information and technology, will be the only banks high enough to control this vast river and funnel it through the duct of organizational culture and into an ocean of market advantage.

With this river of information as a backdrop, I’m excited to introduce the Ventana Research Business Analytics Research Agenda for 2013, focused on three themes:

Answering the W’s (the what, the so what, the now what and the then what)

The first and perhaps most important theme of the 2013 research agenda builds on answering the W’s – the what, the so what, the now vr_bigdata_big_data_capabilities_not_availablewhat and the then what – which was also the topic of one of my most widely read blog posts last year. In that piece I suggested a substantive shift from the discussion the three V’s to the four W’s corresponds to the shift from a technologically-oriented discussion to a business-oriented one. Volume, variety and velocity are well-known parameters of big data that help facilitate the technology discussion, but when we look at analytics and how it can drive success for an organization, we need to move to the so what, now what and then what of analytical insights, organizational decision-making and closed-loop processes. Our big data research found some significant gaps in the business analytics spectrum for what is available in the organization today fueling a new generation of technology for consuming big data.

Outcome-driven approaches are a good way of framing issues, given that business analytics, and in particular big data analytics, are such broad topics, yet the use cases are so specific. Our big data analytics benchmark research for 2013 that we will start shortly will therefore look at specific benefits and supported business cases across industries and LOB in order to assess best practices for big data analytics. The research will investigate the opportunities and barriers that exist today  and explore what needs to happen for us to move from an early adopter market to an early majority market. The benchmark research will feed weighting algorithms into our Big Data Analytics Value Index, which will look at the analytics vendors that are tackling the formidable challenges of providing software to analyze large and multistructured datasets.

Disseminating insights within the organization is a big part of moving from insights to action, and business intelligence is still a primary vehicle for driving insight into the organization. While there is a lot to be said about mobile BI, collaborative BI, visual discovery and predictive analytics, core business intelligence systems remain at the heart of many organizations. Therefore we will continue our in-depth coverage of core business intelligence systems with our Business Intelligence Value Index, in the context of our next-generation business analytics benchmark research that will start in 2013.

Embracing next generation technology for business analytics

We’re beginning to see businesses embracing next-generation technology for business analytics.Collaborative business intelligence is a critical part of this conversation, both in terms of getting insights and in termsvr_ngbi_br_location_is_important_for_bi of making decisions. Last year’s next-generation business intelligence benchmark research showed us that the market is still undecided on how next-generation BI will be rolled out, with business applications being the preferred method, but only slightly more so than through business intelligence or office productivity tools. In addition to collaboration, we will focus on mobile and location trends in our next-generation business analytics benchmark research and our new location analytics benchmark research that we have already found has specific needs for business analysts. We see mobile business intelligence as a particularly hot area in 2013, and we are therefore breaking out mobile business intelligence vendors in this year’s Mobile Business Intelligence Value Index that we will conduct.

Another hot area of next-generation technology revolves around right-time data and real-time data and how they can be built into organizational workflows. As our operational intelligence benchmark research found, perceptions around OI and real-time data differ significantly between IT and business users. We will extend this discussion in the context of the big data analytics benchmark research, and specifically in the context of our Operational Intelligence Value Index that we will do in 2013.

Using analytical best practices across business and IT

Our final theme relates to the use of analytical best practices across business and IT. We’ll be looking at best practices for companies as they evolve to become analytics-driven organizations. In this context, we’llvr_predanalytics_benifits_of_predictive_analytics look at exploratory analytics, visual discovery and even English representation of the analytics itself as approaches in our next-generation business analytics benchmark research and how these impact how we assess BI in our Business Intelligence Value Index. We’ll look at how organizations exploit predictive analytics on big data as a competitive advantage as found in our predictive analytics benchmark within the context of our big data analytics benchmark research. We’ll look at the hot areas of sales and customer analytics, including best practices and their intersection with cloud computing models. And we’ll look at previously untapped areas of analytics that are just now heating up, such as human capital analytics. In our human capital analytics benchmark research that will begin shortly we’ll look across the landscape to assess not just analytics associated with core HR, but analytics around talent management and workforce optimization as well.

I see a high level of innovation and change going on in the business analytics market in 2013. Whenever an industry undergoes such change, high-quality primary research acts as a lighthouse for both customers and suppliers.  Companies can capitalize on all of the exciting developments in analytics, business intelligence and related areas of innovation to drive competitive advantage, but only if they understand the changes and potential value.

I am looking forward providing a practical perspective on using all forms of business analytics as a value in organizations and helping our Ventana Research community and clients.

Come read and download the full research agenda.


Tony Cosentino
VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.


  • 73,783 hits
%d bloggers like this: