You are currently browsing the tag archive for the ‘Tableau’ tag.

Tableau Software’s annual conference, which company spokespeople reported had more than 10,000 attendees, filled the MGM Grand in Las Vegas. Various product announcements supported the company’s strategy to deliver value to analysts and users of visualization tools. Advances include new data preparation and integration features, advanced analytics and mapping. The company also announced the release of a stand-alone mobile application called Vizable . One key message management aimed to promote is that Tableau is more than just a visualization company.

Over the last few years Tableau has made strides in the analytics and business intelligence market with a user-centric philosophy and the ability to engage younger analysts who work in the lines of business rather than in IT. Usability continues to rank as the top criteria for selecting analytic and business intelligence software in all of our business analytics benchmark research. In this area Tableau has introduced innovations such as VizQL, originally developed at Stanford University, which links capabilities to query a database and to visualize data. This combination enables users not highly skilled in languages such as SQL or using proprietary business intelligence tools to create and share visually intuitive dashboards. The effect is to provide previously unavailable visibility into areas of their operations. The impact of being able to see and compare performance across operations and people often increases communication and knowledge sharing.

Tableau 9, released in April 2015, which I discussed, introduced advances including analytic ease of use and performance, new APIs, data preparation, storyboarding and Project Elastic, the precursor to this year’s announcement of Vizable. Adoption of 9.x appears to be robust given both the number of conference attendees and increases in third-quarter revenue ($170 million) and new customers (3,100) reported to the financial markets.

As was the case last year, conference announcements included some developments already on the market as well as some still to come. Among data preparation capabilities introduced are integration and automated spreadsheet cleanup. For the former, being able to join two data sets through a union function, which adds rows to form a single data set, and to do integration across databases by joining specific data fields gives users flexibility in combining, analyzing and visualizing multiple sets of data. For the latter, to automate the spreadsheet cleanup process Tableau examined usage patterns of Tableau Public to learn how users manually clean their spreadsheets. Then it used machine-learning algorithms to help users automate the tasks. Being able to automatically scan Excel files to find subtables and automatically transform data without manual calculations and parsing will save time for analysts who vr_LA_most_important_location_analytics_capabilitiesotherwise would have to do these tasks manually. Our benchmark research into information optimization shows that data preparation consumes the largest portion of time spent on analytics by nearly half (47%) of organizations and even higher in our latest data and analytics in the cloud benchmark research by 59 percent of organizations.

Advanced analytics is another area of innovation for Tableau. The company demonstrated developments in outlier detection and clustering analysis natively integrated with the software. Use of these features is straightforward and visually oriented, replacing the need for statistical charts with drag-and-drop manipulation. The software does not enable users to identify numbers of segments or filter the degree of the outliers, but the basic capability can reduce data sets to more manageable analytic sets and facilitate exploration of anomalous data points within large sets. The skill necessary for these tasks, unlike the interpretation of box plots introduced at last year’s conference, is more intuitive and better suited for business users of information.

The company also demonstrated new mapping and geospatial features at the conference. Capabilities to analyze down to the zip code on a global basis, define custom territories, support geospatial files, integrate with vr_LA_most_important_location_analytics_capabilitiesthe open source mapping platform MapBox and perform calculations within the context of a digital map are all useful features for location analytics, which is becoming more important in areas such as customer analytics and digital devices connected in the emerging Internet of things (IoT). Tableau is adding capabilities that participants most often cited as important in our research on location analytics: to provide geographic representation (72%), visualize metrics associated with locations (65%) and directly select and analyze locations on maps (61%).

Tableau insists that its development of new capabilities is guided by customer requests. This provides a source of opportunities to address user needs especially in the areas of data preparation, advanced analytics and location analytics. However, this strategy raises the question of whether it will ultimately put the company in conflict with the partners that have helped build the Tableau ecosystem and feed the momentum of the company thus far. Tableau is positioning its product as a fully featured analytic platform of the sort that I have outlined, but to achieve that eventually it will have to encroach on the capabilities that partners such as Alteryx, Datawatch, Informatica, Lavastorm, Paxata and Trifacta offer today. Another question is whether Tableau will continue its internal development strategy or opt to acquire companies that can broaden its capabilities that has hampered its overall value rating as identified in our 2015 Analytics and Business intelligence Value Index. In light of announcements at the conference, the path seems to be to develop these capabilities in-house. While there appears to be no immediate threat to the partnerships the continuation of development of some of these capabilities eventually will impact the partner business model in a more material way. Given that the majority of the deals for its partner ecosystem flows through Tableau itself, many of the partners are vulnerable to these development efforts. In addition I will be watching how aggressively Tableau helps to market Spark, the open source big data technology that I wrote about, as compared to some of the partner technologies that Spark threatens. Tableau has already built on Spark while some of its competitors have not, which may give Tableau a window of opportunity.

Going forward, integration with transactional systems and emerging cloud ecosystems is an area for Tableau that I will be watching. Given its architecture it’s not easy for Tableau to participate in the new generation of service-oriented architectures that characterize part of today’s cloud marketplace. For this reason, Tableau will need to continue to build out its own platform and the momentum of its ecosystem – which at this point does not appear to be a problem.

Finally, it will be interesting to see how Tableau eventually aligns its stand-alone data visualization application Vizable with its broader mobile strategy. We will be looking closely at the mobile market in our upcoming Mobile Analytics and Business Intelligence Value Index in the first half of 2016 where in our last analysis found Tableau was in the middle of the pack with other providers but they have made more investments since our last analysis.

We recommend that companies exploring analytics platforms, especially for on-premises and hosted cloud use, include Tableau on their short lists. Organizations that consider deploying Tableau on an enterprise basis should look closely at how it aligns with their broader user requirements and if their cloud strategy will meet its future needs. Furthermore, while the company has made improvements in manageability and performance, these can still be a concern in some circumstances. Tableau should be evaluated also with specific business objectives in mind and in conjunction with its partner ecosystem.


Ventana Research

Qlik was an early pioneer in developing a substantial market for a visual discovery tool that enables end users to easily access and manipulate analytics and data. Its QlikView application uses an associative experience that takes  an in-memory, correlation-based approach to present a simpler design and user experience for analytics than previous tools. Driven by sales of QlikView, the company’s revenue has grown to more than $.5 billion, and originating in Sweden it has a global presence.

At its annual analyst event in New York the business intelligence and analytics vendor discussed recent product developments, in particular the release of Qlik Sense. It is a drag-and-drop visual analytics tool targeted at business users but scalable enough for enterprise use. Its aim is to give business users a simplified visual analytic experience that takes advantage of modern cloud technologies. Such a user experience is important; our benchmark research into next-generation business intelligence shows that usability is an important buying criterion for nearly two out of three (63%) companies. A couple of months ago, Qlik introduced Qlik Sense for desktop systems, and at the analyst event it announced general availability of the cloud and server editions.

vr_bti_br_technology_innovation_prioritiesAccording to our research into business technology innovation, analytics is the top initiative for new technology: 39 percent of organizations ranked it their number-one priority. Analytics includes exploratory and confirmatory approaches to analysis. Ventana Research refers to exploratory analytics as analytic discovery and segments it into four categories that my colleague Mark Smith has articulated. Qlik’s products belong in the analytic discovery category. Users can use the tool to investigate data sets in an intuitive and visual manner, often conducting root cause analysis and decision support functions. This software market is relatively young, and competing companies are evolving and redesigning their products to suit changing tastes. Tableau, one of Qlik’s primary competitors, which I wrote about recently, is adapting its current platform to developments in hardware and in-memory processing, focusing on usability and opening up its APIs. Others have recently made their first moves into the market for visual discovery applications, including Information Builders and MicroStrategy. Companies such as Actuate, IBM, SAP, SAS and Tibco are focused on incorporating more advanced analytics in their discovery tools. For buyers, this competitive and fragmented market creates a challenge when comparing offers in the analytic discovery market.

A key differentiator is Qlik Sense’s new modern architecture, which is designed for cloud-based deployment and embedding in other applications for specialized use. Its analytic engine plugs into a range of Web services. For instance, the Qlik Sense API enables the analytic engine to call to a data set on the fly and allow the application to manipulate data in the context of a business process. An entire table can be delivered to node.js, which extends the JavaScript API to offer server-side features and enables the Qlik Sense engine to take on an almost unlimited number of real-time connections  by not blocking input and output. Previously developers could write PHP script and pipe SQL to get the data, and the resulting application is viable but complex to build and maintain. Now all they need is JavaScript and HTML. The Qlik Sense architecture abstracts the complexity and allows JavaScript developers to make use of complex constructs without intricate knowledge of the database. The new architecture can decouple the Qlik engine from the visualizations themselves, so Web developers can define expressions and dimensions without going into the complexities of the server-side architecture. Furthermore, by decoupling the services, developers gain access to open source visualization technologies such as d3.js. Cloud-based business intelligence and extensible analytics are becoming a hot topic. I have written about this, including a glimpse of our newly announced benchmark research on the next generation of data and analytics in the cloud. From a business user perspective, these types of architectural changes may not mean much, but for developers, OEMs and UX design teams, it allows much faster time to value through a simpler component-based approach to utilizing the Qlik analytic engine and building visualizations.

vr_Big_Data_Analytics_06_benefits_realized_from_big_data_analyticsThe modern architecture of Qlik Sense together with the company’s ecosystem of more than 1,000 partners and a professional services organization that has completed more than 2,700 consulting engagements, gives Qlik a competitive position. The service partner relationships, including those with major systems integrators, are key to the company’s future since analytics is as much about change management as technology. Our research in analytics consistently shows that people and processes lag technology and information in performance with analytics. Furthermore, in our benchmark research into big data analytics, the benefits most often mentioned as achieved are better communication and knowledge sharing (24%), better management and alignment of business goals (18%), and gaining competitive advantage (17%).

As tested on my desktop, Qlik Sense shows an intuitive interface with drag-and-drop capabilities for building analysis. Formulas are easy to incorporate as new measures, and the palate offers a variety of visualization options which automatically fit to the screen. The integration with QlikView is straightforward in that a data model from QlikView can be saved seamlessly and opened intact in Qlik Sense. The storyboard function allows for multiple visualizations to build into narratives and for annotations to be added including linkages with data. For instance, annotations can be added to specific inflection points in a trend line or outliers that may need explanation. Since the approach is all HTML5-based, the visualizations are ready for deployment to mobile devices and responsive to various screen sizes including newer smartphones, tablets and the new class of so-called phablets. In the evaluation of vendors in our Mobile Business Intelligence Value Index Qlik ranked fourth overall.

In the software business, of course, technology advances alone don’t guarantee success. Qlik has struggled to clarify the position its next-generation product and it is not a replacement for QlikView. QlikView users are passionate about keeping their existing tool because they have already designed dashboards and calculations using this tool. Vendors should not underestimate user loyalty and adoption. Therefore Qlik now promises to support both products for as long as the market continues to demand them. The majority of R&D investment will go into Qlik Sense as developers focus on surpassing the capabilities of QlikView. For now, the company will follow a bifurcated strategy in which the tools work together to meet needs for various organizational personas. To me, this is the right strategy. There is no issue in being a two-product company, and the revised positioning of Qlik Sense complements QlikView both on the self-service side and the developer side. Qlik Sense is not yet as mature a product as QlikView, but from a business user’s perspective it is a simple and effective analysis tool for exploring data and building different data views. It is simpler because users no do not need to script the data in order to create the specific views they deem necessary. As the product matures, I expect it to become more than an end user’s visual analysis tool since the capabilities of Qlik Sense lends itself to web scale approaches. Over time, it will be interesting to see how the company harmonizes the two products and how quickly customers will adopt Qlik Sense as a stand-alone tool.

For companies already using QlikView, Qlik Sense is an important addition to the portfolio. It will allow business users to become more engaged in exploring data and sharing ideas. Even for those not using QlikView, with its modern architecture and open approach to analytics, Qlik Sense can help future-proof an organization’s current business intelligence architecture. For those considering Qlik for the first time, the choice may be whether to bring in one or both products. Given the proven approach of QlikView, in the near term a combination approach may be a better solution in some organizations. Partners, content providers and ISVs should consider Qlik Branch, which provides resources for embedding Qlik Sense directly into applications. The site provides developer tools, community efforts such as d3.js integrations and synchronization with Github for sharing and branching of designs. For every class of user, Qlik Sense can be downloaded for free and tested directly on the desktop. Qlik has made significant strides with Qlik Sense, and it is worth a look for anybody interested in the cutting edge of analytics and business intelligence.


Ventana Research

Ventana Research recently completed the most comprehensiveVRMobileBIVI evaluation of mobile business intelligence products and vendors available anywhere today. The evaluation includes 16 technology vendors’ offerings on smartphones and tablets and use across Apple, Google Android, Microsoft Surface and RIM BlackBerry that were assessed in seven key categories: usability, manageability, reliability, capability, adaptability, vendor validation and TCO and ROI. The result is our Value Index for Mobile Business Intelligence in 2014. The analysis shows that the top supplier is MicroStrategy, which qualifies as a Hot vendor and is followed by 10 other Hot vendors: IBM, SAP, QlikTech, Information Builders, Yellowfin, Tableau Software, Roambi, SAS, Oracle and arcplan.

Our expertise, hands on experience and the buyer research from our benchmark research on next-generation business intelligence and on information optimization informed our product evaluations in this new Value Index. The research examined business intelligence on mobile technology to determine organizations’ current and planned use and the capabilities required for successful deployment.

What we found was wide interest in mobile business intelligence and a desire to improve the use of information in 40 percent of organizations, though adoption is less pervasive than interest. Fewer than half of organizations currently access BI capabilities on mobile devices, but nearly three-quarters (71%) expect their mobile workforce to be able to access BI capabilities in the next 12 months. The research also shows strong executive support: Nearly half of executives said that mobility is very important to their BI processes.

Mobile_BI_Weighted_OverallEase of access and use are an important criteria in this Value Index because the largest percentage of organizations identified usability as an important factor in evaluations of mobile business intelligence applications. This is an emphasis that we find in most of our research, and in this case it also may reflect users’ experience with first-generation business intelligence on mobile devices; not all those applications were optimized for touch-screen interfaces and designed to support gestures. It is clear that today’s mobile workforce requires the ability to access and analyze data simply and in a straightforward manner, using an intuitive interface.

The top five companies’ products in our 2014 Mobile Business Intelligence Value Index all provide strong user experiences and functionality. MicroStrategy stood out across the board, finishing first in five categories and most notably in the areas of user experience, mobile application development and presentation of information. IBM, the second-place finisher, has made significant progress in mobile BI with six releases in the past year, adding support for Android, advanced security features and an extensible visualization library. SAP’s steady support for the mobile access to SAP BusinessObjects platform and support for access to SAP Lumira, and its integrated mobile device management software helped produce high scores in various categories and put it in third place. QlikTech’s flexible offline deployment capabilities for the iPad and its high ranking in assurance-related category of TCO and ROI secured it the fourth spot. Information Builders’ latest release of WebFOCUS renders content directly with HTML5 and its Active Technologies and Mobile Faves, the company delivers strong mobile capabilities and rounds out the top five ranked companies. Other noteworthy innovations in mobile BI include Yellowfin’s collaboration technology, Roambi’s use of storyboarding in its Flow application.

Although there is some commonality in how vendors provide mobile access to data, there are many differences among their offerings that can make one a better fit than another for an organization’s particular needs. For example, companies that want their mobile workforce to be able to engage in root-cause discovery analysis may prefer tools from Tableau and QlikTech. For large companies looking for a custom application approach, MicroStrategy or Roambi may be good choices, while others looking for streamlined collaboration on mobile devices may prefer Yellowfin. Many companies may base the decision on mobile business intelligence on which vendor they currently have installed. Customers with large implementations from IBM, SAP or Information Builders will be reassured to find that these companies have made mobility a critical focus.

To learn more about this research and to download a free executive summary, please visit


Tony Cosentino

Vice President and Research Director

Microsoft has been steadily pouring money into big data and business intelligence. The company of course owns the most widely used analytical tool in the world, Microsoft Excel, which our benchmark research into Spreadsheets in the Enterprise shows is not going away soon. User resistance (cited by 56% of participants) and lack of a business case (50%) are the most common reasons that spreadsheets are not being replaced in the enterprise.  The challenge is ensuring the spreadsheets are not just personally used but connected and secured into the enterprise to address consistency and a range of  and potential errors. These issues all add up to more work and maintenance as my colleague has pointed out recently.

vr_ss21_spreadsheets_arent_easily_replacedAlong with Microsoft SQL and SharePoint, Excel is at the heart of the company’s BI strategy. In particular, PowerPivot, originally introduced as an add-on for Excel 2010 and built into Excel 2013, is a discovery tool that enables exploratory analytics and data mashups. PowerPivot uses an in-memory, column store approach similar to other tools in the market. Its ability to access multiple data sources including from third parties and government through Microsoft’s Azure Marketplace, enables a robust analytical experience.

Ultimately, information sources are more important than the tool sets used on them. With the Azure Marketplace and access to other new data sources such as Hadoop through partnership with Hortonworks as my colleague assessed, Microsoft is advancing in the big data space. Microsoft has partnered with Hortonworks to bring Hadoop data into the fold through HDInsights, which enable familiar Excel environments to access HDFS via HCatalog. This approach is similar to access methods utilized by other companies, including Teradata which I wrote about last week. Microsoft stresses the 100 percent open source nature of the Hortonworks approach as a standard alternative to the multiple, more proprietary Hadoop distributions occurring throughout the industry. An important benefit for enterprises with Microsoft deployments is that Microsoft Active Directory adds security to HDInsights.

As my colleague Mark Smith recently pointed out about data discovery methods, the analytic discovery category is broad and includes visualization approaches. On the visualization side, Microsoft markets PowerView, also part of Excel 2013, which provides visual analytics and navigation on top of the Microsoft’s BI semantic model. Users also can annotate and highlight content and then embed it directly into PowerPoint presentations. This direct export feature is valuable because PowerPoint is still a critical communication vehicle in many organizations. Another visual tool, currently in preview, is the Excel add-in GeoFlow, which uses Bing Maps to render visually impressive temporal and geographic data in three dimensions. Such a 3-D visualization technique could be useful in many industries.  Our research into next generation business intelligence found that deploying geographic maps (47%) and visualizing metrics on them (41%) are becoming increasing important but Microsoft will need to further exploit location-based analytics and the need for interactivity.

Microsoft has a core advantage in being able to link its front-office tools such as Excel with its back-end systems such as SQL Server 2012 and SharePoint. In particular, having the ability to leverage a common semantic model through Microsoft Analytical Services, in what Microsoft calls its Business Intelligence Semantic Model, users can set up a dynamic exploratory environment through Excel. Once users or analysts have developed a BI work product, they can publish the work product such as a report directly or through SharePoint. This integration enables business users to share data models and solutions and manage them in common, which applies to security controls as well as giving visibility into usage statistics to see when particular applications are gaining traction with organizational users.

Usability, which our benchmark research into next-generation business intelligencevr_ss21_employee_spreadsheet_skills_are_adequate identifies as the number-one evaluation criterion in nearly two-thirds (64%) of organizations, is still a challenge for Microsoft. Excel power users will appreciate the solid capabilities of PowerPivot, but more casual users of Excel – the majority of business people – do not understand how to build pivot tables or formulas. Our research shows that only 11 percent of Excel users are power users and most skill levels are simply adequate (49%) compared to above average or excellent. While PowerView does give some added capability, a number of other vendors of visual discovery products like Tableau have focused on user experience from the ground up, so it is clear that Microsoft needs to address this shortcoming in its design environment.

When we consider more advanced analytic strategies and inclusion of advanced algorithms, Microsoft’s direction is not clear. Its Data Analysis eXpressions (DAX) can help create custom measures and calculated fields, but it is a scripting language akin to MDX. This is useful for IT professionals who are familiar with such tools, but here also business-oriented users will be challenged in using it effectively.

A wild card in Microsoft’s BI and analytics strategy is with mobile technology. Currently, Microsoft is pursuing a build-once, deploy-anywhere model based on HTML5, and is a key member of the Worldwide Web Consortium (W3C) that is defining the standard. The HTML5 standard, which has just passed a big hurdle in terms of candidate recommendation is beginning to show value in the design of new applications that can be access through web-browsers on smartphones and tablets.  The approach of HTML5 could be challenging as our technology innovation research into mobile technology finds more organizations (39%) prefer native mobile applications from the vendors specific application stores compared to 33 percent through web-browser based method and a fifth with no preference. However, the success or failure of its Windows 8-based Surface tablet will be the real barometer of Microsoft mobile BI success since its integration with the Office franchise is a key differentiator. Early adoption of the tablet has not been strong, but Microsoft is said to be doubling down with a new version to be announced shortly. Success would put Office into the hands of the mobile workforce on a widespread basis via Microsoft devices, which could have far-reaching impacts for the mobile BI market.

As it stands now, however, Microsoft faces an uphill battle in establishing its mobile platform in a market dominated by Android and Apple iOS devices like the iPhone and iPad. If the Surface ultimately fails, Microsoft will likely have to open up Office to run on Android and iOS or risk losing its dominant position.  My colleague is quite pessimistic about Microsoft overall mobile technology efforts and its ability to overcome the reality of the existing market. Our technology innovation research into mobile technology finds that over half of organizations have a preference for their smartphone and tablet technology platform, and the first ranked smartphone priorities has Apple (50%), Android (27%) and RIM (17%) as top smartphone platforms with Microsoft a distant fourth (5%); for tablets is Apple (66%), Android (19%) and then Microsoft (8%). Based on these finding, Microsoft faces challenges on both the platform front and if they adapt their technology to support others that are more preferred in business today.

Ultimately, Microsoft is trying to pull together different initiatives across multiple internal business units that are known for being very siloed and not organized well for customers.  Ultimately, Microsoft has relied on its channel partners and customers to figure out how to not just make them work together but also think about what is possible since they are not always given clear guidance from Redmond. Recent efforts find that Microsoft is trying to come together to address the big data and business analytics challenge and the massive opportunity it represents. One area in which this is coming together is Microsoft’s cloud initiatives. Last year’s announcements of Azure virtual machines enables an infrastructure-as-a-service (IaaS) play for Microsoft and positions Windows Azure SQL Database as a service. This could make the back end systems I’ve discussed available through a cloud-based offer, but currently this is only offered through the client version of the software.

For organizations that already have installed Microsoft as their primary BI platform and are looking for tight integration with an Excel-based discovery environment, the decision to move forward is relatively simple. The trade-off is that this package is still a bit IT-centric and may not attract as many in the larger body of business users as a more user-friendly discovery product might do and address the failings of business intelligence. Furthermore, since Microsoft is not as engaged in direct support and service as other players in this market, it will need to move the traditionally technology focused channel to help their customers become more business savvy. For marketing and other business departments, especially in high-velocity industries where usability and time-to-value is at a premium and back-end integration is secondary, other tools will be worth a look. Microsoft has great potential and with analytics being the top ranked technology innovation priority among its customers I hope that the many divisions inside the global software giant can finally come together to deliver a comprehensive approach.


Tony Cosentino

VP and Research Director

Our benchmark research found in business technology innovation that analytics is the most important new technology for improving their organization’s performance; they ranked big data only fifth out of six choices. This and other findings indicate that the best way for big data to contribute value to today’s organizations is to be paired with analytics. Recently, I wrote about what I call the four pillars of big data analytics on which the technology must be built. These areas are the foundation of big data and information optimization, predictive analytics, right-time analytics and the discovery and visualization of analytics. These components gave me a framework for looking at Teradata’s approach to big data analytics during the company’s analyst conference last week in La Jolla, Calif.

The essence of big data is to optimize the information used by the business for whatever type of need as my colleague has identified as a key value of these investmentsVR_2012_TechAward_Winner_LogoData diversity presents a challenge to most enterprise data warehouse architectures. Teradata has been dealing with large, complex sets of data for years, but today’s different data types are forcing new modes of processing in enterprise data warehouses. Teradata is addressing this issue by focusing on a workload-specific architecture that aligns with MapReduce, statistics and SQL. Its Unified Data Architecture (UDA) incorporates the Hortonworks Hadoop distribution, the Aster Data platform and Teradata’s stalwart RDBMS EDW. The Big Data Analytics appliance that encompasses the UDA framework won our annual innovation award in 2012. The system is connected through Infiniband and accesses Hadoop’s metadata layer directly through Hcatalog. Bringing these pieces together represents the type of holistic thinking that is critical for handling big data analytics; at the same time there are some costs as the system includes two MapReduce processing environments. For more on the UDA architecture, read my previous post on Teradata as well as my colleague Mark Smith’s piece.

Predictive analytics is another foundational piece of big data analytics and one of the top priorities in organizations. However, according to our vr_bigdata_big_data_capabilities_not_availablebig data research, it is not available in 41 percent of organizations today. Teradata is addressing it in a number of ways and at the conference Stephen Brobst, Teradata’s CTO, likened big data analytics to a high-school chemistry classroom that has a chemical closet from which you pull out the chemicals needed to perform an experiment in a separate work area. In this analogy, Hadoop and the RDBMS EDW are the chemical closet, and Aster Data provides the sandbox where the experiment is conducted. With mulitple algorithms currently written into the platform and many more promised over the coming months, this sandbox provides a promising big data lab environment. The approach is SQL-centric and as such has its pros and cons. The obvious advantage is that SQL is a declarative language that is easier to learn than procedural languages, and an established skills base exists within most organizations. The disadvantage is that SQL is not the native tongue of many business analysts and statisticians. While it may be easy to call a function within the context of the SQL statement, the same person who can write the statement may not know when and where to call the function. One way for Teradata to expediently address this need is through its existing partnerships with companies like Alteryx, which I wrote about recently. Alteryx provides a user-friendly analytical workflow environment and is establishing a solid presence on the business side of the house. Teradata already works with predictive analytics providers like SAS but should further expand with companies like Revolution Analytics that I assessed that are using R technology to support a new generation of tools.

Teradata is exploiting its advantage with algorithms such as nPath, which shows the path that a customer has taken to a particular outcome such as buying or not buying. According to our big data benchmark research, being able to conduct what-if analysis and predictive analytics are the two most desired capabilities not currently available with big data, as the chart shows. The algorithms that Teradata is building into Aster help address this challenge, but despite customer case studies shown at the conference, Teradata did not clearly demonstrate how this type of algorithm and others seamlessly integrate to address the overall customer experience or other business challenges. While presenters verbalized it in terms of improving churn and fraud models, and we can imagine how the handoffs might occur, the presentations were more technical in nature. As Teradata gains traction with these types of analytical approaches, it will behoove the company to show not just how the algorithm and SQL works but how it works in the use by business and analysts who are not as technically savvy.

Another key principle behind big data analytics is timeliness of the analytics. Given the nature of business intelligence and traditional EDW architectures, until now timeliness of analytics has been associated with how quickly queries run. This has been a strength of the Teradata MPP share-nothing architecture, but other appliance architectures, such as those of Netezza and Greenplum, now challenge Teradata’s hegemony in this area. Furthermore, trends in big data make the situation more complex. In particular, with very large data sets, many analytical environments have replaced the traditional row-level access with column access. Column access is a more natural way for data to be accessed for analytics since it does not have to read through an entire row of data that may not be relevant to the task at hand. At the same time, column-level access has downsides, such as the reduced speed at which you can write to the system; also, as the data set used in the analysis expands to a high number of columns, it can become less efficient than row-level access. Teradata addresses this challenge by providing both row and column access through innovative proprietary access and computation techniques.

Exploratory analytics on large, diverse data sets also has a timeliness imperative. Hadoop promises the ability to conduct iterative analysis on such data sets, which is the reason that companies store big data in the first place according to our big data benchmark research. Iterative analysis is akin to the way the human brain naturally functions, as one question naturally leads to another question. However, methods such as Hive, which allows an SQL-like method to access Hadoop data, can be very slow, sometimes taking hours to return a query. Aster enables much faster access and therefore provides a more dynamic interface for iterative analytics on big data.

Timeliness also has to do with incorporating big data in a stream-oriented environment and only 16 percent of organizations are very satisfied with timeliness of events according to our operational intelligence benchmark research. In a use case such as fraud and security, rule-based systems work with complex algorithmic functions to uncover criminal activity. While Teradata itself does not provide the streaming or complex event processing (CEP) engines, it can provide the big data analytical sandbox and algorithmic firepower necessary to supply the appropriate algorithms for these systems. Teradata partners with major players in this space already, but would be well served to further partner with CEP and other operational intelligence vendors to expand its footprint. By the way, these vendors will be covered in our upcoming Operational Intelligence Value Index, which is based on our operational intelligence benchmark research. This same research showed that analyzing business and IT events together was very important in 45 percent of organizations.

The visualization and discovery of analytics is the last foundational pillarvr_ngbi_br_importance_of_bi_technology_considerations and here Teradata is still a work in progress. While some of the big data visualizations Aster generates show interesting charts, they lack a context to help people interpret the chart. Furthermore, the visualization is not as intuitive and requires the writing and customization of SQL statements. To be fair, most visual and discovery tools today are relationally oriented and Teradata is trying to visualize large and diverse sets of data. Furthermore, Teradata partners with companies including MicroStrategy and Tableau to provide more user-friendly interfaces. As Teradata pursues the big data analytics market, it will be important to demonstrate how it works with its partners to build a more robust and intuitive analytics workflow environment and visualization capability for the line-of-business user. Usability (63%) and functionality (49%) are the top two considerations when evaluating business intelligence systems according to our research on next-generation business intelligence.

Like other large industry technology players, Teradata is adjusting to the changes brought by business technology innovation in just the last few years. Given its highly scalable databases and data modeling – areas that still represent the heart of most company’s information architectures –  Teradata has the potential to pull everything together and leverage their current deployed base. Technologists looking at Teradata’s new and evolving capabilities will need to understand the business use cases and share these with the people in charge of such initiatives. For business users, it is important to realize that big data is more than just visualizing disparate data sets and that greater value lies in setting up an efficient back end process that applies the right architecture and tools to the right business problem.


Tony Cosentino
VP and Research Director

This year’s Inspire, Alteryx’s annual user conference, featured new developments around the company’s analytics platform. Alteryx CEO Dean Stoecker kicked off the event by talking about the promise of big data, the dissemination of analytics throughout the organization, and the data artisan as the “new boss.” Alteryx coined the term “data artisan” to represent the persona at the center of the company’s development and marketing efforts. My colleague Mark Smith wrote about the rise of the data artisan in his analysis of last year’s event.

President and COO George Mathew keynoted day two, getting into more specifics on the upcoming 8.5 product release. vr_ngbi_br_importance_of_bi_technology_considerationsAdvancements revolve around improvement in the analytical design environment, embedded search capabilities, the addition of interactive mapping and direct model output into Tableau. The goal is to provide an easier, more intuitive user experience. Our benchmark research into next-generation business intelligence shows buyers consider usability the top buying criteria at 63 percent. The redesigned Alteryx interface boasts a new look for the icons and more standardization across different functional environments. Color coding of the toolbox groups tools according to functions, such as data preparation, analytics and reporting. A new favorites function is another good addition, given that users tend to rely on the same tools depending on their role within the analytics value chain. Users can now look at workflows horizontally and not just vertically, and easily change the orientation if for example they are working on an Apple iPad. Version 8.5 allows embedded search and more streamlined navigation, and continues its focus on a role-based application, which my colleague has been advocating for a while. According to the company, 94 percent of its user base demanded interactive mapping; that’s now part of the product, letting users draw a polygon around an area of interest, then integrate it into the analytical application for runtime execution.

The highlight of the talk was the announcement of integration with Tableau 8.0 and the ability to write directly to the software without having to follow the cumbersome process of exporting a file and then reopening it in another application. Alteryx was an alpha partner and worked directly with the code base for Tableau 8.0, which I wrote up a few months ago. The partnership exemplifies the coopetition environment that many companies find themselves in today. While Tableau does some basic prediction, and Alteryx does some basic visual reporting, the companies’ core competencies brought together into one workflow is much more powerful for the user. Another interesting aspect is the juxtaposition of the two user groups. The visually oriented Tableau group in San Diego seemed much younger and was certainly much louder on the reveals, while the analytically oriented Alteryx group was much more subdued.

Alteryx has been around since 1997, when it was called SRC. It grew up focused around location analytics, which allowed it to establish foundational analytic use cases in vertical areas such as real estate and retail. After changing the company name and focusing more on horizontal analytics, Alteryx is growing fast with backing from, interestingly enough, SAP Ventures. Since the company was already profitable, it used a modest infusion of capital to grow its product marketing and sales functions. The move seems to have paid off. Companies such as Dunkin Brands and Redbox use Alteryx and the company has made significant inroads with marketing services companies.  A number of consulting companies, such as Absolute Data and Capgemini, are using Alteryx for customer and marketing analytics and other use cases. I had an interesting talk with the CEO of a small but important services firm who said that he is being asked to introduce innovative analytical approaches to much larger marketing services and market research firms. He told me that Alteryx is a key part of the solution he’ll be introducing to enable things such as big data analytics.

Alteryx provides value in a few innovative ways that are not new to this release, but that are foundational to the company’s business vr_bigdata_obstacles_to_big_data_analyticsstrategy. First, it marries data integration with analytics, which allows business users who have traditionally worked in a flat-file environment to pull from multiple data sources and integrate information within the context of the Alteryx application. Within that same environment, users can build analytic workflows and publish applications to a private or public cloud. This approach helps address the obstacles found in our research in big data analytics where staffing (79%) and training (77%) are addressed by Alteryx through providing more flexibility for business to engage into the analytic process.

Alteryx manages an analytics application store called the Analytics Gallery that crowdsources and shares user-created models. These analytical assets can be used internally within an organization or sold on the broader market. Proprietary algorithms can be secured through a black box approach, or made open to allow other users to tweak the analytic code. It’s similar to what companies like Datameer are doing on top of Hadoop, or Informatica in the cloud integration market. The store gives descriptions of what the applications do, such as fuzzy matching or target marketing. Being crowdsourced, the number of applications should proliferate over time, tracking advancements in the R open source project, since R is at the heart of the Alteryx analytic strategy and what it calls clear box analytics. The underlying algorithm is easily viewed and edited based on permissions established by the data artisan, similar to what we’ve seen with companies such as 1010data. Alteryx 8.5 works with R 3.0, the latest version. On the back end, Alteryx partners with enterprise data warehouse powerhouses such as Teradata, and works with the Hortonworks Hadoop distribution.

I encourage analysts of all stripes to take a look at the Alteryx portfolio. Perhaps start with the Analytics Gallery to get a flavor of what the company does and the type of analytics customers are building and using today.  Alteryx can benefit analysts looking to move beyond the limitations of a flat-file analytics environment, and especially marketing analysts who want to marry third-party data from sources such as the US Census Bureau, Experian, TomTom or Salesforce, which Alteryx offers within its product. If you have not seen Alteryx, you should take a look and see how they are changing the way analytic processes are designed and managed.


Tony Cosentino

VP and Research Director

SiSense gained a lot of traction last week at the Strata conference vr_ngbi_br_importance_of_bi_technology_considerationsin San Jose as it broke records in the so-called 10x10x10 Challenge – analyzing 10 terabytes of data in 10 seconds on a $10,000 commodity machine – and earned the company’s Prism product the Audience Choice Award. The Israel-based company, founded in 2005, has venture capital backing and is currently running at a profit with customers in more than 50 countries and marquee customers such as Target and Merck. Prism, its primary product, provides the entire business analytics stack, from ETL capabilities through data analysis and visualization. From the demonstrations I’ve seen, the toolset appears relatively user-friendly, which is important because customers say usability is the top criterion in 63 percent of organizations according to our next-generation business intelligence.

Prism comprises three main components: ElastiCube Manager, Prism BI Studio and Prism Web. ElastiCube Manager provides a visual environment with which users can ingest data from multiple sources, define relationship between data and do transforms via SQL functions. Prism Studio provides a visual environment that lets customers build visual dashboards that link data and provide users with dynamic charts and interactivity. Finally, Prism Web provides web-based functionality for interacting, sharing dashboards and managing user profiles and accounts.

At the heart of Prism is the ElasticCube technology, which can query large volumes of data quickly. ElasticCube uses a columnar approach, which allows for fast compression. With SiSense, queries are optimized by the CPU itself. That is, the system decides in an ad hoc manner the most efficient way to use both disk and memory. Most other approaches on the market lean either to a pure-play in-memory system or toward a columnar approach.

The company’s approach to big data analytics reveals the chasm that exists in big data analytics understanding between descriptive analytics and more advanced analytics such as we see with R, SAS and SPSS. When SiSense speaks of big data analytics, it is speaking of the ability to consume and explore very large data sets without predefining schemas. By doing away with schemas, the software does away with the  need for a statistician, a data mining engineer or an IT person for that matter. Instead, organizations need analysts with a good understanding of the business, the data sources with which they are working and the basic characteristics of those data sets. SiSense does not do sophisticated predictive modeling or data mining, but rather root-cause and contextual analysis across diverse and potentially very large data sets.

SiSense today has a relatively small footprint, and is facing an uphill battle against entrenched BI and analytics players for enterprise deployments but it’s easy to download and try approach will help it get traction with analysts who are less loyal to the BI that IT departments have purchased. SiSense Vice President of Marketing Bruno Aziza, formerly with Microsoft’s Business Intelligence group, and CEO Amit Bendov have been in the industry for a fair amount of time and understand this challenge. Their platform’s road into the organization is more likely through business groups rather than IT. For this reason, SiSense’s competitors are Tableau and QlikView on the discovery side and products likes SAP HANA and Actuate’s BIRT Analytics in-memory plus columnar approaches are likely the closest competitors in terms of the technological approach to accessing and visualizing large data sets. This ability to access large data sets in a timely manner without the need for data scientists can help overcome the top challenges BI users have in the areas of staffing and real-time access, which we uncovered in our recent business technology innovation research.

vr_bigdata_obstacles_to_big_data_analyticsSiSense has impressive technology, and is getting some good traction. It bears consideration by departmental and mid-market organizations that need to perform analytics across growing volumes of data without the need for an IT department to support their needs.


Tony Cosentino
VP and Research Director

I recently returned from Sweden, where QlikTech International hosted its annual analyst “unsummit.” Much of the information I was exposed to was under NDA, so I cannot talk about it here. What I can discuss, and what in many ways may be more interesting and more important, is the company’s focus on culture and philosophy.

Arguably, culture and company philosophy can provide a company competitive advantage by providing its employees, customer and partners a vision and a sense of engagement with the product and the company itself. Engagement and software usability have become important topics as companies such as Tableau and QlikTech have brought user-friendly, visual and iterative analysis to software that has heretofore been the domain of a few tool-oriented analysts. As such tools continue to gain traction in the organization I anticipate that these discussions around culture, engagement and the consumer-oriented idea of brand intimacy will become more important. In particular, I see two trends driving this: the consumerization of business technologies, which I discussed in a recent blog post, and demographic workforce shifts, which my colleague Robert Kugel discussed.

QlikTech CEO Lars Bjork gave us the initial chat regarding the company.  He spoke of the origins of the company and how the Swedish government gives favorable terms to Swedish startups, but then extracts high interest after a few years. He used this story to show how his company was able to overcome early stage difficulties in repayment of the loan, and how the Swedish government listened to its needs and worked to resolve the issues in a mutually beneficial way. Eventually it turned out that QlikTech became the most successful venture in which the Swedish government had ever engaged.

Bjork used this story as a jumping-off point to discuss the cultural backbone of the company, which is its Swedish heritage.  Sweden, he suggested, is interesting in two ways. The first is its design principles of simplicity and sophistication. The second is the consensus decision-making models in which its population engages.

Simplicity and sophistication are readily evident in Swedish architecture, and furniture. QlikTech and its user base make a strong argument that this is the underpinning of its software as well. (Interestingly, two of the new breed of BI software vendors were born and grew up in Sweden – Spotfire in the north and QlikTech in the south.) QlikTech uses a simple but powerful color-oriented approach to help users understand data.  Values in a data set can be highlighted in green, linked values are white, and excluded values are gray. This approach provides users an exploratory environment in which to iterate and visualize data. This approach was inspired by spreadsheet environments, where color coding is often used for analysis and alerting on differentiated variables. While QlikTech has advanced significantly from its spreadsheet roots, the color design principles remain intact and usability remains a key tenet of its strategy.

Perhaps the more interesting cultural aspect of Sweden is its societal and political culture. It is a democracy with a measure of shared responsibility in government and society that isn’t really found in the United States. The US is much more aligned with Teddy Roosevelt’s idea of “rugged individualism” and of the self-made men. In business, this American mindset tends to create a chain-of-command culture in which decision-making is pushed up the organizational pyramid, and once a decision is made, it is then disseminated through the organization. European business, and Swedish organizations, it can be argued are less hierarchical in nature.

This discussion of Sweden and its culture naturally flows into how business intelligence is becoming more collaborative by nature and how decisions are being pushed down through the organization. It also brings to light the different philosophies of locking down data and controlling data versus the ideas of openness and information sharing. As our benchmark study around next-generation business intelligence shows, these are key points for the future of business intelligence. As cultures become more open with mobility and collaboration, and decisions are moved down through organizations, employees become more empowered.

vr_ngbi_br_benefits_realized_from_collaborative_biThe entire premise, however, revolves around openness toward information sharing and a balance of controls within the organization. The idea of democratizing analytics makes a lot of sense, though democratization pushed to the extreme leads to anarchy. Much like Sweden, QlikTech intends to walk that fine line inspired by democratic ideals and consensus governance in its quest to bring analytics to the organizational masses. At the end of the day this may just may be the secret sauce the appeals to the millennial generation and beyond.


Tony Cosentino

VP & Research Director

Tableau Software is growing fast. Tableau has taken a “land and expand” strategy that drives what they call the democratization of analytics within organizations. Tableau has enjoyed first mover advantage in the area of exploratory analytics called visual discovery, a growing type of business analytics that allows companies to easily visualize data in a descriptive manner, but the company is facing competition as deep-pocket companies such as IBM, SAP and others become more aggressive in the space.

Tableau’s big advantages are that its software is easy to use, can access many data sources, and lacks the complexity of a traditional OLAP cube, which necessitates predefined schemas, materialized views and pre-aggregation of data. Our next-generation business intelligence benchmark research finds that usability is the most critical criterion in the choice of next-generation BI tools, and in usability, Tableau is an industry juggernaut. The company’s VizQL technology obviates the need for an analyst to understand technologies like SQL, and instead lets users explore data from multiple sources using drag-and-drop, point-and-click and pinch-and-tap techniques. No code, no complexity.

With Tableau’s 8.0 release, code-named the Kraken, currently in beta, the story gets more compelling. The Kraken takes the software beyond the business user and into the IT department –  the home of BI giants. New ease–of-use features such as better infographics, text justification and rich text formatting got applause at Tableau’s customer conference in San Diego earlier this month, where Tableau announced three specific features that help equip it for battle with traditional BI vendors.

The first is the ability to author, customize and share reports through a browser interface. This brings much of the functionality that was available only on Tableau Desktop to the Tableau Server environment. It gives users anywhere, anytime access, and increases manageability within the environment. Visualizations can be shared through a link, then picked up by another author in an iterative and collaborative process. Administrators can make sure dashboard proliferation doesn’t overwhelm users.

One big advancement is Tableau’s ability to embed its software in other companies’ portals or web applications through a JavaScript API. Many companies should pick up on this advancement to partner with Tableau to embed analytics into their applications. Our research shows that the market has yet to decide how next-generation BI will be delivered, with approximately even splits between those expecting it to go through a BI application (38%), end-user application (34%) and office productivity suite (36%). Anecdotally, we are seeing an uptick in embedded BI arrangements, such as Workday embedding Datameer and Kronos embedding MicroStrategy. Given Tableau’s visualization sophistication, I anticipate it will get a lot of traction here.

Tableau announced support for and Google Analytics at the conference. The Google move was soon extended to include Google’s BigQuery, which is based on Google’s Dremel real-time distributed query technology, which works in conjunction with Google’s MapReduce. Cloudera recently announced a similar approach to big-data ad-hoc analytics with its Impala initiative, and it chose Tableau as the first company to integrate. These partnerships say a lot about Tableau’s potential partnering power, which I anticipate will become a more important part of the company’s overall strategy.

While the conference announcements were extensive, and in many ways impressive, the battle for the hearts and minds of both IT departments and business users still remains. Tableau comes from the business side, and it remains to be seen is how powerful the usability argument is in the face of the risk and compliance issues that face IT. Tableau may encounter resistance as it moves closer to the IT department in order to enable multidepartment rollouts. IT often has long-term relationships with large software providers, and these providers are now bringing their own tools to market. These include such tools as SAP’s Visual Intelligence (a.k.a. Visi), and IBM Cognos Insight, which I recently blogged about. In many ways it is easier for IT to convince business to use these tools than for business users to make that argument to IT. The outcome of the battle depends on how quickly companies like SAP and IBM can catch up. Tableau says its R&D as a percentage is much higher than that of its competitors, but the question is whether that percentage is big enough to compete with the deep pockets of competitors that surround it. My assumption is that competitors will ultimately catch up, and then its ultimate success will come down to how large a footprint Tableau has established and the loyalty of its user base.

In sum, Tableau has an impressive offering, and the 8.0 beta release is another step forward. The company is advancing a new era of interactive visual discovery of data. Its partnerships and links to multiple data sources make it difficult to ignore in the business intelligence space. The advancements mentioned above as well as Tableau’s focus on things such as cohort analytics and quasi-experimental design approaches gives the company a fair amount of runway with respect to core analytics in the organization. However, it needs to start putting more statistical prowess into the application, starting with basic descriptive statistics, including significance testing such as t-test and chi-squared tests. While it is great to have cool pictures and graphs, if users cannot find real differences in the data, the software’s value is limited. Also, in order to meet its ambition of truly democratizing analytics, it needs to build out or embed basic analytic training modules. This will be key in getting from the What of the data to the So What of the data. Addressing this skills gap, as I wrote about in a blog post earlier this year, is one of the most important areas of focus for companies and suppliers playing in the analytics space. Suppliers that focus only on the tools themselves and ignore data sources and people aspects will see diminishing returns.

Tableau is well on its way into IT departments with its latest advancements, but it still needs to better address things such as write-back, data management and higher-level analytics if it hopes to compete with broader BI portfolios. Competitors in this market are not standing still; they are beginning to morph into more operation-oriented analytical systems.

Business users and departments considering exploratory analytics tools for their companies should definitely consider Tableau. For IT departments with broader responsibility, Tableau is also worth a look. Tableau is a leader in this emerging space, and with its continued investment in R&D, its strengthening partnerships, and a singular focus on bringing analytics to the business populous, it is addressing many core analytics needs within today’s organization. Tableau is an important company to watch.


Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter


  • 72,942 hits
%d bloggers like this: