You are currently browsing the tag archive for the ‘Data visualization’ tag.

In 2014, IBM announced Watson Analytics, which uses machine learning and natural language processing to unify and simplify the user experience in each step of the analytic processing: data acquisition, data preparation, analysis, dashboarding and storytelling.  After a relatively short beta testing period involving more than 22,000 users, IBM released Watson Analytics for general availability in December. There are two editions: the “freemium” trial version allows 500MB of data storage and access to file sizes less than 100,000 rows of data and 50 columns; the personal edition is a monthly subscription that enables larger files and more storage.

Its initial release includes functions to explore, predict and assemble data. Many of the features are based on IBM’s SPSS Analytic Catalyst, which I wrote about and which won the 2013 Ventana Research Technology Innovation Award for business analytics. Once data is uploaded, the explore function enables users to analyze data in an iterative fashion using natural language processing and simple point-and-click actions. Algorithms decide the best fit for graphics based on the data, but users may choose other graphics as needed. An “insight bar” shows other relevant data that may contain insights such as potential market opportunities.

The ability to explore data through visualizations with minimal knowledge is a primary aim of modern analytics tools. With the explore function incorporating natural language processing, which other tools in the market lack, IBM makes analytics accessible to users without the need to drag and drop dimensions and measures across the screen. This feature should not be underestimated; usability is the buying criterion for analytics tools most widely cited in our benchmark research on next-generation business intelligence (by 63% of organizations).

vr_ngbi_br_importance_of_bi_technology_considerations_updatedThe predict capability of Watson Analytics focuses on driver analysis, which is useful in a variety of circumstances such as sales win and loss, market lift analysis, operations and churn analysis. In its simplest form, a driver analysis aims to understand causes and effects among multiple variables. This is a complex process that most organizations leave to their resident statistician or outsource to a professional analyst. By examining the underlying data characteristics, the predict function can address data sets, including what may be considered big data, with an appropriate algorithm. The benefit for nontechnical users is that Watson Analytics makes the decision on selecting the algorithm and presents results in a relatively nontechnical manner such as spiral diagrams or tree diagrams. Having absorbed the top-level information, users can drill down into top key drivers. This ability enables users to see relative attribute influences and interactivity between attributes. Understanding interactivity is an important part of driver analysis since causal variables often move together (a challenge known as multicollinearity) and it is sometimes hard to distinguish what is actually causing a particular outcome. For instance, analysis may blame the customer service department for a product defect and point to it as the primary driver of customer defection. Accepting this result, a company may mistakenly try to fix customer service when a product issue needs to be addressed. This approach also overcomes the challenge of Simpson’s paradox, in which a trend that appears in different groups of data disappears or reverses when these groups are combined. This is a hindrance for some visualization tools in the market.

Once users have analyzed the data sufficiently and want to create and share their analysis, the assemble function enables them to bring together various dashboard visualizations in a single screen. Currently, Watson Analytics does such sharing (as well as comments related to the visualizations) via email. In the future, it would good to see capabilities such as annotation and cloud-based sharing in the product.

Full data preparation capabilities are not yet integrated into Watson Analytics. Currently, it includes a data quality report that gives confidence levels for the current data based on its cleanliness, and basic sort, transform and relabeling are incorporated as well. I assume that IBM has much more in the works here. For instance, its DataWorks cloud service offers APIs for some of the best data preparation and master data management available today. DataWorks can mask data at the source and do probabilistic matching against many sources including both cloud and on-premises addresses.  This is a major challenge organizations face when needing to conduct analytics across many data sets. For instance, in multichannel marketing, each individual customer may have many email addresses as well as different mailing addresses, phone numbers and identifiers for social media. A so-called “golden record” needs to be created so all such information can be linked together. Conceptually, the data becomes one long row of data related to that golden record, rather than multiple unassociated data in rows of shorter length. This data needs to be brought into a company’s own internal systems, and personally identifiable information must be stripped out before anything moves into a public domain. In a probabilistic matching system, data is matched not on one field but through associations of data which gives levels of certainty that records should be merged. This is different than past approaches and one of the reasons for significant innovation in the category. Multiple startups have been entering the data preparation space to address the need for a better user experience in data preparation. Such needs have been documented as one of the foundational issues facing the world of big data. Our benchmark research into information optimization shows that data preparation (47%) and quality and consistency (45%) are the most time-consuming tasks for organizations in analytics.

Watson Analytics is deployed on IBM’s SoftLayer cloud vr_Info_Optimization_04_basic_information_tasks_consume_timetechnology and is part of a push to move its analytic portfolio into the cloud. Early in 2015 the company plans to move its SPSS and Cognos products into the cloud via a managed service, thus offloading tasks such as setup, maintenance and disaster recovery management. Watson Analytics will be offered as a set of APIs much as the broader Watson cognitive computing platform has been. Last year, IBM said it would move almost all of its software portfolio to the cloud via its Bluemix service platform. These cloud efforts, coupled with the company’s substantial investment in partner programs with developers and universities around the world, suggest that Watson may power many next-generation cognitive computing applications, a market estimated to grow into the tens of billions of dollars in the next several years.

Overall, I expect Watson Analytics to gain more attention and adoption in 2015 and beyond. Its design philosophy and user experience are innovative, but work must be done in some areas to make it a tool that professionals use in their daily work. Given the resources IBM is putting into the product and the massive amounts of product feedback it is receiving, I expect initial release issues to be worked out quickly through the continuous release cycle. Once they are, Watson Analytics will raise the bar on self-service analytics.

Regards,

Ventana Research

Qlik was an early pioneer in developing a substantial market for a visual discovery tool that enables end users to easily access and manipulate analytics and data. Its QlikView application uses an associative experience that takes  an in-memory, correlation-based approach to present a simpler design and user experience for analytics than previous tools. Driven by sales of QlikView, the company’s revenue has grown to more than $.5 billion, and originating in Sweden it has a global presence.

At its annual analyst event in New York the business intelligence and analytics vendor discussed recent product developments, in particular the release of Qlik Sense. It is a drag-and-drop visual analytics tool targeted at business users but scalable enough for enterprise use. Its aim is to give business users a simplified visual analytic experience that takes advantage of modern cloud technologies. Such a user experience is important; our benchmark research into next-generation business intelligence shows that usability is an important buying criterion for nearly two out of three (63%) companies. A couple of months ago, Qlik introduced Qlik Sense for desktop systems, and at the analyst event it announced general availability of the cloud and server editions.

vr_bti_br_technology_innovation_prioritiesAccording to our research into business technology innovation, analytics is the top initiative for new technology: 39 percent of organizations ranked it their number-one priority. Analytics includes exploratory and confirmatory approaches to analysis. Ventana Research refers to exploratory analytics as analytic discovery and segments it into four categories that my colleague Mark Smith has articulated. Qlik’s products belong in the analytic discovery category. Users can use the tool to investigate data sets in an intuitive and visual manner, often conducting root cause analysis and decision support functions. This software market is relatively young, and competing companies are evolving and redesigning their products to suit changing tastes. Tableau, one of Qlik’s primary competitors, which I wrote about recently, is adapting its current platform to developments in hardware and in-memory processing, focusing on usability and opening up its APIs. Others have recently made their first moves into the market for visual discovery applications, including Information Builders and MicroStrategy. Companies such as Actuate, IBM, SAP, SAS and Tibco are focused on incorporating more advanced analytics in their discovery tools. For buyers, this competitive and fragmented market creates a challenge when comparing offers in the analytic discovery market.

A key differentiator is Qlik Sense’s new modern architecture, which is designed for cloud-based deployment and embedding in other applications for specialized use. Its analytic engine plugs into a range of Web services. For instance, the Qlik Sense API enables the analytic engine to call to a data set on the fly and allow the application to manipulate data in the context of a business process. An entire table can be delivered to node.js, which extends the JavaScript API to offer server-side features and enables the Qlik Sense engine to take on an almost unlimited number of real-time connections  by not blocking input and output. Previously developers could write PHP script and pipe SQL to get the data, and the resulting application is viable but complex to build and maintain. Now all they need is JavaScript and HTML. The Qlik Sense architecture abstracts the complexity and allows JavaScript developers to make use of complex constructs without intricate knowledge of the database. The new architecture can decouple the Qlik engine from the visualizations themselves, so Web developers can define expressions and dimensions without going into the complexities of the server-side architecture. Furthermore, by decoupling the services, developers gain access to open source visualization technologies such as d3.js. Cloud-based business intelligence and extensible analytics are becoming a hot topic. I have written about this, including a glimpse of our newly announced benchmark research on the next generation of data and analytics in the cloud. From a business user perspective, these types of architectural changes may not mean much, but for developers, OEMs and UX design teams, it allows much faster time to value through a simpler component-based approach to utilizing the Qlik analytic engine and building visualizations.

vr_Big_Data_Analytics_06_benefits_realized_from_big_data_analyticsThe modern architecture of Qlik Sense together with the company’s ecosystem of more than 1,000 partners and a professional services organization that has completed more than 2,700 consulting engagements, gives Qlik a competitive position. The service partner relationships, including those with major systems integrators, are key to the company’s future since analytics is as much about change management as technology. Our research in analytics consistently shows that people and processes lag technology and information in performance with analytics. Furthermore, in our benchmark research into big data analytics, the benefits most often mentioned as achieved are better communication and knowledge sharing (24%), better management and alignment of business goals (18%), and gaining competitive advantage (17%).

As tested on my desktop, Qlik Sense shows an intuitive interface with drag-and-drop capabilities for building analysis. Formulas are easy to incorporate as new measures, and the palate offers a variety of visualization options which automatically fit to the screen. The integration with QlikView is straightforward in that a data model from QlikView can be saved seamlessly and opened intact in Qlik Sense. The storyboard function allows for multiple visualizations to build into narratives and for annotations to be added including linkages with data. For instance, annotations can be added to specific inflection points in a trend line or outliers that may need explanation. Since the approach is all HTML5-based, the visualizations are ready for deployment to mobile devices and responsive to various screen sizes including newer smartphones, tablets and the new class of so-called phablets. In the evaluation of vendors in our Mobile Business Intelligence Value Index Qlik ranked fourth overall.

In the software business, of course, technology advances alone don’t guarantee success. Qlik has struggled to clarify the position its next-generation product and it is not a replacement for QlikView. QlikView users are passionate about keeping their existing tool because they have already designed dashboards and calculations using this tool. Vendors should not underestimate user loyalty and adoption. Therefore Qlik now promises to support both products for as long as the market continues to demand them. The majority of R&D investment will go into Qlik Sense as developers focus on surpassing the capabilities of QlikView. For now, the company will follow a bifurcated strategy in which the tools work together to meet needs for various organizational personas. To me, this is the right strategy. There is no issue in being a two-product company, and the revised positioning of Qlik Sense complements QlikView both on the self-service side and the developer side. Qlik Sense is not yet as mature a product as QlikView, but from a business user’s perspective it is a simple and effective analysis tool for exploring data and building different data views. It is simpler because users no do not need to script the data in order to create the specific views they deem necessary. As the product matures, I expect it to become more than an end user’s visual analysis tool since the capabilities of Qlik Sense lends itself to web scale approaches. Over time, it will be interesting to see how the company harmonizes the two products and how quickly customers will adopt Qlik Sense as a stand-alone tool.

For companies already using QlikView, Qlik Sense is an important addition to the portfolio. It will allow business users to become more engaged in exploring data and sharing ideas. Even for those not using QlikView, with its modern architecture and open approach to analytics, Qlik Sense can help future-proof an organization’s current business intelligence architecture. For those considering Qlik for the first time, the choice may be whether to bring in one or both products. Given the proven approach of QlikView, in the near term a combination approach may be a better solution in some organizations. Partners, content providers and ISVs should consider Qlik Branch, which provides resources for embedding Qlik Sense directly into applications. The site provides developer tools, community efforts such as d3.js integrations and synchronization with Github for sharing and branching of designs. For every class of user, Qlik Sense can be downloaded for free and tested directly on the desktop. Qlik has made significant strides with Qlik Sense, and it is worth a look for anybody interested in the cutting edge of analytics and business intelligence.

Regards,

Ventana Research

Tableau Software introduced its latest advancements in analytics and business intelligence software along with its future plan to more than 5,000 attendees at its annual user conference in its home town of Seattle. The enthusiasm of the primarily millennial-age crowd reflected not only the success of the young company but also its aspirations. The market for what Ventana Research calls visual and data discovery and Tableau have experienced rapid growth that is likely to continue.

vr_ngbi_br_importance_of_bi_technology_considerations_updatedThe company focuses on the mission of usability, which our benchmark research into next-generation business intelligence shows to be a top software buying criterion for more organizations (63%) than any other. Tableau introduced advances in this area including analytic ease of use, APIs, data preparation, storyboarding and mobile technology support as part of its user-centric product strategy. Without revealing specific timelines, executives said that the next major release, Tableau 9.0, likely will be available in the first half of 2015 as outlined by the CEO in his keynote.

Chief Development Officer and co-founder Chris Stolte showed upcoming ease-of-use features such as the addition of Excel-like functionality within workbooks. Users can type a formula directly into a field and use auto-completion or drag and drop to bring in other fields that are components of the calculation. The new calculation can be saved as a metric and easily added to the Tableau data model. Others announced included table calculations, geographic search capabilities and radial and lasso selection on maps. The live demonstration between users onstage was seamless and created flows that the audience could understand. The demonstration reflected impressive navigation capabilities.

Stolte also demonstrated upcoming statistical capabilities.  Box plots have been available since Tableau 8.1, but now the capabilities have been extended for comparative analysis across groups and to create basic forecast models. The comparative descriptive analytics has been improved with drill-down features and calculations within tables. This is important since analysis between and within groups is necessary to use descriptive statistics to reveal business insights. Our research into big data analytics shows that the some of the most important analytic approaches are descriptive in nature: Pivot tables (48%), classification or decision trees (39%) and clustering (37%) are the methods most widely used for big data analytics.

When it comes to predictive analytics, however, Tableau is still somewhat limited. Companies such as IBM, Information Builders, MicroStrategy, SAS and SAP have focused more resources on incorporating advanced analytics in their discovery tools; Tableau has to catch up in this area. Forecasting of basic trend lines is a first step, but if the tool is meant for model builders, then I’d like to see more families of curves and algorithms to fit different data sets such as seasonal variations. Business users, Tableau’s core audience, need automated modeling approaches that can evaluate the characteristics of the data and produce adequate models. How different stakeholders communicate around the statistical parameters and models is also unclear to me. Our research shows that summary statistics and model comparisons are important capabilities for administering and sharing predictive analytics. Overall, Tableau is making strides in both descriptive and predictive statistics and making this intuitive for users.

vr_Info_Optimization_04_basic_information_tasks_consume_timePresenters also introduced new data preparation capabilities on Excel imports including the abilities to view delimiters, split columns and even un-pivot data. The software also has some ability to clean up spreadsheets such as getting rid of headers and footers. Truly dirty data, such as survey data captured in spreadsheets or created with custom calculations and nesting, is not the target here. The data preparation capabilities can’t compare with those provided by companies such as Alteryx, Infromatica, Paxata, Pentaho, Tamr or Trifacta. However, it is useful to quickly upload and clean a basic Excel document and then visualize it in a single application. According to our benchmark research on information optimization, data preparation (47%) and checking data for quality and consistency (45%) are the primary tasks on which analysts spend their time.

Storytelling (which Tableau calls Storypoints), is an exciting area of development for the company. Introduced last year, it enables users to build a sequential narrative that includes graphics and text. New developments enable the user to view thumbnails of different visualizations and easily pull them into the story. Better control over calculations, fonts, colors and text positioning were also introduced. While these changes may seem minor, they are important to this kind of application. A major reason that most analysts take their data out of an analytic tool and put it into PowerPoint is to have this type of control and ease of use. While PowerPoint remains dominant when it comes to communicating analytic results in business, a new category of tools is challenging Microsoft’s preeminence in this area. Tableau Storypoints is one of the easiest to use in the market.

API advancements were discussed by Francois Ajenstat, senior director of product management, who suggested that in the future anything done on Tableau Server can be done through APIs. In this way different capabilities will be exposed so that other software can use them (Tableau visualizations, for example) within the context of their workflows. As well, Tableau has added REST APIs including JavaScript capabilities, which allow users to embed Tableau in applications to do such things as customize dashboards. The ability to visualize JSON data in Tableau is also a step forward since exploiting new data sources is the fastest way to gain business advantage from analytics. This capability was demonstrated using government data, which is commonly packaged in JSON format. As Tableau continues its API initiatives, we hope to see more advances in exposing APIs so Tableau can be integrated into governance workflows, which can be critical to enterprise implementations. APIs also can enable analytic workflow tools to more easily access the product so statistical modelers can understand the data prior to building models. While Tableau integrates on the back end with analytic tools such as Alteryx, the visual and data discovery aspects that must precede statistical model building are still a challenge. Having more open APIs will open up opportunities for Tableau’s customers and could broaden its partner ecosystem.

The company made other enterprise announcements such as Kerberos Security, easier content management, an ability to seamlessly integrate with Salesforce.com, and a parallelized approach to accessing very large data sets. These are all significant developments. As Tableau advances its enterprise vision and continues to expand on the edges of the organization, I expect it to compete in more enterprise deals. The challenge the company faces is still one of the enterprise data model. Short of displacing the enterprise data models that companies have invested in over the years, it will continue to be an uphill battle for Tableau to displace large incumbent BI vendors. Our research into Information Optimization shows that the integration with security and user-access frameworks is the biggest technical challenge for optimizing information. For a deeper look at Tableau’s battle for the enterprise, please read my previous analysis.

VRMobileBIVIPerhaps the most excitement from the audience came from the introduction of Project Elastic, a new mobile application with which users can automatically turn an email attachment in Excel into a visualization. The capability is native so it works in offline mode and provides a fast and responsive experience. The new direction bifurcates Tableau’s mobile strategy which heretofore was a purely HTML5 strategy introduced with Tableau 6.1. Tableau ranked seventh in our 2014 Mobile Business Intelligence Value Index.

Tableau has a keen vision of how it can carve out a place in the analytics market. Usability has a major role in building a following among users that should help it continue to grow. The Tableau cool factor won’t go unchallenged, however. Major players are introducing visual discovery products amid discussion about the need for more governance of data into the enterprise and cloud computing; Tableau likely have to blend into the fabric of analytics and BI in organizations. To do so, the company will need to forge significant partnerships and open its platform for easy access.

Organizations considering visual discovery software in the context of business and IT should include Tableau. For enterprise implementations, consideration should be done to ensure Tableau can support the broader manageability and reliability requirements for larger scale deployments. Visualization of data continues to be a critical method to understand the challenges of global business but should not be the only analytic approach taken for all types of users. Tableau is on the leading edge of visual discovery and should not be overlooked.

Regards,

Ventana Research

Tibco’s recent acquisition of Jaspersoft helps the company fill out its portfolio of business intelligence (BI) and reporting software in an increasingly competitive marketplace. Tibco already offered a range of products in BI and analytics including Tibco Spotfire, an established product for visual data discovery. Jaspersoft and its open source Java reporting tool JasperReports have been around since 2001, and the company says it has 16 million product downloads worldwide, 140,000 production deployments and 2,000 commercial customers in 100 countries. Jaspersoft received attention recently for its partnership with Amazon Marketplace and the ability to embed its system into applications using a credit card and a few simple configuration steps. This example of embedding the technology is an area that Tibco knows well from its history of integrating its technology into enterprise architecture across the planet.

vr_Info_Optimization_08_most_important_analyst_capabilities_updatedThe acquisition is significant given today’s advancements in the Business Intelligence market and the need for tools to serve a variety of users. In some ways their technologies serve the same users – analysts and business users trying to make decisions with data but how they approach it and support a broad set of roles and responsibilities is different.  Tibco Spotfire, Tibco’s approach to business analytics, serves for analytics and visualization with specializing in visual discovery and data exploration while Jaspersoft addresses the query and analyze, reporting, dashboards and other aspects of BI. According to our benchmark research on information optimization, the capabilities business users most often need are to drill into information within applications (37%), search for data (36%) and collaborate (27%). For analysts, the most necessary capabilities are extracting data (39%), designing and integrating metrics (37%) and developing policies and rules for access (34%). With Jaspersoft, Tibco can address both groups and also can embed intelligence and reporting capabilities into operationally oriented environments across range of business applications.

vr_oi_challenges_using_bi_for_operational_intelligenceThe acquisition makes sense in that more capabilities are needed to address the expanding scope of business intelligence and analytics. In practice, it will be interesting to see how the open source community and culture of Jaspersoft meshes with the culture of Tibco’s Spotfire division. For now, Jaspersoft will continue as a separate division so business likely will continue as usual until management decides specific areas of integration. With respect to development efforts, it will be critical to blend the discovery capabilities of Tibco Spotfire with Jaspersoft’s reporting which will be a formidable challenge.  Another key to success will be how Tibco integrates both with the capabilities from Extended Results, a mobile business intelligence provider Tibco bought in 2013. Mobility is an area where Ventana Research found Jaspersoft significantly lacking, so the Extended Results capabilities should prove useful. Finally, Tibco’s event-enabled infrastructure will likely play a key role as the company continues to invest in operational intelligence for event-focused information gathering and delivery. Our operational intelligence research has found a lack of integration from business intelligence like that of Jaspersoft with event streams like from Tibco to be a major challenge in over half (51%) of organizations. This is a potential opportunity for Tibco as it looks at future integration of the technologies.

The Jaspersoft acquisition is not surprising given recent changes in the BI market. The category, which just a few years ago was vr_Info_Optimization_01_whos_responsible_for_information_availabilityconsidered mature and well-defined, is expanding to include areas such as analytic discovery tools, advanced analytics and big data. The union of Tibco Spotfire, which primarily targets line-of-business professionals from analysts to knowledge worksers, and Jaspersoft, a more IT-centered company, reflects the need for the industry to bridge a divide that exists in many organizations where IT is publishing dashboards and reports to business.  The challenge of using information across business and IT was found in our latest research, revealed in our information optimization benchmark research, shows that information management these days is most often (in 42% of organizations) a joint responsibility of IT and the lines of business , although IT is involved in some capacity in four-fifths of them. It remains to be seen whether the joint company can take on major competitors that have far more cash resources and take a similar approach.

Preliminary indicators show a good fit between these two organizations. Customers from each will be introduced to important new tools and capabilities from the other. One of the first likely moves for Tibco will be to introduce the 2,000 commercial customers and global presence of Jaspersoft to the broader portfolio. We advise those customers to evaluate what Tibco offers, especially those from Tibco Spotfire which continues to be a leader in the visual data discovery market. Before investing, however, customers and prospects should demand clarity on the company’s plans for technical integration of analytics and how these will fit with organizations long-term business intelligence and analytics roadmaps. Tibco customers migrating to the cloud should investigate the work Jaspersoft is doing with companies like Amazon and consider whether the embedded approach to interactive reporting can fit with their analytics, cloud and application strategies.

The opportunity for Tibco to advance business analytics is significant through this acquisition but it has historically not been as progressive in its marketing and sales of analytics compared to others in the market. The demand for visual discovery and big data analytics has grown dramatically with over three quarters of organizations according to our research has shown as overall important. Big data analytics and visualization is an area that Spotfire had innovated before Tibco acquisition but has not seen its fair share of growth with the buying trends. The opportunity for Tibco to provide analytics and BI that can further leverage the entire Tibco portfolio of integration, event processing, cloud and social collaboration software products is upon them, let’s see how they do. It now needs to supercharge its analytics efforts significantly with leveraging its new products from Jaspersoft.

Regards,

Tony Cosentino

VP and Research Director

I recently returned from Sweden, where QlikTech International hosted its annual analyst “unsummit.” Much of the information I was exposed to was under NDA, so I cannot talk about it here. What I can discuss, and what in many ways may be more interesting and more important, is the company’s focus on culture and philosophy.

Arguably, culture and company philosophy can provide a company competitive advantage by providing its employees, customer and partners a vision and a sense of engagement with the product and the company itself. Engagement and software usability have become important topics as companies such as Tableau and QlikTech have brought user-friendly, visual and iterative analysis to software that has heretofore been the domain of a few tool-oriented analysts. As such tools continue to gain traction in the organization I anticipate that these discussions around culture, engagement and the consumer-oriented idea of brand intimacy will become more important. In particular, I see two trends driving this: the consumerization of business technologies, which I discussed in a recent blog post, and demographic workforce shifts, which my colleague Robert Kugel discussed.

QlikTech CEO Lars Bjork gave us the initial chat regarding the company.  He spoke of the origins of the company and how the Swedish government gives favorable terms to Swedish startups, but then extracts high interest after a few years. He used this story to show how his company was able to overcome early stage difficulties in repayment of the loan, and how the Swedish government listened to its needs and worked to resolve the issues in a mutually beneficial way. Eventually it turned out that QlikTech became the most successful venture in which the Swedish government had ever engaged.

Bjork used this story as a jumping-off point to discuss the cultural backbone of the company, which is its Swedish heritage.  Sweden, he suggested, is interesting in two ways. The first is its design principles of simplicity and sophistication. The second is the consensus decision-making models in which its population engages.

Simplicity and sophistication are readily evident in Swedish architecture, and furniture. QlikTech and its user base make a strong argument that this is the underpinning of its software as well. (Interestingly, two of the new breed of BI software vendors were born and grew up in Sweden – Spotfire in the north and QlikTech in the south.) QlikTech uses a simple but powerful color-oriented approach to help users understand data.  Values in a data set can be highlighted in green, linked values are white, and excluded values are gray. This approach provides users an exploratory environment in which to iterate and visualize data. This approach was inspired by spreadsheet environments, where color coding is often used for analysis and alerting on differentiated variables. While QlikTech has advanced significantly from its spreadsheet roots, the color design principles remain intact and usability remains a key tenet of its strategy.

Perhaps the more interesting cultural aspect of Sweden is its societal and political culture. It is a democracy with a measure of shared responsibility in government and society that isn’t really found in the United States. The US is much more aligned with Teddy Roosevelt’s idea of “rugged individualism” and of the self-made men. In business, this American mindset tends to create a chain-of-command culture in which decision-making is pushed up the organizational pyramid, and once a decision is made, it is then disseminated through the organization. European business, and Swedish organizations, it can be argued are less hierarchical in nature.

This discussion of Sweden and its culture naturally flows into how business intelligence is becoming more collaborative by nature and how decisions are being pushed down through the organization. It also brings to light the different philosophies of locking down data and controlling data versus the ideas of openness and information sharing. As our benchmark study around next-generation business intelligence shows, these are key points for the future of business intelligence. As cultures become more open with mobility and collaboration, and decisions are moved down through organizations, employees become more empowered.

vr_ngbi_br_benefits_realized_from_collaborative_biThe entire premise, however, revolves around openness toward information sharing and a balance of controls within the organization. The idea of democratizing analytics makes a lot of sense, though democratization pushed to the extreme leads to anarchy. Much like Sweden, QlikTech intends to walk that fine line inspired by democratic ideals and consensus governance in its quest to bring analytics to the organizational masses. At the end of the day this may just may be the secret sauce the appeals to the millennial generation and beyond.

Regards,

Tony Cosentino

VP & Research Director

Tableau Software is growing fast. Tableau has taken a “land and expand” strategy that drives what they call the democratization of analytics within organizations. Tableau has enjoyed first mover advantage in the area of exploratory analytics called visual discovery, a growing type of business analytics that allows companies to easily visualize data in a descriptive manner, but the company is facing competition as deep-pocket companies such as IBM, SAP and others become more aggressive in the space.

Tableau’s big advantages are that its software is easy to use, can access many data sources, and lacks the complexity of a traditional OLAP cube, which necessitates predefined schemas, materialized views and pre-aggregation of data. Our next-generation business intelligence benchmark research finds that usability is the most critical criterion in the choice of next-generation BI tools, and in usability, Tableau is an industry juggernaut. The company’s VizQL technology obviates the need for an analyst to understand technologies like SQL, and instead lets users explore data from multiple sources using drag-and-drop, point-and-click and pinch-and-tap techniques. No code, no complexity.

With Tableau’s 8.0 release, code-named the Kraken, currently in beta, the story gets more compelling. The Kraken takes the software beyond the business user and into the IT department –  the home of BI giants. New ease–of-use features such as better infographics, text justification and rich text formatting got applause at Tableau’s customer conference in San Diego earlier this month, where Tableau announced three specific features that help equip it for battle with traditional BI vendors.

The first is the ability to author, customize and share reports through a browser interface. This brings much of the functionality that was available only on Tableau Desktop to the Tableau Server environment. It gives users anywhere, anytime access, and increases manageability within the environment. Visualizations can be shared through a link, then picked up by another author in an iterative and collaborative process. Administrators can make sure dashboard proliferation doesn’t overwhelm users.

One big advancement is Tableau’s ability to embed its software in other companies’ portals or web applications through a JavaScript API. Many companies should pick up on this advancement to partner with Tableau to embed analytics into their applications. Our research shows that the market has yet to decide how next-generation BI will be delivered, with approximately even splits between those expecting it to go through a BI application (38%), end-user application (34%) and office productivity suite (36%). Anecdotally, we are seeing an uptick in embedded BI arrangements, such as Workday embedding Datameer and Kronos embedding MicroStrategy. Given Tableau’s visualization sophistication, I anticipate it will get a lot of traction here.

Tableau announced support for Salesforce.com and Google Analytics at the conference. The Google move was soon extended to include Google’s BigQuery, which is based on Google’s Dremel real-time distributed query technology, which works in conjunction with Google’s MapReduce. Cloudera recently announced a similar approach to big-data ad-hoc analytics with its Impala initiative, and it chose Tableau as the first company to integrate. These partnerships say a lot about Tableau’s potential partnering power, which I anticipate will become a more important part of the company’s overall strategy.

While the conference announcements were extensive, and in many ways impressive, the battle for the hearts and minds of both IT departments and business users still remains. Tableau comes from the business side, and it remains to be seen is how powerful the usability argument is in the face of the risk and compliance issues that face IT. Tableau may encounter resistance as it moves closer to the IT department in order to enable multidepartment rollouts. IT often has long-term relationships with large software providers, and these providers are now bringing their own tools to market. These include such tools as SAP’s Visual Intelligence (a.k.a. Visi), and IBM Cognos Insight, which I recently blogged about. In many ways it is easier for IT to convince business to use these tools than for business users to make that argument to IT. The outcome of the battle depends on how quickly companies like SAP and IBM can catch up. Tableau says its R&D as a percentage is much higher than that of its competitors, but the question is whether that percentage is big enough to compete with the deep pockets of competitors that surround it. My assumption is that competitors will ultimately catch up, and then its ultimate success will come down to how large a footprint Tableau has established and the loyalty of its user base.

In sum, Tableau has an impressive offering, and the 8.0 beta release is another step forward. The company is advancing a new era of interactive visual discovery of data. Its partnerships and links to multiple data sources make it difficult to ignore in the business intelligence space. The advancements mentioned above as well as Tableau’s focus on things such as cohort analytics and quasi-experimental design approaches gives the company a fair amount of runway with respect to core analytics in the organization. However, it needs to start putting more statistical prowess into the application, starting with basic descriptive statistics, including significance testing such as t-test and chi-squared tests. While it is great to have cool pictures and graphs, if users cannot find real differences in the data, the software’s value is limited. Also, in order to meet its ambition of truly democratizing analytics, it needs to build out or embed basic analytic training modules. This will be key in getting from the What of the data to the So What of the data. Addressing this skills gap, as I wrote about in a blog post earlier this year, is one of the most important areas of focus for companies and suppliers playing in the analytics space. Suppliers that focus only on the tools themselves and ignore data sources and people aspects will see diminishing returns.

Tableau is well on its way into IT departments with its latest advancements, but it still needs to better address things such as write-back, data management and higher-level analytics if it hopes to compete with broader BI portfolios. Competitors in this market are not standing still; they are beginning to morph into more operation-oriented analytical systems.

Business users and departments considering exploratory analytics tools for their companies should definitely consider Tableau. For IT departments with broader responsibility, Tableau is also worth a look. Tableau is a leader in this emerging space, and with its continued investment in R&D, its strengthening partnerships, and a singular focus on bringing analytics to the business populous, it is addressing many core analytics needs within today’s organization. Tableau is an important company to watch.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,049 hits
%d bloggers like this: