You are currently browsing the monthly archive for December 2012.

I recently returned from Sweden, where QlikTech International hosted its annual analyst “unsummit.” Much of the information I was exposed to was under NDA, so I cannot talk about it here. What I can discuss, and what in many ways may be more interesting and more important, is the company’s focus on culture and philosophy.

Arguably, culture and company philosophy can provide a company competitive advantage by providing its employees, customer and partners a vision and a sense of engagement with the product and the company itself. Engagement and software usability have become important topics as companies such as Tableau and QlikTech have brought user-friendly, visual and iterative analysis to software that has heretofore been the domain of a few tool-oriented analysts. As such tools continue to gain traction in the organization I anticipate that these discussions around culture, engagement and the consumer-oriented idea of brand intimacy will become more important. In particular, I see two trends driving this: the consumerization of business technologies, which I discussed in a recent blog post, and demographic workforce shifts, which my colleague Robert Kugel discussed.

QlikTech CEO Lars Bjork gave us the initial chat regarding the company.  He spoke of the origins of the company and how the Swedish government gives favorable terms to Swedish startups, but then extracts high interest after a few years. He used this story to show how his company was able to overcome early stage difficulties in repayment of the loan, and how the Swedish government listened to its needs and worked to resolve the issues in a mutually beneficial way. Eventually it turned out that QlikTech became the most successful venture in which the Swedish government had ever engaged.

Bjork used this story as a jumping-off point to discuss the cultural backbone of the company, which is its Swedish heritage.  Sweden, he suggested, is interesting in two ways. The first is its design principles of simplicity and sophistication. The second is the consensus decision-making models in which its population engages.

Simplicity and sophistication are readily evident in Swedish architecture, and furniture. QlikTech and its user base make a strong argument that this is the underpinning of its software as well. (Interestingly, two of the new breed of BI software vendors were born and grew up in Sweden – Spotfire in the north and QlikTech in the south.) QlikTech uses a simple but powerful color-oriented approach to help users understand data.  Values in a data set can be highlighted in green, linked values are white, and excluded values are gray. This approach provides users an exploratory environment in which to iterate and visualize data. This approach was inspired by spreadsheet environments, where color coding is often used for analysis and alerting on differentiated variables. While QlikTech has advanced significantly from its spreadsheet roots, the color design principles remain intact and usability remains a key tenet of its strategy.

Perhaps the more interesting cultural aspect of Sweden is its societal and political culture. It is a democracy with a measure of shared responsibility in government and society that isn’t really found in the United States. The US is much more aligned with Teddy Roosevelt’s idea of “rugged individualism” and of the self-made men. In business, this American mindset tends to create a chain-of-command culture in which decision-making is pushed up the organizational pyramid, and once a decision is made, it is then disseminated through the organization. European business, and Swedish organizations, it can be argued are less hierarchical in nature.

This discussion of Sweden and its culture naturally flows into how business intelligence is becoming more collaborative by nature and how decisions are being pushed down through the organization. It also brings to light the different philosophies of locking down data and controlling data versus the ideas of openness and information sharing. As our benchmark study around next-generation business intelligence shows, these are key points for the future of business intelligence. As cultures become more open with mobility and collaboration, and decisions are moved down through organizations, employees become more empowered.

vr_ngbi_br_benefits_realized_from_collaborative_biThe entire premise, however, revolves around openness toward information sharing and a balance of controls within the organization. The idea of democratizing analytics makes a lot of sense, though democratization pushed to the extreme leads to anarchy. Much like Sweden, QlikTech intends to walk that fine line inspired by democratic ideals and consensus governance in its quest to bring analytics to the organizational masses. At the end of the day this may just may be the secret sauce the appeals to the millennial generation and beyond.


Tony Cosentino

VP & Research Director

As a market research practitioner and a technology industry analyst covering big data and business analytics, I enjoyed listening to other analysts discuss the market research industry in a webinar.  My own research augments and sometimes contrasts with that of the webinar participants.

In the webinar Simon Chadwick spoke about data mining and the need to focus on the analytic skills gap. I’d maintain that businesses need to focus on traditional hypothesis-driven statistics in addition to data mining, especially when we start talking about predictive analytics. While data mining is directed from the data itself, statistics may be thought of as more hypothesis-driven. If you’re familiar with SPSS tools that I recently assessed, you can see the difference by comparing SPSS Statistics and SPSS Modeler. Market research practitioners have traditionally been trained in latter, but not in the former. I go into more detail in a joint webinar I did with IBM on predictive analytics.

Folks who know data mining are in limited supply. They often come from the data warehouse or business intelligence worlds, which have not traditionally churned out deep analytical expertise. Data warehouse and business intelligence is often the domain of database administrators or folks who have a good understanding of structured query language (SQL). Some technology companies are trying to fill the skills gap related to big data (read unstructured data) by taking advantage of these SQL skillsets. Teradata is moving in this direction with its Aster Data integration, and Karmasphere with its toolset, but SQL is a declarative language, and while it fills some gaps in organizations’ ability to access big data, it has own limitations.

vr_predanalytics_usage_of_predictive_analyticsTo bring the point home of how important advanced analytic skillsets are to an organization, our benchmark research shows that companies are more satisfied (70% versus 59%) when their predictive analytics initiatives are led by analytics professionals than by the data warehouse team. (As an aside, our benchmark research into predictive analytics shows preditive to be a key area where marketing and sales are focusing their efforts right now, with social media analytics, attrition, response and attribution modeling as key components of the strategy.)

Since neither the technology industry nor the market research industry has trained analytics professionals especially well, we have a big shortage of folks who can lead big analytics initiatives. This skills gap is driving significant funding for companies such as Mu Sigma, Absolute Data and GoodData. Not surprisingly, these companies target the marketing and market research client-side professional. It’s not a big leap for analytics professionals in the market research industry who already know hypothesis-driven statistics to move into data mining and data modeling, but these skills generally exist on the market research supplier side, not inside client organizations themselves.

Technologists and market research practitioners have long lived in parallel universes.  In technology, we deal with tables, joins and the ETL process. In market research and analysis, we deal with datasets, merges and data preparation. When you think about it, these are the same things! The subtle difference is that technologists have had a data mining mindset and market researchers have had a hypothesis-driven mindset.

As analytics and data environments come together, market researchers need to get more into data mining and more comfortable with data modeling. At the same time, technologists need to get more into hypothesis-driven analytics. In the webinar, Chadwick mentioned the massive advances in technologies, which are apparent when we look at trends in in-database analytics and embedded analytics. We’re seeing a rise in the availability of complex algorithms and open source languages such as R; nevertheless the three most used algorithms in enterprises today are still the simpler ones –  logistic regression, linear regression and decision trees. These should sound very familiar to those of you in the market research industry.

vr_predanalytics_top_predictive_techniques_usedAnother part of the discussion focused on web analytics. I’d extend that to digital analytics and a new class of tools beyond cookie-driven web analytics pulling from machine data, which is exposed in a variety of ways. The net impact is that rather than just looking at numbers of hits or click-through rates or transactions, we’re beginning to be able to see into the customer journey – sort of like doing shop-along in retail, but in a digital space. This gives us great insight into the purchase funnel and competitive dynamics, the likes of which we just didn’t see before. As we marry this technology with analysis of offline behavior, things get even more interesting, but also more complex. We start to deal with privacy issues unless we invoke faceless types of analysis, but that limits us in our ability to market at an individual level. At the same time, attribution modeling becomes more complex given the increases in the number of both promotional and transaction channels.

Finally, my one disagreement with the webinar speakers is their assertion that there’s not enough investment in education going into schools. On the contrary, every major vendor in the technology space I speak with highlights the schools they are aligned with, and most of these folks talk about investing in these schools with respect to analytics training. I agree with webinar participant Lenny Murphy that academia is often slow to change; I often compare market research with academia, in fact. But schools are getting more private funding as the government pulls back its spending, and for this reason schools are becoming more responsive to the needs of private enterprise. It’s here where the skills gap will begin to be filled as schools move away from classical education underpinnings to be more aligned with the needs of the 21st century.


Tony Cosentino

VP & Research Director

We recently completed our benchmark research on next-generation business intelligence. Ventana Research looks as next-generation BI as a function of traditional BI that is converging with new technologies such as mobility, collaboration and cloud computing. Just a few years ago business intelligence might have been considered a mature category with incremental growth, but now it’s growing in new directions and it’s difficult today to call business intelligence mature.

vr_ngbi_br_mobile_device_deploymentsOne of the reasons for the dramatic change in business intelligence is the impact of consumer technologies in the workplace. Our study shows that 53 percent of companies are currently deploying or plan to deploy tablet computers in their BI environments. This trend is driven by executives who have started to bring their devices to work and are asking for support – the so-called BYOD movement.

From the BYOD trend, it is apparent that ease-of-use and integration expectations are being led at the consumer level. Think about how easy it is to do things on an application like Yelp, where social, local and mobile technologies come together in real time to offer insights on our choice of restaurants.

When we port these expectations into the business environment, however, the tools we have in place do not meet these expectations. In our study, only 28 percent say they are fully satisfied with mobile BI, and only 32 percent with collaborative BI. Furthermore, our maturity model shows that while the people and technology categories are mature, information and processes are immature and holding companies back with respect to next-generation BI. This makes sense, since people have the technology and are skilled at using it in consumer environments, but they lack integrated information in the workplace, as well as the processes they need to take advantage of next-generation BI capabilities. Until businesses can take advantage of the kind of integration available in the consumer environment, we will likely see the satisfaction with these technologies stay relatively low.

Unfortunately, we found no coalescence around any particular access method. Just under two-fifth of the study (38%) prefer business intelligence applications as the primary access method for collaborative BI, but 36 percent prefer access through office productivity tools, and 34 percent prefer access through applications themselves.

Clearly, next-generation business intelligence is extremelyvr_ngbi_br_collaboration_tool_access_preferences important and can provide real competitive advantages, but it is still a bit of a mine field. For this reason, we strongly encourage companies to look at their information environments, consider current role-based workflows, and develop solutions that fit as seamlessly as possible into their environments. The alternative – deploying next-generation BI in a horizontal manner without careful thought for how the technologies integrate with the surrounding people, process and information – is just asking for trouble.


Tony Cosentino

VP and Research Director

GoodData has been around since 2007, but it has seen especially explosive growth in the last year due to the fast adoption of cloud Business Intelligence. In a recent meeting, Roman Stanek, the company’s CEO, told me GoodData has more than 6,000 customers, and that many of these are household names. Given that he publicly stated a customer base of around 2,500 last year, it appears that the company’s growth is on an exponential curve. This momentum is attracting significant investment in the company, with funding of $25 million of capital for a Series C round in July of this year. The cumulative total is now more than $55 million invested in the business.

BI has been a laggard when it comes to migration to the cloud, but things are starting to change. In our latest research on Next-Generation Business Intelligence, we see that 25 percent of customers now prefer to have BI provided as a service – for good reason. BI in the cloud goes beyond the traditional cloud advantages of reducing capital expenditures, faster time-to-value, and scalability; BI in the cloud allows for greater flexibility in adding new data sources, experimenting with analytical models and sharing data across the value chain. In this context, GoodData appears to be in exactly the right place at the right time.

vr_ngbi_br_bi_deployment_preferencesIn building out its portfolio, GoodData’s BI software has followed CRM, sales and service as applications that have had the most success in the cloud. The company’s portfolio consists of three customized SaaS offers, called “Bashes” (short for business mash-ups): GoodSales, GoodMarketing and GoodSubscription. The company says it also has applications in operations, HR and finance, but they are not readily apparent on its website. Outside of the core customized offers, GoodData offers companies the ability to build their own BI applications. In all, the company runs more than 12,000 data warehouses. The GoodData platform itself runs in the cloud on Amazon Web Services.

GoodMarketing Bash focuses on things such marketing mix to see where marketing dollars are most effectively spent. This has traditionally been a challenge for marketers and is in many ways gettingVR_techaward_winner more difficult with the proliferation of promotional channels. Traditionally, attribution model integrations for things like website and email marketing have proved elusive for marketing organizations. By putting together in a common repository different data sources, including things like behavioral, attitudinal and profile data, companies can realize significant value from their BI investments. Right now most companies still pursue such models on an aggregate level, but I anticipate that, as the data enables a more granular approach, share-of-wallet and other individual-level analytics, including omnichannel attribution models, will become more pervasive.  The GoodMarketing Bash application recently was the recipient of our 2012 Technology Innovation Award for its contribution to helping marketing organizations.

GoodSales Bash focuses on the purchase process funnel, consisting of brand awareness, consideration and purchase. This gives executives, marketers and sales representatives the ability to look at deals and see where to spend their time and energy. The company’s software can see how deals flow through the pipeline and where there is leakage or where deals are stalled. Metrics and benchmarks around sales win/loss provide information on what types of deals were lost and for what reasons. These types of analytics have traditionally been done separately and removed in time and space from actual sales activities. Embedding analytics directly into the sales process and allowing sales and management to work together creates a compelling time-to-value equation for companies.

GoodSubscription Bash  focuses on subscription-based business. It looks at many of the classic loyalty metrics that drive these types of businesses, such as customer lifetime value, average revenue per customer, churn, engagement, acquisition costs and upsell opportunities. GoodSubscription is primarily targeted at content and online media companies.

It will be interesting to see what GoodData does with its new infusion of capital. I expect it will look to expand its footprint through a direct sales force at the enterprise level and will look to eat into some of the traditional enterprise BI pie. In order to do this, it will need to build a more vertically oriented portfolio of software that helps different departments in different industries define their metrics and their best analytical path to value.

Expansion on the partner side also seems to be an obvious course as companies look to OEM best-of-breed solutions in building their own offers. Our research shows that the market has yet to decide how next-generation BI will be delivered, with approximately even splits between those expecting it to go through a BI application (38%), end-user application (34%) and office productivity suite (36%). Anecdotally, we have recently seen an uptick in embedded BI arrangements, such as with Tableau working with Google’s Big Query and Cloudera’s Impala, and Workday embedding Datameer into its cloud application. As the land grab moves forward, GoodData seems to be well-positioned to form lucrative partnerships.

GoodData does have a couple of outstanding issues, but I believe the company will address them in the next year. Currently it has not explicated a well defined strategy with respect to mobile. To some degree this makes sense, and is similar to what we have seen with companies but must be addressed. Since there is no decisive direction for mobile in terms of HTML 5 versus native deployment, a bet on one or the other is often too large for small companies to make. The choice is often to stay back until the market matures a little and shows a clearer direction. This however could give up a certain amount of market share in the interim as companies opt for native deployments of technology like business intelligence.

vr_ngbi_br_collaboration_tool_access_preferencesThe other issue outstanding is the aggregation and resale of data. As a prime mover in cloud BI and analytics, GoodData has the ability to pursue this angle, and as it adds customers, this tack will only get stronger. The company must approach this opportunity carefully, since this is a sensitive area and one in which the idea of trust comes to the forefront. The trust factor is the least discussed but most important factor in both cloud computing and social media. Sharing of data is predicated on a fundamental but tacit trust relationship with the customer, and the value of sharing data must be made clear. Many industries do benchmarking based on anonymous data, but their data sources must agree to share their data. GoodData will eventually explore this avenue, but likely with a measure of prudence.


Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.


  • 73,049 hits
%d bloggers like this: