You are currently browsing the tag archive for the ‘PivotLink’ tag.

PivotLink is a cloud-based provider of business intelligence and analytics that serves primarily retail companies. Its flagship product is Customer PerformanceMETRIX, which I covered in detail last year. Recently, the company released an important update to the product, adding attribution modeling, a type of advanced analytic that allows marketers to optimize spending across channels. For retailers these types of capabilities are particularly important. The explosion of purchase channels introduced by the Internet and competition from online retailers are forcing a more analytic approach to marketing as organizations try to decide where the marketing funds can be spent to best results. Our benchmark research into predictive analytics shows that achieving competitive advantage is the number-one reason for implementing predictive analytics, chosen by two-thirds (68%) of all companies and by even more retail organizations.

vr_predanalytics_benifits_of_predictive_analyticsAttribution modeling applied to marketing enables users to assign relative monetary and/or unit values to different marketing channels. With so many channels for marketers to choose among to spend their limited resources, it is difficult for them to defend the marketing dollars they allot to channels if they cannot provide analysis of the return on the investment. While attribution modeling has been around for a long time, the explosion of channels to create what PivotLink calls omnichannel marketing, is a relatively recent phenomenon. In the past, marketing spend focused on just a few channels such as television, radio, newspapers and billboards. Marketers modeled spending through a type of attribution called market mix models (MMM). These models are built around aggregate data, which is adequate when you have few just a few channels to calibrate, but it breaks down in the face of a broader environment. Furthermore, the MMM approach does not allow for sequencing of events, which is important in understanding how to direct spending to impact different parts of the purchase funnel. Newer data sources combined with attribution approaches like the ones PivotLink employs increase visibility of consumer behavior on the individual level, which enables a more finely grained approach. While market mix models will persist when only aggregate data is available, the collection of data in multiple forms (as by using big data) will expand the use of individual level models.

PivotLink’s approach allows marketers and analysts to address an important part of attribution modeling: how credit is assigned across channels. Until now, the first click and the last click typically have been given greatest weight. The problem is that the first click can give undue weighting to the higher part of the funnel and the last click undue weighting to the lower end. For instance, customers may go to a display advertisement to become aware of an offer, but later do a search and buy shortly after. In this instance, the last-click model would likely give too much credit to the search and not give enough credit to the display advertisement. While PivotLink does enable assignment by first click and last click (and by equal weighting as well), the option of custom weighting is the most compelling. After choosing that option from the drop-down menu, the marketer sees a slider in which weights can be assigned manually. This is often the preferred method of attribution in today’s business environment because it provides more flexibility and often reflects better the reality of a particular category; however,  domain expertise is necessary to apportion the weights wisely. To answer this particular challenge, the PivotLink software offers guidance based on industry best practices on how to weight the credit assignment.

Being based in the cloud, PivotLink is able to achieve an aggressive release cycle. Rapid product development is important for the company as its competitive landscape becomes crowded as on-premises analytics providers port their applications into the cloud and larger vendors look at the midmarket space for incremental growth. PivotLink can counter this by continuing to focus on usability and analytics applications for vertical industries. Attribution modeling is an important feature, and I expect to see PivotLink roll out other compelling analytics as well. Retailers looking for fast time-to-value in analytics and an intuitive system that does not need a statistician nor IT involvement, should consider PivotLink.


Tony Cosentino

VP and Research Director

I had a refreshing call this morning with a vendor that did not revolve around integration of systems, types of data, and the intricacies of NoSQL approaches. Instead, the discussion was about how its business users analyze an important and complex problem and how the company’s software enables that analysis. The topic of big data never came up, and it was not needed, because the conversation was business-driven and issue-specific.

By contrast, we get a lot of briefings that start with big data’s impact on business, but devolve into details about how data is accessed and the technology architecture. Data access and integration are important, but when we talk about big data analytics, focusing on the business issues is even more critical. Our benchmark research into big data shows that companies employ storage (95%) and reporting (94%) of big data, but very few use it for data mining (55%) and what-if scenario modeling (49%). That must change. Descriptive analysis on big data is quickly turning into table stakes; the real competitive value of big data analytics is in the latter two categories.

Not every big data vendor drowns its message in technospeak. IBM, for instance, stokes the imagination with analytical systems such as Watson and does a good job of bringing its business-focused story to a diverse audience through its Global Business Services arm. Some newer players paint compelling pictures as well. Companies such as PivotLink, PlanView and SuccessFactors (now part of SAP) deliver analytics stories from different organizational perspectives. Part of their advantage is that they start from a cloud and application perspective, but they also tell the analytics story in context of business, not in context of technology.

Providing that business perspective is a more difficult task for BI companies that have been pitching their software to IT departments for years, but even some of these have managed to buck this trend.  Alteryx, for instance, differentiates itself by putting forward compelling industry-specific use cases, and espousing the concept of the data artisan. This right-brain/left-brain approach appeals to both the technical and business sides of the house. Datameer also does a good job of producing solid business use cases. Its recent advancements in visualization help the company paint the analytical picture from a business perspective. Unfortunately, other examples seem few and far between. Most companies are still caught pitching technology-centric solutions, despite the fact that, in the new world of analytics, it’s about business solutions, not features on a specification sheet.

This focus on business issues over technology is important because the business side of the house today controls more and more of the technology spending. While business managers understand business and often have a firm grasp of analytics, they don’t always understand or care about the intricacies of different processing techniques and data models. In our upcoming benchmark research on next-generation BI systems, the data from which I’m currently analyzing, we see this power shift clearly. While IT still has veto power, decisions are being driven by business users and being ratified at the top of the organization.

The Ventana Research Maturity Model from our business analytics benchmark research shows that the analytics category is still immature, with only 15 percent of companies reaching the innovative level. So how do we begin to change this dialog from a technology-driven discussion to a business-driven discussion? From the client perspective, it starts with a blue sky approach, since the technological limitations that drove the old world of analytics no longer exist. This blank canvas may be framed by metrics such as revenue, profit and share of wallet, but the frame is now extending itself into less tangible and forward-looking areas such as customer churn and brand equity. If these output metrics are the frame, it’s the people, process, information and tools that are our brushes with which we paint. The focal point of the piece is always the customer.

If a business has a hard time thinking in terms of a blank canvass, it can examine a number of existing cases that show the value of utilizing big data analytics to help illuminate customer behavior, web usage, security, location, fraud, regulation and compliance. Some of the bigger ones are briefly discussed in my recent blog entry on predictive analytics.

The big data industry, if we can call it that, is quickly moving from a focus on the technology stack to a focus on tangible business outcomes and time-to-value (TTV). The innovations of the last few years have enabled companies to take a blue sky perspective and do things that they have never thought possible. The key is to start with the business problem you are looking to solve; the technology will work itself out from there.


Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.


  • 73,687 hits
%d bloggers like this: