You are currently browsing the monthly archive for October 2012.

IBM’s Information on Demand conference last week took over the fifth-largest conference venue in the country at the Mandalay Bay and Resort Convention Center in Las Vegas. During the keynote at the end of day one, IBM demonstrated its Cognos portfolio, a family of products that helped IBM earn a ranking of Hot in our 2012 Business Intelligence Value Index.

A relatively new addition to the portfolio is Cognos Insight, a personal desktop approach to exploratory analysis. While an early edition of Cognos Insights had a few challenges, the current release, Cognos Insight 10.2, is an improvement. Exploratory analysis tools like Cognos Insight are gaining momentum with business users because they generally do not require IT support and they allow end users to visualize descriptive data and do root-cause analysis in an iterative, user-friendly manner.

The latest version of the tool is offered in more than 20 languages and incorporates key features to help business users expedite their analytical processes. Once users load data into the environment by dragging and dropping an .xls or .csv file, the software uses an intelligent metadata approach to process information hierarchies. For example, Insight would recognize a one-to-many relationship between customer ID and segment so the user would not have to pre-model the data. This metadata approach also allows for rollups into time frames such as month, quarter or year.

What makes Cognos Insight particularly interesting is its writeback capability, which allows users to perform ad-hoc scenario planning and what-if analysis and return the result to a central data store where it can be shared with others. What-if analysis can do things such as explore the impact of price elasticity to find an optimal price point that drives customer retention, or determine the impact of some proposed capital expenditure on production capacity. Cognos’ approach differs from desktop spreadsheet modeling in that information visualization is an integral part of data input and output. Moreover, Cognos Insight, when coupled with IBM’s Cognos TM1 Business Performance Management application, provides a distributed and integrated planning interface. This integration gives companies more reasons to put aside desktop spreadsheets for enterprise planning. Ventana Research’s benchmarks, such as our recently completed Integrated Business Planning Benchmark, consistently show that dedicated planning applications enable companies to plan and budget more accurately, in part because they make information created by individual planning silos more accessible and easier to aggregate into an enterprise-wide view.

Insight users collaborate by saving self-contained files to the desktop and emailing them to colleagues. If a report is determined to be worthy of publication into a more formal library, users can do so through tools such as Cognos Express, a departmental and midsized company solution, or through Cognos Enterprise, a multi-departmental company solution. Here lies another key differentiator for the Cognos suite of products: It provides a process that allows for user-driven report creation and collaboration, yet also includes a centrally governed business intelligence environment.

IBM sees Cognos Insight both as an entry point for business users to do exploratory analysis and a way to do analytical crowdsourcing within an organization. However, given its write-back capabilities, its integration with SPSS which I recently wrote about and IBM’s decision systems, there is much more to this story. In a future blog entry, I plan to explore the broader Cognos portfolio with an eye to framing the larger IBM analytics approach, including how the Cognos family ties into big data and decision-making in an organization. In the meantime, I encourage you to download a free copy of Cognos Insights from IBM’s analytics community site, www.analyticszone.com.

Regards,

Tony Cosentino

VP and Research Director

Unlike other recent conferences that seem to focus almost exclusively on cloud computing, this week’s Teradata Partners Conference emphasized big data and analytics. The vision that Teradata lays out is one in which new technologies such as Apache Hadoop live side by side with more traditional enterprise data warehouses (EDW) and companies have the flexibility to define their own approaches to BI tools. This approach, at least in the near and medium terms, makes a lot of sense, and is backed by our own research into big data, which shows relational databases are still the predominant tool for delivering big data analytics and solutions to the enterprise. Companies have spent a lot of money on their current infrastructures, and not many have the stomach for a rip-and-replace strategy. Nor do most organizations have the tools and the skillsets yet to take full advantage of all of the newer approaches coming into the market around big data analytics.

Now part of Teradata’s big data and analytics strategy is its integration of Aster, which the company acquired about a year ago, into the Teradata portfolio. Aster offers some big advantages for accessing big data through commonly used and understood query approaches such as SQL. In fact, the SQL-H approach that Aster pioneered allows ANSI-standard SQL exploration of big data, and so far Aster is the only product on the market with such capabilities. SQL-H allows users to employ familiar SQL approaches to the MapReduce framework and take advantage of the massively parallel processing nature of Hadoop. It does this by leveraging HCatalog, which abstracts a metadata layer from Hadoop and provides hooks for the Aster query engine. The fact that this approach uses standard SQL makes it fit in well with existing BI tools and processes.

The Teradata approach relies heavily on Hadoop, the fastest-growing open source ecosystem around big data, but one that is still in its early stages; most organizations have not even put it through a proof of concept phase. Its most mature use case, and the one where organizations seem to be deriving the most value, is one in which Hadoop acts as a supercharged landing strip and refinery for different types of data. Hadoop is very good at capturing and storing data, and at applying low-level math on an extremely large scale in a batch process. The functional tasks it performs, such as filtering, sorting, counting and averaging, are valuable in deriving order out of the chaos of unstructured data. Once analysts apply some basic structure to the data, they can apply an iterative approach to look at data in more advanced ways and develop more complex algorithms.

Last week Teradata announced the Teradata Aster Big Analytics Appliance. In a close developmental partnership with HortonWorks, the Big Analytics Appliance aligns with Hadoop via Hortonworks’ HDP 1.1 and integrates hardware with software. The system provides more than 50 prebuilt MapReduce functions that are accessible in SQL. Because it uses standardized SQL, the system can leverage off-the-shelf BI tools and current ETL deployments. Another key feature is Teradata Viewpoint, which provides server management and monitoring for Teradata’s EDW platform and Teradata Aster. Support for Hadoop is expected early 2013. The system connects the Big Analytics Appliance to the Teradata EDW platform via 40Gbps InfiniBand, SQL-H, TD-Aster and TD-Hadoop connectors.

Teradata promises enterprise-class customer support for the Big Analytics Appliance. The company provides three levels of customer support across the entire system, including Hadoop. Any calls escalated to the highest level, level four, are routed to various centers of expertise, including platform engineering, Aster engineering and Hortonworks, respectively.

This enterprise assurance factor is a key part of the Teradata’s analytics strategy and will be a key determinant of Teradata’s future success. In discussions with customers at the conference, I got a positive feeling about Teradata’s ongoing commitment to tight integration within its systems, its approach to professional services and its general approach to customer support and satisfaction. Enterprise assurance is an intangible driver of purchasing behavior, but it can be especially strong in changing times.  One challenge will be for Teradata to maintain this value as it grows. It will be incumbent upon its professional services divisions to attract and retain a high level of talent, and structure its service delivery models in such a way that any growing pains are seamless from a customer perspective.

Teradata’s Big Analytics Appliance and big data analytics strategy provide a compelling story that addresses the analytics skills and staffing gap revealed by our big data research and promises high levels of customer assurance.

Regards,

Tony Cosentino

VP & Research Director

Our new world of multifaceted customer communications is driven by moments of interaction with the brand, often called moments of truth. Today’s call center analytics put companies in a position to manage these moments. Analytics that are specific to the call center include desktop analytics, event stream analytics, speech analytics, text analytics, cross-channel analytics and predictive modeling. These analytics, in turn, drive areas such as agent training and coaching, time and capacity optimization, customer satisfaction and loyalty, and cross-sell and upsell opportunities.

Our benchmark research into call center analytics shows that training and coaching are two of the biggest drivers of dissatisfaction among call center managers and executives, yet agent coaching is still a very manual process. According to our benchmark research on Agent Performance Management, 71 percent of agent coaching is triggered by the supervisor, and 86 percent of agent coaching is selected by the supervisor. This suggests that we are using subjective criteria without any analytical rigor to drive our most important processes. With analytics, training of call center agents can be driven in a more scientific manner rather than a one-size-fits-all approach, in which agent training is triggered by systematically defining areas for improvement and delivered at the desktop during agent idle time.

Agent utilization and capacity planning are other key areas of concern among managers. Our research shows that 75 percent of call center managers and executives consider increasing agent utilization to be very important. In one use case, we saw a company with forecast variation across locations with respect to both hours and customer satisfaction, which resulted in significant cost overruns in overtime hours, and also significant hits to overall customer satisfaction and loyalty scores. By implementing uniform technology and automated analytic processes across call centers, the company saw more than a 10 percent reduction in operating expenses, a positive return on investment in a matter of months, and an increase of more than 5 percent in customer satisfaction scores.

Companies should conduct a thorough assessment to see what kinds of analytics make sense in terms of business need and technology investment, but agent training and agent utilization are two areas of low-hanging fruit that can demonstrate the value of workforce analytics in the call center.

The key for the business case is to hit the high points: a 10 to 20 percent reduction in operational expenses, months rather than years to break even, and a significant increase in customer satisfaction scores and customer experience. This last factor is the most important, as it addresses the most common complaint I hear from executives regarding their VOC programs: the needle on NPS (or satisfaction, loyalty index or other outcome metric) does not move.

Regards,

Tony Cosentino

VP & Research Director

I had the opportunity last week to visit PivotLink in the Bellevue, Washington, office that houses the company’s development team and marketing leadership to see its software. After taking the helm a little more than a year ago and putting a new team in place, CEO Bruce Armstrong has positioned the company above the fray of the crowded business intelligence software set. The company has smartly moved into the retail space with user-friendly tools that should appeal to mid-tier retailers and where its historical success had been in the market. Building on earlier analysis on PivotLink and its advancement into analytics and cloud computing, this recent review focused more on its efforts to help the retail industry.

As we’ve discussed recently, marketing in retail environments is becoming more sophisticated thanks to forces such as cloud applications and shoppers’ mobile devices. This in turn is driving demand for a new class of analytics and technology to help with marketing optimization, attribution modeling, churn and share-of-wallet analytics, and merchandising analytics.

PivotLink addresses these challenges through software as a service (SaaS). Our recent benchmark research on business data in the cloud indicates that companies are increasingly adopting SaaS-based products across all lines of business. SaaS gives the advantages of shifting capital expenditures to operational expenditures, reduces IT involvement, and reduces Time-to-Value for technology adoption.

PivotLink’s RetailMETRIX provides 30 prebuilt reports and 60 best-in-class metrics that allow marketers to start using the tool right away. Some of its primary uses are to identify underperforming products or brands within a portfolio and to understand causal elements. This allows marketers to better anticipate customer demand and gives visibility into current trends in the supply chain.

Customer PerformanceMETRIX allows marketers to perform a range of analytics through user-friendly drill-downs, thereby producing ad hoc customer segmentations and enabling attribution and RFM analysis. Customer PerformanceMETRIX also integrates with third-party marketing systems to let users optimize campaigns against a particular merchandising strategy, then feed back the results into the PivotLink system in a closed loop process.

DataCLOUD provides data enrichment services by attaching customer-level data such as household demographics and psychographics. It can take into account big data sources such as social media data, including product-level and store-level reviews, as well as traffic and weather patterns. Such information is becoming more important in retail analytics as it allows businesses to assess the causes behind store-level performance.

PivotLink also provides mobile analytics with native capabilities for Apple and Android tablets. In our benchmark research on information applications, 51 percent of participants said broader access to information on mobile technologies is important or very important. In addition, our soon to be released benchmark on next generation of business intelligence that shows that many organizations use a broad spectrum of mobile devices. Users want all the capabilities inherent in those devices, which are available via native applications. Mobile capabilities should also appeal to PivotLink’s core constituency of retail customers, who often spend many hours out of the office traveling to various locations and suppliers.

PivotLink is in a position to capitalize on the changing environment for retail analytics. With most companies still using personal productivity tools to do their analysis, a turnkey SaaS solution with a low investment barrier makes a lot of sense in terms of time-to-value (TTV). Right now market momentum is shifting to cloud-based applications but there are still very few pure-play cloud-based retail analytics vendors. PivotLink must move quickly into this space before other cloud technology players seize the opportunity or larger players start to move downstream to make a heavier push into the mid-tier retail analytics space. I encourage retailers that are still doing analytics in spreadsheets to take a closer look at PivotLink and its approach to business analytics.

Regards,

Tony Cosentino

VP and Research Director

Our benchmark research into retail analytics says that only 34 percent of retail companies are satisfied with the process they currently use to create analytics. That’s a 10 percent lower satisfaction score than we found for all industries combined. The dissatisfaction is being driven by underperforming technology that cannot keep up with the dramatic changes that are occurring in the retail industry. Retail analytics lag those in the broader business world, with 71 percent still using spreadsheets as their primary analysis tool. This is significantly higher than other industries and shows the immaturity in the field of retail analytics.

While in the past retailers did not need to be on the cutting edge of analytics, dramatic changes occurring in retail are driving a new analytics imperative:

Manufacturers are forming direct relationships with consumers through communities and e-commerce. These relationships can extend into the store and influence buyers at the point of purchase.  This “pull-through” strategy increases the power and brand equity of the supplier while decreasing the position strength of the retailer. This dynamic is evidenced by JC Penney, which positions itself as a storefront for an entire portfolio of supplier brands. Whereas before the retailer owned the relationship with the consumer, the relationship is now shared between the retailer and its suppliers.

What this means for retail analytics: Our benchmark research shows retail has lagged behind other businesses with respect to analytics. Given the new co-opitition environment with suppliers, retailers must use analytics to compete. Their decreasing brand equity means that they need analytics not just for brand strategy and planning, but also in tactical areas such as merchandising and promotional management. At the same time, retailers are working with ever-increasing amounts of data that is often shared throughout the supply chain to build business cases and to enrich customer experience, and that data is ripe for analysis in service to business goals.

E-commerce is driving a convergence of offline and online retail consumer behavior, forcing change to a historically inert retail analytics culture. As we’ve all heard by now, online retailers such as Amazon threaten the business models of showroom retailers. Some old-line companies are dealing with the change by taking an “if you can’t beat ’em, join ’em” approach. Traditional brick-and-mortar company Walgreens, for instance, acquired Drugstore.com and put kiosks in its stores to let customers order out-of-stock items immediately at the same price. However, online retailers, instead of looking to move into a brick-and-mortar environment, are driving their business model back into the data center and forward onto mobile devices. Amazon, for instance, offers Amazon Web Services and Kindle tablet.

What this means for retail analytics: There has historically been a wall between the .com area of a company and the rest of the organization. Companies did mystery shopping to do price checks in physical trade areas and bots to do the same thing over the Internet. Now companies such as Sears are investing heavily to gain full digital transparency into the supply chain so that they can change pricing on the fly – that is, it may choose to undercut a competitor on a specific SKU, then when its system finds a lack of inventory among competitors for the item, it can automatically increase its price and its margin. Eventually the entire industry, including midtier retailers, will have to focus on how analytics can improve their business.

Retailers are moving the focus of their strategy away from customer acquisition and toward customer retention. We see this change of focus both on the brick-and-mortar side, where loyalty card programs are becoming ubiquitous, and online via key technology enablers such as Google, whose I/O 2012 conference focused on the shift from online customer acquisition to online customer retention.

What this means for retail analytics: As data proliferates, businesses gain the ability to look more closely at how individuals contribute to a company’s revenue and profit. Traditional RFM and attribution approaches are becoming more precise as we move away from aggregate models and begin to look at particular consumer behavior. Analytics can help pinpoint changes in behavior that matter, and more importantly, indicate what organizations can do to retain desired customers or expand share-of-wallet. In addition, software to improve the customer experience within the context of a site visit is becoming more important. This sort of analytics, which might be called a type of online ethnography, is a powerful tool for improving the customer experience and increasing the stickiness of a retailer’s site.

In sum, our research on retail analytics shows that outdated technological and analytical approaches still dominate the retail industry. At the same time, changes in the industry are forcing companies to rethink their strategies, and many companies are addressing these challenges by leveraging analytics to attract and retain the most valued customers. For large firms, the stakes are extremely high, and the decisions around how to implement this strategy can determine not just profitability but potentially their future existence. Retail organizations need to consider investments into new approaches for getting access to analytics. For example, analytics provided via cloud computing and software as a service are becoming more pervasive help ensure they meet the capabilities and needs of business roles. Such approaches are a step function above the excel based environments that many retailers are living in today.

Regards,

Tony Cosentino

Vice President and Research Director

At Oracle OpenWorld this week I focused on what the company is doing in business analytics, and in particular on what it is doing with its Exalytics In-Memory Machine. The Exalytics appliance is an impressive in-memory hardware approach to putting right-time analytics in the hands of end users by providing a full range of integrated analytic and visualization capabilities. Exalytics fits into the broader analytics portfolio by providing support for Oracle BI Foundational Suite including OBIEE, Oracle’s formidable portfolio of business analytics and business performance applications, as well interactive visualizations and discovery capabilities.

Exalytics connects with an Exadata machine over a high-throughput InfiniBand link. The system leverages Oracle’s TimesTen In-Memory Database with columnar compression originally built for online transaction processing in the telecommunications industry. Oracle’s Summary Advisor software provides a heuristic approach to making sure the right data is in-memory at the right time, though customers at the conference mentioned that in their initial configuration they did not need to turn on Summary Advisor since they were able to load their entire datasets in memory. Oracle’s Essbase OLAP server is also integrated, but since it requires a MOLAP approach and computationally intensive pre-aggregation, one might question whether applications such as Oracle’s Hyperion planning and forecasting tools will truly provide analysis at the advertised ”speed of thought.” To address this latter point, Oracle points to examples in the consumer packaged goods industry, where it has already demonstrated its ability to reduce complex planning scenario cycle times from 24 hours down to four hours. Furthermore, I discussed with Paul Rodwick, Oracle’s vice president of product management for business intelligence, the potential for doing in-line planning, where integrated business planning and complex what-if modeling can be done on the fly. For more on this particular topic, please check out my colleague’s benchmark research on the fast clean close.

Another important part of the Exalytics appliance is the Endeca discovery tool. Endeca, which Oracle acquired just a year ago, provides an exploratory interactive approach to root cause analysis and problem-solving without the traditional struggle of complex data modeling. It does this through a search technology that leverages key-value pairings in unstructured data, thereby deriving structure delivered in the form of descriptive statistics such as recency and frequency.  This type of tool democratizes analytics in an organization and puts power into the hands of line-of-business managers. The ability to navigate across data was the top-ranked business capability in our business analytics benchmark research. However, while Endeca is a discovery and analysis tool for unstructured and semi-structured data, it does not provide full sentiment analysis or deeper text analytics, as do tools such as IBM’s SPSS Text Analytics and SAS Social Conversation Center. For more advanced analytics and data mining, Oracle integrates its version of R for the enterprise on its Exadata Database Machine and its Big Data Appliance, Oracle’s integrated Hadoop approach based on the Cloudera stack.

At the conference, Oracle announced a few updates for Exalytics. The highlights include a new release of Oracle Business Intelligence 11.1.1.6.2BP1 optimized for Exalytics. The release includes new mobility and visualization capabilities, including trellis visualizations that allow users to see a large number of charts in a single view. These advanced capabilities are enabled on mobile devices through the increased speed provided by Exalytics. In addition, Endeca is now certified with Oracle Exalytics, and the TimesTen database has been certified with GoldenGate and Oracle Data Integrator, allowing the system and users to engage in more event-driven analytics and closed-loop processes. Finally, Hyperion Planning is certified on Exalytics.

On the upside, Oracle’s legacy platform support on Exalytics allows the company to leverage its entire portfolio and offer more than 80 prebuilt analytics applications ready to run on Exalytics without any changes. The platform also supports the full range of the Business Intelligence Foundational Suite and provides a common platform and a common dimensional model. In this way, it provides alignment with overall business processes, content management systems and transactional BI systems. This alignment is especially attractive for companies that have a lot of Oracle software already installed, and companies looking to operationalize business analytics through event-based closed-loop decision-making at the front lines of the organization. The speed of the machine, its near-real-time query speeds, and its ability to deliver complex visualizations to mobile devices allow users to create use cases and ROI scenarios they could not before. For example, the San Diego Unified School District was able to utilize Exalytics to push out performance dashboards across multiple device types to students and teachers, thereby increasing attendance and garnering more money from the state.The Endeca software lets users make qualitative and to some degree quantitative assessments from social data and to overlay that with structured data. The fact that Exalytics comprises an integrated hardware and software stack makes it a turnkey solution that does not require expensive systems integration services. Each of these points makes Exalytics an interesting and even an exciting investment possibility for companies.

On the downside, the integrated approach and vendor lock-in may discourage some companies concerned about Oracle’s high switch costs. This may pose a risk for Oracle as maturing alternatives become available and the economics of switching begin to make more sense. However, it is no easy task to switch away from Oracle, especially for companies with large Oracle database and enterprise application rollouts. The economics of loyalty are such that when customers are dissatisfied but captive, they remain loyal; however, as soon as a viable alternative comes on the market, they quickly defect. This defection threat for Oracle could come from dissatisfaction with configuration and ease-of-use issues in combination with new offerings from large vendors such as SAP, with its HANA in-memory database appliance, and hardware companies such as EMC that are moving up the stack into the software environment. To be fair, Oracle is addressing the issues around useability by moving the user experience away from “one size fits all” to more of a personae or role-based interface. Oracle will likely have time to do this, given the long tail of its existing database entrenchment and the breadth and depth of its analytics application portfolio.

With Exalytics, Oracle is addressing high market demand for right-time analytics and interactive visualization. I expect this device will continue to sell well especially since it is plug-and-play and Oracle has such a large portfolio of analytic applications. For companies already running Oracle databases and Oracle applications, Exalytics is very likely a good investment. In particular, it makes sense for those that have not fully leveraged the power of analytics in their business and that are therefore losing competitive position. Our benchmark research on analytics shows a relatively low penetration for analytics software today, but investments are accelerating due to the large ROI companies are realizing from analytics. In situations where companies find the payoffs to be less obvious, or where they fear lock-in, companies should take a hard look at exactly what they hope to accomplish with analytics and which tools best suit that need. The most successful companies start with the use case to justify the investment, then determine which technology makes the most sense.

In addition to this blog, please see Ventana Research’s broader coverage of Oracle OpenWorld.

Regards,

Tony Cosentino

VP and Research Director

IBM acquired SPSS in late 2009 and has been investing steadily in the business as a key component of its overall business analytics portfolio. Today, IBM SPSS provides an integrated approach to predictive analytics through four distinct software packages: SPSS Data Collection, SPSS Statistics, SPSS Modeler and SPSS Decision Management. IBM SPSS is also integrated with Cognos Insight, IBM’s entry into the visual discovery arena.

Our benchmark research into predictive analytics shows that companies are struggling with two core issues: a skills shortage related to predictive analytics and integration of predictive analytics into their information architecture. A preliminary look at IBM’s SPSS software makes it obvious to me that IBM is putting its full weight behind addressing both of these issues.

Predictive analytics is a hot term in business today, but there is still some debate about what it means. My blog entry on predictive analytics discusses findings from our research and the idea that the lines between predictive and descriptive analytics are becoming blurred. IBM provides an interesting take on this conversation by discussing predictive analytics in the context of data mining and statistics. Data mining it sees as more bottom-up and exploratory in nature (though it can also be predictive) and statistics as more of a top-down hypothesis-driven approach (though it can also use descriptive techniques).

If you use SPSS Modeler you don’t have to be a data scientist to participate in predictive analytics discussions. Once data is loaded into the modeler and a few preliminary questions are answered about what you are trying to do, SPSS Modeler presents a choice of analytical techniques, such as CHAID, CART and others, and suggests the best approach based on multiple variables, such as number of fields or predictive power. This is a big deal because business managers often have some idea of clustering, regression and cause-and-effect-type functions, but they don’t necessarily know the intricacies of different techniques. With SPSS Modeler you don’t have to know the details of all of this, but can still participate in these important discussions. SPSS Modeler can bridge the gap between statistician and a day-to-day LOB analyst and decision-maker, and thus help bridge the analytics skills gap facing organizations today.

Another challenge for organizations is integrating multiple streams of data including attitudinal data. Built-in survey data collection in SPSS Data Collection can fill in these blanks for analysts. Sometimes behavioral data reveals knowledge gaps that can only be filled with direct perceptual feedback from stakeholders collected through a survey instrument. Trying to tell a story with only behavioral data can be like trying to tell the actual contents of a file based only on metadata descriptors such as file size, type, and when and how often the file was accessed. Similarly, social media data may provide some of the context, but it does not always give direct answers. We see LOB initiatives bringing together multiple streams of data, including attitudinal data such as brand perceptions or customer satisfaction. The data collection functionality allows managers, within the context of broader analytics initiatives, to bring such data directly into their models and even to do scoring for things such as customer or employee churn.

I have not yet discussed IBM SPSS integration with decision systems and the idea of moving from the “so what” of analytics to the “now what” of decision-making. This is a critical component of a company’s analytics agenda, since operationalizing analytics necessitates that the model outcomes be pushed out to organizations’ front lines and then updated in a closed-loop manner. Such analytics are more and more often seen as a competitive advantage in today’s marketplace – but this is a separate discussion that I will address in a future blog entry.

SPSS has a significant presence across multiple industries, but it is ubiquitous in academia and  in the market research industry. The market research industry itself is a particularly interesting foothold for IBM as the market is estimated to be over $30 billion globally, according to the Council of American Survey Research Organizations. By leveraging IBM and SPSS, companies gain access to a new breed of market research to help merge forward-looking attitudinal data streams with behavioral data streams. The academic community’s loyalty to SPSS provides it an advantage similar to that of Apple when it dominated academic institutions with the Macintosh computer. As people graduate with familiarity with certain platforms, they carry this loyalty with them into the business world. As spreadsheets are phased out as the primary modeling tool due to their limitations, IBM can capitalize on the changes with continued investments in institutions of higher learning.

Companies looking to compete based on analytics should almost certainly consider IBM SPSS. This is especially true of companies that are looking to merge LOB expertise with custom analytical approaches, but that don’t necessarily want to write custom applications to accomplish these goals.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,277 hits
%d bloggers like this: