You are currently browsing the tag archive for the ‘Business Analytics’ tag.

The emerging Internet of Things (IoT) extends digital connectivity to devices and sensors in homes, businesses, vehicles and potentially almost anywhere. This innovation enables devices designed for it to generate and transmit data about their operations; analytics using this data can facilitate monitoring and a range of automatic functions.vr_oi_goals_of_using_operational_intelligence_updated

To perform these functions IoT requires what Ventana Research calls Operational Intelligence (OI), a discipline that has evolved from the capture and analysis of instrumentation, networking and machine-to-machine interactions of many types. We define operational intelligence as a set of event-centered information and analytic processes operating across an organization that enable people to use that event information to take effective actions and make optimal decisions. Our benchmark research into Operational Intelligence shows that organizations most often want to use such event-centric architectures for defining metrics (37%) and assigning thresholds for alerts (35%) and for more action-oriented processes of sending notifications to users (33%) and linking events to activities (27%).

In many industries, organizations can gain competitive advantage if they can reduce the elapsed time between an event occurring and actions taken or decisions made in response to it. Existing business intelligence (BI) tools provide useful analysis of and reporting on data drawn from previously recorded transactions, but to improve competitiveness and maximize efficiencies organizations are concluding that employees and processes – in IT, business operations and front-line customer sales, service and support – also need to be able to detect and respond to events as they happen. Our research into big data integration shows that nearly one in four companies currently integrate data into big data stores in real time. The challenge is to go further and act upon both the data that is stored and the data that is streaming in a timely manner.

The evolution of operational intelligence, especially in conjunction with IoT, is encouraging companies to revisit their priorities and spending for information technology and application management. However, sorting out the range of options poses a challenge for both business and IT leaders. Some see potential value in expanding their network infrastructure to support OI. Others are implementing event processing (EP) systems that employ new technology to detect meaningful patterns, anomalies and relationships among events. Increasingly, organizations are using dashboards, visualization and modeling to notify nontechnical people of events and enable them to understand their significance and take appropriate and immediate action.

As with any innovation, using OI for IoT may require substantial changes. These are among the challenges organizations face as they consider adopting operational intelligence:

  • They find it difficult to evaluate the business value of enabling real-time sensing of data and event streams using identification tags, agents and other systems embedded not only in physical locations like warehouses but also in business processes, networks, mobile devices, data appliances and other technologies.
  • They lack an IT architecture that can support and integrate these systems as the volume and frequency of information increase.
  • They are uncertain how to set reasonable business and IT expectations, priorities and implementation plans for important technologies that may conflict or overlap. These can include business intelligence, event processing, business process management, rules management, network upgrades and new or modified applications and databases.
  • They don’t understand how to create a personalized user experience that enables nontechnical employees in different roles to monitor data or event streams, identify significant changes, quickly understand the correlation between events and develop a context in which to determine the right decisions or actions to take.

Ventana Research has announced new benchmark research on The Internet of Things and Operational Intelligence that will identify trends and best practices associated with this technology and these processes. It will explore organizations’ experiences with initiatives related to events and data and with attempts to align IT projects, resources and spending with new business objectives that demand real-time intelligence and event-driven architectures. The research will investigate how organizations are increasing their responsiveness to events by rebalancing the roles of networks, applications and databases to reduce latency; it also will explore ways in which they are using sensor data and alerts to anticipate problematic events. We will benchmark the performance of organizations’ implementations, including IoT, event stream processing, event and activity monitoring, alerting, event modeling and workflow, and process and rules management.

As operational intelligence evolves as the core of IoT platforms, it is an important time to take a closer look at this emerging opportunity and challenge. For those interested in learning more or becoming involved in this upcoming research, please let me know.

Regards,

Ventana Research

At its annual industry analyst summit last month and in a more recent announcement of enterprise support for parallelizing the R language on its Aster Discovery Platform, Teradata showed that it is adapting to changes in database and analytics technologies. The presentations at the conference revealed a unified approach to data architectures and value propositions in a variety of uses including the Internet of Things, digital marketing and ETL offloading. In particular, the company provided updates on the state of its business as well as how the latest version of its database platform, Teradata 15.0, is addressing customers’ needs for big data. My colleague Mark Smith covered these announcements in depth. The introduction of scalable R support was discussed at the conference but not announced publicly until late last month.

vr_Big_Data_Analytics_13_advanced_analytics_on_big_dataTeradata now has a beta release of parallelized support for R, an open source programming language used significantly in universities and growing rapidly in enterprise use. One challenge is that R relies on a single-thread, in-memory approach to analytics. Parallelization of R allows the algorithm to run on much larger data sets since it is not limited to data stored in memory. For a broader discussion of the pros and cons of R and its evolution, see my analysis. Our benchmark research shows that organizations are counting on companies such as Teradata to provide a layer of abstraction that can simplify analytics on big data architectures. More than half (54%) of advanced analytics implementations are custom built, but in the future this percentage will go down to about one in three (36%).

Teradata’s R project has three parts. The first includes a Teradata Aster R library, which supplies more than 100 prebuilt R functions that hide complexity of the in-database implementation. The algorithms cover the most common big data analytic approaches in use today, which according to our big data analytics benchmark research are classification (used by 39% of organizations), clustering (37%), regression (35%), time series (32%) and affinity analysis (29%). Some use innovative approaches available in Aster such as Teradata’s patented nPath algorithm, which is useful in areas such as digital marketing. All of these functions will receive enterprise support from Teradata, likely through its professional services team.

The second part of the project involves the R parallel constructor. This component gives analysts and data scientists tools to build their own parallel algorithms based on the entire library of open source R algorithms. The framework follows the “split, apply and combine” paradigm, which is popular among the R community. While Teradata won’t support the algorithms themselves, this tool set is a key innovation that I have not yet seen from others in the market.

Finally, the R engine has been integrated with Teradata’s SNAP integration framework. The framework provides unified access to multiple workload specific engines such as relational (SQL), graph (SQL-GR), MapReduce (SQL-MR) and statistics. This is critical since the ultimate value of analytics rests in the information itself. By tying together multiple systems, Teradata enables a variety of analytic approaches. More importantly, the data sources that can be merged into the analysis can deliver competitive advantages. For example, JSON integration, recently announced, delivers information from a plethora of connected devices and detailed Web data.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsTeradata is participating in industry discussions about both data management and analytics. As Mark Smith discussed, its unified approach to data architecture addresses challenges brought on competing big data platforms such as Hadoop and other NoSQL approaches like that one announced with MongoDB supporting JSON integration. These platforms access new information sources and help companies use analytics to indirectly increase revenues, reduce costs and improve operational efficiency. Analytics applied to big data serve a variety of uses, most often cross-selling and up-selling (for 38% of organizations), better understanding of individual customers (32%) and optimizing price (30%) and IT operations (24%). Teradata is active in these areas and is working in multiple industries such as financial services, retail, healthcare, communications, government, energy and utilities.

Current Teradata customers should evaluate the company’s broader analytic and platform portfolio, not just the database appliances. In the fragmented and diverse big data market, Teradata is sorting through the chaos to provide a roadmap for largest of organizations to midsized ones. The Aster Discovery Platform can put power into the hands of analysts and statisticians who need not be data scientists. Business users from various departments, but especially high-level marketing groups that need to integrate multiple data sources for operational use, should take a close look at the Teradata Aster approach.

Regards,

Tony Cosentino

VP & Research Director

SAP recently presented its analytics and business intelligence roadmap and new innovations to about 1,700 customers and partners using SAP BusinessObjects at its SAP Insider event (#BI2014). SAP has one of the largest presences in business intelligence due to its installed base of SAP BusinessObjects customers. The company intends to defend its current position in the established business intelligence (BI) market while expanding in the areas of databases, discovery analytics and advanced analytics. As I discussed a year ago, SAP faces an innovator’s dilemma in parts of its portfolio, but it is working aggressively to get ahead of competitors.

vr_bti_br_technology_innovation_prioritiesOne of the pressures that SAP faces is from a new class of software that is designed for business analytics and enables users to visualize and interact on data in new ways without relationships in the data being predefined. Our business technology innovation research shows that analytics is the top-ranked technology innovation in business today, rated first by 39 percent of organizations. In conventional BI systems, data is modeled in so-called cubes or other defined structures that allow users to slice and dice data quickly and easily. The cube structure solves the problem of abstracting the complexity of the structured query language (SQL) of the database and slashes the amount of time it takes to read data from a row-oriented database. However, as the cost of memory decreases significantly, enabling the use of new column-oriented databases, these methods of BI are being challenged. For SAP and other established business intelligence providers, this situation represents both an opportunity and a challenge. In responding, almost all of these BI companies have introduced some sort of visual discovery capability. SAP introduced SAP Lumira, formerly known as Visual Intelligence, 18 months ago to compete in this emerging segment, and it has gained traction in terms of downloads, which the company estimated at 365,000 in the fourth quarter of 2013.

SAP and other large players in analytics are trying not just to catch up with visual discovery players such as Tableau but rather to make it a game of leapfrog. Toward that end, the capabilities of Lumira demonstrated at the Insider conference included information security and governance, advanced analytics, integrated data preparation, storyboarding and infographics; the aim is to create a differentiated position for the tool. For me, the storyboarding and infographics capabilities are about catching up, but being able to govern and secure today’s analytic platforms is a critical concern for organizations, and SAP means to capitalize on them. A major analytic announcement at the conference focused on the integration of Lumira with the BusinessObjects platform. Lumira users now can create content and save it to the BusinessObjects server, mash up data and deliver the results through a secure common interface.

Beyond the integration of security and governance with discovery analytics, the leapfrog approach centers on advanced analytics. SAP’s acquisition last year of KXEN and its initial integration with Lumira provide an advanced analytics tool that does not require a data scientist to use it. My coverage of KXEN prior to the acquisition revealed that the tool was user-friendly and broadly applicable especially in the area of marketing analytics. Used with Lumira, KXEN will ultimately provide front-end integration for in-database analytic approaches and for more advanced techniques. Currently, for data scientists to run advanced analytics on large data sets, SAP provides its own predictive analytic library (PAL), which runs natively on SAP HANA and offers commonly used algorithms such as clustering, classification and time-series. Integration with the R language is available through a wrapper approach, but the system overhead is greater when compared to the PAL approach on HANA.

The broader vision for Lumira and the BusinessObjects analytics platform SAP said is “collective intelligence,” which it described as “a Wikipedia for business” that provides a bidirectional analytic and communication platform. To achieve this lofty goal, SAP will vr_Big_Data_Analytics_02_defining_big_data_analyticsneed to continue to put resources into HANA and facilitate the integration of underlying data sources. Our recently released research on big data analytics shows that being able to analyze data from all data sources (selected by 75% of participants) is the most prevalent definition for big data analytics. To this end, SAP announced the idea of an “in-memory fabric” that allows virtual data access to multiple underlying data sources including big data platforms such as Hadoop. The key feature of this data federation approach is what the company calls smart data access (SDA). Instead of loading all data into memory, the virtualized system sets a proxy that points to where specific data is held. Using machine learning algorithms, it can define how important information is based on the query patterns of users and upload the most important data into memory. The approach will enable the company to analyze data on a massive scale since utilizing both HANA and the Sybase IQ columnar database which the company says was just certified as the world record for the largest data warehouse, at more than 12 petabytes. Others such as eBay and Teradata may beg to differ with the result based on another implementation, but nevertheless it is an impressive achievement.

Another key announcement was SAP Business Warehouse (BW) 7.4, which now runs on top of HANA. This combination is likely to be popular because it enables migration of the underlying database without impacting business users. Such users store many of their KPIs and complex calculations in BW, and to uproot this system is untenable for many organizations. SAP’s ability to continue support for these users is therefore something of an imperative. The upgrade to 7.4 also provides advances in capability and usability. The ability to do complex calculations at the database level without impacting the application layer enables much faster time-to-value for SAP analytic applications. Relative to the in-memory fabric and SDA discussed above, BW users no longer need intimate knowledge of HANA SDA. The complete data model is now exposed to HANA as an information cube object, and HANA data can be reflected back into BW. To back it up, the company offered testimony from users. Representatives of Molson Coors said their new system took only a weekend to move into production (after six weeks of sandbox experiments and six weeks of development) and enables users to perform right-time financial reporting, rapid prototyping and customer sentiment analysis.

SAP’s advancements and portfolio expansion are necessary for it to continue in a leadership position, but the inherent risk is confusion amongst its customer and prospect base.  SAP published its last statement of direction for analytic dashboard about this time last year, and according to company executives, it will be updated fairly soon, though they would not specify when. The many tools in the portfolio include Web Intelligence, Crystal Reports, Explorer, Xcelsius and now Lumira. SAP and its partners position the portfolio as a toolbox in which each tool is meant to solve a different organizational need. There is overlap among them, however, and the inherent complexity of the toolbox approach may not resonate well with business users who desire simplicity and timeliness.

SAP customers and others considering SAP should carefully examine how well these tools match the skills in their organizations. We encourage companies to look at the different organizationalVRMobileBIVI roles as analytic personas and try to understand which constituencies are served by which parts of the SAP portfolio. For instance, one of the most critical personas going forward is the Designer role since usability is the top priority for organizational software according to our next-generation business intelligence research. Yet this role may become more difficult to fill over time since trends such as mobility continue to add to the job requirement. SAP’s recent upgrade of Design Studio to address emerging needs such as mobility and mobile device management (MDM) may force some organizations to rebuild  dashboards and upscale their designer skill sets to include JavaScript and Cascading Style Sheets, but the ability to deliver multifunctional analytics across devices in a secure manner is becoming paramount. I note that SAP’s capabilities in this regard helped it score third overall in our 2014 Mobile Business Intelligence Value Index. Other key personas are the knowledge worker and the analyst. Our data analytics research shows that while SQL and Excel skills are abundant in organizations, statistical skills and mathematical skills are less common. SAP’s integration of KXEN into Lumira can help organizations develop these personas.

SAP is pursuing an expansive analytic strategy that includes not just traditional business intelligence but databases, discovery analytics and advanced analytics. Any company that has SAP installed, especially those with BusinessObjects or an SAP ERP system, should consider the broader analytic portfolio and how it can meet business goals. Even for new prospects, the portfolio can be compelling, and as the roadmap centered on Lumira develops, SAP may be able to take that big leap in the analytics market.

Regards,

Tony Cosentino

VP and Research Director

As a new generation of business professionals embraces a new generation of technology, the line between people and their tools begins to blur. This shift comes as organizations become flatter and leaner and roles, vr_ngbi_br_importance_of_bi_technology_considerationscontext and responsibilities become intertwined. These changes have introduced faster and easier ways to bring information to users, in a context that makes it quicker to collaborate, assess and act. Today we see this in the prominent buying patterns for business intelligence and analytics software and an increased focus on the user experience. Almost two-thirds (63%) of participants in our benchmark research on next-generation business intelligence say that usability is the top purchase consideration for business intelligence software. In fact, usability is the driving factor in evaluating and selecting technology across all application and technology areas, according to our benchmark research.

In selecting and using technology, personas (that is, an idealized cohort of users) are particularly important, as they help business and IT assess where software will be used in the organization and define the role, responsibility and competency of users and the context of what they need and why. At the same time, personas help software companies understand the attitudinal, behavioral and demographic profile of target individuals and the specific experience that is not just attractive but essential to those users. For example, the mobile and collaborative intelligence capabilities needed by a field executive logging in from a tablet at a customer meeting are quite different from the analytic capabilities needed by an analyst trying to understand the causes of high customer churn rates and how to change that trend with a targeted marketing campaign.

Understanding this context-driven user experience is the first step toward defining the personas found in today’s range of analytics users. The key is to make the personas simple to understand but comprehensive enough to cover the diversity of needs for business analytic types within the organization. To help organizations be more effective in their analytic process and engagement of their resources and time, we recommend the following five analytical personas: (Note that in my years of segmentation work, I’ve found that the most important aspects are the number of segments and the names of those segments. To this end, I have chosen a simple number, five, and the most intuitive names I could find to represent each persona.)

Information Consumer: This persona is not technologically savvy and may even feel intimidated by technology. Information must be provided in a user-friendly fashion to minimize frustration. These users may rely on one or two tools that they use just well enough to do their jobs, which typically involves consuming information in presentations, reports, dashboards or other forms that are easy to read and interpret. They are oriented more to language than to numbers and in most cases would rather read or listen to information about the business. They can write a pertinent memo or email, make a convincing sales pitch or devise a brilliant strategy. Their typical role within the organization varies, but among this group is the high-ranking executive, including the CEO, for whom information is prepared. In the lines of business, this consumer may be a call center agent, a sales manager or a field service worker. In fact, in many companies, the information consumer makes up the majority of the organization. The information consumer usually can read Excel and PowerPoint documents but rarely works within them. This persona feels empowered by consumer-grade applications such as Google, Yelp and Facebook.

Knowledge Worker: Knowledge workers are business, technologically and data savvy and have domain knowledge. They interpret data in functional ways. These workers understand descriptive data but are not likely to take on data integration tasks or interpret advanced statistics (as in a regression analysis). In terms of tools, they can make sense of spreadsheets and with minimal training use the output of tools like business intelligence systems, pivot tables and visual discovery tools. They also actively participate in providing feedback and input to planning and business performance software. Typically, these individuals are over their heads when they are asked to develop a pivot table or structure multi-dimensional data. In some instances, however, new discovery tools allow them to move beyond such limitations. The knowledge worker persona includes but is not limited to technology savvy executives, line of business managers to directors, domain experts and operations managers. Since these workers focus on decision-making and business outcomes, analytics is an important part of their overall workflow but targeted at specific tasks. For analytical tools this role may use applications with embedded analytics, analytic discovery and modeling approaches. Visual discovery tools and in many instances user friendly SaaS applications are empowering the knowledge worker to be more analytically driven without IT involvement.

Analyst: Well versed in data, this persona often knows business intelligence and analytics tools that pertain to the position and applies analytics to analyze various aspects of the business. These users are familiar with applications and systems and know how to retrieve and assemble data from them in many forms. They can also perform a range of data blending and data preparation tasks, and create dashboards and data visualizations along with pivot tables with minimal or no training. They can interpret many types of data, including correlation and in some cases regression. The analyst’s role involves modeling and analytics either within specific analytic software or within software used for business planning and enterprise performance management. More senior analysts focus on more advanced analytics, such as predictive analytics and data mining, to understand current patterns data and predict future outcomes. These analysts might be called a split persona in terms of where their skills and roles are deployed in the organization. They may reside in IT, but a lot more are found on the business side, as they are accountable for analytics tied to the outcomes of the analytics. Analysts on the business side may not be expert in SQL or computer programming but may be adept with languages such as R or SAS. Those on the IT side are more familiar with SQL and the building of data models used in databases. With respect to data preparation, the IT organization looks at integration through the lens of ETL and associated tool sets, whereas the business side looks at it from a data-merge perspective and the creation of analytical data sets in places like spreadsheets.

The roles that represent this persona often are explicitly called analysts with a prefix that in most cases is representative of the department they work from, such as finance, marketing, sales or operations but could have prefixes like corporate, customer, operational or other cross-departmental responsibilities. The analytical tools they use almost always include the spreadsheet, as well as complementary business intelligence tools and a range of analytical tools like visual discovery and in some cases more advanced predictive analytics and statistical software. Visual discovery and commodity modeling approaches are empowering some analyst workers to move upstream from a role of data monger to a more interpretive decision support position. For those already familiar with advanced modeling, today’s big data environments, including new sources of information and modern technology, are providing the ability to build much more robust models and solve an entirely new class of business problems.

Publisher: Skilled in data and analytics, the publisher typically knows how to configure and operate business intelligence tools and publish information from them in dashboards or reports. They are typically skilled in the basics of spreadsheets and publishing information to Microsoft Word or PowerPoint tools. These users not only can interpret many types of analytics but can also build and validate the data for their organizations. Similar to the analyst, the publisher may be considered a split persona, as these individuals may be in a business unit or IT. The IT-based publisher is more familiar with the business intelligence processes and knows the data sources and how to get to data from the data warehouse or even big data sources. They may have basic configuration and scripting skills that enable them to produce outputs in several ways. They may also have basic SQL and relational data modeling skills that help them identify what can be published and determine how data can be combined through the BI tool or databases. The titles related to publisher may include business intelligence manager, data analyst, or manager or director of data or information management. The common tools used by the publisher include business intelligence authoring tools, various visualization and analytic tools, and office productivity tools like Microsoft Office and Adobe Acrobat.

Data Geek: A data geek, data analyst or potentially as sophisticated as a data scientist has expert data management skills, has an interdisciplinary approach to data that melds the split personas discussed at the analyst and senior analyst levels. The primary difference between the data geek and the analyst is that the latter usually focuses on either the IT side or the business side. A senior analyst with a Ph.D. in computer science understands relational data models and programming languages but may not understand advanced statistical models and statistical programming languages. Similarly, a Ph.D. in statistics understands advanced predictive models and associated tools but may not be prepared to write computer code. The data scientist not only understands both advanced statistics and modeling but enough about computer programming and systems along with domain knowledge. The titles for this role vary but include chief analytics officer, enterprise data architect, data analyst, head of information science and even data scientist.

To align analytics and the associated software to individuals in the organization, businesses should use personas to best identify who needs what set of capabilities to be effective. Organizations should also assess competency levels in their personas to avoid adopting software that is too complicated or difficult to use. In some cases you will have individuals that can perform multiple personas. Instead of wasting time, resources and financial capital, look to define what is needed and where training is needed to ensure business and IT work collaboratively in business analytics. While some business analytics software is getting easier to use, many of the offerings are still difficult to use because they are still being designed for IT or more sophisticated analysts. While these individuals are an important group, they represent only a small portion of the users who need analytic business tools.

vr_bigdata_obstacles_to_big_data_analytics (2)The next generation of business intelligence and business analytics will in part address the need to more easily consume information through smartphones and tablets but will not overcome one of the biggest barriers to big data analytics: the skills gap. Our benchmark research on big data shows staffing (79%) and training (77%) are the two biggest challenges organizations face in efforts to take advantage of big data through analytics. In addition, a language barrier still exists in some organizations, where IT speaks in terms of TCO and cost of ownership and efficiency while the business speaks in terms of effectiveness and outcomes or time to value, which I have written about previously. While all of these goals are important, the organization needs to cater to the metrics that are needed by its various personas. Such understanding starts with better defining the different personas and building more effective communication among the groups to ensure that they work together more collaboratively to achieve their respective goals and get the most value from business analytics.

Regards,

Tony Cosentino

VP and Research Director

Microsoft has been steadily pouring money into big data and business intelligence. The company of course owns the most widely used analytical tool in the world, Microsoft Excel, which our benchmark research into Spreadsheets in the Enterprise shows is not going away soon. User resistance (cited by 56% of participants) and lack of a business case (50%) are the most common reasons that spreadsheets are not being replaced in the enterprise.  The challenge is ensuring the spreadsheets are not just personally used but connected and secured into the enterprise to address consistency and a range of  and potential errors. These issues all add up to more work and maintenance as my colleague has pointed out recently.

vr_ss21_spreadsheets_arent_easily_replacedAlong with Microsoft SQL and SharePoint, Excel is at the heart of the company’s BI strategy. In particular, PowerPivot, originally introduced as an add-on for Excel 2010 and built into Excel 2013, is a discovery tool that enables exploratory analytics and data mashups. PowerPivot uses an in-memory, column store approach similar to other tools in the market. Its ability to access multiple data sources including from third parties and government through Microsoft’s Azure Marketplace, enables a robust analytical experience.

Ultimately, information sources are more important than the tool sets used on them. With the Azure Marketplace and access to other new data sources such as Hadoop through partnership with Hortonworks as my colleague assessed, Microsoft is advancing in the big data space. Microsoft has partnered with Hortonworks to bring Hadoop data into the fold through HDInsights, which enable familiar Excel environments to access HDFS via HCatalog. This approach is similar to access methods utilized by other companies, including Teradata which I wrote about last week. Microsoft stresses the 100 percent open source nature of the Hortonworks approach as a standard alternative to the multiple, more proprietary Hadoop distributions occurring throughout the industry. An important benefit for enterprises with Microsoft deployments is that Microsoft Active Directory adds security to HDInsights.

As my colleague Mark Smith recently pointed out about data discovery methods, the analytic discovery category is broad and includes visualization approaches. On the visualization side, Microsoft markets PowerView, also part of Excel 2013, which provides visual analytics and navigation on top of the Microsoft’s BI semantic model. Users also can annotate and highlight content and then embed it directly into PowerPoint presentations. This direct export feature is valuable because PowerPoint is still a critical communication vehicle in many organizations. Another visual tool, currently in preview, is the Excel add-in GeoFlow, which uses Bing Maps to render visually impressive temporal and geographic data in three dimensions. Such a 3-D visualization technique could be useful in many industries.  Our research into next generation business intelligence found that deploying geographic maps (47%) and visualizing metrics on them (41%) are becoming increasing important but Microsoft will need to further exploit location-based analytics and the need for interactivity.

Microsoft has a core advantage in being able to link its front-office tools such as Excel with its back-end systems such as SQL Server 2012 and SharePoint. In particular, having the ability to leverage a common semantic model through Microsoft Analytical Services, in what Microsoft calls its Business Intelligence Semantic Model, users can set up a dynamic exploratory environment through Excel. Once users or analysts have developed a BI work product, they can publish the work product such as a report directly or through SharePoint. This integration enables business users to share data models and solutions and manage them in common, which applies to security controls as well as giving visibility into usage statistics to see when particular applications are gaining traction with organizational users.

Usability, which our benchmark research into next-generation business intelligencevr_ss21_employee_spreadsheet_skills_are_adequate identifies as the number-one evaluation criterion in nearly two-thirds (64%) of organizations, is still a challenge for Microsoft. Excel power users will appreciate the solid capabilities of PowerPivot, but more casual users of Excel – the majority of business people – do not understand how to build pivot tables or formulas. Our research shows that only 11 percent of Excel users are power users and most skill levels are simply adequate (49%) compared to above average or excellent. While PowerView does give some added capability, a number of other vendors of visual discovery products like Tableau have focused on user experience from the ground up, so it is clear that Microsoft needs to address this shortcoming in its design environment.

When we consider more advanced analytic strategies and inclusion of advanced algorithms, Microsoft’s direction is not clear. Its Data Analysis eXpressions (DAX) can help create custom measures and calculated fields, but it is a scripting language akin to MDX. This is useful for IT professionals who are familiar with such tools, but here also business-oriented users will be challenged in using it effectively.

A wild card in Microsoft’s BI and analytics strategy is with mobile technology. Currently, Microsoft is pursuing a build-once, deploy-anywhere model based on HTML5, and is a key member of the Worldwide Web Consortium (W3C) that is defining the standard. The HTML5 standard, which has just passed a big hurdle in terms of candidate recommendation is beginning to show value in the design of new applications that can be access through web-browsers on smartphones and tablets.  The approach of HTML5 could be challenging as our technology innovation research into mobile technology finds more organizations (39%) prefer native mobile applications from the vendors specific application stores compared to 33 percent through web-browser based method and a fifth with no preference. However, the success or failure of its Windows 8-based Surface tablet will be the real barometer of Microsoft mobile BI success since its integration with the Office franchise is a key differentiator. Early adoption of the tablet has not been strong, but Microsoft is said to be doubling down with a new version to be announced shortly. Success would put Office into the hands of the mobile workforce on a widespread basis via Microsoft devices, which could have far-reaching impacts for the mobile BI market.

As it stands now, however, Microsoft faces an uphill battle in establishing its mobile platform in a market dominated by Android and Apple iOS devices like the iPhone and iPad. If the Surface ultimately fails, Microsoft will likely have to open up Office to run on Android and iOS or risk losing its dominant position.  My colleague is quite pessimistic about Microsoft overall mobile technology efforts and its ability to overcome the reality of the existing market. Our technology innovation research into mobile technology finds that over half of organizations have a preference for their smartphone and tablet technology platform, and the first ranked smartphone priorities has Apple (50%), Android (27%) and RIM (17%) as top smartphone platforms with Microsoft a distant fourth (5%); for tablets is Apple (66%), Android (19%) and then Microsoft (8%). Based on these finding, Microsoft faces challenges on both the platform front and if they adapt their technology to support others that are more preferred in business today.

Ultimately, Microsoft is trying to pull together different initiatives across multiple internal business units that are known for being very siloed and not organized well for customers.  Ultimately, Microsoft has relied on its channel partners and customers to figure out how to not just make them work together but also think about what is possible since they are not always given clear guidance from Redmond. Recent efforts find that Microsoft is trying to come together to address the big data and business analytics challenge and the massive opportunity it represents. One area in which this is coming together is Microsoft’s cloud initiatives. Last year’s announcements of Azure virtual machines enables an infrastructure-as-a-service (IaaS) play for Microsoft and positions Windows Azure SQL Database as a service. This could make the back end systems I’ve discussed available through a cloud-based offer, but currently this is only offered through the client version of the software.

For organizations that already have installed Microsoft as their primary BI platform and are looking for tight integration with an Excel-based discovery environment, the decision to move forward is relatively simple. The trade-off is that this package is still a bit IT-centric and may not attract as many in the larger body of business users as a more user-friendly discovery product might do and address the failings of business intelligence. Furthermore, since Microsoft is not as engaged in direct support and service as other players in this market, it will need to move the traditionally technology focused channel to help their customers become more business savvy. For marketing and other business departments, especially in high-velocity industries where usability and time-to-value is at a premium and back-end integration is secondary, other tools will be worth a look. Microsoft has great potential and with analytics being the top ranked technology innovation priority among its customers I hope that the many divisions inside the global software giant can finally come together to deliver a comprehensive approach.

Regards,

Tony Cosentino

VP and Research Director

Platfora has gained a lot of buzz in the Big Data analytics market primarily through word of mouth. Late last year the company took the covers off of some impressive and potentially disruptive technology that takes aim at the broad BI and business analytics ecosystem, including the very foundation on which the industry is built. It recently demonstrated its software at the Strata Conference where the audience that is fixated on big data was in attendance.

Platfora looks to provide the underlying architecture of tomorrow’s vr_ngbi_br_importance_of_bi_technology_considerationsBI systems and address the challenge of big data analytics. Our benchmark research shows that one of the biggest hurdles facing next-generation BI systems is usability and was the top category in 63 percent of organizations. This is of specific concern when it comes to big data analytics and today’s Hadoop ecosystem, where many companies are taking to the Field of Dreams approach – if you build it, they will come. That is, many companies are setting up Hadoop clusters, but users have no access to the underlying data and need data scientists to come in and painstakingly extract nuggets of value. Simply connecting Hadoop to applications via connectors does not work well since there is no good way to sort through the Hadoop data to decide what to move into a more production-oriented system.

Platfora promises to solve this problem by bypassing both traditional architectures and newer hybrid architectures and putting everything in Hadoop, from data capture to data preparation to analysis and visualization.

The challenge with traditional architectures, Platfora argues, is that they organize data in a predetermined manner, but today’s big data analytics environment dictates that organizations cannot determine in advance what they will need to explore in the future. If a user gets to a level of analysis that is not part of the current schema, someone in the organization must undertake a herculean effort to recreate the entire data model. It’s the ‘I don’t know what I don’t know’ challenge. In my blog post Big Data Analytics Faces a Chasm of Understanding, I discuss the difference in exploratory analytics and confirmatory approaches that marks the difference between the 20th and 21st century approaches to business analytics. Businesses need both, but the nature of big data demands the exploratory approach be given more weight.

Platfora stores data in Hadoop and works with all of the open source stacks, including those from Cloudera, HortonWorks and MapR, as well as EMC Pivotal HD proprietary distribution announced just this week and assessed by my colleague. The secret sauce for Platfora is its ability to provide visibility into the underlying file system and provide a shopping-basket metaphor, where an analyst can choose vr_bigdata_obstacles_to_big_data_analyticsdifferent dimensions that are of interest. Through what the company calls Fractal Cache technology, which is a distributed query engine, it takes the data and creates the relationship on the fly in-memory. This essentially provides an ad hoc data mart, which an analyst can then access to do slice-and-dice analysis with sub-second response times and solve exploratory analytics challenges. If an analyst drills down and finds that an interesting piece of information is not included in the model, he can have the software recreate the model on an ad hoc basis, which generally takes from minutes up to an hour, according to the company.

The software’s power and ease of use allows business analysts to expand the breadth of questions they can ask of the data without having to go back to IT. According to the company, it takes only a few hours of training on the system to get up and running. This is especially important given that our Big Data benchmark research that assessed the challenges of Hadoop says one of the biggest challenges organizations face today is one of staffing and training as found in over three quarters of organizations. If Platfora can solve this conundrum and implement it within the enterprise, it will indeed start to move organizations beyond the technologically oriented three V’s discussion about big data into the business-oriented discussion around the three Ws.

The biggest challenge the company may face is institutional. Companies have spent billions of dollars implementing their current architectures, and relationships with software providers often run deep. Furthermore, the idea of such a system largely replacing traditional data warehouses threatens not only the competition, but perhaps the departments the company aims to sell into. Simply put, such a business-usable system obviates the need for an entire area of IT. Many firms, especially large ones, are inherently risk-averse, and this may be the biggest challenge facing Platfora. Other software providers have started with similar messaging out of the gate, but then shifted to more of a coexistence-messaging approach to gain traction in organizations. It will be interesting to see whether Platfora yields to these same headwinds as it moves through its beta phase and into GA.

In sum, Platfora is an exciting new company, and companies that are adopting Hadoop should look into the way in can drive big data analytics and maybe change the culture of their organizations.

Regards,

Tony Cosentino

VP and Research Director

LogiXML has been around for more than a decade, but has seen particularly robust growth in the past year. Recent announcements show the company with better than 100-percent year-over-year growth, driven by a 97 percent license renewal rate and new customer growth in SMB, departmental and OEM deployments. The 158-percent growth for the embedded analytics group for the fourth quarter on a year-over-year basis was particularly strong.

LogiXML categorizes its products into Logi Info, its flagship product targeted primarily at IT and developers; Logi Ad Hoc, targeted as vr_ngbi_br_importance_of_bi_technology_considerationsa self-service tool for business users; and a suite of the two bundled with ETL capabilities. Just a few weeks ago, LogiXML released Logi Info v11. The product offers enhanced visualization capabilities, making analytics easier and more robust for end users. Building on an already solid product, the changes in Logi Info seem to be more evolutionary than revolutionary, and appear to be focused on keeping up with functionality necessary to compete in today’s BI market. With that said, the core product has advantages as simple, highly embeddable software. Its light footprint and ease of use are competitive advantages, allowing companies to quickly and easily put business intelligence capabilities into the hands of end users, as well as into the applications themselves. Usability is by far the number one criterion for companies choosing a business intelligence application, according to our benchmark research on next-generation business intelligence.

LogiXML’s ease of use for developers enables companies to tell a compelling time-to-value story for embedding intelligence directly into applications, and takes advantage of the market trend toward agile development. As I’ve written on numerous occasions and most recently in Big Data Faces a Chasm of Misunderstanding, the key challenge for organizations is overcoming the lack of an analytics-driven culture. This challenge is different for IT and for business users, as IT focuses on things such as standards, security and information management, while business users focus more on providing specific insights for business decisions. My colleague Mark Smith has also written about this environment and specifically about how BI tools are failing business. LogiXML’s ability to put BI directly into applications in an agile way allows the company to overcome many of these arguments, since applications fit directly into the role-based decision environments that drive companies today.

Interestingly, the company is said to be rebranding in 2013. Rebranding is always a large effort and often pays off well for a consumer brand or a broad business brand, but given LogiXML’s size and core advantage in the OEM market, the return on such a move is questionable. Perhaps LogiXML is looking to drive its brand into the business user side of the equation with an Intel Inside-type of strategy, or target business users themselves. Either way, the end-game here is unclear given a limited data discovery portfolio and no clear value path to sell directly to business end users. For LogiXML, this path is still primarily through IT and secondarily through business. Furthermore, many new business intelligence and analytic brands coming on the market also target the business end user, so the company will need to make a significant investment in basic awareness-building in order to gain parity with the current brand equity.

Branding aside, the challenge for the company’s continued trajectory of growth is twofold. First, it is reaching an inflection point in its number of employees. As a company reaches about 200 employees, which LogiXML is approaching now, company culture often transforms itself. With solid leadership and a strong roadmap, LogiXML will be able to overcome this first challenge. The second challenge is that it competes in a highly competitive segment against all the major stack players as well as with the more cost-oriented open source players. The expanding market landscape of business intelligence to business analytics, including data discovery and predictive analytics, introduces a host of new players and dilutes the broader opportunity as customers settle into their approaches.

In sum, LogiXML is often able to provide the path of least resistance and solid time-to-value for companies looking to quickly and effectively roll out business intelligence capabilities. In particular, as SaaS providers and OEM relationships continue to gain steam, the embeddable nature of LogiXML’s platform and the ability to construct applications at an elemental level gives the company a good market position from which to advance. Like many others in the increasingly crowded BI space, it will need to capitalize on its current market momentum and solidify its position before others start to invade more of its addressable space. For those organizations looking for an integrated business intelligence platform and tools should assess the improvements in this latest release of LogiXML.

Regards,

Tony Cosentino

VP and Research Director

Did you catch all the big data analogies people used in 2012? There were many, like the refinement of oil analogy, or the spinning straw into gold analogy, and less useful but more entertaining ones, like big data is like a box of chocolates, or big data is like The Matrix (because “there’s no way Keanu Reeves learns Kung Fu in five seconds without using big data”).  I tend to like the water analogy, which I’ll use here to have a little fun and to briefly describe how I see the business analytics market in 2013.

2013 is about standing lakes of information that will turn into numerous tributaries. These various tributaries of profile, behavioral and attitudinal data will flow into the digital river of institutional knowledge. Analytics, built out by people, process, information and technology, will be the only banks high enough to control this vast river and funnel it through the duct of organizational culture and into an ocean of market advantage.

With this river of information as a backdrop, I’m excited to introduce the Ventana Research Business Analytics Research Agenda for 2013, focused on three themes:

Answering the W’s (the what, the so what, the now what and the then what)

The first and perhaps most important theme of the 2013 research agenda builds on answering the W’s – the what, the so what, the now vr_bigdata_big_data_capabilities_not_availablewhat and the then what – which was also the topic of one of my most widely read blog posts last year. In that piece I suggested a substantive shift from the discussion the three V’s to the four W’s corresponds to the shift from a technologically-oriented discussion to a business-oriented one. Volume, variety and velocity are well-known parameters of big data that help facilitate the technology discussion, but when we look at analytics and how it can drive success for an organization, we need to move to the so what, now what and then what of analytical insights, organizational decision-making and closed-loop processes. Our big data research found some significant gaps in the business analytics spectrum for what is available in the organization today fueling a new generation of technology for consuming big data.

Outcome-driven approaches are a good way of framing issues, given that business analytics, and in particular big data analytics, are such broad topics, yet the use cases are so specific. Our big data analytics benchmark research for 2013 that we will start shortly will therefore look at specific benefits and supported business cases across industries and LOB in order to assess best practices for big data analytics. The research will investigate the opportunities and barriers that exist today  and explore what needs to happen for us to move from an early adopter market to an early majority market. The benchmark research will feed weighting algorithms into our Big Data Analytics Value Index, which will look at the analytics vendors that are tackling the formidable challenges of providing software to analyze large and multistructured datasets.

Disseminating insights within the organization is a big part of moving from insights to action, and business intelligence is still a primary vehicle for driving insight into the organization. While there is a lot to be said about mobile BI, collaborative BI, visual discovery and predictive analytics, core business intelligence systems remain at the heart of many organizations. Therefore we will continue our in-depth coverage of core business intelligence systems with our Business Intelligence Value Index, in the context of our next-generation business analytics benchmark research that will start in 2013.

Embracing next generation technology for business analytics

We’re beginning to see businesses embracing next-generation technology for business analytics.Collaborative business intelligence is a critical part of this conversation, both in terms of getting insights and in termsvr_ngbi_br_location_is_important_for_bi of making decisions. Last year’s next-generation business intelligence benchmark research showed us that the market is still undecided on how next-generation BI will be rolled out, with business applications being the preferred method, but only slightly more so than through business intelligence or office productivity tools. In addition to collaboration, we will focus on mobile and location trends in our next-generation business analytics benchmark research and our new location analytics benchmark research that we have already found has specific needs for business analysts. We see mobile business intelligence as a particularly hot area in 2013, and we are therefore breaking out mobile business intelligence vendors in this year’s Mobile Business Intelligence Value Index that we will conduct.

Another hot area of next-generation technology revolves around right-time data and real-time data and how they can be built into organizational workflows. As our operational intelligence benchmark research found, perceptions around OI and real-time data differ significantly between IT and business users. We will extend this discussion in the context of the big data analytics benchmark research, and specifically in the context of our Operational Intelligence Value Index that we will do in 2013.

Using analytical best practices across business and IT

Our final theme relates to the use of analytical best practices across business and IT. We’ll be looking at best practices for companies as they evolve to become analytics-driven organizations. In this context, we’llvr_predanalytics_benifits_of_predictive_analytics look at exploratory analytics, visual discovery and even English representation of the analytics itself as approaches in our next-generation business analytics benchmark research and how these impact how we assess BI in our Business Intelligence Value Index. We’ll look at how organizations exploit predictive analytics on big data as a competitive advantage as found in our predictive analytics benchmark within the context of our big data analytics benchmark research. We’ll look at the hot areas of sales and customer analytics, including best practices and their intersection with cloud computing models. And we’ll look at previously untapped areas of analytics that are just now heating up, such as human capital analytics. In our human capital analytics benchmark research that will begin shortly we’ll look across the landscape to assess not just analytics associated with core HR, but analytics around talent management and workforce optimization as well.

I see a high level of innovation and change going on in the business analytics market in 2013. Whenever an industry undergoes such change, high-quality primary research acts as a lighthouse for both customers and suppliers.  Companies can capitalize on all of the exciting developments in analytics, business intelligence and related areas of innovation to drive competitive advantage, but only if they understand the changes and potential value.

I am looking forward providing a practical perspective on using all forms of business analytics as a value in organizations and helping our Ventana Research community and clients.

Come read and download the full research agenda.

Regards,

Tony Cosentino
VP and Research Director

At Oracle OpenWorld this week I focused on what the company is doing in business analytics, and in particular on what it is doing with its Exalytics In-Memory Machine. The Exalytics appliance is an impressive in-memory hardware approach to putting right-time analytics in the hands of end users by providing a full range of integrated analytic and visualization capabilities. Exalytics fits into the broader analytics portfolio by providing support for Oracle BI Foundational Suite including OBIEE, Oracle’s formidable portfolio of business analytics and business performance applications, as well interactive visualizations and discovery capabilities.

Exalytics connects with an Exadata machine over a high-throughput InfiniBand link. The system leverages Oracle’s TimesTen In-Memory Database with columnar compression originally built for online transaction processing in the telecommunications industry. Oracle’s Summary Advisor software provides a heuristic approach to making sure the right data is in-memory at the right time, though customers at the conference mentioned that in their initial configuration they did not need to turn on Summary Advisor since they were able to load their entire datasets in memory. Oracle’s Essbase OLAP server is also integrated, but since it requires a MOLAP approach and computationally intensive pre-aggregation, one might question whether applications such as Oracle’s Hyperion planning and forecasting tools will truly provide analysis at the advertised ”speed of thought.” To address this latter point, Oracle points to examples in the consumer packaged goods industry, where it has already demonstrated its ability to reduce complex planning scenario cycle times from 24 hours down to four hours. Furthermore, I discussed with Paul Rodwick, Oracle’s vice president of product management for business intelligence, the potential for doing in-line planning, where integrated business planning and complex what-if modeling can be done on the fly. For more on this particular topic, please check out my colleague’s benchmark research on the fast clean close.

Another important part of the Exalytics appliance is the Endeca discovery tool. Endeca, which Oracle acquired just a year ago, provides an exploratory interactive approach to root cause analysis and problem-solving without the traditional struggle of complex data modeling. It does this through a search technology that leverages key-value pairings in unstructured data, thereby deriving structure delivered in the form of descriptive statistics such as recency and frequency.  This type of tool democratizes analytics in an organization and puts power into the hands of line-of-business managers. The ability to navigate across data was the top-ranked business capability in our business analytics benchmark research. However, while Endeca is a discovery and analysis tool for unstructured and semi-structured data, it does not provide full sentiment analysis or deeper text analytics, as do tools such as IBM’s SPSS Text Analytics and SAS Social Conversation Center. For more advanced analytics and data mining, Oracle integrates its version of R for the enterprise on its Exadata Database Machine and its Big Data Appliance, Oracle’s integrated Hadoop approach based on the Cloudera stack.

At the conference, Oracle announced a few updates for Exalytics. The highlights include a new release of Oracle Business Intelligence 11.1.1.6.2BP1 optimized for Exalytics. The release includes new mobility and visualization capabilities, including trellis visualizations that allow users to see a large number of charts in a single view. These advanced capabilities are enabled on mobile devices through the increased speed provided by Exalytics. In addition, Endeca is now certified with Oracle Exalytics, and the TimesTen database has been certified with GoldenGate and Oracle Data Integrator, allowing the system and users to engage in more event-driven analytics and closed-loop processes. Finally, Hyperion Planning is certified on Exalytics.

On the upside, Oracle’s legacy platform support on Exalytics allows the company to leverage its entire portfolio and offer more than 80 prebuilt analytics applications ready to run on Exalytics without any changes. The platform also supports the full range of the Business Intelligence Foundational Suite and provides a common platform and a common dimensional model. In this way, it provides alignment with overall business processes, content management systems and transactional BI systems. This alignment is especially attractive for companies that have a lot of Oracle software already installed, and companies looking to operationalize business analytics through event-based closed-loop decision-making at the front lines of the organization. The speed of the machine, its near-real-time query speeds, and its ability to deliver complex visualizations to mobile devices allow users to create use cases and ROI scenarios they could not before. For example, the San Diego Unified School District was able to utilize Exalytics to push out performance dashboards across multiple device types to students and teachers, thereby increasing attendance and garnering more money from the state.The Endeca software lets users make qualitative and to some degree quantitative assessments from social data and to overlay that with structured data. The fact that Exalytics comprises an integrated hardware and software stack makes it a turnkey solution that does not require expensive systems integration services. Each of these points makes Exalytics an interesting and even an exciting investment possibility for companies.

On the downside, the integrated approach and vendor lock-in may discourage some companies concerned about Oracle’s high switch costs. This may pose a risk for Oracle as maturing alternatives become available and the economics of switching begin to make more sense. However, it is no easy task to switch away from Oracle, especially for companies with large Oracle database and enterprise application rollouts. The economics of loyalty are such that when customers are dissatisfied but captive, they remain loyal; however, as soon as a viable alternative comes on the market, they quickly defect. This defection threat for Oracle could come from dissatisfaction with configuration and ease-of-use issues in combination with new offerings from large vendors such as SAP, with its HANA in-memory database appliance, and hardware companies such as EMC that are moving up the stack into the software environment. To be fair, Oracle is addressing the issues around useability by moving the user experience away from “one size fits all” to more of a personae or role-based interface. Oracle will likely have time to do this, given the long tail of its existing database entrenchment and the breadth and depth of its analytics application portfolio.

With Exalytics, Oracle is addressing high market demand for right-time analytics and interactive visualization. I expect this device will continue to sell well especially since it is plug-and-play and Oracle has such a large portfolio of analytic applications. For companies already running Oracle databases and Oracle applications, Exalytics is very likely a good investment. In particular, it makes sense for those that have not fully leveraged the power of analytics in their business and that are therefore losing competitive position. Our benchmark research on analytics shows a relatively low penetration for analytics software today, but investments are accelerating due to the large ROI companies are realizing from analytics. In situations where companies find the payoffs to be less obvious, or where they fear lock-in, companies should take a hard look at exactly what they hope to accomplish with analytics and which tools best suit that need. The most successful companies start with the use case to justify the investment, then determine which technology makes the most sense.

In addition to this blog, please see Ventana Research’s broader coverage of Oracle OpenWorld.

Regards,

Tony Cosentino

VP and Research Director

IBM acquired SPSS in late 2009 and has been investing steadily in the business as a key component of its overall business analytics portfolio. Today, IBM SPSS provides an integrated approach to predictive analytics through four distinct software packages: SPSS Data Collection, SPSS Statistics, SPSS Modeler and SPSS Decision Management. IBM SPSS is also integrated with Cognos Insight, IBM’s entry into the visual discovery arena.

Our benchmark research into predictive analytics shows that companies are struggling with two core issues: a skills shortage related to predictive analytics and integration of predictive analytics into their information architecture. A preliminary look at IBM’s SPSS software makes it obvious to me that IBM is putting its full weight behind addressing both of these issues.

Predictive analytics is a hot term in business today, but there is still some debate about what it means. My blog entry on predictive analytics discusses findings from our research and the idea that the lines between predictive and descriptive analytics are becoming blurred. IBM provides an interesting take on this conversation by discussing predictive analytics in the context of data mining and statistics. Data mining it sees as more bottom-up and exploratory in nature (though it can also be predictive) and statistics as more of a top-down hypothesis-driven approach (though it can also use descriptive techniques).

If you use SPSS Modeler you don’t have to be a data scientist to participate in predictive analytics discussions. Once data is loaded into the modeler and a few preliminary questions are answered about what you are trying to do, SPSS Modeler presents a choice of analytical techniques, such as CHAID, CART and others, and suggests the best approach based on multiple variables, such as number of fields or predictive power. This is a big deal because business managers often have some idea of clustering, regression and cause-and-effect-type functions, but they don’t necessarily know the intricacies of different techniques. With SPSS Modeler you don’t have to know the details of all of this, but can still participate in these important discussions. SPSS Modeler can bridge the gap between statistician and a day-to-day LOB analyst and decision-maker, and thus help bridge the analytics skills gap facing organizations today.

Another challenge for organizations is integrating multiple streams of data including attitudinal data. Built-in survey data collection in SPSS Data Collection can fill in these blanks for analysts. Sometimes behavioral data reveals knowledge gaps that can only be filled with direct perceptual feedback from stakeholders collected through a survey instrument. Trying to tell a story with only behavioral data can be like trying to tell the actual contents of a file based only on metadata descriptors such as file size, type, and when and how often the file was accessed. Similarly, social media data may provide some of the context, but it does not always give direct answers. We see LOB initiatives bringing together multiple streams of data, including attitudinal data such as brand perceptions or customer satisfaction. The data collection functionality allows managers, within the context of broader analytics initiatives, to bring such data directly into their models and even to do scoring for things such as customer or employee churn.

I have not yet discussed IBM SPSS integration with decision systems and the idea of moving from the “so what” of analytics to the “now what” of decision-making. This is a critical component of a company’s analytics agenda, since operationalizing analytics necessitates that the model outcomes be pushed out to organizations’ front lines and then updated in a closed-loop manner. Such analytics are more and more often seen as a competitive advantage in today’s marketplace – but this is a separate discussion that I will address in a future blog entry.

SPSS has a significant presence across multiple industries, but it is ubiquitous in academia and  in the market research industry. The market research industry itself is a particularly interesting foothold for IBM as the market is estimated to be over $30 billion globally, according to the Council of American Survey Research Organizations. By leveraging IBM and SPSS, companies gain access to a new breed of market research to help merge forward-looking attitudinal data streams with behavioral data streams. The academic community’s loyalty to SPSS provides it an advantage similar to that of Apple when it dominated academic institutions with the Macintosh computer. As people graduate with familiarity with certain platforms, they carry this loyalty with them into the business world. As spreadsheets are phased out as the primary modeling tool due to their limitations, IBM can capitalize on the changes with continued investments in institutions of higher learning.

Companies looking to compete based on analytics should almost certainly consider IBM SPSS. This is especially true of companies that are looking to merge LOB expertise with custom analytical approaches, but that don’t necessarily want to write custom applications to accomplish these goals.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Stats

  • 72,942 hits
%d bloggers like this: