You are currently browsing anthonycosentinovr’s articles.

IBM redesigned its business intelligence platform, now called IBM Cognos Analytics. Expected to be released by the end of 2015, the new version includes features to help end users model their own data without IT assistance while maintaining the centralized governance and security that the platform already has. Our benchmark research into information optimization shows that simplifying access to information is important to vr_Info_Optimization_01_whos_responsible_for_information_availabilityvirtually all (97%) participating organizations, but it also finds that only one in four (25%) are satisfied with their current software for doing that. Simplification is a major theme of the IBM Cognos redesign.

The new IBM Cognos Analytics provides a completely Web-based environment that is consistent in the user interface and security across multiple devices and browsers. The redesigned interface follows IBM’s internal cultural shift to base product development first on the user experience and second on features and functionality. This may be a wise move as our research across multiple analytic software categories finds usability to be organizations’ most often important buying criterion.

The redesign is based on the same design and self-service principles as IBM Watson Analytics which we did award a Ventana Research Technology Innovation Award for 2015 in business analytics. The redesign is most evident in the IBM Cognos Analytics authoring mode. The Report Studio and Cognos Workspace Advanced modules have been replaced with a simplified Web-based modeling environment. The extended capabilities of IBM Cognos 10.2.2 are still available, but now they are hidden and more logically arranged to provide easier user access. For example, the previous version of Cognos presented an intimidating display of tools with which to do tasks such as fine-grain manipulation of reports; now these features are hidden but still easily accessible. If a user is having difficulty finding a particular function, a “smart search” feature helps to find the correct menu to add it.

The new system indexes objects, including metadata, as they are created, providing a more robust search function suitable for nontechnical users in the lines of business. The search feature works with what IBM calls “intent-based modeling” so users can search for words or phrases – for example, revenue by unit or product costs – and be presented with only relevant tables and columns. The system can then automatically build a model by inferring relationships in the data. The result is that the person building the report need not manually design a multidimensional model of the data, so less skilled end users can serve themselves to build their own data models that underpin dashboards and reports. Previously, end users were limited to parameterized reporting in which they could work only within the context of models previously designed by IT. Many vendors of analytics have been late in exploiting the power of search and therefore may be missing a critical feature that customers desire. Ventana Research is a proponent of such capabilities; my colleague Mark Smith has written about them in the context of data discovery technology. Search is fundamental to user-friendly discovery systems, as is reflected in the success of companies such as Google and Splunk. With search becoming more sophisticated, being based on machine-learning algorithms, we expect it to become a key requirement for new analytics and business intelligence systems.

Furthering the self-service aspect is the ability for end users to access and combine multiple data sets. The previous version of IBM Cognos (10.2.2) allowed users to work with “personal data sets” such as .csv files, but they needed an IBM DB2 back end to house the files. Now such data sets can be uploaded and managed directly on the IBM Cognos Analytics server and accessed with the new Web-based authoring tool. Once data sets are uploaded they can be accessed and modeled like any other object to which the user has access. In this way, IBM Cognos Analytics addresses the “bring your own data” challenge in which data sources such as personal spreadsheets must be integrated into enterprise analytics and business intelligence systems.

After modeling the data, users can lay out new dashboards using drag-and-drop capabilities like those found in IBM Watson Analytics. Dashboards can be previewed and put into service for one-time use or put into production mode if the user has such privileges. As is the case with IBM Watson Analytics, newly designed dashboard components such as tables, charts and maps are automatically linked so that changes in one part of the dashboard automatically relate to other parts. This feature facilitates ease of use in designing dashboards. Some other tools in the market require widgets to be connected manually, which can be time-consuming and is an impediment to prototyping of dashboards.

The move to a more self-service orientation has long been in the works for IBM Cognos and so this release is an important one for IBM. The ability to automatically integrate and model data gives the IT department a more defensible position as other self-service tools are introduced into the organization and are challenging data access and preparation built within tools like IBM Cognos. vr_DAC_20_justification_for_data_preparationThis is becoming especially important as the number and complexity of data sources increases and are needed more rapidly by business. Our research into information optimization shows that most organizations need to integrate at least six data sources and some have 20 or more sources they need to bring together. All of which confirms what our data and analytics in the cloud benchmark research finds data preparation to be a top priority in over half (55%) of organizations.

Over time, IBM intends to integrate the capabilities of Cognos Analytics with those of Watson Analytics. This is an important plan because IBM Watson Analytics has capabilities beyond those of self-service tools in the market today. In particular, the ability to explore unknown data relationships and do advanced analysis is a key differentiator for IBM Watson Analytics, as I have written. IBM Watson Analytics enables users to explore relationships in data that otherwise would not be noticeable, whereas IBM Cognos Analytics enables them to explore and put into production information based on predefined assumptions.

Going forward, I will be watching how IBM aligns Cognos Analytics with Watson Analytics, and in particular, how Cognos Analytics will fit into the IBM cloud ecosystem. Currently IBM Cognos Analytics is offered both on-premises and in a hosted cloud, but here also IBM is working to align it VR_AnalyticsandBI_VI_HotVendor_2015more closely with IBM Watson Analytics. Bringing in data preparation, data quality and MDM capabilities from the IBM DataWorks product could also benefit IBM Cognos Analytics users. IBM should emphasize the breadth of its portfolio of products including IBM Cognos TM1, IBM SPSS, IBM Watson Analytics and IBM DataWorks as it faces stiff competition in enterprise analytics and business intelligence from a host of analytics companies including new cloud-based ones. IBM is rated a Hot Vendor in our Ventana Research Analytics and Business Intelligence Value Index in part because of its overall portfolio.

For organizations already using IBM Cognos, the redesign addresses the need of end users to create their own dashboards while maintaining IT governance and control. The new interface may take some getting used to, but it is modern and more intuitive than previously. For companies new to IBM Cognos, as well as departments wanting to take a look at the platform, cloud options offer less risk. For those wanting early access to the new IBM Cognos Analytics, IBM has provided access to it on www.analyticszone.com. The changes I have noted move IBM Cognos Analytics closer to the advances in analytics as a whole, and I recommend that all these groups examine the new version.

Regards,

Ventana Research

Tableau Software’s annual conference, which company spokespeople reported had more than 10,000 attendees, filled the MGM Grand in Las Vegas. Various product announcements supported the company’s strategy to deliver value to analysts and users of visualization tools. Advances include new data preparation and integration features, advanced analytics and mapping. The company also announced the release of a stand-alone mobile application called Vizable . One key message management aimed to promote is that Tableau is more than just a visualization company.

Over the last few years Tableau has made strides in the analytics and business intelligence market with a user-centric philosophy and the ability to engage younger analysts who work in the lines of business rather than in IT. Usability continues to rank as the top criteria for selecting analytic and business intelligence software in all of our business analytics benchmark research. In this area Tableau has introduced innovations such as VizQL, originally developed at Stanford University, which links capabilities to query a database and to visualize data. This combination enables users not highly skilled in languages such as SQL or using proprietary business intelligence tools to create and share visually intuitive dashboards. The effect is to provide previously unavailable visibility into areas of their operations. The impact of being able to see and compare performance across operations and people often increases communication and knowledge sharing.

Tableau 9, released in April 2015, which I discussed, introduced advances including analytic ease of use and performance, new APIs, data preparation, storyboarding and Project Elastic, the precursor to this year’s announcement of Vizable. Adoption of 9.x appears to be robust given both the number of conference attendees and increases in third-quarter revenue ($170 million) and new customers (3,100) reported to the financial markets.

As was the case last year, conference announcements included some developments already on the market as well as some still to come. Among data preparation capabilities introduced are integration and automated spreadsheet cleanup. For the former, being able to join two data sets through a union function, which adds rows to form a single data set, and to do integration across databases by joining specific data fields gives users flexibility in combining, analyzing and visualizing multiple sets of data. For the latter, to automate the spreadsheet cleanup process Tableau examined usage patterns of Tableau Public to learn how users manually clean their spreadsheets. Then it used machine-learning algorithms to help users automate the tasks. Being able to automatically scan Excel files to find subtables and automatically transform data without manual calculations and parsing will save time for analysts who vr_LA_most_important_location_analytics_capabilitiesotherwise would have to do these tasks manually. Our benchmark research into information optimization shows that data preparation consumes the largest portion of time spent on analytics by nearly half (47%) of organizations and even higher in our latest data and analytics in the cloud benchmark research by 59 percent of organizations.

Advanced analytics is another area of innovation for Tableau. The company demonstrated developments in outlier detection and clustering analysis natively integrated with the software. Use of these features is straightforward and visually oriented, replacing the need for statistical charts with drag-and-drop manipulation. The software does not enable users to identify numbers of segments or filter the degree of the outliers, but the basic capability can reduce data sets to more manageable analytic sets and facilitate exploration of anomalous data points within large sets. The skill necessary for these tasks, unlike the interpretation of box plots introduced at last year’s conference, is more intuitive and better suited for business users of information.

The company also demonstrated new mapping and geospatial features at the conference. Capabilities to analyze down to the zip code on a global basis, define custom territories, support geospatial files, integrate with vr_LA_most_important_location_analytics_capabilitiesthe open source mapping platform MapBox and perform calculations within the context of a digital map are all useful features for location analytics, which is becoming more important in areas such as customer analytics and digital devices connected in the emerging Internet of things (IoT). Tableau is adding capabilities that participants most often cited as important in our research on location analytics: to provide geographic representation (72%), visualize metrics associated with locations (65%) and directly select and analyze locations on maps (61%).

Tableau insists that its development of new capabilities is guided by customer requests. This provides a source of opportunities to address user needs especially in the areas of data preparation, advanced analytics and location analytics. However, this strategy raises the question of whether it will ultimately put the company in conflict with the partners that have helped build the Tableau ecosystem and feed the momentum of the company thus far. Tableau is positioning its product as a fully featured analytic platform of the sort that I have outlined, but to achieve that eventually it will have to encroach on the capabilities that partners such as Alteryx, Datawatch, Informatica, Lavastorm, Paxata and Trifacta offer today. Another question is whether Tableau will continue its internal development strategy or opt to acquire companies that can broaden its capabilities that has hampered its overall value rating as identified in our 2015 Analytics and Business intelligence Value Index. In light of announcements at the conference, the path seems to be to develop these capabilities in-house. While there appears to be no immediate threat to the partnerships the continuation of development of some of these capabilities eventually will impact the partner business model in a more material way. Given that the majority of the deals for its partner ecosystem flows through Tableau itself, many of the partners are vulnerable to these development efforts. In addition I will be watching how aggressively Tableau helps to market Spark, the open source big data technology that I wrote about, as compared to some of the partner technologies that Spark threatens. Tableau has already built on Spark while some of its competitors have not, which may give Tableau a window of opportunity.

Going forward, integration with transactional systems and emerging cloud ecosystems is an area for Tableau that I will be watching. Given its architecture it’s not easy for Tableau to participate in the new generation of service-oriented architectures that characterize part of today’s cloud marketplace. For this reason, Tableau will need to continue to build out its own platform and the momentum of its ecosystem – which at this point does not appear to be a problem.

Finally, it will be interesting to see how Tableau eventually aligns its stand-alone data visualization application Vizable with its broader mobile strategy. We will be looking closely at the mobile market in our upcoming Mobile Analytics and Business Intelligence Value Index in the first half of 2016 where in our last analysis found Tableau was in the middle of the pack with other providers but they have made more investments since our last analysis.

We recommend that companies exploring analytics platforms, especially for on-premises and hosted cloud use, include Tableau on their short lists. Organizations that consider deploying Tableau on an enterprise basis should look closely at how it aligns with their broader user requirements and if their cloud strategy will meet its future needs. Furthermore, while the company has made improvements in manageability and performance, these can still be a concern in some circumstances. Tableau should be evaluated also with specific business objectives in mind and in conjunction with its partner ecosystem.

Regards,

Ventana Research

PentahoWorld 2015, Pentaho’s second annual user conference, held in mid-October, centered on the general availability of release 6.0 of its data integration and analytics platform and its acquisition by Hitachi Data Systems (HDS) earlier this year. Company spokespeople detailed the development of the product in relation to the roadmap laid out in 2014 and outlined plans for its integration with those of HDS and its parent Hitachi. They also discussed Pentaho’s and HDS’s shared intentions regarding the Internet of Things (IoT), particularly in telecommunications, healthcare, public infrastructure and IT analytics.

Pentaho competes on the basis of what it calls a “streamlined data refinery” that enables a flexible way to access, transform and integrate data and embed and present analytic data sets in usable formats without writing new code. In addition, it integrates a visual analytic workflow interface with a business intelligence front end including customization extensions; this is a differentiator for the company since much of the self-serve analytics market in which it competes is still dominated by separate point products.

Pentaho 6 aims to provide manageable and scalable self-service analytics. A key advance in the new version is what Pentaho calls “virtualized data sets” that logically aggregate multiple data sets according to transformations and integration specified by the Pentaho Data Integration (PDI) analytic workflow interface. This virtual approach allows the physical processing to be executed close to the data in various systems such as Hadoop or an RDBMS, which relieves users of the burden of having to continually move data back and forth between the vr_oi_factors_impeding_ol_implementationquery and the response systems. In this way, logical data sets can be served up for consumption in Pentaho Analytics as well as other front-end interfaces in a timely and flexible manner.

One challenge that emerges when accessing multiple integrated and transformed data sets is data lineage. Tracking its lineage is important to establish trust in the data among users by enabling them to ascertain the origin of data prior to transformation and integration. This is particularly useful in regulated industries that may need access to and tracking of source data to prove compliance. This becomes even more complicated with events and completely sourcing them along with the large number of them as found in over a third of organizations in our operational intelligence benchmark research that examined operational centric analytics and business intelligence.

Similarly, Pentaho 6 uses Simple Network Management Protocol (SNMP) to deliver application programming interface (API) extensions so that third-party tools can help provide governance lower in the system stack to further enable reliability of data. Our benchmark research consistently shows that manageability of systems is important for user organizations and in particular for big data environments.

The flexibility introduced with virtual tables and improvements in Pentaho 6.0 around in-line modeling (a concept I discussed after last year’s event are two critical means to building self-service analytic environments. Marrying various data systems with different data models, sometimes referred to as big data integration, has proven to be a difficult challenge in such environments. Pentaho’s continued focus on vr_BDI_01_automating_big_data_integrationbig data integration and providing an integration backbone to the many business intelligence tools (in addition to its own) are potential competitive differentiators for the company. While analysts and users prefer integrated tool sets, today’s fragmented analytics market is increasingly dominated by separate tools that prepare data and surface data for consumption. Front-end tools alone cannot automate the big data integration process, which Pentaho PDI can do.Our research into big data integration shows the importance of eliminating manual tasks in this process: 78 percent of companies said it is important or very important to automate their big data integration processes. Pentaho’s ability to integrate with multiple visual analytics tools is important for the company, especially in light of the HDS accounts, which likely have a variety of front-end tools. In addition, the ability to provide an integrated front end can be attractive to independent software vendors, analytics services providers and certain end-user organizations that would like to embed both integration and visualization without having to license multiple products.

Going forward, Pentaho is focused on joint opportunities with HDS such as the emerging Internet of Things. Pentaho cites established industrial customers such as Halliburton, Intelligent Mechatonic Systems and Kirchoff Datensysteme Software as reference accounts for IoT. In addition, a conference participant from Caterpillar Marine Asset Intelligence shared how it embeds Pentaho to help analyze and predict equipment failure on maritime equipment. Pentaho’s ability to integrate and analyze multiple data sources is key to delivering business value in each of these environments, but the company also possesses a little-known asset in the Weka machine learning library, which is an integrated part of the product suite. Our research on next-generation predictive analytics finds that Weka is used by 5 percent of organizations, and many of the companies that use it are large or very large, which is Pentaho’s target market. Given the importance of machine learning in the IoT category, it will be interesting to see how Pentaho leverages this asset.

Also at the conference, an HDS spokesperson discussed its target markets for IoT or what the company calls “social innovation.” These markets include telecommunications, healthcare, public infrastructure and IT analytics and reflect HDS’s customer base and the core businesses of its parent company Hitachi. Pentaho Data Integration is currently embedded within major customer environments such as Caterpillar, CERN, FINRA, Halliburton, NASDAQ, Sears and Staples, but not all of these companies fit directly into the IoT segments HDS outlined. While Hitachi’s core businesses provide a fertile ground in which grow its business, Pentaho will need to develop integration with the large industrial control systems already in place in those organizations.

The integration of Pentaho into HDS is a key priority. The 2,000-strong global sales force of HDS is now incented to sell Pentaho, and it will be important for the reps to include it as they discuss their accounts’ needs. While Pentaho’s portfolio can potentially broaden sales opportunities for HDS, big data software is a more consultative sale than the price-driven hardware and systems that the sales force may be used to. Furthermore, the buying centers, which are shifting from IT to lines of business, can be significantly different based on the type of organization and their objectives. To address this will require significant training within the HDS sales force and with partner consulting channels. The joint sales efforts will be well served by emphasizing the “big data blueprints” developed by Pentaho over the last couple of years and developing of new ones that speak to IoT and the combined capabilities of the two companies.

HDS says it will begin to embed Pentaho into its product portfolio but has promised to leave Pentaho’s roadmap intact. This is important because Pentaho has done a good job of listening to its customers and addressing the complexities that exist in big data and open source environments. As the next chapter unfolds, I will be looking at how the company integrates its platform with the HDS portfolio and expands it to deal with the complexities of IoT, which we will be investigating in upcoming benchmark research study.

For organizations that need to use large-scale integrated data sets, Pentaho provides one of the most flexible yet mature tools in the market, and they should consider it. The analytics tool provides an integrated and embeddable front end that should be of particular interest to analytics services providers and independent software vendors seeking to make information management and data analytics core capabilities. For existing HDS customers, the Pentaho portfolio will open conversations in new areas of those organizations and potentially add considerable value within accounts.

Regards,

Ventana Research

The emerging Internet of Things (IoT) extends digital connectivity to devices and sensors in homes, businesses, vehicles and potentially almost anywhere. This innovation enables devices designed for it to generate and transmit data about their operations; analytics using this data can facilitate monitoring and a range of automatic functions.vr_oi_goals_of_using_operational_intelligence_updated

To perform these functions IoT requires what Ventana Research calls Operational Intelligence (OI), a discipline that has evolved from the capture and analysis of instrumentation, networking and machine-to-machine interactions of many types. We define operational intelligence as a set of event-centered information and analytic processes operating across an organization that enable people to use that event information to take effective actions and make optimal decisions. Our benchmark research into Operational Intelligence shows that organizations most often want to use such event-centric architectures for defining metrics (37%) and assigning thresholds for alerts (35%) and for more action-oriented processes of sending notifications to users (33%) and linking events to activities (27%).

In many industries, organizations can gain competitive advantage if they can reduce the elapsed time between an event occurring and actions taken or decisions made in response to it. Existing business intelligence (BI) tools provide useful analysis of and reporting on data drawn from previously recorded transactions, but to improve competitiveness and maximize efficiencies organizations are concluding that employees and processes – in IT, business operations and front-line customer sales, service and support – also need to be able to detect and respond to events as they happen. Our research into big data integration shows that nearly one in four companies currently integrate data into big data stores in real time. The challenge is to go further and act upon both the data that is stored and the data that is streaming in a timely manner.

The evolution of operational intelligence, especially in conjunction with IoT, is encouraging companies to revisit their priorities and spending for information technology and application management. However, sorting out the range of options poses a challenge for both business and IT leaders. Some see potential value in expanding their network infrastructure to support OI. Others are implementing event processing (EP) systems that employ new technology to detect meaningful patterns, anomalies and relationships among events. Increasingly, organizations are using dashboards, visualization and modeling to notify nontechnical people of events and enable them to understand their significance and take appropriate and immediate action.

As with any innovation, using OI for IoT may require substantial changes. These are among the challenges organizations face as they consider adopting operational intelligence:

  • They find it difficult to evaluate the business value of enabling real-time sensing of data and event streams using identification tags, agents and other systems embedded not only in physical locations like warehouses but also in business processes, networks, mobile devices, data appliances and other technologies.
  • They lack an IT architecture that can support and integrate these systems as the volume and frequency of information increase.
  • They are uncertain how to set reasonable business and IT expectations, priorities and implementation plans for important technologies that may conflict or overlap. These can include business intelligence, event processing, business process management, rules management, network upgrades and new or modified applications and databases.
  • They don’t understand how to create a personalized user experience that enables nontechnical employees in different roles to monitor data or event streams, identify significant changes, quickly understand the correlation between events and develop a context in which to determine the right decisions or actions to take.

Ventana Research has announced new benchmark research on The Internet of Things and Operational Intelligence that will identify trends and best practices associated with this technology and these processes. It will explore organizations’ experiences with initiatives related to events and data and with attempts to align IT projects, resources and spending with new business objectives that demand real-time intelligence and event-driven architectures. The research will investigate how organizations are increasing their responsiveness to events by rebalancing the roles of networks, applications and databases to reduce latency; it also will explore ways in which they are using sensor data and alerts to anticipate problematic events. We will benchmark the performance of organizations’ implementations, including IoT, event stream processing, event and activity monitoring, alerting, event modeling and workflow, and process and rules management.

As operational intelligence evolves as the core of IoT platforms, it is an important time to take a closer look at this emerging opportunity and challenge. For those interested in learning more or becoming involved in this upcoming research, please let me know.

Regards,

Ventana Research

Our benchmark research consistently shows that business analytics is the most significant technology trend in business today and acquiring effective predictive analytics is organizations’ top priority for analytics. It enables them to look forward rather than backward and, participate organizations reported, leads to competitive advantage and operational efficiencies.

In our benchmark research on big data analytics, for example, 64 percent of organizations ranked predictive analytics as the most Untitledimportant analytics category for working with big data. Yet a majority indicated that they do not have enough experience in applying predictive analytics to business problems and lack training on the tools themselves.

Predictive analytics improves an organization’s ability to understand potential future outcomes of variables that matter. Its results enable an organization to decide correct courses of action in key areas of the business. Predictive analytics can enhance the people, process, information and technology components of an organization’s future performance.

In our most recent research on this topic, more than half (58%) of participants indicated that predictive analytics is very important to their organization, but only one in five said they are very satisfied with their use of those analytics. Furthermore, our research found that implementing predictive analysis would have a transformational impact in one-third of organizations and a significant positive impact in more than half of other ones.

In our new research project, The Next Generation of Predictive Analytics, we will revisit predictive analysis with an eye to determining how attitudes toward it have changed,  along with its current and planned use, and its importance in business. There are significant changes in this area, including where, how, why, and when predictive analytics are applied. We expect to find changes not only in forecasting and analyzing customer churn but also in operational use at the front lines of the organization and in improving the analytic process itself. The research will also look at the progress of emerging statistical languages such as R and Python, which I have written about.

vr_predanalytics_benefits_of_predictive_analytics_updatedAs does big data analytics, predictive analytics involves sourcing data, creating models, deploying them and managing them to understand when an analytic model has become stale and ought to be revised or replaced. It should be obvious that only the most technically advanced users will be familiar with all this, so to achieve broad adoption, predictive analytics products must mask the complexity and be easy to use. Our research will determine the extent to which usability and manageability are being built into product offerings.

The promise of predictive analytics, including competitive advantage (68%), new revenue opportunities (55%), and increased profitability (52%), is significant. But to realize the advantages of predictive analytics, companies must transform how they work. In terms of people and processes a more collaborative strategy may be necessary. Analysts need tools and skills in order to use predictive analytics effectively. A new generation of technology is also becoming available where predictive analytics are easier to apply and use, along with deploy into line of business processes. This will help organizations significantly as there are not enough data scientists and specially trained professionals in predictive analytics that will be available for organizations to utilize or afford to hire.

This benchmark research will look closely at the evolving use of predictive analytics to establish how it equips business to make decisions based on likely futures, not just the past.

Regards,

Tony Cosentino

VP & Research Director

VRMobileBIVI_HotVendorMobility continues to be a hot adoption area in business intelligence, according to our research across analytics and line of business departments. Nearly three-quarters (71%) of organizations said their mobile workforce would be able to access BI capabilities in the next 12 months according to our next generation mobile business intelligence research. Roambi, a provider of mobile business intelligence applications, has made important strides this year after moving to deploying its products in the cloud, an event that I covered previously. Roambi is rated as one of the top providers of mobile business intelligence or what we refer to as a ‘Hot Vendor’ according to our Value Index.

Earlier this year, Roambi announced a partnership with Box, which offers cloud-based data storage and file sharing. More recently it began to catch up with the market by announcing support for the Android mobile operating system. Roambi has focused on the mobile BI market from its inception, first by building on the Apple iPhone’s small screen and then progressing to support the majority of mobile devices in corporations today.

The Box partnership, announced in March, enables joint Box and Roambi customers to visualize and interact with data stored on the Box file sharing system. Specifically, users are able to sync Roambi Analytics, the company’s visualization engine, and Roambi Flow, its storyboarding capability, with Box. The integration allows fast importation of Box source files and the ability to create and share interactive reports through Roambi Analytics and to create digital magazines and content through Roambi Flow. Data can be automatically refreshed in real time via Box Sync.

This partnership serves both companies since the coupled service provide users with expanded capabilities and little overlap. Box’s rapid growth is being driven by its ease of use, open platform and cloud approach to file sharing. Thisis a natural platform for Roambi to build on and expand its customer base. For Box, Roambi’s dynamic visualization and a collaboration capabilities address its business customers’ needs and increase its market opportunities. In our benchmark research on information optimization 83 percent of organizations said it is important to have components that provide interactive capabilities to support presentation of information.

Roambi also works with other application vendors in the CRM and ERP market to integrate their applications. The same research shows that CRM (for 45%) and ERP (38%) are important types to integrate with others especially in areas such as sales and customer service. Apple’s recent announcement of a new SDK, should facilitate process integration between software systems so that, for example, users can access and navigate applications such as those from Box and Roambi and transactional applications such as CRM and ERP in a single workflow. This capability can provide further usability advantages for Roambi, which scored the highest rating in this area in our Mobile Business Intelligence Value Index.

Roambi announced its mobile BI support for the Google Android mobile operating system that operates across a wide range of those smartphone and tablet technologies. It had delayed its development of its software on the Android platform, which required significant development resources and investment but was part of its strategy to maximize its business potential and relationship with Apple. The release are available at the Google Play store . The initial version will include four of Roambi’s most popular views: Card, Catalist, Layers and Superlist. Similar to its application for the Apple platform, security features for Android include Device Lockout, Remote Wipe and Application Passcode. Integration with Roambi’s partner Okta provides identity management services and gives access to any applications supported by Okta. Integration includes Active Directory (AD) and lightweight directory access protocol (LDAP). While Roambi Flow will not be available on Android out of the gate, the company says it will be becoming available by the end of 2014.

Current approaches to mobile business intelligence applications on the market include native, Web-based and hybrid (a combination of both). We compare these approaches in detail in the executive summary of our Mobile Business Intelligence Value Index report.With the new Android support, Roambi has a unique approach to the hybrid architecture that bypasses the browser completely. There is no data cached in a browser and in fact the data is loaded directly to the device and rendered through Roambi natively on the device.  From the user’s perspective, the benefit of this approach is performance since interactivity does not rely on data traveling over a network. A second benefit is offline access to data, which is not available via non-native approaches. From the developer’s or information assembler perspective, testing across browsers is not needed since there is no data caching and the experience is the same regardless of browser in use.

vr_Info_Optimization_16_information_software_evaluation_criteriaOur next-generation business intelligence research shows that executives strongly support mobile BI: Nearly half of them said that mobility is very important to their BI processes. Executives also favor Apple over Android devices which is likely one of the key reasons for Apple’s dominance in the current business intelligence landscape. However our research shows latent demand for Android devices in the business intelligence market and given the dominance of Android in the consumer market as well as dominance in places like Asia and other emerging markets, any company that seeks to take significant global market share will need support for both platforms. Our information optimization research fond Google Android as second in smartphone platform (24%) behind Apple iPhone (59%) as first ranked position. Therefore, Android support is an important evolution for Roambi in order to address an increasing demand in the market for Android devices.

Usability has become the most important evaluation criteria and in the information optimization benchmark research was selected as most important to over half (58%) of organizations. Roambi ranked as the highest in usability in the Ventana Research Mobile Business Intelligence Value Index, though its overall score was hampered somewhat by the lack of support for Android. With Android support, the company now addresses the need for multiple and the so called bring your own device (BYOD) to work methods now becoming more prevalent and allowed by organizations. By addressing this as well as taking advantages of broader market developments such as the new Apple SDKs, Roambi continues to address what organizations find important today. Usability of business intelligence systems is a top priority for 63% of companies. Even our big data analytics research finds a growing level of overall importance for mobile access in almost half of organizations (46%).  Any company that wants to get a user friendly business intelligence into the hands of its mobile workers quickly and effectively should have Roambi in the consideration set.

Regards,

Tony Cosentino

VP and Research Director

At its annual industry analyst summit last month and in a more recent announcement of enterprise support for parallelizing the R language on its Aster Discovery Platform, Teradata showed that it is adapting to changes in database and analytics technologies. The presentations at the conference revealed a unified approach to data architectures and value propositions in a variety of uses including the Internet of Things, digital marketing and ETL offloading. In particular, the company provided updates on the state of its business as well as how the latest version of its database platform, Teradata 15.0, is addressing customers’ needs for big data. My colleague Mark Smith covered these announcements in depth. The introduction of scalable R support was discussed at the conference but not announced publicly until late last month.

vr_Big_Data_Analytics_13_advanced_analytics_on_big_dataTeradata now has a beta release of parallelized support for R, an open source programming language used significantly in universities and growing rapidly in enterprise use. One challenge is that R relies on a single-thread, in-memory approach to analytics. Parallelization of R allows the algorithm to run on much larger data sets since it is not limited to data stored in memory. For a broader discussion of the pros and cons of R and its evolution, see my analysis. Our benchmark research shows that organizations are counting on companies such as Teradata to provide a layer of abstraction that can simplify analytics on big data architectures. More than half (54%) of advanced analytics implementations are custom built, but in the future this percentage will go down to about one in three (36%).

Teradata’s R project has three parts. The first includes a Teradata Aster R library, which supplies more than 100 prebuilt R functions that hide complexity of the in-database implementation. The algorithms cover the most common big data analytic approaches in use today, which according to our big data analytics benchmark research are classification (used by 39% of organizations), clustering (37%), regression (35%), time series (32%) and affinity analysis (29%). Some use innovative approaches available in Aster such as Teradata’s patented nPath algorithm, which is useful in areas such as digital marketing. All of these functions will receive enterprise support from Teradata, likely through its professional services team.

The second part of the project involves the R parallel constructor. This component gives analysts and data scientists tools to build their own parallel algorithms based on the entire library of open source R algorithms. The framework follows the “split, apply and combine” paradigm, which is popular among the R community. While Teradata won’t support the algorithms themselves, this tool set is a key innovation that I have not yet seen from others in the market.

Finally, the R engine has been integrated with Teradata’s SNAP integration framework. The framework provides unified access to multiple workload specific engines such as relational (SQL), graph (SQL-GR), MapReduce (SQL-MR) and statistics. This is critical since the ultimate value of analytics rests in the information itself. By tying together multiple systems, Teradata enables a variety of analytic approaches. More importantly, the data sources that can be merged into the analysis can deliver competitive advantages. For example, JSON integration, recently announced, delivers information from a plethora of connected devices and detailed Web data.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsTeradata is participating in industry discussions about both data management and analytics. As Mark Smith discussed, its unified approach to data architecture addresses challenges brought on competing big data platforms such as Hadoop and other NoSQL approaches like that one announced with MongoDB supporting JSON integration. These platforms access new information sources and help companies use analytics to indirectly increase revenues, reduce costs and improve operational efficiency. Analytics applied to big data serve a variety of uses, most often cross-selling and up-selling (for 38% of organizations), better understanding of individual customers (32%) and optimizing price (30%) and IT operations (24%). Teradata is active in these areas and is working in multiple industries such as financial services, retail, healthcare, communications, government, energy and utilities.

Current Teradata customers should evaluate the company’s broader analytic and platform portfolio, not just the database appliances. In the fragmented and diverse big data market, Teradata is sorting through the chaos to provide a roadmap for largest of organizations to midsized ones. The Aster Discovery Platform can put power into the hands of analysts and statisticians who need not be data scientists. Business users from various departments, but especially high-level marketing groups that need to integrate multiple data sources for operational use, should take a close look at the Teradata Aster approach.

Regards,

Tony Cosentino

VP & Research Director

Information Builders announced two major new products at its recent annual user summit. The first was InfoDiscovery, a tool for ad hoc data analysis and visual discoveryThe second was iWay Sentinel, which allows administrators to manage applications in a proactive and dynamic manner. Being a privately held company, Information Builders is not a household name, but it is a major provider of highly scalable business intelligence (BI) and information management software to companies around the world.

VRMobileBIVI_HotVendorThis year’s announcements come one year after the release of WebFOCUS 8.0, which I wrote about at the time. Version 8.0 of this flagship BI product includes a significant overhaul of the underlying code base, and its biggest change is how it renders graphics by putting the parameters of the HTML5 graph code directly inside the browser. This approach allows consistent representation of the business intelligence graphics in multiple device environments including mobile ones. Our research into information optimization shows that mobile technology improves business performance significantly in one out of three organizations. The graphics capability helped Information Builders earn the rating of Hot vendor in our latest Value Index on Mobile Business Intelligence. It is an increasingly important trend to combine analytics with transactional systems in a mobile environment. Our research shows that mobile business intelligence is advancing quickly. Nearly three-quarters (71%) of participants said they expect their mobile workforce to have BI capabilities in the next 12 months.

vr_Big_Data_Analytics_12_benefits_of_visualizing_big_dataWebFOCUS InfoDiscovery represents the company’s new offer in the self-service analytics market. For visual discovery it enables users to extract, blend and prepare data from various data sources such as spreadsheets, company databases and third-party sources. Once the analytic data set is created, users can drill down into the information in an underlying columnar database. They can define queries as they go and examine trends, correlations and anomalies in the data set. Users given permission can publish the visualization from their desktop to the server for others to view or build further. Visualization is another area of increasing importance for organizations. Our research on big data analytics said data visualization has a number of benefits; the most-often cited are faster analytics (by 49%), understanding content (48%), root-cause analysis (40%) and displaying multiple result sets at the same time (40%).

InfoDiscovery is Information Builders’ contender in the new breed of visual discovery products. The first generation of visual discovery products drew attention for their visual capabilities, ease of use and agility. More recently, established business intelligence vendors, of which Information Builders is one, have focused on developing visual discovery tools on the platform of their well-known BI products, with the aim of taking advantage of their maturity. Currently this second wave of tools is still behind the first in terms of ease of use and visual analysis but are advancing rapidly, and they can provide better data governance, version control, auditing and user security. For instance, InfoDiscovery uses the same metadata as the enterprise platform WebFOCUS 8 so objects from both InfoDiscovery and other WebFOCUS applications can be configured in the same user portal. When a business user selects a filter, the data updates across all the components in the dashboard. The HTML5 rendering engine, new in WebFOCUS 8.0, makes the dashboard available to various devices including tablets and smartphones.

vr_oi_how_operational_intellegence_is_usedThe other major announcement at the conference, iWay Sentinel, is a real-time application monitoring tool that helps administrators manage resources across distributed systems. It works with iWay Service Manager, which is used to manage application workflows. IWay Sentinel allows multiple instances of Service Manager to be viewed and managed from a single Web interface, and administrators can address bottlenecks in system resources both manually and automatically. The tool belongs in the category we call operational intelligence and as our research finds, activity and event monitoring is the most important use (for 62% of research participants), followed by alerting and notification.

Sentinel is an important product in the Information Builders portfolio for a couple of reasons. Tactically speaking, it enables large organizations that are running multiple implementations of iWay Service Manager to manage infrastructure resources in a flexible and streamlined manner. From a strategic perspective, it ties the company to the emerging Internet of Things (IoT), which connects devices and real-time application workflows across a distributed environment. In such an environment, rules and processes flows must be monitored and coordinated in real time. Information is passed along an enterprise service bus that enables synchronous interaction of various application components. The use of IoT is in multiple areas such as remote management of devices, telematics and fleet management, predictive maintenance, supply chain optimization, and utlilities monitoring. The challenge is that application software is often complex and its processes are interdependent. For this reason, most approaches to the IoT have been proprietary in nature. Even so, Information Builders has a large number of clients in various industries, especially retail, that may be interested in its approach.

Information Builders continues to innovate in the changing IT industry and business demand for analytics and data, building on its integration capabilities and its core business intelligence assets. The breadth and depth of its software portfolio enable the company to capitalize on these assets as demand shifts. For instance, temporal analysis is becoming more important; Information Builders has built that capability into its products for years. In addition, the company’s core software is hardened by years of meeting high-concurrency needs. Companies that have thousands of users need this type of scalable, battle-tested system.

Both iWay Sentinel and InfoDiscovery are in limited release currently and will be generally available later this year. Users of other Information Builders software should examine InfoDiscovery and assess its fit in their organizations. For business users it offers a self-service approach on the same platform as the WebFOCUS enterprise product. IT staff can uphold their governance and system management responsibilities through visibility and flexible control of the platform. For its part iWay Sentinel should interest companies that have to manage multiple instances of information applications and use iWay Service Manager. In particular, retailers, transportation companies and healthcare companies exploring IoT uses should consider how it can help.

Information Builders is exploiting the value of data into what is called information optimization for which they are finding continued growth in providing information applications that meet specific business and process needs. Information Builders is also beginning to further exploit the big data sources and mobile technology areas but will need to further invest to ensure it can be part of a spectrum of new business needs. I continued to recommend any company that must serve a large set of employees in the workforce and has a need for blending data and analytics for business intelligence or information needs to consider Information Builders.

Regards,

Tony Cosentino

VP and Research Director

Tibco’s recent acquisition of Jaspersoft helps the company fill out its portfolio of business intelligence (BI) and reporting software in an increasingly competitive marketplace. Tibco already offered a range of products in BI and analytics including Tibco Spotfire, an established product for visual data discovery. Jaspersoft and its open source Java reporting tool JasperReports have been around since 2001, and the company says it has 16 million product downloads worldwide, 140,000 production deployments and 2,000 commercial customers in 100 countries. Jaspersoft received attention recently for its partnership with Amazon Marketplace and the ability to embed its system into applications using a credit card and a few simple configuration steps. This example of embedding the technology is an area that Tibco knows well from its history of integrating its technology into enterprise architecture across the planet.

vr_Info_Optimization_08_most_important_analyst_capabilities_updatedThe acquisition is significant given today’s advancements in the Business Intelligence market and the need for tools to serve a variety of users. In some ways their technologies serve the same users – analysts and business users trying to make decisions with data but how they approach it and support a broad set of roles and responsibilities is different.  Tibco Spotfire, Tibco’s approach to business analytics, serves for analytics and visualization with specializing in visual discovery and data exploration while Jaspersoft addresses the query and analyze, reporting, dashboards and other aspects of BI. According to our benchmark research on information optimization, the capabilities business users most often need are to drill into information within applications (37%), search for data (36%) and collaborate (27%). For analysts, the most necessary capabilities are extracting data (39%), designing and integrating metrics (37%) and developing policies and rules for access (34%). With Jaspersoft, Tibco can address both groups and also can embed intelligence and reporting capabilities into operationally oriented environments across range of business applications.

vr_oi_challenges_using_bi_for_operational_intelligenceThe acquisition makes sense in that more capabilities are needed to address the expanding scope of business intelligence and analytics. In practice, it will be interesting to see how the open source community and culture of Jaspersoft meshes with the culture of Tibco’s Spotfire division. For now, Jaspersoft will continue as a separate division so business likely will continue as usual until management decides specific areas of integration. With respect to development efforts, it will be critical to blend the discovery capabilities of Tibco Spotfire with Jaspersoft’s reporting which will be a formidable challenge.  Another key to success will be how Tibco integrates both with the capabilities from Extended Results, a mobile business intelligence provider Tibco bought in 2013. Mobility is an area where Ventana Research found Jaspersoft significantly lacking, so the Extended Results capabilities should prove useful. Finally, Tibco’s event-enabled infrastructure will likely play a key role as the company continues to invest in operational intelligence for event-focused information gathering and delivery. Our operational intelligence research has found a lack of integration from business intelligence like that of Jaspersoft with event streams like from Tibco to be a major challenge in over half (51%) of organizations. This is a potential opportunity for Tibco as it looks at future integration of the technologies.

The Jaspersoft acquisition is not surprising given recent changes in the BI market. The category, which just a few years ago was vr_Info_Optimization_01_whos_responsible_for_information_availabilityconsidered mature and well-defined, is expanding to include areas such as analytic discovery tools, advanced analytics and big data. The union of Tibco Spotfire, which primarily targets line-of-business professionals from analysts to knowledge worksers, and Jaspersoft, a more IT-centered company, reflects the need for the industry to bridge a divide that exists in many organizations where IT is publishing dashboards and reports to business.  The challenge of using information across business and IT was found in our latest research, revealed in our information optimization benchmark research, shows that information management these days is most often (in 42% of organizations) a joint responsibility of IT and the lines of business , although IT is involved in some capacity in four-fifths of them. It remains to be seen whether the joint company can take on major competitors that have far more cash resources and take a similar approach.

Preliminary indicators show a good fit between these two organizations. Customers from each will be introduced to important new tools and capabilities from the other. One of the first likely moves for Tibco will be to introduce the 2,000 commercial customers and global presence of Jaspersoft to the broader portfolio. We advise those customers to evaluate what Tibco offers, especially those from Tibco Spotfire which continues to be a leader in the visual data discovery market. Before investing, however, customers and prospects should demand clarity on the company’s plans for technical integration of analytics and how these will fit with organizations long-term business intelligence and analytics roadmaps. Tibco customers migrating to the cloud should investigate the work Jaspersoft is doing with companies like Amazon and consider whether the embedded approach to interactive reporting can fit with their analytics, cloud and application strategies.

The opportunity for Tibco to advance business analytics is significant through this acquisition but it has historically not been as progressive in its marketing and sales of analytics compared to others in the market. The demand for visual discovery and big data analytics has grown dramatically with over three quarters of organizations according to our research has shown as overall important. Big data analytics and visualization is an area that Spotfire had innovated before Tibco acquisition but has not seen its fair share of growth with the buying trends. The opportunity for Tibco to provide analytics and BI that can further leverage the entire Tibco portfolio of integration, event processing, cloud and social collaboration software products is upon them, let’s see how they do. It now needs to supercharge its analytics efforts significantly with leveraging its new products from Jaspersoft.

Regards,

Tony Cosentino

VP and Research Director

Alteryx has released version 9.0 of Alteryx Analytics that provides a range of data to predictive analytics in advance of its annual user conference called Inspire 2014. I have covered the company for several years as it has emerged as a key player in providing a range of business analytics from predictive to big data analytics. The importance of this category of analytics is revealed by our latest benchmark research on big data analytics, which finds that predictive analytics is the most important type of big data analytics, ranked first by nearly half (47%) of research participants. The new version 9 includes new capabilities and integration with a range of new information sources including read and write capability to IBM SPSS and SAS for range of analytic needs.

vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsAfter attending Inspire 2013 last year, I wrote about capabilities that are enabling an emerging business role, that which Alteryx calls the data artisan. The label refers to analysts who combines both art and science in using analytics to help direct business outcomes. Alteryx uses an innovative and intuitive approach to analytic tasks, using workflow and linking various data sources through in-memory computation and processing. It takes a “no code” drag and drop approach to integrate data from files and databases, prepare data for analysis, and build and score predictive models to yield relevant results. Other vendors in the advanced analytics market are also applying this approach, but few mature tools are currently available. The output of the Alteryx analytic processes can be shared automatically in numerous data formats including direct export into visualization tools such as those from Qlik (new support) and Tableau. This can help users improve their predictive analytics capabilities and take action on the outcomes of analytics, which are the two capabilities most-often cited in our research as needed to improve big data analytics.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsAlteryx now works with Revolution Analytics to increase the scalability of its system to work with large data sets. The open source language R continues to gain popularity and is being embedded in many business intelligence tools, but it runs only on data that can be loaded into memory. Running only in memory does not address analytics on datasets that run into Terabytes and hundreds of millions of values, and potentially requires use of a sub-sampling approach to advanced analytics. With its RevoScaleR, Revolution Analytics rewrites parts of the R algorithm so that the processing tasks can be parallelized and run in big data architectures such as Hadoop. Such capability is important for analytic problems including recommendation engines, unsupervised anomaly detection, some classification and regression problems, and some clustering problems. These analytic techniques are appropriate for some of the top business uses of big data analytics, which according to our research are cross-selling and up-selling (important to 38%), better understanding of individual customers (32%), analyzing all data rather than a sample (30%) and price optimization (28%). Alteryx Analytics automatically detects whether to use RevoScaleR or open source R algorithms. This approach simplifies the technical complexities of scaling R by providing a layer of abstraction for the analytic professional.

Scoring – the ability to input a data record and receive the probability of a particular outcome – is an important if not well understood aspect of predictive analytics. Our research shows that companies that score models on a timely basis according to their needs get better organizational results than those that score all models the same way. Working with Revolution Analytics, Alteryx has enhanced scoring scalability for R algorithms with new capabilities that chunk data in a parallelized fashion. This approach bypasses the memory-only approach to enable a theoretically unlimited number of scores to be processed. For large-scale implementations and consumer applications in industries such as retail, an important target market for Alteryx, and these capabilities are becoming important.

Alteryx 9.0 also improves on open source R’s default approach to scoring, which is “all or nothing.” That is, if data is missing (a null value) or a new level for a categorical variable is not included in the original model, R will not score the model until the issue is addressed. This process is a particular problem for analysts who want to score data in small batches or individually. In contrast, Alteryx’s new “best effort” approach scores the records that can be run without incident, and those that cannot be run are returned with an error message. This adjustment is particularly important as companies start to deploy predictive analytics into areas such as call centers or within Web applications such as automatic quotes for insurance.

vr_Big_Data_Analytics_02_defining_big_data_analyticsAlteryx 9.0 also has new predictive modeling tools and functionality. A spline model helps address regression and classification problems such as data reduction and nonlinear relationships and their interactions. It uses a clear box way to serve users with differing objectives and skill levels. The approach exposes the underpinnings of the model so that advanced users can modify a model, but at the same time less sophisticated users can use the model without necessarily understanding all of the intricacies of the model itself. Other capabilities include a Gamma regression tool allows data matching to model the Gamma family of distributions using the generalized linear modeling (GLM) framework. Heat plot tools for visualizing joint probability distributions, such as between customer income level and customer advocacy, and more robust A/B testing tools, which are particularly important in digital marketing analytics, are also part of the release.

At the same time, Alteryx has expanded its base of information sources. According to our research, working with all sources of data, not just one, is the most common definition for big data analytics, as stated by three-quarters (76%) of organizations. While structured data from transaction systems and so-called systems of record is still the most important, new data sources including those coming from external sources are becoming important. Our research shows that the most widely used external data sources are cloud applications (54%) and social media data (46%); five additional data sources, including Internet, consumer, market and government sources, are virtually tied in third position (with 39% to 42% each). Alteryx will need to be mindful of best practices in big data analytics as I have outlined to ensure it can stay on top of a growing set of requirements to blend big data but also apply a range of advanced analytics.

New connectors to the social media data provider Gnip give access to social media websites through a single API, and a DataSift (http://www.datasift.com) connector helps make social media more accessible and easier to analyze for any business need. Other new connectors in 9.0 include those for Foursquare, Google Analytics, Marketo, salesforce.com and Twitter. New data warehouse connectors include those for Amazon Redshift, HP Vertica, Microsoft SQL Server and Pivotal Greenplum. Access to SPSS and SAS data files also is introduced in this version; Alteryx hopes to break down the barriers to entry in accounts dominated by these advanced analytic stalwarts. With already existing connectors to major cloud and on-premises data sources, the company provides a robust integration platform for analytics.

Alteryx is on a solid growth curve as evidenced by the increasing number of inquiries and my conversations with company vr_Customer_Analytics_08_time_spent_in_customer_analyticsexecutives. It’s not surprising given the disruptive potential of the technology itself and its unique analytic workflow technology for data blending and advanced analytics. This data blending and workflow technology that Alteryx provides is not highlighted enough as it is one of the largest differentiators of its software and reduces the data related tasks like preparing (47%) and reviewing (43%) data that our customer analytics research finds gets in the way of analysts performing analytics. Additionally Alteryx ability to apply location analytics within its product is a key differentiation that our research found delivers exponential value from analytics than just viewing traditional visualization and tables of data. Also location analytics like Alteryx provides helps rapidly identify areas where customer experience and satisfaction can be improved and is the top benefit found in our research. The flexible platform resonates particularly well with line-of-business and especially in fast-moving, lightly regulated industries such as travel, retail and consumer goods where speed of analytics are critical to be performed. The work the company is doing with Revolution Analytics and the ability to scale is important for advanced analytic that operate on big data. The ability to seamlessly connect and blend information sources is a critical capability for Alteryx and it’s a wise move to invest further in this area but Alteryx will need to examine where collaborative technology could be used to help business work together on analytics within the software. Alteryx will need to continue to adapt to the market demand for analytics and keep focused on varying line of business areas so it can continue its growth. Just about any company involved in analytics today should evaluate Alteryx and see how it can streamline analytics in a very unique approach.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,031 hits
%d bloggers like this: