You are currently browsing the tag archive for the ‘Cloudera’ tag.

SAS Institute, a long-established provider analytics software, showed off its latest technology innovations and product road maps at its recent analyst conference. In a very competitive market, SAS is not standing still, and executives showed progress on the goals introduced at last year’s conference, which I coveredSAS’s Visual Analytics software, integrated with an in-memory analytics engine called LASR, remains the company’s flagship product in its modernized portfolio. CEO Jim Goodnight demonstrated Visual Analytics’ sophisticated integration with statistical capabilities, which is something the company sees as a differentiator going forward. The product already provides automated charting capabilities, forecasting and scenario analysis, and SAS probably has been doing user-experience testing, since the visual interactivity is better than what I saw last year. SAS has put Visual Analytics on a six-month release cadence, which is a fast pace but necessary to keep up with the industry.

Visual discovery alone is becoming an ante in the analytics market,vr_predanalytics_benefits_of_predictive_analytics_updated since just about every vendor has some sort of discovery product in its portfolio. For SAS to gain on its competitors, it must make advanced analytic capabilities part of the product. In this regard, Dr. Goodnight demonstrated the software’s visual statistics capabilities, which can switch quickly from visual discovery into regression analysis running multiple models simultaneously and then optimize the best model. The statistical product is scheduled for availability in the second half of this year. With the ability to automatically create multiple models and output summary statistics and model parameters, users can create and optimize models in a more timely fashion, so the information can be come actionable sooner. In our research on predictive analytics, the most participants (68%) cited competitive advantage as a benefit of predictive analytics, and companies that are able to update their models daily or more often, our research also shows, are very satisfied with their predictive analytics tools more often than others are. The ability to create models in an agile and timely manner is valuable for various uses in a range of industries.

There are three ways that SAS allows high performance computing. The first is the more traditional grid approach which distributes processing across multiple nodes. The second is the in-database approach that allows SAS to run as a process inside of the database. vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsThe third is extracting data and running it in-memory. The system has the flexibility to run on different large-scale database types such as MPP as well Hadoop infrastructure through PIG and HIVE. This is important because for 64 percent of organizations, the ability to run predictive analytics on big data is a priority, according to our recently released research on big data analytics. SAS can run via MapReduce or directly access the underlying Hadoop Distributed File System and pull the data into LASR, the SAS in-memory system. SAS works with almost all commercial Hadoop implementations, including Cloudera, Hortonworks, EMC’s Pivotal and IBM’s InfoSphere BigInsights. The ability to put analytical processes into the MapReduce paradigm is compelling as it enables predictive analytics on big data sets in Hadoop, though the immaturity of initiatives such as YARN may relegate the jobs to batch processing for the time being. The flexibility of LASR and the associated portfolio can help organizations overcome the challenge of architectural integration, which is the most widespread technological barrier to predictive analytics (for 55% of participants in that research). Of note is that the SAS approach provides purely analytical engine, and since there is no SQL involved in the algorithms, its overhead related to SQL is non-existent and it runs directly on the supporting system’s resources.

As well as innovating with Visual Analytics and Hadoop, SAS has a clear direction in its road map, intending to integrate the data integration and data quality aspects of the portfolio in a singlevr_Info_Optimization_04_basic_information_tasks_consume_time workflow with the Visual Analytics product. Indeed, data preparation is still a key sticking point for organizations. According to our benchmark research on information optimization, time spent in analytic tasks is still consumed most by data preparation (for 47%) and data quality and consistency (45%). The most valuable task, interpretation of the data, ranks fourth at 33 percent of analytics time. This is a big area of opportunity in the market, as reflected by the flurry of funding for data preparation software companies in the fourth quarter of 2013. For further analysis of SAS’s data management and big data efforts, please read my colleague Mark Smith’s analysis.

Established relationships with companies like Teradata and a reinvigorated relationship with SAP position SAS to remain at the heart of enterprise analytic architectures. In particular, the co-development effort that allow the SAS predictive analytic workbench to run on top of SAP HANA is promising, which raises the question of how aggressive SAP will be in advancing its own advanced analytic capabilities on HANA. One area where SAS could learn from SAP is in its developer ecosystem. While SAP has thousands of developers building applications for HANA, SAS could do a better job of providing the tools developers need to extend the SAS platform. SAS has been able to prosper with a walled-garden approach, but the breadth and depth of innovation across the technology and analytics industry puts this type of strategy under pressure.

Overall, SAS impressed me with what it has accomplished in the past year and the direction it is heading in. The broad-based development efforts raise a final question of where the company should focus its resources. Based on its progress in the past year, it seems that a lot has gone into visual analytics, visual statistics, LASR and alignment with the Hadoop ecosystem. In 2014, the company will continue horizontal development, but there is a renewed focus on specific analytic solutions as well. At a minimum, the company has good momentum in retail, fraud and risk management, and manufacturing. I’m encouraged by this industry-centric direction because I think that the industry needs to move away from the technology-oriented V’s toward the business-oriented W’s.

For customers already using SAS, the company’s road map is designed to capture market advantage with minimal disruption to existing environments. In particular, focusing on solutions as well as technological depth and breadth is a viable strategy. While it still may make sense for customers to look around at the innovation occurring in analytics, moving to a new system will often incur high switching costs in productivity as well as money. For companies just starting out with visual discovery or predictive analytics, SAS Visual Analytics provides a good point of entry, and SAS has a vision for more advanced analytics down the road.

Regards,

Tony Cosentino

VP and Research Director

Actuate, the driving force behind the open source Eclipse Business Intelligence and Reporting Tools (BIRT) project, is positioning itself in the center of the big-data world through multiple partnerships with companies such as Cloudera, Hortonworks, KXEN, Pervasive and a number of OEMs. These agreements, following on its acquisition of Xenos a couple of years ago, help Actuate address some big issues in big data, involving enterprise integration and closed-loop operational systems that provide what my colleague Robert Kugel refers to as action-oriented information technology systems. Today, most initiatives in big data and Hadoop are still in the proof-of-concept stages or being implemented in organizational siloes. Actuate, with its enterprise orientation and federated architecture, is in a position to potentially advance these efforts in a variety of ways.

Actuate’s back-to-back announcements with Cloudera and Hortonworks give support to the 1.5 million BIRT developers and begin to solidify the ties between the BIRT and Hadoop open source communities. Actuate first announced support for Hadoop with the release of BIRT 3.7 last year when it gave BIRT developers access to Hadoop through Hive Query Language (HQL). Hive allows BIRT native access to Hadoop as a data source and provides a single interface for analysis and reporting across a variety of multistructured data sources. These steps answer the finding in our benchmark research on Hadoop and Information Management that users require more efficient methods to access Hadoop HDFS. The overall result of the partnerships is out-of-the-box access for the two leading distributions of Hadoop, enabling organizations to integrate big data more easily into their BIRT environments.

In the analytics and visualization space, Actuate signed agreements with Pervasive and KXEN. RushAnalyzer, Pervasive’s predictive analytics tool that runs natively on Hadoop, will be integrated with the ActuateOne product suite. The combination of Rush Analyzer and ActuateOne gives business users the ability to transform and analyze big data, and puts advanced visualization capabilities in the hands of analysts. The KXEN partnership goes further into predictive analytics by integrating with KXEN’s flagship product, InfiniteInsight. This gives users capabilities in key areas of predictive analytics such as cross-selling and up-selling, churn analytics, next-best-offer and market basket analysis. Our benchmark research in big data finds that predictive analytics is a capability not available in 41 percent of organizations.

Actuate’s OnDemand software is delivered via both PaaS and SaaS. Here Actuate has signed deals with a number of companies, including BMC Software, Cisco, Computer Associates, GE Healthcare, Infor and Siemens, and more recently with Access Data, eMeter and Integrated Data Services. In all, Actuate has more than 200 OEM partnerships, which are of particular importance as companies and developers turn toward the cloud for big-data platform development. Our benchmark research in big data shows that while most current deployments are on-premises, hosted and SaaS deployments will grow faster moving forward.

In addition to the multiple partnerships, Actuate is positioned to ride the big-data wave with Xenos, a content management company it acquired in 2010. Integration with Xenos Enterprise Server allows BIRT developers to design a user front end for powerful parsing technology that enables mining of multistructured data buried in legacy documents and archives of statements, forms and records.

Overall, I see a three-pronged big-data strategy emerging from Actuate. On one front, it offers an enterprise business intelligence system that easily builds reports using any source of data. A large and growing developer community provides the company with the ability to explore all types of relationships and adjust quickly and nimbly to competitors. On the second level, Actuate provides OEM application developers the ability to invoke broad BI functionality within their own custom applications. This will likely prove to be more important as companies move to cloud-based technologies and closed-loop operational intelligence systems that can drive immediate action within a single desktop or mobile interface. In these first two areas, Actuate is at home in terms of its targets in the enterprise, being well-known among enterprise application developers.

It’s on the third front where I think things get interesting for the company, Actuate’s Performance Analytics software, which my colleague Mark Smith wrote about earlier this year. This application focuses on the lines of business and competes with some discovery tools in the market that have already gained traction. Given the shifting landscape of the enterprise buying center with respect to big data, this group is very important as our big data benchmark research has found. The key for Actuate will be to link big data and predictive analytics capabilities that they gain through partner relationships back to its growing business analytics environment operating across the Internet to the mobile environment. If the company can do so, it will position itself well in business environments where analytics need to be pushed out to support real-time and interactive decision-making by front-line managers.

Regards,

Tony Cosentino – VP & Research Director

As volumes of data grow in organizations, so do the number of deployments of Hadoop, and as Hadoop becomes widespread, more organizations demand data analysis, ease of use and visualization of large data sets. In our benchmark research on Hadoop, 88 percent of organizations said analyzing Hadoop data is important, and in our research on business analytics 89 percent said it is important to make it simpler to provide analytics and metrics to all users who need them. As my colleague Mark Smith has noted, Datameer has an ambitious plan to tackle these issues. It aims to provide a single solution in lieu of the common three-step process involving data integration, data warehouse and BI, giving analysts the ability to apply analytics and visualization to find the dynamic “why” behind data rather than just the static “what.”

The Datameer approach places Hadoop at the center of the computing environment rather than looking at it as simply another data source. This, according to company officers, allows Datameer to analyze large, diverse data sets in ways that traditional approaches cannot, which in turn enables end users to answer questions that may have fallen outside of the purview of the standard information architecture. However, Datameer does not offer its software as a replacement for traditional systems but as a complement to them. The company positions its product to analyze interaction data and data relationships to supplement transactional data analysis of which both are key types of big data that need analysis. Of course, given that most companies are not likely to rip and replace years of system investment and user loyalty, this coexistence strategy is a pragmatic one.

Datameer approaches analytics via a spreadsheet environment. This, too, is pragmatic because, as our business analytics benchmark research shows, spreadsheets are the number-one tool used to generate analytics (by 60% of organizations). Datameer provides descriptive analysis and an interactive dialog box for nested joins of large data sets, but the tool moves beyond traditional analysis with its ability to provide analytics for unstructured data. Path and pattern analyses enable discovery of patterns in massive data sets. Relational statistics, including different cluster techniques, allow for data reduction and latent variable groupings. Data parsing technology is a big part of unstructured data analysis, and Datameer provides prebuilt algorithms for social media text analytics and blogs, among other sources. In all, more than 200 prebuilt algorithms come standard in the Datameer tool set. In addition, users can access spreadsheet macros, open APIs to integrate functions and use the Predictive Model Markup Language (PMML) for model exchange.

In Datameer’s latest version 2.0 it has advanced in providing business infographics tool that provides a visualization layer that enables exploratory data analysis (EDA) through a standard library of widgets, including graphs, charts, diagrams, maps and word clouds. Visualization is one of the key areas lacking in big data deployments today. Analysts work in a free-form layout environment with an easy-to-use drag-and-drop paradigm. Datameer’s WYSIWYG editor provides real-time management of the creation and layout of infographics, allowing analysts to see exactly what the end design will look like as they create it. It also now distributes through HTML5, which allows cross-platform delivery to multiple environments. This is particularly important as Datameer is targeting the enterprise environment, and HTML5 provides a low-maintenance “build once, deploy anywhere” model for mobile platforms.

Datameer is an innovative company, but its charter is a big one, given that it is in a competitive environment at multiple levels of the value delivery chain. Its ability to seamlessly integrate analytics and visualization tools on the Hadoop platform is a unique value proposition; at the same time, it will likely need to put more effort into visualization that is available from other data discovery players. All in all, for enterprises looking to take advantage of large-scale data in the near term that don’t want to wait for other vendors to provide integrated tools on top of Hadoop, Datameer is a company to consider.

Regards,

Tony Cosentino – VP & Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,610 hits
%d bloggers like this: