You are currently browsing the monthly archive for November 2013.

At its Teradata Partners conference in Dallas, a broader vision for big data and analytics was articulated clearly. Their pitch centered on three areas – data warehousing, big data analytics and integrated marketing – that to some degree reflect Teradata’s core market and acquisitions in the last few years of companies like Aprimo who provides integrated marketing technology and Aster in big data analytics. The keynote showcased the company’s leadership position in the increasingly complex world of open source database software, cloud computing and business analytics.

As I discussed in writing about the 2013 Hadoop Summit, Teradata has embraced technologies such as Hadoop that can be seen as vr_bigdata_big_data_technologies_plannedboth a threat and an opportunity to its status as a dominant database provider over the past 20 years. Its holistic architectural approach appropriately named Unified Data Architecture (UDA) reflects an enlightened vision, but relies on the ideas that separate database workloads will drive a unified logical architecture and that companies will continue to rely on today’s major database vendors to provide leadership for the new integrated approach. Our big data benchmark research finds support for this overall position since most big data strategies still rely on a blend of approaches including data warehouse appliances (35%), in-memory databases (34%), specialized databases (33%) and Hadoop (32%).

Teradata is one of the few companies that has the capability to produce a truly integrated platform, and we see evidence of this by its advances in UDA, the Seamless Network Analytics Processing (SNAP) Framework and the Teradata Aster 6 Discovery platform. I want to note that the premise behind UDA is that the complexities of the different big data approaches is abstracted from the user which may access the data through tools such as Aster or other BI or visualization tool. This is important because it means that organizations and their users do not need to understand the complexities of the various types of emerging database approaches prior using them for competitive advantage.

The developments in Aster 6 show the power of the platform to access new and different workloads for new analytic solutions. Teradata announced three key developments about the Aster platform just before the Partners conference. A graph engine is added to complement the existing SQL and MapReduce engines. Graph analytics has not had as much exposure as other NoSQL technologies such as Hadoop or document databases, but it is beginning to gain traction for specific use cases where relationships are difficult to analyze with traditional analytics. For instance, any relationship network, including those in social media, telecommunications or healthcare, can use graph engines, but they also are being applied for basket analysis in retail or behind the scenes in areas such as master data management. The reason that the graph approach can be considered better in these situations is that it is more efficient. For example, it is easier to look at a graph of a social network and understand existing relationships and what is occurring, than trying to understand this same type of data looking at rows and columns. Similarly, using the ideas of nodes and edges, a graph database helps discover very complex patterns in the data that may not be obvious otherwise.

An integrated storage architecture compatible with Apache Hadoop HDFS, its file system, is another important development in Aster 6. It accommodates fast ingestion and preprocessing of multi-structured data. Perhaps the most important development for Aster 6 is the SNAP Framework, which integrates and optimizes execution of SQL queries across the different analytic engines. That is, Teradata has provided a layer of abstraction that removes the need for expertise in different flavors of NoSQL and puts it into SQL, a language that many data-oriented professionals understand.

vr_bigdata_obstacles_to_big_data_analytics %282%29Our big data benchmark research shows that staffing and training are major challenges to big data analytics for three-fourths of organizations today and advances in Aster 6 address multiple analytical access points needed in today’s big data environment. These three analytic access points which are the focus for Teradata are the data scientist, the analyst and the knowledge worker, as described in my recent post on the analytic personas that matter. For the first group of data scientists, there are open APIs and an integrated development environment (IDE) in which they can develop services directly against the unified data architecture. Analysts, who typically are less familiar with procedural programming approaches, can use the declarative paradigm of SQL to access data and call up functions within the unified data architecture. Some advanced algorithms are included now within Aster, as are a few big data visualizations such as sankey; on that topic, I think the best interactive sankey visualization for Aster is from Qlik Technologies, and was showcased at the company’s booth at the Teradata conference. The third persona and access point is the role of the knowledge worker, who accesses big data through BI and visualization tools. Ultimately, the Aster 6 platform brings an impressively integrated access approach to big data analytics; we have not yet seen its equal elsewhere in the market.

A key challenge that Teradata faces as it repositions itself from a best-in-class database provider for data warehousing to a big data and big data analytics provider is to articulate clearly how everything fits together to serve the business analyst. For instance, Teradata relies on its partner’s tools like visual discovery tools and analytical workflow tools such as Alteryx to tap into the power of its database, but it is hard to see how all of these tools use Aster 6. We saw the Aster 6 n-path analysis nicely displayed in an interactive sankey by QlikTech who I recently assessed, and an n-path node within the context of the Alteryx who I also analyzed advanced analytics workflow, but it is unclear how an analyst without specific SQL skills can do more than that. Furthermore, Teradata announced that its database integrates with the full library of advanced analytics algorithms through Fuzzy Logix, and through partnership with Revolution Analytics, R algorithms can run directly in the parallelized environment of Teradata, but again it is unclear how this plays with the Aster 6 approach. This is not to downplay the integration with Fuzzy Logix and Revolution Analytics because these are major announcements and they should not be underestimated especially for big data analytics. However, how these advancements align with the Aster approach and the usability of advanced analytics is still unclear. Our research shows that usability is becoming the most important buying criterion across categories of software and types of purchasers. In the case of next-generation business intelligence, usability is the number-one buying criterion for nearly two out of three (64%) organizations. Nevertheless, Aster 6 provides an agile, powerful and multifaceted data discovery platform that addresses the skills gap especially at the upper end of the analyst skills curve.

In an extension of this exploratory analytics position, Teradata also introduced cloud services. While we have seen vendors of BI and analytics as laggards in the cloud, it is increasingly difficult for them to ignore. Particular use cases are analytic sandboxes and exploratory analytics; in the cloud users can add or reduce resources as needed to address the analytic needs of the organization. Teradata introduced its cloud approach as TCO neutral which means that once you include all of the associated expense of running the service, it will be no more or less expensive than if it was to be run on premise. This runs counter to a lot of industry talk about the inexpensive nature of Amazon’s Redshift platform (based on the Paraccel MPP database that I wrote about). However, IT professionals who actually run databases are sophisticated enough to understand the cost drivers and know that a purely cost-based argument is a red herring. Network infrastructure costs, data governance, security and compliance all come into play since these issues are similar in the cloud as they are on-premises. TCO neutral is a reasonable position for Teradata since it shows that the company knows what it takes to deploy and run big data and analytics in the cloud. Although cloud players market themselves as less expensive, there still are plenty of expenses associated with it. The big differences are in the elasticity of the resources as well as the way the cost is distributed in the form of operational expenditures rather than capital expenditures. Buyers should consider all factors before making the datawarehouse cloud decision, but overall cloud strategy and use case are two critical criterions.

Its cloud computing direction is emblematic of the analytics market position that Teradata is aspiring to occupy. For years it has under-promised and over-delivered. This company doesn’t introduce products with a lot of hype and bugs and then ask the market to help fix them. Its reputation has earned it some of the biggest clients in the world and has built a high level of trust, especially within IT departments. As companies become frustrated with a lack of governance and security and a proliferation of data silos that today’s business-driven use of analytics spawns, I expect that the pendulum of power will swing back toward IT. It’s hard to predict when this may happen, but Teradata will be particularly well positioned when it does. Until then, on the business side it will continue to compete with systems integration consulting firms and other giants vying for the high level trusted advisor position in today’s enterprise. In this effort, Teradata has both industry and technical expertise and has established a center of excellence populated by some of the smartest minds in the big data world including Scott Nau, Tasso Argyros and Bill Franks. I recommend Bill Franks’ Taming the Big Data Tidal Wave as one of the most comprehensive and readable books on big data and analytics.

For large and midsize companies that are already Teradata customers, midsize companies with a cloud-first charter and any established organization rethinking its big data architecture, Teradata should be on the list of vendors to consider.

Regards,

Tony Cosentino

VP and Research Director

While covering providers of business analytics software, it is also interesting for me to look at some that focus on the people, process and implementation aspects in big data and analytics. One such company is Nuevora, which uses a flexible platform to provide customized analytic solutions. I recently met the company’s founder, Phani Nagarjuna, when we appeared on a panel at the Predictive Analytics World conference in San Diego.

Nuevora focuses on big data and analytics from the perspective of the analytic life cycle; that is, it helps companies bring together data and process, visualize and model the data to reach specific business outcomes. Nuevora aims to package implementations of analytics for vertical industries by putting together data sources and analytical techniques, and designing the package to be consumed by a target user group. While the core of the analytic service may be the same within an industry category, each solution is customized to the particulars of the client and its view of the market. Using particular information sources and models depending on their industry, customers can take advantage of advances in big data and analytics including new data sources and technologies. For its part Nuevora does not have to reinvent the wheel for each engagement. It has established patterns of data processing and prebuilt predictive analytics apps that are based on best practices and designed to solve specific problems within industry segments.

The service is currently delivered via a managed service on Nuevora servers called the Big Data Analytics & Apps Platform (nBAAP), but the company’s roadmap calls for more of a software as a service (SaaS) delivery model. Currently nBAAP uses Hadoop for data processing, R for predictive analytics and Tableau for visualizations. This approach brings together best-of-breed point solutions to address specific business issues. As a managed service, it has flexibility in design, and the company can reuse existing SAS and SPSS code for predictive models and can integrate with different BI tools depending on the customer’s environment.

Complementing the nBAAP approach is the Big Data & Analytics Maturity (nBAM) Assessment Framework. This is an industry-based consulting framework that guides companies through their analytic planning process by looking at organizational goals and objectives, establishing a baseline of the current environment, and putting forward a plan that aligns with the analytical frameworks and industry-centric approaches in nBAAP.

From an operating perspective, Nagarjuna, a native of India, taps analytics talent from universities there and places strategic solution vr_predanalytics_usage_of_predictive_analyticsconsultants in client-facing roles in the United States. The company focuses primarily on big data analytics in marketing, which makes sense since, according to our benchmark research on predictive analytics, revenue-generating functions such as forecasting (cited by 72% of organizations) and marketing (67%) are the two primary use cases for predictive analytics. Nuevora has mapped multiple business processes related to processes such as gaining a 360-degree view of the customer. For example, at a high-level, it divides marketing into areas such as retention, cross-sell and up-sell, profitability and customer lifetime value. These provide building blocks for the overall strategy of the organization, and each can be broken down into finer divisions, linkages and algorithms based on the industry. These building blocks also serve as the foundation for the deployment patterns of raw data and preselected data variables, metrics, models, visuals, model update guidelines and expected outcomes.

By providing preprocessing capabilities that automatically produce the analytic data set, then providing updated and optimized models, and finally enabling consumption of these models through the relevant user paradigm, Nuevora addresses some of the key challenges in analytics today. The first is data preparation, which our research shows takes from 40 to 60 percent of analysts’ time. The second is addressing outdated models. Our research on predictive analytics shows that companies that update their models often are much more satisfied with them than are those that do not. While the appropriate timing of model updates is relative to the business context and market changes, our research shows that about one month is optimal.

Midsize or larger companies looking to take advantage of big data and analytics matched with specific business outcomes, without having to hire data scientists and build a full solution internally, should consider Nuevora.

Regards,

Tony Cosentino

VP and Research Director

In our benchmark research on business technology innovation, organizations ranked analytics the number-one priority (for 39%) among six technology trends. Big data, perhaps because it is a more technical concept, ranked fifth, with 11 percent of organizations calling it a top innovation priority. But in this time of global business, nonstop communications and fierce competition, more organizations are finding that big data and analytics together can help them cope with constant change. They can help organizations face imperatives such as increasing time-to-value and becoming more agile and adaptive.

Using them properly can increase the potential for competitive advantage in business in various ways. For example, customer analytics, on which we have vr_bti_br_technology_innovation_prioritiesconducted benchmark research, is changing with access to new sources of information.  In telecommunications, for instance, call detail records, which contain enormous amounts of data, are being sifted and combined with customer records to determine things such as quality of service at the individual level. These sources can be combined with various others to help determine a customer’s value and propensity to churn. Machine learning can then be applied to figure out the best action to prevent churn at a cost commensurate with the value of the customer. Attribution modeling is changing as well with big data, especially since online tracking is no longer dominated by cookies and users need to join online and offline channels. Human capital analytics, on which we recently released new benchmark research, is a hot area as well. It includes analytics in the area of talent management, which is concerned with salaried (rather than hourly) employees and hard-to-find skill sets. It focuses on things like recruitment, career paths, retention and the entire employment life cycle (often beyond a single organization). Human capital analytics also includes workforce management, which uses big data and analytics to optimize scheduling and resource allocation and to match skills to particular tasks. Another area, operational analytics, often focuses on efficiency and can have a great impact on customer service through improvements in delivery times, system responsiveness or predicting outages before they occur. In the area of governance, risk and compliance, things such as fraud detection and system and network management, cybersecurity and portfolio risk all can be addressed using some form of big data and analytics.

New information and new technology are impacting almost every industry and every function, but in different ways and at different speeds. For instance, healthcare and banking are driven more than others by risk management and regulatory compliance, whereas services such as retail are driven more by sales, and manufacturing is driven more by efficiency and cost containment. Retail is facing a great deal of discontinuity as e-commerce, and in particular Amazon, have forced established retailers to completely rethink their strategies.

This discussion is not complete without addressing the impact of big data and analytics on organizations’ people, processes and culture, which is a particular topic of my current research. We are seeing in all our research more empowerment of business users and business analysts. This is being driven by a number of internal factors such as industry competition but also by the ability of business users to rent applications from the cloud without incurring significant capital expenses and being dependent on their IT groups. Until recently IT chose new tools and provisioned a company standard, but that is less true today. From an information and insights perspective, now the power lies with who owns the analytic agenda rather than the technology agenda, and the analytic agenda is owned by those with the most business savvy. At the same time, we see a people challenge, which according to our big data benchmark research, vr_bigdata_obstacles_to_big_data_analytics (2)manifests in the top two challenges to big data analytics: staffing (for 79% of organizations) and training (for 77%). Organizations have trouble finding qualified professionals to manage big data and providing training to those already on board. The finance department and other numbers-oriented functions, which have much of the analytic talent, are starting to exert influence in less analytically savvy parts of the organization such as human resources and marketing. Since having a complete view of the customer is vital today, the marketing organization is a candidate for big data initiatives, but only if the team is analytically savvy and can take advantage of new sources of information and new technologies. Otherwise, Finance or IT will ultimately lead the new analytic efforts of the organization.

There is an interesting dynamic occurring here. Finance and IT are natural allies in that both are numbers- and tools-oriented. Since the 1990s they have led adoption and use of enterprise technologies such as ERP and business intelligence. The marketing department has had a different orientation, but the strength of marketing is its ability to produce top-line revenue by understanding and influencing the customer. I expect that when these forces come together and cross-pollinate the organization, we will start to see a real transformation in 21st century business.

As with many other trends, in their enthusiasm people offer big data as the answer to every question. I heard a great one-liner the other day: “If you are a bartender running low on gin, there’s no need to worry about big data.” I like this because it refocuses the discussion to look first at the business problems organizations are trying to solve and the data later. I’ve written a few pieces trying to add some structure to how to think about big data analytics such as four pillars of big data analytics and moving from the technologically oriented big data Vs to the business focused Ws. As well my colleague Mark Smith has written about four types of discovery technology, an area that is critical to exploring the data that is becoming so abundant. Hopefully, these frameworks and approaches can help companies think through the challenges around big data and analytics and the transformation that ensues.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,106 hits
%d bloggers like this: