You are currently browsing the tag archive for the ‘oracle’ tag.

The challenge with discussing big data analytics is in cutting through the ambiguity that surrounds the term. People often focus on the 3 Vs of big data – volume, variety and velocity – which provides a good lens for big data technology, but only gets us part of the way to understanding big data analytics, and provides even less guidance on how to take advantage of big data analytics to unlock business value.

Part of the challenge of defining big data analytics is a lack of clarity vr_bigdata_big_data_capabilities_not_availablearound the big data analytics value chain – from data sources, to analytic scalability, to analytic processes and access methods. Our recent research on big data find many capabilities still not available including predictive analytics (41%) to visualization (37%). Moreover, organizations are unclear on how best to initiate changes in the way they approach data analysis to take advantage of big data and what processes and technologies they ought to be using. The growth in use of appliances, Hadoop and in-memory databases and the growing footprints of RDBMSes all add up to pressure to have more intelligent analytics, but the most direct and cost-effective path from here to there is unclear. What is certain is that as business analytics and big data increasingly merge, the potential for increased value is building expectations.

To understand the organizational chasm that exists with respect to big data analytics, it’s important to understand two foundational analytic approaches that are used in organizations today. Former Census Bureau Director Robert Grove’s ideas around designed data and organic data give us a great jumping off point for this discussion, especially as it relates to big data analytics.

In Grove’s estimation, the 20th century was about designed data, or what might be considered hypothesis-driven data. With designed data we engage in analytics by establishing a hypothesis and collecting data to prove or disprove it. Designed data is at the heart of confirmatory analytics, where we go out and collect data that are relevant to the assumptions we have already made. Designed data is often considered the domain of the statistician, but it is also at the heart of structured databases, since we assume that all of our data can fit into columns and rows and be modeled in a relational manner.

In contrast to the designed data approach of the 20th century, the 21st century is about organic data. Organic data is data that is not limited by a specific frame of reference that we apply to it, and because of this it grows without limits and without any structure other than that structure provided by randomness and probability. Organic data represents all data in the world, but for pragmatic reasons we may think of it as all the data we are able to instrument. RFID, GPS data, sensor data, sentiment data and various types of machine data are all organic data sources that may be characterized by context or by attributes such data sparsity (also known as low-density data). Much like the interpretation of silence in a conversation, analyzing big data is as much about interpreting that which exists between the lines as it is about what we can put on the line itself.

vr_predanalytics_adequacy_of_predictive_analytics_supportThese two types of data and the analytics associated with them reveal the chasm that exists within organizations and shed light on the skills gap that our predictive analytics benchmark research shows to be the primary challenge for analytics in organizations today. This research finds inadequate support in many areas including product training (26%) and how to apply to business problems (23%).

On one side of the chasm are the business groups and the analysts who are aligned with Grove’s idea of designed data. These groups may encompass domain experts in areas such as finance or marketing, advanced Excel users, and even Ph.D.-level statisticians. These analysts serve organizational decision-makers and are tied closely to actionable insights that lead to specific business outcomes. The primary way they get work done is through a flat file environment, as was outlined in some detail last week by my colleague Mark Smith. In this environment, Excel is often the lowest common denominator.

On the other side of the chasm exist the IT and database professionals, where a different analytical culture and mindset exist. The priority challenge for this group is dealing with the three Vs and simply organizing data into a legitimate enterprise data set. This group is often more comfortable with large data sets and machine learning approaches that are the hallmark of the organic data of 21st century. Their analytical environment is different from that of their business counterparts; rather than Excel, it is SQL that is often the lowest common denominator.

As I wrote in a recent blog post, database professionals and business analytics practitioners have long lived in parallel universes. In technology, practitioners deal with tables, joins and the ETL process. In business analysis, practitioners deal with datasets, merges and data preparation. When you think about it, these are the same things. The subtle difference is that database professionals have had a data mining mindset, or, as Grove calls it, an organic data mindset, while the business analyst has had a designed data or statistic-driven mindset.  The bigger differences revolve around the cultural mindset, and the tools that are used to carry out the analytical objectives. These differences represent the current conundrum for organizations.

In a world of big data analytics, these two sides of the chasm are being pushed together in a shotgun wedding because the marriage of these groups is how competitive advantage is achieved. Both groups have critical contributions to make, but need to figure out how to work together before they can truly realize the benefits of big data analytics. The firms that understand that the merging of these different analytical cultures is the primary challenge facing the analytics organization, and that develop approaches that deal with this challenge, will take the lead in big data analytics. We already see this as a primary focus area for leading professional services organizations.

In my next analyst perspective on big data I will lay out some pragmatic approaches companies are using to address this big data analytics chasm; these also represent the focus of the benchmark research we’re currently designing to understand organizational best practices in big data analytics.

Regards,

Tony Cosentino

VP and Research Director

At Oracle OpenWorld this week I focused on what the company is doing in business analytics, and in particular on what it is doing with its Exalytics In-Memory Machine. The Exalytics appliance is an impressive in-memory hardware approach to putting right-time analytics in the hands of end users by providing a full range of integrated analytic and visualization capabilities. Exalytics fits into the broader analytics portfolio by providing support for Oracle BI Foundational Suite including OBIEE, Oracle’s formidable portfolio of business analytics and business performance applications, as well interactive visualizations and discovery capabilities.

Exalytics connects with an Exadata machine over a high-throughput InfiniBand link. The system leverages Oracle’s TimesTen In-Memory Database with columnar compression originally built for online transaction processing in the telecommunications industry. Oracle’s Summary Advisor software provides a heuristic approach to making sure the right data is in-memory at the right time, though customers at the conference mentioned that in their initial configuration they did not need to turn on Summary Advisor since they were able to load their entire datasets in memory. Oracle’s Essbase OLAP server is also integrated, but since it requires a MOLAP approach and computationally intensive pre-aggregation, one might question whether applications such as Oracle’s Hyperion planning and forecasting tools will truly provide analysis at the advertised ”speed of thought.” To address this latter point, Oracle points to examples in the consumer packaged goods industry, where it has already demonstrated its ability to reduce complex planning scenario cycle times from 24 hours down to four hours. Furthermore, I discussed with Paul Rodwick, Oracle’s vice president of product management for business intelligence, the potential for doing in-line planning, where integrated business planning and complex what-if modeling can be done on the fly. For more on this particular topic, please check out my colleague’s benchmark research on the fast clean close.

Another important part of the Exalytics appliance is the Endeca discovery tool. Endeca, which Oracle acquired just a year ago, provides an exploratory interactive approach to root cause analysis and problem-solving without the traditional struggle of complex data modeling. It does this through a search technology that leverages key-value pairings in unstructured data, thereby deriving structure delivered in the form of descriptive statistics such as recency and frequency.  This type of tool democratizes analytics in an organization and puts power into the hands of line-of-business managers. The ability to navigate across data was the top-ranked business capability in our business analytics benchmark research. However, while Endeca is a discovery and analysis tool for unstructured and semi-structured data, it does not provide full sentiment analysis or deeper text analytics, as do tools such as IBM’s SPSS Text Analytics and SAS Social Conversation Center. For more advanced analytics and data mining, Oracle integrates its version of R for the enterprise on its Exadata Database Machine and its Big Data Appliance, Oracle’s integrated Hadoop approach based on the Cloudera stack.

At the conference, Oracle announced a few updates for Exalytics. The highlights include a new release of Oracle Business Intelligence 11.1.1.6.2BP1 optimized for Exalytics. The release includes new mobility and visualization capabilities, including trellis visualizations that allow users to see a large number of charts in a single view. These advanced capabilities are enabled on mobile devices through the increased speed provided by Exalytics. In addition, Endeca is now certified with Oracle Exalytics, and the TimesTen database has been certified with GoldenGate and Oracle Data Integrator, allowing the system and users to engage in more event-driven analytics and closed-loop processes. Finally, Hyperion Planning is certified on Exalytics.

On the upside, Oracle’s legacy platform support on Exalytics allows the company to leverage its entire portfolio and offer more than 80 prebuilt analytics applications ready to run on Exalytics without any changes. The platform also supports the full range of the Business Intelligence Foundational Suite and provides a common platform and a common dimensional model. In this way, it provides alignment with overall business processes, content management systems and transactional BI systems. This alignment is especially attractive for companies that have a lot of Oracle software already installed, and companies looking to operationalize business analytics through event-based closed-loop decision-making at the front lines of the organization. The speed of the machine, its near-real-time query speeds, and its ability to deliver complex visualizations to mobile devices allow users to create use cases and ROI scenarios they could not before. For example, the San Diego Unified School District was able to utilize Exalytics to push out performance dashboards across multiple device types to students and teachers, thereby increasing attendance and garnering more money from the state.The Endeca software lets users make qualitative and to some degree quantitative assessments from social data and to overlay that with structured data. The fact that Exalytics comprises an integrated hardware and software stack makes it a turnkey solution that does not require expensive systems integration services. Each of these points makes Exalytics an interesting and even an exciting investment possibility for companies.

On the downside, the integrated approach and vendor lock-in may discourage some companies concerned about Oracle’s high switch costs. This may pose a risk for Oracle as maturing alternatives become available and the economics of switching begin to make more sense. However, it is no easy task to switch away from Oracle, especially for companies with large Oracle database and enterprise application rollouts. The economics of loyalty are such that when customers are dissatisfied but captive, they remain loyal; however, as soon as a viable alternative comes on the market, they quickly defect. This defection threat for Oracle could come from dissatisfaction with configuration and ease-of-use issues in combination with new offerings from large vendors such as SAP, with its HANA in-memory database appliance, and hardware companies such as EMC that are moving up the stack into the software environment. To be fair, Oracle is addressing the issues around useability by moving the user experience away from “one size fits all” to more of a personae or role-based interface. Oracle will likely have time to do this, given the long tail of its existing database entrenchment and the breadth and depth of its analytics application portfolio.

With Exalytics, Oracle is addressing high market demand for right-time analytics and interactive visualization. I expect this device will continue to sell well especially since it is plug-and-play and Oracle has such a large portfolio of analytic applications. For companies already running Oracle databases and Oracle applications, Exalytics is very likely a good investment. In particular, it makes sense for those that have not fully leveraged the power of analytics in their business and that are therefore losing competitive position. Our benchmark research on analytics shows a relatively low penetration for analytics software today, but investments are accelerating due to the large ROI companies are realizing from analytics. In situations where companies find the payoffs to be less obvious, or where they fear lock-in, companies should take a hard look at exactly what they hope to accomplish with analytics and which tools best suit that need. The most successful companies start with the use case to justify the investment, then determine which technology makes the most sense.

In addition to this blog, please see Ventana Research’s broader coverage of Oracle OpenWorld.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,504 hits
%d bloggers like this: