You are currently browsing the tag archive for the ‘Service oriented architecture’ tag.

Ventana Research has been researching and advocating operational intelligence for the past 10 years, but not always with that name. The use of events and analytics in business process management and the need for hourly and daily operational business intelligence originally drove the discussion, but the alignment with traditional BI architecture didn’t allow for a seamless system; so a few years later the discussion started to focus around business process management and the ability of companies to monitor and analyze BPM on top of their enterprise applications. Business activity monitoring became the vogue term, but that term did not denote the action orientation necessary to accurately describe this emerging area. Ventana Research had at that point already defined a category of technology and approaches that allow both monitoring and management of operational activities and systems along with taking action on critical events. Today, Ventana Research defines Operational Intelligence as a set of event-centered information and analytics processes operating across the organization that enable people to take effective actions and make better decisions.

The challenge in defining a category in today’s enterprise software market is that prolific innovation  is driving a fundamental reassessment of category taxonomies. It’s nearly impossible to define a mutually exclusive and combinatorially exhaustive set of categories, and without that, there will necessarily be overlapping categories and definitions. Take the category of big data; when we ask our community for the definition, we get many perspectives and ideas of what big data represents.

Operational intelligence overlaps in many ways with big data. In technological terms, both deal with a diversity of data sources and data structures, both need to provide data in a timely manner, and both must deal with the exponential growth of data.

Also, business users and technologists often see both from different perspectives. Much like the wise men touching the elephant, each group feels that OI has a specific purpose based on their perspective. The technologist looks at operational intelligence from a systems and network management perspective, while business users look at things from a business performance perspective. This is apparent when we look into the data sources used for operational intelligence: IT places more importance on IT systems management (79% vs. 40% for business), while business places more importance on financial data (54% vs. 39% for IT) and customer data (40% vs. 27% for IT). Business is also more likely to use business intelligence tools for operational intelligence (50% vs. 43%), while IT is more likely to use specialized operational intelligence tools (17% vs. 9% for business).

The last and perhaps biggest parallel is that in both cases, the terms are general, but their implementations and business benefits are specific. The top use cases in our study for operational intelligence were managing performance (59%), fraud and security (59%), compliance (58%) and risk management (58%). Overall we see relative parity in the top four, but when we drill down by industry, in areas such as financial services, government, healthcare and manufacturing, we see many differences. We conclude that each industry has unique requirements for operational intelligence, and this is very similar to what we see with big data.

It is not surprising that our definition of operational intelligence is still evolving. As we move from the century of designed data to the century of organic data (terminology coined by Census Director Robert Groves), many of our traditional labels are evolving. Business intelligence is beginning to overlap with categories such as big data, advanced analytics and operational intelligence. As I discussed in a recent blog post, The Brave New World of Business Intelligence, the business intelligence category was mature and was showing incremental growth only a few years ago, but it is difficult to call the BI category mature any longer.

Based on the results of our latest operational intelligence benchmark research, we feel confident that our current definition encompasses the evolving state of the market. As operational intelligence advances, we will continue to help put a frame around it. For now, it acts very much like what might be called “right-time big data.”


Tony Cosentino

VP & Research Director

Over the years Tibco has provided infrastructure for enterprise data integration and has built a substantial installed base. Now the company positions itself as supplying next-generation analytics for big data through service-oriented architecture (SOA). SOA has been around for a while; Ventana Research has been tracking it since 2006 and conducted benchmark research on SOA. But it remains a vaguely understood technology. Our research shows that SOA is not clearly defined in the market and that interpretations vary across the software industry. The basic function of an SOA is to provide common components and a common implementation that enable programmers to plug in and share applications through open application programming interfaces (APIs). In recent years, SOA has morphed into more of a general approach than a fixed set of standards. SOA architectures (though not always called SOA) are at the heart of modern platforms such as, Facebook and Amazon Web Services. In SOA Tibco competes with IBM and Oracle, among others.

The company’s promotion of SOA is unique; none of its competitors lead with it. Originally, SOA standards were based on the Simple Object Access Protocol (SOAP), but in the past few years standards based onREpresentational State Transfer (REST) have been gaining more adoption. From an architectural and development perspective, however, SOA is still quite viable, and for this reason Tibco’s messaging strategy may succeed. Our research shows that as business use of mobile devices grows, so will adoption of distributed architectures such as SOA and cloud computing.

From this SOA orientation, Tibco introduced Tibco Silver for cloud computing in 2009. While Silver was originally compared by some to Amazon’s EC2, it is more of a development environment that needs a public cloud infrastructure to run its applications. In fact, Tibco Silver now uses Amazon Web Services as its infrastructure provider to offer services such as its Spotfire product (which I will discuss later). Beyond Spotfire, Tibco Silver offers private cloud as well as public cloud services including grid computing resources and platform as a service (PaaS). Tibco differentiates its cloud products competitively in two ways. The first is dynamic provisioning, in which the system automatically provisions the necessary resources to meet the demand for a growing customer base. The second differentiator is to embed in it the company’s complex event processing (CEP) software, BusinessEvents.

The BusinessEvents software is a key part of the company’s strategy. CEP, covered in our benchmark research on operational intelligence, is about processing and analyzing multiple data streams in real time. To illustrate, think of what a car’s computing system does at the moment of a collision. Reacting to a number of signals, the computer is able to process input and command the airbag to deploy within a fraction of a second. This is CEP in a specific, contained environment. Tibco is one of the leaders in this CEP category and competes against companies such as IBM, Microsoft, Oracle and SAP. The company also sells CEP to a range of industries. One of the prominent sectors is travel where it helps airlines and railroads make real-time adjustments to factors such as schedules and personnel in response to changing weather conditions.

The fastest-growing product in the Tibco portfolio is Tibco Spotfire, its visual discovery and analytics tool. Tibco positions Spotfire between what the company sees as standard BI reporting tools, such as IBM Cognos, Oracle OBIEE and SAP Business Objects, and the statistical “heavy lifting” tools such as SAS, SPSS and the R packages. Spotfire 4.5, released in May, provides robust visualization capabilities and iterative analysis capabilities through its associative discovery model and in-memory processing engine. Spotfire is one of a growing class of data discovery tools that employ either an interactive visual approach or a search-based approach. Of the two, Tibco Spotfire is in the former category. A demonstration of the software impressed me with its ease of use, intuitive qualities and graphing of embedded predictive analytic functions.

It’s important to note that other companies are not standing still in this area. Almost all of the major players have products in the visualization space including IBM Cognos Insight, MicroStrategy Visual Insights, Oracle’s integration of Endeca into the Exalytics platform and SAS Visual Analytics Explorer. Visualization features and functionality may have been a competitive advantage a year ago, but most companies are catching up and basic visualization aspects now are table stakes in a market where the leading edge is shifting toward collaboration and mobile access.

In short, Tibco’s strategy is to insist that SOA and CEP are essential to enable the near-real-time responses to changes in business conditions and customer demand that will convey competitive advantage in the future. These capabilities are part of the latest release of Spotfire 4.5 where it also supports access to Hadoop data and can access predictive analytics from providers like MathWorks and SAS. The market seems to approve of it, as Tibco’s stock price has gone up 50 percent this year and retail revenue doubled year over year. Some of the strength in retail likely derives from Tibco providing Amazon’s next-best-offer (NBO) analytics, which it can use to pitch predicative analytics to other major retailers.

Tibco is transitioning itself as well as its customers from a 20th century enterprise integration model to a 21st century analytics model. Organizations considering both stand-alone visual discovery tools and tools that integrate CEP into the analytical mix should look to Tibco. It also offers a one-year trial version of its Spotfire software that enables companies to test-drive the product for an extended period.


Tony Cosentino

Vice President and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.


  • 73,715 hits
%d bloggers like this: