You are currently browsing the tag archive for the ‘data integration’ tag.

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Ventana Research

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Tony Cosentino

VP and Research Director

Our benchmark research consistently shows that business analytics is the most significant technology trend in business today and acquiring effective predictive analytics is organizations’ top priority for analytics. It enables them to look forward rather than backward and, participate organizations reported, leads to competitive advantage and operational efficiencies.

In our benchmark research on big data analytics, for example, 64 percent of organizations ranked predictive analytics as the most Untitledimportant analytics category for working with big data. Yet a majority indicated that they do not have enough experience in applying predictive analytics to business problems and lack training on the tools themselves.

Predictive analytics improves an organization’s ability to understand potential future outcomes of variables that matter. Its results enable an organization to decide correct courses of action in key areas of the business. Predictive analytics can enhance the people, process, information and technology components of an organization’s future performance.

In our most recent research on this topic, more than half (58%) of participants indicated that predictive analytics is very important to their organization, but only one in five said they are very satisfied with their use of those analytics. Furthermore, our research found that implementing predictive analysis would have a transformational impact in one-third of organizations and a significant positive impact in more than half of other ones.

In our new research project, The Next Generation of Predictive Analytics, we will revisit predictive analysis with an eye to determining how attitudes toward it have changed,  along with its current and planned use, and its importance in business. There are significant changes in this area, including where, how, why, and when predictive analytics are applied. We expect to find changes not only in forecasting and analyzing customer churn but also in operational use at the front lines of the organization and in improving the analytic process itself. The research will also look at the progress of emerging statistical languages such as R and Python, which I have written about.

vr_predanalytics_benefits_of_predictive_analytics_updatedAs does big data analytics, predictive analytics involves sourcing data, creating models, deploying them and managing them to understand when an analytic model has become stale and ought to be revised or replaced. It should be obvious that only the most technically advanced users will be familiar with all this, so to achieve broad adoption, predictive analytics products must mask the complexity and be easy to use. Our research will determine the extent to which usability and manageability are being built into product offerings.

The promise of predictive analytics, including competitive advantage (68%), new revenue opportunities (55%), and increased profitability (52%), is significant. But to realize the advantages of predictive analytics, companies must transform how they work. In terms of people and processes a more collaborative strategy may be necessary. Analysts need tools and skills in order to use predictive analytics effectively. A new generation of technology is also becoming available where predictive analytics are easier to apply and use, along with deploy into line of business processes. This will help organizations significantly as there are not enough data scientists and specially trained professionals in predictive analytics that will be available for organizations to utilize or afford to hire.

This benchmark research will look closely at the evolving use of predictive analytics to establish how it equips business to make decisions based on likely futures, not just the past.

Regards,

Tony Cosentino

VP & Research Director

Pentaho recently announced Pentaho 5.0 which represents a major advancement for this supplier of business analytics and data integration software as well as for the open source community to which it contributes and supports. In fact, with 250 new features and enhancements in the 5.0 release, it’s important not to lose the forest for the trees. Some of the highlights are a new user interface that caters to specific roles within the organization, tight integration with emerging databases such as Mongo, and enhanced extensibility. With a funding round of $60 million coming less than a year ago and the growing market momentum around big data and analytics and it appears that Pentaho has doubled down at the right time in its efforts to balance the needs of the enterprise with those of the end user.

The 5.0 products have been completely redesigned for discovery analytics, content creation, accessibility and simplified administration. One key area of change is around roles, or what I  call personas. I recently discussed the different analytic personas that are emerging in today’s data-driven organization, and Pentaho has done a good job of addressing these at each level of the organization. In particular, the system addresses each part of the analytic value chain, from data integration through to analytic discovery and visualization.

I was surprised by the usability of the visual analytics tool, which offers a host of capabilities that enable easy data exploration and visual vr_ngbi_br_importance_of_bi_technology_considerationsdiscovery. Features such as drag-and-drop conditional formatting, including color coding, are simple, intuitive and powerful. Drop-down charting reveals an impressive list of visualizations that can be changed with a single click once. Users will need to understand the chart types by name, however, since no thumbnail visuals are revealed upon scrolling over and there is no chart recommendation engine. But overall, the release’s ease-of-use developments are a major improvement in an already usable system that our firm rated Hot in the 2012 Value Index on Business Intelligence, putting Pentaho on par with other best-in-class tools. According to our benchmark study of next-generation business intelligence systems, usability is becoming more important in business intelligence and is the key buying criterion 63 percent of the time.

Advances from an enterprise perspective include features that will help IT manage the large volumes of data being introduced into the environment through its support of big data sources and streamlining the automation of data integration. Capabilities such as job restart, rollback and load balancing are all included. For administrators, you can more easily configure and manage the system, including security levels, licensing and servers. In addition, new REST services APIs simplify the embedding of analytics and reporting into SaaS implementations. This last advancement in embedding is important, as I discussed in a recent piece that making analytics available anywhere is extremely important.

No discussion of big data integration and analytics is complete vr_infomgt_barriers_to_information_managementwithout the mention of Pentaho Data Integration (PDI), which I consider the crown jewel of the Pentaho portfolio. The value of PDI is derived from its ability to put big data integration and business analytics in the same workflow. The data integration through a user-friendly graphical paradigm helps a range of IT and analysts blend data from multiple platforms at the semantic layer rather than the user level. This enables centralized agreement around data definitions so companies can govern and secure their information environments. The Pentaho approach addresses the two biggest barriers to information management, as revealed in our benchmark research: data spread across too many systems (67%) and multiple versions of the truth (64%). While other tools on the market facilitate blending at the business-user level, there is an inherent danger in such an approach because each individual can create analysis according to the definition that best suits his or her argument. It is similar to the spreadsheet problem we have now, in which many analysts come together, each with a different understanding of the source data.

vr_bigdata_big_data_technologies_plannedIts depth in data integration is very robust and Pentaho  supports a range of big data which has been expanding rapidly to multiple data sources that are being used today and what our research found is planned to be used like Data Warehouse Appliances (35%), In-memory Database (34%), Specialized DBMS (33%) and Hadoop (32%) as found in our Big Data benchmark. Beyond these big data and RDBMS sources that are supported today, it has also expanded to non-SQL sources. The open source and pluggable nature of the Pentaho architecture allows community-driven evolution beyond traditional JDBC and ODBC drivers and gives an increasingly important leverage point for using its platform. For example, the just announced MongoDB Connector enables deep integration that includes replica sets, tag sets and read and write preferences, as well as first-of-its-kind reporting on the Mongo NoSQL database. MongoDB is a document database, which is a new class of database that allows a more flexible, object-oriented approach for accessing new sources of information. The emergence of MongoDB mirrors that of new, more flexible notation languages such as JavaScript Object Notation (JSON). While reporting is still basic, I expect the initial integration with MongoDB to be just a first step for the Pentaho community in optimizing information around this big data store. Additionally, Pentaho announced new integration with Splunk, Amazon Redshift and Cloudera Impala, as well as certifications including MongoDB, Cassandra, Cloudera, Intel, Hortonworks and MapR.

Currently the analytics and BI market is bifurcated, with the so-called stack vendors occupying entrenched positions in many organizations and visual discovery selling to business users through a viral bottom-up strategy. Both sides are moving to the middle in their development efforts and addressing the lack of data integration that is integrated in the Pentaho approach. The challenge for the traditional enterprise BI vendors is to build flexible, user-friendly visual platforms, while for the newcomers it’s applying structure and governance to their visually oriented information environment. Arguably, Pentaho is building its platform from the middle out. The company has done a good job of balancing usability aspects with the governance and security models needed for a holistic approach that both IT and end users can support. Organizations that are looking for a unified data integration and business analytics approach for business and IT, including advanced analytics and embedded approaches to information-driven applications, should consider Pentaho.

Regards,

Tony Cosentino

VP and Research Director

This year’s Inspire, Alteryx’s annual user conference, featured new developments around the company’s analytics platform. Alteryx CEO Dean Stoecker kicked off the event by talking about the promise of big data, the dissemination of analytics throughout the organization, and the data artisan as the “new boss.” Alteryx coined the term “data artisan” to represent the persona at the center of the company’s development and marketing efforts. My colleague Mark Smith wrote about the rise of the data artisan in his analysis of last year’s event.

President and COO George Mathew keynoted day two, getting into more specifics on the upcoming 8.5 product release. vr_ngbi_br_importance_of_bi_technology_considerationsAdvancements revolve around improvement in the analytical design environment, embedded search capabilities, the addition of interactive mapping and direct model output into Tableau. The goal is to provide an easier, more intuitive user experience. Our benchmark research into next-generation business intelligence shows buyers consider usability the top buying criteria at 63 percent. The redesigned Alteryx interface boasts a new look for the icons and more standardization across different functional environments. Color coding of the toolbox groups tools according to functions, such as data preparation, analytics and reporting. A new favorites function is another good addition, given that users tend to rely on the same tools depending on their role within the analytics value chain. Users can now look at workflows horizontally and not just vertically, and easily change the orientation if for example they are working on an Apple iPad. Version 8.5 allows embedded search and more streamlined navigation, and continues its focus on a role-based application, which my colleague has been advocating for a while. According to the company, 94 percent of its user base demanded interactive mapping; that’s now part of the product, letting users draw a polygon around an area of interest, then integrate it into the analytical application for runtime execution.

The highlight of the talk was the announcement of integration with Tableau 8.0 and the ability to write directly to the software without having to follow the cumbersome process of exporting a file and then reopening it in another application. Alteryx was an alpha partner and worked directly with the code base for Tableau 8.0, which I wrote up a few months ago. The partnership exemplifies the coopetition environment that many companies find themselves in today. While Tableau does some basic prediction, and Alteryx does some basic visual reporting, the companies’ core competencies brought together into one workflow is much more powerful for the user. Another interesting aspect is the juxtaposition of the two user groups. The visually oriented Tableau group in San Diego seemed much younger and was certainly much louder on the reveals, while the analytically oriented Alteryx group was much more subdued.

Alteryx has been around since 1997, when it was called SRC. It grew up focused around location analytics, which allowed it to establish foundational analytic use cases in vertical areas such as real estate and retail. After changing the company name and focusing more on horizontal analytics, Alteryx is growing fast with backing from, interestingly enough, SAP Ventures. Since the company was already profitable, it used a modest infusion of capital to grow its product marketing and sales functions. The move seems to have paid off. Companies such as Dunkin Brands and Redbox use Alteryx and the company has made significant inroads with marketing services companies.  A number of consulting companies, such as Absolute Data and Capgemini, are using Alteryx for customer and marketing analytics and other use cases. I had an interesting talk with the CEO of a small but important services firm who said that he is being asked to introduce innovative analytical approaches to much larger marketing services and market research firms. He told me that Alteryx is a key part of the solution he’ll be introducing to enable things such as big data analytics.

Alteryx provides value in a few innovative ways that are not new to this release, but that are foundational to the company’s business vr_bigdata_obstacles_to_big_data_analyticsstrategy. First, it marries data integration with analytics, which allows business users who have traditionally worked in a flat-file environment to pull from multiple data sources and integrate information within the context of the Alteryx application. Within that same environment, users can build analytic workflows and publish applications to a private or public cloud. This approach helps address the obstacles found in our research in big data analytics where staffing (79%) and training (77%) are addressed by Alteryx through providing more flexibility for business to engage into the analytic process.

Alteryx manages an analytics application store called the Analytics Gallery that crowdsources and shares user-created models. These analytical assets can be used internally within an organization or sold on the broader market. Proprietary algorithms can be secured through a black box approach, or made open to allow other users to tweak the analytic code. It’s similar to what companies like Datameer are doing on top of Hadoop, or Informatica in the cloud integration market. The store gives descriptions of what the applications do, such as fuzzy matching or target marketing. Being crowdsourced, the number of applications should proliferate over time, tracking advancements in the R open source project, since R is at the heart of the Alteryx analytic strategy and what it calls clear box analytics. The underlying algorithm is easily viewed and edited based on permissions established by the data artisan, similar to what we’ve seen with companies such as 1010data. Alteryx 8.5 works with R 3.0, the latest version. On the back end, Alteryx partners with enterprise data warehouse powerhouses such as Teradata, and works with the Hortonworks Hadoop distribution.

I encourage analysts of all stripes to take a look at the Alteryx portfolio. Perhaps start with the Analytics Gallery to get a flavor of what the company does and the type of analytics customers are building and using today.  Alteryx can benefit analysts looking to move beyond the limitations of a flat-file analytics environment, and especially marketing analysts who want to marry third-party data from sources such as the US Census Bureau, Experian, TomTom or Salesforce, which Alteryx offers within its product. If you have not seen Alteryx, you should take a look and see how they are changing the way analytic processes are designed and managed.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,340 hits
%d bloggers like this: