You are currently browsing the tag archive for the ‘internet of things’ tag.

Splunk’s annual gathering, this year called .conf 2015, in late September hosted almost 4,000 Splunk customers, partners and employees. It is one of the fastest-growing user conferences in the technology industry. The area dedicated to Splunk partners has grown from a handful of booths a few years ago to a vast showroom floor many times larger. While the conference’s main announcement was the release of Splunk Enterprise 6.3, its flagship platform, the progress the company is making in the related areas of machine learning and the Internet of Things (IoT) most caught my attention.

Splunk’s strength is its ability to index, normalize, correlate and query data throughout the technology stack, including applications, servers, networks and sensors. It uses distributed search that enables correlation and analysis of events across local- and wide-area networks without moving vast amounts of data. Its architectural approach unifies cloud and on-premises implementations and provides extensibility for developers building applications. Originally, Splunk provided an innovative way to troubleshoot complex technology issues, but over time new uses for Splunk-based data have emerged, including digital marketing analytics, cyber security, fraud prevention and connecting digital devices in the emerging Internet of Things. Ventana Research has covered Splunk since its establishment in the market, most recently in this analysis of mine.

Splunk’s experience in dealing directly with distributed, time-series data and processes on a large scale puts it in position to address the Internet of Things from an industrial perspective. This sort of data is at the heart of large-scale industrial control systems, but it often comes in different formats and its implementation is based on different formats and protocols. For instance, sensor technology and control systems that were invented 10 to 20 years ago use very different technology than modern systems. Furthermore, as with computer technology, there are multiple layers in stack models that have to communicate. Splunk’s tools help engineers and systems analysts cross-reference these disparate systems in the same way that it queries computer system and network data, however, the systems can be vastly different. To address this challenge, Splunk turns to its partners and its extensible platform. For example, Kepware has developed plug-ins that use its more than 150 communication drivers so users can stream real-time industrial sensor and machine data directly into the Splunk platform. Currently, the primary value drivers for organizations in this field of the industrial IoT are operational efficiency, predictive maintenance and asset management. At the conference, Splunk showcased projects in these areas including one with Target that uses Splunk to improve operations in robotics and manufacturing.

For its part, Splunk is taking a multipronged approach by acquiring companies, investing in internal development and enabling its partner ecosystem to build new products. One key enabler of its approach to IoT is machine learning algorithms built on the Splunk platform. In machine learning a model can use new data to continuously learn and adapt its answers to queries. This differs from conventional predictive analytics, in which users build models and validate them based on a particular sample; the model does not adapt over time. With machine learning, for instance, if a piece of equipment or an automobile shows a certain optimal pattern of operation over time, an algorithm can identify that pattern and build a model for how that system should behave. When the equipment begins to act in a less optimal or anomalous way, the system can alert a human operator that there may be a problem, or in a machine-to-machine situation, it can invoke a process to solve the problem or recalibrate the machine.

Machine learning algorithms allow event processes to be audited, analyzed and acted upon in real time. They enable predictive capabilities for maintenance, transportation and logistics, and asset management and can also be applied in more people-oriented domains such as fraud prevention, security, business process improvement, and digital products.  IoT potentially can have a major impact on business processes, but only if organizations can realign systems to discover-and-adapt rather than model-and-apply approaches. For instance, processes are often carried out in an uneven fashion different from the way the model was conceived and communicated through complex process documentation and systems. As more process flows are directly instrumented and more processes carried out by machines, the ability to model directly based on the discovery of those event flows and to adapt to them (through human learning or machine learning) becomes key to improving organizational processes. Such realignment of business processes, however, often involves broad organizational transformation.Our benchmark research on operational intelligence shows that challenges associated with people and processes, rather than information and technology, most often hold back organizational improvement.

Two product announcements made at the conference illuminate the direction Splunk is taking with IoT and machine learning. The first is User Behavior Analytics (UBA), based VR2015_InnovationAwardWinneron its acquisition of Caspida, which produces advanced algorithms that can detect anomalous behavior within a network. Such algorithms can model internal user behavior, and when behavior deviates from the specified norm, it can generate an alert that can be addressed through investigative processes usingSplunk Enterprise Security 4.0. Together, Splunk Enterprise Security 4.0 and UBA won the 2015 Ventana Research CIO Innovation Award.The acquisition of Caspida shows that Splunk is not afraid to acquire companies in niche areas where they can exploit their platform to deliver organizational value. I expect that we will see more such acquisitions of companies with high value ML algorithms as Splunk carves out specific positions in the emergent markets.

The other product announced is IT Service Intelligence (ITSI), which highlights machine learning algorithms alongside of Splunk’s core capabilities. The IT Service Intelligence App is an application in which end users deploy machine learning to see patterns in various IT service scenarios. ITSI can inform and enable multiple business uses such as predictive maintenance, churn analysis, service level agreements and chargebacks. Similar to UBA, it uses anomaly detection to point out issues and enables managers to view highly distributed processes such as claims process data in insurance companies. At this point, however, use of ITSI (like other areas of IoT) may encounter cultural and political issues as organizations deal with changes in the roles of IT and operations management. Splunk’s direction with ITSI shows that the company is staying close to its IT operations knitting as it builds out application software, but such development also puts Splunk into new competitive scenarios where legacy technology and processes may still be considered good enough.

We note that ITSI is built using Splunk’s Machine Learning Toolkit and showcase, which currently is in preview mode. The vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsplatform is an important development for the company and fills one of the gaps that I pointed out in its portfolio last year. Addressing this gap enables Splunk and its partners to create services that apply advanced analytics to big data that almost half (45%) of organizations find important. The use of predictive and advanced analytics on big data I consider a killer application for big data; our benchmark research on big data analytics backs this claim: Predictive analytics is the type of analytics most (64%) organizations wish to pursue on big data.

Organizations currently looking at IoT use cases should consider Splunk’s strategy and tools in the context of specific problems they need to address. Machine learning algorithms built for particular industries are key so it is important to understand if the problem can be addressed using prebuilt applications provided by Splunk or one of its partners, or if the organization will need to build its own algorithms using the Splunk machine learning platform or alternatives. Evaluate both the platform capabilities and the instrumentation, the type of protocols and formats involved and how that data will be consumed into the system and related in a uniform manner. Most of all, be sure the skills and processes in the organization align with the technology from an end user and business perspective.

Regards,

Ventana Research

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Ventana Research

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Tony Cosentino

VP and Research Director

At its annual industry analyst summit last month and in a more recent announcement of enterprise support for parallelizing the R language on its Aster Discovery Platform, Teradata showed that it is adapting to changes in database and analytics technologies. The presentations at the conference revealed a unified approach to data architectures and value propositions in a variety of uses including the Internet of Things, digital marketing and ETL offloading. In particular, the company provided updates on the state of its business as well as how the latest version of its database platform, Teradata 15.0, is addressing customers’ needs for big data. My colleague Mark Smith covered these announcements in depth. The introduction of scalable R support was discussed at the conference but not announced publicly until late last month.

vr_Big_Data_Analytics_13_advanced_analytics_on_big_dataTeradata now has a beta release of parallelized support for R, an open source programming language used significantly in universities and growing rapidly in enterprise use. One challenge is that R relies on a single-thread, in-memory approach to analytics. Parallelization of R allows the algorithm to run on much larger data sets since it is not limited to data stored in memory. For a broader discussion of the pros and cons of R and its evolution, see my analysis. Our benchmark research shows that organizations are counting on companies such as Teradata to provide a layer of abstraction that can simplify analytics on big data architectures. More than half (54%) of advanced analytics implementations are custom built, but in the future this percentage will go down to about one in three (36%).

Teradata’s R project has three parts. The first includes a Teradata Aster R library, which supplies more than 100 prebuilt R functions that hide complexity of the in-database implementation. The algorithms cover the most common big data analytic approaches in use today, which according to our big data analytics benchmark research are classification (used by 39% of organizations), clustering (37%), regression (35%), time series (32%) and affinity analysis (29%). Some use innovative approaches available in Aster such as Teradata’s patented nPath algorithm, which is useful in areas such as digital marketing. All of these functions will receive enterprise support from Teradata, likely through its professional services team.

The second part of the project involves the R parallel constructor. This component gives analysts and data scientists tools to build their own parallel algorithms based on the entire library of open source R algorithms. The framework follows the “split, apply and combine” paradigm, which is popular among the R community. While Teradata won’t support the algorithms themselves, this tool set is a key innovation that I have not yet seen from others in the market.

Finally, the R engine has been integrated with Teradata’s SNAP integration framework. The framework provides unified access to multiple workload specific engines such as relational (SQL), graph (SQL-GR), MapReduce (SQL-MR) and statistics. This is critical since the ultimate value of analytics rests in the information itself. By tying together multiple systems, Teradata enables a variety of analytic approaches. More importantly, the data sources that can be merged into the analysis can deliver competitive advantages. For example, JSON integration, recently announced, delivers information from a plethora of connected devices and detailed Web data.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsTeradata is participating in industry discussions about both data management and analytics. As Mark Smith discussed, its unified approach to data architecture addresses challenges brought on competing big data platforms such as Hadoop and other NoSQL approaches like that one announced with MongoDB supporting JSON integration. These platforms access new information sources and help companies use analytics to indirectly increase revenues, reduce costs and improve operational efficiency. Analytics applied to big data serve a variety of uses, most often cross-selling and up-selling (for 38% of organizations), better understanding of individual customers (32%) and optimizing price (30%) and IT operations (24%). Teradata is active in these areas and is working in multiple industries such as financial services, retail, healthcare, communications, government, energy and utilities.

Current Teradata customers should evaluate the company’s broader analytic and platform portfolio, not just the database appliances. In the fragmented and diverse big data market, Teradata is sorting through the chaos to provide a roadmap for largest of organizations to midsized ones. The Aster Discovery Platform can put power into the hands of analysts and statisticians who need not be data scientists. Business users from various departments, but especially high-level marketing groups that need to integrate multiple data sources for operational use, should take a close look at the Teradata Aster approach.

Regards,

Tony Cosentino

VP & Research Director

Information Builders announced two major new products at its recent annual user summit. The first was InfoDiscovery, a tool for ad hoc data analysis and visual discoveryThe second was iWay Sentinel, which allows administrators to manage applications in a proactive and dynamic manner. Being a privately held company, Information Builders is not a household name, but it is a major provider of highly scalable business intelligence (BI) and information management software to companies around the world.

VRMobileBIVI_HotVendorThis year’s announcements come one year after the release of WebFOCUS 8.0, which I wrote about at the time. Version 8.0 of this flagship BI product includes a significant overhaul of the underlying code base, and its biggest change is how it renders graphics by putting the parameters of the HTML5 graph code directly inside the browser. This approach allows consistent representation of the business intelligence graphics in multiple device environments including mobile ones. Our research into information optimization shows that mobile technology improves business performance significantly in one out of three organizations. The graphics capability helped Information Builders earn the rating of Hot vendor in our latest Value Index on Mobile Business Intelligence. It is an increasingly important trend to combine analytics with transactional systems in a mobile environment. Our research shows that mobile business intelligence is advancing quickly. Nearly three-quarters (71%) of participants said they expect their mobile workforce to have BI capabilities in the next 12 months.

vr_Big_Data_Analytics_12_benefits_of_visualizing_big_dataWebFOCUS InfoDiscovery represents the company’s new offer in the self-service analytics market. For visual discovery it enables users to extract, blend and prepare data from various data sources such as spreadsheets, company databases and third-party sources. Once the analytic data set is created, users can drill down into the information in an underlying columnar database. They can define queries as they go and examine trends, correlations and anomalies in the data set. Users given permission can publish the visualization from their desktop to the server for others to view or build further. Visualization is another area of increasing importance for organizations. Our research on big data analytics said data visualization has a number of benefits; the most-often cited are faster analytics (by 49%), understanding content (48%), root-cause analysis (40%) and displaying multiple result sets at the same time (40%).

InfoDiscovery is Information Builders’ contender in the new breed of visual discovery products. The first generation of visual discovery products drew attention for their visual capabilities, ease of use and agility. More recently, established business intelligence vendors, of which Information Builders is one, have focused on developing visual discovery tools on the platform of their well-known BI products, with the aim of taking advantage of their maturity. Currently this second wave of tools is still behind the first in terms of ease of use and visual analysis but are advancing rapidly, and they can provide better data governance, version control, auditing and user security. For instance, InfoDiscovery uses the same metadata as the enterprise platform WebFOCUS 8 so objects from both InfoDiscovery and other WebFOCUS applications can be configured in the same user portal. When a business user selects a filter, the data updates across all the components in the dashboard. The HTML5 rendering engine, new in WebFOCUS 8.0, makes the dashboard available to various devices including tablets and smartphones.

vr_oi_how_operational_intellegence_is_usedThe other major announcement at the conference, iWay Sentinel, is a real-time application monitoring tool that helps administrators manage resources across distributed systems. It works with iWay Service Manager, which is used to manage application workflows. IWay Sentinel allows multiple instances of Service Manager to be viewed and managed from a single Web interface, and administrators can address bottlenecks in system resources both manually and automatically. The tool belongs in the category we call operational intelligence and as our research finds, activity and event monitoring is the most important use (for 62% of research participants), followed by alerting and notification.

Sentinel is an important product in the Information Builders portfolio for a couple of reasons. Tactically speaking, it enables large organizations that are running multiple implementations of iWay Service Manager to manage infrastructure resources in a flexible and streamlined manner. From a strategic perspective, it ties the company to the emerging Internet of Things (IoT), which connects devices and real-time application workflows across a distributed environment. In such an environment, rules and processes flows must be monitored and coordinated in real time. Information is passed along an enterprise service bus that enables synchronous interaction of various application components. The use of IoT is in multiple areas such as remote management of devices, telematics and fleet management, predictive maintenance, supply chain optimization, and utlilities monitoring. The challenge is that application software is often complex and its processes are interdependent. For this reason, most approaches to the IoT have been proprietary in nature. Even so, Information Builders has a large number of clients in various industries, especially retail, that may be interested in its approach.

Information Builders continues to innovate in the changing IT industry and business demand for analytics and data, building on its integration capabilities and its core business intelligence assets. The breadth and depth of its software portfolio enable the company to capitalize on these assets as demand shifts. For instance, temporal analysis is becoming more important; Information Builders has built that capability into its products for years. In addition, the company’s core software is hardened by years of meeting high-concurrency needs. Companies that have thousands of users need this type of scalable, battle-tested system.

Both iWay Sentinel and InfoDiscovery are in limited release currently and will be generally available later this year. Users of other Information Builders software should examine InfoDiscovery and assess its fit in their organizations. For business users it offers a self-service approach on the same platform as the WebFOCUS enterprise product. IT staff can uphold their governance and system management responsibilities through visibility and flexible control of the platform. For its part iWay Sentinel should interest companies that have to manage multiple instances of information applications and use iWay Service Manager. In particular, retailers, transportation companies and healthcare companies exploring IoT uses should consider how it can help.

Information Builders is exploiting the value of data into what is called information optimization for which they are finding continued growth in providing information applications that meet specific business and process needs. Information Builders is also beginning to further exploit the big data sources and mobile technology areas but will need to further invest to ensure it can be part of a spectrum of new business needs. I continued to recommend any company that must serve a large set of employees in the workforce and has a need for blending data and analytics for business intelligence or information needs to consider Information Builders.

Regards,

Tony Cosentino

VP and Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Stats

  • 72,942 hits
%d bloggers like this: