You are currently browsing the tag archive for the ‘Microstrategy’ tag.

Ventana Research recently completed the most comprehensiveVRMobileBIVI evaluation of mobile business intelligence products and vendors available anywhere today. The evaluation includes 16 technology vendors’ offerings on smartphones and tablets and use across Apple, Google Android, Microsoft Surface and RIM BlackBerry that were assessed in seven key categories: usability, manageability, reliability, capability, adaptability, vendor validation and TCO and ROI. The result is our Value Index for Mobile Business Intelligence in 2014. The analysis shows that the top supplier is MicroStrategy, which qualifies as a Hot vendor and is followed by 10 other Hot vendors: IBM, SAP, QlikTech, Information Builders, Yellowfin, Tableau Software, Roambi, SAS, Oracle and arcplan.

Our expertise, hands on experience and the buyer research from our benchmark research on next-generation business intelligence and on information optimization informed our product evaluations in this new Value Index. The research examined business intelligence on mobile technology to determine organizations’ current and planned use and the capabilities required for successful deployment.

What we found was wide interest in mobile business intelligence and a desire to improve the use of information in 40 percent of organizations, though adoption is less pervasive than interest. Fewer than half of organizations currently access BI capabilities on mobile devices, but nearly three-quarters (71%) expect their mobile workforce to be able to access BI capabilities in the next 12 months. The research also shows strong executive support: Nearly half of executives said that mobility is very important to their BI processes.

Mobile_BI_Weighted_OverallEase of access and use are an important criteria in this Value Index because the largest percentage of organizations identified usability as an important factor in evaluations of mobile business intelligence applications. This is an emphasis that we find in most of our research, and in this case it also may reflect users’ experience with first-generation business intelligence on mobile devices; not all those applications were optimized for touch-screen interfaces and designed to support gestures. It is clear that today’s mobile workforce requires the ability to access and analyze data simply and in a straightforward manner, using an intuitive interface.

The top five companies’ products in our 2014 Mobile Business Intelligence Value Index all provide strong user experiences and functionality. MicroStrategy stood out across the board, finishing first in five categories and most notably in the areas of user experience, mobile application development and presentation of information. IBM, the second-place finisher, has made significant progress in mobile BI with six releases in the past year, adding support for Android, advanced security features and an extensible visualization library. SAP’s steady support for the mobile access to SAP BusinessObjects platform and support for access to SAP Lumira, and its integrated mobile device management software helped produce high scores in various categories and put it in third place. QlikTech’s flexible offline deployment capabilities for the iPad and its high ranking in assurance-related category of TCO and ROI secured it the fourth spot. Information Builders’ latest release of WebFOCUS renders content directly with HTML5 and its Active Technologies and Mobile Faves, the company delivers strong mobile capabilities and rounds out the top five ranked companies. Other noteworthy innovations in mobile BI include Yellowfin’s collaboration technology, Roambi’s use of storyboarding in its Flow application.

Although there is some commonality in how vendors provide mobile access to data, there are many differences among their offerings that can make one a better fit than another for an organization’s particular needs. For example, companies that want their mobile workforce to be able to engage in root-cause discovery analysis may prefer tools from Tableau and QlikTech. For large companies looking for a custom application approach, MicroStrategy or Roambi may be good choices, while others looking for streamlined collaboration on mobile devices may prefer Yellowfin. Many companies may base the decision on mobile business intelligence on which vendor they currently have installed. Customers with large implementations from IBM, SAP or Information Builders will be reassured to find that these companies have made mobility a critical focus.

To learn more about this research and to download a free executive summary, please visit http://www.ventanaresearch.com/bivalueindex/.

Regards,

Tony Cosentino

Vice President and Research Director

Our benchmark research found in business technology innovation that analytics is the most important new technology for improving their organization’s performance; they ranked big data only fifth out of six choices. This and other findings indicate that the best way for big data to contribute value to today’s organizations is to be paired with analytics. Recently, I wrote about what I call the four pillars of big data analytics on which the technology must be built. These areas are the foundation of big data and information optimization, predictive analytics, right-time analytics and the discovery and visualization of analytics. These components gave me a framework for looking at Teradata’s approach to big data analytics during the company’s analyst conference last week in La Jolla, Calif.

The essence of big data is to optimize the information used by the business for whatever type of need as my colleague has identified as a key value of these investmentsVR_2012_TechAward_Winner_LogoData diversity presents a challenge to most enterprise data warehouse architectures. Teradata has been dealing with large, complex sets of data for years, but today’s different data types are forcing new modes of processing in enterprise data warehouses. Teradata is addressing this issue by focusing on a workload-specific architecture that aligns with MapReduce, statistics and SQL. Its Unified Data Architecture (UDA) incorporates the Hortonworks Hadoop distribution, the Aster Data platform and Teradata’s stalwart RDBMS EDW. The Big Data Analytics appliance that encompasses the UDA framework won our annual innovation award in 2012. The system is connected through Infiniband and accesses Hadoop’s metadata layer directly through Hcatalog. Bringing these pieces together represents the type of holistic thinking that is critical for handling big data analytics; at the same time there are some costs as the system includes two MapReduce processing environments. For more on the UDA architecture, read my previous post on Teradata as well as my colleague Mark Smith’s piece.

Predictive analytics is another foundational piece of big data analytics and one of the top priorities in organizations. However, according to our vr_bigdata_big_data_capabilities_not_availablebig data research, it is not available in 41 percent of organizations today. Teradata is addressing it in a number of ways and at the conference Stephen Brobst, Teradata’s CTO, likened big data analytics to a high-school chemistry classroom that has a chemical closet from which you pull out the chemicals needed to perform an experiment in a separate work area. In this analogy, Hadoop and the RDBMS EDW are the chemical closet, and Aster Data provides the sandbox where the experiment is conducted. With mulitple algorithms currently written into the platform and many more promised over the coming months, this sandbox provides a promising big data lab environment. The approach is SQL-centric and as such has its pros and cons. The obvious advantage is that SQL is a declarative language that is easier to learn than procedural languages, and an established skills base exists within most organizations. The disadvantage is that SQL is not the native tongue of many business analysts and statisticians. While it may be easy to call a function within the context of the SQL statement, the same person who can write the statement may not know when and where to call the function. One way for Teradata to expediently address this need is through its existing partnerships with companies like Alteryx, which I wrote about recently. Alteryx provides a user-friendly analytical workflow environment and is establishing a solid presence on the business side of the house. Teradata already works with predictive analytics providers like SAS but should further expand with companies like Revolution Analytics that I assessed that are using R technology to support a new generation of tools.

Teradata is exploiting its advantage with algorithms such as nPath, which shows the path that a customer has taken to a particular outcome such as buying or not buying. According to our big data benchmark research, being able to conduct what-if analysis and predictive analytics are the two most desired capabilities not currently available with big data, as the chart shows. The algorithms that Teradata is building into Aster help address this challenge, but despite customer case studies shown at the conference, Teradata did not clearly demonstrate how this type of algorithm and others seamlessly integrate to address the overall customer experience or other business challenges. While presenters verbalized it in terms of improving churn and fraud models, and we can imagine how the handoffs might occur, the presentations were more technical in nature. As Teradata gains traction with these types of analytical approaches, it will behoove the company to show not just how the algorithm and SQL works but how it works in the use by business and analysts who are not as technically savvy.

Another key principle behind big data analytics is timeliness of the analytics. Given the nature of business intelligence and traditional EDW architectures, until now timeliness of analytics has been associated with how quickly queries run. This has been a strength of the Teradata MPP share-nothing architecture, but other appliance architectures, such as those of Netezza and Greenplum, now challenge Teradata’s hegemony in this area. Furthermore, trends in big data make the situation more complex. In particular, with very large data sets, many analytical environments have replaced the traditional row-level access with column access. Column access is a more natural way for data to be accessed for analytics since it does not have to read through an entire row of data that may not be relevant to the task at hand. At the same time, column-level access has downsides, such as the reduced speed at which you can write to the system; also, as the data set used in the analysis expands to a high number of columns, it can become less efficient than row-level access. Teradata addresses this challenge by providing both row and column access through innovative proprietary access and computation techniques.

Exploratory analytics on large, diverse data sets also has a timeliness imperative. Hadoop promises the ability to conduct iterative analysis on such data sets, which is the reason that companies store big data in the first place according to our big data benchmark research. Iterative analysis is akin to the way the human brain naturally functions, as one question naturally leads to another question. However, methods such as Hive, which allows an SQL-like method to access Hadoop data, can be very slow, sometimes taking hours to return a query. Aster enables much faster access and therefore provides a more dynamic interface for iterative analytics on big data.

Timeliness also has to do with incorporating big data in a stream-oriented environment and only 16 percent of organizations are very satisfied with timeliness of events according to our operational intelligence benchmark research. In a use case such as fraud and security, rule-based systems work with complex algorithmic functions to uncover criminal activity. While Teradata itself does not provide the streaming or complex event processing (CEP) engines, it can provide the big data analytical sandbox and algorithmic firepower necessary to supply the appropriate algorithms for these systems. Teradata partners with major players in this space already, but would be well served to further partner with CEP and other operational intelligence vendors to expand its footprint. By the way, these vendors will be covered in our upcoming Operational Intelligence Value Index, which is based on our operational intelligence benchmark research. This same research showed that analyzing business and IT events together was very important in 45 percent of organizations.

The visualization and discovery of analytics is the last foundational pillarvr_ngbi_br_importance_of_bi_technology_considerations and here Teradata is still a work in progress. While some of the big data visualizations Aster generates show interesting charts, they lack a context to help people interpret the chart. Furthermore, the visualization is not as intuitive and requires the writing and customization of SQL statements. To be fair, most visual and discovery tools today are relationally oriented and Teradata is trying to visualize large and diverse sets of data. Furthermore, Teradata partners with companies including MicroStrategy and Tableau to provide more user-friendly interfaces. As Teradata pursues the big data analytics market, it will be important to demonstrate how it works with its partners to build a more robust and intuitive analytics workflow environment and visualization capability for the line-of-business user. Usability (63%) and functionality (49%) are the top two considerations when evaluating business intelligence systems according to our research on next-generation business intelligence.

Like other large industry technology players, Teradata is adjusting to the changes brought by business technology innovation in just the last few years. Given its highly scalable databases and data modeling – areas that still represent the heart of most company’s information architectures –  Teradata has the potential to pull everything together and leverage their current deployed base. Technologists looking at Teradata’s new and evolving capabilities will need to understand the business use cases and share these with the people in charge of such initiatives. For business users, it is important to realize that big data is more than just visualizing disparate data sets and that greater value lies in setting up an efficient back end process that applies the right architecture and tools to the right business problem.

Regards,

Tony Cosentino
VP and Research Director

ParAccel is a well-funded big data startup, with $64 million invested in the firm so far. Only a few companies can top this level of startup funding, and most of them are service-based rather than product-based companies. Amazon has a 20 percent stake in the company and is making a big bet on the company’s technology to run its Redshift data warehouse in the cloud initiative. Microstrategy also uses ParAccel for it’s cloud offering, but holds no equity in the company.

ParAccel provides a software-based analytical platform that competes in the database appliance market, and as many in the space are increasingly trying to do, it is building analytic processes on top of the platform. On the base level, ParAccel is a massively parallel processing (MPP) database with columnar compression support, which allows for very fast query and analysis times. It is offered either as software or in an appliance configuration which, as we’ll discuss in a moment, is a different approach than many others in the space are taking. It connects with Teradata, Hadoop, Oracle and Microsoft SQL Server databases as well as financial market data such as semi-structured trading data and NYSE data through what the company calls On Demand Integration (ODI). This allows joint analysis through SQL of relational and non-relational data sources. In-database analytics offer more than 600 functions (though places on the company’s website and datasheets still say just over 500).

The company’s latest release, ParAccel 4.0, introduced product enhancements around performance as well as reliability and scalability. Performance enhancements include advanced query optimization that is said to improve aggregation performance 20X by doing “sort-aware” aggregations which tracks data properties up and down the processing pipeline. ParAccel’s own High Speed Interconnect protocol has been further optimized reducing data distribution overhead and speeding query processing. The new version 4.0 introduces new algorithms that exploit I/O patterns to pre-fetch data and store in memory, which again speeds query processing and reduced I/O overhead. The need for scalability is addressed in enhancements to enable the system to scale to 5,000 concurrent connections supporting up to 38,000 users on a single system. Its Hash Join algorithms allow for complex analytics by allowing the number of joins to fit the complexity of the analytic. Finally, interactive workload management introduces a class of persistent queries that allows short running queries and long running queries to be run side by side without impacting performance. This is particularly important as the integration of on-demand data sources through the company’s ODI approach could otherwise interfere with more interactive user requirements.

The company separates out its semi-annual database release cycle from the more iterative analytics release cycle. The new analytic functions just released just last month include a number of interesting developments for the company. Text analytics for various feeds allows for analytics across a variety of use cases, such as social media, customer comment analysis, insurance and warranty claims. In addition, functions such as sessionization and JSON parsing allow a new dimension of analytics for ParAccel as web data can now be analyzed. The new analytic capabilities allow the company to address a broad class of use cases such as “golden path analysis”, fraud detection, attribution modeling, segmentation and profiling. Interestingly, some of these use case are of the same character as those seen in the Hadoop world.

So where does ParAccel fit in the broader appliance landscape? vr_bigdata_big_data_technologies_plannedAccording to our benchmark research on big data more than 35 percent of businesses plan to use appliance technology, but the market is still fragmented. The appliance landscape can be broken down into categories that include hardware and software that run together, software that can be deployed across commodity hardware, and non-relational parallel processing paradigms such as Hadoop. This landscape gets especially interesting when we look at Amazon’s Redshift and the idea of elastic scalability on a relational data warehouse. The lack of elastic scalability in the data warehouse has been a big limitation for business; it has traditionally taken significant money, time and energy to implement.

With its “Right to Deploy” pricing strategy, ParAccel promises the same elasticity as with its on-premises deployments. The new pricing policy removes the traditional per-node pricing obstacles by offering prices based on “unlimited data” and takes into consideration the types of analytics that a company wants to deploy. This strategy may play well against companies that only sell their appliances bundled with hardware. Such vendors will have a difficult time matching ParAccel’s pricing because of their hardware-driven business model. While the offer is likely to get ParAccel invited into more consideration sets, it remains to be seen whether they win more deals based on it.

Partnerships with Amazon and MicroStrategy to provide cloud infrastructure produce a halo effect for ParAccel, but the cloud approaches compete against ParAccel’s internal sales efforts. One of the key differentiators for ParAccel as the company competes against the cloud version of itself will be the analytics that are stacked on top of the platform. Since neither Redshift nor MicroStrategy cloud offers currently license the upper parts of this value stack, customers and prospects will likely hear quite a bit about the library of 600-plus functions and the ability to address advanced analytics for clients. The extensible approach and the fact that the company has built analytics as a first class object in its database allow the architecture to address speed, scalability and analytic complexity. The one potential drawback, depending on how you look at it, is that the statistical libraries are based on user-defined-functions (UDFs) written in a procedural language. While the library integration is seamless to end users and scales well, if a company needs to customize the algorithms, data scientists must go into the underlying procedural programming language to make the changes. The upside is that the broad library of analytics can be used based on the SQL paradigm.

vr_bigdata_obstacles_to_big_data_analytics (2)While ParAccel aligns closely with the Hadoop ecosystem in order to source data, the company also seems to be welcoming opportunities to compete with Hadoop. Some of the use cases mentioned above such as so called “golden-path analysis, and others have been provided as key Hadoop analytic use cases. Furthermore, many Hadoop vendors are bringing the SQL access paradigm and traditional BI tools together with Hadoop to mitigate the skills gap in organizations. But if an MPP database like ParAccel that is built natively for relational data is also able to do big data analytics, and is able to deliver a more mature product with similar horizontal scalability and cost structure, the argument for standard SQL analytics on Hadoop becomes less compelling. If ParAccel is right, and SQL is the Lingua Franca for analytics, then they may be in a good position to fill the so called skills gap. Our benchmark research on business technology innovations shows that the biggest challenge for organizations deploying big data today revolves around staffing and training, with more than 77 percent of companies claiming that they are challenged in both categories.

ParAccel offers a unique approach in a crowded market. The new pricing policy is a brilliant stroke, as it not only will get the company invited into more bid opportunities, but it moves client conversations away from the technology-oriented three Vs and more to analytics and the business-oriented three Ws. If the company puts pricing pressure on the integrated appliance vendors, it will be interesting to see if any of those vendors begin to separate out their own software and allow it to run on commodity hardware. That would be a hard decision for them, since their underlying business models often rely on an integrated hardware/software strategy. With companies such as MicroStrategy and Amazon choosing it for their underlying analytical platforms, the company is one to watch. Depending on the use case and the organization, ParAccel’s in-database analytics should be readily considered and contrasted with other approaches.

Regards,

Tony Cosentino

VP and Research Director

MicroStrategy CEO Michael Saylor has a keen sense of where things are headed. He sees mobile and social as the two drivers of a world based largely in software. Last year I covered the announcements at the MicroStrategy events in Amsterdam and the vision Saylor put forth in his keynote speech. MicroStategy World 2013 last month finds the company delving into such diverse areas as identity management, marketing services and integrated point-of-sale applications. The uniting factor is mobile intelligence.

At the event, MicroStrategy highlighted three innovative product lines. VR_2012_TechAward_Winner_LogoUsher, announced in 2012, is a mobile identity management system that allows you to issue digital credentials on a mobile device.  Alert provides a mobile shopper experience, including promotions, product locator, transaction capabilities and receipt delivery. Wisdom, winner of the 2012 Ventana Research Technology Innovation Award for Social Media, mines social media data from Facebook to help drive brand insight. Along with large investments in cloud and mobile intelligence, these technologies illustrate where the company is headed.

In a breakout session provokingly titled “Beat Amazon and Google with Revolutionary Retail Apps for Your Store Operations,” MicroStrategy Vice President of Retail Frank Andryauskas brought the company’s technologies to life by outlining a typical in-store mobile purchase process. A customer may start by using Alert to engage social media while he looks at items on his phone or tablet and checks prices, sizes or availability within the application. Based on his selection, he may want recommendations through Wisdom for items that his friends like or that appeal to them because of their unique preferences. He could choose to purchase an item with a coupon promotion delivered through Alert, or have the item drop-shipped to his home or to the store.

On the back end, marketers can run purchase path analytics that tie the customer experience to the transaction. This in turn helps with promotional strategies that can influence purchase behavior at the store level. The key for the retailer, as well as for MicroStrategy, is to create customer value through an in-store and online experience that is differentiated from ones in other stores. The tools help retailers move beyond “showrooming” and leverage their physical assets to drive competitive advantage.

The MicroStrategy mobile retail vision gets even more compelling when you look at what’s going on with their customers, including large retailers that are using analytics to drive things such as employee engagement in a brick-and-mortar retail environment, which in turn can improve customer retention and increase share of wallet. The Container Store demonstrated how it uses MicroStrategy mobile BI to allow employees to view their performance as compared to their peers. This taps into a fundamental human need to be on the leading part of a curve and never lag behind. Friendly competition between stores with similar footprints and trade areas can drive best-in-class store performance. It will be interesting to see whether MicroStrategy can leverage this game approach across other industries, such as travel and tourism, government, manufacturing and healthcare.

MicroStrategy has a strong presence and compelling use cases in the pharmaceuticals industry, with solutions aroundvr_sales_mobile_technology mobile sales force enablement where operating smatrtphones and tables is a priority today. This area can show tremendous productivity gains, as in-meeting effectiveness often requires fast and easy access to pricing, distribution and benchmark data. The ability to communicate with other team members in real time during the sales process and to conduct transactions on the spot can reduce sales cycle times. Ancillary benefits include providing an audit trail of the best sales processes and representatives, so that, much like in the retail environment, pharmaceutical companies can develop and replicate a best-in-class approach.

While the company’s long-range vision is solid, MicroStrategy may be too far ahead of the curve. I would argue that the company is on the leading edge of mobile and may have spent more money than it had to in order to catch the mobile wave but is more ready than any other BI provider. With technologies such as Wisdom, Alert and Usher, it may be in a position similar to the one it was in a few years ago with mobile. Wisdom uses “like” data from Facebook to drive analytics, but how far can that data really get a marketer today? This innovation needs to pay more dividends for marketers, and it might in the future as Facebook starts to introduce a categorical verb universe that denotes specific attitudes and purchase intent. Alert could be good for a mid-market retailer, if its value and ease of use is compelling enough for mobile users to download the application and sign up as a store customer. Usher is spot on with its intent to manage digital identity, but uptake may be slow since separating data about the user from data about the phone is challenging.

In sum, MicroStrategy is pressing its advantage in mobileBI_VentanaResearch2012_HotVendor intelligence solutions and is figuring out ways to drive that advantage into the mobile applications market. It is investing heavily in enterprise business intelligence applications in the cloud, where it already has more than 40 customers. It has an industry-leading business intelligence toolkit and was ranked as a hot vendor in our 2012 Business Intelligence Value Index.

MicroStrategy has a lot going for it, but it is also placing a broad set of innovation bets relative to its size. In a recent interview, Saylor said, “If these things play out the way I expect, then we’re a $10 billion revenue company, out 10 years. If they don’t play out the way I expect, then whatever. We’ll muddle along and we’ll do what we’re going to do.” I’m inclined to agree.

Regards,

Tony Cosentino

VP & Research Director

MicroStrategy, announced version 9.3. The announcement came out of Amsterdam this month just in front of MicroStrategy World, the company’s annual conference for the European market. Release 9.3 delivers significant updates in four main areas: big data, advanced analytics, automated administration and visual data discovery.

The announcements on the big data front have to do with bringing data together from disparate sources, enriching available data, and new report search capabilities. Addressing the need to provide more automated support for data access and preparation are critical as found in our benchmark research on big data and our predictive analytics benchmark research as key obstacles to gaining business value from available data. The data source access improvements in 9.3 include improved access to departmental data, including data from spreadsheets and Salesforce.com, and from multidimensional sources such as Microsoft Analysis Services and Cognos TM1. The software can access data from SAP’s HANA appliance, and use a thrift connector to Hadoop distributions, including those of Cloudera and Amazon Web Services. The data enrichment enhancements include expansion of data based on ZIP code or date. Such location intelligence features address a hot area with great potential in the areas of database cleansing and enrichment. We’ll be exploring these trends in our upcoming benchmark research on location intelligence. MicroStrategy 9.3 also provides a Google-like function to discover reports and a dashboard, so users don’t have to spend unnecessary time looking for reports or creating new ones.

With respect to the Hadoop access, the company has four approaches. The first is to bring data from Hadoop into an in-memory structure for visual exploration and rapid prototyping using the imported data. This approach is interesting, but you still need to define your Hadoop queries before you do the analysis in memory, thus taking away the exploratory element of the big data. The second approach is to do freeform queries directly into Hadoop using Pig Latin or HiveQL.  This approach gives users back the exploratory aspect, but introduces complexity and sacrifices speed. The third approach is to model the data with a traditional multidimensional approach, while the fourth approach is to merge the Hadoop data with the enterprise data warehouse into a uniform view. The company says this last model is gaining traction for a number of their clients, which is in line with what we have been seeing from others in the space. Providing these options are critical as our big data benchmark research found that Hadoop as one of the key technologies planned in almost a third of organizations (32%).

For advanced analytics, the new release integrates R statistical packages into the MicroStrategy BI platform, which allows for advanced in-database analytics with any available R algorithm, including many custom R developments. Version 9.3 supports the most-used algorithms straight out of the box with more than 300 functions. While others have integrated R, few have gone as far as integrating the visualization aspects of R, as MicroStrategy does in this release. R is well-known as an analytical tool, but most users don’t know about its visualization capabilities. The R language is gaining traction in both the academic and business worlds, with universities, large government organizations and the pharmaceutical industry all showing significant support. This integration of predictive analytics into business intelligence is an important step for MicroStrategy and our predictive analytics benchmark found 58 percent of organizations have this as a priority.

The third area of improvement is introduction of System Manager, a GUI administrative workflow tool that the company claims will reduce operating costs by more than 50 percent. The tool allows users to create administrative workflows from both MicroStrategy admin products and third-party tools to do things such as create an Amazon instance. Use cases include MicroStrategy intelligence reports, daily report execution schedules, and migrating objects. The package is priced separately, which is fine since this is a capability most BI packages do not offer.

The fourth and final area of improvement involves the already formidable Visual Insights, a visual data discovery tool MicroStrategy introduced last year. Visual discovery tools continue to gain traction in the market due to their ease of use and their ability to give time back to analysts. Our benchmark research into big data found that visualization is a top priority and unmet need in 37 percent of existing deployments just as is predictive analytics in 41 percent of organizations. The new capabilities of the 9.3 release include density maps, which help to highlight geographic concentration levels such as sales volume. Users can create network diagrams for analytics with web traffic, affinity marketing, or market-basket analysis, and image layouts, which allow for visual mashups. Other enhancements to Visual Insight include a wizard to suggest appropriate visualizations based on the data, the ability to do rank filtering, and shortcuts to commonly used metrics such as counts, moving averages and running totals.  Finally, the ease of creating and distributing dashboards is significantly improved. Drag-and-drop visualizations, and the ability to do visualization-to-visualization overlays, are impressive, and I expect to see others try to emulate these in the future.

Mobile Business Intelligence wasn’t addressed directly in the 9.3 release, but MicroStrategy’s platform for mobile applications was the focus of the 9.2.1m release in January. Mobile intelligence is a big part of the MicroStrategy strategy, and it was also a big part of the conference in Amsterdam. In a separate blog post, I wrote about Michael Saylor’s keynote speech, his new book, The Mobile Wave, and the company’s direction in mobile technology. MicroStrategy has been investing heavily in mobile for a while, especially around native support for Apple’s iOS.

In sum, the MicroStrategy 9.3 release is a big advancement for a firm already providing leadership in the analytics market. Given the firm’s advantage of being an enterprise platform and moving into discovery tools with Visual Insights, it is likely in a better position to expand than many of the discovery players trying to move upstream into an enterprise role. The fact that the company has built the platform from the ground up also gives it an advantage over some of the larger players with less than organic strategies. For organizations with MicroStrategy already installed, the 9.3 upgrade (and memory upgrades) makes plenty of sense. Any firm looking for deeper support of Hadoop, predictive analytics and visual discovery should examine this 9.3 release from MicroStrategy.

Regards,

Tony Cosentino – VP & Research Director

On the heels of the release of his new book, The Mobile Wave, Microstrategy’s CEO Michael Saylor delivered an interesting keynote at Microstrategy World in Amsterdam this past week. Unlike other keynotes we’ve seen at various supplier conferences, the presentation was not a sales pitch. There was no reference to the fact that the company was simultaneously launching MicroStrategy 9.3, a major new release of its flagship offer. The presentation focused almost entirely on the rise of mobile computing and its ability to change the world. Saylor sees the Apple iPad at the heart of the mobile revolution, and notes that BI capabilities delivered through the device are displacing paper and people within organizations. The iPad’s 10-inch screen, which can display 90 percent of printed pages, is the key for companies to unlock the shackles of the physical office environment. Between the lines, it’s easy to read that Microstrategy is betting a lot on mobile and on the iPad.

Saylor’s argument against paper is relatively straightforward. For years we’ve been talking about the paperless office, but technology has not yet allowed us to get away from paper, and executives are still using it for all types of reports and data. Business intelligence before mobile was restricted to columnar reporting, and business intelligence before device interactivity was a manual, paper-based process in which an executive asked an analyst to run a report to answer a question, then looked at the report on paper. The results often inspired other questions, sending the executive back to the analyst to run yet another report – and so on.  Finally, once the executive’s questions were answered, he could ask an employee to take action based on his conclusions.

The iPad, Saylor argues, changes all of this, since iOS and the 10-inch screen allow us to look at standard-size documents and interact with company data. Given the revolutionary capabilities of mobile BI systems, an executive can interactively and visually query multiple data sources, get answers immediately, run his own scenarios, and take action, all from the sidelines of his kid’s soccer game. The executive, now doing the job of three people, is much more productive (if a bit lonelier).

How does the Microstrategy iPad-focused BI strategy stack up in the new mobile world that also contains tablets such as Google’s Nexus 7 and Microsoft’s Surface? With his presentation and over the course of the conference, Saylor took aim at the mobile strategy of a number of industry stalwarts, including Google and Microsoft. Microsoft in particular, he suggested, alienated both its customers and its partners with its recent preannouncement of the Surface tablet computer.

The most obvious competitor currently in the enterprise environment is Google’s Android, but the Android development community is focused around the smartphone, not the tablet. Google’s Nexus 7 suggests that the company is not keen to take on the iPad directly in the enterprise market; the 7-inch screen suggests consumer ambitions. One argument that Saylor gives against the Android is that it lacks tight enough integration between the hardware and the software for delivery on a 10-inch device. I’m curious whether this argument will still hold as Google starts to produce larger form-factor devices with tighter hardware and software integration, and as improved content parsing technologies allow for more information to be consumed on different-sized devices.

The more interesting enterprise play is around Microsoft’s Surface tablet running Windows 8 on Intel chips. When it is finally introduced, the Microsoft advantages will be hard to ignore. As it moves away from ARM-based chips, Microsoft will be able to provide full access to entrenched office productivity software, tight integration with other Windows-based hardware and software, and backward compatibility.  Hewlett Packard, in signing with Intel earlier this year, signaled its own move into Windows 8 tablets. HP’s global distribution power could make this an important milestone. The challenge is whether business will engage and consider Microsoft tablet or how many folks will bring this technology into business and expect support for it with business intelligence.

Unlike Google today and Microsoft tomorrow, Apple takes a “walled garden” approach to its operating system and applications, and enterprise IT departments generally do not like this idea, especially as it relates to security. On the other hand, the developer community in this garden is huge, and the “bring your own device” (BYOD) trend is really helping drive iPhone and iPad into the corporate market. The most influential businesspeople and cultural icons in our society carry iPads, and corporations, much to the chagrin of IT departments all over the world, are being forced to deal with this phenomenon.

On a practical note, I had an opportunity to test-drive MicroStrategy’s platform for mobile applications. I built a number of interactive mobile dashboards for the iPad, the iPhone, and for my own smartphone running Android. While things worked well with the iPad and the iPhone, the Android applications had a lot of issues. I’m not sure if this was due to the lack of Microstrategy focus on Android, or to Android itself. What I do know is that Microstrategy Mobile works well on iPad; just about any user can create designs with minimal training, and not having to wait for coders is a huge advantage.

Nevertheless, an Apple-focused bet in the enterprise environment is a bit risky as new devices come onto the market. It will be interesting to look at Microstrategy’s tack in the context of our upcoming Next Generation Business Intelligence Benchmark Research, which focuses on mobile and collaboration technologies in the enterprise BI environment.

Regards,

Tony Cosentino – VP & Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,049 hits
%d bloggers like this: