You are currently browsing the monthly archive for February 2013.

LogiXML has been around for more than a decade, but has seen particularly robust growth in the past year. Recent announcements show the company with better than 100-percent year-over-year growth, driven by a 97 percent license renewal rate and new customer growth in SMB, departmental and OEM deployments. The 158-percent growth for the embedded analytics group for the fourth quarter on a year-over-year basis was particularly strong.

LogiXML categorizes its products into Logi Info, its flagship product targeted primarily at IT and developers; Logi Ad Hoc, targeted as vr_ngbi_br_importance_of_bi_technology_considerationsa self-service tool for business users; and a suite of the two bundled with ETL capabilities. Just a few weeks ago, LogiXML released Logi Info v11. The product offers enhanced visualization capabilities, making analytics easier and more robust for end users. Building on an already solid product, the changes in Logi Info seem to be more evolutionary than revolutionary, and appear to be focused on keeping up with functionality necessary to compete in today’s BI market. With that said, the core product has advantages as simple, highly embeddable software. Its light footprint and ease of use are competitive advantages, allowing companies to quickly and easily put business intelligence capabilities into the hands of end users, as well as into the applications themselves. Usability is by far the number one criterion for companies choosing a business intelligence application, according to our benchmark research on next-generation business intelligence.

LogiXML’s ease of use for developers enables companies to tell a compelling time-to-value story for embedding intelligence directly into applications, and takes advantage of the market trend toward agile development. As I’ve written on numerous occasions and most recently in Big Data Faces a Chasm of Misunderstanding, the key challenge for organizations is overcoming the lack of an analytics-driven culture. This challenge is different for IT and for business users, as IT focuses on things such as standards, security and information management, while business users focus more on providing specific insights for business decisions. My colleague Mark Smith has also written about this environment and specifically about how BI tools are failing business. LogiXML’s ability to put BI directly into applications in an agile way allows the company to overcome many of these arguments, since applications fit directly into the role-based decision environments that drive companies today.

Interestingly, the company is said to be rebranding in 2013. Rebranding is always a large effort and often pays off well for a consumer brand or a broad business brand, but given LogiXML’s size and core advantage in the OEM market, the return on such a move is questionable. Perhaps LogiXML is looking to drive its brand into the business user side of the equation with an Intel Inside-type of strategy, or target business users themselves. Either way, the end-game here is unclear given a limited data discovery portfolio and no clear value path to sell directly to business end users. For LogiXML, this path is still primarily through IT and secondarily through business. Furthermore, many new business intelligence and analytic brands coming on the market also target the business end user, so the company will need to make a significant investment in basic awareness-building in order to gain parity with the current brand equity.

Branding aside, the challenge for the company’s continued trajectory of growth is twofold. First, it is reaching an inflection point in its number of employees. As a company reaches about 200 employees, which LogiXML is approaching now, company culture often transforms itself. With solid leadership and a strong roadmap, LogiXML will be able to overcome this first challenge. The second challenge is that it competes in a highly competitive segment against all the major stack players as well as with the more cost-oriented open source players. The expanding market landscape of business intelligence to business analytics, including data discovery and predictive analytics, introduces a host of new players and dilutes the broader opportunity as customers settle into their approaches.

In sum, LogiXML is often able to provide the path of least resistance and solid time-to-value for companies looking to quickly and effectively roll out business intelligence capabilities. In particular, as SaaS providers and OEM relationships continue to gain steam, the embeddable nature of LogiXML’s platform and the ability to construct applications at an elemental level gives the company a good market position from which to advance. Like many others in the increasingly crowded BI space, it will need to capitalize on its current market momentum and solidify its position before others start to invade more of its addressable space. For those organizations looking for an integrated business intelligence platform and tools should assess the improvements in this latest release of LogiXML.

Regards,

Tony Cosentino

VP and Research Director

The challenge with discussing big data analytics is in cutting through the ambiguity that surrounds the term. People often focus on the 3 Vs of big data – volume, variety and velocity – which provides a good lens for big data technology, but only gets us part of the way to understanding big data analytics, and provides even less guidance on how to take advantage of big data analytics to unlock business value.

Part of the challenge of defining big data analytics is a lack of clarity vr_bigdata_big_data_capabilities_not_availablearound the big data analytics value chain – from data sources, to analytic scalability, to analytic processes and access methods. Our recent research on big data find many capabilities still not available including predictive analytics (41%) to visualization (37%). Moreover, organizations are unclear on how best to initiate changes in the way they approach data analysis to take advantage of big data and what processes and technologies they ought to be using. The growth in use of appliances, Hadoop and in-memory databases and the growing footprints of RDBMSes all add up to pressure to have more intelligent analytics, but the most direct and cost-effective path from here to there is unclear. What is certain is that as business analytics and big data increasingly merge, the potential for increased value is building expectations.

To understand the organizational chasm that exists with respect to big data analytics, it’s important to understand two foundational analytic approaches that are used in organizations today. Former Census Bureau Director Robert Grove’s ideas around designed data and organic data give us a great jumping off point for this discussion, especially as it relates to big data analytics.

In Grove’s estimation, the 20th century was about designed data, or what might be considered hypothesis-driven data. With designed data we engage in analytics by establishing a hypothesis and collecting data to prove or disprove it. Designed data is at the heart of confirmatory analytics, where we go out and collect data that are relevant to the assumptions we have already made. Designed data is often considered the domain of the statistician, but it is also at the heart of structured databases, since we assume that all of our data can fit into columns and rows and be modeled in a relational manner.

In contrast to the designed data approach of the 20th century, the 21st century is about organic data. Organic data is data that is not limited by a specific frame of reference that we apply to it, and because of this it grows without limits and without any structure other than that structure provided by randomness and probability. Organic data represents all data in the world, but for pragmatic reasons we may think of it as all the data we are able to instrument. RFID, GPS data, sensor data, sentiment data and various types of machine data are all organic data sources that may be characterized by context or by attributes such data sparsity (also known as low-density data). Much like the interpretation of silence in a conversation, analyzing big data is as much about interpreting that which exists between the lines as it is about what we can put on the line itself.

vr_predanalytics_adequacy_of_predictive_analytics_supportThese two types of data and the analytics associated with them reveal the chasm that exists within organizations and shed light on the skills gap that our predictive analytics benchmark research shows to be the primary challenge for analytics in organizations today. This research finds inadequate support in many areas including product training (26%) and how to apply to business problems (23%).

On one side of the chasm are the business groups and the analysts who are aligned with Grove’s idea of designed data. These groups may encompass domain experts in areas such as finance or marketing, advanced Excel users, and even Ph.D.-level statisticians. These analysts serve organizational decision-makers and are tied closely to actionable insights that lead to specific business outcomes. The primary way they get work done is through a flat file environment, as was outlined in some detail last week by my colleague Mark Smith. In this environment, Excel is often the lowest common denominator.

On the other side of the chasm exist the IT and database professionals, where a different analytical culture and mindset exist. The priority challenge for this group is dealing with the three Vs and simply organizing data into a legitimate enterprise data set. This group is often more comfortable with large data sets and machine learning approaches that are the hallmark of the organic data of 21st century. Their analytical environment is different from that of their business counterparts; rather than Excel, it is SQL that is often the lowest common denominator.

As I wrote in a recent blog post, database professionals and business analytics practitioners have long lived in parallel universes. In technology, practitioners deal with tables, joins and the ETL process. In business analysis, practitioners deal with datasets, merges and data preparation. When you think about it, these are the same things. The subtle difference is that database professionals have had a data mining mindset, or, as Grove calls it, an organic data mindset, while the business analyst has had a designed data or statistic-driven mindset.  The bigger differences revolve around the cultural mindset, and the tools that are used to carry out the analytical objectives. These differences represent the current conundrum for organizations.

In a world of big data analytics, these two sides of the chasm are being pushed together in a shotgun wedding because the marriage of these groups is how competitive advantage is achieved. Both groups have critical contributions to make, but need to figure out how to work together before they can truly realize the benefits of big data analytics. The firms that understand that the merging of these different analytical cultures is the primary challenge facing the analytics organization, and that develop approaches that deal with this challenge, will take the lead in big data analytics. We already see this as a primary focus area for leading professional services organizations.

In my next analyst perspective on big data I will lay out some pragmatic approaches companies are using to address this big data analytics chasm; these also represent the focus of the benchmark research we’re currently designing to understand organizational best practices in big data analytics.

Regards,

Tony Cosentino

VP and Research Director

MicroStrategy CEO Michael Saylor has a keen sense of where things are headed. He sees mobile and social as the two drivers of a world based largely in software. Last year I covered the announcements at the MicroStrategy events in Amsterdam and the vision Saylor put forth in his keynote speech. MicroStategy World 2013 last month finds the company delving into such diverse areas as identity management, marketing services and integrated point-of-sale applications. The uniting factor is mobile intelligence.

At the event, MicroStrategy highlighted three innovative product lines. VR_2012_TechAward_Winner_LogoUsher, announced in 2012, is a mobile identity management system that allows you to issue digital credentials on a mobile device.  Alert provides a mobile shopper experience, including promotions, product locator, transaction capabilities and receipt delivery. Wisdom, winner of the 2012 Ventana Research Technology Innovation Award for Social Media, mines social media data from Facebook to help drive brand insight. Along with large investments in cloud and mobile intelligence, these technologies illustrate where the company is headed.

In a breakout session provokingly titled “Beat Amazon and Google with Revolutionary Retail Apps for Your Store Operations,” MicroStrategy Vice President of Retail Frank Andryauskas brought the company’s technologies to life by outlining a typical in-store mobile purchase process. A customer may start by using Alert to engage social media while he looks at items on his phone or tablet and checks prices, sizes or availability within the application. Based on his selection, he may want recommendations through Wisdom for items that his friends like or that appeal to them because of their unique preferences. He could choose to purchase an item with a coupon promotion delivered through Alert, or have the item drop-shipped to his home or to the store.

On the back end, marketers can run purchase path analytics that tie the customer experience to the transaction. This in turn helps with promotional strategies that can influence purchase behavior at the store level. The key for the retailer, as well as for MicroStrategy, is to create customer value through an in-store and online experience that is differentiated from ones in other stores. The tools help retailers move beyond “showrooming” and leverage their physical assets to drive competitive advantage.

The MicroStrategy mobile retail vision gets even more compelling when you look at what’s going on with their customers, including large retailers that are using analytics to drive things such as employee engagement in a brick-and-mortar retail environment, which in turn can improve customer retention and increase share of wallet. The Container Store demonstrated how it uses MicroStrategy mobile BI to allow employees to view their performance as compared to their peers. This taps into a fundamental human need to be on the leading part of a curve and never lag behind. Friendly competition between stores with similar footprints and trade areas can drive best-in-class store performance. It will be interesting to see whether MicroStrategy can leverage this game approach across other industries, such as travel and tourism, government, manufacturing and healthcare.

MicroStrategy has a strong presence and compelling use cases in the pharmaceuticals industry, with solutions aroundvr_sales_mobile_technology mobile sales force enablement where operating smatrtphones and tables is a priority today. This area can show tremendous productivity gains, as in-meeting effectiveness often requires fast and easy access to pricing, distribution and benchmark data. The ability to communicate with other team members in real time during the sales process and to conduct transactions on the spot can reduce sales cycle times. Ancillary benefits include providing an audit trail of the best sales processes and representatives, so that, much like in the retail environment, pharmaceutical companies can develop and replicate a best-in-class approach.

While the company’s long-range vision is solid, MicroStrategy may be too far ahead of the curve. I would argue that the company is on the leading edge of mobile and may have spent more money than it had to in order to catch the mobile wave but is more ready than any other BI provider. With technologies such as Wisdom, Alert and Usher, it may be in a position similar to the one it was in a few years ago with mobile. Wisdom uses “like” data from Facebook to drive analytics, but how far can that data really get a marketer today? This innovation needs to pay more dividends for marketers, and it might in the future as Facebook starts to introduce a categorical verb universe that denotes specific attitudes and purchase intent. Alert could be good for a mid-market retailer, if its value and ease of use is compelling enough for mobile users to download the application and sign up as a store customer. Usher is spot on with its intent to manage digital identity, but uptake may be slow since separating data about the user from data about the phone is challenging.

In sum, MicroStrategy is pressing its advantage in mobileBI_VentanaResearch2012_HotVendor intelligence solutions and is figuring out ways to drive that advantage into the mobile applications market. It is investing heavily in enterprise business intelligence applications in the cloud, where it already has more than 40 customers. It has an industry-leading business intelligence toolkit and was ranked as a hot vendor in our 2012 Business Intelligence Value Index.

MicroStrategy has a lot going for it, but it is also placing a broad set of innovation bets relative to its size. In a recent interview, Saylor said, “If these things play out the way I expect, then we’re a $10 billion revenue company, out 10 years. If they don’t play out the way I expect, then whatever. We’ll muddle along and we’ll do what we’re going to do.” I’m inclined to agree.

Regards,

Tony Cosentino

VP & Research Director

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,106 hits
%d bloggers like this: