You are currently browsing the monthly archive for February 2015.

In many organizations, advanced analytics groups and IT are separate, and there often is a chasm of understanding between them, as I have noted. A key finding in our benchmark research on big data analytics is that communication and knowledge sharing is a top benefit of big data analytics initiatives,vr_Big_Data_Analytics_06_benefits_realized_from_big_data_analytics but often it is a latent benefit. That is, prior to deployment, communication and knowledge sharing is deemed a marginal benefit, but once the program is deployed it is deemed a top benefit. From a tactical viewpoint, organizations may not spend enough time defining a common vocabulary for big data analytics prior to starting the program; our research shows that fewer than half of organizations have agreement on the definition of big data analytics. It makes sense therefore that, along with a technical infrastructure and management processes, explicit communication processes at the beginning of a big data analytics program can increase the chance of success. We found these qualities in the Chorus platform of Alpine Data Labs, which received the Ventana Research Technology Innovation Award for Predictive Analytics in September 2014.

VR2014_TechInnovation_AwardWinnerAlpine Chorus 5.0, the company’s flagship product, addresses the big data analytics communication challenge by providing a user-friendly platform for multiple roles in an organization to build and collaborate on analytic projects. Chorus helps organizations manage the analytic life cycle from discovery and data preparation through model development and model deployment. It brings together analytics professionals via activity streams for rapid collaboration and workspaces that encourage projects to be managed in a uniform manner. While activity streams enable group communication via short messages and file sharing, workspaces allow each analytic project to be managed separately with capabilities for project summary, tracking and data source mapping. These functions are particularly valuable as organizations embark on multiple analytic initiatives and need to track and share information about models as well as the multitude of data sources feeding the models.

The Alpine platform addresses the challenge of processing big data by parallelizing algorithms to run across big data platforms such as Hadoop and making it accessible by a wide audience of users. The platform supports most analytic databases and all major Hadoop distributions. Alpine was vr_Big_Data_Analytics_13_advanced_analytics_on_big_dataan early adopter of Apache Spark, an open source in-memory data processing framework that one day may replace the original map-reduce processing paradigm of Hadoop. Alpine Data Labs has been certified by Databricks, the primary contributor to the Spark project, which is responsible for 75 percent of the code added in the past year. With Spark, Alpine’s analytic models such as logistic regression run in a fraction of the time previously possible and new approaches, such as one the company calls Sequoia Forest, a machine learning approach that is a more robust version of random forest analysis. Our big data analytics research shows that predictive analytics is a top priority for about two-thirds (64%) of organizations, but they often lack the skills to deploy a fully customized approach. This is likely a reason that companies now are looking for more packaged approaches to implementing big data analytics (44%) than custom approaches (36%), according to our research. Alpine taps into this trend by delivering advanced analytics directly in Hadoop and the HDFS file system with its in-cluster analytic capabilities that address the complex parallel processing tasks needed to run in distributed environments such as Hadoop.

A key differentiator for Alpine is usability. Its graphical user interface provides a visual analytic workflow experience built on popular algorithms to deliver transformation capabilities and predictive analytics on big data. The platform supports scripts in the R language, which can be cut and pasted into the workflow development studio; custom operators for more advanced users; and Predictive Model Markup Language (PMML), which enables extensible model sharing and scoring across different systems. The complexities of the underlying data stores and databases as well as the orchestration of the analytic workflow are abstracted from the user. Using it an analyst or statistician does not need to know programming languages or the intricacies of the database technology to build analytic models and workflows.

It will be interesting to see what direction Alpine will take as the big data industry continues to evolve; currently there are many point tools, each strong in a specific area of the analytic process. For many of the analytic tools currently available in the market, co-opetition among vendors prevails in which partner ecosystems compete with stack-oriented approaches. The decisions vendors make in terms of partnering as well as research and development are often a function of these market dynamics, and buyers should be keenly aware of who aligns with whom.  For example, Alpine currently partners with Qlik and Tableau for data visualization but also offers its own data visualization tool. Similarly, it offers data transformation capabilities, but its toolbox could be complimented by data preparation and master data solutions. This emerging area of self-service data preparation is important to line-of-business analysts, as my colleague Mark Smith recently discussed.

Alpine Labs is one of many companies that have been gaining traction in the booming analytics market. With a cadre of large clients and venture capital backing of US$23 million in series A and B, Alpine competes in an increasingly crowded and diverse big data analytics market. The management team includes industry veterans Joe Otto and Steve Hillion. Alpine seems to be particularly well suited for customers that have a clear understanding of the challenges of advanced analytics vr_predanalytics_benefits_of_predictive_analytics_updatedand are committed to using it with big data to gain a competitive advantage. This benefit is what organizations find most in over two thirds (68%) of organizations according to our predictive analytics benchmark research. A key differentiator for Alpine Labs is the collaboration platform, which helps companies clear the communication hurdle discussed above and address the advanced analytics skills gap at the same time. The collaboration assets embedded into the application and the usability of the visual workflow process enable the product to meet a host of needs in predictive analytics. This platform approach to analytics is often missing in organizations grounded in individual processes and spreadsheet approaches. Companies seeking to use big data with advanced analytics tools should include Alpine Labs in their consideration.

Regards,

Ventana Research

The idea of not focusing on innovation is heretical in today’s business culture and media. Yet a recent article in The New Yorker suggests that today’s society and organizations focus too much on innovation and technology. The same may be true for technology in business organizations. Our research provides evidence for my claim.

My analysis on our benchmark research into information optimization shows that organizations perform better in technology and information than in the people and process dimensions. vr_Info_Optim_Maturity_06_oraganization_maturity_by_dimensionsThey face a flood of information that continues to increase in volume and frequency and must use technology to manage and analyze it in the hope of improving their decision-making and competitiveness. It is understandable that many see this as foremost an IT issue. But proficiency in use of technology and even statistical knowledge are not the only capabilities needed to optimize an organization’s use of information and analytics. They also need a framework that complements the usual analytical modeling to ensure that analytics are used correctly and deliver the desired results. Without a process for getting to the right question, users can go off in the wrong direction, producing results that cannot solve the problem.

In terms of business analytics strategy, getting to the right question is a matter of defining goals and terms; when this is done properly, the “noise” of differing meanings is reduced and people can work together efficiently. As we all know, many vr_Big_Data_Analytics_05_terminology_for_big_data_analyticsterms, especially new ones, mean different things to different people, and this can be an impediment to teamwork and achieving of business goals. Our research into big data analytics shows a significant gap in understanding here: Fewer than half of organizations have internal agreement on what big data analytics is. This lack of agreement is a barrier to building a strong analytic process. The best practice is to take time to discover what people really want to know; describing something in detail ensures that everyone is on the same page. Strategic listening is a critical skill, and done right it enables analysts to identify, craft and focus the questions that the organization needs answered through the analytic process.

To develop an effective process and create an adaptive mindset, organizations should instill a Bayesian sensibility. Bayesian analysis, also called posterior probability analysis, starts with assuming an end probability and works backward to determine prior probabilities. In a practical sense, it’s about updating a hypothesis when given new information; it’s about taking all available information and finding where it converges. This is a flexible approach in which beliefs are updated as new information is presented; it values both data and intuition. This mindset also instills strategic listening into the team and into the organization.

For business analytics, the more you know about the category you’re dealing with, the easier it is to separate what is valuable information and hypothesis from what is not. Category knowledge allows you to look at the data from a different perspective and add complex existing knowledge. This in and of itself is a Bayesian approach, and it allows the analyst to iteratively take the investigation in the right direction. This is not to say that intuition should be the analytic starting point. Data is the starting point, but a hypothesis is needed to make sense of the data. Physicist Enrico Fermi pointed out that measurement is the reduction of uncertainty. Analysts should start with a hypothesis and try to disprove it rather than to prove it. From there, iteration is needed to come as close to the truth as possible. Starting with a gut feel and trying to prove it is the wrong approach. The results are rarely surprising and the analysis is likely to add nothing new. Let the data guide the analysis rather than allowing predetermined beliefs to guide the analysis. Technological innovations in exploratory analytics and machine learning support this idea and encourage a data-driven approach.

Bayesian analysis has had a great impact not only on statistics and market insights in recent years, but it has impacted how we view important historical events as well. It is consistent with modern thinking in the fields of technology and machine learning, as well as behavioral economics. For those interested in how the Bayesian philosophy is taking hold in many different disciplines, I recommend a book entitled The Theory That Would Not Die by Sharon Bertsch McGrayne.

A good analytic process, however, needs more than a sensibility for how to derive and think about questions; it needs a tangible method to address the questions and derive business value from the answers. The method I propose can be framed in four steps: what, so what, now what and then what. Moving beyond the “what” (i.e., measurement and data) to the “so what” (i.e., insights) should be a goal of any analysis, yet many organizations are still turning out analysis that does nothing more than state the facts. Maybe 54 percent of people in a study prefer white houses, but why does anyone care? Analysis must move beyond mere findings to answer critical business questions and provide informed insights, implications and ideally full recommendations. That said, if organizations cannot get the instrumentation and the data right, findings and recommendations are subject to scrutiny.

The analytics professional should make sure that the findings, implications and recommendations of the analysis are heard by strategic and operational decision-makers. This is the “now what” step and includes business planning and implementation decisions that are driven by the analytic insights. If those insights do not lead to decision-making or action, the analytic effort has no value. There are a number of things that the analyst can do to make the information heard. A compelling story line that incorporates storytelling techniques, animation and dynamic presentation is a good start. Depending on the size of the initiative, professional videography, implementation of learning systems and change management tools also may be used.

The “then what” represents a closed-loop process in which insights and new data are fed back into the organization’s operational systems. This can be from the perspective of institutional knowledge and learning in the usual human sense which is an imperative in organizations. Our benchmark research into big data and business analytics shows a need for this: Skills and training are substantial obstacles to using big data (for 79%) and analytics (77%) in organizations. This process is similar to machine learning. That is, as new information is brought into the organization, the organization as a whole learns and adapts to current business conditions. This is the goal of the closed-loop analytic process.

Our business technology innovation research finds analytics in the top three priorities in three out of four (74%) organizations; collaboration is a top-three priority in 59 percent. vr_bti_br_technology_innovation_prioritiesBoth analytics and collaboration have a process orientation that uses technology as an enabler of the process. The sooner organizations implement a process framework, the sooner they can achieve success in their analytic efforts. To implement a successful framework such as the one described above, organizations must realize that innovation is not the top priority; rather they need the ability to use innovation to support an adaptable analytic process. The benefits will be wide-ranging, including better understanding of objectives, more targeted analysis, analytical depth and analytical initiatives that have a real impact on decision-making.

Regards,

Ventana Research

Oracle is one of the world’s largest business intelligence and analytics software companies. Its products range from middleware, back-end databases and ETL tools to business intelligence applications and cloud platforms, and it is well established in many corporate and government accounts. A key to Oracle’s ongoing success is in transitioning its business intelligence and analytics portfolio to self-service, big data and cloud deployments. To that end, three areas in which the company has innovated are fast, scalable access for transaction data; exploratory data access for less structured data; and cloud-based business intelligence.

 Providing users with access to structured data in an expedient and governed fashion continues to be a necessity for companies. Our benchmark research into information optimization finds drilling into information within applications (37%) and search (36%) to be the capabilities most needed for end users in business.

To provide them, Oracle enhanced its database in version Oracle 12c, which was  released in 2013 . The key innovation is to enable both transaction processing and analytic processing workloads on the same system.MostImportantEndUseCapUsing in-memory instruction sets on the processor, the system can run calculations quickly without changing the application data. The result is that end users can explore large amounts of information in the context of all data and applications running on the 12c platform. These applications include Oracle’s growing cadre of cloud based applications. The value of this is evident in our big data analytics benchmark research , which finds that the number-one source of big data is transactional data from applications, mentioned by 60 percent of participants.

 Search and interactive analysis of structured data are addressed by Oracle Business Intelligence Enterprise Edition (OBIEE) through a new visualization interface that applies assets Oracle acquired from Endeca in 2011. (Currently, this approach is available in Business Intelligence Cloud Service, which I discuss below.) To run fast queries of large data sets, columnar compression can be implemented by small code changes in the Oracle SQL Developer interface. These changes use the innovation in 12c discussed above and would be implemented by users familiar with SQL. Previously, IT professionals would have to spend significant time to construct aggregate data and tune the database so users could quickly access data. Otherwise transactional databases take a long time to query since they are row-oriented and the query literally must go through every row of data to return analytic results. With columnar compression, end users can explore and interact with data in a much faster, less limited fashion. With the new approach, users no longer need to walk down each hierarchy but can drag and drop or right-click to see the hierarchy definition. Drag-and-drop and brushing features enable exploration and uniform updates across all visualizations on the screen. Under the covers,

 DefiningBDAnalyticsthe database is doing some heavy lifting, often joining five to 10 tables to compute the query in near real time. The ability to do correlations on large data sets in near real time is a critical enabler of data exploration since it allows questions to be asked and answered one after another rather than asking users to predefine what those questions might be. This type of analytic discovery enables much faster time to value especially when providing root-cause analysis for decision-making.

 Oracle also  provides Big Data SQL , a query approach that enables analysis of unstructured data analysis on systems such as Hadoop. The model uses what Oracle calls query franchising rather than query federation in which, processing is done in a native SQL dialect and the various dialects must be translated and combined into one. With franchising, Oracle SQL runs natively inside of each of the systems. This approach applies Oracle SQL to big data systems and offloads queries to the compute nodes or storage servers of the big data system. It also maintains the security and speed needed to do exploration on less structured data sources such as JSON, which the 12c database supports natively. In this way Oracle provides security and manageability within the big data environment. Looking beyond structured data is key for organizations today. Our research shows that analyzing data from all sources is how three-fourths (76%) of organizations define big data analytics.

 To visualize and explore big data, Oracle  offers Big Data Discovery , which browses Hadoop and NoSQL stores, and samples and profiles data automatically to create catalogs. Users can explore important attributes through visualization as well as using common search techniques. The system currently supports capabilities such as string transformations, variable grouping, geotagging and text enrichment that assist in data preparation. This is a good start to address exploration on big data sources, but to better compete in this space, Oracle should offer more usable interfaces and more capabilities for both data preparation and visualization. For example, visualizations such as decision trees and correlation matrices are important to help end users to make sense of big data and do not appear to be included in the tool.

 The third analytic focus, and the catalyst of the innovations discussed above, is Oracle’s move to the cloud. In September 2014,  Oracle released BI Cloud Service  (BICS), which helps business users access Oracle BI systems in a self-service manner with limited help from IT. Cloud computing has been a major priority for Oracle in the past few years with not just its applications but also for its entire stack of technology. With BICS, Oracle offers a stand-alone product with which a departmental workgroup can insert analytics directly into its cloud applications. When BICS is coupled with the Data-as-a-Service (DaaS) offering, which accesses internal data as well as third-party data sources in the cloud, Oracle is able to deliver cross-channel analysis and identity-as-data. Cross-channel analysis and identity management are important in cloud analytics from both business and a privacy and security perspectives.

 CustomerAnalyticsIn particular, such tools can help tie together and thus simplify the complex task of managing multichannel marketing. Availability and simplicity in analytics tools are priorities for marketing organizations.  Our research into next-generation customer analytics  shows that for most organizations data not being readily available (63%) and difficulty in maintaining customer analytics systems (56%) are top challenges.

 Oracle is not the first vendor to offer self-service discovery and flexible data preparation, but BICS begins its movement from the previous generation of BI technology to the next. BICS puts Oracle Transactional Business Intelligence (OTBI) in the cloud as a first step toward integration with vertical applications in the lines of business. It lays the groundwork for cross-functional analysis in the cloud.

 We don’t expect BICS to compete immediately with more user-friendly analytic tools designed for business and analytics or with well-established cloud computing BI players. Designers still must be trained in Oracle tools, and for this reason, it appears that the tool, at least in its first iteration, is targeted only at Oracle’s OBIEE customers seeking a departmental solution that limits IT involvement. Oracle should continue to address usability for both end users and designers. BICS also should connect to more data sources including Oracle Essbase. It currently comes bundled with  Oracle Database Schema Service  which acts as the sole data source but does not directly connect with any other database. Furthermore, data movement is not streamlined in the first iteration, and replication of data is often necessary.

 Overall, Oracle’s moves in business intelligence and analytics make sense because they use the same semantic models in the cloud as those analytic applications that many very large companies use today and won’t abandon soon. Furthermore, given Oracle’s growing portfolio of cloud applications and the integration of analytics into these transactional applications through OTBI, Oracle can leverage cloud application differentiation for companies not using Oracle. If Oracle can align its self-service discovery and big data tools with its current portfolio in reasonably timely fashion, current customers will not turn away from their Oracle investments. In particular, those with an Oracle centric cloud roadmap will have no reason to switch. We note that cloud-based business intelligence and analytics applications is still a developing market. Our previous research showed that business intelligence had been a laggard in the cloud in comparison to genres such as human capital management, marketing, sales and customer service. We are examining trends in our forthcoming  data and analytics in the cloud benchmark research, which will evaluate both the current state of such software and where the industry likely is heading in 2015 and beyond. For organizations shifting to cloud platforms, Oracle has a very progressive cloud computing portfolio that  my colleague has assessed  and they have created a path by investing in its Platform-as-a-Service (PaaS) and DaaS offerings. Its goal is to provide uniform capabilities across mobility, collaboration, big data and analytics so that all Oracle applications are consistent for users and can be extended easily by developers. However, Oracle competes against many cloud computing heavyweights like Amazon Web Services, IBM and Microsoft, so achieving success through significant growth has some challenges. Oracle customers generally and OBIEE customers especially should investigate the new innovations in the context of their own roadmaps for big data analytics, cloud computing and self-service access to analytics.

 Regards,

Ventana Research

Oracle is one of the world’s largest business intelligence and analytics software companies. Its products range from middleware, back-end databases and ETL tools to business intelligence applications and cloud platforms, and it is well established in many corporate and government accounts. A key to Oracle’s ongoing success is in transitioning its business intelligence and analytics portfolio to self-service, big data and cloud deployments. To that end, three areas in which the company has innovated are fast, scalable access for transaction data; exploratory data access for less structured data; and cloud-based business intelligence.

 Providing users with access to structured data in an expedient and governed fashion continues to be a necessity for companies. Our benchmark research into information optimization finds drilling into information within applications (37%) and search (36%) to be the capabilities most needed for end users in business.

To provide them, Oracle enhanced its database in version Oracle 12c, which was  released in 2013 . The key innovation is to enable both transaction processing and analytic processing workloads on the same system.MostImportantEndUseCapUsing in-memory instruction sets on the processor, the system can run calculations quickly without changing the application data. The result is that end users can explore large amounts of information in the context of all data and applications running on the 12c platform. These applications include Oracle’s growing cadre of cloud based applications. The value of this is evident in our big data analytics benchmark research , which finds that the number-one source of big data is transactional data from applications, mentioned by 60 percent of participants.

 Search and interactive analysis of structured data are addressed by Oracle Business Intelligence Enterprise Edition (OBIEE) through a new visualization interface that applies assets Oracle acquired from Endeca in 2011. (Currently, this approach is available in Business Intelligence Cloud Service, which I discuss below.) To run fast queries of large data sets, columnar compression can be implemented by small code changes in the Oracle SQL Developer interface. These changes use the innovation in 12c discussed above and would be implemented by users familiar with SQL. Previously, IT professionals would have to spend significant time to construct aggregate data and tune the database so users could quickly access data. Otherwise transactional databases take a long time to query since they are row-oriented and the query literally must go through every row of data to return analytic results. With columnar compression, end users can explore and interact with data in a much faster, less limited fashion. With the new approach, users no longer need to walk down each hierarchy but can drag and drop or right-click to see the hierarchy definition. Drag-and-drop and brushing features enable exploration and uniform updates across all visualizations on the screen. Under the covers,

 DefiningBDAnalyticsthe database is doing some heavy lifting, often joining five to 10 tables to compute the query in near real time. The ability to do correlations on large data sets in near real time is a critical enabler of data exploration since it allows questions to be asked and answered one after another rather than asking users to predefine what those questions might be. This type of analytic discovery enables much faster time to value especially when providing root-cause analysis for decision-making.

 Oracle also  provides Big Data SQL , a query approach that enables analysis of unstructured data analysis on systems such as Hadoop. The model uses what Oracle calls query franchising rather than query federation in which, processing is done in a native SQL dialect and the various dialects must be translated and combined into one. With franchising, Oracle SQL runs natively inside of each of the systems. This approach applies Oracle SQL to big data systems and offloads queries to the compute nodes or storage servers of the big data system. It also maintains the security and speed needed to do exploration on less structured data sources such as JSON, which the 12c database supports natively. In this way Oracle provides security and manageability within the big data environment. Looking beyond structured data is key for organizations today. Our research shows that analyzing data from all sources is how three-fourths (76%) of organizations define big data analytics.

 To visualize and explore big data, Oracle  offers Big Data Discovery , which browses Hadoop and NoSQL stores, and samples and profiles data automatically to create catalogs. Users can explore important attributes through visualization as well as using common search techniques. The system currently supports capabilities such as string transformations, variable grouping, geotagging and text enrichment that assist in data preparation. This is a good start to address exploration on big data sources, but to better compete in this space, Oracle should offer more usable interfaces and more capabilities for both data preparation and visualization. For example, visualizations such as decision trees and correlation matrices are important to help end users to make sense of big data and do not appear to be included in the tool.

 The third analytic focus, and the catalyst of the innovations discussed above, is Oracle’s move to the cloud. In September 2014,  Oracle released BI Cloud Service  (BICS), which helps business users access Oracle BI systems in a self-service manner with limited help from IT. Cloud computing has been a major priority for Oracle in the past few years with not just its applications but also for its entire stack of technology. With BICS, Oracle offers a stand-alone product with which a departmental workgroup can insert analytics directly into its cloud applications. When BICS is coupled with the Data-as-a-Service (DaaS) offering, which accesses internal data as well as third-party data sources in the cloud, Oracle is able to deliver cross-channel analysis and identity-as-data. Cross-channel analysis and identity management are important in cloud analytics from both business and a privacy and security perspectives.

 CustomerAnalyticsIn particular, such tools can help tie together and thus simplify the complex task of managing multichannel marketing. Availability and simplicity in analytics tools are priorities for marketing organizations.  Our research into next-generation customer analytics  shows that for most organizations data not being readily available (63%) and difficulty in maintaining customer analytics systems (56%) are top challenges.

 Oracle is not the first vendor to offer self-service discovery and flexible data preparation, but BICS begins its movement from the previous generation of BI technology to the next. BICS puts Oracle Transactional Business Intelligence (OTBI) in the cloud as a first step toward integration with vertical applications in the lines of business. It lays the groundwork for cross-functional analysis in the cloud.

 We don’t expect BICS to compete immediately with more user-friendly analytic tools designed for business and analytics or with well-established cloud computing BI players. Designers still must be trained in Oracle tools, and for this reason, it appears that the tool, at least in its first iteration, is targeted only at Oracle’s OBIEE customers seeking a departmental solution that limits IT involvement. Oracle should continue to address usability for both end users and designers. BICS also should connect to more data sources including Oracle Essbase. It currently comes bundled with  Oracle Database Schema Service  which acts as the sole data source but does not directly connect with any other database. Furthermore, data movement is not streamlined in the first iteration, and replication of data is often necessary.

 Overall, Oracle’s moves in business intelligence and analytics make sense because they use the same semantic models in the cloud as those analytic applications that many very large companies use today and won’t abandon soon. Furthermore, given Oracle’s growing portfolio of cloud applications and the integration of analytics into these transactional applications through OTBI, Oracle can leverage cloud application differentiation for companies not using Oracle. If Oracle can align its self-service discovery and big data tools with its current portfolio in reasonably timely fashion, current customers will not turn away from their Oracle investments. In particular, those with an Oracle centric cloud roadmap will have no reason to switch. We note that cloud-based business intelligence and analytics applications is still a developing market. Our previous research showed that business intelligence had been a laggard in the cloud in comparison to genres such as human capital management, marketing, sales and customer service. We are examining trends in our forthcoming  data and analytics in the cloud benchmark research, which will evaluate both the current state of such software and where the industry likely is heading in 2015 and beyond. For organizations shifting to cloud platforms, Oracle has a very progressive cloud computing portfolio that  my colleague has assessed  and they have created a path by investing in its Platform-as-a-Service (PaaS) and DaaS offerings. Its goal is to provide uniform capabilities across mobility, collaboration, big data and analytics so that all Oracle applications are consistent for users and can be extended easily by developers. However, Oracle competes against many cloud computing heavyweights like Amazon Web Services, IBM and Microsoft, so achieving success through significant growth has some challenges. Oracle customers generally and OBIEE customers especially should investigate the new innovations in the context of their own roadmaps for big data analytics, cloud computing and self-service access to analytics.

 Regards,

Ventana Research

RSS Tony Cosentino’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Tony Cosentino – Twitter

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

Stats

  • 73,315 hits
%d bloggers like this: