Sunday 27 November 2011

web analytics(google analytics vs omniture)

  • Campaign reporting. GA allows for multiple dimensions– in particular, motion charts, advanced segments, and the various multi-dimensional views that are metric specific. I believe Omniture is inferior in this regard because of GA’s ability to visualize data in cross-tab (pivot) and related view formats. In other words, it’s easier to uncover trends in GA than by hunting through Omniture.
  • Integration. This is usually a red herring across most Internet marketing companies. The most important integration in analytics and PPC software is that of CRM interaction (salesforce.com, Eloqua, Microsoft Dynamics, SugarCRM, etc..) and offline conversions. This almost always requires some custom work, since every company has a different underlying data model (which they should), as well as a different sales funnel and attribution scheme. The collection, integration, and weighting of this data is not an out-of-the-box software module, but an exercise of sophisticated marketing analytics. Online conversion tracking is relatively simple for all enterprise-level analytics tools, whether using a method like Google URL Builder or cookie tracking. Google has a significant advantage in tracking activity from and on Facebook, despite the marketing efforts put forth by Omniture.
  • Funnel tracking. Omniture does allow for multiple paths. Our viewpoint is that the more sophisticated method is to measure event-level attribution (page or click), rather than force the analyst (you) to have to define each particular path to analyze. The traditional methods of slice-and-dice is a needle in a haystack approach — you should prefer your analytics tool to do the legwork to tell you what combinations of pages lead to a conversion or a poor user experience. We are not aware of any clickstream analytics tool that does this out of the box. With the number of combinations of attributes events, and pages possible, you need click level data and a correlation algorithm to pull out the right combination of trends to view. You cannot do this out of GA yourself, because you’ll need the raw data to calculate. That said, log file parsing is probably the most practical solution here if you want to go that far in analytics, given that Omniture doesn’t know how to do it (we’ve had multiple calls with their top people and they are stumped).
  • User tagging. Omniture does allow for more variables to be stored. You’ll want to consider what use cases you have that cannot be solved via an advanced segment and parsing urls. If you’re interested in Omniture’s solution, please read the chapter in their implementation guide — it’s a confusing read, but they do allow collection of personal data. Google doesn’t allow collection of such data for privacy reasons. Not sure about auditing requirements — any certification of data accuracy would have to rely upon click-level data out of your logs, which Google can’t do (unless you have the old urchin, which is not recommended).
  • Goal tracking. Google has recently expanded from 4 goals to 20 goals. Most companies misuse goal-setting, as they confuse segments and points within the funnel as goals. The more goals you have, the more complex the attribution. It’s hard enough to do attribution when you have only 1 goal and many events for which you have to allocate credit — now try matrix attribution with many goals and many events. To the best of our knowledge, almost nobody has single goal attribution down, so matrix attribution is not even in the vernacular of analytics yet.
  • Page overlays. Cool tool with wow factors for both GA and Omniture– but usually not usable because of tracking problems and multiple links on a page that have the same URL. On the latter, let’s say that on a particular page, there are two links to get to another page (a topnav and a footer nav link)– if they have the same destination url, you won’t be able to tell which one drove the click. We have rarely found the visual overlays to offer accurate data.
  • Data freshness. Generally a 2-10 hour delay on Google. Data freshness is most important is when you have events that require real-time optimization. Keep in mind that PPC data may be on a full day lag and you’re limited by your weakest link. Thus, if your web analytics is only 30 minutes behind, but your PPC and CRM are 4 hours behind, you’re really 4 hours behind (or you’re making inaccurate decisions). Further, the concept of statistical significance is such that you have to gather enough data to determine what’s going on. At Yahoo!, we decided that a 3 day reporting delay (because we needed 2.5 days to crunch attribution) was worth the trade-off in speed versus effective optimization. You’ll have to decide what data you really need at what frequency.
  • Independence. Several of the government agencies we have talked to don’t use Google Analytics because open source is considered off-limits. Some major advertisers don’t use GA because of the potential conflict of interest in having your analytics being tracked with the place you spend your money. And there are the “tin foil hat” and anti-monopoly people that in general don’t believe you should have your analytics, PPC, landing page testing, mail, and so forth with the same company. Given practical realities, we don’t think this is an issue right now.
  • Effectiveness. Google AdWords is going to have more effective (effective meaning increasing profits, as opposed to allowing you to create more reports) tools than 3rd party tools– they have to, because they have the advantage of more data. Case in point– the Conversion Optimizer of Google versus any bid management tools. With the exception of folks like ClickEquations (market leader who is good, but not great), in our opinion, nobody yet has a sophisticated method of bid management.

No comments:

Post a Comment