This short-and-sweet post will answer the following questions, all of which have come up to me in the last month:
- Why are my Product Views in SiteCatalyst inflated?
- How do I see Product Views in a conversion funnel?
- Why did our Solution Design Document have both the "prodView" event AND a custom "Product View" event?
There's one answer to all three questions: prodView does not behave like other events. prodView acts for the product variable almost like "instances" does for eVars. This can confuse people who don't know what they are looking at:
For one, it will not show up in non-product reports. If you want to see it in a conversion funnel, maybe broken down by campaigns, you won't be able to.
Second, s.events="prodView" doesn't have to appear anywhere in your code for the prodView event to be set. Anytime that s.products is set without a corresponding event, prodView will be set automatically. Which means the number of product views could potentially be inflated without ever being set. If you are using the products variable on your site, it's possible you have data in your prodviews report without you ever even setting it. This also makes debugging it a bit tricky.
My recommendation? Use a custom event for product views. If you want to still take advantage of some of the out-of-the-box reports in SiteCatalyst that come from the prodView event, then you can set it along with your custom event (s.events="prodView,event5") and get the best of both worlds- just be aware that you will then have two reports that may not match each other perfectly and may confuse some of your SiteCatalyst users (documentation is your friend; take advantage of the SiteCatalyst feature of writing notes for reports that need clarification).
Mistake #1: Hoarding data
You've heard of report monkeys—I'm going to tell you about data squirrels. Data squirrels spend lots of energy collecting data; then they hoard it. They carry it around in their cheeks, feeling well-prepared for the winter, but their mouths are so full they can't process and swallow their hoarded goods.
I'm thinking of one former client who was using nearly every single variable available at the time- virtually every available report in Sitecatalyst was filled with data. According to "Best Practices", they had a weekly meeting to discuss their implementation and keep it up-to-date- the data was tested, current, and abundant. Their dashboards were vibrant and plentiful. But they were so busy collecting and hoarding data that they had no time to process and analyze it. The idea of taking action on the data was never part of the process.
For all of their effort, how much value do you think they were getting out of their analytics tool?
Sidenote: Data Squirrels and Report Monkeys make great friends. The Squirrel hands data to the Monkey and the Monkey hands "reporting requirements" back to the squirrel. This cycle can go on indefinitely without any real value.
Mistake #2: Believing the work is done when the implementation is complete
You know what makes the web analyst in me want to go cry in a corner? This all-too-common situation:
A major company spends thousands of dollars and dozens of man-hours on an implementation project to grab and report on some new data. The implementation consultant gets the data, shows them the working reports, signs off on the project, then walks away. Six months later, the company calls the consultant and says "this report doesn't work anymore, someone disabled the variable five months ago and it took us until now to notice. Can we bring you back on to fix the problem?" (Presumably, so the data can go another 5 months without being used).
If you can go 5 months without looking at a report, why-oh-why did you throw money at someone to make the report available? Sadly I know the answer: it's easier to throw money at something and feel that sense of accomplishment than it is to spend time analyzing and making potentially risky decisions based on the data.
It's a lovely feeling to wrap up an implementation and feel like the hard part is done. But if you really want value out of your tool, the work is far from done. Fortunately, the post-implementation work is the fun stuff: the stuff that doesn't count on your IT department to roll out new code; the stuff that has the potential to realize actual, provable value for your company.
If you must, on the day your implementation project is completed, mark a few hours of time two weeks or one month out to play with the data until you get an action out of it. Schedule a meeting with the analyst-minded folks of your organization and go over the data- NOT to discuss how complete your implementation is, but to discuss what insight the data has given you and what the next steps are. Don't let USING your data fall down the priority list just because it doesn't have a project and a budget attached to it.
Mistake #3: Repeating this phrase frequently during the implementation process: "I want to track..."
•I want to HYPOTHESIZE...
•I want to TEST...
•I want to COMPARE...
•I want to BRAINSTORM...
•I want to OPTIMIZE...
•I want to DECIDE...
•I want to ACT UPON...
"Tracking" or merely collecting data should never be the goal or even a main focus. ALWAYS implement with this question in mind: "How am I going to USE this data to optimize my site?"
Mistake #4: Failing to evolve
A major healthcare website recently hired an agency with which I am familiar. The client has used Omniture for years and noticed a few things that were a little off in their reports (have I mentioned how much I hate the Form Analysis plugin?), so they hired this agency to do an implementation audit. As they sign the contract for the implementation audit contract, they say, "You'll notice we don't use conversion variables or events, and that's fine. We're really content with just the traffic data, just help us make sure the traffic data is coming through properly". In other words, "This solution worked for us four years ago, please just keep it working as is."
Oh, how this breaks my heart! Because I know a year from now they might say, "Gee, we're spending a chunk of money on our web analytics, but we've never done a thing with the data. Let's bring on an expert to help us analyze". And that expert will say "analyze? analyze what?" Then they'll need to re-implement, then wait even more time before they have real, actionable data.
If you're using the same set of reports that you were 18 months ago, you are very likely losing value on your implementation. That is, unless your website hasn't changed, in which case there may be bigger problems. The web is constantly evolving: so should your site, and so should your implementation.
The problem, of course, is finding a balance between "We don't need any newfangled reports" and "Let's track everything in case someday we might need it". The best way around this? Any time you are changing your implementation, don't think of what data you want to track now (again: never think "I want to track"). Think of which reports you've actively used in the last 3 months. Think of what reports you will use in the next 3 months. If current reports don't fall in that list, scrap them or at least hide them from your menu so they don't distract you.
See what analytics trends are on the rise. Check out some of the top blogs in the industry- particularly your vendor's blog, like Omniture's Industry Insights- to see what's up-and-coming in analytics. If you're hiring an implementation consultant- either from a vendor or an agency- don't just tell them what reports you'd like. Ask them which cool reports others having been using. Use their experience. They may be content to take orders or fulfill the requirements of the contract (which are usually made by salespeople, not analysts), but it's very likely that they have great ideas if you'll give them a couple hours to incorporate them.
I write this post as a step in the direction of repentance. I have been a Data Squirrel enabler, and I know it. Learn from my past and don't allow these mistakes to diminish the value of your implementation.