Mistake #1: Hoarding data
You've heard of report monkeys—I'm going to tell you about data squirrels. Data squirrels spend lots of energy collecting data; then they hoard it. They carry it around in their cheeks, feeling well-prepared for the winter, but their mouths are so full they can't process and swallow their hoarded goods.
I'm thinking of one former client who was using nearly every single variable available at the time- virtually every available report in Sitecatalyst was filled with data. According to "Best Practices", they had a weekly meeting to discuss their implementation and keep it up-to-date- the data was tested, current, and abundant. Their dashboards were vibrant and plentiful. But they were so busy collecting and hoarding data that they had no time to process and analyze it. The idea of taking action on the data was never part of the process.
For all of their effort, how much value do you think they were getting out of their analytics tool?
Sidenote: Data Squirrels and Report Monkeys make great friends. The Squirrel hands data to the Monkey and the Monkey hands "reporting requirements" back to the squirrel. This cycle can go on indefinitely without any real value.
Mistake #2: Believing the work is done when the implementation is complete
You know what makes the web analyst in me want to go cry in a corner? This all-too-common situation:
A major company spends thousands of dollars and dozens of man-hours on an implementation project to grab and report on some new data. The implementation consultant gets the data, shows them the working reports, signs off on the project, then walks away. Six months later, the company calls the consultant and says "this report doesn't work anymore, someone disabled the variable five months ago and it took us until now to notice. Can we bring you back on to fix the problem?" (Presumably, so the data can go another 5 months without being used).
If you can go 5 months without looking at a report, why-oh-why did you throw money at someone to make the report available? Sadly I know the answer: it's easier to throw money at something and feel that sense of accomplishment than it is to spend time analyzing and making potentially risky decisions based on the data.
It's a lovely feeling to wrap up an implementation and feel like the hard part is done. But if you really want value out of your tool, the work is far from done. Fortunately, the post-implementation work is the fun stuff: the stuff that doesn't count on your IT department to roll out new code; the stuff that has the potential to realize actual, provable value for your company.
If you must, on the day your implementation project is completed, mark a few hours of time two weeks or one month out to play with the data until you get an action out of it. Schedule a meeting with the analyst-minded folks of your organization and go over the data- NOT to discuss how complete your implementation is, but to discuss what insight the data has given you and what the next steps are. Don't let USING your data fall down the priority list just because it doesn't have a project and a budget attached to it.
Mistake #3: Repeating this phrase frequently during the implementation process: "I want to track..."
•I want to HYPOTHESIZE...
•I want to TEST...
•I want to COMPARE...
•I want to BRAINSTORM...
•I want to OPTIMIZE...
•I want to DECIDE...
•I want to ACT UPON...
"Tracking" or merely collecting data should never be the goal or even a main focus. ALWAYS implement with this question in mind: "How am I going to USE this data to optimize my site?"
Mistake #4: Failing to evolve
A major healthcare website recently hired an agency with which I am familiar. The client has used Omniture for years and noticed a few things that were a little off in their reports (have I mentioned how much I hate the Form Analysis plugin?), so they hired this agency to do an implementation audit. As they sign the contract for the implementation audit contract, they say, "You'll notice we don't use conversion variables or events, and that's fine. We're really content with just the traffic data, just help us make sure the traffic data is coming through properly". In other words, "This solution worked for us four years ago, please just keep it working as is."
Oh, how this breaks my heart! Because I know a year from now they might say, "Gee, we're spending a chunk of money on our web analytics, but we've never done a thing with the data. Let's bring on an expert to help us analyze". And that expert will say "analyze? analyze what?" Then they'll need to re-implement, then wait even more time before they have real, actionable data.
If you're using the same set of reports that you were 18 months ago, you are very likely losing value on your implementation. That is, unless your website hasn't changed, in which case there may be bigger problems. The web is constantly evolving: so should your site, and so should your implementation.
The problem, of course, is finding a balance between "We don't need any newfangled reports" and "Let's track everything in case someday we might need it". The best way around this? Any time you are changing your implementation, don't think of what data you want to track now (again: never think "I want to track"). Think of which reports you've actively used in the last 3 months. Think of what reports you will use in the next 3 months. If current reports don't fall in that list, scrap them or at least hide them from your menu so they don't distract you.
See what analytics trends are on the rise. Check out some of the top blogs in the industry- particularly your vendor's blog, like Omniture's Industry Insights- to see what's up-and-coming in analytics. If you're hiring an implementation consultant- either from a vendor or an agency- don't just tell them what reports you'd like. Ask them which cool reports others having been using. Use their experience. They may be content to take orders or fulfill the requirements of the contract (which are usually made by salespeople, not analysts), but it's very likely that they have great ideas if you'll give them a couple hours to incorporate them.
I write this post as a step in the direction of repentance. I have been a Data Squirrel enabler, and I know it. Learn from my past and don't allow these mistakes to diminish the value of your implementation.
Because right now my role is mostly in implementation, it sometimes feels like the world of analytics is nothing but figuring out how to report- either how to get the data in there or how to present it. I want to see some action!
I challenge you all to spend a mere 30 minutes in your tool of choice to find one- just ONE- actionable piece of data, and *here's the kicker*: actually take steps towards that action (even if it's just making a plan). I don't expect the action to only take 30 minutes, but you should definitely be able to find your piece of data and start a plan in that time.
Here are a few ideas that range from simple to ambitious to help you get started:
- Look at your Error Pages report and fix the most-clicked broken link. If needed, use a pathing report to find the page sending people to your error page. This is practically a freebie.
- Look at your top internal search keyword. Figure out a way to make content on that topic more easily findable from your homepage. Ask yourself: would this make a good internal promotion?
- Look at your top 5 highest-traffic landing pages, then see which is converting the least. Make a hypothesis about what could improve (compare to highest-converting page if you need ideas), then make a game plan to A/B test it.
- See which paid search keyword has the highest bounce rate. Hypothesize on how to make your landing page appeal to users clicking on that keyword more, or reword your keyword so it brings in more qualified traffic. Make an A/B test out of it.
- Think of the one thing on your site you wish users were doing more of. Now put it in a fall-out report. Find the point of highest abandonment. Hypothesize about why users are falling out. Test it.
- Find a low-performing call-to-action. Figure out a different way to present it: perhaps a different graphic or reworded text. Test it. (Are you noticing a "test it" theme, here?)
- Take your highest-performing campaign. Play with segments until you see who the campaign appeals to the most. Earmark that segment for more marketing efforts.
- Find a video with a high conversion rate. Feature it in an area with higher visibility.
- Look at your top Organic Search Terms. Do you see a lot of your brand name in there? Find a high-converting product page and focus some SEO efforts there so you can reach users looking for your products, not your brand.
If you reach the end of your 30 minutes with no action plan, don't give up. Spend some time finding a recent success or failure. What's trending up? What's trending down? Try segmenting the data different ways. Make some theories, then plan some tests. Not to sound like a broken record here, but you can't go awry with a well-executed test.
I'll happily take suggestions for more ideas, too. I'd love to make one huge long list of your ideas for actionable data.
Ready? Go team!
When you're done, please come back here and tell me how it went.
PS: If you can't find one piece of actionable data to move with, then either you need to revamp your implementation, or congratulate yourself on a perfect website and implementation. In which case, you have free time on your hands to volunteer at the Analysis Exchange!