Last year, Todd Belcher and I started the Boston Digital Analytics meetup in order to bring together our peers in the marketing analytics industry for networking and knowledge sharing. This month, we’re hosting our 5th Web Analytics Wednesday on 8/24 at Northeastern where Sharon Bernstein will be presenting on the topic of Data Storytelling. If you’re in the Boston area, come out and meet some local analytics enthusiasts!
When I send out weekly performance summaries to my clients, I often focus on just a few key take-aways and insights. For instance:
Campaign A is providing leads at $5/lead while Campaign B is converting at $15/lead. I’ve shifted most of the budget from Campaign B to Campaign A, but started an A/B test on Campaign B’s landing page to see if its performance can be improved.
These reports focus on what happened and what is about to happen. What’s missing in these emails, and discussions around measurement in general, is what didn’t happen. In other words, what mistakes did we avoid because we had data pointing us in another direction? Read More
While listening to an interview with an analytics vendor the other day, I heard a phrase that is too often repeated and can be summarized as “… we flag users as highly engaged when they type your website address directly into your browser”. The problem with this statement is nuanced but will become clear in a second.
*autotrack.js has been updated! See this post for more information*
Just 2 days ago, the Google Analytics team released a plug-in called autotrack that packs a lot of new functionality into Google Analytics. The thinking behind autotrack is that far too many clients deploy the default GA snippet and stop there. By packaging up advanced GA tracking capabilities into one, easily deployed plug-in, clients will gain more value out of their GA account. These tracking capabilities include:
- Event Tracking – Track when users click on any HTML element with a certain data attribute
- Media Query Tracking – Track breakpoints, orientation, and resolution
- Outbound Form Tracking – Track when users submit forms that land them off-site
- Outbound Link Tracking – Track when users click on an outbound link
- Session Duration Tracking – More accurately track the duration of sessions by firing an event when the user closes their browser window
- Social Button Tracking – Track when users click on social sharing buttons
- URL Change Tracking – Track when the URL changes but the page does not refresh (important for single page applications)
While utilizing these enhancements goes beyond a “beginner” understanding of GA, by packaging them up in this easy-to-use plug-in it brings them from the clouds and into the hands of anyone with a basic understanding of GA and HTML.
For the many customers of Brightcove’s video platform, understanding user engagement (ex: play, pause, percentage watched) with their videos is key. And while Brightcove offers reports showing user engagement, there are many advantages to tying that data into a broader analytics platform such as Google Analytics. One particular advantage of this approach is that AdWords remarketing lists can be generated based on video engagement. Did a user watch your video but not convert? Remarket them!
I recently completed a Brightcove/GA integration and learned some things along the way worth sharing.
I recently completed a project with MIT Press Journals (MITPJ) as part of the Analysis Exchange, an online marketplace that connects mentors and students in order to provide free web analytics services to non-profits. The program is intended as a vehicle for providing recent entrants into the web analytics field with real-world experience while assisting non-profits with services that are typically out reach due to budget and staffing considerations. It’s a great program that I can’t recommend enough so I’ve put together the following case study that can hopefully inspire others to get involved. With permission from MIT Press, I’ve released all the deliverables for the project on Google Drive so that others can borrow/steal from our work as much as possible.
The project spanned 4 weeks and can be broken out into the following 4 phases:
- Week 1 – Understand
- Week 2 – Educate
- Week 3 – Analyze
- Week 4 – Recommend
Anyone responsible for measuring campaign performance has likely run into either incomplete or missing data due to incomplete campaign tagging. This can show up in the form of (not set) values in Google Analytics or in artificially inflated (direct / none) traffic.
The typical solution is to ask that everyone in your organization generate campaign links with UTM parameters such as:
However, these links and parameters are often cumbersome to generate and are prone to human error. Any typos will result in misleading data in your reports.
Lately I’ve been revving up to run a design sprint with a local agency interested in redefining their client on-boarding and discovery process. The timing lined up well as my copy of Design Sprint arrived just last week. Authored by local Boston luminaries C. Todd Lombardo, Richard Banfield, and their NYC compatriot Trace Wax, Design Sprint is a practical guide for running sprints within any sized organization.
There’s a section in the book on sprint supplies which I used as a check-list for my own sprint. After loading up my shopping cart on Amazon, I thought “Wouldn’t it be nice if these were helpfully collated in a public Amazon wishlist?”. After some quick Googling, I decided I might be the first to have this idea (please correct me if I’m wrong!).
Without further ado, here’s my Amazon Wishlist for design sprint supplies.
- Water-based flip chart markers (much better than alcohol-based markers on easel pads)
- Self-stick easel pads
- Large and small sticky dots for voting
- 8″ Time timer
- 8.5″ x 11″ Printer paper
- Adhesive putty tabs
- 4×6 multi-color post-its (slightly larger than typical post-its)
- Felt-tipped black pens
- Dry-erase kit
Leave a comment if you have any suggestions for additional supplies and happy sprinting!
The question came up recently within the Digital Analytics Association member forum regarding whether one could interchange the Google Analytic’s “Unique Views” metric found when viewing a Content Group and the “Sessions” metric found when using Advanced Segments. They were asking because they had a preference for using Advanced Segments, but wanted to make sure they were comparing apples to apples. The question boiled down to this:
If I create a content group defined as “contains /blog/” and an advanced segment defined as “sessions that viewed pages which contain /blog/”, will the “Unique Views” metric for the content group be the same as the “Sessions” metric for the advanced segment?
The answer is yes. Let’s look into why.
When you create a content group and view the “All Pages” report with your content group selected, you’ll notice that the first 2 metrics displayed are “Pageviews” and “Unique Views”. Don’t be thrown off here. You may be familiar with “Pageviews”, but “Unique Views” is not “Unique Pageviews”. It’s a metric only available when viewing content groups.