Noise to Signal

◂ Blog



Mobile App & Live Streaming Analytics: A Case Study with Hope Channel International, Inc.


Over the last 3 years, Noise to Signal has had the pleasure of designing and implementing a robust analytics system for Hope Channel International, the media arm of the Seventh-Day Adventist Church. Hope Channel’s shows focus on faith, health, and community and reaches millions of viewers across dozens of countries in just as many languages. When I was introduced to Hope Channel in 2017, they didn’t have any hard data related to video performance or live-stream viewership. They were, in effect, flying blind as it related to scheduling and programming decisions. Today, they have near real-time access to granular data related to show performance and viewership trends.

I was introduced to Hope Channel by way of Brightcove, a Boston-based company that provides an ecosystem of software and services that support online video. At this point, Hope Channel was about to launch an ambitious undertaking with Brightcove: the development of multimedia apps targeting 6 different mobile devices and set-top-boxes (ex: Roku). These apps would provide Hope Channel with over-the-top distribution capabilities, resulting in a closer connection between the church and its viewers. In this project, Hope Channel leadership saw an opportunity to finally collect the viewership analytics they were lacking and welcomed the introduction to Noise to Signal. That relationship has culminated in two major projects that I’ll describe here: 1) app analytics and 2) live stream analytics.

App Analytics

A key benefit of the over-the-top media distribution model is that the platform developer can own the user’s experience rather than relying on third party broadcasters. With this in mind, Hope Channel targeted 6 different app platforms that were likely to reach the broadest set of viewers: iPhone, Apple TV, Android, Android TV, Fire TV, and Roku. This presented an analytics design challenge: How would Noise to Signal ensure that the final reports showed data consolidated across all devices?

To solve this challenge, Noise to Signal made clever use of Google Analytics (GA), Google Firebase, Google Tag Manager (GTM), Google Data Studio, and GA’s Measurement Protocol. The final product is a Data Studio dashboard that shows data collected from each app and the ability to partition the data by dimensions such as app, region, language, date, and show. Example reporting events include: app installation, app open, screen view, video-on-demand video play, live-stream video play, and language change, among others. 

App Analytics Dashboard

The technical details are below, but it’s worth mentioning first the amount of documentation and communication that was necessary for each of the composite pieces described below to work together. Brightcove managed separate developers for each app platform, Hope Channel assisted with requirements and user acceptance testing, and two consultants from Noise to Signal provided project management and analytics implementation services. Each reporting event was painstakingly designed to behave consistently across apps and output the same data format. This meant validating the technical feasibility of each event across all app platforms and delivering precise instructions to each development team. The final outcome was truly a team effort.

Technical Details (if you are so inclined)

As mentioned previously, a key challenge was ensuring that data from each app could be collected consistently and stored and reported out of a single data source. The following describes how each technology was deployed to accomplish this purpose.

App Analytics System Diagram
  1. The Fire TV and Android TV implementations were perhaps the easiest to design because Brightcove developed these solutions as a wrapper around a webpage. Users of these devices would load a Fire TV “channel” and would be unaware that the content displayed was basic HTML. GA and GTM were built with websites in mind which made them a natural fit. However, because we were blending this “website” data with “mobile app” data from other sources, we had to find a way to transform our “website” hits into “mobile app” hits. To do this, we used custom tasks and some clever JavaScript to modify the hits as they were transported to GA. 
  2. The iOS and Android solutions benefited from the fact that both platforms are supported by Google Firebase, which provides a suite of mobile app development accelerators including crash reporting and analytics. Furthermore, Google provides a seamless integration between GTM’s mobile SDK and Firebase whereby Firebase events are automatically converted into GTM events. Those GTM events were then converted into GA hits. While this implementation led to a longer chain of events (Firebase → GTM → GA), Hope Channel was able to reap the benefits of Firebase which, among other things, includes the ability to segment viewers and generate lookalike audiences in Google Ads. 
  3. Roku and tvOS presented a conundrum. Neither are supported by Google Firebase or GTM, which meant the developers would need to work with GA’s measurement protocol directly. Fortunately, however, we were able to find community-built libraries for each platform that could be modified to suit our needs. This produced a more challenging documentation, implementation, and testing process as each hit was built from scratch with parameters that shifted depending on the situation. It was during this process that I became all-too familiar with the Charles HTTP Proxy which is often seen as a right-of-passage in the analytics and testing world. A tool used only by the downtrodden and desperate!
  4. The GA configuration, as mentioned above, was set to focus on mobile app analytics. This meant that “pageviews” were replaced with “screenviews,” among other changes. The brunt of the reporting depended on custom dimensions such as “App Name,” “Episode Title,” “Show Title,” and “Affiliate Name.” 18 custom dimensions and 3 custom metrics in total. By including the “App Name” as a custom dimension, filters could be constructed that produced unique “views” for each app platform. A single, consolidated view could then be constructed to merge all app data.
  5. There were two categories of end-users for the data collected in the previous steps: Hope Channel’s administrators and their local “affiliate” program managers. The administrators were given a dashboard that showed data collected across all apps and all affiliates. Using “report filters,” individual dashboards were then created for each affiliate program manager showing data specific to their affiliate station only (Ex: Poland, or Brazil).

Live Stream Analytics

The work above met a majority of Hope Channel’s requirements, but we knew there would be a major reporting gap related to live stream analytics. Our reports showed when users watched live stream shows but not what they watched. This was unfortunate given that live stream play events were 30x more frequent than video-on-demand plays. The underlying problem was that the apps had no knowledge of what shows were playing on any live stream at any given time. The only system with that knowledge was the live stream broadcast server which, at that time, had no structured method of sharing its information with other systems.

This second phase of work focused on closing this gap. We devised a system whereby the individual apps would send out pings every 20 seconds to a custom data collection endpoint built in Google Cloud Platform. These pings answered “when” users were engaging with a live stream channel. The “what” question would be answered by a custom API, built by Hope Channel, that accessed their live stream broadcast schedule. With these two pieces of information available in a structured manner, Noise to Signal was then able to stitch these data points together and provide robust analytics related to live stream viewership and show popularity. Data Studio was once again deployed as the reporting method while Google BigQuery was used to store the raw and stitched data.

Live Stream Analytics – Viewers per Minute

Technical Details (if you are so inclined)

The key challenge in designing this solution was one of scale. Our estimated data collection rates indicated that GA, with its 20 million hits per month limit, would not be an appropriate solution. We also had to consider how the collection rate might increase as Hope Channel continued to promote their newly developed apps. To address these challenges, we turned to Google Cloud Platform and specifically Cloud Functions and BigQuery.

  1. As part of this project, each of the 6 apps were updated by Brightcove to implement the 20-second ping procedure. Each ping included a small set of attributes to aid in downstream reporting: timestamp (most importantly), client ID, app version, live stream channel ID, language, and country.
  2. The broadcast system provided a key piece of information: what shows play at what time on what live stream channels. This data was provided by a custom-built API, built by Hope Channel, that accepted a date range and channel ID and returned the appropriate broadcast schedule.
  3. Noise to Signal’s implementation work began by constructing two HTTP Cloud Functions: one built to collect the app pings and another to collect the broadcast schedule. If you aren’t familiar with Cloud Functions, you may be familiar with Amazon’s Lambda solution or, more generally, the concept of serverless. Once the data is collected, it’s transformed and written to BigQuery. One main benefit of Cloud Functions is its ability to automatically scale based on demand. The chart below shows 165 virtual machines spun up at one time to collect data! One reasonable question may be, “but how much does it cost?” While I can’t go into specifics, the answer is: not a lot. More details can be found on the Cloud Functions pricing page.
Number of Concurrent Virtual Machines (Google Cloud Functions)
  1. Another question of scale was related to data storage. Given the volume of data (already known to be greater than 20 million hits per month), where would this be stored and could reports be generated in a reasonable amount of time? Here we turned to the combination of BigQuery and Data Studio. BigQuery partitioned tables were generated to collect the raw live stream data and broadcast schedules. Scheduled queries were then constructed to merge this data together. This is a computationally expensive operation that determines whether each live stream ping’s timestamp was between the start and end time of any particular show. By computing this in advance and storing the results, downstream reports load quicker. Finally, views were created to provide the metrics and dimensions used in the Google Data Studio reports.
  1. Finally, Data Studio was implemented as the dashboard and reporting solution. Importantly, a Data Studio parameter was created that controls the time zone displayed in the report (a feature I would love to see built into Data Studio if you would like to upvote the feature request here). The final dashboard is able to calculate important metrics such as: average minutes watched per viewer, average number of shows watched per viewer, as well as the minute-by-minute viewing trends split by app, country, or language.

Conclusion

The impact of this project goes beyond generating a few visually appealing reports. Hope Channel employees work hard every day to produce the best content possible for their viewers. These reports validate and bring a deeper sense of impact for that work. Most importantly, this analytics system provides opportunities for increased organizational learning and decision making. 

From our interactions with Hope Channel and Brightcove, to technical challenges overcome and the final word product, this has been by far the most rewarding project taken on by Noise to Signal to date. 

Many of the concepts presented in this case study are applicable to areas outside of media broadcasting. Do you or someone you know need a custom-built analytics system that stitches together data from multiple sources? If so, reach out!




Author

Adam Ribaudo


Adam Ribaudo is the owner and founder of Noise to Signal LLC. He works with clients to ensure that their marketing technologies work together to provide measurable outcomes.

Leave a Reply



Home   Blog   Portfolio   Contact  

Bringing clarity to marketers in a noisy world © 2020