Target audiences: Measures of success and tracking progress in closing the gaps
Douglas K. Smith, Quentin Hope, Tim Griggs, Knight-Lenfest Newsroom Initiative,This is an excerpt from “Table Stakes: A Manual for Getting in the Game of News,” published Nov. 14, 2017. Read more excerpts here.
This section describes ways to track progress and measure success at mastering how to use targeted content to serve targeted audiences. The first five steps (a through e) guide how folks in your newsroom (e.g. audience teams, audience developers and others) can work with analytics and technology folks to design, build, test and continuously improve data-driven metrics and related scorecards to support excelling at this Table Stake #1.
Part f below can help managers distinguish the data, analytics, scorecards and managerial reviews to drive improved performance and skill results across the three time horizons in which real performance and change happens: (1) weekly/biweekly sprints by which teams progress; (2) monthly/bimonthly progress reviews of team performance by senior leaders; and, (3) one-to-three month periods within which individuals set and achieve personal performance and skill goals.
Finally, part g can help senior news leaders see and manage targeted audience efforts as a portfolio within which to set and achieve overall news enterprise goals.
a) You cannot become audience-driven in the absence of data and audience teams using that data
Being data-driven goes hand-in-hand with audience first approaches. Data provide audience teams a direct view into the size, behaviors and preferences of the target audiences served.
Digital audience data has been available to newsrooms for years and its use and reach continues to spread. Yet, the sheer volume of data plus the variety of available metrics can sow confusion and a lack of confidence about what data to look at and how to figure out what to do with that data. Too often, a focus on one metric in one conversation bears no relation to the focus on a different metric in another. Or, conversations happen at either the very micro or very macro level.
Target audience teams have an opportunity to give more effective focus and purpose to using audience data to drive audience growth. Such teams are small enough to use and learn from data in more practical, actionable ways than, say, top newsroom leaders trying to make sense of all of the audience data from the newsroom en masse. Audience teams also provide a practical, actionable context for individual reporters to translate audience results into team strategies and approaches as opposed to merely looking at individual data and celebrating or despairing over whether its good or bad news.
Teams are better positioned to make choices, act on them and learn from data about what works, what doesn’t and how to adjust moving forward.
b) Marry clear audience purposes to the audience data you choose, share and use
In the absence of clear and shared audience purposes, objectives and goals, you’re at risk of drowning in the wide array of digital audience metrics, using data that reveals too little insight, or both. Your audience purposes emerge from exploring questions about what data can best help. These questions start at a high level and drill-down to specifics:
- Very broad questions about the basic dimensions of the audience –its size, location, behaviors, sources, and screen use;
- More specific questions that dig further about types of audience measures – e.g., “consumption,” “attention,” and “loyalty” when looking further in the audience’s behaviors;
- Still more specific questions within these types of measurement to select specifics – for example, choosing “time spent per unique” to measure how much total attention do you get from an audience member over a given span of time.
Together these provide audience teams a typology to customize based on team purposes, objectives and goals:
This typology is useful in two ways. First, it provides a mental model for thinking about audience metrics overall and how individual metrics fit the larger picture. Second, it keeps clear why teams track any given metric – that is, the questions teams seek to answer. If there’s no clear and important question, there’s no value in tracking the metric.
In this typology, take special note of the first dimension, audience location. It is key to distinguish your in-market audience since the primary purpose of audience teams is to grow engaged and loyal local audience rather than attracting a churning, fly-by, high bounce rate audience from anywhere.
Yes, out-of-market traffic and page views have inventory value for programmatic advertising. That is a piece of the revenue puzzle to solve. Still, audience teams must go beyond programmatic to build local audiences from whom revenue can be earned in multiple ways: higher value advertising and sponsorship opportunities, local sponsored content, subscriptions, events, and more. (See Table Stakes #4 and #5 for more on revenue possibilities and practices for realizing them.)
Moreover, in-market and out-of-market audiences likely have different behaviors. Blending the data will distort the profile of the in-market audience and make it difficult to accurately understand local audience behaviors, needs and interests.
Note to readers
This view of digital metrics and analytics is from an audience perspective. Table Stake#2 provides another view of digital measures from a platform perspective for managing platform performance, though many of the same measures and metrics are involved. And Table Stake #4 focuses on metrics for “funneling” fly-by readers and viewers to become loyal audiences with a “willingness to pay” in one form or another.
c) Ask audience teams and analytics plus technology folks to work together to develop and use audience team scorecards
Having a good understanding of audience data and being able to view it in an analytics system is a start but won’t do much to drive the performance of your audience teams. To be really useful, the data needs to be fashioned into a scorecard as a tool that serves multiple purposes. A well-developed scorecard:
- Tells the story of the audience and the audience behaviors you want to grow and tracks your progress in reaching set objectives.
- Focuses on the questions and measures that matter most and provides a logical path for drilling deeper on questions as needed with additional measures;
- Serves as the focal point for regular team meetings to review audience results, to see and discuss what is and isn’t working, and to identify the next round of improvements to make.
Scorecards don’t have to be elaborate. In the beginning, for example, scorecards might comprise a handful of key measures manually pulled and tracked each week on a spreadsheet. That can be enough to get started with weekly review meetings. The important thing is to have and use some sort of scorecard from the start with the team and build from there.
With this in mind, here’s guidance for developing scorecards audience teams should use to continually improve performance.
d) Determine the scorecard specs
While you might start with a manually kept scorecard, you will soon want to shift to one that is automatically generated by whatever analytics system your newsroom uses. In making that shift, develop the specifications for your scorecard through:
- Selecting the audience measures to track. Use the typology of measures to pick wisely among the range of possible things to track. In the beginning, focus on fewer rather than more. Recognize that your choices may change over time as the audience teams’ evolve. For example, audience teams might initially choose to build traffic and engagement and later move to increasing reach, loyalty and monetization.
- Adding content measures. Content helps drive audience. So, make sure the team’s scorecard includes some basic measures to track how the volume and effectiveness of the content published by the team is affecting audience numbers. These can include:
- The count of stories published by the all the team’s contributors
- Percentage of stories with unique visitors above a threshold number (e.g. % of stories with >= 3,000 uniques with the threshold set as a target that’s well above the current average)
- Percentage of stories with unique visitors below a threshold number (e.g. % of stories with <= 100 uniques with the threshold set at a level deemed not to have been worth the time and effort)
- Setting the reporting frequency for each measure. As the list of measures gets longer, you can and should distinguish between tracking measures and diagnostic Tracking measures are those your team reviews to monitor progress while diagnostic measures are those the team uses to understand what is happening and why.
- Tracking measures include a smaller set of measures that teams likely should review once a week – long enough to smooth out day-to-day variations yet not so long as to preclude action (for example, monthly intervals might cause the team to forget what to do and why).
- Diagnostic measures are reviewed to drill-down on more specific questions and analysis of particular issues and opportunities. They are a longer list of measures that facilitate and support teams taking deeper dives. Consequently, monthly time intervals are quite likely more useful than weekly ones.
- Including goals for the measures. Scorecards must include the current goals set for the team. Otherwise, they are no more than a table with numbers. Goals should cover all of the team’s tracking measures and perhaps some of the diagnostics measures. Time frames for goals will include but also extend beyond weekly intervals.
e) Work with your data/analytics colleagues to ensure the scorecard meets your chosen specs
It is key to include your data, analytics and technology colleagues from the beginning of your scorecard discussions. In particular, engage them about:
- Capturing all the data. Make sure to capture data from all the platforms where the audience team’s content is published. This includes your own websites, mobile sites and apps. To the degree possible, it should also include available data from other’s platform’s where the team’s content is read and viewed directly on their platform (for more, see Table Stake #2).
- Providing multiple data views in the same format. Three different cuts of the data going into the scorecard are needed for three different purposes. All three versions should look the same. This is important for ease and speed of viewing and to reinforce the audience story reflected in the main scorecard.
- Team performance: how the entire team has done on key measures. This version is the main scorecard and used for the weekly team meetings and deeper-dive monthly sessions.
- Individual performance: how each individual contributor to the team has done on key measures. Among other things, this view is key to use in a one-on-one review and coaching sessions and in evaluating relative performance across the team’s content contributors.
- Content performance: This version helps the team better understand what kinds of content and story forms work best or not.
- Displaying trending. Teams use the scorecard to drive growth and improvement, which mean scorecards must include rates of change from period to period. This can be done numerically with percentage changes from the last report or against a rolling average for the measure (e.g. the rolling average of unique users for the last 4 weeks). Showing a graph for how things trend over an extended period is the best way to see what’s happening.
- Ensuring usefulness to the team. Just as with a website or app, the user experience is important. Is everything clearly and meaningfully labeled? Is there a good flow to the order and layout? For example, it’s worth first doing a wire frame mock-up, just as you would a website. Similarly, for viewing the scorecard on-line, layout the navigation and places where you’d most want to drill-down for detail.
- Providing access. To be as useful as possible, make sure folks have access to the scorecard by desktop, mobile device and printout. Managing who has access is also important to think through. Open and more access is a better rule than closed, narrow access.
- Making trade-offs. You will probably confront challenges around data availability, data quality, and the reporting limitations of your analytics systems. Trade-offs, next-best alternatives and workarounds will be required. So, the better your analytics folks understand your scorecard needs, the better, most pragmatic will be the choices and approaches.
f) Use three time horizons to guide and manage audience team performance
Senior leaders along with audience teams and team leaders must use the goals and scorecards to drive performance, skill building and change. Three time horizons or cycles are key to guiding and managing overall progress:
1. Weekly (or biweekly) team meetings to stay focused on design-do experimentation and improvement
Weekly audience team meetings establish a performance rhythm that keeps the design-do cycle moving. Daily meetings don’t allow enough time to make choices, act on them and see results. Yet, if these meetings happen less than biweekly, too much time lapses between check-ins, disrupting the rhythm of progress. Following a set agenda keeps these meetings brief and effective.
- Review scorecard results and trend for the prior 7-day (or 14 day) period. Identify and discuss what did and didn’t work based on the audience data, what did and didn’t happen as intended, and what insights can be gleaned and acted upon.
- Identify and check in on the progress of “sprints” to improve performance in focused ways. “Sprints” come from the world of agile software development where work is broken down into short, iterative cycles with each cycle sharply focused in terms of activity and objectives. These short cycles typically last one to two weeks, sometime as long as four. For example:
- Focusing on a particular skill and practice across the team (e.g. writing better social heads);
- Trying quick experiments to test specific ideas (e.g. a new type of aggregation).
- Asking team members to adopt practices that have shown progress, whether those practices arose within or beyond the team (e.g. story forms that have proven successful in other audience teams)
- Calendar ahead for content and coverage, including the content flow for the next one-to-two weeks, further out content items that are emerging possibilities or date-pegged opportunities, and planning lead times for stories requiring early coordination with specialists (e.g. visually rich story projects).
2. Monthly (bi-monthly or every 6 week) reviews with the senior newsroom leaders
The senior leaders to whom the audience teams are accountable should gather the teams every 4 to 8 weeks to (1) evaluate progress, (2) discuss and identify insights from design/do sprints, (3) raise and resolve issues and challenges needing senior management input or decisions, and (4) do deeper dives into particular issues or opportunities that benefit from more senior perspectives or cross-team sharing about what’s working versus not working.
In addition, senior leaders need to set aside time to review skill-building efforts, including candid discussions about gaps between the skills, attitudes, behaviors and working relationships teams need for success versus the current reality – and what steps have been or ought to be taken to close those gaps (see chapter entitled “Shaping The Right Staff Roles And Skills For Your Newsroom”).
And, at least 2 to 4 times each year, senior newsroom leaders should invite the enterprise’s top management team to join for a more thorough, strategic review of how well each audience team’s content plus strategy is serving the target audience as well as generating revenues and otherwise supporting enterprise results.
For these reviews, pull a report of every article published over a recent period of time (e.g. the last 90 days) but also published for a minimum amount of time (e.g. 7 days). With each story include the traffic data for two or three key audience measures from the audience team’s scorecard (e.g. total uniques and average time spent). Then sort the stories by each of the measures to create ranked lists by each measure (e.g. one for total uniques and another for average times spent). In reviewing these lists look for:
- Consistently low performing types of stories to drop (or find more effective ways of covering the content);
- New types of content and story forms that are performing well and worth doing more of;
- “Middling” performance stories that could be boosted to “top 20%” with some reframing and better production;
- Content with good evergreen or long-tail performance that can be produced during slower periods or assigned to freelancers.
In addition, make sure to review how the team is doing on its goals for the blended cost and value of content. And, make sure top management of your enterprise understand and weigh in on how each team’s “content plus” strategy might get adjusted to better link to revenues.
3. One-on-one sessions to drive performance of individual team members
These are coaching sessions, not formal performance reviews. The frequency can vary from, say, every 30 to 90 days depending on the individual being reviewed. The agenda and objectives include:
- Reviewing progress against agreed upon improvement actions identified in the last one-on-one session – what did and didn’t get done, what seems to be working and what’s still a struggle.
- Looking at the most recent audience scorecard for the individual and discussing improvements in recent results, shortfalls and areas for improvement moving forward, including how the individual can repeat and amplify successes.
- Discussing some recent stories – some that performed well and some that did not – and identifying specific things that did and did not work in the stories ranging from the story idea itself to specific aspects (e.g. the cropping of a photo) to publishing time and platform.
- Linking all of these discussions to the individual’s skill, attitude, behavior and/or working relationship gaps/shortfalls to be closed.
- Agreeing on specific improvement opportunities over the coming period; and, make sure to ask/discuss steps for where, how and from whom the individual can get help. Remember to keep notes so you can come back to items at the start of the next review session.
Use this same approach with freelancers who are essential to the team’s success. Freelancers will appreciate the feedback and discussion since they are less likely to get it elsewhere. (And, if they don’t appreciate the feedback, it’s a sign you should seriously consider cutting ties.) It also can help motivate freelancers to make your newsroom their top client priority. Finally, if the team has any critical content partnerships with other organizations, it’s key to find ways to review those as well.
g) Manage across the portfolio of target audience teams
From the perspective of the entire newsroom – indeed the entire news enterprise – target audience teams comprise a portfolio that senior leaders must manage and migrate toward audience and enterprise success. Each audience team is best viewed as a mini-enterprise that, when fully formed, has a business strategy and plan for success – not just a content and audience plan. (See Table Stake #7: Drive Audience Growth and Profitability From A “Mini-Publisher” Perspective)
Managing this portfolio of mini-enterprises involves:
- Setting performance expectations for the portfolio as a whole that, in turn, get allocated to the audience teams as well as the entire newsroom. (Newsroom-wide audience development can be tracked with a newsroom scorecard using the same design as the audience teams’ but inclusive of all traffic.)
- Regularly reviewing the performance of the audience teams collectively using the common scorecards that have been developed for them. Such reviews should include the participation of groups beyond the newsroom – sales, marketing, events, technology, etc. – that need to be actively working with and across the audience teams.
- Assessing where to add or shift reporting and other resources across the target audience teams based on their performance and opportunities.
- Identifying where target audience teams need to be added, redefined or reconfigured.
- Working with the publisher, marketing, ad sales and others to convert the growth in local audiences into revenue opportunities and results.