A resource for news innovators powered by American Press Institute
Complexity: Intermediate
Article Complexity Bar Graph

Embrace platforms: Measures of success and tracking progress in closing the gaps

There's a wide variety of data to monitor platform performance. Avoid unnecessary complexity and also choose and monitor indicators that are not addressable by data – such as user experience, revenue and costs.

This is an excerpt from “Table Stakes: A Manual for Getting in the Game of News,” published Nov. 14, 2017. Read more excerpts here.

You can choose among a rich array of available data and analytics to monitor platform performance. In doing this, though, avoid complexity – and the confusion, loss of focus, and lack of action arising from ill-defined, ill-chosen data and analytic approaches.

In addition, choose and monitor metrics/indicators for aspects of a platform business plan that are not addressable by data – things related to the quality of user experience, revenues, costs and so forth.

Platform measures

At the broadest level, there are two key categories for platform metrics:

  • Analytics-based metrics. The familiar digital metrics of uniques, shares, time spent and the like focus on audience traffic and the audience’s interactions with the platform. They help you get continuously better at using a platform to grow and engage audiences every day. They are operationally oriented.
  • Managerially focused metrics. Managerial measures guide platform owners in managing platforms as businesses. They help you plan, implement and monitor performance – audience growth and retention, profitability, market performance vis a vis competitors, etc. – in ways you can compare platform versus platform as well as the portfolio of platforms as a whole. They are less operational and more strategic.

Analytics-based measures

Four types of analytic metrics help you focus on the audience’s use of the platform:

  • Reach: how many individuals a platform might help you reach. For example, Facebook might help you reach, say, 2 million users in your metro. That doesn’t mean your efforts actually reach that number. But they might.
  • Consumption, time and attention: How many users you actually reach and, for them, how much of their time you garner and the “stickiness” of their visits in keeping them with you in a session.
  • Engagement action: what actions are taken by those you reach beyond just reading and viewing your content – for example, do they click on ads or provide you data or sign up for subscriptions
  • Loyalty and habituation: how often do those you reach come back in a given period of time; how much of a habit you are in their lives; how often do they share your content with others.

The following typology provides a simple, illustrative overview of analytic-based measures. You and your colleagues should use it to gain a basic orientation about the nature of choices at hand. Your objective here is to become conversant with the basic choices so that you can better do the more complicated work of actually choosing the specific metrics that work with different platforms. For that work, see TS#4.

Typology of Platform Metrics

Type of measure Metric* Sub-metrics*
Reach unique users in-market
sources of users by referral source
devices used by users by device
Consumption, time and attention Click throughs total click throughs (referrals)
click through rate
content views total page views
views per user session
time spent by content item
per user session
bounce rate
completion rates
Engagement likes/recommends total for site/brand
rate per content piece
shares total count
rate per content piece
comments total count
rate per content piece
survey responses
self-identification Registrations
Loyalty and habituation visit per unique users
% of unique users visiting => threshold number of times

(e.g. => 9 visits per month)

*All measures within a defined time period

Managerially focused measures

Platform owners use managerial measures to manage platforms as businesses. They help measure whether overall platform performance is delivering what the business needs for success: audience, market share/standing, revenue, productivity/costs and profitability:

  • Audience hurdle measures. By ‘hurdle,’ we mean a threshold level above which you deem performance good and below which not good. For example, you might use a hurdle of X thousand unique visitors to distinguish platforms that perform well versus those that do not.

Platform owners must set hurdles because, without them, you fail to take a position on what is good versus not good performance. This is true for the platforms you own and truer still for platforms owned by others where less information is available.

Hurdles help you drive improved performance through comparing and contrasting different efforts on the same platform. Imagine, for example, that Facebook posts published over the past 30 days have ranged from zero uniques to 5,000 uniques and zero time to 3 minutes of engagement. You might divide up this full range into quartiles or quintiles of performance – in the case of quintiles, you figure out the average number of uniques and engagement time for the top fifth, second from top fifth and so on down to bottom fifth.

With this information, you as platform owner might set a hurdle (threshold). For example, you let folks know you’re seeking ‘desired performance’ equal to the averages for the top fifth – say, 3,500 uniques and 2 minutes of engagement.

You and your colleagues can then monitor against this goal over whatever time horizon you think best: the next 30 days… the next 60 days… the next 90 days … and so on.

As platform owner, you monitor how often content reaches into the top fifth – and work with folks to understand why versus why not so that it happens more often.

(Please note: If this sort of internal hurdle is happening for the first time in your newsroom, you’ll probably want to make clear the objectives are about learning – and not immediately about individual performance. Yes, with time, individual performance will come under scrutiny. But not until you and others learn more about how to improve results. Failure to manage this distinction can lead to anxieties that preclude instead of advance learning.)

It’s worth noting that audience teams – and the mini-publishers managing them – might also set internal platform hurdles for their teams in ways that help platform owners succeed as well. For example, food related content might consistently outperform town hall meeting content on Instagram. This could mean setting a lower yet still ‘good’ hurdle for town hall content, yet a higher hurdle for food.

And, the hurdle you use might arise externally as well. The above Facebook example linked to internal numbers. It might, though, have come from the performance of competitors (for example, perhaps you know a local TV station routinely exceeds 3,500 uniques and 2 minutes engagement time – and that is the source of your designated hurdle/threshold).

Using this approach helps you build teams and individuals whose work consistently exceeds the hurdle level. It helps you avoid getting caught in the trap of focusing on big hit, viral stories. It provides you performance data with which to address and make choices about low performing content – and whether, why and how much of that content to create.


Dallas made good use of internal hurdles in all these ways. They developed a careful index of factors. Then set hurdle goals that differed for different content types. When they rolled this out the first time, they made clear it was about learning and not individual performance – while also alerting folks that the individual performance aspects would emerge after they’d learned much more about what works and what doesn’t.

  • Local market standing and share. These measure how well you do on particular platforms relative to other publishers in your market:
    • Local market standing: an estimation of where you rank among competitors in your local market on the platform
    • Local market share: a rough estimation of how much of the local market (or the target market within the local market) you are reaching through the platform based on demographic data for the local market.
  • Revenue attributable to platforms. Platform owners running platform businesses need to understand how much revenue their platforms generate – or help generate. Hence the word ‘attributable,’ which can be direct, indirect or intangible yet still important:
    • Direct: on-site advertising and sponsorships; revenue shares from others’ sites
    • Indirect: ad revenue derived from traffic sent to a directly monetized site
    • Intangible: reach, brand discovery and brand reinforcement

Please note: How platform owners set and monitor revenue and revenue-related (e.g., brand discovery) goals is essential to running the platform as a business. Such revenue goals and approaches define the core objectives of the ‘business’ – and how those objectives relate to serving audiences and at what cost.

For example, say you choose an overall revenue target of $10,000 from ad revenue generated by Facebook referrals. Well, what reach, click rate, share rate and share click rate would be needed to generate the needed number of main website page views at a given CPM required to reach $10,000? Once you’ve figured that out, set those as targets.

  • Platform productivity. Productivity is a business term meaning: how much output do we get from a given input? For example, productivity might estimate the “audience yield” you get from the content your newsroom publishes on the platform. It is best expressed as a ratio: Audience yield versus staff time involved or cost to produce content or whatever other input you choose. Imagine, for example, that you have 50 folks in your newsroom and they average 2 hours per week on Facebook posting. That’s 100 hundred hours per week – or, if we use 50 weeks per year, 5,000 hours per year. Either way, you now can construct a productivity ratio by choosing the relevant metric for audience as the output measure: “We get X number of uniques for our 5,000 hours, we get Y engagement time for our 5,000 hours … and so forth.” Platform owners can also translate the 5,000 hours into costs using wages, salaries, benefits and other direct costs (that is, not pure overhead). Perhaps the 5,000 hours equates to, say, $150,000. With either of these in hand – hours or costs – platform owners can set and monitor productivity goals.
  • Platform contribution. By attributing revenue and costs to platforms, platform owners can gauge how much the platform is contributing to the total enterprise’s economic performance. And, when this is done for all platforms, your enterprise will have a good indicator of overall platform portfolio performance as well as platform versus platform performance.

Platform scorecards and sharing best practices

As described in TS#7, platform owners are encouraged to be ‘mini-publishers’ who manage key platforms as businesses. For the most important platforms, platform owners should use a blend of analytics metrics plus managerial metrics to create and use a scorecard. It is not as important – and could be too much work – for platform owners to have scorecards for the less critical platforms – including those being experimented with.   In this case, though, it’s essential to have regularly scheduled group meetings to share best practices: what’s working, what’s not, why and agreed upon next steps.

Meanwhile, platform owners as a group along with top news enterprise leaders must manage the overall portfolio of platforms – including both key ones for which there are scorecards as well as less critical ones for which only best practice sharing gets done.


The steps and guidance for developing and using platform scorecards parallel those described for target audiences in TS#1:

  • Choosing the scorecard measures: use the previous sections on analytics measures plus managerial measures to identify what to include in your platform scorecard.
  • Setting the measurement interval: the measurement intervals will vary. For example, analytic measures could be monitored daily while some managerial measures are best reviewed weekly, monthly or even quarterly. Essentially, as a platform owner, you want to allow enough time between measurement/review to make progress against goals you set.
  • Constructing the scorecard itself: See Table Stake 7 for guidance on this.
  • Setting objectives for the platform: It’s key to set SMART outcome based goals that describe what success looks like on an ongoing basis.

Qualitative best practice sharing

Platform owners and their colleagues must proactively seek out and share examples of what’s working well – and instructive examples of what is not working well. There are a range of ways to do this such as (1) weekly emails; (2) incorporation into daily meetings; (3) posting on an internal wiki or knowledge sharing platform; and, (4) use of Slack or Slack-like channels. In addition, platform owners and colleagues should encourage informal sharing of what works and what doesn’t – essentially, seek out folks and ask them to find ways of sharing on their own. Make sure to share a combination of the approaches/techniques used as well as the results along with highlighting the specific lessons to be learned and any ongoing questions worthy of focus. And, don’t forget to celebrate the names of those who have acted – even in those cases where failure can be celebrated as a source of learning.

Managing platform performance with the scorecards

Use scorecards to track performance, identify issues needing corrective action, and spot potential opportunities for improved performance. In the case of platform scorecards this should be done on four levels.

  • The platform level. The owner for each given platform should spend a half hour each week tracking recent performance and spotting immediate issues of concern or opportunities. Every two to four weeks, it’s worth spending more time to dig deeper into the data and identify clear next steps to improve performance. Both levels of review should be scheduled on a calendar.
  • The target-audience team level. Platform owners and audience teams should regularly hold joint reviews in which to reflect on – and identify key next steps for – how audience team efforts perform on the platforms. While the timing of these reviews will vary, it’s key to make sure every audience team using the platform is met with at least every two to three months or so.
  • The newsroom level across the portfolio platforms. Newsroom leaders should gather all platform owners about every four weeks or so to focus on cross-platform issues and opportunities as well as to coordinate joint efforts to improve platform use (e.g. rotating responsibilities for best practice sharing). Also, use these meetings to discuss, review and evolve the role of platform owner, and to share experiences and ideas for being more effective in the role.
  • The enterprise level across the portfolio of platforms. Every three to six months, the top management of the enterprise should gather platform owners along with appropriate folks from editorial, technology, business development, sales, marketing and others to focus on current platform-by-platform performance, investments (or disinvestments) needed among current platforms, experiments with new platforms and launches of new platforms with tested audience growth potential.

This last level of review (and resulting action) is the essence of platform portfolio management. Given the limited capacity of any metro newsroom to publish on multiple platforms, this level of management is essential for making sure that capacity is deployed to publishing in the most effective ways for building a larger, more engaged and more loyal local audience to be served and from which revenue can be earned.