Advertisement

URP leaderboard September 2024

Postmedia uses junk science to make its case for digital millions

In policy debates, ‘junk science’ should be taken with not just a grain of salt, but several kilos of it

Media Canadian Business

Image from Pixabay

Non-scholars tend to be impressed by studies that are seemingly scientific but on closer inspection simply don’t hold water. They are often offered by our news media in pursuit of government largesse, and our politicians and bureaucrats usually fall for them. A good example was the National Post op-ed last week headlined “Big Tech is profiting immensely from news,” which pointed to a Swiss study that purported to measure how much Google is making from news links there. The study was produced by a behavioural economics consulting firm that was hired by the Swiss publishers association and found that Google searches involving media content generate about $592 million in annual revenue, or about $69 for every member of the population. It somehow concluded that publishers should receive 40 percent of that revenue (not profit), or about $237 million a year.

Switzerland is considering legislation that would have a similar effect to our Online News Act, which Postmedia has been campaigning for, and it is thus accepting submissions from stakeholders until September 15. The study is fatally flawed by its methodology, however, which seems to be nothing more than drawing a couple of analogies, both of which fall flat. The first is to Google’s own Ad Sense program, which sells ads for websites that are matched based on the vast amount of data it has gathered on computer users. “In analogy to the Google AdSense program,” the study said, “the fair and industry-standard division ratio is between 32% and 49% (on average 40%), which are passed on to the media.”

The only problem is that sending readers (and potential subscribers) to a news website from search results is hardly analogous to selling ads for that website. When Google sells ads for a website, it provides a service for which it charges a percentage of the proceeds. That percentage is no business of government, as it is a private commercial transaction. You always have the option of hiring someone else to sell ads or you could (shudder) sell them yourself. The study admits that “Google is not a monopolist and has serious competition” in the online advertising market. Google is by far the most popular provider of this service, however, perhaps because it has the most data. There’s nothing unfair about that. It has invested handsomely in hardware and software, not to mention intellectual property, to achieve its success.

The other analogy used by the Swiss study is even more inapt. “Another example of revenue sharing is the Microsoft Content Network,” it states. “Microsoft integrates current content from media companies on its websites, for example on MSN.com… In the process, Microsoft offers content providers (publishers) a revenue share of 60 percent.” The only problem is that unlike Google, MSN doesn’t send readers to the publisher’s website to read an article in full and be exposed to its surrounding ads. Instead it copies the content to its own website, on which it places its own ads, the revenue from which it then shares with the content creator. Click on any story on the MSN portal and see for yourself.

Instead of them having to pay publishers for sending readers (and potential subscribers) to news media websites, where they will be exposed to their ads, Meta and Google argue that the value proposition runs in the opposite direction. They are doing publishers a big favour by sending them free traffic by providing free searches. In Canada, Meta estimates that it sends Canadian publishers more than 1.9 billion clicks a year worth more than $230 million. Google’s estimate is even higher at $250 million.

It should come as no surprise that a study bought and paid for by an interested party is almost laughably flawed. This is what we call “junk science,” and it surrounds us. A similar study a few years ago in the United States estimated that Google was raking in a whopping $4.7 billion annually from news searches there. That brought howls of laughter from Harvard’s MediaLab website, which quipped that it was “based on math reasoning that would be embarrassing from a bright middle schooler.”

I have been railing for years against the use of junk science in media policy debates, and economists are the worst culprits. Their fancy mathematical studies rarely measure what they claim to because they suffer from the basic flaw of “garbage in, garbage out.” My favourite example of a gaping hole in economic research offered up by Postmedia to justify its plundering ways came in 2014 after it took over the Sun Media chain, which gave it daily newspaper monopolies in four of Canada’s six largest markets. National Post columnist John Ivison pointed to a study by economists that examined the impact on circulation prices and advertising rates of Canadian newspaper mergers in the late 1990s and concluded that “consolidation in the newspaper market does not imply an ability by publishers to set higher prices.” Since 70 years of media economics research in my field of communication had found exactly the opposite, I contacted one of the authors and was told that they had only looked at studies in economics, which just started studying communication in the 1990s.

The worst example, however, is the Local News Map (LNM) produced at Toronto Metropolitan University’s journalism school, which magically found the number of community newspapers to be falling fast despite an annual industry inventory that found it fairly steady until the pandemic. The LNM was first cited by the 2016 think tank report The Shattered Mirror, which was influential in the $595 million news media bailout that expires next year and whose author was married to the head of the journalism school. How could the LNP produce data so contrary to more regularly collected numbers? The methodology it used was “crowd sourcing,” meaning that members of the public could post map data, which left it wide open to fiddling. Project staff defended the discrepancies in their research, saying “the data are at odds because they measure different phenomena.”

Its data were influential in the bailout, as a LNM study published in 2020 counted 92 mentions of its research in news reports before the bailout was announced in 2018, plus another 71 over the following year while it was being finalized. The study seemed to admit that, far from being neutral, its research had a policy purpose. “At a time when funders are increasingly demanding evidence that research dollars are well spent,” it said, “map data were incorporated into news and social media content that helped push the news industry’s problems onto the government’s policy-making agenda.”

LNM data are still taken as gospel by government in allocating hundreds of millions to news media, as according to Heritage ministry staff the federal estimate of more than 450 news outlets having closed since 2008, which was offered in tabling Bill C-18 (the Online News Act) last year, came from the LNM report of August 2021.

Junk science is the problem you get from research funded by interested parties, and in policy debates it should be taken with not just a grain of salt, but several kilos of it.

Marc Edge is a journalism researcher and author who lives in Ladysmith, BC. His books and articles can be found online at www.marcedge.com.

Advertisement

Delivering Community Power CUPW 2022-2023

Browse the Archive