We hear much about the value of fact-based evidence in swaying opinion, be it public or political. Organisations wanting to enhance their reputation or influence the shape of regulation often commission surveys or studies then trumpet the findings.
But a recent Wall Street Journal piece explaining how companies are paying academics to produce research raised questions about this approach.
A straw poll of leading journalists covering EU policy confirmed, on background that for media there are ‘facts’ and facts. Our small but influential sample shows journalists take a hard-nosed approach, digging beneath the PR surface to know who is behind the findings and whether they are truly ground-breaking.
Q: When you’re sent a press release on research findings, what check list do you use to decide whether it’s credible and authoritative?
A: I don’t have a set checklist but my first reflex is to check the source: Does it come from an official source, an institution? Is it a research institute or an academic source, a university? How authoritative are they? If it’s a private study, how independent is it? Most importantly, who financed it? If it’s a survey I also check how large the sample was and how representative. And the questions can sometimes be biased so I briefly look into this as well.
A: I don’t have a checklist per se, but I check if I know them, any university or company they are linked to, and if it is in the realms of my previous understanding.
A: I generally ignore press releases on research findings, as I assume they back the views of the company that sponsored them.
A: I’m often familiar with the sender. If that’s not the case and I’m interested in using the document, I’ll either double-check the organisation’s website to verify the release or call directly.
Q: What’s your general experience of such material? Any good or bad examples you can cite?
A: It can be good, but I’m also aware you can pay people to write nearly anything.
A: The recent Burson-Marsteller survey on media influence in Brussels is a good example of a highly questionable study. The sample was very small, not even representative of the Brussels bubble (which itself is not representative of anything). And the choice of media was also questionable: all of them were English-speaking publications. Not a single German or French newspaper represented there, which is kind of strange since the BBC and the FT are listed.
A: Findings are never surprising if you know the views of the sponsoring company and I don’t have time to read every study that is sent to me.
A: Studies like public opinion polls from PEW are very helpful and can be used as information titbits in articles. Similarly, research looking into industries, trade patterns, economic developments/balances/imbalances are often helpful — like the ones from consulting firms such as McKenzie and EY, or from think tanks like MERICS. Same for law firm papers on various issues from trade deals to ECJ rulings and legislative proposals.
A: The vast majority of alleged research and studies are done by companies wanting to get their name, product or service out there. They want free publicity. Many just state the bleeding obvious and are backward looking.
A: To be honest, these things always look like blatant PR, so I tend to avoid them. But they are good for starting conversations. That Durex global sex survey is always worth a click.
Q: What advice do you have to any organisation considering commissioning a study?
A: Pick a credible research organisation. In the energy/environment field, CE Delft and Ecofys come across as being authoritative private research institutes or consultancies. Otherwise, there are plenty of official institutes and organisations at national or EU level which offer strong guarantees of independence.
A: You’re better off funding something at arm’s length. And letting them publicise it.
A: Be transparent but don’t expect media to pick it up.
A: Make it informative, data-driven and accessible. Often times, journalists will look for figures such as trade volumes, or output, etc., for comparisons and background, and studies that present those details in an interesting package are often useful.
A: Bin unless something really new or major.
Q: What should PR people remember when presenting the findings to journalists?
A: PR will be taken for what it is, i.e.: PR. Try to bring other organisations to back your study or to comment on it. It will strengthen credibility.
A: Focus on what is truly new. If there are useful background links include them in emails
A: That it’s mostly not news…
A: Try to cater it directly to a journalist, looking for a specific angle that would be interesting to her/him.
Q: Does your media outlet have a policy on citing such studies / research?
A: No but we use them a lot.
A: We normally wouldn’t cite a study first, but I have done so in the past. The philosophy is usually quote the decision-maker where possible.
A: We generally do not pick up studies unless they are from a governmental agency (WHO, EEA, etc.). Citing them in stories is fine, just apply usual judgment (does the person have an axe to grind etc.) and say if was commissioned by someone in particular.
A: If we use it, we cite to it in full, including organisation and — if available — author names. If there are big-shots co-authoring the piece, even better. We use them on merit.
A: Basically we have stopped covering studies, research unless it’s from a major, independent institution that is saying something new.