The trust gap: what happens when we defy the algorithms?
It seems anyone and anything – from the most powerful people in the world to the algorithms that control our social media feeds – have been playing fast and loose with the facts and our data, leaving us wondering who and what to believe. How much is technology contributing, and can we regain control of our digital lives?
We trust what (and who) we know
According to this year’s Edelman Trust Barometer, our faith in major institutions is at an all-time low and most of us feel that ‘the system’ is failing us. The picture’s even bleaker when you look at the results from Edelman’s survey of 16–18-year-olds. This generation don’t trust the government and, perhaps surprisingly, they think that the pace of digital innovation is too fast. They see family, friends and their own generation as most trustworthy when it comes to protecting their future interests.
Being reminded of people’s intrinsic trust in the those closest to them takes me back to my BBC Local Radio training nearly 30 years ago. As a rookie broadcaster, I was encouraged to address ‘the listener’, to think of one person tuning in, not hundreds. Some of us even went to the extent of drawing a face on a fire extinguisher in the corner of the studio and talking to it, as though it were our oldest friend! It was widely understood that the more intimate our dialogue with the listener, the more appealing, engaging and relevant our stories would be. Beyond that, and along with almost all local TV, radio and newspapers at the time, 95% of our stories were sourced from within 40 miles of the newsroom. We focused more-or-less exclusively on our own patch. What happened in London felt irrelevant, a million miles away.
There are proven, psychological roots for our interest in events that happen in the places where we live, and for trusting stories that we hear from our nearest and dearest. The messenger is arguably just as important as the message. But in the past 20 years, the way that information is communicated has changed beyond recognition. For most of that time, local newspapers have been in serious decline. Sharply falling sales have led to a slump in advertising revenues, and where local papers still exist, there is an increasing reliance on syndicated copy at the expense of original, local journalism. The same is true in the USA, argued Kathleen McLaughlin in the Guardian this January. “In the wake of the most divisive presidential election in recent memory, and the midst of many hand-wringing treatises on the state of journalism,” she writes, “we’ve somehow overlooked what happened with local news, the place where most Americans used to get the bulk of their information. The scaffolding of American journalism, a basic bulwark in our apparently delicate system, is crumbling.”
Of course, many of the same regional and local papers that are no longer popular on the newsstand have burgeoning online audiences and are supported by advertising. But there’s a natural cause-and-effect in terms of the stories that make it online when clicks = advertising revenue that has given rise to the idea of clickbait. This is defined by Wikipedia as “web content that is aimed at generating online advertising revenue, especially at the expense of quality or accuracy, relying on sensationalist headlines or eye-catching thumbnail pictures to attract click-throughs.”
Our lives online
Today we receive more of our news and information from online sources that are outside the mainstream. Beyond the familiar voice of the local paper or radio station, we’re being introduced to people and publications online that sound like us, and reflect our interests and views. Social networks are using increasingly sophisticated algorithms to ‘ape’ the kinds of behaviour that we would expect to see and hear from those closest to us. They know what we’re interested in, and what we’re not. They know where we live, whether we’re fat or thin, young or old, and they will supply us with content and information that seems to follow our interests and match our needs. The material often addresses us in a friendly ‘voice’, and sometimes with a nice regional accent to make us feel at home. While most of us are aware that these algorithms are put to work in the interests of online advertising, it’s perhaps less well understood that they are also applied to user-generated and editorial content.
The social media echo chamber
It could be argued that this is simply about understanding the audience (or customer) and tailoring messages to suit them. Yet now it looks like there’s something else at play that goes beyond the desire to merely tailor the message to the recipient. Filtering (by algorithms) means it’s quite possible that we might never see or hear things online that contradict our own world view. This has been called the social media echo chamber and it creates the possibility of a world where people are no more than targets to be hit, votes to be won and money to be made.
So far, so utterly depressing. Yet if Edelman’s Trust Barometer is to believed, perhaps more and more of us are seeing these effects for what they are. If companies, governments, the media and NGOs were doing a good job of manufacturing the authentic, familial voices that we trust, of understanding people, and of drowning out alternative points of view, surely we’d be trusting them more, not less? Perhaps we actually feel manipulated, misled and lied to. As in The Truman Show, you don’t need to row too long before you hit the edge of this particular set!
Reasons to be cheerful
Personally, I see evidence of change in the air. I wholeheartedly agree with Sir Tim Berners-Lee who, writing in the Guardian on 11 March, said that we need to take action to protect the Web, and our own personal data against abuses. “It’s too easy to spread misinformation on the web”, he says. Tim is President and co-founder of the Open Data Institute (ODI), who have been our client since 2012 and have never wavered in their mission to promote data literacy, education and awareness. We can expect to see a redoubling of their efforts in this space during 2017. Slowly but surely, I think it’s being acknowledged that too much control over our data has been placed in the hands of a few, very powerful interests and that it’s time to take back agency over our information and how it is used. Even the social media behemoths are beginning to acknowledge that they need to take some ownership of the problem, with Facebook recently announcing it’s introduced new features to help users spot fake news online.
Among our wider clients at Thwaites, we’re seeing a new, healthier relationship with the digital world emerging and a better understanding of data. Online tools might provide the mechanisms for communicating effectively with lots of people but they don’t replace relationships and networks. Increasingly, we’re being approached by organisations that want our help in making authentic connections with their stakeholders and customers both on and offline. We’ll share more about how we’re doing this in a forthcoming blog.
I feel that until recently, we have been swept along on a tide of data-driven innovation where the knowledge and power to manage the world that’s being created has been held by a very few individuals and organisations. Now, while we might not know exactly how the algorithms work, we’re increasingly aware that they are being put to work. And we’re starting to question just how much of what we’re receiving is filtered, designed to push our buttons, and playing to our human inclination to believe what seems familiar and relevant. It’s a leap from where we are now to fully reclaiming agency over our online lives and realising Tim Berners-Lee’s vision of a web ‘for everyone’ but I feel we’re close to being on the right track. Maybe, just maybe, when the anti-hero of The Truman Show, Christof says “we accept the reality of the world with which we’re presented. It’s as simple as that”, he might have been wrong after all.
If you’re intrigued about what this means for your audience and how you can explore more authentic connections, drop us a line.