THE TRUST
ECONOMY
Technology has transformed the way we live our lives. Whether it’s buying groceries online, getting into cars with people we don’t know, or holidaying in the homes of people we have never met, the internet has opened up a world of opportunity for everyone.
It has also created a new currency that goes beyond hard cash: trust.
As Carolyn Jameson, chief trust officer at online review platform Trustpilot, says: “People are now more accepting of doing things and are a bit braver online. If you consider some of the companies that didn't exist many years ago like Airbnb or Uber, it's a huge confidence and trust leap to go and get into a taxi or stay with somebody you don't know.”
Back in pre-internet times businesses gained trust in part from the quality of the goods or services they provided and in part from working hard to ensure their physical environments and personnel carried the air of trust about them.
Indeed, Graeme Jones, chairman and non-executive director of new SME bank AlbaCo Ltd, says that when he started work in the 1980s, staff in the traditional high street banks and building societies were often seen as both trusted advisers and active members of the local community.
“We would meet our customers in the branches, high street or in their homes and became an integral part of the local community. We would build lasting trust and indeed friendships, over many years”, he says. “Branch buildings were also visible and substantive which helped to build a sense of permanence, security and trust in the brand, further added to by the known and trusted staff working within the building. Staff were often volunteers of community charities and organisations all leading to trust in both the individuals as well as the institutions".
In the online world today that traditional approach has all but evaporated, with some notable exceptions. Trust therefore needs to be built differently when there is no building to visit and no actual person to interact with. In the online world, trust is in a sense much simpler to define but also much easier to lose. As Jones says: “If your proposition is going to be completely online, it needs to work 24/7 and be easy to use. Clearly there has to be trust for people to choose to bank online, but it's so easily lost should there be outages in online services, data leaks compromising privacy, or a service doesn’t quite perform as expected. In the online world it doesn't take an awful lot for that trust to to be badly damaged or indeed lost”.
Financial institutions have worked hard to ensure their technology does work in a way that earns them their customers’ trust, but the flipside is that as fintech has become more sophisticated so too have the people involved in the kind of fraudulent activity that erodes that trust. James Darbyshire, chief counsel at the Financial Services Compensation Scheme (FSCS), notes that online fraud has “mushroomed” to the point that, of the £2bn that was lost to fraud last year, 80 per cent was lost in online scams. Some of that is conducted via websites designed to replicate those of genuine financial institutions, some via sites that gain trust by fraudulently claiming to be FSCS protected, and some, as Darbyshire says, by “scammers pretending to be your bank and getting access to your computer as a result of talking to you about something happening with your bank account”.
“It’s a big, big challenge because financial sophistication and literacy is still pretty low in this country,” he says. “There’s this tension between giving consumers choice, but also whether you can really protect them, or alternatively whether they should be responsible for their own decisions. That balance, I think, is always tricky. My view is that you need to find a path where consumers are able to make informed choices (through a combination of both what regulated firms should tell them, and their own understanding of financial services products), and therefore reduce the chances of consumer harm as a result.”
It's an attention economy, yet these platforms have become so integrated into our lives we're torn by this dichotomy between sort of having to trust them because we need them, but also simultaneously thinking of them as toxic.
IAN STEVENSON - CHIEF EXECUTIVE, CYACOMB
Trust in the financial context is related to something tangible: if an organisation shows it is not just looking after its customers’ assets but is looking after them well then it is likely to earn their trust. Elsewhere in the online world things can get far more complicated. Ian Stevenson, chief executive of forensic software firm Cyacomb, says people are, whether justifiably or not, becoming increasingly distrustful of all online operators because “they are having experiences online that they find deeply unpleasant”.
“There are some really complicated issues around online safety that relate to freedom of speech and ensuring users have adequate privacy, but people are having these negative experiences across a wide range of different platforms. Generally there is a sense from most people that they’ve been kind of abandoned,” he says. “How can we have faith that the services we're using are working for us rather than against us? There is a statement that goes something along the lines of ‘if you're not paying for a service, then you are the product’. If you’re using gmail, for example, you know all sorts of preference data is being extracted from your emails. It is perhaps anonymised, but nonetheless it is used to help target advertising, understand demographics, behaviours, interests and so on. If you’re using Facebook, you know it's free, but your attention is the product. It’s the same with Instagram, the same with TikTok. It's an attention economy, yet these platforms have become so integrated into our lives we're torn by this dichotomy between having to trust them because we need them, but also simultaneously thinking of them as toxic.”
All businesses are now viewed as technology businesses to some degree, whether it’s because they have a website to interface with their customers or because their entire proposition is delivered online. That is a relatively new phenomenon, though, and because of that there is a limited amount of legislation designed specifically to deal with online harms. The European Union's Digital Services Act, which aims to tighten up regulations around illegal content, advertising and misinformation, came into force last year, while in the UK the government last year introduced its Online Safety Bill and is expected to this year bring forward its planned Digital Markets, Competition and Consumer Bill. The former, which is currently making its way through the House of Lords, seeks to address concerns about the way tech businesses handle user information and aims to ensure that users are safe when using their online service, though concerns have been raised that its scope has already been watered down after it was altered to remove a requirement for big tech companies to remove legal harmful content from their sites. The latter, which the government consulted on in 2021, will increase the CMA's powers to regulate the largest tech companies with "strategic market status", and is expected to significantly strengthen consumer protections.
Stevenson notes that campaigners such as the parents of 14-year-old Molly Russell, who took her own life after viewing images of suicide and self-harm online, are unlikely to be satisfied with the final form the Online Safety Bill takes. On the other side, tech businesses are advocating for the legislation to be as light-touch as possible, meaning a middle ground will have to be found. “Designing good legislation means basically making everybody slightly unhappy,” Stevenson says. “There is no bill that can possibly get passed in the UK parliament that will both satisfy Molly Russell's parents that enough is being done and simultaneously be light-touch enough for some of the big tech companies. Success for the bill probably means trying to leave everybody only slightly unhappy instead of some people monstrously unhappy.”
Callum Sinclair, partner and head of the technology and commercial division at Burness Paull, observes that “there has always been a challenge for regulators in keeping up with the pace of change in technology, and in understanding the fundamentals and risks of new technology so that they can make ‘good law’. That is not getting any easier, but it is critically important: carefully considered regulation, appropriately enforced, provides an essential foundation upon which trust can be built.”
AI tends to be sensationalised...The truth is, it's been with us for decades. Many people think of it as like Skynet from the Terminator franchise and machines wreaking destruction, but fundamentally AI is just a tool that makes inferences from large datasets.
ERIC GROUSE - VICE PRESIDENT & DEPUTY GENERAL COUNSEL FOR EMEA AT SONY INTERACTIVE ENTERTAINMENT (SIE)
Without such laws being in place, organisations have had to find their own ways to protect customers and build trust. Eric Grouse, vice president & deputy general counsel for EMEA at Sony Interactive Entertainment (SIE), makers of the PlayStation, says the number one priority for an organisation such as his is to earn the trust of customers by ensuring “every player has a safe, fun experience”. With video games now frequently incorporating multiplayer and interactive experiences, ensuring player interactions are “safe and positive” is key.
“We set clear policies and provide proactive tools for all users, but especially for parents to configure to control their children's video gaming experience,” he says. “We've been focused on the player experience and enhancing our capabilities for years, since prior to the EU Digital Services Act and the Online Safety Bill. These issues have been very much in the public eye and legislated on for the last two or three years, but prior to that for many years we've been increasingly improving and building our functionality to ensure that our users can play in a safe and fun manner.”
As the government has discovered during the passage of the Online Safety Bill, it is a fine balance to strike. Grouse says SIE earns the confidence of its customers through its longstanding commitment to create safe and respectful spaces for players. SIE has done this through the adoption of policies and practices to protect players from illegal and harmful content and harassment, including a Community Code of Conduct, a Hate Speech Policy and PlayStation Network Rules making clear the consequences of violating these policies; empowering players and parents to understand and control their video gaming experience through tools like parental controls; and continually investing in technology to help thwart improper conduct and content before a player may be subject to harm.
“SIE uses sophisticated communication filtering technology to proactively protect players, reporting mechanisms proximate to where activity takes place that allows players to report inappropriate content or conduct on the PlayStation Network, and human moderators to review each report that is submitted, for example, a report of someone bullying a player in a voice chat,” he says. “We're very clear in our terms that we have standards of behaviour, and we monitor our services to ensure those standards of behaviour are met using certain automated tools, reporting functionality and moderation. If it's against our guidelines there are penalties. When a violation is found, SIE implements an appropriate sanction and notifies the offending player. SIE utilises graduated sanctions for players who violate PSN rules, ranging from written warnings to console suspensions, depending on the nature of the prohibited conduct and the player’s past moderation history.”
Some of that monitoring is done by a team of humans, some by artificial intelligence (AI) and machine learning, something Grouse says can be disconcerting for the public because there is a fundamental misunderstanding about what AI actually is. “AI tends to be sensationalised,” he says. “The truth is, it's been with us for decades. Many people think of it as like Skynet from the Terminator franchise and machines wreaking destruction, but fundamentally AI is just a tool that makes inferences from large datasets.”
For Jameson at Trustpilot, AI has an important role to play in helping identify which reviews uploaded to its site are real and which are fake. “We partner with a company that looks at behavioural factors and can draw connections between the businesses buying reviews and the people writing those reviews — even where they're trying to disguise themselves — and that can draw out these networks. That's been hugely effective because review sellers are a big problem. […] We use artificial intelligence and machine learning to train our detection models, which are run over every review.” For a business whose entire purpose is helping other businesses gain the trust of consumers, that is key. As Jameson says: “It's really important that consumers understand what they're looking at.”
Jameson stresses the important role organisations such as Trustpilot play in ensuring the public can trust the information they see online, especially as more and more people have become comfortable living their lives in a tech-enabled way. “That can lead them into situations where they don't always think to check and they can get into a situation where something bad happens,” she says. However, she warns that, if too much focus is put on the downsides technology can bring, it risks eclipsing its many positives.
“One of the negative things we’re dealing with now is this sentiment that all tech companies are bad,” she says. “That slightly troubles me because they're not all bad, and actually there are some that are capable of doing some very good things. You see regulators at the minute who are really focused on looking at online platforms, but if we just continue with this negativity I'm not sure it will help any of us as consumers or society.”
Sinclair from Burness Paull agrees: “A balanced and informed public discourse about the trust economy we now live in will be a very important element in tackling some of society’s major challenges over the coming decade. Educating our citizens has to start from a very young age – as anyone with children will know, toddlers very quickly become accomplished users of tablet devices!
“Building products and services with not only security and privacy by design, but ethics and transparency by design, will be core to this. In an increasingly complex world, well designed regulation will help to foster and improve trust in technology businesses of all flavours and provide a solid foundation for sustainable growth. Legal advisers must play their part in supporting their clients to adopt these principles at every stage of their business.”
Whether your business creates, sells or is enabled by technology, as a digitally native legal firm, we give thoughtful and precisely informed advice.
To discuss how to ensure you have the right assurances and protections in place to win earned trust and benefit from growth in the trust economy, get in touch. We’d love to have a conversation.
CALLUM SINCLAIR - HEAD OF TECHNOLOGY & COMMERCIAL
callum.sinclair@burnesspaull.com
+44 (0)141 273 6882
+44 (0)7391 405 414