Technology has created bad trust habits in all of us. We shouldn’t be tricked into giving tech our trust, but that’s exactly what happens when everything is about making life easier.

During this lockdown period, I’ve been thinking a lot about the difference between states and habits. Since the outbreak of COVID-19, we’ve all learned what proper hand washing hygiene looks like, how to prevent the risk of spreading disease when we’re in public and what kinds of places to avoid.

For many of us, health used to be a state that we enjoyed without having to develop too many of the habits that help guarantee that health. We’ve had all the benefit without the effort. However, we’re now recognising that if we want the best chance of maintaining our health in a time of uncertainty, we need to be intentional about the habits and behaviours we develop.

I think it’s helpful to think about trust in the same way. For many people and organisations, being trusted is a state: we want to be in a situation where people have high confidence in us. What hasn’t always happened is to think about the intentional practices, behaviours and habits that are likely to secure trust in times of crisis.

This is particularly true for technology and tech companies, who have enjoyed a disproportionately high level of trust for a simple reason: they make our lives easier. The convenience we receive by interacting with technology means we’re likely to continue to engage with them, even when there are very good reasons not to.

Take Uber, for example. Uber is highly reliable and very convenient, which means people are willing to get into cars with complete strangers. Their behaviour indicates they trust the service, even if they say they don’t (surveys find that people find taxi drivers more trustworthy than Uber drivers). This kind of behavioural trust, born of convenience, holds even in situations where people have very good reasons not to ride.

In 2016, in Kalamazoo, Michigan, Jason Dalton – an Uber driver – murdered six people whilst working his nightly Uber driving route. As news broke that there was a suspected murderer picking up rides via Uber, people continued to use the service. One rider who caught a ride with Dalton (thankfully, he wasn’t murdered) actually asked him ‘you’re not the one whose been driving around killing people, are you?’. Despite being aware there was a real threat to life, the convenience of a cheap ride home secured consumer trust in Uber.

Of course, it’s only trust of a certain kind. The trust we confer on convenient technology isn’t genuine trust – where we rationally, consciously believe that our interests align to the tech developers and that they want to take care of us. It’s implied trust; whether we believe the technology will deliver, we act as though it will.

This is the kind of trust we show in large tech platforms like Facebook. A 2018 YouGov survey commissioned by the Huffington Post found 66% of Facebook users have little to no trust in the platforms use of their data. Despite this, those users have given Facebook their data, and continue to do so, which is the kind of trust we provide when there’s something convenient on offer.

We cannot understate the significance that convenience plays as a trust lubricant. Trust expert Rachel Botsman, author of Who Can You Trust?, argues that “Money is the currency of transactions. Trust is the currency of interactions.” We need to add another layer to this: trust is the currency of conscious interactions, but convenience is the currency of the unthinking consumer (and we are all, at times, unthinking consumers).

This generates some real challenges for tech companies. It’s easy to use convenience to secure behavioural trust – to be in the ‘state’ of trust with customers – so that they’ll use your services, hand over data or spend their money, without developing the habits that generate rational, genuine trust. It’s easier to be trusted than to be trustworthy, but it might also be less valuable in the long run.

Moreover, the tendency to reward the convenience-seeking part of ourselves might generate problems with a very long tail. Some problems are not easily solved, nor is there an app to solve wealth inequality, climate change or discrimination. Many of our problems require a willingness to persevere; whilst technology can help, and might help resolve the symptoms of some of these issues, the underlying causes require rethinking our social, political and economic beliefs. Technology alone cannot get us there.

And yet, we continually look to technology as a solution for these woes. The Australian government’s first response to climate change after the 2020 bushfires was a large-scale investment in new climate technologies. Several people have released ‘consent apps’, aimed at preventing rape and false rape claims by having people sign a waiver to confirm they’ve consented to sex. There is no app to solve misogyny; no one technology that will fix our approach to the environment.

The reality is, trading on convenience can make us lazy – not just as individuals, but as a society. It’s bad for us. Moreover, it’s bad for business.

Although people often make decisions based on convenience, they pass judgements based on trust. This means they will often feel duped, exploited or betrayed, feeling ‘tricked’ into signing up to something just because it was convenient at the time.

This makes for a fickle customer and is an unreliable basis on which to build a business. Recognising this, a number of successful organisations are now seeking to build genuine trust. Salesforce CEO Mark Benioff recently stated, “trust has to be the highest value in your company, and if it’s not, something bad is going to happen to you.”

However, for people to trust you, they need to be able to slow down, think, form an ethical judgement. Today, one of the major goals of technology is to be frictionless. Hopefully by now you can see why that’s an unwise goal. If you want people to genuinely trust you, then you can’t give them a seamless experience. You need to create some friction.

Remember, convenience can be a lubricant. It might help you get people through the door more quickly, but it makes them slippery and hard to hold on to.

If you found this article interesting, download our paper, Ethical By Design: Principles for Good Technology for free to further explore ethical tech. Learn the principles you need to consider when designing ethical technology, how to balance the intentions of design and use, and the rules of thumb to prevent ethical missteps. Understand how to break down some of the biggest challenges and explore a new way of thinking for creating purpose-based design.