Don't harm robots

HitchBOT was a cute hitchhiking robot made up of odds and ends as an experiment to see how humans respond to new technology. Two weeks into its journey across the United States, it was beheaded in an act of vandalism.

For most of its year-long “world tour” in 2015, the Wellington-boot wearing robot was met with kindness, appearing in “selfies” with the people who had picked it up by the side of the road, taking it to football games and art galleries.

However, the destruction of HitchBOT points to a darker side of human psychology – where some people will act out their more violent and anti-social instincts on a piece of human-like technology.

 

A target for violence

Manufacturers of robots are well aware that their products can become a target, with plenty of reports of wilful damage. Here’s a brief timeline of the types of bullying human’s have inflicted on our robotic counterparts in recent years.

  • The makers of a wheeled robot that delivers takeaway food in business parks reported that people kick or flip over the machines for no apparent reason.
  • Homeless people in the US threw a tarpaulin over a patrolling security robot in a carpark and smeared barbeque sauce over its lenses.
  • Google’s self-driving cars have been attacked. Children in Japan have reportedly attacked robots in shopping malls, leading their designers to write programs to help them avoid small people.
  • In less than 24 hours after its launch, Microsoft’s chatbot “Tay” had been corrupted into a racist by social media users who encouraged its antisocial pronouncements.

Researchers speculated to the Boston Globe that the motives for these attacks could be boredom or annoyance at how the technology was being used. When you look at those examples together, is it fair to say we are we becoming brutes?

Programming for human behaviour

While manufacturers want us to be kind to their robots, researchers are examining the ways human behaviour is changing in response to the use of technology.

Take the style of discourse on social media, for example. You don’t have to spend long on a Facebook or Twitter discussion before you are confronted with an example of written aggression.

“I think people’s communications skills have deteriorated enormously because of the digital age,” says Tania de Jong, founder and executive producer of the Creative Innovation summit, which will be held in Melbourne in April.

“It is like people slapping each other – slap, slap slap. It is like common courtesies that we took for granted as human beings are being bypassed in some way.”

Clinical psychologist Louise Remond says words typed online are easily misinterpreted. “The verbal component is only 7 per cent of the whole message and the other components are the tone and the body language and those things you get from interacting with a person.”

The dark power of anonymity

“The disinhibition of anonymity, where people will say things they would never utter if they knew they were being identified and observed, is another factor in poor online behaviour. But, even when people are identifiable, they sometimes lose sight of how many people can see their messages.” says Remond, who works at the Kidman Centre in Sydney.

Text messaging is abbreviated communication, says Dr Robyn Johns, Senior Lecturer in Human Resource Management at the University of Technology, Sydney. “So you lose that tone and the intention around it and it can come across as being quite coarse,” she says.

Is civility at risk?

If we are rude to machines, will we be rude to each other?

If you drop your usual polite attitude when dealing with a taxi-ordering chatbot are you more likely to treat a human the same way? Possibly, says de Jong. The experience of call centre workers could be a bad omen: “A lot of people are rude to those workers, but polite to the people who work with them.”

“Perhaps there is a case to be made that we all need to be a lot more respectful,” says Jong, who founded the non-profit Creativity Australia, which aims to unlock the creativity of employees.

“A general rule, if we are going to act with integrity as whole human beings, we are not going to have different ways of talking to different things.”

 

The COO of “empathetic AI” company Sensum, Ben Bland, recently wrote that his company’s rule-of-thumb is to apply the principle of “don’t be a dick” to its interactions with AI.

“ … we should consider if being mean to machines will encourage us to become meaner people in general. But whether or not treating [digital personal assistant] Alexa like a disobedient slave will cause us to become bad neighbours, there’s a stickier aspect to this problem. What happens when AI is blended with ourselves?,” he asks in a column published on Medium.com.

“With the adoption of tools such as intelligent prosthetics, the line between human and machine is increasingly blurry. We may have to consider the social consequences of every interaction, between both natural and artificial entities, because it might soon be difficult or unethical to tell the difference.”

Research Specialist at the MIT Media Lab, Dr Kate Darling, told CBC news in 2016 that research shows a relationship between people’s tendencies for empathy and the way they treat a robot.

“You know how it’s a red flag if your date is nice to you, but rude to the waiter? Maybe if your date is mean to Siri, you should not go on another date with that person.”

Research fellow at MIT Sloan School’s Center for Digital Business, Michael Schrage, has forecast that “ … being bad to bots will become professionally and socially taboo in tomorrow’s workplace”.

“When “deep learning” devices emotionally resonate with their users, mistreating them feels less like breaking one’s mobile phone than kicking a kitten. The former earns a reprimand; the latter gets you fired, he writes in the Harvard Business Review.

Need to practise human-to-human skills

Johns says we are starting to get to a “tipping point” where that online style of behaviour is bleeding into the face-to-face interactions.

“There seems to be a lot more discussion around people not being able to communicate face-to-face,” she says.

When she was consulting to a large fast food provider recently, managers told her they had trouble getting young workers to interact with older customers who wanted help with the automated ordering system.

“They [the workers] hate that. They don’t want to talk to anyone. They run and hide behind the counter,” says Johns, a doctor of Philosophy with a background in human resources.

The young workers vie for positions “behind the scenes” whereas, previously, the serving positions were the most sought-after.

Johns says she expects to see etiquette classes making a comeback as employers and universities take responsibility for training people to communicate clearly, confidently and politely.

“I see it with graduating students, those who are able to communicate and present well are the first to get the jobs,” she says.

We watch and learn

Remond specialises in dealing with young people – immersed in cyber worlds since a very young age – and says there is a human instinct to connect with others, but the skills have to be practised.

“There is an element of hardwiring in all of us to be empathetic and respond to social cues,” she says.

Young people can practice social skills in a variety of real-life environments, rather than merely absorbing the poor role models they find of reality television shows.

“There are a lot of other influences. We learn so much from the social modelling of other people. You can walk into a work environment and watch how other people interact with each other at lunchtime.”

Remond says employers should ensure people who work remotely have opportunities to reconnect face-to-face. “If you are part of a team, you are going to work at your best when you feel a genuine connection with these people and you feel like you trust them and you feel like you can engage with them.”

 

This article was originally written for The Ethics Alliance. Find out more about this corporate membership program. Already a member? Log in to the membership portal for more content and tools here.

Join the conversation

If we are rude to machines, will we be rude to each other?