There's a reason why humans find it hard to trust machines. It's called Algorithm Aversion.

Till Grusche
February 26, 2016

From Tesla’s Elon Musk and investor Peter Thiel starting “Open AI,” a non-commercial research lab that studies the benefits and downsides of machine learning, to Mark Zuckerberg's intention to build a fully-equipped personal assistant, major tech companies are heavily invested in the utopic vision of truly helpful artificial intelligence (AI). 

But can they create machines that people trust? To explain the challenges, researchers use a phenomenon called algorithm aversion—even when an algorithm consistently beats human judgment, people prefer to go with their gut, especially when they’ve seen the algorithm fail, even just once. Mistakes are human, but society is not granting the same leeway to a machine. So it seems we will have to come to a point where we recognize ourselves in the machine before we truly trust it.

Gregor Hochmuth, a data scientist from New York, who has worked with Google and Instagram, along with artist Jonathan Harris proved that we’re not quite there yet. For a recent digital project called the Network Effect, they crawled more than 10,000 videos and detected 100 everyday human movements, such as eat, swim, or kiss and connected them with queries in Google News and Twitter. “Network Effect” is an immersive infographic users are encouraged to click through with an endless number of layers that feels like a journey through a schizophrenic brain. Hochmuth’s and Harris’ initial intention with the project was trying to show human life and create an empathetic library of everything humans do on the internet. But as it turned out the result was the opposite — the resulting data collection never felt warm or human.

0:00 / -:-

We’re still in the early days of digital transformation. Tools will improve, and people’s algorithm aversion will most likely soften over time—especially when they realize that these algorithms could be helpful. As we already see today with digital-native youth, long-held beliefs can change rapidly and within one generation.

Meanwhile, technology brands can leverage distinct strategies to make people trust their services more. One of the most surprising ones—at least for the internet industry which centered around the idea of scaling anything—is the concept of focusing on things that don’t scale. Airbnb’s photographer program is such an example.

By investing in an analog process that sends professional photographers to hosts to create a new category of Airbnb approved photography the startup was able to build enormous trust with travelers. In line with this, Airbnb has developed a growth mantra to focus on 100 people that love you rather than 100,000 people that like you. Christopher Cederskog, the long-time regional manager for Airbnb in Germany, Central & South-Eastern Europe explained it as the idea of fostering true engagement through activism, and creating a human bond between a service and its users.

0:00 / -:-

Journeys | Access | Authenticity | Filters | Trust | Wayfinding