Smart Design Means Treating Your Users Like Friends.

Trust and transparency are key to successful anticipatory experiences.

Gina Pensiero
August 3, 2015

It was a chilly evening in late 2013 and I was tired. After a red eye from San Francisco and a long day at a conference, I was in a Park Slope bar, waiting for a friend. I sipped my beer and fiddled with my phone mostly just to keep from falling asleep. 

I opened my Google Now app. Its cheerful cards brought me up to speed on my own life, surfacing tomorrow’s calendar items, the current weather in Brooklyn and a quick route back to my hotel. But as I scrolled down, I felt a chill—the next card directed me back to my old Brooklyn apartment, a place I hadn’t lived in nearly three years. “8 min to 50 Irving Pl, Brooklyn, NY 11238. Light traffic on Atlantic Ave,” Google helpfully noted. 

First, a wave of nostalgia washed over me. I missed living in New York, and as I sat alone in the corner of a dark bar, I felt an odd kinship with Google. Google gets it. Google knows what I’m going through right now. 

It's undeniable: more data than ever before is being collected and made available to all kinds of entities, from businesses to governments. As technology improves, an appetite for personalization grows and a new generation of digital natives enters early adulthood. The landscape around data storage, privacy and digital intimacy is constantly evolving. 

Still, there is a creep line, the point at which personalized digital data goes from being a benefit to an uncomfortable and invasive experience for the user. As digital embeds itself more seamlessly into our lives and seeps into the background, designers, marketers and technologists will have to become comfortable walking the tightrope between consumer privacy concerns and business demand, striking a balance for useful personalization. 

Designing for Intimacy

The assumption has been that if users get enough value from providing personal details, they will continue to provide them, whether actively—volunteering personal information in an onboarding process—or passively—improved personalized search results based on past search history. But the public is also becoming more aware of the sophisticated tools being used to track and aggregate their personal data. These tactics have become ubiquitous. It is the unpleasant but necessary cost of doing business on the Internet. 

Many rebut the growing user concern over privacy with the aforementioned data-for-value-exchange argument. We see users saying one thing: “I’m concerned about privacy,” and doing another: “but I won’t change my habits because they provide convenience or value.” 

Because users aren’t actively bailing, the prevalent attitude in the tech community has largely been one of apathy. But it’s time to think about the long game on this issue—the constant low-level sense of unease is bad for business, at best manifesting in distrust and at worst in negative consumer attitudes, less usage and stifled growth. 

After examining the common causes that made users uneasy, including ad retargeting, geo-tracking and targeting, wonky recommendation engine algorithms and online behavioral tracking, we found that what often set people off was the digital property’s lack of understanding of the principles of human empathy and intimacy. When Facebook recommends friending a recent ex-boyfriend’s new girlfriend or watching a “year in review” style montage of a recently deceased family member, digital’s lack of grace really gets to us. 

"As our ability to target and personalize becomes more sophisticated, we should be designing more sophisticated systems of intimacy into our products."

Humans tend to know better. So, if a lack of empathy is the problem, can we learn something from the principles of human intimacy and apply that to digital experiences to make them less creepy? 

Intimacy is achieved in a relationship that contains affection, trust and reciprocity. It follows a cadence that can be felt within blossoming friendships and good first dates. Personal information is traded and trust is built. For the most part, we know these things instinctively: reveal nothing while your date shares all, and you’ll close the door on growth. Reveal too much before finishing the first drink and you’ll scare them away. 

As our ability to target and personalize becomes more sophisticated, we should be designing more sophisticated systems of intimacy into our products. How can we, as strategists, designers and marketers working in digital, imbue the products we create with affection, trust and reciprocity? 

Affection means that the experience must be beneficial to the user.

An individual must feel that they get something out of receiving targeted information. It must be understood that the intent of serving up targeted information is benevolent in nature, which creates goodwill and affection, rather than misgiving.

Trust means that the experience must be secure.

A user must trust that their information is safe and that the privacy of their information is respected both in terms of an organization’s internal policy on usage of consumer data and external threats of hacking.

Reciprocity means that the experience must be equitable.

An individual must feel that the relationship is mutual and that there is equity in the information being exchanged. Ok, that’s all well and good. But what does this actually mean when we get into the nuts and bolts of the design process? Here are some more specific guidelines to keep in mind: 

Respect the user.

Digital experiences should be built with an eye towards growing and maintaining relationships with users over time. Not every business needs to be friends with its customers, but friendship is a useful model to keep in mind when deciding whether or not to use personal data to communicate, target or enhance experiences. Just because you can do something with technology doesn’t mean you should. Smart brands need to examine if the benefit of personalized recommendations are worth potentially creeping your users out and stressing cost and tech capabilities in the process.

Consider creepiness in the user journey.

When developing a user journey for the design, consider the use cases where unintentional creepiness might occur. If the user were to receive a particular message at work, how might that feel different than if they were at home eating dinner with their family or on vacation—or even in bed with their partner? Consumer mindset and user goals can greatly change the experience. 

Make privacy policies easy to understand—and set.

In addition to the mandatory legalese, create a Cliff’s Notes version of your privacy policy that hits the most important points. Write it in plain language that’s easy to understand. It’s also important to grant easy access to privacy settings. Build trust by letting users decide with whom to share what information. 

Provide a paid option.

Allow users to avoid sharing personal information and unwanted targeted ads by granting them the option to pay a one-time use or subscription fee. This is a model that media and entertainment companies and mobile applications have already successfully leveraged. 

Be transparent.

When requesting personal information, tell the user why you’re asking for it, especially when the information will be used to provide a better experience for the user. Divulging intent helps to create trust and intimacy—and creates an environment where users feel more at ease with what they’re providing and why it will help them. 

While AT&T retired its Yelp-like Buzz.com initiative, its registration and privacy policy still stands as an exemplar of transparency, user control, and readability. Registration required users to choose the degree to which identifying information is shared, and AT&T made it clear what types of information was shared and with whom. AT&T also worked to build user control features into the experience itself, rather than build a product then work with a team of lawyers to draft a privacy policy post hoc. 

Go beyond opt-out.

While most consumers don’t take active steps to hide their online behavior, organizations should allow users to have greater control over their own data. While Google and Facebook have been criticized for the breadth and depth of data they collect on their users, both offer opportunities for users to have better jurisdiction over its use. Facebook offers tiered options on what gets shared with groups of friends and the wider public. Likewise, Google allows users to see, modify and download their web history. 

Ease into self-disclosure.

Think about how trusted relationships develop in “the real world.” Provide and solicit disclosure following a similar cadence. Begin with requesting and exposing less information, then build on the initial relationship. Another useful trick to make digital interactions feel more natural is to complement recommendation algorithms with human-curated lists. It will remind users that there are humans on the other side of the experience. 

"As users share more of their personal data, digital properties often miss out on the nuances of human intimacy."

In many ways, our technology has already come to know us more broadly, more deeply and more honestly than close friends. It knows shopping patterns, financial status, contacts and even the content of our communications. It knows what we search for late at night when we are sad or bored or lonely. It knows us at our most hopeful and our most vulnerable. 

As users share more of their personal data, digital properties often miss out on the nuances of human intimacy. Technology has no social graces. It is not human, and it is not subtle. Even with access to all the data in the world, an experience that is spun from an algorithm can’t know when a user feels uncomfortable. Aggregating everything I’ve ever liked, everywhere I’ve ever been, anyone I’ve ever crossed paths with, well, that’s never going to add up to true intimacy. Intimacy is a subtler, softer, more human concern. It’s an integral part of a truly personalized experience.