In Facebook We Don’t Trust…But Do We Buy?


Posted by Neil Wilson on 5/21/12 5:33 AM

There’s lots of discussion lately about Facebook’s recent IPO—from skeptical reactions to their $104B valuation, long-term viability of their business model, to increased privacy concerns for Facebook users.

Many of the conversations revolve around Facebook’s use of personal information for more targeted advertising, and how that impacts ad revenues. Last year, Facebook earned $3.2B from advertisers to deliver ads globally. 82% of Facebook’s total revenue last quarter came from advertising. It keeps the lights on today, and it’s the main strategy for their future.

So, for Facebook to ultimately be worth $104B, they really need to up their advertising game. And that’s exactly what they plan to do.

Facebook has been collecting lots of data about you—some of which you intentionally provide (your username, birthday, cover photo, status updates, etc), and some of which you may not even think about (locations of your status updates, what your friends say about you in their posts, music you like, every picture you post, etc). It all goes into the algorithm to predict your likes and preferences—and what you might be willing to buy.

In fact, a couple weeks ago, Facebook further refined their privacy policy to clarify some items, including some noticeable ones:

  1. Facebook is always tracking. “We use the information we receive to deliver ads and to make them more relevant to you. This includes all of the things you do and share on Facebook, such as the Pages you like or the key words from your stories, and the things we infer from your use of Facebook.” It all goes into the super algorithm to define what you might buy.
  2. Facebook is expanding beyond Facebook. They can use your data to deliver ads to you on Facebook (like they do now) and even on other websites.

Another interesting item in the 14 page Privacy Policy is the Ads + Social Context section. This basically states that they can deliver ads that relate to a friend’s post. To use their example, “an ad for a sushi restaurant may be paired with a news story that one of your friends likes that restaurant’s Facebook page.” This is definitely part of their long-term strategy—the word of mouth effect. Any time you “like” something, they are free to use you as that thing’s brand ambassador to your friends.

This intense knowledge of 900M users makes Facebook well poised to create a more robust, expanded, targeted ad network. With the IPO, they will be more revenue hungry, so the targeted ads—especially the ones with social context—will only continue to grow in frequency and creativity. Arguably, having investors will also make Facebook more accountable and attuned to privacy issues, so in theory, using this data to target ads should be done responsibly.

The problem, though, is that Facebook hasn’t always been above board in how they use your data. They’re up front now, because they have to be, but there is a backlash from people who didn’t know their data was already being used that way. It’s no surprise, then, that there’s a general trust deficit with Facebook. It’s only been six months since the FCC slapped their wrist for privacy violations.

Several surveys have been released in the past weeks that highlight this lack of trust. In one survey, 59% of Facebook users said they trust Facebook "only a little," or "not at all" to keep their personal information private. Only 13% trust Facebook “completely” or “a lot.”

But, how much does trust matter? 900M users are active on Facebook, and how many of them have even read the privacy policies? If people care about trust, does it impact their behavior?

I think it matters a lot when consumers are considering a purchase. As we’ve been saying at UnboundID, Trust + Knowledge = Loyalty.

You earn a customer’s trust by being transparent and empowering them to choose how their data is shared, and ideally giving them a value for it (i.e. opt-in programs or membership cards). A company can then convert that data into knowledge to improve the customer experience. That’s good for everybody. Facebook says they also use your data to make Facebook a better experience, but the Facebook user has never opted in to enable this benefit.

Targeted services can ultimately be good for users—they are convenient, make choices easier, and add value to our experiences. For example, I trust my mobile carrier. They have a lot of data about me—geo-locating (where I live and where I frequently go), billing, customer service, apps I use, phone numbers I call, etc. That data is currently static. But, let’s say I gave my carrier permission to create an app that would offer to order my food while I’m traveling. If I’m in a hurry at the airport, they could connect the dots with my data and, with my permission, order food so I can make my next flight. My data is then able to help me.

This is where the rubber meets the road for Facebook. If consumers don’t ultimately buy the advertised items, advertisers will eventually not invest in ads. According to the AP-CNBC poll, this is troubling for Facebook because “only 12 percent would feel safe making purchases through the site. Even Facebook’s most dedicated users are wary — half of those who use the site daily say they wouldn’t feel safe buying things on the network.”

As a medium for viewing ads that lead to purchasing items from other sites, is Facebook effective? Most users say they aren’t --“57 percent of users say they never click on ads or on Facebook’s sponsored content. About another quarter say they rarely do.” I personally have only clicked on one ad since I joined Facebook, seven years ago.

GM made a timely announcement last week that they are dropping $10M in advertising because Facebook ads just weren’t paying off. That may be an isolated incident, but it’s all up to the customers’ willingness to buy from the Facebook medium.

Facebook has yet to unleash the full impact of their ad network, so the situation will likely change soon. But, will that change consumers’ reluctance to purchase or view ads?

It’s a good thing for Mark Zuckerberg that Facebook wasn’t valued on trust. But, maybe that should have been part of the equation…because trust is a factor for generating revenue.


Topics: Privacy and Preference Management