Balancing User Privacy and Data Monetization

I don’t remember the last time I signed up for a service expecting to pay for it. There might have been a freemium component involved (pay for a basic service, receive constant offers to upgrade), but it certainly did not require payment up front. The assumption was pretty evident – they were going to use my data to generate revenues and hopefully make my experience better. 

As the number of connected individuals increases, so does the volume of data and the necessity to make sense out of it. In parallel, data-collection-as-a-revenue stream (DCaaRS, see we can make up acronyms too) becomes more popular as well. 

After all, in our current digital age, services (like Facebook, LinkedIn, etc.) seem free, but we’re paying with our data. Users understand this and are right to be a little hesitant. How do any of us know that a particular app maker is ethically respecting our data? 

“How do any of us know that a particular app maker is ethically respecting our data?”

In truth, companies have been packaging and selling consumer information since at least the 1960s. Axciom was one of the first. They anonymize personally identifiable information and sell it in bundles to anybody with a credit card. App makers, publishers, game developers, and really anybody but brands have begun to rely on the same model. Collect as much information as possible about a user, de-personalize the data, package it, and sell it to the highest bidder – usually a data broker. Sound creepy? These are the same companies under congressional investigation. 

Privacy is not a nice-to-have anymore – it’s a must. Users will continue to make decisions that serve their best interests. The rise of anonymity apps, shift towards privacy on Instragram, and backlash over Facebook’s News Feed experiment all show us that we expect a certain degree of privacy in our online decision making, and big data, no matter how profitable, shouldn’t undermine our personal agency.  

But, we all also understand the need for businesses to sustain themselves, which means that this is about striking a balance between the profitability of data at scale and our collective basic expectations of privacy. 

The best way to strike that balance? Make sure that any data you collect is only used to improve my experience. Instead of talking about experience in the general sense (mostly “better” ads), let’s talk about tangible things a business can do to improve how I interact with it. To begin, let’s just call big data the CRM of the future.

Customer Experience That Makes a Difference

Let’s say that you own a shopping application and ask for my birthday information, make sure you send me a coupon for a product on my birthday or sometime around it. You get valuable demographic data, and I receive a discount. It’s a fair trade.

You ask to know where I live. Make sure you don’t send me terrible ad network ads allegedly based on my location. Do something with it. I don’t even mind if you build an internal ad network that allows advertisers to target based on location. Filter out the ones that don’t make sense to your brand and only include the ones that do, like HEB, Whole Foods (if I’m in Austin). Provide more value to your advertisers and more relevance to me. That’s a fair trade.

You get the idea – don’t sell my data for the sake of selling data. Use it to give me something back, and I’ll stick with you. 

“Don’t sell my data for the sake of selling data. Use it to give me something back, and I’ll stick with you.”

See, the balancing act isn’t all that difficult. In fact, it only requires two things: 

  1. Respect my information, not because you have to, but because you should. Don’t pawn user data to whoever will pay the most or allegedly give you the most additional data in exchange. Know that you and I have a contract to use this information, and I don’t expect you to strip my name out of it, bundle it with the data about other people, and sell it. That wasn’t part of the deal.
  2. Use my information to improve my experience. Forget about the mythical marketing notion of “1:1 marketing at scale at the right place, right time, and the right emotional state.” It’s acceptable to speak with consumers as segments or groups who share common characteristics. This alleviates the strain on your organization, prevents the purchase of yet another analytics tool, keeps the ad networks out, and let’s you take a step back to truly understand what people want. Build your strategy around that.

At the end of the day, data collection is fine as long as the information is used productively. There is a delicate balance between collecting as much information as possible to sell and using that information to meaningfully improve a person’s experience. Don’t be creepy and sell data. Your users may have signed a broad terms of service, but they certainly did not expect their information to just be sold. If you are going to monetize it, offer something back to the user, and for the sake of everyone, don’t just say “experience” and then fail to deliver. Too many companies are already playing that game, and believe it when I write it, it backfires