It Wasn’t Me – Who Is To Blame When Data Goes Wrong?

When there is a data breach or algorithms spewing out less than favorable results, who is the first person to blame? Typically, it’s the CEO.

Even though data collection is now a routine part of business strategy, few companies have considered their own ethical boundaries –  of course, until it’s too late. Take Target for example – which comprised over 40 million records and eventually resulted in the resignation of their CEO.

Whether it’s data breaches or data racism, there is an obvious need for a defined role solely dedicated to navigating this murky and uncharted space. And this goes far beyond the security person that is required out of compliance.

When consumers find out that you’re a “data-driven” company there is a chance they could delete your app, cancel their subscription or block all of your ads. So what can companies do to be seen favorably by their customers and regarded as data responsible?

1.) Strike the right balance between being ethical and profitable.

Marketers need to be clear on how much data they need to be successful, but not at the expense of being unethical. Is having 100 data points on a person enough to make you scale your marketing efforts or do you need 1,000 data points “to have it just in case” and are willing to do anything to collect that data. The onus is on marketers to be extremely clear about what data is necessary, what are the most ethical ways to collect this data and to know what data is dispensable.

Jeff Tanner, Marketing Professor at Baylor University, encourages marketers to develop trust first and constantly monitor the level of trust as you collect more information. “As you develop trust with your customers, Tanner suggests going deeper and asking them for certain psychographic data points, like details about their personalities, values and lifestyles.”  

2.) Set up company guidelines clearly stating the ‘right’ and ‘wrong’ ways to use data.

Having access to so much information can be a marketer’s dream – and nightmare if used incorrectly. Aside from standard security policies, is there someone at the company keeping a watchful eye on what the algorithms are saying about your customer? Or better yet, is there someone who is making sure the data science team is being held accountable and using the correct types of data to make conclusions?

Take the Uber situation in Australia last year. When there was a hostage situation in Sydney, people were fleeing the area and Uber’s algorithm’s instantly increase prices (based on the skyrocketing demand), leaving many Uber patrons with a disheartened sentiment towards the ride-sharing service. While Uber admitted that the price surge was a result of an algorithm responding to an increase in demand, they also stated the kept the prices intentionally high as a way to motivate drivers to enter danger zones. Having someone monitoring turnkey algorithms that may not apply in every situation can help keep the company adaptable in unique situations.

3.) Trust must remain strong between you and your customers.

When consumers heard that Samsung televisions were watching their customers through their TVs, the amount of trust between the consumer and electronics giant plummeted, therefore affecting sales. Having someone in place to oversee not only a viable privacy policy, but someone who will research and understand your customer’s values.

Splitting responsibility between the CEO, CTO and CMO only leaves space for finger pointing. Companies need to evaluate who can be in charge of data ethics and provide a clear guideline for the company on what to do when algorithms go wild.