Big data is certainly a buzz word, but it’s not going anywhere. Instead, big data is only getting bigger and what we consider a massive amount of 1s and 0s now will be minuscule to what we are computing in the future. That is simply a fact.
As wearable tech becomes more commonplace, as virtual reality begins to seep in to our regular lives (and it will, if Facebook knows anything), as digital monetary systems begin to seriously take hold, as our on and offline lives seamlessly collide, big data will turn into massive data — personalizing all of our experiences from when we sign on to our social media accounts to when we buy our first house.
We’re talking about predictive analysis here, the ability to determine, based on past events, what an individual’s future actions will look like. Of course, predictive analysis isn’t perfect now, nor is it likely it ever will be, but we’re getting closer to understanding what it takes to look into the digital crystal ball, if you will. After all, humans are creatures of habit, and habits are blatantly obvious when run through an algorithm — which is why security needs to be big data’s biggest concern right now.
According to findings from a January 2014 survey conducted by the Pew Research Center, 21% of Americans said they have had their online accounts compromised at one point or another. Another 18% said that they have had important personal information stolen, including social security numbers, credit card numbers or bank account information. That is an increase from the 11% who said the same in July of 2013.
“After all, humans are creatures of habit, and habits are blatantly obvious when run through an algorithm.”
Security on the web is getting tougher — the heartbleed bug made that painfully obvious — and with such large amounts of information being collected, stored, and analyzed via big data platforms like Hadoop or the Digital Genome, ensuring security is only going to get tougher.
But, big data needs to have security on lockdown in order to gain the trust of consumers, many of whom have felt violated by so many of our modern day scandals (looking at you, NSA). Otherwise, big data will indeed die away, a remnant of the web we should have created, but failed to properly safeguard.
At Umbel, we weren’t affected by the heart bleed bug, so absolutely zero percent of our customers data was leaked or ever in jeopardy – and our first responders made sure of it. Of course, we don’t just protect our customer’s data from glitches or bugs — we protect it from the get-go so that no one ever has to worry about something, or someone, lurking in the background.
These are our security and user privacy promises, and they should be those of every big data company out there.
First Party Data
Freedom is the Right to Choose
Collecting data, no matter where it comes from, is not a take-all game. Users need to be afforded the right to choose — even if that means they choose not to share their information.
Ensuring this is easy, and simply requires an opt-out option, which allows users to decide for themselves which companies have access to their data points. This concept is fundamental in a democracy, and applies to every aspect of our personal lives. As our digital fingerprints become just as prominent, recordable and identifiable as we are offline, it’s the responsibility of society, and thus businesses in that society, to make sure that those records are protected from those who will use that information in harmful, or merely unsolicited, ways.
Only Take What You Need
When it comes to big data, not every company needs every single data point. Big data is only as useful as the questions you ask it — which means companies need to start with a desired business outcome in order for big data to reach a desired ROI. And, when a company knows what they want from the data, they can narrow down what they ask from their users. Do you need to know geo-location? Is it pivotal that you have access to their social media posts? Is the year of their birth truly important?
See the example above — NFL.com indicates exactly what permissions they want, and if for some reason this seems too liberal, you can always choose to opt-out (cancel). The less a brand takes, the more privacy a user retains — and the more likely that they will share their data points, which leads to the next point…
In the effort to protect data rights, transparency is key. What will you be doing with a user’s data? How might that change overtime? How will you alert users to data breaches (if it happens) or changes in your use policy? In protecting data rights, your users become an extension of your brand — and thus an asset to secure. When your users trust you, when they feel their data is safe in your hands, your ability to increase your bottom line and ROI is unstoppable — and your users will back you, promote you and help to make sure you are successful, because they have a stake in your company, too.
Having processes set-up in advance to let users know you are on their side, especially when they hand over information as valuable as their social identity data, is necessary to building the Internet of the future.
Connect with us for a walk-thru of our Digital Genome technology and let us show you what protecting data rights looks like.