EMERGE
Empowering Digital Product Leaders
meatburger is better than veggie
emerge
The Big Data Evolution

The Big Data Evolution

  • Product Development /

Moving Toward a Marriage of Ethics and Innovation

The excitement about big data’s potential is palpable these days. It’s often referred to as a “revolution,” and there are loads of articles proclaiming its transformative power. New metaphors keep popping up to help explain what we can learn from it: it’s like frozen yogurt, it’s like a paper shredder. There are at least a couple dozen TED talks dedicated to it. So, overexposed? Sure. But overhyped? Probably not.

Big data has already changed the way we live.

There’s no denying big data’s ability to offer businesses previously out-of-reach insights—about value chains, about new markets, about trends, and even about the habits of individual customers. It’s allowed manufacturers to streamline processes, increase output, and cut costs. It’s enabled the healthcare industry to offer new products and services that improve quality of life, and in some cases even help to predict epidemics and hasten cures. And corporations are investing billions: Google’s recent purchase of home automation company Nest will give them a line of access to new kinds of data, and Facebook’s $19 billion acquisition of WhatsApp gives them better access to all-important mobile data.

From Amazon recommending the books you didn’t know you wanted to read, to Spotify and Pandora curating playlists for your every mood, data allows companies to close the gap between what customers expect from brands and what those brands are offering. Customers today want individual attention, and companies are responding with an “every customer’s a VIP” treatment made possible by big data. Data allows retailers to improve a customer’s entire journey and experience at each touch point based on their behavior. And path analysis can provide an understanding of the routes users take in order to customize their own shopping experiences. In the home, data is being used to offer new levels of customization and precision in everything from thermostats to fridges, and now there are even social robots who can help around the house—managing your communications, taking family photos, even keeping your children company by reading them stories. More and more, as the notion of a quantified self really takes hold, consumers are managing their own data: from Fitbit and UP, that allow users to track and share performance stats, to FitBark and Whistle, which allow you to keep up on (you guessed it) your pet’s data.

And consider the effect data can have on our infrastructure. A few years ago GE began harnessing data from trains and planes in order to create what they’re calling an Industrial Internet—connecting machines in order to increase efficiency and predict malfunctions. Or look at how IBM’s Smarter Cities initiative has been using data to identify congestion in order to improve the flow of traffic. Or how data generated from ticketing systems, social media, and sensors on vehicles and traffic signals has allowed the Transport of London to track user journeys in order to improve efficiency, response time, and to get the most updated information to riders. And then there are police departments: many are now using data to identify patterns in order to solve crimes faster. For a specific example, Durham, NC has been able to use predictive analytics to reduce violent crime by 50%.

In short, big data has already shown the ability to improve our commutes, to keep us safer and healthier, to allow us deeper interaction with our appliances, our pets and our robots. And it has the potential to enhance nearly all interactions between a person and a brand.

The biggest challenge? Trust.

But to do all this, a ton of data is needed, and from a wide array of diverse sources. This goes way beyond, say, transactional history and demographics. And it takes sophisticated analysis to make sense of it all, which can be a significant hurdle—especially for your average company who doesn’t have an in-house analytic infrastructure and a team of data scientists on the payroll. What’s more, data is often owned by any number of third parties, and corralling it all and sharing it is a real challenge.

But the deeper challenges center on trust and ethics. The fact is, companies have always used data and analytics, but the ethical questions have become so much more urgent and complex because of the extent of data we’re now dealing with, and because of how much easier technology has made its collection and employment. It’s no mystery that data theft is a huge problem (the U.S.’s fastest growing crime, in fact), and that large companies are particularly susceptible to security breaches (just ask JP Morgan, whose data breach affected more than 70 million households, or Sony Pictures who had nearly 50,000 Social Security numbers exposed, just to name two  cases in recent history), but it’s less clear cut, both legally and ethically, what exactly companies should or must do to protect people as best they can, or how much data is too much, or what’s fair game and what’s off limits.

The 2012 Consumer Privacy Bill of Rights does lay down some clear guidelines, and in response a number of companies have pledged to offer consumers a “do not track” option, but as of now all this is more of a request than a legal obligation. (Other parts of the world—from the EU to South Africa to South Korea—have far stricter and more clear protection rules.) The fact is, many average citizens are wary of big data, and it’s because they don’t trust the entities who so badly want their personal information.  

The Brooklyn electronic music project Big Data had a breakout track last year called “Dangerous” with lyrics that seem to speak to these concerns: …peeping through the floor, it’s like they know… they’re right under my bed, they’re on patrol… Trust in the government and large corporations are at all-time lows. In fact, only around 5% of people are “very confident” that data collected by companies will remain private and secure, and this dips down to 2% for search engines, and 1% for social media sites. Yet people have come more and more to accept personally targeted marketing, or even to actively embrace it (a 2014 report suggests that ? of consumers want companies to know their history), even if they don’t completely understand exactly what kind of data is being collected. According to a recent study done for the Harvard Business Journal, only about a quarter of people know that their social network friends list and location are being noted, and only 14% realize that their web histories could be fair game. It makes for an odd mixture of complacency and alarm.

And this isn’t the only paradox of the big data revolution. A recent article in the Stanford Law Review pointed out a few others, including transparency and power. For one thing, companies are collecting all this private information about average consumers, yet the average consumer is either ignorant of, or actively prevented from knowing about, what’s being done with their data. In addition, big data promises much empowerment for the individual consumer, yet it tends to further concentrate power in the hands of the largest corporations.

It’s no wonder that so many people are pushing for a whole new ethical framework, and why governments are now actively involved. Innovation is important. Efficiency is important. Offering customers the best, most personal experience is important. But do we need to sacrifice our values along the way?

The good news: we should be able to have it all.

Although innovation and ethics may sometimes seem in competition when it comes to big data, it’s actually possible to collect and analyze data in a way that respects people’s privacy and increases their trust, without the data losing any of its power. What’s more, this heightened trust should also lead to more opportunities for collaboration and an increase in the pace of innovation.

OK. But how do we get there?

Be transparent

All this opportunity begins with business leaders promoting transparency and fostering trust by getting more explicit permission from consumers, and by being as specific as possible about what’s being done with data. This goes well beyond some kind of disclosure or terms and conditions agreement and gets into actual education and informed consent. Open and honest information sharing will increase customer trust and loyalty, and as a result will likely grant companies deeper access to data. On the other hand, the risks of shrouding data practices in secret are real and costly: customer backlash and the kind of distrust that will actively prevent you from collecting the data you want. Facebook has experienced this more than once, and even Zuckerberg talked not long ago about the need for more trust between Facebook and its users, saying, “We need to give people more control over their information so that everyone feels comfortable using these products.”

Provide direct value

When consumers are given a measure of control and some genuine education they can better understand the value of their data—the value to companies, yes, but also the value to themselves. This is a key point: the collection of data from consumers should provide them direct value, and since some data is more valuable to companies than others (the deepest and most personal info) consumers should see corresponding value for giving it up. Examples of this direct value can be seen in a consumer’s data being used to improve a product they actually use, like Netflix’s refined recommendation algorithms, or to offer a service that makes their life more convenient, like Delta launching an app that allows passengers to track their bags. On the other hand, having one’s data sold off to a third party usually provides little or no value to a consumer. The more transparent companies are about what’s being done with the data they’re collecting, and what value they’re providing in exchange for it, the more trust they earn. And with trust, consumers will be willing to share much more. In a relationship like this, it becomes mutually beneficial to offer people full access to information, and the ability to manage their data.

Know when to say when

Currently, most people do not have much data management ability, and it’s still fairly easy for certain companies to collect massive amounts of data from people without them ever knowing. But that does not mean this is a wise course. It doesn’t mean you should just dig and dig and dig. For one thing, too much data can very easily lead to analysis paralysis. Airlines, who have access to tons of data, have already realized this and are being much more selective about what they target. In addition, there’s a real risk in trying to collect data too deeply on individuals, and in too closely targeting them—a risk that ranges from irking your customers all the way to inadvertently exposing things about them that they want kept private (sexuality, health, political leanings, etc.) Target ran into this quite infamously in 2012 when they used “pregnancy prediction” scores to target customers and ended up revealing a young woman’s pregnancy to her father. It shouldn’t take stories like this to remind us that there are real people behind all this data, and that we need to be more conscious about what kinds of data we collect, and especially about what we do with it. This takes forethought and planning; it takes an investment in responsible analysis and storage; and it takes instituting guidelines that CEOs can champion.

It also takes empathy and a little common sense. Sure, it might seem like a grand idea to offer people a snapshot from exactly one year ago on social media, or to run a year-in-review slideshow for them. But how often will you be forcing people to relive tragedies? Again, just because data has made something new and exciting possible, it doesn’t mean we need to go there—not without thinking it through very carefully. It’s really easy to get caught up and assume too much intimacy with the user, and before you know it you’re stepping over the line from helpful into distressing.

Leading the way forward.

There’s so much to be gained for those willing to lead this revolution by example—to help evolve the way we collect and handle people’s data and, by extension, how we treat those people themselves. As a leading digital innovation agency focused on web and app development we are always thinking of the implications of the products we design. Yes, it’s still possible for companies to get away with a certain amount of secrecy and carelessness in their data collection and employment. But things are beginning to shift. There are calls for a new ethical framework coming from governments, consumers, and even from many business leaders. The way forward is all about trust. Which is perfect, because there’s nothing more important to a brand.

comments powered by Disqus

Forward. Digital. Thinking.

© 2024 EMERGE. All Rights Reserved.