The Internet of People: 4 key principles for analyzing personal data

Click here to view original web page at www.kdnuggets.com

Recently, The Insight Centre for Data Analytics (an Irish research initiative) announced its intentions to establish a “Magna Carta for Data Ethics” that attempts to balance the privacy worries of the masses with the interests of big business. “Existing ideas of ownership and privacy are not relevant in the data space,” Oliver Daniels, Insight CEO, said.

In an “Always On Society,” where nearly everything we do is or soon will be tracked, establishing some ground rules is a good idea. We are all part of what I call the “Internet of People” — a network of entities participating in the digital environment and defined by our data and models which analyze that data. But there is no Bill of Rights or Magna Carta yet for how the data that defines us can be used.

The debate will largely hinge on one major issue: Who benefits from the use of the data? Is it used to improve society as whole? Or are we using it to criticize, penalize, and discriminate based on these new sources of information?

When your car tells on you

Take the scenario where cars become data transmitters. What if I speed and my car relays that information to the police? Would I have to give permission for that to happen? Maybe I trust the police — but what about my insurance company? Do I trust it to not raise my premiums every time I go five miles over the speed limit?

Perhaps the data from my car could contribute to some social good. Perhaps it can be used to create more efficient traffic patterns, or to optimize crosswalks to increase safety for kids walking to school. As a driver, I have to weigh that potential good against the potential cost to me personally if my insurance premiums increase.

Today, individuals are making the choice about whether to transmit data on our driving habits to our insurers. But how long will that last before we decide this should be an “opt-out” system, not an “opt-in” system? If the risk of rising premiums makes us better drivers, why should we leave it up to the driver’s discretion?

Whose data is it?

This goes to the trade-off between the benefits of data sharing and the question of privacy. If it’s my data, don’t I own it? And shouldn’t I be the primary beneficiary of its use?

There are several personal information management (PIM) companies cropping up, such as It’s Really About Me, that would argue yes. But, maybe it’s not that black and white. If I’m driving on a public road, maintained by the state, then perhaps my clocked speed is not entirely my private data.

The privacy debate is even more meaningful in the case of something personal like medical data. Sharing it with one’s doctor is crucial to receiving proper medical care. It makes sense that we want to incentivize disclosure so that individuals can get earlier and less-expensive treatments and learn better habits sooner.

But many medical conditions cannot be helped with behavioral changes. We can’t change our DNA, or our family’s cancer history. The same data that help your physician return you to health could also prevent you from getting life insurance, or make it more expensive. Does sharing it with your doctor but withholding it from your insurer make you liable?

Healthcare

Perhaps the thorniest issue around the Internet of People involves predictive analytics. If Big Data is being used to predict behavior, are we punishing people for things that they haven’t yet done — and may never do, or precluding them from the opportunity to do? This violates the fundamentals of the legal code — and yet, from credit risk analytics to health risk analyses to predictive policing, the practice is growing.

A few sound principles

What’s clear from this discussion is that the ground rules are still completely unclear. As a society, we have not decided how to balance the rights of individuals, businesses and society when it comes to data sharing. But as FICO has been thinking about this for a relatively long time — nearly 60 years now — let me offer what I believe are the four key practices for anyone storing, sharing or analyzing data:

  1. Establish trust and build confidence, in part through understandable transparency around what data you hold and what you do with it.
  2. Provide demonstrable value – make it clear what the consumer gets from your data use.
  3. Engage in dialogue with your customers – don’t sit behind the two-way mirror.
  4. Help people who decide to change their behavior. In other words, don’t just monitor, report and penalize – help people drive better, live healthier, spend more wisely, etc.

Following these four principles will help the industry and its participants navigate the stormy debates to come over data use and the Internet of People.

Bio: Dr. Andrew Jennings is FICO’s chief analytics officer and head of FICO Labs. He has held a number of leadership positions at FICO since joining the company in 1994, and was a lecturer in economics and econometrics at U. of Nottingham. He has a BA and Ph.D. in economics, and an MSc in Agricultural Economics. He blogs at www.fico.com/en/blogs/.

Related:

zclixadmin