Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Effective Machine Learning Needs Leadership — Not AI Hype
 Originally published in BigThink, Feb 12, 2024.  Excerpted from The...
Today’s AI Won’t Radically Transform Society, But It’s Already Reshaping Business
 Originally published in Fast Company, Jan 5, 2024. Eric...
Calculating Customer Potential with Share of Wallet
 No question about it: We, as consumers have our...
A University Curriculum Supplement to Teach a Business Framework for ML Deployment
    In 2023, as a visiting analytics professor...
SHARE THIS:

This excerpt is from the LinkedIn. To view the whole article click here.  

9 years ago
The Big Risks of Big Data Mining

 

“Every step you take, I’ll be watching you” – when Sting wrote those lyrics back in the ’80’s he most certainly wasn’t thinking about digital data collection. But whether we realize it or not, every digital step we take is indeed being watched—with the resulting data providing a frightening wealth of information about our lives. Those of us who work in the space know: Every card transaction, every web site visited, every online social interaction, even our movements and exact location are routinely collected and analyzed to build up a picture of our habits and preferences. While the insights that the data provides can bring benefits for the consumer and for marketers, the mining of big data also poses risks that business leaders would be foolish to ignore.

The collection of so much data has the very real potential to ignite new privacy and ethical firestorms that corporations haven’t needed to pay too much attention to in the past.

In a 2014 report called, Big Data’s Big Meaning for Marketing, Forrester highlighted three main areas of risk businesses should be aware of:

Personal data protection: Existing methods of protecting the identity of individuals may no longer be sufficient in the era of big data. Forrester cited the example of Netflix, who were sued for releasing data after researchers at the University of Texas were able to positively identify individuals from supposedly “anonymous” reviews.

Financial liabilities. The full extent of any financial liabilities for big data practices is unknown and at present unquantifiable (Italics mine). Lawsuits against organizations that have data breaches or are perceived to be misusing data are just beginning. Those who use collect and use data need to be aware of relevant legislation and the potential for increased costs if they get it wrong.

Ethical dilemmas. New ethical dilemmas are being created by the analysis of big data. Just because something can be predicted, should that information be used, or that prediction acted upon?

Take for instance this example of how healthcare providers are mining data to predict our health needs, then judge for yourself as to where the ethical boundaries should lie.

Bloomberg Business reported last year that Carolinas HealthCare System, operator of more than 900 care centers, began to purchase data to allow them to identify high-risk patients. Why? So they could intervene in an attempt to prevent potential health problems developing. My alarm bells are already going off!

The data is collected from credit card purchases, store loyalty programs, and other public records. In theory, medical practitioners can learn more about their patients—and their patients’ lifestyles—from their shopping habits than from brief, or sometimes non-existent, consultations. Although the data doesn’t yet identify individual purchases, it does provide a risk score doctors can use to highlight potential problems. Anyone remember the story from a couple of years ago about the dad who discovered his teen daughter was pregnant because Target mined her purchase data and sent her ads for baby products? This stuff, and the capabilities that brands have when it comes to data mining, isn’t new news.

While some patients might welcome such a pro-active approach like the Carolinas HealthCare example cited above, there will be many (me included) who will see it as an invasion of their privacy. Some health advocates also fear an erosion of the patient-doctor relationship, if medical professionals begin to interrogate apparently healthy patients about the consequences of their perceived lifestyle choices. Then, when you extrapolate from that and factor in insurance companies having access to this kind of data—that adds a whole different level of creepiness factor into the equation. Because, of course, insurance companies always have our best interests at heart, right? It’s not a giant leap to envision a future where people are turned away from health insurers based solely on risk scores developed from this type of data mining. It’s definitely a slippery slope.

By: Shelly DeMotte Kramer, Social Media Savvy Geek
Originally published at www.linkedin.com

This excerpt is from the LinkedIn. To view the whole article click here.

Leave a Reply