Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
How Generative AI Helps Predictive AI
 Originally published in Forbes, August 21, 2024 This is the...
4 Ways Machine Learning Can Perpetuate Injustice and What to Do About It
 Originally published in Built In, July 12, 2024 When ML...
The Great AI Myth: These 3 Misconceptions Fuel It
 Originally published in Forbes, July 29, 2024 The hottest thing...
Where FICO Gets Its Data for Screening Two-Thirds of All Card Transactions
 Originally published in The European Business Review, March 21,...
SHARE THIS:

10 years ago
Little privacy in the age of big data

 

With massive amounts of our personal data now being routinely collected and stored, privacy breaches are almost inevitable

In the era of big data, the battle for privacy has already been fought and lost – personal data is routinely collected and traded in the new economy and there are few effective controls over how it is used or secured. Data researchers and analysts now say that it’s time for legislation to reclaim some of that privacy and ensure that any data that is collected remains secure.

“We have become the product,” says Rob Livingstone, a fellow of the University of Technology and the head of a business advisory firm.

“We are being productised and sold to anyone,” he said. “We’re being monetised in essence. We are being mobilised as products with inducement of the services of we use such as Facebook and Twitter.”

However, Livingstone says the dilemma facing regulators is how they can regulate the collection, storage and trading of personal data on the on the internet, when all of these activities, and the corporations themselves, operate across multiple continents and jurisdictions.

The task of reclaiming some semblance of privacy is all the more urgent because the rate at which personal data is being collected is accelerating. The buzz around big data is attracting millions of dollars of from investors and brands hoping to turn a profit, while intelligence agencies are also furiously collecting information about our online activities for much different purposes.

And alongside these, there’s also the black market operators that make millions of dollars a year out of things like identity theft and matching disparate data sets across the web to help identify people who might be suitable targets for a scam.

Add to that emerging technologies like Google Glass and facial recognition technology (which, by the way is already being deployed by Facebook and shared with Australian state and federal police) and you’ve got a recipe for ubiquitous mass online surveillance not just by intelligence agencies, but by all. And it’s unclear how all this will be used in the near or long-term future.

While many of us don’t think much of sharing our details online now, Livingstone says the danger is that our society may relinquish its privacy rights “without due regard for future consequences” and without debate over the implications.

Eerke Boiten, a senior lecturer at the University of Kent, advocates the creation of standards to better protect the data that is collected and exchanged online.

“You could make it compulsory for web servers to use the secure HTTPS prefix rather than the HTTP default,” he said. “A lot of encryption capabilities on existing mobile and tablet technology is switched off by default too. You could change that.”

But as to who should create these standards and who gets to decide what the standards include “is a political question”, Boiten says.

“You could mandate through government though it should be mandated through the market, if enough people want it badly enough.”

But that requires enough people to know and understand what they’re asking for, and then you’re back to the problem of protecting people from their own ignorance when it comes to privacy and security.

But then again in the UK there has already been a consumer backlash against revelations that medical data was being onsold to marketers without knowledge or consent of the people it belonged to.

But hoping and waiting for the Australian population to take notice may not be the most efficient way to get things done. In Australia, Boiten said the country’s data protection needed to be “stronger and with more teeth”.

New privacy principles were recently passed into law which required all businesses earning more than $3m annually to disclose to customers how their information was being stored and used, however the new legislation stopped short of mandating compulsory data breach notifications for businesses who fall victim to security violations.

A bill that would make it illegal to hide security problems was set to pass into law last year, however it failed to make it through both houses of the Senate before the election. And since the Coalition took power, the legislation has stalled.

Boiten agreed with Livingstone’s assessment that companies are simply not capable of protecting their own data.

“The technology is not good enough yet,” he said. “Yes there are crypto solutions but they get implemented badly and they get used badly, are inconvenient to use, and require lots of time and money in training.

“Ebay is still in business. Target is still in business. Even though they both lost a significant amount of data. The incentives are not strong enough on companies to get it in right.”

Or is volunteering our personal data simply the price we pay for free services?

As the old saying goes, if you’re not paying for it, you’re the product. Paul Greenberg, head of the National Online Retailers Association told Guardian Australia that he was happy to share his data “so long as I’m getting something back”.

“Information can be priced,” he said, “and it always has had a price. I don’t see why that will change.”

“Knowledge is power, but the transaction between consumers and service won’t be about monetary terms, it’ll be about relevance. I’d happily trade you relevance for information. And I reckon a lot of people will also.”

Greenberg said the privacy concerns over big data are “completely overstated”.

“Maybe I don’t want people to know about my love life, but I’m happy to put a price on my privacy,” he said. “As long as I get a ROI, whether that’s in the form of relevant content, or customised goods or services”.

Greenberg, however, admitted that even the most successful corporations have difficulty protecting their customers’ information from security breaches and that something needed to be done to protect them.

But then there’s the national security argument of big data. Edward Snowden revealed last year that the National Security Agency, GHCQ and ASIO (amongst others) were and are conducting wholesale surveillance of its citizens online, sucking up all the noise in the hopes that its sophisticated technology can detect patterns or anomalies that could help prevent acts of terror.

Boiten argued that there was little evidence to suggest mass data surveillance was efficient, or effective.

“You’re doing mass data collation but what evidence do we have that it works?,” he asked.

It was revealed in the wake of the Boston bombings that the FBI had been contacted by Russian authorities about the Tsarnaev brothers about a year prior to the event warning them that the two were potentially dangerous to America’s national security but they were determined not to be a threat.

A number of warning signs were discovered in the computer networks of a number of US intelligence agencies post 9/11 also. It was discovered that failure to properly implement and monitor computer software that had been implemented to direct terror activity was at least partially to blame.

“Data surveillance didn’t prevent the Boston bombings. We know which bit of information probably should have but didn’t flow from one place to another, which could have prevented the 9/11 attacks.

“Intelligence agencies like to claim they also prevented some big things which they weren’t able to tell us about because that would give their agents or trade secrets away, but now security is increasingly a trust game. Do those programs exist? Did they ever help to prevent an attack? We’ll never know.”

By: Claire Porter, theguardian.com
Originally published at theguardian.com

Leave a Reply