fbpx
articles

How is your data being used, or misused?

Spoilers for season 2 of Westworld

The second season of Westworld wrapped up last weekend.  Throughout most of its run, a large issue has been the company behind the show’s non-virtual reality park, Delos, secretly monitoring the brains of its human guests in order to replicate and store them as data.

(At least, that’s what I think happened; it was kind of confusing)

In comparison, data misuse by actual, real organizations might not seem as existentially egregious.  Nevertheless, careless exploitation of customer data is still a pressing concern.  Invasions of privacy and personally identifiable information almost seem to be a core tenet of online ad-driven business strategies.  To go even further, according to Vijay Sundaram at CustomerThink, this current model and data privacy are incompatible.  When advertisers pay for customer data—especially that gathered through “free” products—the greater the detail, the better.  This includes, as the recent Cambridge Analytica scandal revealed, data not just from the consumers themselves, but extended to friends and family.

In the case of the Facebook incident, Cambridge compromised information on 87 million people, with data from a mere 270,000 users.

Consumers and governments are becoming increasingly intolerant of such cavalier misuse of their data, however.  Ergo GDPR and similar initiatives in other countries.  In such a shifting environment, it is key for companies to have privacy policies that are both transparent in their data collection, and minimalist in only accumulating data necessary for business.  Usually, for example, this excludes items like addresses, phone numbers, GPS location or marital status.

An interesting example currently circulating is that of HealthEngine, a healthcare appointment booking startup.  HealthEngine is now coming under a bit of fire for sharing user medical data, which is a requirement of using the service, with a personal injury law firm, Slater and Gordon.

This practice is indirectly alluded to in the fine print: “Our Promise: Your privacy is important to us.  HealthEngine will not provide your personal information to a third party without your express consent except as required or permitted by law, or in those circumstances described in our privacy policy.”

The policy itself elaborates on what data it collects and just who it collects from.  The former indeed includes addresses, emails, phone numbers, gender, GPS data, marital status, occupation, cultural background, and photographs.  The latter includes a range from family members, doctors, and user’s social media accounts.  In this situation, a degree of transparency does seem to be present.  Some of this data, however, which HealthEngine claims is “only for the purpose of providing goods or services to us,” is still of unclear necessity.  Moreover, there does not appear to be an opt-out for users having their data shared with third parties, which raises red flags.

Another clause states that “Some third-party service providers used by HealthEngine may store your personal information on servers located overseas; however, they must also meet our requirements for privacy and data security.”  This raises my suspicions further.

So, what do we have here?  To my eyes, a company not going far enough to keep consumer data secure, as dictated by savvy users’ concerns or regulations like GDPR.  Businesses may not currently be storing your brain chemistry in order to upload your mind into an android, but too many are still showing insufficient efforts to indicate that they are taking data security or the threat landscape seriously.If a company wants to survive and flourish, this will have to change.

 

By: Jonathan Weicher, post on June 27, 2018
Originally published at: http://www.netlibsecurity.com
Copyright: NetLib Security

 

Top