Breaches and data leaks occur with disappointing frequency, whereby millions of records fall into the wrong hands or are misused. This might include the broad categories of credentials (stolen passwords), financial (credit card or banking), or personal information.
The impact of compromised credentials or payment information can be significant but there are well-established recovery controls in place. Passwords can be changed, and new credit cards can be issued with relative ease.
Personal information, however, such as date of birth, social security number, address, fingerprints etc. is usually static and can't be so easily changed.
That said, even when large swathes of personal information is leaked, people often display apathy. After all, most of this information is publicly available somewhere.
People just don't see a personal impact from these breaches either. If a fraudster takes out a mortgage application using someone else's details, it is nearly impossible to pin down where they obtained those details. So, it is treated like any other fraud; there is generally an established process and life moves on.
While personal data is static, and often public, there is an underlying layer of what I'll refer to as personality data - data about *who* you are, rather than what you are.
Advertisers and marketers have long tried to understand 'who' their customers are. Big data has allowed information to be collected and analysed at scale.
Many with access to a dataset of habits or personality traits will use it to improve products and offerings. Video streaming sites like YouTube, Netflix, and Amazon Video can recommend shows based on what you, and others with similar interests, have watched. It also allows companies to analyse which shows are most popular.
Shopping advertisers use similar algorithms to predict buying patterns. For example, if you buy a pair of leather shoes, it may recommend buying some shoe polish.
These are all legitimate and beneficial uses. Although far from perfect, they generally work well; helping the retailer upsell, prolonging the viewer’s attention to a site, improving consumer convenience.
Nevertheless, some do cross the line in the amount of data they collect.
In his keynote at the entertainment finance forum, CEO of the now defunct MoviePass, Mitch Lowe stated, "We get an enormous amount of information...we watch how you drive from home to the movies. We watch where you go afterwards."
The company maintains that the information is collected for future services and to bring greater convenience to its customers. While that may be the case, transparency is important – it’s one of the underlying principles of GDPR, whereby data collected should only be used for the purposes it was intended for.
When data manipulates you
In her 2013 paper, Analyzing the chemistry of data, Wendy Nather asked whether data should be treated like dangerous chemicals. Specifically, where individual data elements may be inert, combined together they make a toxic mix –understanding and exposing individuals more than the individual understands themselves.
When huge amounts of data on individuals aggregates into one place, it becomes possible to use that data to manipulate the individual.
“The things you own end up owning you” – Tyler Durden, Fight Club
As the CEO, Alexander Nix of the now defunct Cambridge Analytica's described in his keynote at the Online Marketing Rockstars conference in 2017, two individuals may look similar - same age, gender, similar incomes, families, and subscribed to the same newspapers; - but have very different personalities.
Slide from Alexander Nix keynote at OMR 2017
Cambridge Analytica uses the OCEAN personality model:
Openness: Do they enjoy new experiences?
Conscientiousness: Do they prefer plans and order?
Extraversion: Do they like spending time with others?
Agreeableness: Do they put people’s needs before theirs?
Neuroticism: Do they tend to worry a lot?
By gathering such information, it is possible to understand what really drives people; and target them accordingly.
It’s worth bearing in mind that targeted pushing of content is not unique to Cambridge Analytica. It is similar to how a video provider would recommend a video, and it’s what nearly all social media networks do when they push content in a non-chronological order, or through advertising. In the physical world, it’s why supermarkets stock their preferred brands on shelves at eye level.
There are only two major differences in what Cambridge Analytica allegedly did. First, was the way they obtained data profiles from an estimated 50m Facebook users, violating their agreement with Facebook. Second, was the fact that people felt manipulated at scale, believing the company fundamentally undermined the democratic process in several countries.
What this means during a pandemic
We’re in unprecedented times, and that’s not even taking into account the pandemic. Big data has made possible today fantasy that we only saw in movies.
As most are socially distancing, having services on demand, being able to shop without leaving the house, and having a variety of games and apps available is a good thing. It’s probably kept many sane during this period.
But are we too willing to give up information for the sake of convenience? That too is a double-edged sword. On one hand, you have cases like Target, which made the headlines after it knew a teenage girl was pregnant before her parents did. But imagine the same technology being able to alert you when you were close to a heart attack, before any physical symptoms occurred.
Technology is a tool and can be used for good or nefarious purposes.
Should social distancing apps be widely deployed? Is the trade-off between privacy and the safety of the wider public worth it? What will happen to these apps and the collected data once the pandemic is over? Could this information be used to manipulate you?
There is no straightforward answer. People should be made aware early, and often, as to what data they are giving up, static, personal, or personality-wise in exchange for a service. Only through informed debate can people make risk-based decisions that are best for them.
Author: Javvad Malik, Security Awareness Advocate at KnowBe4