THE GOOD NEWS:


Being aware of how social media uses your information is the first step to protecting yourself online.

What state should you move to based on your personality? What character on “Downton Abbey” would you be? What breed of dog is best for you?

Some enormous percentage of Facebook’s 2.13 billion users must have seen Facebook friends sharing results of various online quizzes. They are sometimes annoying, senseless, and a total waste of time. But they are irresistible. Besides, you’re only sharing the results with your family and friends. There’s nothing more innocent, right?

Wrong.

Facebook is in the business of exploiting your data. The company is worth billions of dollars because it harvests your data and sells it to advertisers. Users are encouraged to like, share, and comment their lives away in the name of staying connected to family and friends. However, as an ethical hacker, security researcher, and data analyst, I know that there is a lot more to the story. The bedrock of modern democracy is at stake.

You are being psychographically profiled

Most people have heard of demographics — the term used by advertisers to slice up a market by age, gender, ethnicity, and other variables to help them understand customers. In contrast, psychographics measure people’s personality, values, opinions, attitudes, interests, and lifestyles. They help advertisers understand the way you act and who you are.

Historically, psychographic data were much harder to collect and act on than demographics. Today, Facebook is the world’s largest treasure trove of this data. Every day, billions of people give the company huge amounts of information about their lives and dreams.

This isn’t a problem when the data are used ethically, like when a company shows you an ad for a pair of sunglasses you recently searched for.

However, it matters a lot when the data are used maliciously — segmenting society into disconnected echo chambers and custom-crafting misleading messages to manipulate individuals’ opinions and actions.

That’s exactly what Facebook allowed to happen.

Quizzes, reading your mind, and predicting your politics

Recent reports have revealed how Cambridge Analytica, a U.K.-based company owned by an enigmatic billionaire and led at the time by candidate Donald Trump’s key adviser Steve Bannon, used psychographic data from Facebook to profile American voters in the months before the 2016 presidential election.

Why? To target them with personalized political messages and influence their voting behavior.

A whistleblower from Cambridge Analytica, Christopher Wylie, described in detail how the company exploited Facebook users by harvesting their data and building models to “target their inner demons.”

How did Facebook let this happen?

The company does more than just sell your data. Since the early 2000s, Facebook has provided access to academic researchers seeking to study you. Many psychologists and social scientists have made their careers analyzing ways to predict your personality and ideologies by asking simple questions. These questions, like the ones used in social media quizzes, do not appear to have obvious connections to politics. Even a decision like which web browser you are using to read this article is filled with clues about your personality.

In 2015, Facebook gave permission to academic researcher Aleksandr Kogan to develop a quiz of his own. Like other quizzes, his was able to capture all of your public information, including name, profile picture, age, gender, and birthday. But it doesn’t stop there. It also encompasses everything you’ve ever posted on your timeline, your entire friends list, all of your photos and the photos you’re tagged in, education history, hometown and current city, everything you’ve ever liked, and information about the device you’re using (including your web browser and preferred language).

Kogan shared the data he collected with Cambridge Analytica, which was against Facebook policy — but apparently, the company rarely enforced its rules.

Going shopping for impressionable users

Analyzing these data, Cambridge Analytica determined topics that would intrigue users; which kinds of political messaging users were susceptible to; how to frame the messages, content, and tone that would motivate users; and how to get them to share it with others. It compiled a shopping list of traits that could be predicted about voters.

Then the company was able to create websites, ads, and blogs that would attract Facebook users and encourage them to spread the word. In Wylie’s words: “They see it … they click it … they go down the rabbit hole.”

This is how American voters were targeted with fake news, misleading information, and contradictory messages intended to influence how they voted — or if they voted at all.

This is how Facebook users’ relationships with family and friends are being exploited for monetary profit and political gain.

Knowingly putting users at risk

Facebook could have done more to protect users.

The company encouraged developers to build apps for its platform. In return, the apps had access to vast amounts of user data — supposedly subject to those rules that were rarely enforced. But Facebook collected 30% of payments made through the apps, so its business interest made it want more apps doing more things.

People who didn’t fill out quizzes were vulnerable, too. Facebook allowed companies like Cambridge Analytica to collect personal data of friends of quiz takers without their knowledge or consent. Tens of millions of people’s data were harvested — and many more Facebook users could have been affected by other apps.

Changing culture and politics

In a video interview with the Observer, Wylie explained that “Politics flows from culture … you have to change the people in order to change [the] culture.”

That’s exactly what Facebook enabled Cambridge Analytica to do. In 2017, the company’s CEO boasted publicly that it was “able to use data to identify … very large quantities of persuadable voters … that could be influenced to vote for the Trump campaign.”

To exert that influence, Cambridge Analytica — which claims to have 5,000 data points on every American — used people’s data to psychologically nudge them to alter their behaviors in predictable ways.

This included what became known as “fake news.” In an undercover investigation, Britain’s Channel 4 recorded Cambridge Analytica executives expressing their willingness to disseminate misinformation, with its CEO saying that “these are things that don’t necessarily need to be true, as long as they’re believed.”

U.S. society was unprepared: 62% of American adults get news on social media, and many people who see fake news stories report that they believe them. So Cambridge Analytica’s tactics worked: 115 pro-Trump fake stories were shared on Facebook a total of 30 million times. In fact, the most popular fake news stories were more widely shared on Facebook than the most popular mainstream news stories.

For this psychological warfare, the Trump campaign paid Cambridge Analytica millions of dollars.

A healthy dose of skepticism

U.S. history is filled with stories of people sharing their thoughts in the public square. If interested, a passerby could come and listen, sharing in the experience of the narrative.

By combining psychographic profiling, analysis of big data, and ad micro-targeting, public discourse in the U.S. has entered a new era. What used to be a public exchange of information and democratic dialogue is now a customized whisper campaign: Groups both ethical and malicious can divide Americans, whispering into the ear of each and every user, nudging them based on their fears, and encouraging them to whisper to others who share those fears.

A Cambridge Analytica executive explained:

“There are two fundamental human drivers … hopes and fears … and many of those are unspoken and even unconscious. You didn’t know that was a fear until you saw something that evoked that reaction from you. Our job is … to understand those really deep-seated underlying fears, concerns. It’s no good fighting an election campaign on the facts because actually, it’s all about emotion.”

The information that you shared on Facebook exposed your hopes and fears. That innocent-looking Facebook quiz isn’t so innocent.

The problem isn’t that this psychographic data were exploited on a massive scale. It’s that platforms like Facebook enable people’s data to be used in ways that take power away from voters and give it to data-analyzing campaigners.

In my view, this kills democracy. Even Facebook can see that, saying in January that at its worst, social media “allows people to spread misinformation and corrode democracy.”

The ConversationMy advice: Use Facebook with a healthy dose of skepticism.

  • Man’s dog suddenly becomes protective of his wife, Internet clocks the reason right away
    Dogs have impressive observational powers.Photo credit: Canva

    Reddit user Girlfriendhatesmefor’s three-year-old pitbull, Otis, had recently become overprotective of his wife. So he asked the online community if they knew what might be wrong with the dog.

    “A week or two ago, my wife got some sort of stomach bug,” the Reddit user wrote under the subreddit /r/dogs. “She was really nauseous and ill for about a week. Otis is very in tune with her emotions (we once got in a fight and she was upset, I swear he was staring daggers at me lol) and during this time didn’t even want to leave her to go on walks. We thought it was adorable!”

    His wife soon felt better, butthe dog’s behavior didn’t change.

    pregnancy signs, dogs and pregnancy, pitbull behavior, pet intuition, dog overprotection, Reddit stories, viral Reddit, dog instincts, canine emotions, dog owner tips
    Otis knew before they did. Canva

    Girlfriendhatesmefor began to fear that Otis’ behavior may be an early sign of an aggression issue or an indication that the dog was hurt or sick.

    So he threw a question out to fellow Reddit users: “Has anyone else’s dog suddenly developed attachment/aggression issues? Any and all advice appreciated, even if it’s that we’re being paranoid!”

    The most popular response to his thread was by ZZBC.

    Any chance your wife is pregnant?

    ZZBC | Reddit

    The potential news hit Girlfriendhatesmefor like a ton of bricks. A few days later, Girlfriendhatesmefor posted an update and ZZBC was right!

    “The wifey is pregnant!” the father-to-be wrote. “Otis is still being overprotective but it all makes sense now! Thanks for all the advice and kind words! Sorry for the delayed reply, I didn’t check back until just now!”

    Redditors responded with similar experiences.

    Anecdotal I know but I swear my dog knew I was pregnant before I was. He was super clingy (more than normal) and was always resting his head on my belly.

    realityisworse | Reddit

    So why do dogs get overprotective when someone is pregnant?

    Jeff Werber, PhD, president and chief veterinarian of the Century Veterinary Group in Los Angeles, told Health.com that “dogs can also smell the hormonal changes going on in a woman’s body at that time.” He added the dog may “not understand that this new scent of your skin and breath is caused by a developing baby, but they will know that something is different with you—which might cause them to be more curious or attentive.”

    The big lesson here is to listen to your pets and to ask questions when their behavior abruptly changes. They may be trying to tell you something, and the news may be life-changing.

    This article originally appeared last year.

  • Throughout history, women have stood up and fought to break down barriers imposed on them from stereotypes and societal expectations. The trailblazers in these photos made history and redefined what a woman could be. In doing so, they paved the way for future generations to stand up and continue to fight for equality.

  • ,

    Why mass shootings spawn conspiracy theories

    Mass shootings and conspiracy theories have a long history.

    While conspiracy theories are not limited to any topic, there is one type of event that seems particularly likely to spark them: mass shootings, typically defined as attacks in which a shooter kills at least four other people.

    When one person kills many others in a single incident, particularly when it seems random, people naturally seek out answers for why the tragedy happened. After all, if a mass shooting is random, anyone can be a target.

    Pointing to some nefarious plan by a powerful group – such as the government – can be more comforting than the idea that the attack was the result of a disturbed or mentally ill individual who obtained a firearm legally.


Explore More Articles Stories

Articles

Man’s dog suddenly becomes protective of his wife, Internet clocks the reason right away

Articles

14 images of badass women who destroyed stereotypes and inspired future generations

Articles

Why mass shootings spawn conspiracy theories

Articles

11 hilarious posts describe the everyday struggles of being a woman