Why Conspiracy Theories Work so Well on Facebook

Credit: Hiroshi Watanabe/DigitalVision/Getty

Panic attacks were common. Employees joked about suicide, had sex in the stairwells, and smoked weed on break. A disturbing investigative report by the Verge last week revealed that some of Facebook’s contract moderators—who are tasked with keeping content like be headings, bestiality, and racism out of your news feed—have turned to extreme coping mechanisms to get through the workday. Some contractors have been profoundly impacted by the content they’re exposed to, which may have implications for the rest of us who have grown accustomed to scrolling past sketchy links in our news feeds.

For some workers, repeatedly viewing conspiracy videos and memes became a gateway to embracing conspiracy theories themselves. Contractors told the Verge’s Casey Newton that some of their peers would “embrace fringe views” after days spent combing through conspiracy videos and memes. “One auditor walks the floor promoting the idea that the Earth is flat,” Newton wrote.

Because most misinformation isn’t banned by the platform’s rules, Facebook generally won’t remove it outright, arguing instead that it should be countered with factual reports. Such efforts have not been totally successful: A recent article in the Guardian found that “search results for groups and pages with information about vaccines were dominated by anti-vaccination propaganda” on the social network.

If the average social media user were akin to a casual smoker, moderators are consuming the equivalent of two packs-a-day.

Of course, this isn’t solely Facebook’s problem. A Qanon conspiracy book, abetted by Amazon’s algorithms, is rapidly climbing the site’s sales charts. YouTube announced Friday that it will withhold ads from all videos that contain Momo, a ghoulish figure some parents worry, needlessly, will instruct their children to self-harm or kill themselves in a viral “challenge.” That hasn’t stopped all kinds of Momo content from spreading, of course.

What causes an otherwise logical person to accept any of this? If Facebook’s moderators—the ones supposedly trained to view this troubling material—are so viscerally affected, what hope is there for the rest of us?

Afew psychological factors may be at play. People tend to develop a preference for ideas they are familiar with, a social science phenomenon that’s known as the mere-exposure effect or the familiarity principle. Ideas, in this framing, work a bit like infectious viruses.

“The more often you see it, the more familiar something is, and the more familiar something is, the more believable it is,” says Jeff Hancock, communication professor and founding director of the Stanford Social Media Lab.

Conspiracy content is engineered to be persuasive. People accept these theories because they help make sense of a world that feels random; even if they seem far-flung to the rest of us, they can offer some sense of comfort or security. And seeing those theories repeatedly pop up in your Facebook news feed “starts to undermine the sense that they are fringe,” says James Grimmelmann, a professor at Cornell Law School who studies internet law and social networks.

While a Facebook user may be outraged the first time they see an objectionable video in their feed, it’s unlikely that they’ll muster an equally emotional negative reaction if that experience is repeated again and again. With enough viewings, they may find the messaging palatable or, at the very least, less jarring. Open the door to your mind a crack, and conspiracy can slink in.

“Repeated exposure could also make the moderators think that the conspiracy theory is more prevalent than it is actually in the population and therefore makes them slowly adhere to it,” says Dominique Brossard, professor in the life science communication department at University of Wisconsin-Madison. “This is why social norms are so powerful.”

Should those beliefs become instilled, people will work to reinforce them, simply because we tend to work hard to defend views when we hold them. “Once you start to believe, confirmation bias kicks in,” says Hancock, referring to the human tendency of seeking out information or news sources that support preexisting beliefs.

A smaller contributing factor may be the psychological phenomenon known as emotional contagion, which is when feelings or emotions spread from person to person. You may not know the term, but you know the feeling. Unconsciously mimicking the facial expression or mirroring the mood of someone you are spending time with are both examples of emotional contagion.

“Conspiracies exonerate them for their own misfortune.”

It could be that conspiracy videos instill or spread fear in viewers, who after mirroring that emotion, seek to attribute that unexplained fear to a tangible source. “‘I’m feeling weird and gross and scared. Why is that?’” asks Hopkins as a hypothetical. “‘That’s probably because the government is against us.’ You’re rationalizing why you’re upset”—even though what made you upset in the first place was the conspiracy content.

While Facebook moderators’ extreme psychological distress is likely linked to the high quantity of disturbing content they consume on the job—one moderator who spoke with the Verge sifted through up to 400 posts a day—Facebook may still serve conspiracy or simply false videos, news, and memes to the rest of us.

What does our collective exposure to conspiracy content on Facebook mean for public mental health at large? Are we all just clicks away from identifying as “flat-Earthers”—or worse?

Joseph Walther, director of the Center for Information Technology and Safety at the University of California-Santa Barbara, cautioned against extrapolating from the extreme experience of Facebook moderators. If the average social media user were akin to a casual smoker, moderators are consuming the equivalent of two packs-a-day.

“The moderators you describe, they’re clearly under duress,” Walther tells OneZero. “Their level of exposure to these messages isn’t normal. Their experience doesn’t map onto the rest of us.”

The ultimate difference between individuals who are unaffected by conspiracy content and those who become indoctrinated may come down to what’s happening to them offline. A major blow to an individual’s ego, such as being fired from a job or rejected by a romantic partner, can draw some people toward racism or conspiracy theories, which Walther says are often intertwined. When they’re hurting, humans like to find a scapegoat.

“If they’re already prone to racist ideology, and perceive some threat to the status or purity or privilege of their own race or sex, and then experience a particular vulnerability, conspiracies could be especially appealing,” he says. “Conspiracies exonerate them for their own misfortune.”


While the general public may be less at risk for conspiracy theory indoctrination than Facebook moderators are, that doesn’t mean that a public health risk doesn’t exist.

Petter Tornberg, a sociologist at the University of Amsterdam in the Netherlands, compares the spread of misinformation in a social network to a wildfire. “An echo chamber has the same effect as a dry pile of tinder in the forest,” Tornberg tells OneZero. “It provides the fuel for an initial small flame, which can spread to larger sticks, branches, trees, to finally engulf the forest.”

That social networks like Facebook tend to create an echo chamber—or what Eli Pariser famously called a “filter bubble”—isn’t by accident. It’s the inevitable result of the attention economy.

“Since online platforms are advertisement-based, they are operating in an environment in which they are competing for attention,” Tornberg says. Outrage and anger are efficient ways of maximizing viewers’ attention—people interact with and share sensationalist coverage to such an extent that a cottage industry of spammers has cropped up on social media, where “junk news” can drive ad impressions. There’s little incentive for platforms like Facebook to discourage users from gravitating toward small interests groups where outrageous material is shared.

“This is just capitalism,” he says. “There are unpleasant jobs, like cleaning sewage. We want the kinds of things that those jobs support.”

“Social media do not necessarily have to be designed in a way that pushes us toward tribalization and self-segregation,” says Tornberg. “They could also be designed to help form the foundation for a common world.” For example, he points to Reddit’s “Change My View” community, where users post opinions they accept may be flawed in hopes of having a discussion and understanding other perspectives.

“But that would require different incentives for the companies behind them.” And right now, social networks like Facebook—which made $22 billion in 2018, an increase in nearly 40 percent from the previous year—are doing a very effective job maximizing the current economic incentives.


Automating content moderation would spare humans from the potentially damaging effects of viewing high quantities of traumatic videos and conspiracy theory posts. But while artificial intelligence may seem like an obvious solution for reducing moderators’ and the public’s exposure to conspiracy theories—albeit at the ultimate cost of many of those moderators’ jobs—Hancock says that the technology isn’t there yet.

“Humans are developing and producing this content,” he says. “For a good chunk of time, humans will be the only ones who can make these judgments.” This is in part because flagged posts on Facebook are, by definition, atypical, and current A.I. does not do well with the atypical. “It’s hard to anticipate a lot of the kinds of content that they see,” Hancock says. “Part of it is because it’s fringe stuff.” Fringe content surprises us because it is unusual, and computers are trained to react to situations that they have encountered before. Training a machine to react to yet-to-be imagined conspiracies is practically impossible at this point. For Grimmelmann, the existence of content moderation as a profession is a necessary evil for protecting the mental health of the country at large—as long as the country remains addicted to social media in its current form. “This is just capitalism,” he says. “There are unpleasant jobs, like cleaning sewage. We want the kinds of things that those jobs support.” Just because an unpleasant job is socially necessary, however, doesn’t mean that employers should do nothing to ameliorate those working conditions. Grimmelmann believes the public should pressure platforms like Facebook to commit to keeping conspiracy theories out of users’ news feeds and to making sure their content moderators have the support they need to protect the rest of us, including a reduced workload, more emotional support, and better pay. (Per the Verge, Cognizant’s contractors make $28,800 a year.) And it means that more people, not fewer, will need to take part in content moderation work. “Unfortunately this is going to have to be a lot bigger industry,” says Hancock. What we’re seeing now is just the beginning. Facebook alone says it employs 30,000 trust and safety workers, including contractors. Half of them work in content moderation, and investment in content review will almost certainly continue to swell. “We’ve more than doubled this team each of the last two years, adding thousands of people to these efforts,” Justin Osofsky, Facebook’s vice president of Global Operations, wrote in a post published to Facebook on Feb. 25.

So, just as people created these problems to begin with—by exploiting a platform to spread conspiracy theories and other disturbing material—people will have to fix them. No automated process could anticipate every permutation of junk content we’re capable of coming up with. Or, as Hancock puts it: “I don’t think humans will ever be out.”

Google found it paid men less than women for the same job

Google Germany Opens Berlin Representation Office

The story we’re used to hearing is that women get paid less than men. In Google’s  case, according to its own internal pay audit, it turned out male-identified Level 4 Software Engineers received less money than women in that same role. That led to Google paying $9.7 million to adjust pay for 10,677 employees.

It’s not clear how many of the employees who received pay adjustments were men (TechCrunch reached out to Google about this, but the company declined to share any additional data), but Google does cite the underpaying of men as a reason why the company paid more in adjustments for 2018 than in 2017. But The New York Times reports men received a disproportionately higher percentage of the money.

For 2017, Google paid just $270,000 to close any wage gaps for 228 employees across six job groups. Google also cited its new-hire analysis as a reason why the company had to make more adjustments. The analysis, which entailed looking for discrepancies in offers to new hires, accounted for 49 of the total amount spent on adjustments. 

“Our pay equity analysis ensures that compensation is fair for employees in the same job, at the same level, location and performance,” Google Lead Analyst for Pay Equity and People Analytics Lauren Barbato wrote in a blog post. “But we know that’s only part of the story. Because leveling, performance ratings, and promotion impact pay, this year, we are undertaking a comprehensive review of these processes to make sure the outcomes are fair and equitable for all employees.”

Meanwhile, Google is still battling a class-action pay discrimination lawsuit and is the subject of a Labor Department investigation pertaining to compensation data.

A second patient appears to be cured of HIV, sparking new hope for a novel treatment

View of Eiffel Tower between trees, Paris, France

The French government has unveiled a complete overhaul of the French Tech Visa for employees working for a tech company. And France is taking a contrarian stance by making it easier to come work in France. Let’s start with the big number. According to French Tech Mission director Kat Borlongan, there are more than 10,000 startups that meet the requirements to access the French Tech Visa and hire foreign employees more easily. (And if you live in the European Union, you don’t need a visa, of course.)

I asked Borlongan why it was important to overhaul the French Tech Visa. “Because our startups needed it,” she told me. “There are two dimensions to that. There’s the economic supply-demand part — all the high-growth startups we interviewed pretty unanimously said that hiring was their number one priority and that they were looking for profiles that weren’t readily available in France.” “The second is cultural. As strong an ecosystem as the French Tech is becoming, it’s still perceived as overwhelmingly French. To succeed globally, we need to become global ourselves, in terms of team composition, mindset, markets, etc.” Unlike many American visas, you don’t need to prove that you’ve been looking for candidates in France. You don’t need to pay crazy-high immigration lawyer fees — the French Tech Visa costs €368 in administrative fees. Future employees don’t need to meet any diploma requirement.

The previous version of the visa was limited to roughly 100 companies that were selected as part of the Pass French Tech program. Employees also had to graduate with a master’s degree. So it’s a huge change.And it’s a pretty sweet deal for foreign employees as well. Your visa is valid for four years and renewable after that. You don’t have to stay in the same company — you can work for another company and keep your visa. Your family also gets visas so they can come with you. If your startup has raised money from a VC fund, has been part of an accelerator, has received state funding or has the JEI status, then you’re eligible.

La French Tech  and the French government have created various lists of VC funds, accelerators, grants, etc. If you meet one of those conditions, you can apply to the visa program. You’ll find most VC funds and accelerators based in France (but not all of them), as well as a few foreign companies (Y Combinator, 500 Startups, Techstars, Entrepreneur First, Plug and Play, Startupbootcamp). Those lists will be updated multiple times per year.

Startups that want to take advantage of the French Tech Visa need to complete an online form first — the full list of VC funds and accelerators is embedded in the form. Future employees can then get their visa from their home country at the French Consulate. The French tech ecosystem has been growing rapidly. And many French startups have chosen to work in English and hire foreign talent. Tech talent is becoming a global talent pool, so this visa scheme is essential for the future of the French tech ecosystem.

Google reveals “high severity” flaw in macOS kernel

Google’s Project Zero team is well-known for its knack of finding security flaws in the company’s own products as well as those manufactured by other firms. Its members locate flaws in software, privately report them to the manufacturers, and give them 90 days to resolve the problem before publicly disclosing it.

Last year, the team revealed vulnerabilities in Windows 10 S and Microsoft Edge. Now, it has exposed a “high severity” flaw in macOS’ kernel.

A security researcher from Google’s Project Zero has discovered that even though macOS’ kernel, XNU, allows copy-on-write (COW) behavior in some cases, it is essential that any copied memory is not available for modifications from the source process. While COW is a resource-management technique that is not inherently flawed, it appears that Apple’s implementation of it certainly is.

Project Zero has found out that if a user-owned mounted filesystem image is modified, the virtual management subsystem is not informed of the changes, which means that an attacker can potentially take malicious actions without the mounted filesystem knowing about it. The detailed explanation can be found below:

This copy-on-write behavior works not only with anonymous memory, but also with file mappings. This means that, after the destination process has started reading from the transferred memory area, memory pressure can cause the pages holding the transferred memory to be evicted from the page cache. Later, when the evicted pages are needed again, they can be reloaded from the backing filesystem.

This means that if an attacker can mutate an on-disk file without informing the virtual management subsystem, this is a security bug. MacOS permits normal users to mount filesystem images. When a mounted filesystem image is mutated directly (e.g. by calling pwrite() on the filesystem image), this information is not propagated into the mounted filesystem.

The researcher informed Apple about the flaw back in November 2018, but the company is yet to fix it even after exceeding the 90-day deadline, which is why the bug is now being made public with a “high severity” label. That said, Apple has accepted the problem and is working with Project Zero on a patch for a future macOS release. You can also view the proof-of-concept code that demonstrates the problem on the dedicated webpage here.

Facebook Plans to Become World’s Biggest Central Bank

Facebook is making its own digital currency and could become the largest bank in the world. Here’s why…

“Facebook is looking at pegging the value of its coin to a basket of different foreign currencies…held in Facebook bank accounts.”

— “Facebook and Telegram Are Hoping to Succeed Where Bitcoin Failed”, By Nathaniel Popper and Mike Isaac, The New York Times, Febraury 28, 2019 According to The New York Times (NYT), Facebook is creating its own cryptocurrency. It would be used on WhatsApp, which Facebook owns, to facilitate transactions between users.

Facebook’s move is clearly to counter the threat from up and coming messenging rivals Telegram and Signal. NYT said that the Facebook secret effort to build its own cryptocurrency “started last year after Telegram raised an eye-popping $1.7 billion to fund its cryptocurrency project”.

Between Messenger, WhatsApp and Instagram, which Facebook owns, there are a collective 2.7 billion users. If Facebook decides to back the value of its own digital with a basket of foreign currencies, then it could potentially become the largest central bank in the world — because that’s what central banks do; print money backed by a basket of foreign currency reserves. Not only will this become monumental in world economic history, it is also going to become a serious and rapid threat for the existing giants of the finance industry.

Bye bye Visa and MasterCard

The two largest credit card companies in the world are Visa and MasterCard. They provide us with the ability to transact without carrying cash. They then settle those transactions with our banks.If Facebook issued its own digital currency, and all its users had a Facebook mobile wallet with Facebook coins in it, then the need for credit cards will diminish more and more.You make a purchase online, you pay for it with Facebook coins. Many apps have already integrated user sign-in with Facebook accounts. Payment is just another step away. You buy a latte at Starbucks, you tap your phone at the counter to pay with Facebook coins instead of Apple Pay or Google Wallet. Facebook will definitely roll out marketing incentives for users to pay with their Facebook mobile wallets at the point of purchase. This is how existing mobile wallets battle it out for market share.And if Facebook is backing the conversion value of its coins with real world money, few would see the need to convert their Facebook coins in and out of their Facebook mobile wallets frequently.But why would you buy Facebook coins in the first place? Because of the big difference it would make to e-commerce.

E-Commerce F.0

When Facebook coins launch, it could bring e-commerce to a whole new level. Facebook will achieve its dream of online merchants being able to post and sell items directly within its apps, down to the payment process. No more clicking links to 3rd party sites. No more typing in credit card or PayPal account details. Online shopping will become a two-step process in Facebook apps. I see, I like, I click buy, I hit confirm. Facebook coins move from my mobile wallet to the seller immediately. There will be no exchange rate spreads and far lower transaction fees. Social networks, instant messaging and e-commerce will become truly integrated. Consumers will come to expect the convenience of two-click transactions. Existing e-commerce platforms using traditional payment gateways will have to adapt to compete.Note: The above scenarios has been the case in China with WeChat wallet and Alipay for many years now.

Facebook becomes the biggest currency in the world

The NYT report said that, “ The Facebook project is far enough along that the social networking giant has held conversations with cryptocurrency exchanges about selling the Facebook coin to consumers.” This could potentially create the biggest private currency exchange market in the world. The world’s population is about 7.7 billion people. China is the most populous country in the world at 1.4 billion. WhatsApp alone has 1.5 billion users. Put Facebook Messenger and Instagram into the equation, no other currency has anything close to a potential user base of 2.7 billion. That’s what the Facebook coin could become — the most held and used currency in the world!It will spawn a foreign exchange market the size of which the world has never seen. Facebook could potentially become more powerful than any other central bank.At the same time, as Facebook coins become more and more frequently traded by users, a market for Facebook coin derivatives will emerge — Facebook coin futures, forward contracts, options, interest rate swaps etc.

Facebook the next Silk Road?

The NYT article also said, “The big question facing Facebook is how much control it would retain over the digital coin. Working with cryptocurrency exchanges would take at least some of the regulatory burden off Facebook…but..it will be harder for the company to make money from transaction fees and easier for criminals to use the coin for illegal purposes.” Facebook faces a real dilemma here. It has gotten a lot of public backlash for selling user data in recent years. But if it swings to the other extreme and guarantees absolute privacy like rivals Telegram and Signal, then it could potentially become another Silk Road, the Internet black market that first popularized Bitcoin when users started using it to buy and sell drugs. NYT says Facebook coins could be issued as early as the first half of this year. The world will know the answers to the above scenarios soon.