Cambridge Analytica may technically no longer exist,* but revelations about its conduct before and during the 2016 election continue to raise concerns about how social media can be used to undermine fundamental democratic processes.
On Wednesday, Cambridge Analytica whistleblower Christopher Wylie testified before the Senate judiciary committee. The wide-ranging, three-hour exchange with the company’s former research director did little to allay fears.
The panel also featured Mark Jamison, a visiting scholar at the American Enterprise Institute, and Eitan Hersh, a professor at Tufts University and an expert on voter behavior and election strategies.
Here are some of the major points of discussion:
There were targeted efforts to suppress voters by race.
Wylie testified that Cambridge Analytica actively worked to undermine voters of color and foster political disengagement ― a tactic that featured more and more heavily in conversations just before Wylie left the firm in 2014, though he says he didn’t personally oversee or participate in any of those programs.
According to Wylie, Cambridge Analytica’s algorithms could very accurately predict race, and then go a step further by serving differentiated messages within that group. So instead of serving similar, generic messages to a large group of black people, for example, Cambridge Analytica could isolate and target specific individuals within that group who displayed certain character traits.
“When you pull a random sample of African Americans, they’re all different people,” he said. “Understanding their internal characteristics is a very powerful thing. You don’t treat them just as a black person, you treat them as [individuals].”
Cambridge Analytica likely made similar efforts to suppress the votes of other communities, Wylie said, targeting “anybody with characteristics that would lead them to vote for the Democratic Party.”
The Mercers carried a lot of clout within Cambridge Analytica ― and there’s a chance they violated campaign finance law.
Robert Mercer is a billionaire megadonor who supports conservative causes. The Mercer family initially backed Sen. Ted Cruz (R-Texas) in the 2016 Republican primary, then shifted their support to Donald Trump in the general election.
In 2013, at the urging of Steve Bannon ― who would later serve as White House strategist in the Trump administration ― Mercer invested tens of millions of dollars in Cambridge Analytica to effectively create an American shell company for its British parent, SCL Group.
Sen. Dianne Feinstein (D-Calif.) posited on Wednesday that Mercer’s move gave Cambridge Analytica the appearance of following U.S. election law, which prohibits foreign firms from working on U.S. elections. But Wylie testified that Cambridge Analytica nevertheless featured numerous British staffers, in violation of a 2014 memo from outside legal counsel warning it not to do so.
What’s more, Mercer’s funding came on the condition that Cambridge Analytica work only with Republicans. Sen. Richard Blumenthal (D-Conn.) wondered Wednesday if the “investments” may have been a deliberate attempt to circumvent campaign finance laws.
“Whenever Mercer invested money, it was for [research and development] which ultimately was to the benefit of its clients ― various PACs and campaigns,” Wylie told Blumenthal. He added that Mercer’s funds allowed Cambridge Analytica “to work on projects and charge clients substantially less money for the work that was being done than it would have actually cost had the clients been paying for it themselves.”
After Mercer put money in, “the only restriction was that we not work with Democrats,” he said.
Cambridge Analytica had its hands in a lot of cookie jars ― many of them Russian.
According to Wednesday’s testimony, Cambridge Analytica’s parent company, SCL Group, engaged yet another subsidiary, named Aggregate IQ, to bolster the Brexit campaign. Per Wylie, the campaign groups Vote Leave, BeLeave and Veterans for Britain all had contracts with the company. (Separately Wednesday, The Guardian reported that Vote Leave and BeLeave both used the same data sets to target Facebook users, indicating a higher level of coordination than previously realized.)
Wylie also says Cambridge Analytica enlisted the help of Black Cube, an Israeli private intelligence group, to conduct espionage work in Nigeria.
And Cambridge Analytica has numerous ties to Russia. In 2014, Alexander Nix, Cambridge’s CEO, met with executives from Lukoil, Russia’s second largest oil company, to discuss the firm’s political work in Nigeria and elsewhere.
The firm was also in contact with WikiLeaks founder Julian Assange, though Feinstein acknowledged Wednesday that “the extent of Cambridge Analytica’s connection to WikiLeaks and other Russian interests” isn’t clear.
Wylie said Cambridge Analytica also engaged with contractors who had advanced pro-Russian messages in Eastern Europe, at one point collecting data on how Americans viewed the leadership style of Russian President Vladimir Putin and what their thoughts were on issues “relating to Russian expansionism.”
“There was a lot of contact with Russian companies that made it known this research was being done,” Wylie said. “A lot of noise was being made to companies and individuals who were connected to the Russian government.”
Did it even work?
Probably ― though it’s not clear to what extent.
Armed with enough data points from Facebook, Wylie said, Cambridge Analytica could target people remarkably well. Based on as few as 100 of a person’s Facebook “likes,” he said, “you can get to the same level of accuracy predicting personality traits as your spouse.”
Hersh, the Tufts professor, disputed the idea that predictive capacity translates to persuasive advertising, pointing out that political campaigns have a long history of engaging in tactics that don’t work particularly well.
“Every election brings exaggerated claims about the effects of the latest technologies,” Hersh said, noting it’s in the interest of campaign consultants like Cambridge Analytica to embellish their role in a candidate’s victory. “From everything I’ve publicly seen about [the firm], I’m skeptical its strategies were unusually effective.”
Sen. John Kennedy (R-La.), however, called Hersh’s argument “rubbish,” saying advertising can absolutely have a lasting impact when done well.
“I see kids walking around all the time saying ‘Dilly dilly,’” Kennedy said, referring to a popular tag line from a recent series of beer commercials. “They didn’t just dream that up.”
The Senate judiciary committee would like to ask Steve Bannon a few questions.
Wylie left Cambridge Analytica in 2014 and was therefore unable to answer specific questions about the firm’s work during the 2016 election. Instead, he repeatedly referred the senators to Bannon, who helped launch Cambridge Analytica and was reportedly deeply involved in its early efforts to collect Facebook data and use it to target and influence voters.
Bannon served as Trump’s chief campaign strategist during the 2016 election, then assumed a similar role in the White House until his ouster last summer. Per Wylie, Bannon was in charge when Cambridge Analytica began touting its voter suppression tactics to candidates.
Bannon was also behind crafting messages like “drain the swamp” that would ultimately become focal points of the Trump campaign.
“The company learned [there] were segments of the population that responded to messages like ‘Drain the swamp,’ or images of walls, or indeed paranoia about the ‘deep state,’ that weren’t necessarily reflected in mainstream polling or mainstream political discourse,” Wylie said, adding that Bannon “saw cultural warfare as the means to create enduring change in American politics.”
It seems pretty clear: Social media hurts democracy.
Amid much uncertainty about Cambridge Analytica and just how much influence it wielded in 2016, most of the experts Wednesday agreed on one thing: Social media is bad for democracy.
“We’re seeing a resegregation of society that’s catalyzed by algorithms,” Wylie said. Sites like Facebook reward informational echo chambers where partisan views are reinforced instead of challenged. “Instead of a common fabric,” he said, “we’re tearing that fabric apart.”
The further people drift from the mainstream, the more they’re susceptible to absurd conspiracy theories and increasingly vitriolic messaging.
Hersh agreed. “We have a basic human response that we are attracted to provocation and extremism,” he said, “and online platforms are encouraging that behavior. We aren’t drawn to things that are truthful. We’re drawn to what we want.”