- Vanity Fair – “Each time a fresh scandal plays out at Facebook, the company positions itself as playing catch-up. Apparently alarmed and dismayed at the ready spread of fake news on its platform, Facebook de-emphasized posts from media companies. Faced with a decline in user trust, Facebook admitted it might be inherently damaging to democracy. And when a Cambridge Analytica contractor was found to have siphoned data from millions of Facebook users through an app, the company retroactively promised to fix the problem. But an internal memo written in June 2016 by top Facebook executive Andrew Bosworth, a close confidante of Mark Zuckerberg’s, and published by BuzzFeed News on Thursday, all but blows up the naïveté the company has cultivated, suggesting that Facebook has always been aware of the potential for bad actors to abuse it, and has actively chosen to disguise those concerns from the public. In the memo, Bosworth argued that Facebook could very well be used to bully or to coordinate acts of terrorism, but that the company’s mission—connecting people—should take precedent over the possibility of such negative outcomes…After BuzzFeed published the memo, Bosworth—who is now the V.P. of consumer hardware at Facebook, and who has taken to Twitter time and time again to defend the company as it faces new rounds of controversies—said he didn’t agree with his own thoughts, even as he wrote them. “The purpose of this post, like many others I have written internally, was to bring to the surface issues I felt deserved more discussion with the broader company,” he said. In other words, he argued, the post was a thought exercise meant to provoke, rather than a legitimately held viewpoint. Zuckerberg, who has been relatively quiet following his mea culpa tour last week, distanced himself from Boz’s comments. “Boz is a talented leader who says many provocative things,” he told BuzzFeed. “This was one that most people at Facebook including myself disagreed with strongly. We’ve never believed the ends justify the means.”
- NiemanLab: This is how Cambridge Analytica’s Facebook targeting model really worked — according to the person who built it. “The method was similar to the one Netflix uses to recommend movies — no crystal ball, but good enough to make an effective political tool. The researcher whose work is at the center of the Facebook–Cambridge Analytica data analysis and political advertising uproar has revealed that his method worked much like the one Netflix uses to recommend movies. In an email to me, Cambridge University scholar Aleksandr Kogan explained how his statistical model processed Facebook data for Cambridge Analytica. The accuracy he claims suggests it works about as well as established voter-targeting methods based on demographics like race, age, and gender. If confirmed, Kogan’s account would mean the digital modeling Cambridge Analytica used was hardly the virtual crystal ball a few have claimed. Yet the numbers Kogan provides also show what is — and isn’t — actually possible by combining personal data with machine learning for political ends. Regarding one key public concern, though, Kogan’s numbers suggest that information on users’ personalities or “psychographics” was just a modest part of how the model targeted citizens. It was not a personality model, strictly speaking, but rather one that boiled down demographics, social influences, personality, and everything else into a big correlated lump. This soak-up-all-the-correlation-and-call-it-personality approach seems to have created a valuable campaign tool, even if the product being sold wasn’t quite as it was billed..”.
Sorry, comments are closed for this post.