View this email in your browser

Whistleblower turns up the heat on Facebook and Instagram

By Mathew Ingram

Last month, the Wall Street Journal published a series of investigative news stories about Facebook, alleging a pattern of questionable behavior at the social network and its photo-sharing service, Instagram. One said that changes to the Facebook news feed algorithm, which were purportedly designed to improve the news-reading experience, actually had the opposite effect and "turned it into an angrier place." Another said that the company knew about the negative effects Instagram had on the mental health of young girls—researchers working at Facebook had repeatedly mentioned it during briefings with senior executives—but Facebook took little or no action. Other Journal stories revealed a little-known feature that allowed celebrities to avoid responsibility for breaching Facebook's rules, and claimed that the company knew its services were being used by drug cartels and human trafficking networks, but routinely failed to do anything to stop it. Facebook, for its part, responded that the Journal stories are inaccurate, and that it cares deeply about the effect its products have on users, including young girls.

The Journal reports were all based on what the paper called "an extensive array of internal company communications" provided by a whistleblower, a former Facebook staffer who copied the documents before they quit working for the company because they disagreed with its behavior. Last Sunday, on 60 Minutes, the whistleblower revealed herself to be Frances Haugen, a former product manager at Facebook who has also worked for Google, Pinterest, Yelp, and a number of other technology companies. On Tuesday, Haugen testified before the Senate subcommittee on consumer protection, product safety, data security, and the potential dangers of Instagram for young users. (Haugen also posted her testimony to her personal website.) In both her 60 Minutes interview and her congressional testimony, Haugen made the same central point: Facebook knew about the dangers of the recommendation algorithms that power its social network and Instagram, but chose to do nothing. 

“The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people,” Haugen said during her testimony to the Senate subcommittee. She encouraged Congress to take action by regulating Facebook, comparing it to the tobacco and automobile industries, which the government further regulated in order to protect consumers from harm. One of the big challenges with Facebook, she argued, is that legislators don’t have any idea how the company's products work because it is reluctant to share data from its own internal research and to provide data to outside scientists. “This inability to see into Facebook’s actual systems and confirm they work as communicated is like the Department of Transportation regulating cars by only watching them drive down the highway,” Haugen told the subcommittee.

Many critics of Facebook, including law professor and former Congressional candidate Zephyr Teachout (who spoke with CJR as part of a discussion series on our Galley platform), have argued that antitrust action is the only solution to the problems the company creates, and that it needs to be broken up and forced to sell off its subsidiaries, including Instagram and WhatsApp, its messaging service. Surprisingly, Haugen said she disagrees with this approach. “I’m actually against the breaking-up of Facebook,” she said in Tuesday’s hearing. “If you split Facebook and Instagram apart, it’s likely that most advertising dollars would go to Instagram, and Facebook will continue to be this Frankenstein that is endangering lives around the world”—without, she noted, the necessary funds to pay for the content moderation and other work required. 

Haugen said she believes that “regulatory oversight and finding collaborative solutions with Congress is going to be key, because these systems are going to continue to exist and be dangerous.” Some industry watchers, however, say Haugen's proposal for government regulation would prove to be a favorable outcome for Facebook; there would be no expensive breakup, and oversight would be susceptible to lobbying, enabling the company to shape regulations to its own benefit

Since Haugen’s 60 Minutes appearance, the company has demeaned its former employee, suggesting she lacked sufficient authority to be credible—even though the bulk of her whistleblowing came from the company's own research. Lena Pietsch, Facebook's director of policy communications, dismissed Haugen as “a former product manager who worked for the company for less than two years, had no direct reports [and] never attended a decision-point meeting with C-level executives." Facebook CEO Mark Zuckerberg, who has stayed out of the limelight over the past few weeks, said in a Facebook post that he and the rest of the company care deeply about safety, well-being, and mental health. "It's difficult to see coverage that misrepresents our work and our motives," he wrote. “At the most basic level, I think most of us just don't recognize the false picture of the company that is being painted.”

Below, more on Facebook:

  • In the wake of the Journal's reporting and Haugen's congressional testimony, some critics have suggested that if regulators can't find a way to hold Facebook responsible for the actual content it hosts—because of the legal protections contained in Section 230 of the Communications Decency Act—then they might be able to hold it responsible for the way its algorithms promote certain kinds of content. Researcher Daphne Keller, however, argues that while this might sound like a great idea, it's likely to be considerably harder than it sounds because of the First Amendment, and the way that courts have ruled on similar attempts to govern algorithmic behavior.
  • Nathaniel Persily, a professor of law and director of the Stanford Cyber Policy Center, is asking Congress to pass a law that would grant researchers access to information from Facebook about how its services impact society. In an op-ed for the Washington Post, Persily writes that he resigned last year as co-chair of Social Science One, a partnership between researchers and Facebook, because of what he said was "years of frustration" over broken promises to share more data. "When Facebook did finally give researchers access to data, it ended up having significant errors—a problem that was discovered only after researchers had spent hundreds of hours analyzing it, and in some cases publishing their findings."

Under the Influence

by E. Tammy Kim

Korea has a disinformation problem

Other notable stories: 

  • Meredith, which owns a stable of magazines including People and Better Homes & Gardensis being acquired by Barry Diller's IAC holding company, and will be merged with IAC's Dotdash digital content group, formerly known as, in a deal that is valued at $2.7 billion. Meredith acquired Time Inc. for $1.85 billion in 2018, and later sold Time magazine to Salesforce CEO Marc Benioff; it also sold off Sports Illustrated, Fortune and Money. Dotdash owns more than a dozen branded websites that post content related to health, finance, and lifestyle; they include Investopedia, Serious Eats, Treehugger, and Brides.
  • Wired magazine looks at how the International Consortium of Investigative Journalists coordinated reporting on the Pandora Papers document leak, which included almost three terabytes of data. "The Pandora paper revelations came from an unfathomably big tranche of documents: 2.94 terabytes of data in all, 11.9 million records and documents dating back to the 1970s," the magazine reports. "But how do you handle a massive leak of such size securely, when documents come in all sizes and formats, some dating back five decades?"
  • The British Broadcasting Corp., ITV, Channel 4 and ViacomCBS are building a shared service that would better promote their streaming brands, according to a report from Bloomberg. The broadcasters are developing a common platform in order to defend themselves against US tech giants and a planned overhaul of British TV laws, the Bloomberg report states. "The work is being loosely organized through the company Digital UK, owned by the BBC, ITV and Channel 4. The idea is to stay relevant and present a united front in negotiations with the new gatekeepers of streaming TV: Silicon Valley operating systems like Alphabet Inc.’s Android and smart TV manufacturers such as Samsung Electronics."
  • Bustle Digital, which revived Gawker this year after a couple of false starts, has rolled out a revamped version of Mic, another media asset that Bustle bought after it filed for bankruptcy. “We are a place you can read a review of Lil Nas X’s new album and also a column about the existential feelings around climate change,” Shanté Cosme, Mic’s editor in chief, said in an interview with the New York Times. The makeover was led by Cosme and Joshua Topolsky, a chief content officer at Bustle Digital and the former founder of Outline, another New York-based media startup that was acquired by Bustle after it failed.
  • Charles McPhedran writes for CJR about a media war taking place in Belarus, featuring two exiles who have created the world’s largest Telegram channel, with over a million subscribers, and two popular YouTube channels. "Nexta’s hyperactive mixture of pointed, sometimes vulgar, videos, reader-generated exclusives, and calls to protest helped launch a street movement that posed one of the most serious threats to the grasp on power exerted by Lukashenko, Eastern Europe’s longest-lasting and perhaps fiercest dictator."
Questions or comments about what you’d like to read with your coffee? 
Reach today's newsletter editor, Mathew Ingram, at
Our weekly podcast on media news, The Kicker, is available on Apple PodcastsStitcher, and SoundCloud.

Catch up with all of our coverage at
Tweet Tweet
Share Share
Post Post
You are receiving this because you signed up for CJR’s regular email newsletter. You can unsubscribe from this list.

Columbia Journalism Review
801 Pulitzer Hall
2950 Broadway
New York, NY 10027