a recommended change
May 17, 2020

President Trump's decision to fire State Department Inspector General Steve Linick on Friday came on the advice of Secretary of State Mike Pompeo, a White House official said Saturday.

The move immediately drew sharp criticism from Democrats who consider the ouster a retaliatory act; Linick was reportedly looking into Pompeo's alleged misuse of a department appointee to perform personal tasks for him and his wife, and it comes on the heels of several other federal watchdog dismissals in recent months.

It wasn't only Democrats who seemed unsatisfied with Trump's decision, though. While the president said he no longer had confidence in Linick, an Obama appointee, Sen. Chuck Grassley (R-Iowa), co-chair of the Whistleblower Protection Caucus, said Saturday that Congress is entitled to a more thorough explanation, noting that inspectors general are "crucial in correcting government failures and promoting the accountability that the American people deserve." He said Trump's reasoning, as it stands, "simply is not sufficient."

Grassley's Democratic colleagues, Rep. Eliot Engel (D-N.Y.) and Sen. Bob Menendez (D-N.J.), took things a step further. They sent letters to the White House demanding officials hand over all records related to Linick's firing, adding that they plan to "look deeply into this matter." Read more at NBC News and The Associated Press. Tim O'Donnell

January 25, 2019

YouTube says it will adjust its algorithm to crack down on conspiracy theory videos that may be harmful to users.

The site in a blog post Friday said that it will "begin reducing recommendations of borderline content and content that could misinform users in harmful ways," such as flat Earth theories or 9/11 conspiracy theory videos. YouTube won't actually remove anything, but this algorithm change would affect how often the content pops up as a recommendation for a user or in the "next up" tab after they're finished watching a video.

It doesn't seem that YouTube will never recommend videos like these under any circumstances, though, as the site only says it's going to "limit" the recommendations and saying that "these videos may appear in recommendations for channel subscribers and in search results."

Still, this announcement was seen as a necessary move for the company, especially after a BuzzFeed News investigation published Thursday, which showed how YouTube can direct users to extremist or conspiracy theory videos after they've watched legitimate news. For example, watching a BBC News video of a speech by House Speaker Nancy Pelosi (D-Calif.) eventually leads a user to videos about the QAnon conspiracy theory. Charlie Warzel, one of the reporters behind the BuzzFeed report, wrote Friday that YouTube's algorithm change seems "like a really meaningful step forward," although noting that "no credit is due until real people see meaningful changes." Brendan Morrow

See More Speed Reads