Breaking news. Just moments ago, Facebook announced it is getting rid of fact checkers and also making major changes to what content is allowed. This is just the latest move is Facebook's parent company, meta and founder Mark Zuckerberg worked to very publicly win favor with Donald Trump.
The recent elections also feel like a cultural tipping point towards once again prioritizing speech. So we're going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms. More specifically, here's what we're going to do.
First, we're going to get rid of fact checkers and replace them with community notes similar to X starting in the U. S. , we tried in good faith to address those concerns without becoming the arbiters of truth.
But the fact checkers have just been too politically biased and have destroyed more trust than they've created, especially in the US. Wow. Getting rid of fact checkers and Claire Duffy, you can see it right here is here with us now.
So what's going on here? Yeah, John, this is a major reversal. Let's not forget that the reason that Facebook and Meta introduced this third party fact checking system in the first place was because it was accused of allowing foreign actors to spread disinformation and discord around American political elections.
Now the company is saying it is getting rid of those third party fact checkers and will introduce this system called community notes, which is something that Elon Musk has also rolled out on X. It lets users add user generated contacts labels to posts, but you're relying on users to do that and to bring in the information. This was announced on Fox and Friends by Joel Kaplan, who is one of the company's top Republicans.
Just last week, we talked about he was elevated to the company's top policy job. Here's how he described the system. And so somebody can write a note.
And then the way it works is, different, different people on the platform can sort of vote on that note. And if you get people who usually disagree who all say, yeah, that sounds right, then that note gets put on the on the post and people see it. X has been doing it for a while.
we think it's working really well and we're going to adopt that system. So meta also is announcing a number of other changes to its moderation practices. It's going to scale back some of the automatic filters that catch bad content.
It says it will keep that for things like terrorism, child sexual exploitation, but for lower severity violations. It's going to ask users to report those before it evaluates them. The company is also moving one of its key safety teams from California to Texas.
Again, an effort to sort of appeal to perhaps users in that part of the country. And Zuckerberg said that this is going to allow more bad things to make it onto the platform. He acknowledged that, but he said it will help with free expression.
A couple points I want to make here in a moment. Joel Kaplan announced that on Fox News. Again, everything very public, unsubtle.
I think what meta is doing here, and I think they would perhaps admit to that. I don't think they would run away from that. And the New York Times reporting that the Trump administration got a heads up about this policy shift beforehand.
So, again, seems to be part of this larger, very public effort to curry favor. Yeah. And Kaplan, did he acknowledge that it was in part because of this change in administration that they are making these changes?
He said that over the last four years, the company feels like it's experienced pressure to crack down on more content, and that the Trump administration offers them an opportunity to pull back a bit there. So he is acknowledging it. I will note, too, the company announced yesterday that it's adding three new board members, including UFC CEO Dana White, who we know is a Trump supporter.
He was stumping for Trump on the campaign trail. So it does feel like this broader shift at the company completely unsubtle. All right, Claire Duffy, thank you very much.
Meta has announced this morning that it's making major changes to its content moderation on Facebook and Instagram. Mark Zuckerberg saying they're getting rid of fact checkers, replacing them with user generated community notes, which is similar to what we know Elon Musk has done with X. The way Mark Zuckerberg puts it in his announcement was that if he says the fact checkers were getting were too politically biased, and he's now attempting to restore free expression, the New York Times is reporting, Congressman, the Trump administration got a heads up about this move.
What do you think of this? Look, it's the lack of profiles in courage. Would with Zuckerberg do this if Trump had lost the election?
Is this just an attempt, blatantly trying to appease the president going in so he's not attacked? let's just remember, in 2017, the entire intelligence community said that the Russians attacked through social media our democratic process, and they did it to help the Trump campaign. So social media is an extraordinary weapon against our democracy.
There is a balance to protect our First Amendment rights. But when you take away the guardrails and you see the far right connections here, it's a great concern. So what do you what do you can you do about, I'll say, the Elon Musk iteration of the federal government, if that's the the the direction it's going, not elected, not in not going into the government.
What what direction is this heading? Look, what we can do as Democrats is, is draw attention to what he's done and the fact that, obviously he wasn't elected and how this impacts him. The when we took out in that first bill, the, disaster relief, all we could do was communicate with the American people, though, how much it would cost each state and disaster relief that had to get put back in the next bill.
And it did. But that's all we can do, is communicate directly with the American people and our colleagues and hope that they understand, Mr Musk, while wealthy, is not particularly intelligent about, a lot of items. If you're so rich, why aren't you smart?
And he's not connected to the American people, and he has a disconnect that isn't really concerned with their needs. So I think that's, the limits of what we can do. where we vote in Congress.
Congressman Mike Quigley, thank you for coming in. But I ask you a question, no pun intended here. Maybe pun intended.
How do you feel about facts? How do I I'm pro facts. I mean, yeah, look, I'm an engineer.
I'm a numbers guy. I'm a spreadsheet guy. I'm a firm believer that you look at the data, you make decisions, you be super transparent.
Especially for those who will public office explain to folks why you're making the decisions. You are based on data and you move forward. So some things are objectively true.
Other things are objectively not true. Last time I checked, yes. Okay, so the reason I'm asking you is because meta, the parent company of Facebook, announced this morning that it's getting rid of its fact checkers.
Great. Good. Nobody believes them.
No. And in their own words, there was severe political bias there. If there was a conservative group that had fact checker fact checkers, I'd say there's political bias.
If there's a liberal group, if there's social media groups, if there's there's always going to be a bias in what you do. I think in the social media world, the political bias got very, very heavy. They acknowledge it.
They're going to make a change. Can you get better fact checkers then, rather than getting rid of them altogether? How about not worry about fact checking?
How about, well, you just told me that there are things that are objectively true and objectively not true. They are. But is that the role is social media's role to prove to you what is true and what is not, or is its role to be an open and open platform for discussion, debate, opinion, you know, whether folks believe something or not believe something.
I would say that's really where social communication, that's the more of the role of social media not to be the police of what's true without using words like policing. I'm glad we're having this discussion because it's an important one here. Whose responsibility is it, then to to talk about what's true and not true?
Yours, mine, my kids, our communities? We all have that personal responsibility. It's not the government's.
Oh, there's a freedom of speech aspect to this, of course, but it's not the government's role that I tell you absolute what is true. And it's not true. What you should believe judge you on what you believe and how you're better.
Yeah. We're not talking about the government. We're talking about Facebook.
Right? Is it private business? Is that is that is it is it in their business model that we are going to be the police of truth in America?
And when people say something that isn't true, we're going to immediately correct it or ban them. No, of course not. That's I I'm going to guess I haven't seen the business plan for Facebook, but I'm guessing that meta doesn't have it in.
So, you know, when you go on Facebook, whether what you're reading is true, you don't, you don't. And that's why you shouldn't use Facebook or just truth or just social media or just CNN or just ABC. You you use your experiences to understand that what you're getting.
And I'll say this, I think the next generation, I think we have this tough debate on it because social media is new for us, right? Our generation invented social media. We didn't have the maturity and the responsibility to handle it appropriately.
I think the next generation does. I think they know the vast majority of what they're seeing online isn't necessarily truthful. There is bias on both sides.