Zuckerberg denies Facebook News Feed bubble impacted the election1 month, 6 days ago
In the aftermath of the U.S. Presidential election, Facebook founder Mark Zuckerberg took the stage at Techonomy1 6 to address concerns that the companydidnt do enough to stop the proliferation of fake news on News Feed .
Zuckerberg insisted that more can always be done to improve the quality of the News Feed experience, but that Facebook could not have influenced the outcome of theelection.
Personally, Ithink the idea that fake news on Facebook, of which its a very small amount of the content, influenced the election in any way is a pretty crazy idea, Zuckerberg said.
He continued by saying people are looking for a narrative to explain the election. However, he believes that a narrative that implicitly presumes Trump supporters are dumb enough to be manipulated by Facebook is insulting to those voters. In his view, It was just as likely for News Feed to highlightfake news about Clinton but the media remains steadfast inignoring that Trump advocates ultimately believed their candidate can bring them a better life.
People are smart and they understand whats important to them , notedZuckerberg.
Rather than placing blamed on theaccessibility of facts, he pointed to content involvement as their own problems. Zuckerberg noted that Trumps posts got more engagement than Clintons on Facebook.
Facebook research shows that nearly everyone on the platform isconnected with at-least someonethat hasopposing ideological notion. The real question for Zuckerberg ishow to influence the way people react when they consider a post they disagree. The key is to stop them from brushing it under the rug.
To get there, Facebook is making efforts to involve humans more deeply in thecreation of theranking algorithmsthe companyuses for content.News Feed now has a human quality panel that is usedto hone in rankings. Humans are given tales and is necessary to rank them to get a better idea of what makes a particular story fulfilling for the user.
Zuckerberg had previously only addressed the election in a Facebookpost featuring a photograph of his daughter Max. Henoted at that time that, We are all blessed to have the ability to stimulate the world better, and we have the responsibility to do it, but didnt elaborate on what that meant specifically for him and his company.
Adam Mosseri, VP of Product Management for NewsFeed, echoed much of what Zuckerberg saidearlier today in a statement to TechCrunch, though his brief comments werenotably less skeptical of the importance of removing propaganda.
We understand theres so much more we need to do, and that is why its important that we maintain improving our ability to detect misinformation, Mosseri noted.
Despite all of the global concernabout Trumps win, Zuckerberg did take a moment to makeit clear that he doesnt believe any single person can basically alter the arc of technological innovation.
The following transcription has been edited somewhat for readability. It contains the largest chunk ofZuckerbergs commentary about the role of News Feed in the election. If you want to see the entire talk, you can watch it in its entirety here .~ ATAGEND
So when it comes to News Feed ranking, I actually think we are very transparent. Every period we add a new signal or make a change we publish that, right? We explain why we are doing it and what signal we are adding and we bring people in to talk to them about it. You know that stuff is right there and we will continue to do that and thats a big part of what we do and we take that seriously.
Ive considered some of the tales you are talking about around this election and personally I suppose the idea that fake news on Facebook, of which its a very small amount of the content, influenced the election in any way is a pretty crazy idea.
You know voters make decisions based on their lived experience. We actually believes in people. You dont generally go wrong when you trust that people understand what they care about and whats important to them and you construct systems that reflect that.
Part of what I think is going on there are people are trying to understand the result of the election, but I do think that there is a certain profound absence of empathy in asserting that the only reason someone could have voted the route they did is because they find some fake news. If you believe that then, I dont think you have internalized the message that Trump supporters are trying to send in this election.
The quickest route to refute the fact that this surely had no impact is why would you think there would be fake news but not on the other. We know we study this, we know that its a very small volume of anything. Hoaxes are not new on Facebook. There have been hoaxes on the internet and there have been hoaxes before. We do our best to make it so that people can report that and so that we can as I said before share people the most meaningful content that we can.
Weve examined this a lot because as you can imagine I truly care about this, I want what we do to have a good impact on the world. I want people to have a diversity of information. So this is why we study this stuff to make sure we are having that positive impact. For whatever reason, all the research suggests that this isnt genuinely a problem and I can go into that in a second but for whatever reason we have had a really hard time get that out.
But here is the historical analogy that I think is useful on this. If you go back 20 years and look at the media scenery, there were a few major Tv networks in any devoted local area. There were a few major newspapers that had an editorial opinion and those were the opinions that you got all your news filtered through.
Regardless of what leaning you have on Facebook politically, or what your background is, all the research shows that almost everyone has some friends who are on the other side. Even if youre a Democrat and 90 percentage of your friends are Democrat, 10 percentage of your friends are Republican. Even if you live in some nation or some country, youre going to know some people who live in a different state or a different country.
So what we observed, and you can go through everything, you can go through religion you can go through ethnic background, only all of these different things. In a lot of cases, the majority of someones friends might fit their beliefs, but there are always some outliers. That means that the media diversity and the diversity of information you are getting through a social system like a Facebook is going to be inherently more diverse than what you would have gotten through watching one of the three news stations and sticking with that and having that be your newspaper or your tv station 20 years ago.
The research also presents something, which is a little bit less inspiring, which is that we analyze not only people exposure in News Feed to content from different points of view, but also what people click on and engage with.
By far, the biggest filter in the system is not that the content isnt there, or that you dont have friends who support the other nominee, or who are from another religion, but that you simply tune it out when you see it. So, you have your world view, and you go through and I think we would all be surprised how many things dont conform to our world view that we just tune out. We just dont click on them and you know I dont know what to do about that. We should work on that.
Presenting people with a diversity of information is an important problem in the world, and one I hope we can build more progress on. But right now, the problem isnt that the diverse information isnt there, its actually, by any study, more there than traditional media in the last generation, but we havent gotten people to engage with it in higher proportions.