Создать аккаунт
Главные новости » Эксклюзив » Facebook Keeps Data Secret, Letting Conservative Bias Claims Persist
Эксклюзив

Facebook Keeps Data Secret, Letting Conservative Bias Claims Persist

0
Facebook Keeps Data Secret, Letting Conservative Bias Claims Persist



Enlarge this image


The thumbs up Like logo is shown on a sign at Facebook headquarters in Menlo Park, Calif.





Jeff Chiu/AP



hide caption



toggle caption


Jeff Chiu/AP



Technology
Can Circuit Breakers Stop Viral Rumors On Facebook, Twitter?

«You see these crazy numbers on CrowdTangle, but you don’t see how many people are engaging with this compared with the rest of the platform, Allen said.

Another point researchers raise: All engagement is not created equal.

Users could «hate-like a post, or click like as a way of bookmarking, or leave another reaction expressing disgust, not support. Take, for example, the laughing-face emoji.

«It could mean, ‘I agree with this’ or ‘This is so hilarious untrue,' said data scientist Qian. «It’s just hard to know what people actually mean by those reactions.

It’s also hard to tell whether people or automated bots are generating all the likes, comments and shares. Former Facebook research scientist Solomon Messing conducted a study of Twitter in 2018 finding hat bots were likely responsible for 66% of link shares on the platform. The tactic is employed on Facebook, too.

«What Facebook calls ‘inauthentic behavior’ and other borderline scam-like activity are unfortunately common and you can buy fake engagement easily on a number of websites, Messing said.

Brendan Nyhan, a political scientist at Dartmouth College, is also wary about drawing any big conclusions from CrowdTangle.



Live Updates: Trump Tests Positive For Coronavirus
Facebook, Twitter And TikTok Say Wishing Trump’s Death From COVID-19 Is Not Allowed

«You can’t judge anything about American movies by looking at the top ten box films hits of all time, Nyhan said. «That’s not a great way of understanding what people are actually watching. There’s the same risk here.

‘Concerned about being seen as on the side of liberals’

Experts agree that a much better measure would be a by-the-numbers rundown of what posts are reaching the most people. So why doesn’t Facebook reveal that data?

In a Twitter thread back in July, John Hegeman, the head of Facebook’s NewsFeed, offered one sample of such a list, saying it is «not as partisan as lists compiled with CrowdTangle data suggest.

But when asked why Facebook doesn’t share that broader data with the public, Hegeman did not reply.

It could be, some experts say, that Facebook fears that data will be used as ammunition against the company at a time when Congress and the Trump administration are threatening to rein in the power of Big Tech.

«They are incredibly concerned about being seen as on the side of liberals. That is against the profit motive of their business, Dartmouth’s Nyhan said of Facebook executives. «I don’t see any reason to see that they have a secret, hidden liberal agenda, but they are just so unwilling to be transparent.

Facebook has been more forthcoming with some academic researchers looking at how social media affects elections and democracy. In April 2019, it announced a partnership that would give 60 scholars access to more data, including the background and political affiliation of people who are engaging content.

One of those researchers is University of Pennsylvania data scientist Duncan Watts.

«Mostly it’s mainstream content, he said of the most viewed and clicked on posts. «If anything, there is a bias in favor of conservative content.

While Facebook posts from national television networks and major newspapers get the most clicks, partisan outlets like the Daily Wire and Brietbart routinely show up in top spots, too.

«That should be so marginal that it has no relevance at all, Watts said of the right-wing content. «The fact that it is showing up at all is troubling.

‘More false and misleading content on the right’

Accusations from Trump and other Republicans in Washington that Facebook is a biased referee of its content tend to flare up when the social network takes action against a conservative-leaning posts that violate its policies.

Researchers say there is a reason why most of the high-profile examples of content warnings and removals target conservative content.

«That is a result of there just being more false and misleading content on the right, said researcher Allen. «There are bad actors on the left, but the ecosystem on the right is just much more mature and popular.

Facebook’s algorithms could also be helping more people see right-wing content that’s meant to evoke passionate reactions, she added.

Because of the sheer amount of envelope-pushing conservative content, some of it veering into the realm of conspiracy theories, the moderation from Facebook is also greater.

Or as Nyhan put it: «When reality is asymmetric, enforcement may be asymmetric. That doesn’t necessarily indicate a bias.

The attacks on Facebook over perceived prejudice against conservatives has helped fuel the push in Congress and the White House to reform Section 230 of the Communications and Decency Act of 1996, which allows platforms to avoid lawsuits over what users post and gives tech companies the freedom to police its sites as the companies see fit.

Joe Osborne, a Facebook spokesman, in a statement said the social network’s content moderation policies are applied fairly across the board.

«While many Republicans think we should do one thing, many Democrats think we should do the exact opposite. We’ve faced criticism from Republicans for being biased against conservatives and Democrats for not taking more steps to restrict the exact same content. Our job is to create one consistent set of rules that applies equally to everyone.

Osborne confirmed that Facebook is exploring ways to make more data available in the platform’s public tools, but he declined to elaborate.

Watts, University of Pennsylvania data scientist who studies social media, said Facebook is sensitive to Republican criticism, but no matter what decision they make, the attacks will continue.

«Facebook could end up responding in a way to accommodate the right, but the right will never be appeased, Watts said. «So it’s this constant pressure of ‘you have to give us more, you have to give us more,' he said. «And it creates a situation where there’s no way to win arguments based on evidence, because they can just say, ‘Well, I don’t trust you.'


  • Facebook

0 комментариев
Обсудим?
Смотрите также:
Продолжая просматривать сайт nrus.info вы принимаете политику конфидициальности.
ОК