Facebook Papers | How Algorithms promoted Polarization and Hatred | Dhruv Rathee - YouTube

Channel: unknown

[0]
Hello, friends!
[0]
One of the largest social media platforms in the world,
[3]
Facebook is under some serious allegations.
[6]
It's being said that Facebook has encouraged hate speech and violence
[11]
in several countries across the globe.
[12]
Indirectly supported extremist groups and misinformation
[16]
and because of Facebook's algorithm in many places around the world,
[20]
instances of riots and violence were seen in real life.
[23]
These allegations were made by none other than an ex-employee of Facebook.
[29]
Thousands of internal documents of Facebook have been leaked.
[32]
Through which we came to know all this.
[34]
These leaked documents are now known as the 'Facebook Papers.'
[40]
"Pressure has been mounting on tech giant Facebook
[43]
after a former employee, Frances Haugen,
[45]
turned a whistle-blower."
[46]
"...has told US lawmakers,
[48]
that she believes Facebook's products
[49]
harm children, stoke division and weaken democracy."
[52]
"My fear is that without action,
[54]
divisive and extremist behaviour that we see today
[56]
are only the beginning."
[57]
"Facebook CEO Mark Zuckerberg is responding to the leaked documents
[60]
that suggest that the platform took little action
[63]
to address its role in spreading misinformation."
[72]
The first question to arise is
[73]
who leaked these internal documents?
[76]
Who is the whistle-blower?
[77]
The answer to it, friends, is Frances Haugen.
[80]
This 37-year old woman,
[82]
who is an engineer,
[83]
had a really successful tech career.
[86]
She has been the product manager in multiple large companies.
[89]
Like Google+, Pinterest and Yelp.
[91]
According to her LinkedIn profile,
[93]
she has done a management program at Harvard Business School.
[96]
And then went on to co-found a dating platform.
[99]
In 2018, Facebook had approached her
[102]
to join their company.
[103]
She had insisted then that she would work
[106]
only at a role where she could counter fake information.
[111]
A job related to democracy.
[113]
Facebook had then employed her in 2019.
[116]
In their Civic Integrity team.
[118]
This team was created by Facebook,
[120]
to see over the election interference all over the world.
[123]
But this team was disbanded,
[126]
after the 2020 US Presidential Election.
[130]
When this happened, it bothered Frances,
[132]
and she started feeling disillusioned.
[134]
She felt that Facebook wasn't serious about countering election interference.
[139]
But she still continued working at the company.
[141]
But with the passage of time,
[143]
her frustration only grew.
[145]
She saw Facebook's inaction
[148]
about countering misinformation.
[150]
Because of the misinformation spread on Facebook,
[153]
cases of violence were seen around the world.
[155]
When she had enough of that, finally in May 2021,
[160]
she resigned from her job.
[162]
But some weeks before she resigned,
[164]
she copied and downloaded thousands of internal documents from the company's computers.
[170]
And she decided to become a whistle-blower,
[173]
and reveal this to the world.
[176]
After doing so, she reached out to
[178]
a journalist of the Wall Street Journal.
[180]
But as I told you in the video on Pandora Papers,
[183]
when there are so many documents,
[185]
one person can't start analysing all the documents.
[187]
Help from many other people is needed.
[190]
That's why, in this case too,
[192]
17 other news organisations of the US came together,
[196]
and formed a consortium
[198]
and they were all given access to the documents.
[200]
The organisations then started analysing the documents.
[204]
Washington Post, New York Times, Associated Press,
[207]
The Atlantic, Reuters, The Wire, CNN were among the big names.
[210]
Additionally, a consortium of some European News agencies was also formed.
[214]
They were given all the documents.
[216]
They analysed these and then
[218]
this revelation was brought to the public.
[220]
This project was named The Facebook Papers.
[223]
To show how serious she is about the allegations,
[227]
not only did Frances approach the media,
[229]
but she also filed 8 complaints against Facebook at the US Securities Exchange Commission.
[234]
Apart from this, on 5th October,
[236]
she appeared before the US Senate Committee
[239]
to record her testimony.
[240]
Frances has said that through this revelation,
[243]
she isn't aiming to defame Facebook.
[246]
Instead, she wants Facebook to acknowledge these
[249]
and to rectify the internal problems.
[252]
Because of Facebook's algorithm,
[254]
democracy is literally in danger all around the world.
[257]
Children are in danger.
[258]
And violence is spreading globally.
[260]
"Having worked in four different types of social networks,
[263]
I understand how complex and nuanced these problems are.
[266]
However,
[267]
the choices being made inside Facebook are disastrous.
[270]
For our children, our public safety, our privacy and for our democracy.
[274]
And that is why we must demand Facebook make changes."
[277]
How is it happening exactly?
[279]
Come, let's talk about what's revealed in the Facebook Papers.
[282]
In February 2019, researchers set up a dummy account on social media,
[288]
to see the experience of an Indian user.
[291]
The aim was to see what the Facebook algorithm would recommend to this user.
[298]
And the recommendations that they saw,
[300]
were shocking.
[301]
This test account,
[303]
within 3 weeks,
[305]
was inundated with fake news and inciteful things on its newsfeed.
[311]
Violence, photos of beheadings,
[313]
photos of corpses,
[315]
within 3 weeks, Facebook started recommending these to this dummy account.
[320]
This showed that Facebook's algorithm recommends such hateful and violent things on its own.
[328]
If you remember, around February 2019, the Pulwama attack had happened.
[331]
So this experiment was quite related to it.
[334]
Because in this experiment it was found that
[336]
the hateful and violent posts
[338]
were often against Pakistan.
[340]
Several images with fake news were being circulated
[343]
with fake photos of bombs
[346]
and crudest of the crude language was being used against the people of Pakistan.
[349]
Against a specific community,
[351]
and the researchers had even observed that
[353]
their video service Facebook Watch,
[355]
the feed for videos on Facebook,
[358]
if the user didn't put in the type of videos they wanted to watch,
[361]
by liking different videos,
[363]
then the algorithm automatically recommended
[366]
were pornographic videos.
[368]
Softcore porn.
[369]
This internal report was named
[377]
Today, the thousands of documents leaked by Frances Haugen,
[381]
one of those documents, one of those reports, is this.
[384]
In its response, Facebook's spokesperson has said
[387]
that Facebook tries to always
[389]
minimalise hate speech on their platform.
[391]
They said that after this report,
[394]
Facebook had made changes to their algorithm to improve the system.
[397]
It sounds fair.
[399]
It sounds like Facebook took action after the research.
[403]
But many other documents in the Facebook Papers,
[406]
show that
[408]
the reality isn't how Facebook is depicting.
[410]
The first thing to be noted was that
[412]
the money spent by Facebook
[414]
for fighting misinformation,
[416]
for fighting fake news,
[418]
87% of their budget
[422]
is spent only on one country.
[424]
The United States.
[425]
Only 13% of their budget is used
[428]
to fight fake news in the rest of the world.
[431]
Inequality and injustice like this,
[433]
can be seen at many places in the world
[436]
in many aspects.
[438]
But for jobs at least,
[440]
you have the Hirect app.
[442]
Where you can talk directly to the founders and CXOs of more than 30,000 start-ups.
[447]
The job postings on the Hirect app
[450]
are 100% verified.
[452]
There are no consultancies or middlemen.
[454]
And Hirect claims that all the postings are scam free
[457]
and spam-free.
[458]
If you are a job seeker,
[459]
you're looking for jobs,
[461]
then you can install this app for free
[462]
through the link given in the description below.
[464]
And it is very easy to use the app.
[466]
Install this app.
[467]
And then you'd have to answer some questions
[469]
to create your profile
[471]
and based on your profile,
[474]
the algorithm of the app will match you with the recruiters
[477]
where you can potentially get jobs.
[480]
So do check it out
[481]
and here, I'd also like to thank the Hirect app for sponsoring this video.
[485]
Coming back to the topic.
[486]
Secondly, these documents have revealed that
[489]
even if Facebook wanted to fight against hate speech,
[492]
it couldn't do so
[493]
because in several languages,
[495]
the technology wasn't available to Facebook
[497]
to detect violence and hate speech.
[499]
For example, for Hindi and Bangla,
[502]
Facebook did not have classifiers.
[504]
What are classifiers?
[505]
These are basically Facebook language algorithms
[508]
that could automatically detect hate speech
[512]
and violent posts.
[513]
Facebook has admitted to it
[515]
that for Hindi and Bangla languages,
[517]
they put up the classifiers in violence and incitement categories
[522]
only in early 2021.
[525]
At the beginning of this year.
[527]
Before this,
[528]
their algorithms couldn't even recognise the violent posts in Hindi and Bangla.
[533]
And even today, as per Facebook's spokesperson,
[536]
that they have the classifiers for only 5 Indian languages.
[541]
Whereas Facebook is present in more than 20 languages in India.
[545]
It means that to spread violence and hate speech over Facebook, in those languages,
[550]
is very easy and without any consequences.
[552]
In its defence,
[554]
Facebook claims that even though they do not have the technology or the algorithms,
[557]
to fight hate speech,
[559]
but they have a dedicated team of 15,000 people for India
[563]
for content review
[565]
that covers the 20 Indian languages.
[566]
As an example, they've said
[568]
that since May 15th 2021,
[570]
to August 31st 2021,
[572]
their team proactively removed 877,000 pieces of hate speech content in India.
[580]
When Facebook is acknowledging
[582]
that their team of humans had helped in removing hate speech from their platform,
[585]
what's the issue?
[586]
The issue, friends, is that
[588]
the next thing exposed in the Facebook Papers is that
[592]
Facebook isn't impartial at removing all types of hate speech.
[596]
Facebook lets some hate speech remain on the platform.
[601]
This was revealed in an internal report titled
[610]
Wall Street Journal quotes this study
[613]
"much of the content posted by users, groups and pages
[617]
from the Hindu nationalist Rashtriya Swayamsevak Sangh group or RSS
[621]
is never flagged."
[622]
Meaning that most of the content posted by the RSS group,
[626]
is never flagged.
[628]
And why?
[628]
Because Facebook has designated RSS
[630]
as not to be removed
[632]
because of 'political sensitivities.'
[633]
What are these 'political sensitivities?'
[635]
The New York Times reports
[637]
that this is basically Facebook's concerns
[639]
about its operations in the country.
[641]
Facebook is worried that they wouldn't be able to properly continue their business in India
[645]
if it started doing this.
[647]
Some other examples that were revealed were
[649]
the violent content of Bajrang Dal.
[651]
Even before this, a similar disclosure had been made.
[655]
In August 2020.
[656]
Wall Street Journal had written an article regarding this.
[659]
When a BJP politician had given an extremely violent hate speech against Muslims.
[663]
Who was the politician?
[664]
Anurag Thakur, Pravesh Mishra, Adityantha, Tejasvi Surya weren't the ones.
[668]
Instead, it was T. Raja.
[670]
He had literally threatened to shoot people.
[672]
Some Facebook employees felt that as per the rules,
[675]
he should be labelled as a 'dangerous individual'
[678]
and banned from the website.
[679]
Facebook employees were about to ban him,
[682]
but Facebook India's top Public Policy Executive,
[684]
Ankhi Das stopped them.
[686]
Apparently, the article quotes Ankhi Das,
[689]
that she had told her employees,
[691]
"...punishing violations by politicians from Mr Modi鈥檚 party
[693]
would damage the company鈥檚 business prospects in the country."
[696]
They were literally saying that
[698]
they cannot remove the hate speech
[701]
or else their business prospects would be affected in India.
[704]
When this expose took place,
[705]
2 months later,
[706]
Ankhi Das resigns from Facebook.
[709]
Saying that she wants to pursue her interest in public service
[713]
was the reason for her resignations.
[714]
In December 2020,
[716]
another article was published in the Wall Street Journal.
[718]
It claims that the safety team of Facebook
[720]
has reached the conclusion that
[722]
Bajrang Dal had supported violence in India.
[725]
That's why it should be designated as a 'dangerous organisation.'
[729]
And the organisation should be banned from Facebook.
[732]
But still, this action wasn't taken.
[735]
'Fearing possible attacks.'
[736]
Facebook is scared yet again.
[738]
For the record, let me tell you, friends,
[740]
this isn't happening only in India,
[742]
similar stories regarding Facebook are heard
[745]
from many countries across the world.
[746]
For example, take the case of Vietnam.
[749]
The ruling party in Vietnam is the Communist Party.
[751]
The voices of its opposition are being squelched similarly.
[754]
There also, Facebook has allegedly
[757]
banned the people speaking against the Communist Party from its platform
[761]
or removed their opinions from Facebook.
[764]
Because Facebook was worried
[766]
about Vietnam banning Facebook from the country.
[769]
Now hearing this, you might say that Facebook isn't really at fault.
[772]
Facebook is forced to do so.
[774]
To protect democracy and people's lives
[778]
should Facebook lose out on a huge market?
[781]
It was a question of Facebook's survival.
[784]
Facebook as a company wants to survive,
[786]
that's why it is forced to do such things.
[788]
If you think like this,
[789]
then here is the next allegation.
[791]
It is said that Facebook has intentionally
[793]
chosen profits over people.
[796]
How?
[796]
Frances Haugen has revealed that around 2018,
[799]
Facebook changed its news feed algorithm.
[801]
Earlier,
[802]
the posts were sorted by recent.
[807]
The most recent post by your friends
[810]
would be at the top of the news feed.
[812]
But now, Facebook has changed it to
[814]
sorting by activity by default.
[817]
Meaning the post with the most likes and comments,
[819]
would be at the top.
[821]
And the posts that would be more relevant based on your likes,
[824]
would also be at the top.
[826]
It means that what you see on your news feed now,
[829]
is according to your likes.
[830]
You see the things that you already like.
[833]
The political opinions that you already have,
[835]
you'd see the same political opinions on your news feed.
[838]
Earlier when everything was sorted by time,
[842]
you'd often see the counter-opinions from people.
[845]
Things with which you might disagree with.
[847]
Because everything is being aligned with your likes,
[850]
social media bubbles are being created.
[852]
It creates Polarization.
[854]
This was explained very well
[855]
in the Netflix documentary, The Social Dilemma.
[858]
And because Facebook promotes those posts with more likes and comments,
[863]
it increases the chances of breeding negativity as well.
[866]
Think about it.
[868]
On which post would you comment more?
[870]
A post that is a happy,
[873]
that pleases you, a post by your friend?
[876]
Or a post that is very debatable?
[879]
With a lot of controversy going on in the comments,
[882]
and you write your opinion there
[883]
and another person comments on your opinion
[886]
you reply they reply, an argument ensues,
[888]
there's debate, controversy and anger.
[891]
Posts like these get more comments.
[893]
And that is why those are promoted more by the algorithms.
[896]
I'd say that this thing is present on YouTube too, to an extent
[900]
because the videos which do not have much controversy in them
[903]
without much racy content,
[906]
like my video on Ravi Dahiya,
[907]
or the video on Vrikshit Foundation,
[909]
get the least amount of views.
[911]
And the videos which are full of controversy,
[914]
get the most views.
[915]
Like the video on Kangana Ranaut,
[917]
or the video on Aryan Khan.
[919]
You can say that it is human nature,
[921]
that people are more attracted to controversy.
[923]
A big example of this is Facebook's QAnon conspiracy theory.
[927]
This came up in the USA around 2017.
[930]
This conspiracy theory says that the world is controlled by a secret group
[935]
who worship demons.
[937]
Its members include Barack Obama, George Soros,
[939]
Tom Hanks and the Dalai Lama.
[941]
Researchers have discovered that if you
[942]
liked Donald Trump's page,
[944]
some days later this QAnon conspiracy theories page was suggested as well.
[949]
Automatically by Facebook's algorithm.
[951]
It isn't that the algorithm was developed to do these underhanded things.
[955]
The algorithm is merely designed around human nature.
[957]
The things that you like are the things you're shown repeatedly.
[961]
This leads to polarization.
[963]
And this is how you get sucked into the circle of conspiracy theories.
[966]
Because in that group, you wouldn't find anyone
[969]
who would present a counter opinion.
[970]
Because if someone were against it,
[972]
they'd be shown things that they like.
[975]
Eventually, what was the consequence of this?
[977]
We witnessed the Capital Riots in the USA.
[981]
A coup in Myanmar.
[983]
Ethnic violence in Ethiopia.
[986]
Experts believe that these three examples
[989]
were the consequences of Facebook's algorithm.
[992]
Many Facebooks employees felt frustrated about this
[996]
and many had resigned from their jobs at Facebook.
[998]
Other than these, the internal documents also reveal
[1001]
how the body image issues in girls
[1005]
were worsened by Facebook.
[1006]
For every 1 in 3 teenage girls,
[1009]
after joining Facebook,
[1010]
they feel more insecure
[1012]
about their body image.
[1014]
In an internal project by Facebook in 2019,
[1017]
named Project Daisy,
[1018]
aimed at improving the mental health among people.
[1021]
How?
[1022]
A presentation was shown to Mark Zuckerberg.
[1025]
With the suggestion to hide the number of likes,
[1029]
if people wouldn't be able to see the likes,
[1031]
people would feel less insecure
[1033]
about sharing their post.
[1035]
It would even lower the anxiety
[1037]
among the teenagers and young people.
[1039]
But this plan was later discarded.
[1042]
And rejected claiming that it was 'ineffective.'
[1045]
Additionally, there was also a complaint
[1047]
that in 2019, Facebook and Instagram knew
[1050]
that their platforms were being used
[1052]
for even human trafficking.
[1054]
There are so many things revealed by the internal documents
[1058]
that this video may take hours.
[1059]
But for now, I'd like to stop here.
[1062]
Facebook has denied these allegations.
[1065]
Although, they haven't claimed that the internal documents are fake,
[1068]
instead, they've said that these documents were cherry-picked
[1072]
to defame Facebook.
[1075]
But the whistle-blowers have said that
[1077]
there was nothing to cherry-pick,
[1080]
there were only cherries.
[1081]
Facebook has acknowledged that
[1082]
they need to bring more regulations and improvements
[1086]
to make their platform better.
[1088]
Talking about solutions, it is very clear here
[1090]
that Facebook has to take action regarding them as soon as possible.
[1094]
To change their algorithms.
[1096]
Because here lives are at stake literally.
[1098]
But what can you do as an individual?
[1100]
For this, my suggestion would be that friends,
[1103]
when you surf social media,
[1105]
when you're scrolling,
[1106]
do so consciously.
[1108]
Think about it and realise that
[1109]
what you are shown on your news feed,
[1112]
are according to your likes and dislikes.
[1114]
Ensure that you are not going down the rabbit hole of a conspiracy.
[1118]
Getting enraged against a community or some people,
[1122]
and are being violently provoked against them.
[1126]
Look into yourself.
[1128]
You may think that you are making a conscious decision while scrolling,
[1132]
but remember this is their algorithm,
[1134]
that is perhaps making you their puppet.
[1137]
Always be aware.
[1138]
This would be my only advice here.
[1140]
Thank you very much!