How TikTok's Algorithm Figures You Out | WSJ - YouTube

Channel: unknown

[0]
- Okay, has anyone else noticed that your For You page
[2]
has been a little too accurate lately?
[5]
It hasn't been things that I'll google
[6]
or I talk about.
[7]
It's been thoughts.
[9]
- TikTok knows everything about us.
[11]
♪ Hold up, don't scroll ♪
[12]
♪ Lemme ask you something first ♪
[13]
♪ Can someone please explain ♪
[14]
♪ How this algorithm works ♪
[15]
- [Narrator] TikTok users often wonder
[16]
how the world's fastest-growing social network seems
[19]
to know them so well.
[20]
- Is TikTok secretly listening to us
[22]
while we're watching videos?
[23]
I don't know.
[24]
(upbeat music)
[25]
- [Narrator] The answer to how this app gets to know you
[26]
so intimately is a highly secretive algorithm,
[30]
long guarded by TikTok's
[32]
China-based parent company, ByteDance.
[35]
- TikTok has been so successful
[36]
in terms of implementing their algorithms.
[39]
- TikTok's algorithm could influence the thinking
[41]
of US youth.
[43]
(upbeat music)
[44]
- [Narrator] To understand how it knows users so well,
[46]
The Wall Street Journal created
[47]
over 100 automated TikTok accounts or bots
[51]
that watched hundreds of thousands of videos on the app.
[54]
We also spoke to current and former executives
[57]
at the company.
[58]
Officially, the company says that shares,
[60]
likes, follows and what you watch all play a role
[64]
in what TikTok shows you.
[66]
We found that TikTok only needs one
[68]
of these to figure you out.
[70]
How long you linger over a piece of content.
[73]
Every second you hesitate or rewatch,
[75]
the app is tracking you.
[77]
- I just wanna quiet the noise.
[78]
- [Narrator] Through this one powerful signal,
[80]
TikTok learns your most hidden interests and emotions.
[83]
And drives you deep into rabbit holes of content
[86]
that are hard to escape.
[88]
(noise of voices)
[94]
♪ I'm happy, happy guy ♪
[96]
♪ Oh, just a happy, happy, happy ♪
[97]
- [Narrator] The TikTok experience starts
[99]
the same way for everyone.
[101]
Open the app and you'll immediately see an endless string
[103]
of videos in your For You feed.
[106]
Take this new user,
[107]
a 24-year-old from Henry County, Kentucky.
[110]
(jumbled music plays)
[112]
TikTok starts by serving the account a selection
[115]
of very popular videos vetted
[116]
by app moderators.
[118]
Is this person religious?
[119]
- Because I still have a purpose.
[121]
And you still hold a plan for my life.
[123]
- [Narrator] Do they wanna participate in viral dances.
[126]
(upbeat music)
[128]
♪ Hey, hey, who ♪
[129]
(upbeat music)
[133]
- [Narrator] Are they feeling down lately?
[134]
- [Narrator] Just remember, I loved you once.
[138]
And that love goes for a friend,
[140]
family, or any relationship.
[143]
- [Narrator] What TikTok doesn't know
[144]
is that the 24-year-old from Kentucky
[146]
isn't a person at all.
[148]
It's one of the bot accounts programmed
[150]
by The Wall Street Journal.
[152]
Let's call it kentucky_96.
[155]
We set up these accounts to understand
[157]
how TikTok figures out your unexpressed interests.
[160]
We assigned each bot a date of birth and an IP address,
[164]
which told TikTok their location.
[166]
None were given a gender.
[169]
We gave each bot or user interests
[171]
but those interests were never entered into the app.
[175]
The only way our users expressed their interests
[177]
was by rewatching or pausing on videos
[180]
with related hashtags or images.
[183]
Some were into extreme sports.
[186]
Others were interested in forestry.
[190]
Or dance.
[192]
Or astrology.
[193]
- I'm not the babysitter, I'm not the parent.
[196]
- [Narrator] Or some other topic.
[197]
- [TikToker] Keep scrolling if you hate animals.
[200]
- [Narrator] For all our accounts,
[201]
we found that TikTok draws users in at first
[204]
by serving a wide variety of videos.
[206]
Many with millions of views.
[209]
Then as the algorithm sees what you respond to,
[212]
the selection of videos and the view counts
[214]
can get lower and lower
[216]
with fewer of them vetted by moderators
[218]
to see if they violate TikTok's terms of service.
[222]
We reviewed our experiment
[223]
and its results with a data scientist,
[226]
an algorithm expert, Guillaume Chaslot,
[228]
a former Google engineer
[230]
who worked on YouTube's algorithm.
[232]
He's now an advocate for algorithm transparency.
[235]
The says TikTok is different
[236]
from other social media platforms.
[238]
- The algorithm on TikTok can get much more powerful
[242]
and it can be able to learn
[244]
your vulnerabilities much faster.
[246]
- [Narrator] In fact, TikTok fully learned many
[248]
of our accounts' interests in less than two hours.
[252]
Some it figured out in less than 40 minutes.
[255]
- [Guillaume] On YouTube, more than 70% of the views come
[258]
from the recommendation engine.
[260]
So it's already huge.
[262]
But on TikTok, it's even worse.
[264]
It's probably like 90-95%
[267]
of the content that is seen
[269]
that comes from the recommendation engine.
[272]
(dramatic music)
[274]
- [Narrator] This is a visualization made
[275]
from hashtags attached to the videos our bots watched.
[279]
Think of it as a partial view
[280]
of the universe of TikTok content.
[283]
Here's where we found dance videos,
[286]
over here are the cooking videos.
[289]
The spindly arm stretching out of the center
[291]
represent niche content areas.
[293]
This arm starts with general videos of cute animals
[297]
but if we follow it out to the end,
[298]
we find more specific videos for enthusiast
[301]
of French bulldogs.
[303]
As kentucky_96 starts its journey,
[305]
it starts moving around within the mainstream
[307]
where TikTok is trying to puzzle out what it wants.
[311]
We programmed kentucky_96
[313]
to be interested in sadness and depression.
[316]
Let's see how long it takes TikTok to figure that out.
[319]
- [TikToker] Life doesn't happen to you,
[321]
life happens for you.
[323]
So if life is taking people away from your life,
[325]
and putting new ones in.
[327]
- [Narrator] Less than three minutes into using TikTok,
[329]
at its 15th video,
[332]
kentucky_96 pauses on this.
[334]
- [TikToker] And that love goes for a friend,
[336]
family or any relationship.
[338]
- [Narrator] Kentucky_96 watches the 35-second video twice.
[342]
Here TikTok gets its first inkling
[344]
that perhaps the new user is feeling down lately.
[346]
- [TikToker] Whoever comes, let them come.
[348]
Whoever stays, let them stay.
[350]
Whoever goes, let them go.
[351]
- [Narrator] The information contained in this single video
[354]
provided the app with important clues.
[357]
The author of the video, the audio track,
[360]
the video description, the hashtags.
[364]
After kentucky_96's first sad video,
[367]
TikTok serves another one 23 videos later.
[370]
Or after about four more minutes of watching.
[373]
- [TikToker] I'll leave you alone from now on
[374]
if that's what you want.
[376]
- [Narrator] This one is a breakup video
[377]
with the hashtag #sad.
[381]
- [TikToker] Do you why I leave you alone?
[383]
'Cause I care about your feelings more than mine.
[385]
- [Narrator] TikTok's still trying to suss out this new user
[388]
with more high view count videos.
[391]
Does the new user wanna watch videos about friendship?
[393]
(upbeat music)
[397]
(playful music)
[398]
Or to laugh at funny fail videos?
[400]
♪ Oh no, oh no, no, no, no, no ♪
[403]
- [TikToker] Nobody's gonna know.
[404]
- [TikToker] They're gonna know.
[405]
(dramatic music)
[406]
- [Narrator] Or do they like videos about home repairs?
[408]
(dramatic music)
[413]
(upbeat music)
[415]
Other information from your phone,
[416]
including location can impact the videos
[419]
that are shown in a user's feed.
[420]
- And as a Kentuckian,
[422]
I never thought I'd lose my freedom
[424]
over a virus.
[426]
- [Narrator] For instance, kentucky_96 saw lots
[428]
about Kentucky but whether or not it keeps showing you
[431]
that type of video depends on your response to it.
[434]
A TikTok spokeswoman said the app
[436]
does not listen to your microphone
[437]
or read text messages to serve you personalized videos.
[441]
(pensive music)
[443]
At video 57, kentucky_96 keeps watching a video
[447]
about heartbreak and hurt feelings.
[449]
(pensive music)
[455]
And then at video 60,
[457]
watches one about emotional pain.
[460]
(dramatic music)
[463]
Based on the videos we watched so far,
[464]
TikTok thinks that maybe this user wants
[466]
to see more about love, breakups and dating.
[470]
So at about 80 videos and 15 minutes in,
[473]
the app starts serving more about relationships.
[476]
But kentucky_96 isn't interested.
[478]
- [TikToker] Your voice, your smile,
[480]
your eyes, your laugh.
[484]
- [Narrator] The user instead pauses on one
[486]
about mental health.
[491]
Then quickly swipes past videos about missing an ex.
[494]
- [TikToker] I miss you and I like having you around.
[498]
- [Narrator] Advice about moving on
[499]
and how to hold a lover's interest.
[501]
- [TikToker] He spends more time on his phone
[503]
when he's around you.
[506]
- [Narrator] Kentucky_96 lingers over this video
[508]
containing the hashtag #depression.
[510]
- [TikToker] Something's wrong with me.
[512]
(sad music)
[515]
And these videos about suffering from anxiety.
[520]
- [TikToker] It's like a reward.
[522]
- [Narrator] After 224 videos into the bot's overall journey
[526]
or about 36 minutes of total watch time,
[529]
TikTok's understanding of kentucky_96 takes shape.
[533]
Videos about depression and mental health struggles
[536]
outnumber those about relationships and breakups.
[539]
From here on, kentucky_96's feed is a deluge
[543]
of depressive content.
[544]
93% of videos shown to the account
[547]
are about sadness or depression.
[550]
- People been lookin' at me.
[552]
I'm just like what you looking at?
[554]
- [Narrator] A TikTok spokeswoman said
[555]
that some of the remaining 7% of videos
[558]
are to help the user discover different content.
[561]
But for kentucky_96, such videos were few and far between.
[565]
The majority of videos it was shown outside
[567]
of its depressive rabbit hole were ads.
[570]
A TikTok spokeswoman said that the simulated activity
[573]
generated by The Wall Street Journal's bots
[575]
is not representative of real user behavior
[578]
because humans have a diverse set of interests.
[581]
But even some of our accounts
[582]
with diverse interests rabbit holed.
[586]
We showed our data and many of the videos seen
[588]
by kentucky_96 to Chaslot.
[591]
- What we see on TikTok
[592]
is a bit the same that what we saw on YouTube.
[595]
So basically, the algorithm is detecting
[598]
that this depressing content
[600]
is useful to create engagement
[602]
and pushes depressing content.
[604]
So the algorithm is pushing people
[606]
towards more and more extreme content,
[608]
so it can push them toward more and more watch time.
[614]
- [Narrator] TikTok also says it allows you
[615]
to see less of something
[616]
by selecting the Not Interested button
[619]
but Chaslot says that's not enough.
[621]
- The algorithm is able to find the piece
[624]
of content that you're vulnerable to,
[626]
that will make you click,
[627]
that will make you watch
[629]
but it doesn't mean you really like it
[631]
and that it's the content that you enjoy the most.
[634]
It's just the content that's the most likely
[637]
to make you stay on the platform.
[639]
- [Narrator] Our bots only escaped rabbit holes
[641]
when we changed their viewing interests.
[643]
When we told one bot to stop watching videos
[646]
about ADHD, the algorithm cut back on that content.
[650]
Still, many of The Journal's bots
[652]
were rapidly pushed deep into rabbit holes.
[655]
TikTok learned our bots' most far-flung interests,
[658]
like astrology, but even bots
[661]
with general mainstream interests got pushed
[663]
to the margin as the recommendations
[665]
got more personalized and narrow.
[669]
Bots with a interest in sexual content
[671]
wound up way out here
[673]
watching hashtag #kingtok videos
[675]
about sexual power dynamics.
[679]
And our bot with a general interest in politics
[682]
wound up being served videos
[683]
about election conspiracies and QAnon.
[687]
- Impeach Biden and Kamala.
[691]
It goes from the House to the Senate.
[692]
- [Narrator] Deep in the niche worlds of TikTok,
[694]
users are more likely to encounter
[695]
potentiality harmful content
[698]
that is less vetted by moderators
[700]
and violates the app's terms of service.
[704]
- [TikToker] Make them angry and sad.
[706]
They would be so much happier without you.
[708]
- [Narrator] A TikTok spokeswoman said
[709]
that the company catches a lot of banned content,
[712]
which passes through both computer analysis
[714]
and human moderators.
[716]
It also reviews videos reported by users.
[720]
- [TikToker] Don't tell them goodbye, just go.
[722]
- [TikToker] It's not my time.
[723]
Pew.
[724]
(TikToker laughing)
[725]
- [Narrator] There's a lot of fun, silly
[727]
and life-affirming content on TikTok.
[730]
(TikToker screaming)
[731]
But while TikTok can draw out what makes you laugh.
[734]
(TikToker humming)
[735]
It can also make you wallow in your darkest box.
[738]
- Turning the pain from mental to physical works.
[741]
- [Narrator] Without ever needing to eavesdrop on you
[743]
or collect any personal information about you.
[746]
- I've never attempted suicide or anything.
[749]
I've never let it get that far.
[751]
- Whether it's on TikTok,
[752]
on Facebook, on YouTube,
[753]
we're interacting with algorithms
[755]
in our everyday life more and more.
[759]
We are training them and they're training us.
[763]
So we have to study this
[764]
so we understand it better
[766]
and we don't let it go in directions
[769]
that are harmful to society
[771]
or to certain groups of people.
[774]
(dramatic music)