TikTok CEO Shou Chew on Its Future — and What Makes Its Algorithm Different | Live at TED2023

TED
21 Apr 202339:19

Summary

TLDR在这段视频中,Chris Anderson与TikTok首席执行官Shou Chew进行了对话。Shou Chew分享了他如何加入TikTok,并解释了该平台成功背后的算法和推荐机制。他详细介绍了TikTok的使命是激发创意和带来欢乐,并讨论了他们如何确保平台的安全性,特别是对青少年的保护。Shou Chew还谈到了他们在美国数据安全方面的努力,以及通过Project Texas将美国用户数据存储在美国。整个对话强调了TikTok对用户和内容创作者的影响,以及他们应对隐私和数据安全挑战的措施。

Takeaways

  • 🎉 恭喜Shou Chew在美国国会的听证会上取得了两党共识。
  • 📱 Shou Chew分享了自己加入TikTok的经历,并介绍了TikTok的创立背景。
  • 🔍 TikTok的推荐算法通过用户兴趣信号进行内容推荐,而不是基于社交关系。
  • 🌍 TikTok为普通人提供了展示才能的平台,使他们有机会被发现并获得成功。
  • 👨‍🏫 教师和教育内容在TikTok上也受到了广泛欢迎,尤其是STEM内容。
  • 📉 TikTok采取多种措施防止用户过度使用,包括时间限制和提示视频。
  • 🛡️ TikTok有明确的社区指南,并且通过机器和人工来审核和删除不良内容。
  • 🗣️ TikTok的透明度措施包括允许第三方审查源代码,以确保平台上的自由表达和防止政府干预。
  • 🔒 Shou Chew介绍了Project Texas,旨在将美国用户的数据存储在美国境内以提高数据安全性。
  • 🌟 Shou Chew展望了TikTok未来五年的发展愿景,包括更多的内容发现、创造和连接社区的机会。

Q & A

  • 问题1:Shou Chew是如何加入TikTok的?

    -Shou Chew介绍说他来自新加坡,大约十年前他遇到了两位正在开发产品的工程师,他们的理念是根据用户喜欢的内容推荐,而不是根据他们认识的人。五年前,随着4G和短视频的兴起,TikTok诞生了。几年前,Shou Chew有机会管理这家公司,并表示每天都很兴奋。

  • 问题2:TikTok成功的关键因素是什么?

    -Shou Chew提到TikTok的成功很大程度上归功于其内容推荐算法,该算法基于用户的兴趣信号推荐内容。此外,TikTok的短视频格式和针对智能手机优化的设计也是成功的重要因素。

  • 问题3:TikTok的推荐算法是如何工作的?

    -Shou Chew解释说,TikTok的推荐算法主要是基于用户的兴趣信号进行模式识别。例如,如果用户喜欢某些视频,算法会将这些兴趣信号结合起来,向用户推荐更多类似的内容。

  • 问题4:TikTok如何确保用户的安全,尤其是青少年的安全?

    -Shou Chew表示,TikTok有明确的社区准则,禁止暴力、色情和其他不良内容。对于未成年人,TikTok提供了限制功能,例如不允许16岁以下用户使用即时消息和直播功能,并且父母可以通过家庭配对功能控制孩子的使用时间。

  • 问题5:TikTok如何处理平台上的有害内容?

    -Shou Chew说,TikTok有一个由数万人和机器组成的团队,专门负责识别和移除平台上的有害内容。此外,他们还与创作者合作,制作教育视频,提醒用户一些危险行为的风险。

  • 问题6:TikTok在数据隐私和保护方面采取了哪些措施?

    -Shou Chew提到TikTok启动了“德克萨斯计划”,将美国用户的数据存储在美国境内,由美国公司Oracle管理和监督。此外,他们还让第三方审核员查看源代码,以确保平台的透明度和数据安全。

  • 问题7:TikTok对内容创作者的支持措施有哪些?

    -Shou Chew表示,TikTok为所有用户提供了一个展示自己才能的平台,不管他们是否有粉丝。通过这种方式,很多有才华但不出名的人也能在TikTok上获得观众和成功。

  • 问题8:TikTok如何帮助小型企业?

    -Shou Chew举例说明了一个在凤凰城经营餐厅的企业主,通过TikTok发布内容吸引了大量顾客,去年通过TikTok赚取了约一百万美元的收入。他表示,TikTok为许多小企业提供了新的商业机会。

  • 问题9:TikTok如何应对青少年用户对屏幕时间的依赖?

    -Shou Chew表示,TikTok在用户使用时间过长时会主动推送提示视频,建议他们休息。对于18岁以下的用户,默认设置为每天使用时间限制为60分钟,并且父母可以通过家庭配对功能进一步控制使用时间。

  • 问题10:TikTok在未来五年的愿景是什么?

    -Shou Chew提到,TikTok的愿景是继续提供发现窗口、创作平台和连接桥梁。他希望通过新技术和AI帮助用户创作更多内容,进一步促进用户之间的连接,并帮助更多企业通过TikTok获得成功。

Outlines

00:00

😀 欢迎与背景

Chris Anderson欢迎Shou Chew,并祝贺他在国会听证会上取得的成就。讨论了TikTok在美国政治中引起的共识,即需要禁止TikTok。然后,Chris询问Shou的背景以及他如何加入TikTok。Shou Chew介绍了自己的背景,解释了他是如何与两位工程师合作创建了TikTok的前身产品,并描述了TikTok的早期发展和他加入公司的过程。

05:01

🤔 TikTok的成功秘诀

Chris Anderson深入探讨了TikTok为何如此成功和令人上瘾的问题。Shou Chew解释了TikTok公司的使命,即激发创造力和带来快乐,并详细描述了推荐算法如何根据用户兴趣提供内容。他强调了TikTok独特的发现引擎和机器学习算法如何帮助用户发现有趣的内容,并举例说明了像Khaby这样的创作者如何在TikTok上取得成功。

10:02

📊 推荐算法的工作原理

Chris Anderson继续探讨TikTok的推荐算法。Shou Chew简化了算法的工作原理,解释了如何通过用户的互动数据进行协同过滤,从而推荐相关内容。讨论了优化智能手机格式和短视频的重要性,以及TikTok如何通过推荐算法连接用户和发现社区。Shou Chew还分享了一些成功的创作者故事,强调了TikTok为普通人提供了展示才华的平台。

15:02

🛡️ 社区准则与内容审核

讨论了TikTok如何应对平台上的不良行为和内容。Shou Chew解释了TikTok的社区准则以及如何使用人工和机器相结合的方式来审查和移除违反准则的内容。他强调了为不同年龄段的用户提供不同体验的重要性,并详细描述了针对未成年用户的保护措施,如时间限制和家长控制功能。

20:05

⏳ 控制使用时间与健康关系

Chris Anderson提出了关于用户成瘾的问题,特别是年轻用户。Shou Chew解释了TikTok如何通过提示用户休息和设置时间限制来管理用户的屏幕时间。他强调,TikTok的目标不是最大化用户的使用时间,而是确保用户与平台之间的健康关系,并分享了与专家合作制定的60分钟使用限制的建议。

25:05

👮 数据隐私与政府干预担忧

讨论了美国国会对TikTok数据隐私和潜在政府干预的担忧。Shou Chew介绍了Project Texas计划,该计划将美国用户的数据存储在美国,由美国公司Oracle管理,以确保数据安全和透明。他强调了TikTok为保护数据所做的努力,并解释了如何通过第三方审核和透明度报告来防止任何形式的政府操控。

30:05

🔍 透明度与算法监督

Shou Chew详细说明了TikTok的透明度措施,包括允许第三方审查源代码和提供研究工具,以确保平台上的内容和算法的公正性和透明度。他重申了对防止政府操控的承诺,并强调了通过透明度和第三方监督来降低风险的重要性。

35:06

🌐 未来愿景与平台影响

在最后的讨论中,Chris Anderson询问了Shou Chew对TikTok未来的愿景。Shou Chew描述了TikTok在发现新事物、激发创造力和连接社区方面的长期目标。他提到了科学、书籍和烹饪等内容在平台上的广泛影响,并强调了TikTok为普通人提供展示才华和建立商业机会的平台。讨论以制作一段TikTok视频结束,强调了Shou Chew作为CEO的魅力和影响力。

Mindmap

Keywords

💡TikTok

TikTok是一款由字节跳动公司开发的短视频社交平台。它的推荐算法基于用户的兴趣而非社交图谱,这使得它能够迅速了解用户的喜好并提供相关内容。视频中提到,TikTok让许多普通人展示才华并获得关注,例如工厂工人Khaby通过发布有趣的视频积累了160百万粉丝。

💡推荐算法

推荐算法是TikTok核心技术,通过机器学习根据用户的兴趣信号(如观看、点赞、分享等行为)来推荐内容。视频中解释了该算法如何识别用户兴趣模式,并实时推荐相关内容,从而提升用户体验。

💡用户生成内容

用户生成内容(UGC)是指由用户自行创建和上传的内容,如视频、图片等。TikTok平台以UGC为主,允许任何有才华的人展示自己,获得广泛关注。视频中提到,TikTok上的成功案例如餐馆老板通过发布视频增加了收入。

💡成瘾性

成瘾性指的是用户对某种行为或物品产生的强烈依赖感。视频中讨论了TikTok的推荐算法如何通过识别和提供用户感兴趣的内容,可能导致用户花费大量时间在平台上,从而引发成瘾性问题。

💡数据隐私

数据隐私是指对用户个人数据的保护。视频中提到,TikTok为解决美国对数据隐私的担忧,启动了Project Texas,将美国用户数据存储在由Oracle公司管理的美国境内服务器上,以避免数据被中国政府访问的风险。

💡创作者

创作者是指在TikTok平台上发布内容的人。视频中多次提到,TikTok为许多普通人提供了一个展示才华的平台,使他们能够被更多人发现和认可,例如教师使用TikTok进行远程教学,积累了大量关注者。

💡社区指南

社区指南是TikTok制定的规则和政策,用于规范平台上的内容和行为。视频中提到,这些指南明确规定了禁止的内容类型,如色情、暴力等,并对不同年龄段用户提供不同的使用限制,以确保平台的安全性。

💡Project Texas

Project Texas是TikTok为应对美国政府对数据安全担忧而实施的项目。该项目将美国用户的数据存储在Oracle公司的美国服务器上,由美国员工管理,以确保数据的安全和隐私。视频中提到,这是一个前所未有的项目,旨在增强数据保护和透明度。

💡人工智能

人工智能(AI)是指模拟人类智能的技术。视频中解释了TikTok如何利用AI来实现推荐算法,通过分析用户行为数据来提供个性化内容推荐,从而增强用户体验。

💡透明度

透明度指的是公司运营和数据处理的公开和透明。视频中提到,TikTok通过让第三方审查其源代码和输出内容,来确保平台的操作透明,防止任何政府的干预和操控,尤其是在数据隐私和内容审核方面。

Highlights

Chris Anderson称赞Shou Chew成功地在美国政治中达成了跨党派共识,尽管这一共识主要集中在“我们必须禁止TikTok”上。

Shou Chew介绍了自己如何加入TikTok的故事,提到了十年前与两位工程师的相遇,以及五年前TikTok在4G时代的诞生。

Shou Chew描述了TikTok的使命是激发创造力和带来欢乐,并分享了平台的愿景:提供发现的窗口、创作的画布以及连接的桥梁。

Shou Chew解释了TikTok的推荐算法如何通过机器学习迅速学习用户的兴趣信号,并展示相关内容。

Shou Chew提到TikTok的推荐算法基于数学,通过模式识别来展示用户可能喜欢的内容,而不是依赖用户已认识的人。

Shou Chew分享了TikTok如何帮助普通人展示才华,并以Khaby的例子说明一个普通工人如何通过TikTok成为拥有1.6亿粉丝的大V。

Shou Chew解释了TikTok如何为中小企业和个人创造者提供一个被发现的平台,提到了一个通过TikTok成功的餐厅案例。

Shou Chew强调TikTok并非单纯追求用户在线时长,平台会主动提醒用户休息,并为青少年设置默认的60分钟时间限制。

Shou Chew谈到TikTok在保护青少年用户方面的努力,包括提供家长控制工具和限制未成年用户的功能。

Shou Chew讨论了TikTok在美国通过Project Texas保护用户数据的计划,并解释了数据存储在美国的Oracle云基础设施中的安全措施。

Shou Chew提到TikTok为了防止政府操控,提供前所未有的透明度,包括让第三方审查源代码。

Shou Chew解释了TikTok如何通过透明度和第三方监控来确保平台不会被政府操控,并强调这一点是其公司使命的一部分。

Shou Chew分享了TikTok在推广STEM内容方面的成就,并提到了全球范围内STEM内容获得了1160亿次观看。

Shou Chew强调TikTok的使命是通过发现、创作和连接来激发创造力和带来欢乐,分享了平台的未来愿景。

Shou Chew在TED演讲结束时,与观众一起拍摄了一段TikTok视频,并表达了对平台带来正面影响的信心。

Transcripts

00:00

Chris Anderson: It's very nice to have you here.

00:03

Let's see.

00:04

First of all, congratulations.

00:05

You really pulled off something remarkable on that grilling,

00:08

you achieved something that very few people do,

00:10

which was, you pulled off a kind of, a bipartisan consensus in US politics.

00:15

It was great.

00:16

(Laughter)

00:18

The bad news was that that consensus largely seemed to be:

00:21

"We must ban TikTok."

00:23

So we're going to come to that in a bit.

00:25

And I'm curious, but before we go there, we need to know about you.

00:30

You seem to me like a remarkable person.

00:33

I want to know a bit of your story

00:35

and how you came to TikTok in the first place.

00:37

Shou Chew: Thank you, Chris.

00:39

Before we do that, can I just check, need to know my audience,

00:42

how many of you here use TikTok?

00:45

Oh, thank you.

00:46

For those who don’t, the Wi-Fi is free.

00:48

(Laughter)

00:51

CA: There’s another question, which is,

00:53

how many of you here have had your lives touched through TikTok,

00:56

through your kids and other people in your lives?

01:00

SC: Oh, that's great to see.

01:01

CA: It's basically, if you're alive,

01:03

you have had some kind of contact with TikTok at this point.

01:06

So tell us about you.

01:08

SC: So my name is Shou, and I’m from Singapore.

01:11

Roughly 10 years ago,

01:13

I met with two engineers who were building a product.

01:17

And the idea behind this was to build a product

01:20

that recommended content to people not based on who they knew,

01:25

which was, if you think about it, 10 years ago,

01:28

the social graph was all in the rage.

01:31

And the idea was, you know,

01:32

your content and the feed that you saw should be based on people that you knew.

01:36

But 10 years ago,

01:38

these two engineers thought about something different,

01:40

which is, instead of showing you --

01:44

instead of showing you people you knew,

01:46

why don't we show you content that you liked?

01:49

And that's sort of the genesis and the birth

01:51

of the early iterations of TikTok.

01:54

And about five years ago,

01:56

with the advent of 4G, short video, mobile phone penetration,

02:01

TikTok was born.

02:03

And a couple of years ago,

02:04

I had the opportunity to run this company,

02:07

and it still excites me every single day.

02:09

CA: So I want to dig in a little more into this,

02:12

about what was it that made this take-off so explosive?

02:15

Because the language I hear from people who spent time on it,

02:19

it's sort of like I mean,

02:21

it is a different level of addiction to other media out there.

02:27

And I don't necessarily mean this in a good way, we'll be coming on to it.

02:30

There’s good and bad things about this type of addiction.

02:33

But it’s the feeling

02:34

that within a couple of days of experience of TikTok,

02:37

it knows you and it surprises you

02:39

with things that you didn't know you were going to be interested in,

02:42

but you are.

02:44

How?

02:45

Is it really just, instead of the social graph --

02:48

What are these algorithms doing?

02:50

SC: I think to describe this, to begin to answer your question,

02:54

we have to talk about the mission of the company.

02:56

Now the mission is to inspire creativity and to bring joy.

03:00

And I think missions for companies like ours [are] really important.

03:04

Because you have product managers working on the product every single day,

03:07

and they need to have a North Star, you know,

03:09

something to sort of, work towards together.

03:12

Now, based on this mission,

03:13

our vision is to provide three things to our users.

03:17

We want to provide a window to discover,

03:19

and I’ll talk about discovery, you talked about this, in a second.

03:22

We want to give them a canvas to create,

03:24

which is going to be really exciting with new technologies in AI

03:28

that are going to help people create new things.

03:32

And the final thing is bridges for people to connect.

03:35

So that's sort of the vision of what we're trying to build.

03:38

Now what really makes TikTok very unique and very different

03:42

is the whole discovery engine behind it.

03:45

So there are earlier apps that I have a lot of respect for,

03:48

but they were built for a different purpose.

03:50

For example, in the era of search, you know,

03:53

there was an app that was built for people who wanted to search things

03:57

so that is more easily found.

04:00

And then in the era of social graphs,

04:02

it was about connecting people and their followers.

04:05

Now what we have done is that ... based on our machine-learning algorithms,

04:09

we're showing people what they liked.

04:11

And what this means is that we have given the everyday person

04:15

a platform to be discovered.

04:16

If you have talent, it is very, very easy to get discovered on TikTok.

04:20

And I'll just give you one example of this.

04:23

The biggest creator on TikTok is a guy called Khaby.

04:27

Khaby was from Senegal,

04:29

he lives in Italy, he was a factory worker.

04:32

He, for the longest time, didn't even speak in any of his videos.

04:37

But what he did was he had talent.

04:39

He was funny, he had a good expression,

04:41

he had creativity, so he kept posting.

04:44

And today he has 160 million followers on our platform.

04:47

So every single day we hear stories like that,

04:49

businesses, people with talent.

04:52

And I think it's very freeing to have a platform

04:55

where, as long as you have talent, you're going to be heard

04:58

and you have the chance to succeed.

05:00

And that's what we're providing to our users.

05:02

CA: So this is the amazing thing to me.

05:04

Like, most of us have grown up with, say, network television,

05:09

where, for decades you've had thousands of brilliant, creative people

05:12

toiling in the trenches,

05:14

trying to imagine stuff that will be amazing for an audience.

05:18

And none of them ever remotely came up with anything

05:22

that looked like many of your creators.

05:27

So these algorithms,

05:29

just by observing people's behavior and what they look like,

05:33

have discovered things that thousands of brilliant humans never discovered.

05:37

Tell me some of the things that it is looking at.

05:40

So obvious things, like if someone presses like

05:42

or stays on a video for a long time,

05:44

that gives you a clue, "more like that."

05:47

But is it subject matter?

05:48

What are the array of things

05:50

that you have noticed that you can actually track

05:53

that provide useful clues?

05:55

SC: I'm going to simplify this a lot,

05:57

but the machine learning, the recommendation algorithm

06:00

is really just math.

06:02

So, for example, if you liked videos one, two, three and four,

06:06

and I like videos one, two, three and five,

06:08

maybe he liked videos one, two, three and six.

06:11

Now what's going to happen is,

06:12

because we like one, two, three at the same time,

06:15

he's going to be shown four, five, six, and so are we.

06:18

And you can think about this repeated at scale in real time

06:21

across more than a billion people.

06:23

That's basically what it is, it's math.

06:25

And of course, you know,

06:26

AI and machine learning has allowed this to be done

06:29

at a very, very big scale.

06:31

And what we have seen, the result of this,

06:33

is that it learns the interest signals

06:36

that people exhibit very quickly

06:38

and shows you content that's really relevant for you

06:41

in a very quick way.

06:44

CA: So it's a form of collaborative filtering, from what you're saying.

06:47

The theory behind it is that these humans are weird people,

06:50

we don't really know what they're interested in,

06:52

but if we see that one human is interested,

06:54

with an overlap of someone else, chances are, you know,

06:57

you could make use of the other pieces

06:59

that are in that overlapped human's repertoire to feed them,

07:05

and they'll be surprised.

07:06

But the reason they like it is because their pal also liked it.

07:10

SC: It's pattern recognition based on your interest signals.

07:13

And I think the other thing here

07:15

is that we don't actually ask you 20 questions

07:17

on whether you like a piece of content, you know, what are your interests,

07:21

we don't do that.

07:22

We built that experience organically into the app experience.

07:25

So you are voting with your thumbs by watching a video,

07:29

by swiping it, by liking it, by sharing it,

07:32

you are basically exhibiting interest signals.

07:34

And what it does mathematically is to take those signals,

07:37

put it in a formula and then matches it through pattern recognition.

07:40

That's basically the idea behind it.

07:43

CA: I mean, lots of start-ups have tried to use these types of techniques.

07:49

I'm wondering what else played a role early on?

07:51

I mean, how big a deal was it,

07:52

that from the get-go you were optimizing for smartphones

07:56

so that videos were shot in portrait format

07:58

and they were short.

08:00

Was that an early distinguishing thing that mattered?

08:03

SC: I think we were the first to really try this at scale.

08:06

You know, the recommendation algorithm is a very important reason

08:10

as to why the platform is so popular among so many people.

08:15

But beyond that, you know, you mentioned the format itself.

08:20

So we talked about the vision of the company,

08:22

which is to have a window to discover.

08:24

And if you just open the app for the first time,

08:26

you'll see that it takes up your whole screen.

08:28

So that's the window that we want.

08:30

You can imagine a lot of people using that window

08:32

to discover new things in their lives.

08:34

Then, you know, through this recommendation algorithm,

08:37

we have found that it connects people together.

08:40

People find communities,

08:41

and I've heard so many stories of people who have found their communities

08:45

because of the content that they're posting.

08:47

Now, I'll give you an example.

08:49

I was in DC recently, and I met with a bunch of creators.

08:53

CA: I heard.

08:54

(Laughter)

08:56

SC: One of them was sitting next to me at a dinner,

08:58

his name is Samuel.

08:59

He runs a restaurant in Phoenix, Arizona, and it's a taco restaurant.

09:04

He told me he has never done this before, first venture.

09:08

He started posting all this content on TikTok,

09:10

and I saw his content,

09:12

I was hungry after looking at it, it's great content.

09:16

And he's generated so much interest in his business,

09:19

that last year he made something like a million dollars in revenue

09:22

just via TikTok.

09:23

One restaurant.

09:24

And again and again, I hear these stories,

09:27

you know, by connecting people together,

09:29

by giving people the window to discover,

09:31

we have given many small businesses and many people, your common person,

09:36

a voice that they will never otherwise have.

09:38

And I think that's the power of the platform.

09:41

CA: So you definitely have identified early

09:43

just how we're social creatures, we need affirmation.

09:47

I've heard a story,

09:48

and you can tell me whether true or not,

09:50

that one of the keys to your early liftoff

09:53

was that you wanted to persuade creators who were trying out TikTok

09:57

that this was a platform where they would get response,

10:01

early on, when you're trying to grow something,

10:03

the numbers aren't there for response.

10:05

So you had the brilliant idea of goosing those numbers a bit,

10:08

basically finding ways to give people, you know,

10:11

a bigger sense of like, more likes,

10:13

more engagement than was actually the case,

10:15

by using AI agents somehow in the process.

10:19

Is that a brilliant idea, or is that just a myth?

10:22

SC: I would describe it in a different way.

10:26

So there are other platforms that exist before TikTok.

10:29

And if you think about those platforms,

10:31

you sort of have to be famous already in order to get followers.

10:35

Because the way it’s built is that people come and follow people.

10:39

And if you aren't already famous,

10:41

the chances that you get discovered are very, very low.

10:44

Now, what we have done, again,

10:46

because of the difference in the way we're recommending content,

10:49

is that we have given anyone,

10:51

any single person with enough talent a stage to be able to be discovered.

10:56

And I think that actually is the single, probably the most important thing

11:00

contributing to the growth of the platform.

11:02

And again and again, you will hear stories from people who use the platform,

11:06

who post regularly on it,

11:07

that if they have something they want to say,

11:10

the platform gives them the chance and the stage

11:12

to connect with their audience

11:14

in a way that I think no other product in the past has ever offered them.

11:19

CA: So I'm just trying to play back what you said there.

11:21

You said you were describing a different way what I said.

11:25

Is it then the case that like, to give someone a decent chance,

11:28

someone who's brilliant but doesn't come with any followers initially,

11:32

that you've got some technique to identify talent

11:35

and that you will almost encourage them,

11:38

you will give them some kind of, you know,

11:40

artificially increase the number of followers or likes

11:43

or whatever that they have,

11:44

so that others are encouraged to go,

11:46

"Wow, there's something there."

11:48

Like it's this idea of critical mass that kind of, every entrepreneur,

11:51

every party planner kind of knows about of

11:53

"No, no, this is the hot place in town, everyone come,"

11:56

and that that is how you actually gain critical mass?

11:58

SC: We want to make sure that every person who posts a video

12:02

is given an equal chance to be able to have some audience to begin with.

12:07

But this idea that you are maybe alluding to,

12:10

that we can get people to like something,

12:13

it doesn't really work like that.

12:15

CA: Could you get AI agents to like something?

12:18

Could you seed the network with extra AI agents that could kind of, you know,

12:22

give someone early encouragement?

12:24

SC: Ultimately, what the machine does is it recognizes people's interests.

12:29

So if you post something that's not interesting to a lot of people,

12:32

even if you gave it a lot of exposure,

12:34

you're not going to get the virality that you want.

12:36

So it's a lot of ...

12:38

There is no push here.

12:41

It's not like you can go and push something,

12:44

because I like Chris, I'm going to push your content,

12:46

it doesn't work like that.

12:47

You've got to have a message that resonates with people,

12:50

and if it does,

12:51

then it will automatically just have the virality itself.

12:54

That's the beauty of user-generated content.

12:56

It's not something that can be engineered or over-thought.

13:00

It really is something that has to resonate with the audience.

13:04

And if it does, then it goes viral.

13:06

CA: Speaking privately with an investor who knows your company quite well,

13:10

who said that actually the level of sophistication

13:15

of the algorithms you have going

13:16

is just another order of magnitude

13:18

to what competitors like, you know, Facebook or YouTube have going.

13:23

Is that just hype or do you really believe you --

13:27

like, how complex are these algorithms?

13:31

SC: Well, I think in terms of complexity,

13:33

there are many companies who have a lot of resources

13:35

and a lot of talent.

13:36

They will figure out even the most complex algorithms.

13:39

I think what is very different is your mission of your company,

13:43

how you started the company.

13:45

Like I said, you know, we started with this idea

13:47

that this was the main use case.

13:50

The most important use case is you come and you get to see recommended content.

13:54

Now for some other apps out there,

13:56

they are very significant and have a lot of users,

13:59

they are built for a different original purpose.

14:02

And if you are built for something different,

14:04

then your users are used to that

14:05

because the community comes in and they expect that sort of experience.

14:09

So I think the pivot away from that

14:10

is not really just a matter of engineering and algorithms,

14:14

it’s a matter of what your company is built to begin with.

14:18

Which is why I started this by saying you need to have a vision,

14:21

you need to have a mission, and that's the North Star.

14:23

You can't just shift it halfway.

14:25

CA: Right.

14:26

And is it fair to say

14:27

that because your start point has been interest algorithms

14:30

rather than social graph algorithms,

14:32

you've been able to avoid some of the worst of the sort of,

14:35

the filter bubbles that have happened in other social media

14:37

where you have tribes kind of declaring war on each other effectively.

14:41

And so much of the noise and energy is around that.

14:45

Do you believe that you've largely avoided that on TikTok?

14:48

SC: The diversity of content that our users see is very key.

14:52

You know, in order for the discovery -- the mission is to discover --

14:56

sorry, the vision is to discover.

14:59

So in order to facilitate that,

15:00

it is very important to us

15:02

that what the users see is a diversity of content.

15:06

Now, generally speaking, you know,

15:08

there are certain issues that you mentioned

15:10

that the industry faces, you know.

15:12

There are some bad actors who come on the internet,

15:14

they post bad content.

15:16

Now our approach is that we have very clear community guidelines.

15:20

We're very transparent about what is allowed

15:22

and what is not allowed on our platform.

15:24

No executives make any ad hoc decisions.

15:27

And based on that,

15:28

we have built a team that is tens of thousands of people plus machines

15:33

in order to identify content that is bad

15:35

and actively and proactively remove it from the platform.

15:38

CA: Talk about what some of those key guidelines are.

15:41

SC: We have it published on our website.

15:43

In March, we just iterated a new version to make it more readable.

15:49

So there are many things like, for example, no pornography,

15:53

clearly no child sexual abuse material and other bad things,

15:56

no violence, for example.

15:58

We also make it clear that it's a differentiated experience

16:01

if you're below 18 years old.

16:03

So if you're below 18 years old, for example,

16:05

your entire app experience is actually more restricted.

16:09

We don't allow, as an example,

16:11

users below 16, by default, to go viral.

16:15

We don't allow that.

16:16

If you're below 16,

16:18

we don’t allow you to use the instant messaging feature in app.

16:22

If you’re below 18, we don’t allow you to use the livestreaming features.

16:26

And of course, we give parents a whole set of tools

16:28

to control their teenagers’ experience as well.

16:31

CA: How do you know the age of your users?

16:34

SC: In our industry, we rely mainly on something called age gating,

16:38

which is when you sign up for the app for the first time

16:41

and we ask you for the age.

16:42

Now, beyond that,

16:44

we also have built tools to go through your public profile for example,

16:49

when you post a video,

16:50

we try to match the age that you said with the video that you just posted.

16:55

Now, there are questions of can we do more?

16:57

And the question always has, for every company, by the way,

17:00

in our industry, has to be balanced with privacy.

17:03

Now, if, for example, we scan the faces of every single user,

17:09

then we will significantly increase the ability to tell their age.

17:13

But we will also significantly increase the amount of data

17:16

that we collect on you.

17:17

Now, we don't want to collect data.

17:19

We don't want to scan data on your face to collect that.

17:21

So that balance has to be maintained,

17:23

and it's a challenge that we are working through

17:26

together with industry, together with the regulators as well.

17:29

CA: So look, one thing that is unquestionable

17:32

is that you have created a platform for literally millions of people

17:35

who never thought they were going to be a content creator.

17:38

You've given them an audience.

17:40

I'd actually like to hear from you one other favorite example

17:43

of someone who TikTok has given an audience to

17:46

that never had that before.

17:47

SC: So when again,

17:49

when I travel around the world,

17:51

I meet with a whole bunch of creators on our platform.

17:55

I was in South Korea just yesterday, and before that I met with -- yes,

18:01

before that I met with a bunch of --

18:02

People don't expect, for example, teachers.

18:04

There is an English teacher from Arkansas.

18:08

Her name is Claudine, and I met her in person.

18:11

She uses our platform to reach out to students.

18:14

There is another teacher called Chemical Kim.

18:17

And Chemical Kim teaches chemistry.

18:20

What she does is she uses our platform

18:22

to reach out to a much broader student base

18:25

than she has in her classroom.

18:26

And they're both very, very popular.

18:28

You know, in fact,

18:29

what we have realized is that STEM content

18:34

has over 116 billion views on our platform globally.

18:39

And it's so significant --

18:40

CA: In a year?

18:41

SC: Cumulatively.

18:43

CA: [116] billion.

18:44

SC: It's so significant, that in the US we have started testing,

18:47

creating a feed just for STEM content.

18:50

Just for STEM content.

18:52

I’ve been using it for a while, and I learned something new.

18:55

You want to know what it is?

18:56

Apparently if you flip an egg on your tray,

19:00

the egg will last longer.

19:02

It's science,

19:03

there’s a whole video on this, I learned this on TikTok.

19:06

You can search for this.

19:07

CA: You want to know something else about an egg?

19:10

If you put it in just one hand and squeeze it as hard as you can,

19:13

it will never break.

19:14

SC: Yes, I think I read about that, too.

19:16

CA: It's not true.

19:17

(Laughter)

19:18

SC: We can search for it.

19:21

CA: But look, here's here's the flip side to all this amazingness.

19:24

And honestly, this is the key thing,

19:26

that I want to have an honest, heart-to-heart conversation with you

19:31

because it's such an important issue,

19:33

this question of human addiction.

19:35

You know, we are ...

19:38

animals with a prefrontal cortex.

19:42

That's how I think of us.

19:43

We have these addictive instincts that go back millions of years,

19:48

and we often are in the mode of trying to modulate our own behavior.

19:54

It turns out that the internet is incredibly good

19:59

at activating our animal cells

20:01

and getting them so damn excited.

20:04

And your company, the company you've built,

20:07

is better at it than any other company on the planet, I think.

20:12

So what are the risks of this?

20:16

I mean, how ...

20:17

From a company point of view, for example,

20:20

it's in your interest to have people on there as long as possible.

20:24

So some would say, as a first pass,

20:26

you want people to be addicted as long as possible.

20:28

That's how advertising money will flow and so forth,

20:32

and that's how your creators will be delighted.

20:36

What is too much?

20:37

SC: I don't actually agree with that.

20:40

You know, as a company,

20:41

our goal is not to optimize and maximize time spent.

20:45

It is not.

20:46

In fact, in order to address people spending too much time on our platform,

20:50

we have done a number of things.

20:52

I was just speaking with some of your colleagues backstage.

20:55

One of them told me she has encountered this as well.

20:58

If you spend too much time on our platform,

21:00

we will proactively send you videos to tell you to get off the platform.

21:05

We will.

21:06

And depending on the time of the day,

21:08

if it's late at night, it will come sooner.

21:10

We have also built in tools to limit,

21:12

if you below 18 years old, by default,

21:15

we set a 60-minute default time limit.

21:18

CA: How many?

21:19

SC: Sixty minutes.

21:20

And we've given parents tools and yourself tools,

21:22

if you go to settings, you can set your own time limit.

21:25

We've given parents tools so that you can pair,

21:27

for the parents who don't know this, go to settings, family pairing,

21:31

you can pair your phone with your teenager's phone

21:33

and set the time limit.

21:34

And we really encourage parents to have these conversations with their teenagers

21:38

on what is the right amount of screen time.

21:40

I think there’s a healthy relationship that you should have with your screen,

21:44

and as a business, we believe that that balance needs to be met.

21:47

So it's not true that we just want to maximize time spent.

21:50

CA: If you were advising parents here

21:52

what time they should actually recommend to their teenagers,

21:56

what do you think is the right setting?

21:57

SC: Well, 60 minutes,

21:59

we did not come up with it ourselves.

22:00

So I went to the Digital Wellness Lab at the Boston Children's Hospital,

22:04

and we had this conversation with them.

22:06

And 60 minutes was the recommendation that they gave to us,

22:09

which is why we built this into the app.

22:10

So 60 minutes, take it for what it is,

22:12

it’s something that we’ve had some discussions of experts.

22:16

But I think for all parents here,

22:17

it is very important to have these conversations with your teenage children

22:21

and help them develop a healthy relationship with screens.

22:26

I think we live in an age where it's completely inevitable

22:29

that we're going to interact with screens and digital content,

22:34

but I think we should develop healthy habits early on in life,

22:37

and that's something I would encourage.

22:39

CA: Curious to ask the audience,

22:42

which of you who have ever had that video on TikTok appear

22:46

saying, “Come off.”

22:48

OK, I mean ...

22:50

So maybe a third of the audience seem to be active TikTok users,

22:54

and about 20 people maybe put their hands up there.

22:58

Are you sure that --

23:00

like, it feels to me like this is a great thing to have,

23:05

but are you ...

23:07

isn't there always going to be a temptation

23:09

in any given quarter or whatever,

23:11

to just push it a bit at the boundary

23:13

and just dial back a bit on that

23:15

so that you can hit revenue goals, etc?

23:19

Are you saying that this is used scrupulously?

23:22

SC: I think, you know, in terms ...

23:25

Even if you think about it from a commercial point of view,

23:28

it is always best when your customers have a very healthy relationship

23:31

with your product.

23:32

It's always best when it's healthy.

23:34

So if you think about very short-term retention, maybe,

23:37

but that's not the way we think about it.

23:40

If you think about it from a longer-term perspective,

23:42

what you really want to have is a healthy relationship, you know.

23:45

You don’t want people to develop very unhealthy habits,

23:48

and then at some point they're going to drop it.

23:51

So I think everything in moderation.

23:53

CA: There's a claim out there that in China,

23:56

there's a much more rigorous standards imposed on the amount of time

24:01

that children, especially, can spend on the TikTok equivalent of that.

24:06

SC: That is unfortunately a misconception.

24:11

So that experience that is being mentioned for Douyin,

24:14

which is a different app,

24:15

is for an under 14-year-old experience.

24:18

Now, if you compare that in the United States,

24:20

we have an under-13 experience in the US.

24:23

It's only available in the US, it's not available here in Canada,

24:26

in Canada, we just don't allow it.

24:28

If you look at the under-13 experience in the US,

24:30

it's much more restricted than the under-14 experience in China.

24:34

It's so restrictive,

24:35

that every single piece of content is vetted

24:38

by our third-party child safety expert.

24:43

And we don't allow any under-13s in the US to publish,

24:47

we don’t allow them to post,

24:49

and we don't allow them to use a lot of features.

24:51

So I think that that report, I've seen that report too,

24:54

it's not doing a fair comparison.

24:56

CA: What do you make of this issue?

24:58

You know, you've got these millions of content creators

25:00

and all of them, in a sense, are in a race for attention,

25:05

and that race can pull them in certain directions.

25:07

So, for example, teenage girls on TikTok,

25:12

sometimes people worry that, to win attention,

25:15

they've discovered that by being more sexual

25:18

that they can gain extra viewers.

25:20

Is this a concern?

25:21

Is there anything you can do about this?

25:23

SC: We address this in our community guidelines as well.

25:28

You know, if you look at sort of the sexualized content on our guidelines,

25:32

if you’re below a certain age,

25:33

you know, for certain themes that are mature,

25:37

we actually remove that from your experience.

25:39

Again, I come back to this,

25:40

you know, we want to have a safe platform.

25:43

In fact, at my congressional hearing,

25:44

I made four commitments to our users and to the politicians in the US.

25:48

And the first one is that we take safety, especially for teenagers,

25:53

extremely seriously,

25:54

and we will continue to prioritize that.

25:56

You know, I believe that we need to give our teenage users,

26:00

and our users in general,

26:01

a very safe experience,

26:03

because if we don't do that,

26:04

then we cannot fulfill --

26:05

the mission is to inspire creativity and to bring joy.

26:08

If they don't feel safe, I cannot fulfill my mission.

26:11

So it's all very organic to me as a business

26:14

to make sure I do that.

26:15

CA: But in the strange interacting world of human psychology and so forth,

26:19

weird memes can take off.

26:20

I mean, you had this outbreak a couple years back

26:23

with these devious licks where kids were competing with each other

26:26

to do vandalism in schools and, you know,

26:29

get lots of followers from it.

26:30

How on Earth do you battle something like that?

26:33

SC: So dangerous challenges are not allowed on our platform.

26:36

If you look at our guidelines, it's violative.

26:39

We proactively invest resources to identify them

26:43

and remove them from our platform.

26:45

In fact, if you search for dangerous challenges on our platform today,

26:48

we will redirect you to a safety resource page.

26:51

And we actually worked with some creators as well to come up with campaigns.

26:54

This is another campaign.

26:56

It's the "Stop, Think, Decide Before You Act" campaign

26:59

where we work with the creators to produce videos,

27:01

to explain to people that some things are dangerous,

27:04

please don't do it.

27:05

And we post these videos actively on our platform as well.

27:09

CA: That's cool.

27:11

And you've got lots of employees.

27:12

I mean, how many employees do you have

27:14

who are specifically looking at these content moderation things,

27:18

or is that the wrong question?

27:20

Are they mostly identified by AI initially

27:22

and then you have a group who are overseeing

27:25

and making the final decision?

27:27

SC: The group is based in Ireland and it's a lot of people,

27:31

it's tens of thousands of people.

27:32

CA: Tens of thousands?

27:33

SC: It's one of the most important cost items on my PnL,

27:38

and I think it's completely worth it.

27:39

Now, most of the moderation has to be done by machines.

27:42

The machines are good, they're quite good,

27:45

but they're not as good as, you know,

27:47

they're not perfect at this point.

27:48

So you have to complement them with a lot of human beings today.

27:51

And I think, by the way, a lot of the progress in AI in general

27:56

is making that kind of content moderation capabilities a lot better.

28:00

So we're going to get more precise.

28:02

You know, we’re going to get more specific.

28:04

And it’s going to be able to handle larger scale.

28:08

And that's something I think that I'm personally looking forward to.

28:13

CA: What about this perceived huge downside

28:18

of use of, certainly Instagram, I think TikTok as well.

28:21

What people worry that you are amplifying insecurities,

28:25

especially of teenagers

28:27

and perhaps especially of teenage girls.

28:29

They see these amazing people on there doing amazing things,

28:33

they feel inadequate,

28:34

there's all these reported cases of depression, insecurity,

28:38

suicide and so forth.

28:39

SC: I take this extremely seriously.

28:42

So in our guidelines,

28:45

for certain themes that we think are mature and not suitable for teenagers,

28:50

we actually proactively remove it from their experience.

28:54

At the same time, if you search certain terms,

28:56

we will make sure that you get redirected to a resource safety page.

29:01

Now we are always working with experts to understand some of these new trends

29:05

that could emerge

29:06

and proactively try to manage them, if that makes sense.

29:10

Now, this is a problem that predates us,

29:13

that predates TikTok.

29:14

It actually predates the internet.

29:16

But it's our responsibility to make sure

29:18

that we invest enough to understand and to address the concerns,

29:21

to keep the experience as safe as possible

29:23

for as many people as possible.

29:26

CA: Now, in Congress,

29:27

the main concern seemed to be not so much what we've talked about,

29:30

but data, the data of users,

29:33

the fact that you're owned by ByteDance, Chinese company,

29:36

and the concern that at any moment

29:39

Chinese government might require or ask for data.

29:43

And in fact, there have been instances

29:45

where, I think you've confirmed,

29:46

that some data of journalists on the platform

29:50

was made available to ByteDance's engineers

29:53

and from there, who knows what.

29:56

Now, your response to this was to have this Project Texas,

30:00

where you're moving data to be controlled by Oracle here in the US.

30:05

Can you talk about that project and why, if you believe it so,

30:10

why we should not worry so much about this issue?

30:13

SC: I will say a couple of things about this, if you don't mind.

30:16

The first thing I would say is that the internet is built

30:19

on global interoperability,

30:20

and we are not the only company that relies on the global talent pool

30:24

to make our products as good as possible.

30:26

Technology is a very collaborative effort.

30:28

I think many people here would say the same thing.

30:32

So we are not the first company to have engineers in all countries,

30:35

including in China.

30:36

We're not the first one.

30:38

Now, I understand some of these concerns.

30:40

You know, the data access by employees is not data accessed by government.

30:44

This is very different, and there’s a clear difference in this.

30:47

But we hear the concerns that are raised in the United States.

30:50

We did not try to avoid discussing.

30:54

We did not try to argue our way out of it.

30:57

What we did was we built an unprecedented project

31:00

where we localize American data to be stored on American soil

31:04

by an American company overseen by American personnel.

31:08

So this kind of protection for American data

31:11

is beyond what any other company in our industry has ever done.

31:16

Well, money is not the only issue here,

31:19

but it's very expensive to build something like that.

31:21

And more importantly, you know,

31:23

we are basically localizing data in a way that no other company has done.

31:28

So we need to be very careful that whilst we are pursuing

31:32

what we call digital sovereignty in the US

31:35

and we are also doing a version of this in Europe,

31:38

that we don't balkanize the internet.

31:39

Now we are the first to do it.

31:41

And I expect that, you know,

31:42

other companies are probably looking at this

31:45

and trying to figure out how you balance between protecting, protected data,

31:49

you know, to make sure that everybody feels secure about it

31:52

while at the same time allowing for interoperability

31:55

to continue to happen,

31:56

because that's what makes technology and the internet so great.

31:59

So that's something that we are doing.

32:01

CA: How far are you along that journey with Project Texas?

32:04

SC: We are very, very far along today.

32:05

CA: When will there be a clear you know,

32:09

here it is, it’s done, it’s firewalled, this data is protected?

32:12

SC: Today, by default, all new US data

32:16

is already stored in the Oracle cloud infrastructure.

32:18

So it's in this protected US environment that we talked about in the United States.

32:23

We still have some legacy data to delete in our own servers in Virginia

32:27

and in Singapore.

32:28

Our data has never been stored in China, by the way.

32:31

That deletion is a very big engineering effort.

32:33

So as we said, as I said at the hearing,

32:36

it's going to take us a while to delete them,

32:38

but I expect it to be done this year.

32:43

CA: How much power do you have

32:46

over your own ability to control certain things?

32:49

So, for example, suppose that, for whatever reason,

32:52

the Chinese government was to look at an upcoming US election and say,

32:57

"You know what, we would like this party to win," let's say,

33:01

or "We would like civil war to break out" or whatever.

33:05

How ...

33:06

"And we could do this

33:07

by amplifying the content of certain troublemaking, disturbing people,

33:12

causing uncertainty, spreading misinformation," etc.

33:16

If you were required via ByteDance to do this,

33:21

like, first of all, is there a pathway where theoretically that is possible?

33:26

What's your personal line in the sand on this?

33:30

SC: So during the congressional hearing,

33:32

I made four commitments,

33:34

we talked about the first one, which is safety.

33:36

The third one is to keep TikTok a place of freedom of expression.

33:39

By the way, if you go on TikTok today,

33:41

you can search for anything you want,

33:43

as long as it doesn't violate our community guidelines.

33:46

And to keep it free from any government manipulation.

33:49

And the fourth one is transparency and third-party monitoring.

33:53

So the way we are trying to address this concern

33:55

is an unprecedented amount of transparency.

33:58

What do I mean by this?

33:59

We're actually allowing third-party reviewers

34:03

to come in and review our source code.

34:05

I don't know any other company that does this, by the way.

34:08

Because everything, as you know, is driven by code.

34:11

So to allow someone else to review the source code

34:13

is to give this a significant amount of transparency

34:16

to ensure that the scenarios that you described

34:19

that are highly hypothetical, cannot happen on our platform.

34:23

Now, at the same time,

34:24

we are releasing more research tools for researchers

34:28

so that they can study the output.

34:29

So the source code is the input.

34:32

We are also allowing researchers to study the output,

34:34

which is the content on our platform.

34:36

I think the easiest way to sort of fend this off is transparency.

34:40

You know, we give people access to monitor us,

34:43

and we just make it very, very transparent.

34:45

And that's our approach to the problem.

34:47

CA: So you will say directly to this group

34:49

that the scenario I talked about,

34:51

of theoretical Chinese government interference in an American election,

34:56

you can say that will not happen?

34:59

SC: I can say that we are building all the tools

35:02

to prevent any of these actions from happening.

35:05

And I'm very confident that with an unprecedented amount of transparency

35:10

that we're giving on the platform,

35:11

we can reduce this risk to as low as zero as possible.

35:17

CA: To as low as zero as possible.

35:20

SC: To as close to zero as possible.

35:22

CA: As close to zero as possible.

35:25

That's fairly reassuring.

35:27

Fairly.

35:28

(Laughter)

35:33

I mean, how would the world know?

35:34

If you discovered this or you thought you had to do it,

35:37

is this a line in the sand for you?

35:40

Like, are you in a situation you would not let the company that you know now

35:44

and that you are running do this?

35:46

SC: Absolutely.

35:47

That's the reason why we're letting third parties monitor,

35:50

because if they find out, you know, they will disclose this.

35:52

We also have transparency reports, by the way,

35:55

where we talk about a whole bunch of things,

35:57

the content that we remove, you know, that violates our guidelines,

36:00

government requests.

36:01

You know, it's all published online.

36:03

All you have to do is search for it.

36:05

CA: So you're super compelling

36:06

and likable as a CEO, I have to say.

36:08

And I would like to, as we wrap this up,

36:11

I'd like to give you a chance just to paint, like, what's the vision?

36:14

As you look at what TikTok could be,

36:18

let's move the clock out, say, five years from now.

36:21

How should we think about your contribution to our collective future?

36:26

SC: I think it's still down to the vision that we have.

36:29

So in terms of the window of discovery,

36:31

I think there's a huge benefit to the world

36:34

when people can discover new things.

36:36

You know, people think that TikTok is all about dancing and singing,

36:39

and there’s nothing wrong with that, because it’s super fun.

36:42

There's still a lot of that,

36:44

but we're seeing science content, STEM content,

36:46

have you about BookTok?

36:48

It's a viral trend that talks about books

36:51

and encourages people to read.

36:53

That BookTok has 120 billion views globally,

36:57

120 billion.

36:58

CA: Billion, with a B.

36:59

SC: People are learning how to cook,

37:02

people are learning about science,

37:03

people are learning how to golf --

37:05

well, people are watching videos on golfing, I guess.

37:07

(Laughter)

37:10

I haven't gotten better by looking at the videos.

37:13

I think there's a huge, huge opportunity here on discovery

37:17

and giving the everyday person a voice.

37:19

If you talk to our creators, you know,

37:21

a lot of people will tell you this again and again, that before TikTok,

37:24

they would never have been discovered.

37:26

And we have given them the platform to do that.

37:28

And it's important to maintain that.

37:30

Then we talk about creation.

37:31

You know, there’s all this new technology coming in with AI-generated content

37:36

that will help people create even more creative content.

37:40

I think there's going to be a collaboration between,

37:42

and I think there's a speaker who is going to talk about this,

37:45

between people and AI

37:46

where they can unleash their creativity in a different way.

37:49

You know, like for example, I'm terrible at drawing personally,

37:52

but if I had some AI to help me,

37:55

then maybe I can express myself even better.

37:57

Then we talk about bridges to connect

37:59

and connecting people and the communities together.

38:02

This could be products, this could be commerce,

38:04

five million businesses in the US benefit from TikTok today.

38:10

I think we can get that number to a much higher number.

38:12

And of course, if you look around the world, including in Canada,

38:15

that number is going to be massive.

38:17

So I think these are the biggest opportunities that we have,

38:20

and it's really very exciting.

38:21

CA: So courtesy of your experience in Congress,

38:24

you actually became a bit of a TikTok star yourself, I think.

38:28

Some of your videos have gone viral.

38:31

You've got your phone with you.

38:32

Do you want to make a little little TikTok video right now?

38:35

Let's do this.

38:36

SC: If you don't mind ...

38:37

CA: What do you think, should we do this?

38:39

SC: We're just going to do a selfie together, how's that?

38:42

So why don't we just say "Hi."

38:44

Hi!

38:45

Audience: Hi!

38:46

CA: Hello from TED.

38:48

SC: All right, thank you, I hope it goes viral.

38:50

(Laughter)

38:53

CA: If that one goes viral, I think I've given up on your algorithm, actually.

38:57

(Laughter)

39:00

Shou Chew, you're one of the most influential

39:03

and powerful people in the world, whether you know it or not.

39:06

And I really appreciate you coming and sharing your vision.

39:08

I really, really hope the upside of what you're talking about comes about.

39:12

Thank you so much for coming today.

39:14

SC: Thank you, Chris.

39:15

CA: It's really interesting.

39:16

(Applause)