The Race For AI Robots Just Got Real (OpenAI, NVIDIA and more)

ColdFusion
25 Mar 202421:26

Summary

TLDRThe script discusses the advancements in robotics and AI, highlighting the launch of Figure1, a humanoid robot by tech startup Figure AI. Unlike previous robots, Figure1 can engage in human-like conversations and respond to stimuli without human operation, thanks to its integration with OpenAI. The robot's capabilities, the company's rapid growth, and the potential impact on the labor force are explored. The script also touches on other recent developments in the robotics industry, such as Amazon's Digit and ABB's YuMi, and contemplates the future of human-robot interaction in various sectors.

Takeaways

  • 🤖 The tech startup Figure launched a humanoid robot named Figure1, capable of human-style conversations and tasks.
  • 🚀 Figure1's demonstration showcased its ability to perform tasks like picking up trash and organizing dishes without human operation.
  • 💡 Figure AI, founded in 2022, focuses on general-purpose humanoid robots and has quickly become a unicorn with significant investments.
  • 🌟 Key investors in Figure AI include Microsoft, Nvidia, Jeff Bezos, and Intel Capital, highlighting the industry's interest in advanced robotics.
  • 📈 Figure's mission is to expand human capabilities through advanced AI, targeting the global labor shortage and essential roles like warehouse and retail.
  • 🤖 Figure1's advanced capabilities come from a partnership with OpenAI, enhancing its natural language processing and computer vision.
  • 🔍 The robot's actions are learned, not teleoperated, and it processes information at a rapid pace, though there's still latency in conversation.
  • 🏭 Figure1 is set to be tested in a BMW manufacturing plant, indicating a move towards real-world application and potential integration into industries.
  • 🌐 The robotics industry is experiencing a boom with companies like Agility Robotics, ABB, and Toyota also developing advanced robots.
  • 🔮 Nvidia's announcement of Project Groot suggests a push towards a foundational model for humanoid robots, indicating a potential industry standard.
  • 🌍 The impact of AI and robotics on the workforce is significant, with predictions of job displacement and a shift towards automation of automation.

Q & A

  • What is the main topic of discussion in the Cold Fusion episode?

    -The main topic of discussion is the launch of a humanoid robot called Figure1 by a tech startup named Figure, its capabilities, and the implications of such advancements in robotics and AI on the workforce and society.

  • When was the episode of The Jetsons, featuring Rosie the robot, first aired?

    -The episode of The Jetsons featuring Rosie the robot was first aired on September 23rd, 1962.

  • What are some of the tasks that Figure1 is shown to perform in the demo?

    -In the demo, Figure1 is shown to perform tasks such as picking up an apple, disposing of trash, and placing dishes on a drying rack.

  • How does Figure1 process and respond to human interactions?

    -Figure1 processes human interactions by capturing transcribed text and images from its cameras and microphones. It then uses a large multimodal model trained by OpenAI to understand both text and images, decide on the appropriate action, and execute it.

  • What is Figure AI's mission according to their website?

    -Figure AI's mission is to expand human capabilities through advanced AI, specifically targeting the labor force to address the global labor shortage.

  • Who is the founder of Figure AI and what is his background?

    -Figure AI was founded by Brett Adcock, who has previously founded two other companies, Vety and Archer Aviation. Vety is an online talent marketplace, and Archer Aviation is a publicly traded company working on electric vertical takeoff and landing aircraft.

  • What is the significance of Figure's partnership with OpenAI?

    -The partnership with OpenAI helps accelerate Figure's commercial timeline by enhancing the capabilities of its robots to process and reason from natural language using OpenAI's research and large multimodal models. This allows for full conversations with the robot and potentially improves its computer vision.

  • How does Figure1's reaction time compare to human and animal reaction times?

    -Figure1's mechanical action operates at 200 HZ, which means it has a reaction time of 5 milliseconds. In comparison, humans have a reaction time of around 250 milliseconds, and a cat's reaction time is approximately 30 milliseconds.

  • What is the significance of Nvidia's announcement regarding Project Groot?

    -Nvidia's announcement of Project Groot introduces a general-purpose foundational model for humanoid robots. It aims to help robots understand natural language and emulate movements by observing human actions, quickly learning coordination, dexterity, and other skills to interact with the real world.

  • What are some of the other companies and robots mentioned in the script?

    -The script mentions Boston Dynamics, Agility Robotics (with their robot Digit), ABTronic, and Toyota and Hunde's robots. It also discusses Tesla's Optimus robot and Nvidia's involvement with robotics through Project Groot.

  • What are the potential future implications of widespread humanoid robot adoption?

    -The potential future implications include the replacement of human labor in various sectors, the possibility of alleviating humans from manual tasks, and the potential military applications of humanoid robots. There are also concerns about the impact on the job market and the need for societal adaptation to these technological changes.

Outlines

00:00

🤖 The Emergence of Humanoid Robots

This paragraph introduces the concept of humanoid robots becoming a part of everyday life, drawing parallels to the animated sitcom 'The Jetsons'. It discusses the launch of 'Figure1', a humanoid robot by tech startup Figure, capable of conversing with humans and responding to stimuli. The segment also raises questions about the future of robotics and AI, and their impact on society and the workforce.

05:02

🚀 Figure1's Technological Breakthroughs

The second paragraph delves into the technical aspects of Figure1, highlighting its ability to process natural language and perform tasks through its collaboration with OpenAI. It explains how the robot learns and executes actions using neural networks and multimodal models, emphasizing the impressive speed and precision of its movements. The paragraph also touches on the human-like imperfections in the robot's speech, adding to its realistic interaction.

10:04

🏭 Addressing the Global Labor Shortage

This section discusses the mission of Figure AI to address the global labor shortage by deploying humanoid robots in the workforce. It provides background on the company's rapid growth and funding, as well as the vision of its founder, Brett Adcock. The paragraph also compares Figure1 with other robots in the industry, noting its unique capabilities in natural language processing and intricate task handling.

15:07

🌐 The Future of Robotics and AI

The fourth paragraph explores the broader implications of advancements in robotics and AI, discussing the potential for automation to replace manual and cognitive jobs. It mentions the investment and interest from major tech companies like Nvidia and the development of industry standards. The segment also highlights the role of generative AI in enabling robots to understand their environment dynamically.

20:08

📈 The Implications for Society and the Workforce

The final paragraph reflects on the societal and economic impacts of humanoid robots, questioning whether they will replace humans entirely or just alleviate us from manual tasks. It mentions the potential military interest in such technology and acknowledges the split opinions on the future of robotics. The author expresses a commitment to closely monitor the developments in this field.

Mindmap

Keywords

💡Humanoid Robots

Humanoid robots are machines designed to mimic human form and function. In the video, the humanoid robot Figure1 is introduced, capable of having natural language conversations and performing tasks such as picking up trash or placing dishes on a drying rack. The development and integration of such robots into various industries are central to the video's theme of exploring the future of robotics.

💡Artificial Intelligence (AI)

Artificial Intelligence refers to the simulation of human intelligence in machines that are programmed to think and learn like humans. In the context of the video, AI is crucial for enabling robots to understand and process natural language, respond to external stimuli, and carry out tasks autonomously. The integration of AI is what sets Figure1 apart from previous robotic models.

💡Labor Shortage

Labor shortage refers to a situation where there are more job openings than there are workers to fill them. The video discusses how humanoid robots like Figure1 are being developed to address this issue by taking on roles in sectors with significant worker shortages, such as warehouse and transportation jobs.

💡Natural Language Processing (NLP)

Natural Language Processing is a subfield of AI concerned with the interaction between computers and humans through natural language. In the video, NLP is a key technology that enables robots like Figure1 to understand and generate human language, facilitating more intuitive and effective communication between humans and robots.

💡Investment

Investment in the context of the video refers to the financial backing received by Figure AI, which has allowed the startup to develop and advance its humanoid robot technology. The investment from major tech companies like Microsoft, Nvidia, and Intel Capital signifies the industry's belief in the potential of Figure1 and similar technologies.

💡Automation

Automation refers to the process of making a task or system operate automatically. In the video, automation is discussed as a key driver in the development of robotics, with the potential to replace or augment human labor in various industries. The advent of humanoid robots is seen as a significant step forward in automation technology.

💡Machine Learning

Machine Learning is a subset of AI that allows computers to learn from and make decisions based on data. In the video, machine learning is essential for robots to improve their performance over time, adapt to new tasks, and understand complex human behaviors and language.

💡Generative AI

Generative AI refers to AI systems that can create new content or data, such as writing code or generating images. In the video, generative AI is highlighted as a transformative technology that enables robots to dynamically understand and adjust to their environment without the need for hard coding, which was a limitation in previous robotic developments.

💡Future of Robotics

The future of robotics encompasses the potential developments and applications of robotic technology in the coming years. The video explores the implications of increasingly sophisticated and autonomous robots, including their potential to replace human labor, assist in dangerous tasks, and revolutionize industries.

💡Ethical Considerations

Ethical considerations involve the moral implications and responsibilities associated with the development and use of technology. In the context of the video, these considerations are crucial as humanoid robots become more prevalent, potentially replacing human jobs and raising questions about the role of machines in society.

Highlights

The Jetsons episode from 1962 depicted a future with humanoid robots, and now we are one step closer to that reality with the launch of Figure1 by tech startup Figure.

Figure1 is a humanoid robot capable of having human-style conversations and performing tasks autonomously, thanks to its integration with OpenAI.

The robot's demonstration video, while shortened for brevity, claims to be unaltered and in real-time, showcasing its ability to understand and respond to commands.

Figure AI, founded in 2022, has rapidly gained traction, raising $675 million and achieving unicorn status with investments from major tech companies like Microsoft, Nvidia, and Intel Capital.

Figure's mission is to expand human capabilities through advanced AI, targeting the labor force to address the global labor shortage.

The company was founded by Brett Adcock, who has a history of successful tech startups, including an online talent marketplace and a company working on electric aircraft.

Figure1's abilities include understanding natural language, processing visual inputs, and performing actions with impressive speed and precision.

The robot's reaction time is 5 milliseconds, significantly faster than human reaction times, making it capable of complex tasks with minimal latency.

Despite the impressive capabilities, there are still challenges to overcome, such as the latency in voice command responses and the robot's walking abilities.

The robotics industry is experiencing a boom, with companies like Amazon and Toyota investing in warehouse and logistics robots capable of performing human tasks.

Nvidia's Project Groot aims to create a foundational model for humanoid robots, enabling them to understand language and learn skills quickly to interact with the real world.

The advancements in AI and robotics are expected to lead to significant changes in the job market, with estimates suggesting over 300 million jobs could be affected by AI advancements.

Figure has already signed an agreement with BMW to test Figure1 in their manufacturing plant, indicating a move towards real-world applications of the technology.

The future of humanoid robots raises questions about their role in society, whether they will replace humans entirely or just assist with manual tasks.

The development and adoption of humanoid robots are likely to accelerate, with continuous improvements expected from companies like Figure and others in the industry.

The potential applications of humanoid robots extend beyond domestic and industrial use, with possibilities in areas like the military and space exploration.

The robotics and AI advancements represent a shift towards the automation of automation, changing the landscape of jobs and the nature of work.

The future impact of these technologies on society will depend on how we collectively navigate the challenges and opportunities they present.

Transcripts

00:00

fascinating crazy maybe a little scary I

00:04

mean it sounds like we're talking about

00:05

not just using something to help humans

00:08

do their jobs but just replacing it

00:09

entirely could this be a new normal on a

00:13

on a large scale hi welcome to another

00:15

episode of Cold Fusion this scene is

00:18

from the popular animated American

00:19

sitcom the Jetson this episode titled

00:22

Rosie the robot first aired on September

00:25

23rd 1962 the main character in the

00:28

scene is presented with a few variations

00:30

of a robot one that's responsive to

00:32

commands will do housework and basically

00:35

work as a personal assistant 61 years on

00:38

and we're one step closer to seeing that

00:40

reality this week a tech startup called

00:43

Figure launched a humanoid robot called

00:45

figure1 you can have conversations with

00:48

it thanks to their integration with open

00:50

AI now at this point you might be

00:52

thinking we've already seen robots like

00:54

this over the years whether it's from

00:56

Boston Dynamics or Tesla how is this any

00:59

different well first let's take a look

01:01

at the demo of the robot before we get

01:03

into the weeds keep in mind that this

01:05

demo as claimed by the company is

01:07

unaltered and in real time but for the

01:09

purposes of brevity I've shortened some

01:11

parts all right so let's watch great can

01:14

I have something to

01:15

eat sure

01:23

thing great can you explain why you did

01:28

what you just did while you pick up this

01:30

trash on it so I gave you the Apple

01:33

because it's the only uh edible item I

01:36

could provide you with from the

01:39

table so based on the scene right now

01:42

where do you think the dishes in front

01:44

of you go

01:45

next the dishes on the table like that

01:48

plate and cup are likely to go into the

01:50

drying rack next great can you put them

01:54

there of

01:58

course

02:03

after seeing how well it did for those

02:05

tasks and the human style conversation

02:07

some of you will be impressed some

02:09

excited and some even a little bit

02:11

scared according to the company this bot

02:14

isn't being operated by anyone it's able

02:17

to carry out full conversations with

02:18

people and respond to external stimuli

02:21

and that right there is one of the

02:23

primary differences between figure 01

02:25

and the other Android robots we've seen

02:27

so far but let's back up for a second

02:30

how did a startup only founded in 2022

02:33

make such Headway who is backing figure

02:36

and what are their

02:37

intentions given what Nvidia just

02:39

announced in the past week are we moving

02:41

towards an aid driven robotics race and

02:44

most importantly how do these advances

02:47

in robotics and AI impact all of us

02:50

we'll take a look at all of that and

02:51

have a Roundup of all the latest robots

02:53

in this episode so sit back relax into

02:56

your docking station as we figure out

02:58

the future of robotics

03:03

you are watching cold fusion

03:11

TV it all started in the wake of Co at

03:15

this time robotic sales began surging

03:17

hitting a record higher in 2022 as the

03:20

workforce retreated these robots were

03:22

the kind that were used in warehouses

03:24

and assembly lines the exact sort of

03:26

predictable environment that was the

03:28

lowest hanging fruit to train and deploy

03:30

robots and amongst this backdrop comes

03:33

figure AI founded in 2022 figure AI Inc

03:37

is a California based Tech startup with

03:39

a focus on general purpose humanoid

03:41

robots within just 1 year of its

03:44

Inception the company raised $70 million

03:47

but now they've recently raised a

03:49

whopping 675 million and now have a

03:51

valuation of 2.6 billion officially a

03:55

unicorn and you know it's serious

03:57

business when the likes of Microsoft

03:59

Nvidia Jeff Bezos Intel capital and many

04:02

others have become investors figure's

04:04

Mission according to their website is to

04:06

quote expand human capabilities through

04:09

advanced AI they aim to do so by

04:11

specifically targeting the labor force

04:14

according to figure the labor force is

04:16

shrinking they mentioned that quote

04:18

there are 10 million unfilled jobs in

04:20

the United States 7 million of those job

04:22

openings are for essential roles such as

04:24

Warehouse transportation and Retail and

04:26

there are only 6 million people

04:28

available to fill these open positions

04:30

while key Warehouse suppliers predict

04:32

they will run out of people to hire by

04:34

2024 quote as we reach the upper limits

04:37

of our production capability humanoids

04:39

will join the workforce with the ability

04:41

to think learn and interact with their

04:43

environment safely alongside

04:46

[Music]

04:51

us figur founder Brett Adcock is no

04:54

stranger to the tech industry he's

04:57

previously founded two other companies

04:58

vety and Archer Aviation veter is an

05:02

online Talent Marketplace and in 2018

05:04

they were acquired for 100 million

05:06

Archer Aviation is now publicly traded

05:09

and they're working on building electric

05:10

vertical takeoff and Landing aircraft so

05:13

with his experience Adcock has built a

05:16

team of Industry veterans including

05:17

professionals who have worked at Boston

05:19

Dynamics Google Deep Mind and

05:22

Tesla perhaps their biggest breakthrough

05:25

has come in the form of their

05:26

partnership with open AI in a press

05:28

release in February 2024 figer announced

05:31

their collaboration and it will help

05:33

quote accelerate figure's commercial

05:34

timeline by enhancing the capabilities

05:36

of its robots to process and reason from

05:39

natural language using open AI research

05:41

and large multimodal models in simple

05:44

terms it allows for one to have full

05:46

conversations with the robot and I

05:47

suspect will also make it better at

05:49

computer vision so ultimately the idea

05:52

is simple build autonomous humanoid

05:54

robots to address the global labor

05:56

shortage and to do just that figures

05:59

first r robot is

06:01

[Music]

06:03

here let's start by looking at how the

06:06

robot actually works first of all every

06:08

single action and behavior in the video

06:11

was learned and not teleoperated the

06:13

robot is equipped with cameras and

06:15

microphones so when someone speaks and

06:17

asks it to do something the transcribed

06:19

texts and images from the cameras are

06:21

captured to a large multimodal model

06:23

which is trained by open AI to

06:25

understand both text and images the

06:27

model processes the full conversation

06:29

including previously shared images it

06:31

then decides which specifically learned

06:33

Clos Loop action to take by selecting

06:35

and loading the relevant neural network

06:37

configurations onto the GPU and then

06:39

executing it the mechanical action of

06:42

figure 01 is rapid at 200 HZ which

06:45

basically means that after it knows what

06:46

to do it sends back instructions at 200

06:49

times per second that's 5 milliseconds

06:52

of reaction time for all those micro

06:53

adjustments humans are in the order of

06:55

250 milliseconds and fun fact a cat's

06:58

reaction time is around 30

07:00

milliseconds so essentially they are

07:03

using a combination of different neural

07:04

networks to sort through audio and

07:06

visual inputs understand them and then

07:08

do the appropriate actions it's actually

07:11

kind of amazing all right so how do you

07:13

think you did I I think I did pretty

07:16

well the apple found its new owner the

07:18

trash is gone and the table wear is

07:20

right where it belongs did you notice

07:22

the speech hesitation there it's the

07:24

little humanlike imperfections that

07:26

allows us as viewers to break the

07:28

illusion that it's an in anate robot

07:30

we've already seen this in chat GPT

07:32

voice hey chat GPT can you describe a

07:36

nice kitchen scene but make it very

07:38

casual and breathe and say um and and

07:41

things that sound very natural while

07:43

you're

07:46

describing sure

07:48

thing picture this you're in this cozy

07:51

kitchen right the sun's peeking through

07:53

the curtains giving that warm glow you

07:55

can almost smell the aroma of fresh

07:57

coffee brewing um there's this gentle

08:00

hum of the fridge in the background kind

08:02

of soothing you know and uh maybe

08:05

there's a bit of chatter as someone's

08:07

flipping Pancakes on the stove it's just

08:09

this laidback Vibe like time's taking a

08:12

chill pill to see it play out in a real

08:14

robot was only the next

08:16

Frontier as aside in an episode I made

08:19

back in late 2021 I talked about how it

08:21

was only a matter of time before a

08:23

company cracked the code of mixing

08:25

dexterity mobility and natural language

08:27

into a single robot and through 3 years

08:29

later I think we're getting close but

08:32

all that being said as you can see

08:34

there's still some latency between

08:36

talking and getting a response and I

08:38

think that's from open AI side mainly

08:40

although we don't know the intricate

08:41

details of the integration voiced

08:43

instructions probably still have to go

08:45

up to a server and come back I've

08:47

noticed the same delay issue when

08:49

playing around with chat GPT voice so

08:51

there's still some work to be done if

08:53

these robots are going to be used in

08:54

large scale commercial applications

08:56

imagine an emergency response robot just

08:58

standing there for 10 seconds while

09:00

someone's having a heart

09:05

attack others are simply not impressed

09:08

months ago figure showed a demo of the

09:10

robot making coffee some people back

09:12

then asked if the robots are also

09:14

learning bad habits from the training

09:15

data the figure robot is also not the

09:18

best at walking but we can't be too

09:20

harsh it's still an early stage product

09:22

as I always say criticizing is easy but

09:25

building is hard with also such an

09:28

incredible amount of investment and very

09:30

few demo videos to show some have even

09:32

drawn the comparison to a certain Trevor

09:34

Milton if you're a regular cold fusion

09:36

viewer you'll know what that's all about

09:38

but obviously this is way too soon for

09:40

that

09:45

[Music]

09:46

conversation broadly speaking it's an

09:48

exciting time for the robotics industry

09:50

so the playing field for figure AI isn't

09:52

going to be Barren by any means let's

09:55

take a quick look at the latest in the

09:57

industry first up this digit which is

09:59

made by agility Robotics and backed by

10:01

Amazon before now robots could only

10:04

replace part of a human job enough to

10:06

make coffee or an omelet but this is

10:10

digit a robot it's maker say is ready to

10:12

do what human warehouse workers do and

10:15

they can lift about 35 lbs you know the

10:17

the robot weighs about what a person

10:18

would weigh it can reach about the size

10:20

of what a person can reach you know

10:22

we're really going for that kind of like

10:23

OSHA regulations and requirements for

10:25

the robot agility robotics is funded in

10:27

part by Amazon which is investing a

10:29

billion dollars in new Industrial

10:31

Technologies the fact that it can enter

10:33

places that were designed for humans um

10:36

and solve those challenges that exist

10:38

without having to rebuild uh the

10:39

environment for Automation and it

10:42

doesn't quit and it doesn't quit there

10:44

are about

10:45

$250,000 so they're not inexpensive but

10:48

you know over just a few years they're

10:51

worth it the digit robots will be

10:53

deployed in Amazon warehouses and the

10:55

company aims to produce 10,000 units per

10:58

year so Jake fascinating crazy maybe a

11:02

little scary I mean it sounds like we're

11:03

talking about not just using something

11:05

to help humans do their jobs but just

11:08

replacing it entirely could this be a

11:10

new normal on a on a large scale I mean

11:12

this is the thing right as a tech

11:13

correspondent for years I've had

11:15

companies tell me you know well this

11:16

technology is for augmenting a human or

11:18

making a human's job a little bit easier

11:20

in this case it is truly right out and

11:22

out there to replace a human worker

11:25

Amazon warehouse workers you know that's

11:26

one of the largest employers in America

11:28

people really go for that job and so you

11:31

know we're really I think at this sort

11:32

of historic turning point where suddenly

11:34

it's no longer the case of of of you

11:36

know when could technology replace a set

11:39

of Javas but this is truly something

11:41

that's built to do that next up abtronic

11:44

is designed to move Goods around a

11:45

warehouse and can work for 22 hours per

11:48

day all you need to do is swap out the

11:50

battery every few hours deliveries start

11:52

in 20125 the company's also working with

11:55

NASA on a robot called Valkyrie and that

11:57

aims to work in dangerous environment

11:59

ments and perhaps even future space

12:01

walks Toyota and hunde are also

12:04

currently working on robots of their own

12:06

but I'd say less seriously than the

12:07

others compared to other robots as

12:09

mentioned earlier figure o one's key

12:11

difference is the ability to directly

12:13

Converse in natural language in order to

12:15

get tasks done for you it's not a new

12:17

concept as about a year ago on Cold

12:19

Fusion we covered a project from Google

12:21

that did the same thing at that time we

12:24

saw that the concept was still in its

12:25

infancy and the robot was quite slow

12:28

earlier this year here we saw Elon Musk

12:30

sharing a video of Tesla's Optimus robot

12:32

folding some clothing at first glance it

12:35

was impressive but the curtains came

12:36

down later when Elon added that Optimus

12:39

couldn't do it autonomously but since

12:41

then there's been footage of the Tesla

12:43

bot carrying boxes watering plants

12:45

walking crouching handling delicate

12:48

objects such as eggs and Performing

12:50

other domestic activities interestingly

12:52

these abilities come from the knowledge

12:54

and benefits from Tesla's self-driving

12:56

efforts which is pretty cool but again

12:58

as far as we know it lacks that natural

13:00

language

13:02

aspect perhaps the atlas robot by Boston

13:04

Dynamics is the most agile robot we've

13:06

ever seen we've seen it organized

13:09

objects and even perform back flips but

13:11

is this and the figure1 robot really the

13:13

same thing Boston dynamics's Atlas is

13:16

definitely more focused on Mobility

13:18

whereas figure seems to be suited for

13:20

intricate tasks there's more of a focus

13:22

on finger dexterity alluded to by the

13:25

five fingers with three joints on them

13:27

and for factory deployment this this

13:29

could make more sense in a lot of cases

13:32

assembly line work is done by sitting

13:33

down or being stationary at a specific

13:36

location for long periods of time and as

13:38

we were working on this video Nvidia

13:40

announced project Groot a general

13:42

purpose foundational model for humanoid

13:44

robots it's meant to help humanoid

13:46

robots quote understand natural language

13:49

and emulate movements by observing human

13:51

actions quickly learning coordination

13:53

dexterity and other skills in order to

13:55

navigate adapt and interact with the

13:57

real world

14:01

[Applause]

14:05

about the same

14:06

size the intersection of computer

14:09

Graphics physics artificial intelligence

14:12

it all came to bear at this moment well

14:16

I think we have some special

14:19

guests do

14:24

we little Jetson robotics computers

14:27

inside they learn to walk in Isaac Sim

14:32

let's

14:33

go five things where you

14:38

going I sit right

14:42

here the system removes some of the leg

14:45

work from training robots in the real

14:46

world via simulation the lessons and

14:49

skills learned can be put into a

14:50

knowledge base and transferred to other

14:52

Robots part and video wants Groot

14:54

machines to be able to understand human

14:57

language sense and navigate the world

14:59

around themselves uh and Carry Out

15:02

arbitrary tasks the training part of the

15:04

stack starts with video so the model can

15:06

sort of intuitively understand physics

15:09

then accurately modeling the robots in

15:10

Virtual space in order to teach the

15:13

model how to move around and interact

15:15

with terrain objects or other robots

15:17

Jensen described this as a gym here the

15:21

model can learn Basics far quicker than

15:23

in the real world as we've covered in

15:26

another episode on the history and size

15:27

of Nvidia the already turning heads with

15:29

their stocks growing at unprecedented

15:31

levels this year I'll leave a link to

15:33

that episode in the description as well

15:35

the company has also mentioned that

15:37

they're working with Boston Dynamics

15:38

agility Robotics abtronic and of course

15:41

figure Ai and just thinking about this I

15:45

can already see an industry standard

15:47

forming open AI provides the

15:49

foundational models Nvidia provides some

15:51

of the processing hardware and software

15:53

packages and finally there's going to be

15:55

a plethora of different robot bodies by

15:57

different vendors so why is this all

16:00

happening now well the reason that a lot

16:02

of this stuff didn't work in previous

16:03

decades was because the instructions for

16:06

the robots had to be laboriously hand

16:08

coded now with generative Ai and the

16:11

latest generations of models robots are

16:14

beginning to be able to dynamically

16:15

understand their environment and adjust

16:17

seamlessly with no hard coding the

16:20

amount of change we're going to see over

16:21

the next 5 years 10 years will dwarf

16:26

everything that's happened over the last

16:27

30 we went from automating pen and paper

16:31

right with spreadsheets to connecting

16:33

PCS into loc Network to connecting

16:36

networks to get the internet to getting

16:38

the network effect of communicating

16:40

people globally but now we're

16:41

introducing machine intell you know

16:43

artificial intelligence machine learning

16:44

deep learning neural networks Etc so and

16:47

what that enables is the automation of

16:49

automation right and so the people who

16:52

were writing software particularly at

16:55

the lower end unless you are doing these

16:57

Advanced things they're gone right the

16:59

people that s is writing itself in doing

17:01

itself right it's just math programmer

17:03

back in the day so when I was writing

17:04

code it was the algorithms were if this

17:07

than that right ex or whatever it may be

17:09

right but you had to guess right you had

17:12

to use your your best instance then it

17:14

got smarter and smarter and smarter and

17:15

libraries to do bigger and better things

17:17

all that is being

17:19

automated the progress of automation has

17:22

been a funny story to watch unfold if

17:24

you've been following the tech World

17:25

closely for long enough you'd remember

17:27

all of those discussions over a decade

17:29

ago about how manual labor jobs would

17:31

eventually be taken by robots but as

17:33

was' seen since the AI boom a couple of

17:35

years ago it was the cognitive jobs that

17:38

were the first to be impacted in 2017

17:41

only one in five companies used AI

17:43

according to McKenzie now half of all

17:45

companies

17:46

do we've already seen many companies

17:49

laying off thousands of employees in

17:51

part due to technological advancements

17:53

like AI is completely the opposite of

17:55

what most people thought would happen

17:57

Goldman Sachs last year reported

17:58

reported that over 300 million jobs

18:00

could be lost or diminished by AI most

18:03

at risk are legal architecture and

18:05

engineering business and financial

18:07

operations management sales Health Care

18:11

Art and Design Goldman Sachs also thinks

18:14

that the price of robots and their

18:15

constitute components will drop

18:17

drastically and they've already dropped

18:19

drastically since last

18:21

year unlike humans robots don't need to

18:24

take a break they don't need to sleep go

18:26

on holidays strike and don't complain so

18:30

as soon as these robots are feasible and

18:31

cheap enough companies will be chomping

18:33

at the bit to install them so what do we

18:36

do about all of this well it's your

18:38

lucky day because I've already done an

18:39

episode on life after automation it's a

18:42

huge topic and will just be way too long

18:44

to get into in this episode so I'll

18:45

leave another link to that

18:49

below okay so back to figure one at this

18:52

point you might think all right the

18:54

figure demo looks cool but surely it's

18:57

going to be a long time until these

18:58

human robots are in production well it

19:00

seems like figure didn't want to waste

19:02

any time even before the demo earlier in

19:05

the year figer signed an agreement with

19:07

BMW manufacturing they're going to test

19:09

the figure 01 robot at the car makers

19:11

plant in Spartenburg South Carolina but

19:13

we're yet to get reports on how the 01

19:15

is performing and what tasks it's going

19:17

to do exactly but I still think that's

19:19

pretty

19:20

[Music]

19:22

interesting humanoid robots like the

19:25

figure one are a future conundrum will

19:27

they replace humans completely or just

19:29

alleviate us from menual tasks leaving

19:31

the more important stuff for humans but

19:33

on the other hand of course the military

19:36

would probably want to have a sniff at

19:37

humanoid robots in the future too

19:39

looking at consumer products though

19:41

we're pretty far away from getting a

19:42

highly functional robot that's cheap

19:44

that you can buy at your local store but

19:46

the day will probably arrive figure 01

19:50

is the worst robot from this company so

19:52

there's only going to be Improvement

19:54

aside from that the competition is

19:55

heating up so if it's not them it's

19:57

going to be another compy that's going

19:59

to crack the code from the broadest

20:01

perspective possible I'm sure opinions

20:03

from all of you will be split for the

20:05

engineering types this is an excellent

20:07

display of human Ingenuity for the

20:10

Skeptics they'll say it's all pointless

20:12

and we should focus our efforts

20:13

elsewhere the optimists will see this as

20:16

an exciting New Era for Humanity free of

20:18

drudgery regardless whether that future

20:21

turns into the Jetson the Terminator or

20:24

Utopia will just have to let Humanity

20:26

collectively figure that one out all I

20:28

know is that I'll be watching this space

20:31

closely and that's where we are with

20:33

figure one and Robotics today I hope you

20:35

enjoyed that or found it interesting and

20:37

if you did feel free to share it with

20:38

someone who would find it interesting as

20:40

well all right so that's about it for me

20:42

my name is toogo and you've been

20:44

watching cold fusion and if you're new

20:46

feel free to subscribe if you like it's

20:48

free cheers guys have a good

20:51

[Music]

20:57

one

20:58

[Music]

21:19

cold fusion it's new

21:24

thinking

Rate This

5.0 / 5 (0 votes)

Связанные теги
RoboticsArtificial IntelligenceLabor AutomationHumanoid RobotsTech InnovationFigure1Natural Language ProcessingFutureOfWorkInvestmentIndustry Advancement
Вам нужно реферат на русском языке?