AWS CEO Adam Selipsky On Amazon’s $100 Million Investment Into Generative AI

CNBC
23 Jun 202327:18

TLDRAdam Selipsky, CEO of AWS, discusses Amazon's generative AI strategy, emphasizing the company's long-standing commitment to machine learning and its extensive use of large language models. He highlights the launch of Sagemaker, AWS's machine learning platform, and the development of custom chips like Trainium and Inferentia to support AI operations. Selipsky also mentions the new $100 million generative AI innovation center and Amazon Bedrock, a managed service offering diverse AI models. He asserts AWS's position as a leader in cloud computing and generative AI, focusing on customer needs and democratizing AI technology.

Takeaways

  • 🤖 Amazon has been investing in machine learning for about 25 years, starting with personalization on its website.
  • 🚀 In 2017, AWS launched Sagemaker, a machine learning platform, which now serves over 100,000 customers.
  • 🌐 AWS claims to host the majority of machine learning happening in the cloud across all cloud services.
  • 🧠 Amazon has developed its own Large Language Models (LLMs) and has been implementing them for years, including in Alexa and retail website search.
  • 💡 AWS is taking a full stack view of generative AI, from designing its own chips like Trainium for training and Inferentia for inference, to providing cloud compute capacity for AI models.
  • 🔧 Amazon's in-house chips are designed to offer industry-leading price-performance for AI training and inference.
  • 📈 Amazon is investing $100 million into a new generative AI innovation center to work directly with customers on AI solutions.
  • 🛠️ Amazon Bedrock is a managed service for generative AI, offering a choice of Amazon's own models and those from leading startups.
  • 🌟 AWS emphasizes the importance of choice in AI, believing that different customers and use cases will require different solutions.
  • 💼 Amazon believes that its long-term focus and customer obsession will keep it at the forefront of the cloud and AI industries.
  • 🏢 Despite economic uncertainty and cost-optimization efforts by customers, AWS sees robust growth ahead for cloud adoption and generative AI applications.

Q & A

  • How long has Amazon been involved in machine learning and AI?

    -Amazon has been involved in machine learning and AI for approximately 25 years, starting with personalization on the Amazon website.

  • What was the significance of AWS launching Sagemaker in 2017?

    -The launch of Sagemaker in 2017 marked the introduction of Amazon's machine learning platform, which now serves over 100,000 customers conducting machine learning operations, indicating a significant portion of cloud-based machine learning happens on AWS.

  • What are the different forms of generative AI that Amazon has been working on?

    -Amazon has been working on various forms of generative AI, including large language models (LLMs) and image models, with applications in retail website search and voice responses for Alexa, among others.

  • How does AWS approach the development and implementation of generative AI?

    -AWS takes a full stack view of generative AI, focusing on model development, chip design for training and inference, and providing customers with a choice of solutions, including their own chips like Trainium and Inferentia, as well as GPUs.

  • What is the purpose of Amazon's investment in a new generative AI innovation center?

    -The investment in the generative AI innovation center aims to bring Amazon's expertise, including data scientists and engineers, together with customers to understand their problems and design specific solutions utilizing generative AI on AWS.

  • How does Amazon's Amazon Bedrock service democratize AI?

    -Amazon Bedrock is a managed service for generative AI that provides customers with access to a variety of models, including Amazon's own Titan models and those from leading startups, thus democratizing AI by offering choices and enabling customers to experiment and find the best fit for their needs.

  • What is AWS's strategy in response to Wall Street's skepticism about Amazon's generative AI capabilities?

    -AWS focuses on customer needs and has been working on AI for a long time, accumulating extensive experience and developing a full stack of capabilities. They believe in providing customers with choices and solutions rather than focusing on perceptions in the market.

  • How does AWS plan to maintain its leadership in the cloud market amidst competition and growth slowdowns?

    -AWS plans to maintain its leadership by continuing to innovate faster than competitors, providing the best security, operational excellence, and the broadest and deepest set of capabilities, including in generative AI.

  • What is the role of ARM in Amazon's chip design efforts?

    -ARM is a critical design partner for Amazon as they brainstorm and lead their own chip design efforts, including the development of their general-purpose Graviton chips, which are more energy efficient than equivalent x86-based chips.

  • How does the demand for chips like Nvidia's H100 affect Amazon's CapEx plans?

    -The high demand for chips used in generative AI does affect Amazon's CapEx plans, as they will need to invest in infrastructure and servers to accommodate the shift. However, Amazon's own chip designs and supply chain control provide them with a significant advantage in meeting customer demands.

  • What is the significance of Amazon's Codewhisperer tool in the context of generative AI?

    -Codewhisperer is a coding companion developed by Amazon that uses generative AI to reduce the time to complete coding tasks by up to 57%. It exemplifies Amazon's practical application of generative AI to improve efficiency and productivity.

Outlines

00:00

🤖 Introduction to Amazon's Generative AI Strategy

Adam Selipsky, CEO of AWS, discusses Amazon's journey in machine learning, starting with personalization on the Amazon website and leading up to the launch of Sagemaker in 2017. He emphasizes AWS's position as a leader in cloud-based machine learning, with over 100,000 customers. Selipsky also talks about Amazon's development and implementation of large language models (LLMs) and generative AI, highlighting their use in retail website search and Alexa's voice responses. He mentions that Amazon has been working on LMS for years, predating the surge in popularity of models like ChatGPT. Selipsky outlines AWS's comprehensive approach to generative AI, including designing their own chips for training and inference, offering a range of options to customers, and the introduction of Amazon Bedrock, a managed service for generative AI.

05:02

🚀 Amazon's Investment in Generative AI Innovation

Selipsky addresses the market's perception of AWS's position in generative AI, particularly in comparison to other tech giants like Google and Microsoft. He clarifies that AWS's $100 million investment into a new generative AI innovation center is just one part of their broader AI efforts. He emphasizes that Amazon has spent billions on AI over the years and that their investment is not limited to this single figure. Selipsky also introduces Amazon Bedrock, which aims to democratize AI by providing customers with a choice of models, including Amazon's own Titan models and those from other startups. He stresses the importance of choice and experimentation in the AI space and positions AWS as a leader in providing these options to customers.

10:02

🌟 Amazon's Focus on Customer Needs in AI

The conversation shifts to how Amazon's generative AI innovation center will directly engage with customers, utilizing experts like data scientists and engineers to understand and solve their problems. Selipsky counters Wall Street's skepticism about Amazon's AI capabilities by highlighting the company's extensive experience and expertise in AI. He also mentions Amazon's launch of Codewhisperer, a coding companion that significantly reduces development time. Selipsky asserts that AWS's unique full-stack capabilities, including chip design and a choice-oriented service model, position it favorably compared to competitors like Microsoft and Google. He emphasizes that AWS is focused on meeting customer needs and building for the future, rather than immediate recognition or comparison with other companies.

15:04

📈 The Future of Cloud and AI at AWS

Selipsky discusses the evolution of the cloud market and AWS's position as a leader. He addresses concerns about slowing growth in the cloud sector and the potential impact of generative AI on AWS's business. He argues that the cloud is essential for generative AI due to the significant computing and storage requirements, and that AWS's focus on security, operational excellence, and a broad range of capabilities will continue to attract customers. Selipsky also talks about the economic uncertainty affecting tech companies and how AWS is helping customers optimize costs. He expresses optimism about the future growth of cloud computing and AWS's role in it, despite predictions from analysts like Morgan Stanley about potential shifts in market share.

20:07

💡 Collaboration with ARM and AWS's Chip Strategy

Selipsky elaborates on AWS's partnership with ARM, a key player in the chip industry, and how it has evolved over the years. He discusses the development of AWS's own chips, including the energy-efficient Graviton series, and the importance of these efforts in terms of sustainability and performance. Selipsky stresses the value of choice in the chip market and AWS's commitment to ensuring a consistent supply chain for its customers. He also addresses the potential impact of the chip shortage on AWS's capital expenditure plans and the company's strategy to meet the growing demand for generative AI compute capacity.

Mindmap

Keywords

💡Generative AI

Generative AI refers to artificial intelligence systems that can create new content, such as text, images, or audio. In the context of the video, it is a key focus area for Amazon, with the development of large language models and other models that can be applied in various industries beyond just chat applications. The video discusses Amazon's investment in generative AI, including the creation of an innovation center and the development of specific chips for AI training and inference.

💡Sagemaker

Sagemaker is Amazon Web Services' (AWS) machine learning platform, which allows users to build, train, and deploy machine learning models quickly. The video highlights that Sagemaker has over 100,000 customers and is a significant part of AWS's generative AI strategy, as it facilitates the adoption of machine learning in the cloud across various sectors.

💡Large Language Models (LLMs)

Large Language Models, or LLMs, are AI models that process and generate human-like text based on the input they receive. In the video, LLMs are a specific type of generative AI discussed, which Amazon has been developing and implementing in retail website search and voice responses for Alexa, among other applications.

💡Trainium and Inferentia Chips

Trainium and Inferentia are custom-designed chips by Amazon specifically for machine learning tasks. Trainium chips are used for training AI models, while Inferentia chips are used for running models in production. These chips are highlighted in the video as a part of Amazon's strategy to offer industry-leading price-performance for AI workloads and to ensure a robust supply chain for the necessary hardware.

💡Amazon Bedrock

Amazon Bedrock is a managed service mentioned in the video that is designed to democratize AI by providing users with access to a variety of generative AI models. It is part of Amazon's effort to offer choice in AI solutions and is currently in private preview. Bedrock will include Amazon's own Titan models as well as models from leading startups, emphasizing the importance of choice in AI applications.

💡AI Democratization

AI democratization refers to making artificial intelligence technologies accessible and affordable to a wide range of users, not just a select few. In the video, this concept is central to Amazon's strategy, with the aim of providing customers with various options for AI models and services, thus enabling more businesses to leverage AI in their operations and innovation.

💡Cloud Computing

Cloud computing is the delivery of computing services over the internet, allowing for the storage, processing, and management of data remotely. The video discusses the importance of cloud computing in the context of generative AI, emphasizing that the cloud is essential for the significant computing and storage needs of AI applications and is a key area where AWS sees growth and investment.

💡Graviton Chips

Graviton chips are Amazon's own general-purpose processors designed to power their cloud computing services. The video mentions Graviton three, the third generation of these chips, which are more energy-efficient than equivalent x86-based chips. This highlights Amazon's commitment to innovation in chip design and sustainability in their cloud infrastructure.

💡ARM

ARM is a semiconductor and software design company whose technology is used in a wide range of processors. In the video, ARM is noted as a critical partner for Amazon in the design and development of their in-house chips, including the Graviton series. The collaboration with ARM is important for Amazon to maintain control over their supply chain and to continue to innovate in chip design.

💡Investment in AI

The video discusses Amazon's significant investment in AI, including the establishment of a new generative AI innovation center with a $100 million investment. This highlights Amazon's commitment to advancing AI technologies and working directly with customers to solve problems using generative AI, reflecting the company's broader strategy to integrate AI into various aspects of its business and services.

Highlights

Amazon has been engaged in machine learning for approximately 25 years, initially focusing on personalization for the Amazon website.

AWS launched Sagemaker in 2017, a machine learning platform that currently has over 100,000 customers.

The majority of machine learning in the cloud happens on AWS, making it a central hub for such activities.

Amazon has developed its own Large Language Models (LLMs) and has been implementing them for quite some time, including in retail website search and Alexa's voice responses.

Amazon's investment in generative AI includes designing their own chips, Trainium for training models and Inferentia for running models in production.

Amazon's chips are aimed to provide industry-leading price-performance ratios for AI and machine learning tasks.

AWS offers a full stack view of generative AI, from model development to deployment and inference.

Amazon is investing $100 Million into a new generative AI innovation center to focus on technology development and direct customer engagement.

Amazon's approach to AI involves providing customers with choices, including access to various models and the ability to build their own.

Amazon Bedrock is a managed service for generative AI that will offer a range of models from Amazon and other startups.

Amazon has accumulated extensive experience in AI and has thousands of practitioners working on generative AI specifically.

Amazon's generative AI efforts include Codewhisperer, a coding companion that significantly reduces development time for tasks.

Amazon is focused on a broad range of applications for generative AI, aiming to innovate across various sectors of the economy.

AWS's market leadership in cloud services is bolstered by its focus on customer needs and innovation in generative AI.

Amazon's strategy includes democratizing AI and providing a variety of solutions to meet the diverse needs of different customers.

Amazon is committed to fair competition and allowing customers to choose the best solutions for their needs.

Amazon's in-house chip development efforts with ARM as a partner have resulted in energy-efficient general-purpose chips like Graviton.

Amazon continues to invest in generative AI and cloud infrastructure, expecting robust growth and innovation in the future.