The RIGHT WAY To Build AI Agents with CrewAI (BONUS: 100% Local)

Matthew Berman
15 Apr 202419:46

TLDRThis video tutorial demonstrates the optimal setup for a Crew AI team using Lightning AI, a cloud-based code editor that facilitates collaboration and open-source model integration. The presenter guides viewers through creating a modular financial analyst crew, defining tasks and agents in YAML files, and utilizing Gradio to expose an API. The process showcases the ease of environment management, automatic saving, and the ability to switch between different AI models like GPT-4 and Mixol, all powered by Lightning AI's GPUs for a fast and efficient workflow.

Takeaways

  • 🌟 The video demonstrates the optimal setup for a Crew AI team using Lightning AI, a cloud-based code editor that facilitates collaboration and open-source model integration.
  • 🛠️ The presenter introduces a modular structure for the Crew AI codebase, utilizing YAML to define agents and tasks, which simplifies the main.py file.
  • 📁 The process involves creating a new studio in Lightning AI, setting up a source folder, and organizing tasks and agents within specific subfolders for clarity and structure.
  • 🔍 The 'research company task' is defined to gather stock information using a search tool, aiming to prepare data for informed analysis of a company's stock performance.
  • 📊 The 'analyze company task' is outlined to include various financial metrics for a comprehensive financial analysis, using the provided financial information.
  • 👥 Two agents are created: 'company researcher' and 'company analyst', each with specific roles and goals aligned with the tasks, and both set to not delegate tasks.
  • 📝 The script details the creation of a main file that integrates all agents and tasks, showcasing how to import necessary libraries and set up the Crew AI project structure.
  • 🔗 The video explains how to use Lang chain and Gradio to power the Crew AI with different models, including the option to use open-source models with Lightning AI's GPU support.
  • 🚀 The presenter successfully demonstrates running the Crew AI with Gro, an AI model, and then attempts to run it with an open-source model, Mixol, powered by Lightning AI's infrastructure.
  • 🔧 The process includes troubleshooting steps such as installing necessary packages and adjusting the code to accommodate different AI models.
  • 🎉 The final result showcases the Crew AI functioning with the open-source model, Mixol, and emphasizes the speed and effectiveness of the setup.

Q & A

  • What is the main topic of the video?

    -The main topic of the video is demonstrating the optimal way to set up a Crew AI team using Lightning AI, an open-source model, and a cloud-based code editor.

  • Who is the founder of Crew AI and what role does he play in the video?

    -The founder of Crew AI is not explicitly named in the transcript, but he has shown the future of what a Crew AI codebase looks like, which is demonstrated in the video.

  • What is Lightning AI and how does it benefit the process shown in the video?

    -Lightning AI is a cloud-based code editor that allows collaboration on code in the cloud and the powering of open-source models. It benefits the process by providing a fresh environment every time, saving time on Python environment management.

  • What is the structure of the Crew AI code framework described in the video?

    -The structure of the Crew AI code framework is modular, with separate areas for tools, YAML files to define agents and tasks, and everything piping into a very short main.py file.

  • How does the video guide the creation of a new studio in Lightning AI?

    -The video guides the creation of a new studio by instructing viewers to sign up for a Lightning account, click 'New Studio', and then 'Start' to create a code studio in the cloud.

  • What is the purpose of creating a 'source' folder and a 'financial analyst crew' folder within it?

    -The 'source' folder and the 'financial analyst crew' folder are created to structure the project and organize the codebase, making it easier to manage and develop the AI team.

  • What is the significance of using YAML files to define agents and tasks?

    -Using YAML files to define agents and tasks allows for a clear and organized way to structure the codebase, making it easier to manage and potentially enabling the automatic creation of an API based on this structure.

  • What is the role of the 'agents.yml' and 'tasks.yml' files in the video?

    -The 'agents.yml' and 'tasks.yml' files are used to define the agents and tasks for the Crew AI team. They specify the roles, goals, and expected outputs of each agent and task.

  • How does the video demonstrate the use of an open-source model with Lightning AI?

    -The video demonstrates the use of an open-source model by showing how to power the Crew AI team with Mixol or Mistol, and how to expose an API endpoint to control the team based on the structure defined in the YAML files.

  • What is the final step shown in the video for running the Crew AI team?

    -The final step shown in the video is using Poetry to put everything together and run the project. This includes installing Poetry, creating a 'pyproject.toml' file, locking dependencies, and running the 'financial analyst crew' using Poetry.

Outlines

00:00

🚀 Introduction to Setting Up a Crew AI Team with Lightning AI

The video begins with the host introducing the topic of setting up a Crew AI team, emphasizing the optimal way to do so as demonstrated by the Crew AI founder. The host outlines the use of Lightning AI, a cloud-based code editor that facilitates collaboration and the powering of open-source models. The session is sponsored by Lightning AI, and the host expresses excitement about building the team on this platform. The process starts with creating a new studio on Lightning, which simplifies Python environment management. The structure of the Crew AI codebase is modular, utilizing YAML for defining agents and tasks, all feeding into a concise main.py file.

05:01

📚 Defining Tasks and Agents for the Financial Analyst Crew

The host proceeds to define two tasks for the Crew AI team: researching a specific company and analyzing the company's financial performance. Each task is detailed in a YAML file, specifying parameters like the description, expected output, and variables. The host then moves on to defining agents, creating two distinct agents tailored to each task: a 'company researcher' and a 'company analyst'. These agents are characterized by their roles, goals, and backstories. The host also highlights the benefits of explicit structuring, which may soon allow for API creation based on the defined structure.

10:02

💻 Main File Creation and Integration of Agents and Tasks

The host demonstrates the creation of the main file, 'crew.py', which integrates all the previously defined agents and tasks. The file begins with importing necessary libraries from the Crew AI framework and LangChain for GPT integration. A 'financial analyst crew' class is defined, which loads the agents and tasks, and sets up the GPT information using temperature and model name parameters. The agents and tasks are instantiated within this class, and the GPT model is specified as 'misto'. The host also notes the convenience of Lightning AI's automatic saving feature and the ability to resume work from where one left off.

15:02

🔌 Running the Crew with Gro and Exploring Open-Source Model Integration

The video concludes with the host running the Crew AI team using Gro, showcasing the speed and efficiency of the process. The output includes detailed financial analysis and metrics for Tesla as an example. The host then guides viewers on how to run the Crew with an open-source model powered by Lightning AI's GPUs. This involves using the Lightning Studio templates to set up an open-source model, exposing an API endpoint, and integrating this endpoint into the Crew AI team. The host successfully demonstrates the integration and confirms the model's functionality, encouraging viewers to like and subscribe for more content.

Mindmap

Keywords

💡Crew AI

Crew AI refers to a framework for building AI agents that work collaboratively to perform tasks. In the context of the video, it is about setting up a team of AI agents using the Crew AI codebase, which is demonstrated by the founder. The video shows how to structure and build a 'financial analyst crew' using this framework, which is crucial for understanding the workflow and the collaborative nature of AI agents in the video.

💡Lightning AI

Lightning AI is a cloud-based code editor that enables collaboration on code in the cloud. It is highlighted in the video for its ability to streamline Python environment management and for powering open-source models. The video demonstrates using Lightning AI to build the Crew AI team, emphasizing its convenience and efficiency in creating and managing AI projects.

💡YAML

YAML, which stands for 'YAML Ain't Markup Language,' is a human-readable data serialization standard used for configuration files and in applications where data is being stored or transmitted. In the video, YAML is used to define agents and tasks for the Crew AI team, which helps in structuring the workflow and the interactions between different AI components.

💡Main.py

Main.py is a Python file that typically serves as the entry point for a Python application. In the video, it is mentioned as a short file that aggregates all the tasks and agents into a cohesive workflow. It is a critical component as it ties together the various elements of the Crew AI project.

💡Financial Analyst Crew

The term 'Financial Analyst Crew' refers to a specific type of AI team created in the video, designed to perform financial analysis tasks. It is an example of how to apply the Crew AI framework to a practical use case, focusing on researching and analyzing company financials, which is central to the video's demonstration.

💡Grok

Grok is an AI model used in the video to power the AI agents. It is mentioned as a preferred model over GPT-4 due to its effectiveness. The video shows how to integrate Grok with the Crew AI framework, demonstrating the flexibility to use different models for different tasks or agents.

💡Lang Chain

Lang Chain is a library used in the video for integrating AI models like Grok with the Crew AI framework. It serves as a bridge between the AI models and the tasks that the agents need to perform, allowing for the dynamic interaction between the model's capabilities and the project's requirements.

💡Mixel

Mixel is an open-source model mentioned in the video as a potential option to power the AI agents. The script discusses the possibility of using Mixel through an API exposed by an open-source model running on Lightning AI's GPUs, showing the versatility in choosing the right model for the task.

💡API

API stands for Application Programming Interface, which is a set of rules and protocols for building and interacting with software applications. In the video, the presenter shows how to expose an API endpoint for an open-source model like Mixel, which can then be used to power the Crew AI agents, demonstrating the integration of external services into the AI workflow.

💡Poetry

Poetry is a dependency manager for Python packages. In the video, it is used to manage the project's dependencies, ensuring that all necessary libraries are installed and configured correctly. This is an important step in setting up and running the Crew AI project, as it handles the complexities of package management.

Highlights

Optimal setup for a crew AI team demonstrated using Lightning AI.

Lightning AI is a cloud-based code editor that facilitates collaboration and open-source model powering.

Introduction to building a Crea I team with gp4 and mixol or mistol.

Lightning AI sponsorship and excitement for building a team on their platform.

Creating a new studio in Lightning to start building the crew AI code framework.

Advantages of using Lightning for environment management and avoiding Python environment setup.

Structuring the crew with a modular approach using separate areas for tools and YAML for defining agents and tasks.

Explanation of creating a source folder and naming a new crew within it.

Creating a config folder for task and agent definitions in YAML files.

Writing tasks with descriptions and expected outputs for research and analysis of a company.

Switching to Claude 3 for financial analysis metrics and incorporating them into the task description.

Defining agents with roles, goals, and settings like allow delegation and verbose.

Automatic saving feature of Lightning AI and its convenience for developers.

Creating the main file that integrates all agents and tasks for the crew.

Importing libraries and setting up the crew base with agents and tasks in the main file.

Using Gradio to power the crew with an open-source model and exposing an API endpoint.

Demonstration of running the crew with Gro and then with an open-source model powered by Lightning AI GPUs.

Using Studio Templates in Lightning to set up a preconfigured environment for running models like Mixr.

Installing and using API Builder to expose an open AI compatible API endpoint.

Integrating the exposed API endpoint with the crew model to power it with an open-source model.

Final demonstration of the crew working with the open-source model and the speed of execution.