This project demonstrates the use of the CrewAI framework to automate the creation of Instagram content. CrewAI coordinates autonomous AI agents, allowing them to work together and efficiently handle complex tasks.
- CrewAI Framework
- Executing the Script
- In-Depth Explanation
- Utilizing Local Models with Ollama
- License
CrewAI is crafted to enable the collaboration of AI agents in role-playing scenarios. In this instance, these agents collaborate to provide a comprehensive stock analysis and investment advice.
This example defaults to using llama 3.1 via Ollama, so you should download Ollama and llama 3.1.
You can switch the model by modifying the MODEL
environment variable in the .env
file.
- Set Up Environment: Duplicate
.env.example
and configure the environment variables for Browseless and Serper. - Install Required Packages: Execute
poetry install --no-root
. - Run the Script: Execute
python main.py
and provide your idea.
- Script Execution: Run
python main.py
and provide your idea when prompted. The script will utilize the CrewAI framework to process the idea and generate a landing page. - Core Components:
./main.py
: Primary script file../tasks.py
: File containing task prompts../agents.py
: File for agent creation../tools/
: Directory with tool classes used by the agents.
This example operates entirely with local models. The CrewAI framework supports integration with both proprietary and local models, using tools like Ollama for enhanced flexibility and customization. This allows you to use your own models, which can be particularly beneficial for specialized tasks or data privacy concerns.
- Install Ollama: Ensure Ollama is installed in your environment. Follow Ollama's installation guide for detailed instructions.
- Set Up Ollama: Configure Ollama to work with your local model. You may need to adjust the model using a Modelfile; experimenting with
top_p
andtemperature
is recommended.
This project is distributed under the MIT License.