Unleash the Power of Deerflow: A Free, Local Deep Research Agent
Discover Deerflow: A free, open-source deep research agent that rivals Manisai and GenSpark. Learn how to easily install and use this powerful local AI tool for tasks like web search, data analysis, and content creation.
2025年6月3日

Unlock the power of AI-driven research with Deerflow, a fully open-source and locally-hosted deep research agent that can tackle a wide range of tasks with ease. Discover a versatile tool that combines large language models, specialized tools, and collaborative features to streamline your research workflow and unlock new insights.
Powerful AI Agent with Endless Capabilities
Multi-Agent Architecture for Seamless Research
Open-Source Models and API Integrations for Flexibility
Two Installation Options: Source Code and Docker
Configuring the Environment and Language Model
Running the Deerflow Agent: Console and Web UI
Conclusion
Powerful AI Agent with Endless Capabilities
Powerful AI Agent with Endless Capabilities
Gearflow is a community-driven deep research framework that combines large language models with specialized tools for tasks like web search, crawling, and Python execution. It aims to mimic the capabilities of advanced deep research agents like Manis AI and GenSpark, but with the added benefits of being fully open-source and locally hosted.
Key features of Gearflow include:
- Multi-Agent Architecture: Gearflow utilizes a coordinator, planner, research team, and reporter to efficiently manage the research process.
- Open-Source Models: Gearflow supports the use of open-source language models like Olama, as well as OpenAI-compatible APIs and multi-task LM systems.
- Integrated Tools: Gearflow integrates tools like search, retrieval, web scraping, and MCP (Modular Cognitive Primitives) to enhance its research capabilities.
- Human Collaboration: Gearflow allows for human-in-the-loop interactions, including report post-editing and content creation.
- Easy Installation: Gearflow can be installed either through source code or using Docker, making it accessible to a wide range of users.
With its powerful features and open-source nature, Gearflow offers a compelling alternative to proprietary deep research agents, empowering users to harness the power of advanced AI for their research and exploration needs.
Multi-Agent Architecture for Seamless Research
Multi-Agent Architecture for Seamless Research
Gearflow's multi-agent architecture is designed to provide a seamless and efficient research experience. The key components of this architecture are:
-
Coordinator: Responsible for engaging with the user prompt and sending it to the Planner.
-
Planner: Uses various AI agents to update the plan based on human feedback, which is then sent back and forth in a cycle.
-
Research Team: Develops and executes the plan, utilizing the Research Agent and Coder (if coding is required).
-
Reporter: Compiles the research results and sends the final output back to the user.
This multi-agent approach allows Gearflow to leverage the strengths of different AI models and tools, such as large language models, web search, web crawling, and Python execution. The system can be powered by LLMs (like Olama) or open-source models (like OpenRouter), providing flexibility and accessibility.
Key features of the multi-agent architecture include:
- Open-Source Models: Ability to use open-source models, including OpenAI-compatible APIs and multi-task LM systems.
- Integrated Tools: Integration of tools like search, retrieval, web scraping, and MCPs to enhance the agent's capabilities.
- Human Collaboration: Support for human-in-the-loop feedback and report post-editing.
- Content Creation: Capability to generate outputs like podcasts and presentations, similar to Manis AI.
The multi-agent design ensures a seamless and efficient research workflow, leveraging the strengths of various AI components to deliver comprehensive and accurate results.
Open-Source Models and API Integrations for Flexibility
Open-Source Models and API Integrations for Flexibility
Gearflow, the deep research agent, offers flexibility through its support for open-source models and API integrations. This allows users to leverage a wide range of language models and specialized tools to enhance the agent's capabilities.
The agent supports the use of OpenAI-compatible models, which can be configured in the conf.yml
file. Users can choose to use models like DeepSEEK or Olama models, which can be installed locally. Alternatively, Gearflow also supports the use of free models from OpenRouter, which can be set up by providing the API key.
In addition to language models, Gearflow integrates with various APIs and tools to expand its research and task-completion abilities. This includes integrations with search engines like Brave Search and DuckDuckGo, as well as web scraping and Python execution capabilities. The agent can also leverage tools like Retrieval Augmented Generation (RAG) to enhance its research and content generation.
The flexibility provided by Gearflow's open-source models and API integrations allows users to tailor the agent's capabilities to their specific needs, making it a powerful and versatile deep research tool.
Two Installation Options: Source Code and Docker
Two Installation Options: Source Code and Docker
There are two ways to install Gearflow, the open-source deep research agent:
-
Using the Source Code:
- Prerequisites:
- Python (latest version)
- Node.js
- npm
- nvm
- git
- VS Code
- Steps:
- Clone the repository using the provided command.
- Navigate to the
deer-flow
directory. - Rename the
env.example
file and configure the necessary API keys (e.g., Tavly, Brave Search, Vulk engine). - Modify the
conf.yml
file to set up the large language model (e.g., DeepSEEK, OpenAI compatible models, OpenRouter models). - Install the required dependencies using the provided commands.
- Start the backend and frontend servers using the specified commands.
- Prerequisites:
-
Using Docker:
- Prerequisites:
- Docker Desktop
- Steps:
- Modify the configurations in the Docker setup.
- Use the provided Docker commands to set up and run Gearflow.
- Prerequisites:
The Docker installation is generally simpler, as it eliminates the need to manage dependencies and configurations manually. However, if you prefer to work with the source code, the step-by-step instructions are provided for a smooth installation process.
Configuring the Environment and Language Model
Configuring the Environment and Language Model
To set up Gearflow, the local deep research agent, you'll need to follow these steps:
-
Prerequisites:
- Ensure you have Python, Node.js, npm, and git installed on your system.
- Install Visual Studio Code (VS Code) for easy configuration.
-
Clone the Repository:
- Open your command prompt/terminal and run the following command to clone the Gearflow repository:
git clone https://github.com/gearflow/gearflow.git
- Navigate to the cloned
gearflow
directory using the commandcd gearflow
.
- Open your command prompt/terminal and run the following command to clone the Gearflow repository:
-
Configure API Keys:
- In VS Code, open the cloned
gearflow
folder. - Rename the
env.example
file to.env
. - In the
.env
file, provide the necessary API keys for services like Tavly, Brave Search, and Vulk engine (if you plan to use them).
- In VS Code, open the cloned
-
Set up the Language Model:
- Open the
conf.yml
file in VS Code. - Read the configuration guide to understand how to switch between language models and set the appropriate parameters.
- For example, to use the DeepSEEK model, update the
base_url
,model
, andapi_key
fields accordingly. - Save the changes to the
conf.yml
file.
- Open the
-
Install Dependencies:
- If you're on Windows, run the following command to install the required dependencies:
npm install
- If you're on macOS or Linux, use the following command instead:
brew install marp-cli
- If you're on Windows, run the following command to install the required dependencies:
-
Run the Application:
- To start the web-based user interface, navigate to the
web
directory and run the following command:npm run start
- Alternatively, you can run the console-based interface directly from the root directory using the command:
python -m gearflow.main
- To start the web-based user interface, navigate to the
Now you have Gearflow, your local deep research agent, set up and ready to use. You can start exploring its capabilities by providing prompts and watching it perform research, analysis, and report generation.
Running the Deerflow Agent: Console and Web UI
Running the Deerflow Agent: Console and Web UI
To run the Deerflow agent, you have two options: the console UI and the web UI.
Console UI
-
After cloning the repository and configuring the necessary API keys, you can run the agent using the following command:
uv run main.py
This will start the agent in the console UI, allowing you to interact with it directly through the command prompt.
Web UI
-
To use the web UI, navigate to the
web
directory within the cloned repository:cd web
-
Install the dependencies for the web UI:
npm install
-
Start the backend and frontend servers:
npm run start
This will start the Deerflow agent's backend and frontend servers, and you can access the web UI by opening your web browser and navigating to
http://localhost:3000
. -
From the web UI, you can now send prompts to the Deerflow agent and view the generated responses.
Both the console UI and the web UI provide access to the Deerflow agent's deep research capabilities, allowing you to leverage its powerful features for your research and analysis tasks.
Conclusion
Conclusion
Gearflow is a powerful, open-source deep research agent that provides a comprehensive solution for conducting in-depth research and analysis. It combines large language models with specialized tools for tasks like web search, crawling, and Python execution, allowing users to replicate the capabilities of advanced AI agents like Manis or GenSpark.
The key highlights of Gearflow include:
- Local and Open-Source: Gearflow is a fully local and open-source solution, making it accessible and customizable for users.
- Flexible Model Integration: Gearflow supports the use of OpenAI-compatible models, as well as free models from OpenRouter, providing users with a range of options to suit their needs.
- Comprehensive Capabilities: Gearflow can perform tasks like web search, content summarization, and even code generation, making it a versatile tool for research and analysis.
- Easy Installation: Gearflow can be installed either through the source code or using Docker, providing a straightforward setup process for users.
Overall, Gearflow is a powerful and user-friendly deep research agent that offers a compelling alternative to proprietary solutions. Its open-source nature and flexible model integration make it a valuable tool for researchers, developers, and anyone looking to conduct in-depth analysis and exploration.
FAQ
FAQ