An Open-Source clone of Open AI's Deep Research experiment. Instead of using a fine-tuned version of o3, this method uses Firecrawl's extract + search with a reasoning model to deep research the web.
Check out the demo here
- Firecrawl Search + Extract
- Feed realtime data to the AI via search
- Extract structured data from multiple websites via extract
- Next.js App Router
- Advanced routing for seamless navigation and performance
- React Server Components (RSCs) and Server Actions for server-side rendering and increased performance
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Supports OpenAI (default), Anthropic, Cohere, and other model providers
- shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
- Data Persistence
- PostgreSQL database (via Docker) for saving chat history and user data
- In-memory rate limiting for API protection
Both databases are automatically configured when you run the application using Docker Compose:
docker-compose up -d
Environment variables for database connections:
# PostgreSQL connection
DATABASE_URL="postgresql://postgres:postgres@localhost:5432/chatdb"
# Redis connection
REDIS_URL="redis://localhost:6379"
# Start all services (PostgreSQL and Redis)
docker-compose up -d
# Stop all services
docker-compose down
# View logs for all services
docker-compose logs -f
# View logs for specific service
docker-compose logs -f postgres
docker-compose logs -f redis
# Stop services but keep volumes
docker-compose down
# Stop services and remove volumes
docker-compose down -v
# Access database CLIs
docker-compose exec postgres psql -U postgres -d chatdb
docker-compose exec redis redis-cli
- Local file storage for uploads
- NextAuth.js
- Simple and secure authentication
This template ships with OpenAI gpt-4o
as the default. However, with the AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, and many more with just a few lines of code.
This repo is compatible with OpenRouter and OpenAI. To use OpenRouter, you need to set the OPENROUTER_API_KEY
environment variable.
Before starting, make sure you have:
The initialization script will automatically:
- Create a conda environment with Python and Node.js
- Install pnpm
- Install all required dependencies
- Set up database containers (PostgreSQL and Redis)
The application uses two configuration files:
env-config.xml
for environment name:
<?xml version="1.0" encoding="UTF-8"?>
<config>
<environment>
<name>open-deep-research</name>
</environment>
</config>
environment.yml
for conda environment setup:
name: open-deep-research
channels:
- conda-forge
- defaults
dependencies:
- python=3.11
- nodejs=20
- npm=10
- pip
The application comes with two utility scripts:
This script performs first-time setup and should be run only once. It will:
- Create a new conda environment
- Install all dependencies
- Set up the database
- Create initial configurations
chmod +x scripts/init.sh
./scripts/init.sh
- Reset your conda environment
- Reset your database
- Reset your configurations
- Create new environment files
The script will ask for confirmation before proceeding.
This script starts the application for regular use. It will:
- Activate the conda environment
- Check if services are running
- Start the web server
chmod +x scripts/start.sh
./scripts/start.sh
Use this script for your daily development work. It's safe to run multiple times and won't reset your environment.
- Make the scripts executable:
chmod +x scripts/init.sh scripts/start.sh
- Run the initialization script (only once):
./scripts/init.sh
- For subsequent runs, use the start script:
./scripts/start.sh