Skip to content

Commit 0dec05a

Browse files
committed
ReadMe update
1 parent eefca9e commit 0dec05a

File tree

10 files changed

+117
-720
lines changed

10 files changed

+117
-720
lines changed

.github/workflows/build-release.yml

Whitespace-only changes.

README.md

Lines changed: 117 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,14 @@
5151

5252
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
5353

54+
**🎉 Latest Updates (v1.6.9)**
55+
- **🔧 Custom Port Support** - Configure Ollama to run on any port
56+
- **⚙️ Comprehensive Settings Panel** - Complete Ollama service management
57+
- **🔍 Enhanced Web Search** - Improved search with source tracking and reliability
58+
- **📱 Cross-Platform Desktop App** - Native desktop application with Tauri
59+
- **🧠 Advanced AI Features** - Thinking mode, verbose stats, and conversation management
60+
- **🛠️ Developer Experience** - Improved error handling and Windows compatibility
61+
5462
> **⚠️ Early Development Notice**
5563
> This project is in active development. Features and APIs may change. We welcome contributions and feedback from the community.
5664
@@ -66,27 +74,32 @@ https://github.com/user-attachments/assets/8ed11232-de9c-469b-b332-143ca41daf15
6674
## ✨ Features
6775

6876
### Current Features
69-
- **🔍 Intelligent Web Search** - Real-time internet search with SearxNG integration
70-
- **🧠 Thinking Mode Control** - Toggle AI reasoning traces on/off
71-
- **🌐 Multi-Engine Fallback** - Multiple SearxNG instances for reliability
77+
- **🔍 Intelligent Web Search** - Real-time internet search with SearxNG integration and source tracking
78+
- **🧠 Thinking Mode Control** - Toggle AI reasoning traces on/off with clean rendering
79+
- **🌐 Multi-Engine Fallback** - Multiple SearxNG instances for reliability and uptime
80+
- **🔧 Custom Port Support** - Configure Ollama to run on any port (not just 11434)
81+
- **⚙️ Comprehensive Settings** - Complete Ollama management, model downloads, and configuration
7282
- **🎬 Animated Shine Borders** - Eye-catching animated message borders with color cycling
7383
- **📱 Responsive Design** - Mobile-first approach with seamless cross-device compatibility
7484
- **🌙 Theme System** - Dark/light mode with system preference detection
75-
- **⚡ Real-time Streaming** - Live response streaming from Ollama models
85+
- **⚡ Real-time Streaming** - Live response streaming from Ollama models with typing effects
7686
- **🎯 Clean Interface** - Simplified message rendering focused on readability
77-
- **🔄 Model Management** - Easy switching between available Ollama models
87+
- **🔄 Advanced Model Management** - Download, delete, and switch between Ollama models
88+
- **📊 Verbose Statistics** - Toggle detailed timing and performance stats for responses
89+
- **💬 Conversation Management** - Persistent chat history with sidebar navigation
90+
- **🖥️ Cross-Platform Support** - Windows, macOS, and Linux compatibility with platform-specific optimizations
7891
- **⌨️ Smart Input** - Keyboard shortcuts (Enter to send, Shift+Enter for newlines)
79-
- **🎨 Modern UI/UX** - Glassmorphism effects and smooth micro-animations
92+
- **🎨 Modern UI/UX** - Glassmorphism effects, smooth micro-animations, and polished design
8093

8194
### 🚧 Upcoming Features
82-
- **🔐 API Key Management** - Secure storage and management of API credentials
83-
- **💾 Conversation History** - Persistent chat history with search functionality
84-
- **🔧 Advanced Settings** - Customizable model parameters and system prompts
85-
- **📁 File Upload Support** - Document and image processing capabilities
95+
- ** File Upload Support** - Document and image processing capabilities
8696
- **🌐 Multi-language Support** - Internationalization for global users
87-
- **📊 Usage Analytics** - Token usage tracking and conversation insights
97+
- **📊 Advanced Usage Analytics** - Enhanced token usage tracking and conversation insights
8898
- **🔌 Plugin System** - Extensible architecture for third-party integrations
8999
- **☁️ Cloud Sync** - Optional cloud backup for conversations and settings
100+
- **🔐 Multi-API Support** - Integration with OpenAI, Anthropic, and other AI providers
101+
- **🎯 Advanced Prompt Templates** - Pre-built and custom prompt management
102+
- **🔒 Enhanced Security** - API key encryption and secure credential storage
90103

91104
## 🚀 Installation
92105

@@ -119,26 +132,26 @@ ollama pull llama2
119132
ollama pull codellama
120133
ollama pull mistral
121134

122-
# For web search feature, also pull:
123-
ollama pull qwen3:0.6b
135+
# For web search feature, also pull a small model:
136+
ollama pull qwen2:0.5b
124137

125138
# Verify installation
126139
ollama list
127140
```
128141

129142
### Step 3: Setup Web Search (Optional)
130143

131-
For enhanced web search capabilities, set up a local SearxNG instance:
144+
For enhanced web search capabilities, BeautifyOllama includes integrated Python-based web search:
132145

133146
```bash
134-
# Quick setup with provided script
135-
./setup-searxng.sh
136-
137-
# Or manually install Python dependencies
147+
# Install Python dependencies for web search
138148
pip install ollama requests
149+
150+
# The web search feature uses multiple SearxNG instances
151+
# No additional setup required - it's built-in!
139152
```
140153

141-
For detailed web search setup, see [Web Search Integration Guide](WEB_SEARCH_INTEGRATION.md).
154+
For detailed web search setup and configuration, see [Web Search Integration Guide](WEB_SEARCH_INTEGRATION.md).
142155

143156
### Step 4: Install BeautifyOllama
144157

@@ -176,12 +189,16 @@ Create a `.env.local` file in the project root:
176189
# Ollama Configuration
177190
NEXT_PUBLIC_OLLAMA_API_URL=http://localhost:11434
178191
NEXT_PUBLIC_DEFAULT_MODEL=llama2
192+
OLLAMA_PORT=11434
179193
180-
# Feature Flags (Coming Soon)
194+
# Web Search Configuration
195+
SEARXNG_INSTANCES=https://search.example.com,https://searx.example.org
196+
197+
# Feature Flags
181198
NEXT_PUBLIC_ENABLE_ANALYTICS=false
182199
NEXT_PUBLIC_ENABLE_CLOUD_SYNC=false
183200
184-
# API Keys (Future Feature)
201+
# Future API Keys
185202
# OPENAI_API_KEY=your_openai_key_here
186203
# ANTHROPIC_API_KEY=your_anthropic_key_here
187204
```
@@ -206,8 +223,22 @@ export const ollamaConfig = {
206223
1. **Start a Conversation**: Type your message in the input field
207224
2. **Send Messages**: Press `Enter` or click the send button
208225
3. **New Lines**: Use `Shift + Enter` for multi-line messages
209-
4. **Switch Models**: Use the model selector in the sidebar
226+
4. **Switch Models**: Use the model selector in the header
210227
5. **Theme Toggle**: Click the theme button to switch between light/dark modes
228+
6. **Enable Features**: Use the toggle buttons below the input for:
229+
- **Stats Mode**: View detailed response timing and performance
230+
- **Thinking Mode**: See AI reasoning process (when supported)
231+
- **Web Search**: Include real-time internet search in responses
232+
233+
### Advanced Features
234+
235+
- **Settings Panel**: Click the gear icon to access:
236+
- **Connection Settings**: Configure custom Ollama ports
237+
- **Model Management**: Download new models or delete existing ones
238+
- **Service Control**: Start/stop Ollama service
239+
- **Command Logs**: View detailed operation logs
240+
- **Conversation Management**: Navigate between chats using the sidebar
241+
- **Response Features**: View sources for web search results and detailed statistics
211242

212243
### Mobile Usage
213244

@@ -222,11 +253,13 @@ export const ollamaConfig = {
222253
| Layer | Technology | Purpose |
223254
|-------|------------|---------|
224255
| **Frontend** | Next.js 15 + React 19 | Modern React framework with App Router |
256+
| **Backend** | Tauri + Rust | Native desktop integration and system calls |
225257
| **Styling** | TailwindCSS 4 | Utility-first CSS framework |
226258
| **Animation** | Framer Motion | Smooth animations and transitions |
227-
| **Language** | TypeScript | Type safety and developer experience |
259+
| **Language** | TypeScript + Rust | Type safety and high-performance backend |
228260
| **State Management** | React Hooks | Local state management |
229261
| **Theme** | next-themes | Dark/light mode functionality |
262+
| **Search** | Python + SearxNG | Integrated web search capabilities |
230263

231264
### Project Structure
232265

@@ -236,16 +269,27 @@ beautifyollama/
236269
│ ├── app/ # Next.js App Router
237270
│ │ ├── globals.css # Global styles
238271
│ │ ├── layout.tsx # Root layout
239-
│ │ └── page.tsx # Home page
272+
│ │ ├── page.tsx # Home page
273+
│ │ └── services/ # Service layer
274+
│ │ └── ollamaService.ts
240275
│ ├── components/ # React components
241276
│ │ ├── Chat.tsx # Main chat interface
242-
│ │ ├── ShineBorder.tsx # Animated border component
277+
│ │ ├── Settings.tsx # Settings modal
243278
│ │ ├── MarkdownRenderer.tsx
279+
│ │ ├── ThinkingRenderer.tsx
244280
│ │ └── ui/ # Reusable UI components
245281
│ ├── config/ # Configuration files
246282
│ ├── hooks/ # Custom React hooks
247283
│ ├── lib/ # Utility functions
248284
│ └── types/ # TypeScript type definitions
285+
├── src-tauri/ # Tauri Rust backend
286+
│ ├── src/
287+
│ │ ├── main.rs # Main Tauri entry
288+
│ │ └── lib.rs # Core backend logic
289+
│ └── Cargo.toml # Rust dependencies
290+
├── tools/ # External tools
291+
│ ├── web-search/ # Python web search integration
292+
│ └── README.md # Tools documentation
249293
├── public/ # Static assets
250294
├── docs/ # Documentation
251295
└── tests/ # Test files
@@ -297,29 +341,37 @@ We welcome contributions from the community! BeautifyOllama is an early-stage pr
297341

298342
```bash
299343
npm run dev # Start development server
300-
npm run build # Build for production
344+
npm run build # Build for production
301345
npm run start # Start production server
302346
npm run lint # Run ESLint
303347
npm run type-check # Run TypeScript compiler
348+
npm run tauri dev # Start Tauri development mode
349+
npm run tauri build # Build Tauri desktop application
304350
npm test # Run tests (when available)
305351
```
306352

307353
## 🛣️ Roadmap
308354

309355
### Phase 1: Core Features (Current)
310-
- [x] Basic chat interface
311-
- [x] Ollama integration
312-
- [x] Theme system
313-
- [x] Responsive design
314-
- [ ] Enhanced error handling
315-
- [ ] Performance optimizations
356+
- [x] Basic chat interface with real-time streaming
357+
- [x] Ollama integration with custom port support
358+
- [x] Theme system (dark/light mode)
359+
- [x] Responsive design for all devices
360+
- [x] Web search integration with SearxNG
361+
- [x] Comprehensive settings and model management
362+
- [x] Conversation history and management
363+
- [x] Advanced thinking and verbose modes
364+
- [x] Cross-platform support (Windows, macOS, Linux)
365+
- [ ] Enhanced error handling and user feedback
366+
- [ ] Performance optimizations for large conversations
316367

317368
### Phase 2: Advanced Features (Next)
318-
- [ ] API key management system
319-
- [ ] Conversation history persistence
320-
- [ ] File upload and processing
321-
- [ ] Advanced model settings
322-
- [ ] Export/import conversations
369+
- [ ] File upload and document processing
370+
- [ ] Advanced prompt templates and management
371+
- [ ] Export/import conversations (JSON, Markdown)
372+
- [ ] Custom model parameter configuration
373+
- [ ] Plugin architecture foundation
374+
- [ ] Enhanced search within conversation history
323375

324376
### Phase 3: Enterprise Features (Future)
325377
- [ ] Multi-user support
@@ -339,10 +391,14 @@ npm test # Run tests (when available)
339391
| Feature | Status | Priority |
340392
|---------|--------|----------|
341393
| Core Chat | ✅ Complete | High |
394+
| Web Search | ✅ Complete | High |
395+
| Settings Panel | ✅ Complete | High |
396+
| Model Management | ✅ Complete | High |
342397
| Theme System | ✅ Complete | High |
343398
| Mobile Support | ✅ Complete | High |
344-
| API Keys | 🚧 In Progress | High |
399+
| Custom Ports | ✅ Complete | Medium |
345400
| File Upload | 📋 Planned | Medium |
401+
| Multi-API Support | 📋 Planned | Medium |
346402
| Cloud Sync | 📋 Planned | Low |
347403

348404
## 🐛 Troubleshooting
@@ -359,6 +415,26 @@ ollama list
359415

360416
# Test API endpoint
361417
curl http://localhost:11434/api/tags
418+
419+
# For custom ports, test the specific port
420+
curl http://localhost:YOUR_PORT/api/tags
421+
```
422+
423+
**Model Loading Issues (Windows)**
424+
```bash
425+
# Force refresh models in the app or try:
426+
ollama list
427+
ollama pull llama2
428+
# Then refresh the model list in BeautifyOllama settings
429+
```
430+
431+
**Web Search Not Working**
432+
```bash
433+
# Ensure Python dependencies are installed
434+
pip install ollama requests
435+
436+
# Check if SearxNG instances are accessible
437+
# The app will automatically try multiple instances
362438
```
363439

364440
**Build Errors**
@@ -369,6 +445,9 @@ rm -rf .next
369445
# Reinstall dependencies
370446
rm -rf node_modules package-lock.json
371447
npm install
448+
449+
# For Tauri build issues
450+
cd src-tauri && cargo clean
372451
```
373452

374453
**Hydration Errors**

0 commit comments

Comments
 (0)