Skip to content

bipark/mac_llm_client

Repository files navigation

✨ LLM-K - Multi LLM Client ✨

Multi-platform LLM Mac client supporting Ollama, LM Studio, Claude, and OpenAI

한국어日本語中文

LLM-K

LLM-K is a mac client app that allows you to connect to various LLM services including Ollama, LM Studio, Claude, and OpenAI. You can download and build the source code or download the LLM-K app from the Apple App Store.

Introduction

LLM-K is a versatile client that supports multiple LLM platforms:

  • Ollama: Open source software for running LLMs locally
  • LM Studio: Local LLM platform with various model support
  • Claude: Anthropic's advanced AI model
  • OpenAI: Leading AI platform including GPT models

poster

Key features

  • Multiple LLM Platform Support:
  • Selective Service Display: Choose which LLM services to show in the model selection menu
  • Remote LLM access: Connect to the Ollama/LM Studio host via IP address
  • Custom prompts: support for setting custom instructions
  • Supports various open source LLMs (Deepseek, Llama, Gemma, Qwen, Mistral, etc.)
  • Customizable instruction settings
  • Advanced model parameters: Temperature, Top P, Top K controls with intuitive sliders
  • Connection testing: Built-in server connection status checker
  • Multi-format file support: Images, PDF documents, and text files
  • Supports image recognition (only on models that support it)
  • Intuitive chat-like UI
  • Conversation history: save and manage chat sessions
  • Supports Korean, English, Japanese, Chinese
  • Support Markdown format

poster

How to use

  1. Choose your preferred LLM platform:
    • For Ollama: Install Ollama on your computer (Ollama Download)
    • For LM Studio: Install LM Studio (LM Studio Website)
    • For Claude/OpenAI: Obtain API keys from respective platforms
  2. Download the source and build it with Xcode, or download the LLM-K app from the App Store
  3. Configure your chosen platform:
    • For Ollama/LM Studio: Install desired models
    • For Claude/OpenAI: Enter your API keys in settings
  4. For local LLMs (Ollama/LM Studio), configure remote access if needed
  5. Launch LLM-K and select your preferred service and model
  6. Start your conversation!

System requirements

  • For local LLMs: Computer with Ollama or LM Studio installed
  • For cloud LLMs: Valid API keys for Claude or OpenAI
  • Network connection

Advantages

  • Multi-platform support for both local and cloud-based LLMs
  • Flexible service selection for streamlined interface
  • Advanced AI features available through various platforms
  • Privacy-protected options (local LLMs)
  • Versatile for programming, creative work, casual questions, etc.
  • Organized conversation management

Notes

  • Local LLM features require Ollama or LM Studio installation
  • API keys required for Claude and OpenAI services
  • You are responsible for managing your local LLM hosts and API keys securely

Download App

License

LLM-K is licensed under the GNU license. For more information, please refer to the LICENSE file.

Contact

For questions or bug reports about LLM-K, please send an email to [email protected].

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages