Skip to content

LlamaChat, a mobile application that lets users chat with locally-running large language models (LLMs) directly on their devices. LlamaChat provides an intuitive interface for downloading, managing, and interacting with LLMs, enabling on-device AI chat without requiring internet connectivity for inference.

Notifications You must be signed in to change notification settings

VinsonGuo/LlamaChat

Repository files navigation

LlamaChat is a cross-platform mobile application built with React Native that brings the power of large language models to mobile devices. Unlike cloud-based AI chat applications, LlamaChat runs models directly on-device, providing:

  • Privacy: All conversations and processing stay on your device
  • Offline functionality: No internet required for generating responses
  • Customizability: Control model parameters and system prompts

Readme: https://deepwiki.com/VinsonGuo/LlamaChat

About

LlamaChat, a mobile application that lets users chat with locally-running large language models (LLMs) directly on their devices. LlamaChat provides an intuitive interface for downloading, managing, and interacting with LLMs, enabling on-device AI chat without requiring internet connectivity for inference.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published