Skip to content

Secure Flutter desktop app connecting Auth0 authentication with local Ollama AI models via encrypted tunneling. Access your private AI instances remotely while keeping data on your hardware.

License

Notifications You must be signed in to change notification settings

CloudToLocalLLM-online/CloudToLocalLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CloudToLocalLLM

License: MIT Flutter Version Node.js Version Platform Status

A privacy-first platform to manage and run powerful Large Language Models (LLMs) locally, with an optional cloud relay for seamless remote access.

Key FeaturesDownload & InstallDocumentationDevelopment


🚀 Overview

CloudToLocalLLM bridges the gap between secure local AI execution and the convenience of cloud-based management. Designed for privacy-conscious users and businesses, it allows you to run models like Llama 3 and Mistral entirely on your own hardware while offering an optional, secure pathway for remote interaction.

Note: The project is currently in Heavy Development/Early Access.

✨ Key Features

  • 🔒 Privacy-First: Run models locally using Ollama. Your data stays on your device by default.
  • 💻 Cross-Platform: Native support for Windows and Linux, with a responsive Web interface.
  • ⚡ Hybrid Architecture: Seamlessly switch between local models when needed.
  • 🔌 Extensible: Integrated with LangChain for advanced AI workflows.
  • ☁️ Cloud Infrastructure: Deployed on AWS EKS for scalable management.
  • 🏠 Self-Hosted: Easily deploy your own instance on any Linux VPS using Docker Compose.

📋 Prerequisites

To use CloudToLocalLLM locally:

  • Ollama: The engine that runs the AI models.
    • Pull a model: ollama pull llama3.2

📥 Download & Install

Windows & Linux

  1. Go to the Latest Releases page.
  2. Download the installer or executable (.exe for Windows, .AppImage for Linux).
  3. Launch the application.

Web Version

Latest web deployment: cloudtolocalllm.online

📖 Documentation

🛠️ Development

Tech Stack

  • Frontend: Flutter (Linux, Windows, Web)
  • Backend: Node.js (Express.js)
  • Authentication: Auth0
  • Deployment: AWS EKS (Cloud) or Docker Compose (Self-Hosted)

Build from Source

  1. Clone: git clone https://github.com/CloudToLocalLLM-online/CloudToLocalLLM.git
  2. Deps: flutter pub get && (cd services/api-backend && npm install)
  3. Run: flutter run -d linux (Desktop) or flutter run -d chrome (Web)

📄 License

This project is licensed under the MIT License.

About

Secure Flutter desktop app connecting Auth0 authentication with local Ollama AI models via encrypted tunneling. Access your private AI instances remotely while keeping data on your hardware.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published