Back to Home
turmex icon

ai-assistant-public

Verified Safe

by turmex

Overview

The AI Assistant provides a local-first conversational interface, enabling users to chat with LLMs directly in their browser via WebLLM (WebGPU) or locally using an Ollama server. It features intelligent model management, hardware-aware recommendations, and token optimization for efficient conversation.

Installation

Run Command
cd ~/Desktop/ai-assistant && ./launch.sh

Environment Variables

  • OLLAMA_BASE_URL
  • DEFAULT_LOCAL_MODEL
  • DATABASE_URL
  • HOST
  • PORT
  • DEBUG
  • GOOGLE_CLIENT_ID
  • GOOGLE_CLIENT_SECRET
  • SALESFORCE_CLIENT_ID
  • SALESFORCE_CLIENT_SECRET
  • SECRET_KEY

Security Notes

API keys are stored in browser localStorage with base64 obfuscation, which is explicitly noted as 'not encryption' and easily discoverable. CORS is configured permissively for MVP and requires restriction in production. The `launch.sh` script uses `kill -9` to free ports, which could be risky if misused, though it's contextualized for local cleanup.

Similar Servers

Stats

Interest Score0
Security Score4
Cost ClassLow
Avg Tokens150
Stars0
Forks0
Last Update2025-11-20

Tags

AILLMChatbotWebLLMOllamaModel ManagementLocal AIWebGPUPythonFastAPIJavaScriptMachine LearningConversation