mylocalai

Local AI Chat

A React chat application that runs AI locally using Ollama. Open source and completely free.

GitHub

πŸŽ₯ Watch Demo

Philosophy

This application runs AI locally on your machine using Ollama. It’s open source and completely free to use.

Quick Start

  1. Install Ollama

    Download and install Ollama from the official website:

    • macOS/Windows/Linux: Visit https://ollama.com and download the installer for your operating system
    • Follow the installation instructions for your platform
  2. Install a model
    ollama pull llama3.1:8b
    
  3. Run the application
    npm install
    npm start
    
  4. Open http://localhost:3000

Features

Requirements

Supported Models

Works with llama3.1:8b (recommended)

Configuration

Edit .env.local to customize:

REACT_APP_OLLAMA_URL=http://localhost:11434
PORT=3000

Troubleshooting

Ollama Not Connected?

Performance Issues?

Development

npm start       # Development server
npm run build   # Production build
npm test        # Run tests

License

MIT License - See LICENSE file for details.