Self-hosted Providers

This page will help you set up any self-hosted providers with LLM Vision

LocalAI

To use LocalAI you need to have a LocalAI server running. You can find the installation instructions here. During setup you'll need to provide the IP address of your machine and port on which LocalAI is running (default is 8000). If you want to use HTTPS to send requests you need to check the HTTPS box.

Ollama

To use Ollama you must set up an Ollama server first. You can download it from here. Once installed you need to run the following command to download the llava model:

ollama run llava

If your Home Assistant is not running on the same computer as Ollama, you need to set the OLLAMA_HOST environment variable.

Linux
  1. Edit the systemd service by calling systemctl edit ollama.service. This will open an editor.

  2. For each environment variable, add a line Environment under section [Service]:

[Service]
Environment="OLLAMA_HOST=0.0.0.0"
  1. Save and close the editor.

  2. Reload systemd and restart Ollama

systemctl daemon-reload
systemctl restart ollama
Windows
  1. Quit Ollama from the system tray

  1. Open File Explorer

  1. Right click on This PC and select Properties

  1. Click on Advanced system settings

  1. Select Environment Variables

  1. Under User variables click New

  1. For variable name enter OLLAMA_HOST and for value enter 0.0.0.0

  1. Click OK and start Ollama again from the Start Menu

macOS
  1. Open Terminal

  2. Run the following command

launchctl setenv OLLAMA_HOST "0.0.0.0"
  1. Restart Ollama

Last updated