Providers
LLM Vision multiple AI providers into an easy to use integration. At the moment these providers are supported:
Cloud based Providers
OpenAI
Anthropic
Google
Groq
Self-hosted Providers
Ollama
LocalAI
If you are unsure which provider or model to use, there is a comparison available here:https://llm-vision.gitbook.io/getting-started/choosing-the-right-model
You will have to set up each provider that you want to use separately. For cloud-based providers you will need a valid API key. Continue for more detailed instruction on how to get API keys or how to host your own LLM locally.
Last updated