feat: add AIML API provider
Introduces support for the AI/ML API provider, including configuration options, chat and embedding model loading, and UI integration. Updates documentation and sample config to reflect the new provider.
This commit is contained in:
parent
0024ce36c8
commit
c6d084f5dc
7 changed files with 138 additions and 1 deletions
|
|
@ -90,7 +90,9 @@ There are mainly 2 ways of installing Perplexica - With Docker, Without Docker.
|
|||
- `OLLAMA`: Your Ollama API URL. You should enter it as `http://host.docker.internal:PORT_NUMBER`. If you installed Ollama on port 11434, use `http://host.docker.internal:11434`. For other ports, adjust accordingly. **You need to fill this if you wish to use Ollama's models instead of OpenAI's**.
|
||||
- `GROQ`: Your Groq API key. **You only need to fill this if you wish to use Groq's hosted models**.
|
||||
- `ANTHROPIC`: Your Anthropic API key. **You only need to fill this if you wish to use Anthropic models**.
|
||||
- `Gemini`: Your Gemini API key. **You only need to fill this if you wish to use Google's models**.
|
||||
- `Gemini`: Your Gemini API key. **You only need to fill this if you wish to use Google's models**.
|
||||
- `DEEPSEEK`: Your Deepseek API key. **Only needed if you want Deepseek models.**
|
||||
- `AIMLAPI`: Your AI/ML API key. **Only needed if you want to use AI/ML API models and embeddings.**
|
||||
|
||||
**Note**: You can change these after starting Perplexica from the settings dialog.
|
||||
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue