mirror of
https://github.com/temporal-community/temporal-ai-agent.git
synced 2026-03-15 14:08:08 +01:00
readme
This commit is contained in:
22
README.md
22
README.md
@@ -16,17 +16,24 @@ cp .env.example .env
|
|||||||
|
|
||||||
### LLM Provider Configuration
|
### LLM Provider Configuration
|
||||||
|
|
||||||
The agent can use either OpenAI's GPT-4o, a local LLM via Ollama, or Google Gemini. Set the `LLM_PROVIDER` environment variable in your `.env` file to choose the desired provider:
|
The agent can use either OpenAI's GPT-4o, Google Gemini, or a local LLM via Ollama. Set the `LLM_PROVIDER` environment variable in your `.env` file to choose the desired provider:
|
||||||
|
|
||||||
- `LLM_PROVIDER=openai` for OpenAI's GPT-4o
|
- `LLM_PROVIDER=openai` for OpenAI's GPT-4o
|
||||||
- `LLM_PROVIDER=ollama` for the local LLM via Ollama (not recommended for this use case)
|
|
||||||
- `LLM_PROVIDER=google` for Google Gemini
|
- `LLM_PROVIDER=google` for Google Gemini
|
||||||
|
- `LLM_PROVIDER=ollama` for the local LLM via Ollama (not recommended for this use case)
|
||||||
|
|
||||||
### OpenAI Configuration
|
### Option 1: OpenAI Configuration
|
||||||
|
|
||||||
If using OpenAI, ensure you have an OpenAI key for the GPT-4o model. Set this in the `OPENAI_API_KEY` environment variable in `.env`.
|
If using OpenAI, ensure you have an OpenAI key for the GPT-4o model. Set this in the `OPENAI_API_KEY` environment variable in `.env`.
|
||||||
|
|
||||||
### Ollama Configuration
|
### Option 2: Google Gemini
|
||||||
|
|
||||||
|
To use Google Gemini:
|
||||||
|
|
||||||
|
1. Obtain a Google API key and set it in the `GOOGLE_API_KEY` environment variable in `.env`.
|
||||||
|
2. Set `LLM_PROVIDER=google` in your `.env` file.
|
||||||
|
|
||||||
|
### Option 3: Local LLM via Ollama (not recommended)
|
||||||
|
|
||||||
To use a local LLM with Ollama:
|
To use a local LLM with Ollama:
|
||||||
|
|
||||||
@@ -38,13 +45,6 @@ To use a local LLM with Ollama:
|
|||||||
|
|
||||||
Note: The local LLM is disabled by default as ChatGPT 4o was found to be MUCH more reliable for this use case. However, you can switch to Ollama if desired.
|
Note: The local LLM is disabled by default as ChatGPT 4o was found to be MUCH more reliable for this use case. However, you can switch to Ollama if desired.
|
||||||
|
|
||||||
### Google Gemini Configuration
|
|
||||||
|
|
||||||
To use Google Gemini:
|
|
||||||
|
|
||||||
1. Obtain a Google API key and set it in the `GOOGLE_API_KEY` environment variable in `.env`.
|
|
||||||
2. Set `LLM_PROVIDER=google` in your `.env` file.
|
|
||||||
|
|
||||||
## Agent Tools
|
## Agent Tools
|
||||||
* Requires a Rapidapi key for sky-scrapper (how we find flights). Set this in the `RAPIDAPI_KEY` environment variable in .env
|
* Requires a Rapidapi key for sky-scrapper (how we find flights). Set this in the `RAPIDAPI_KEY` environment variable in .env
|
||||||
* It's free to sign up and get a key at [RapidAPI](https://rapidapi.com/apiheya/api/sky-scrapper)
|
* It's free to sign up and get a key at [RapidAPI](https://rapidapi.com/apiheya/api/sky-scrapper)
|
||||||
|
|||||||
Reference in New Issue
Block a user