2. Config
[anthropic_api]
key = '...'
[openai_api]
api_key = '...'
Aware uses a global config .toml file for configuration. The file is provided during initial installation. For clarity, below is a set of configurations that can be modified according to your needs.
Models Configuration (Mandatory)
Aware requires at least:
One embedding model
One chat model capable of agentic (multi-step) execution
Additional fast models can improve performance but are not required.
Model Types
1. Embedding Model (Required)
Drives chunking and retrieval. May require tuning if replaced.
Example: text-embedding-large-3
2. Super-Fast Model (Optional)
A lightweight model for very quick operations. If not provided, the agentic model will handle these tasks, but indexing may become slower.
Example: gpt-4.1-nano
Recommendation: If no fast or super-fast model is available, it’s best to disable code description during indexing to avoid slowdowns.
3. Fast Chat Model (Optional)
Used for quick, non-reasoning tasks. If not provided, the agentic model can be used instead, with some performance impact.
Example: gpt-4.1-mini
4. Agentic-Capable Chat Model (Required)
Runs the core agent workflows and multi-step reasoning.
Example: claude-sonnet-4-5
Required Models Summary
Embeddings
text-embedding-large-3
Yes
Agentic-capable chat
e.g., claude-sonnet-4-5
Yes
Fast chat model
e.g., gpt-4.1-mini
Optional
Super-fast model
e.g., gpt-4.1-nano
Optional
Example Model Configurations
OpenAI-Only Example
Embedding
openai/text-embedding-large-3
Super-fast model
openai/gpt-4.1-nano
Fast chat model
openai/gpt-4.1-mini
Agentic-capable chat model
openai/gpt-5.1
Bedrock-Only Example
Embedding
bedrock/amazon.titan-embed-text-v2:0
Super-fast, low-cost model
bedrock/anthropic.claude-haiku-4-5-20251001-v1:0
Fast, non-reasoning chat model
bedrock/anthropic.claude-haiku-4-5-20251001-v1:0
Agentic-capable chat model
bedrock/global.anthropic.claude-sonnet-4-5-20250929-v1:0
Default Configuration
Bedrock Configuration
Environment variable that must be set to enable bedrock embedding
AI_MODELS_FILE=ai_models_bedrock.py
LLM Gateway Support
Our product supports integration with custom LLM gateways, provided they implement the same API interfaces as the official OpenAI or Anthropic model endpoints.
If you have unique integration needs, please reach out to the Qodo team.
Monitoring config (Recommended)
Last updated