This folder contains examples for direct chat client usage patterns.
| File | Description |
|---|---|
built_in_chat_clients.py |
Consolidated sample for built-in chat clients. Uses get_client() to create the selected client and pass it to main(). |
chat_response_cancellation.py |
Demonstrates how to cancel chat responses during streaming, showing proper cancellation handling and cleanup. |
custom_chat_client.py |
Demonstrates how to create custom chat clients by extending the BaseChatClient class. Shows a EchoingChatClient implementation and how to integrate it with Agent using the as_agent() method. |
require_per_service_call_history_persistence.py |
Compares two otherwise identical FoundryChatClient agents with store=False; the only difference is whether require_per_service_call_history_persistence is enabled, and only the run without it stores the synthesized tool result when middleware terminates the loop early. |
built_in_chat_clients.py starts with:
asyncio.run(main("openai_responses"))Change the argument to pick a client:
openai_responsesopenai_chat_completionanthropicollamabedrockazure_openai_responsesazure_openai_chat_completionfoundry_chat
Example:
uv run samples/02-agents/chat_client/built_in_chat_clients.pyThe require_per_service_call_history_persistence.py sample uses FoundryChatClient, so set the usual Foundry settings first and sign in with the Azure CLI:
export FOUNDRY_PROJECT_ENDPOINT="https://<your-project>.services.ai.azure.com/api/projects/<project-name>"
export FOUNDRY_MODEL="<your-model-deployment-name>"
az login
uv run samples/02-agents/chat_client/require_per_service_call_history_persistence.pyDepending on the selected client, set the appropriate environment variables:
For Azure OpenAI clients (azure_openai_responses and azure_openai_chat_completion):
AZURE_OPENAI_ENDPOINT: Your Azure OpenAI endpointAZURE_OPENAI_MODEL: The Azure OpenAI deployment used by the sampleAZURE_OPENAI_API_VERSION(optional): Azure OpenAI API version overrideAZURE_OPENAI_API_KEY(optional): Azure OpenAI API key if you are not usingAzureCliCredential
For Foundry client (foundry_chat):
FOUNDRY_PROJECT_ENDPOINT: Your Azure AI Foundry project endpointFOUNDRY_MODEL: The Foundry deployment used by the sample
For OpenAI clients:
OPENAI_API_KEY: Your OpenAI API keyOPENAI_CHAT_MODEL: The OpenAI model foropenai_chat_completionOPENAI_RESPONSES_MODEL: The OpenAI model foropenai_responses
For Anthropic client (anthropic):
ANTHROPIC_API_KEY: Your Anthropic API keyANTHROPIC_CHAT_MODEL: The Anthropic model to use (for example,claude-sonnet-4-5)
For Ollama client (ollama):
OLLAMA_HOST: Ollama server URL (defaults tohttp://localhost:11434if unset)OLLAMA_MODEL: Ollama model name (for example,mistral,qwen2.5:8b)
For Bedrock client (bedrock):
BEDROCK_CHAT_MODEL: Bedrock model ID (for example,anthropic.claude-3-5-sonnet-20240620-v1:0)BEDROCK_REGION: AWS region (defaults tous-east-1if unset)- AWS credentials via standard environment variables (for example,
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY)