Authentication
Set yourHF_TOKEN
environment. You can get one from HuggingFace here.
Example
UseHuggingFace
with your Agent
:
View more examples here.
Parameters
Parameter | Type | Default | Description |
---|---|---|---|
id | str | "microsoft/DialoGPT-medium" | The id of the Hugging Face model to use |
name | str | "HuggingFace" | The name of the model |
provider | str | "HuggingFace" | The provider of the model |
api_key | Optional[str] | None | The API key for Hugging Face (defaults to HF_TOKEN env var) |
base_url | str | "https://api-inference.huggingface.co/models" | The base URL for Hugging Face Inference API |
wait_for_model | bool | True | Whether to wait for the model to load if it’s cold |
use_cache | bool | True | Whether to use caching for faster inference |
max_tokens | Optional[int] | None | Maximum number of tokens to generate |
temperature | Optional[float] | None | Controls randomness in the model’s output |
top_p | Optional[float] | None | Controls diversity via nucleus sampling |
repetition_penalty | Optional[float] | None | Penalty for repeating tokens (higher values reduce repetition) |
HuggingFace
is a subclass of the Model class and has access to the same params.