OptionalmodelThe model ID to use for the agent. If not provided, defaults to process.env.INFERENCE_MODEL_ID.
OptionaltemperatureControls randomness of the agent's LLM responses.
OptionalmaxMax tokens per underlying inference request made by the agent.
OptionalstopList of strings that stop generation for the agent's LLM.
OptionaltopProportion of tokens to consider for the agent's LLM.
OptionaltoolsList of heroku_tool or mcp tools the agent is allowed to use.
OptionalapiHeroku API Key. Reads from env HEROKU_API_KEY if not provided.
OptionalapiHeroku API Base URL. Defaults to inference.heroku.com.
OptionalmaxMax retries for API calls
OptionaltimeoutTimeout for API calls in ms
OptionaladditionalAllows passing any other Heroku-specific agent parameters not explicitly defined.
Interface for the fields to instantiate HerokuAgent. Extends BaseChatModelParams and includes Heroku-specific agent parameters.