Optional
modelThe model ID to use for the agent. If not provided, defaults to process.env.INFERENCE_MODEL_ID.
Optional
temperatureControls randomness of the agent's LLM responses.
Optional
maxMax tokens per underlying inference request made by the agent.
Optional
stopList of strings that stop generation for the agent's LLM.
Optional
topProportion of tokens to consider for the agent's LLM.
Optional
toolsList of heroku_tool or mcp tools the agent is allowed to use.
Optional
apiHeroku API Key. Reads from env HEROKU_API_KEY if not provided.
Optional
apiHeroku API Base URL. Defaults to inference.heroku.com.
Optional
maxMax retries for API calls
Optional
timeoutTimeout for API calls in ms
Optional
additionalAllows passing any other Heroku-specific agent parameters not explicitly defined.
Interface for the fields to instantiate HerokuMiaAgent. Extends BaseChatModelParams and includes Heroku-specific agent parameters. Based on SPECS.md Table: HerokuMiaAgentFields Constructor Parameters (Section 3.2.2)