Heroku LangChain - v0.2.1
    Preparing search index...

    Class HerokuMiaAgent

    HerokuMiaAgent - Heroku Managed Inference Agent Integration

    A LangChain-compatible chat model that interfaces with Heroku's Managed Inference Agent API. This class provides access to intelligent agents that can execute tools and perform complex multi-step reasoning tasks. Agents have access to Heroku-specific tools like app management, database operations, and can integrate with external services via MCP (Model Context Protocol).

    Unlike the basic HerokuMia model, agents are designed for autonomous task execution with built-in tool calling capabilities and advanced reasoning patterns.

    import { HerokuMiaAgent } from "heroku-langchain";
    import { HumanMessage } from "@langchain/core/messages";

    // Basic agent usage
    const agent = new HerokuMiaAgent({
    model: "claude-3-7-sonnet",
    temperature: 0.3,
    tools: [
    {
    type: "heroku_tool",
    name: "dyno_run_command ",
    runtime_params: {
    target_app_name: "my-app",
    tool_params: {
    cmd: "date",
    description: "Gets the current date and time on the server.",
    parameters: { type: "object", properties: {} },
    },
    },
    }
    ],
    apiKey: process.env.INFERENCE_KEY,
    apiUrl: process.env.INFERENCE_URL
    });

    const response = await agent.invoke([
    new HumanMessage("Deploy my Node.js application to Heroku")
    ]);
    // Agent with MCP tools
    const agentWithMCP = new HerokuMiaAgent({
    model: "claude-3-7-sonnet",
    tools: [
    {
    type: "mcp",
    name: "mcp/read_file",
    description: "Read file contents via MCP"
    },
    {
    type: "heroku_tool",
    name: "scale_dyno",
    runtime_params: {
    target_app_name: "production-app"
    }
    }
    ]
    });

    const result = await agentWithMCP.invoke([
    new HumanMessage("Read my package.json and scale the app based on the dependencies")
    ]);
    // Streaming agent responses to see tool execution in real-time
    const stream = await agent.stream([
    new HumanMessage("Check the status of all my Heroku apps and restart any that are down")
    ]);

    for await (const chunk of stream) {
    if (chunk.response_metadata?.tool_calls) {
    console.log("Agent is executing:", chunk.response_metadata.tool_calls);
    }
    if (chunk.content) {
    process.stdout.write(chunk.content);
    }
    }

    Hierarchy

    Index

    Constructors

    • Creates a new HerokuMiaAgent instance.

      Parameters

      Returns HerokuMiaAgent

      When model ID is not provided and INFERENCE_MODEL_ID environment variable is not set

      // Basic usage with defaults
      const agent = new HerokuMiaAgent();

      // With custom configuration
      const agent = new HerokuMiaAgent({
      model: "claude-3-7-sonnet",
      temperature: 0.3,
      maxTokensPerRequest: 2000,
      tools: [
      {
      type: "heroku_tool",
      name: "dyno_run_command",
      runtime_params: {
      target_app_name: "my-app",
      tool_params: {
      cmd: "date",
      description: "Gets the current date and time on the server.",
      parameters: { type: "object", properties: {} },
      },
      },
      }
      }
      ],
      apiKey: "your-api-key",
      apiUrl: "https://us.inference.heroku.com"
      });

    Properties

    model: string
    temperature?: number
    maxTokensPerRequest?: number
    stop?: string[]
    topP?: number
    tools?: any[]
    apiKey?: string
    apiUrl?: string
    maxRetries?: number
    timeout?: number
    streaming?: boolean
    streamUsage?: boolean
    additionalKwargs?: Record<string, any>

    Methods

    • Returns the LangChain identifier for this agent class.

      Returns string

      The string "HerokuMiaAgent"

    • Returns the LLM type identifier for this agent.

      Returns string

      The string "HerokuMiaAgent"

    • Internal

      Get the parameters used to invoke the agent.

      This method combines constructor parameters with runtime options to create the final request parameters for the Heroku Agent API. Runtime options take precedence over constructor parameters.

      Parameters

      Returns Omit<HerokuMiaAgentFields, (keyof BaseLanguageModelParams) | "disableStreaming"> & {
          [key: string]: any;
      }

      Combined parameters for the agent API request

    • Parameters

      • messages: BaseMessage[]
      • options: Omit<
            HerokuMiaAgentCallOptions,
            | "configurable"
            | "recursionLimit"
            | "runName"
            | "tags"
            | "metadata"
            | "callbacks"
            | "runId",
        >
      • OptionalrunManager: CallbackManagerForLLMRun

      Returns Promise<ChatResult>

    • Parameters

      • messages: BaseMessage[]
      • options: Omit<
            HerokuMiaAgentCallOptions,
            | "configurable"
            | "recursionLimit"
            | "runName"
            | "tags"
            | "metadata"
            | "callbacks"
            | "runId",
        >
      • OptionalrunManager: CallbackManagerForLLMRun

      Returns AsyncGenerator<AIMessageChunk>