Heroku LangChain.js - v0.4.2
    Preparing search index...

    Class HerokuAgent

    HerokuAgent - Heroku Managed Inference Agent Integration

    A LangChain-compatible chat model that interfaces with Heroku's Managed Inference Agent API. This class provides access to intelligent agents that can execute tools and perform complex multi-step reasoning tasks. Agents have access to Heroku-specific tools like app management, database operations, and can integrate with external services via MCP (Model Context Protocol).

    Unlike the basic ChatHeroku model, agents are designed for autonomous task execution with built-in tool calling capabilities and advanced reasoning patterns.

    import { HerokuAgent } from "heroku-langchain";
    import { HumanMessage } from "@langchain/core/messages";

    // Basic agent usage
    const agent = new HerokuAgent({
    model: "gpt-oss-120b",
    temperature: 0.3,
    tools: [
    {
    type: "heroku_tool",
    name: "dyno_run_command ",
    runtime_params: {
    target_app_name: "my-app",
    tool_params: {
    cmd: "date",
    description: "Gets the current date and time on the server.",
    parameters: { type: "object", properties: {} },
    },
    },
    }
    ],
    apiKey: process.env.INFERENCE_KEY,
    apiUrl: process.env.INFERENCE_URL
    });

    const response = await agent.invoke([
    new HumanMessage("Deploy my Node.js application to Heroku")
    ]);
    // Agent with MCP tools
    const agentWithMCP = new HerokuAgent({
    model: "gpt-oss-120b",
    tools: [
    {
    type: "mcp",
    name: "mcp/read_file",
    description: "Read file contents via MCP"
    },
    {
    type: "heroku_tool",
    name: "scale_dyno",
    runtime_params: {
    target_app_name: "production-app"
    }
    }
    ]
    });

    const result = await agentWithMCP.invoke([
    new HumanMessage("Read my package.json and scale the app based on the dependencies")
    ]);
    // Streaming agent responses to see tool execution in real-time
    const stream = await agent.stream([
    new HumanMessage("Check the status of all my Heroku apps and restart any that are down")
    ]);

    for await (const chunk of stream) {
    if (chunk.response_metadata?.tool_calls) {
    console.log("Agent is executing:", chunk.response_metadata.tool_calls);
    }
    if (chunk.content) {
    process.stdout.write(chunk.content);
    }
    }

    Hierarchy

    Index

    Constructors

    • Creates a new HerokuAgent instance.

      Parameters

      • Optionalfields: HerokuAgentFields

        Optional configuration options for the Heroku Mia Agent

      Returns HerokuAgent

      When model ID is not provided and INFERENCE_MODEL_ID environment variable is not set

      // Basic usage with defaults
      const agent = new HerokuAgent();

      // With custom configuration
      const agent = new HerokuAgent({
      model: "gpt-oss-120b",
      temperature: 0.3,
      maxTokensPerRequest: 2000,
      tools: [
      {
      type: "heroku_tool",
      name: "dyno_run_command",
      runtime_params: {
      target_app_name: "my-app",
      tool_params: {
      cmd: "date",
      description: "Gets the current date and time on the server.",
      parameters: { type: "object", properties: {} },
      },
      },
      }
      }
      ],
      apiKey: "your-api-key",
      apiUrl: "https://us.inference.heroku.com"
      });

    Properties

    maxTokensPerRequest?: number
    tools?: any[]
    streamUsage?: boolean
    toolResultQueues: Map<string, any[]> = ...
    _localNoopTools: StructuredTool<ToolInputSchemaBase, any, any, any>[] = []
    model: string
    temperature?: number
    stop?: string[]
    topP?: number
    apiKey?: string
    apiUrl?: string
    maxRetries?: number
    timeout?: number
    streaming?: boolean
    additionalKwargs?: Record<string, any>

    Methods

    • Returns the LangChain identifier for this agent class.

      Returns string

      The string "HerokuAgent"

    • Returns the LLM type identifier for this agent.

      Returns string

      The string "HerokuAgent"

    • Internal

      Get the parameters used to invoke the agent.

      This method combines constructor parameters with runtime options to create the final request parameters for the Heroku Agent API. Runtime options take precedence over constructor parameters.

      Parameters

      • Optionaloptions: Partial<HerokuAgentCallOptions>

        Optional runtime parameters that override constructor defaults

      Returns Omit<HerokuAgentFields, (keyof BaseLanguageModelParams) | "disableStreaming"> & {
          [key: string]: any;
      }

      Combined parameters for the agent API request

    • Parameters

      • messages: BaseMessage[]
      • options: Omit<
            HerokuAgentCallOptions,
            | "configurable"
            | "recursionLimit"
            | "runName"
            | "tags"
            | "metadata"
            | "callbacks"
            | "runId",
        >
      • OptionalrunManager: CallbackManagerForLLMRun

      Returns Promise<ChatResult>

    • Parameters

      • messages: BaseMessage[]
      • options: Omit<
            HerokuAgentCallOptions,
            | "configurable"
            | "recursionLimit"
            | "runName"
            | "tags"
            | "metadata"
            | "callbacks"
            | "runId",
        >
      • OptionalrunManager: CallbackManagerForLLMRun

      Returns AsyncGenerator<AIMessageChunk>

    • Create a version of this agent that returns structured output by instructing the model to produce JSON matching the schema (jsonMode-style).

      Type Parameters

      • RunOutput extends Record<string, any> = Record<string, any>
      • IncludeRaw extends boolean = false

      Parameters

      • schemaOrParams:
            | Record<string, any>
            | ZodType<RunOutput, ZodTypeDef, RunOutput>
            | {
                schema: Record<string, any> | ZodType<RunOutput, ZodTypeDef, RunOutput>;
                name?: string;
                description?: string;
                method?: "functionCalling" | "jsonMode";
                includeRaw?: IncludeRaw;
            }
      • OptionalmaybeOptions: {
            name?: string;
            description?: string;
            method?: "functionCalling" | "jsonMode";
            includeRaw?: IncludeRaw;
        }

      Returns any

    • Get the locally bound no-op LangChain tools mirroring server tools.

      Returns Tool<any>[]

    • Push a server-provided tool result into the local queue for a tool name

      Parameters

      • toolName: undefined | string
      • result: any

      Returns void

    • Consume the oldest queued server tool result for a tool name

      Parameters

      • toolName: string

      Returns any

    • Remove undefined keys to keep payloads clean

      Type Parameters

      • T extends Record<string, any>

      Parameters

      • obj: T

      Returns T

    • Standard headers for Heroku API calls

      Parameters

      • apiKey: string

      Returns Record<string, string>

    • POST JSON with retries, timeout, and consistent error wrapping.

      Parameters

      • url: string
      • apiKey: string
      • body: Record<string, any>

      Returns Promise<Response>