Creates a new HerokuMia instance.
Optional
fields: HerokuMiaFieldsOptional configuration options for the Heroku Mia model
Protected
modelProtected
Optional
temperatureProtected
Optional
maxProtected
Optional
stopProtected
Optional
topProtected
Optional
apiProtected
Optional
apiProtected
Optional
maxProtected
Optional
timeoutProtected
Optional
streamingProtected
Optional
additionalStatic
lc_Returns the LangChain identifier for this model class.
The string "HerokuMia"
Returns the LLM type identifier for this model.
The string "HerokuMia"
Bind tools to this chat model for function calling capabilities.
This method creates a new instance of HerokuMia with the specified tools pre-bound, enabling the model to call functions during conversations. The tools will be automatically included in all subsequent calls to the model.
A list of StructuredTool instances or tool definitions to bind to the model
A new HerokuMia instance with the tools bound and tool_choice set to "auto"
import { DynamicStructuredTool } from "@langchain/core/tools";
import { z } from "zod";
const calculatorTool = new DynamicStructuredTool({
name: "calculator",
description: "Perform basic arithmetic operations",
schema: z.object({
operation: z.enum(["add", "subtract", "multiply", "divide"]),
a: z.number(),
b: z.number()
}),
func: async ({ operation, a, b }) => {
switch (operation) {
case "add": return `${a + b}`;
case "subtract": return `${a - b}`;
case "multiply": return `${a * b}`;
case "divide": return `${a / b}`;
}
}
});
const modelWithTools = model.bindTools([calculatorTool]);
const result = await modelWithTools.invoke([
new HumanMessage("What is 15 * 7?")
]);
Internal
Get the parameters used to invoke the model.
This method combines constructor parameters with runtime options to create the final request parameters for the Heroku API. Runtime options take precedence over constructor parameters.
Optional
options: Partial<HerokuMiaCallOptions>Optional runtime parameters that override constructor defaults
Combined parameters for the API request
Optional
runManager: CallbackManagerForLLMRunOptional
runManager: CallbackManagerForLLMRunCreate a version of this chat model that returns structured output.
This method enables the model to return responses that conform to a specific schema, using function calling under the hood. The model is instructed to call a special "extraction" function with the structured data as arguments.
The type of the structured output
The schema for the structured output (Zod schema or JSON schema)
Optional
config: StructuredOutputMethodOptions<false>Configuration options for structured output
A new runnable that returns structured output
import { z } from "zod";
// Define the schema for extracted data
const personSchema = z.object({
name: z.string().describe("The person's full name"),
age: z.number().describe("The person's age in years"),
occupation: z.string().describe("The person's job or profession"),
skills: z.array(z.string()).describe("List of skills or expertise")
});
// Create a model that returns structured output
const extractionModel = model.withStructuredOutput(personSchema, {
name: "extract_person_info",
description: "Extract structured information about a person"
});
// Use the model
const result = await extractionModel.invoke([
new HumanMessage("Sarah Johnson is a 28-year-old data scientist who specializes in machine learning, Python, and statistical analysis.")
]);
console.log(result);
// Output: {
// name: "Sarah Johnson",
// age: 28,
// occupation: "data scientist",
// skills: ["machine learning", "Python", "statistical analysis"]
// }
// With includeRaw option to get both raw and parsed responses
const extractionModelWithRaw = model.withStructuredOutput(personSchema, {
includeRaw: true
});
const result = await extractionModelWithRaw.invoke([
new HumanMessage("John is a 35-year-old teacher.")
]);
console.log(result.parsed); // { name: "John", age: 35, occupation: "teacher", skills: [] }
console.log(result.raw); // Original AIMessage with tool calls
Create a version of this chat model that returns structured output.
This method enables the model to return responses that conform to a specific schema, using function calling under the hood. The model is instructed to call a special "extraction" function with the structured data as arguments.
The type of the structured output
The schema for the structured output (Zod schema or JSON schema)
Optional
config: StructuredOutputMethodOptions<true>Configuration options for structured output
A new runnable that returns structured output
import { z } from "zod";
// Define the schema for extracted data
const personSchema = z.object({
name: z.string().describe("The person's full name"),
age: z.number().describe("The person's age in years"),
occupation: z.string().describe("The person's job or profession"),
skills: z.array(z.string()).describe("List of skills or expertise")
});
// Create a model that returns structured output
const extractionModel = model.withStructuredOutput(personSchema, {
name: "extract_person_info",
description: "Extract structured information about a person"
});
// Use the model
const result = await extractionModel.invoke([
new HumanMessage("Sarah Johnson is a 28-year-old data scientist who specializes in machine learning, Python, and statistical analysis.")
]);
console.log(result);
// Output: {
// name: "Sarah Johnson",
// age: 28,
// occupation: "data scientist",
// skills: ["machine learning", "Python", "statistical analysis"]
// }
// With includeRaw option to get both raw and parsed responses
const extractionModelWithRaw = model.withStructuredOutput(personSchema, {
includeRaw: true
});
const result = await extractionModelWithRaw.invoke([
new HumanMessage("John is a 35-year-old teacher.")
]);
console.log(result.parsed); // { name: "John", age: 35, occupation: "teacher", skills: [] }
console.log(result.raw); // Original AIMessage with tool calls
Private
isHelper method to check if input is a Zod schema
Private
isHelper method to check if input is StructuredOutputMethodParams
HerokuMia - Heroku Managed Inference API LangChain Integration
A LangChain-compatible chat model that interfaces with Heroku's Managed Inference API (Mia). This class provides access to various language models hosted on Heroku's infrastructure, including support for function calling, structured outputs, and streaming responses.
Example
Example
Example
Example
See