Within AgentKit, models are adapters that wrap a given provider (ex. OpenAI, Anthropic)‘s specific model version (ex. gpt-3.5).

Each Agent can each select their own model to use and a Network can select a default model.

import { openai, anthropic, gemini } from "@inngest/agent-kit";

How to use a model

Create a model instance

Each model helper will first try to get the API Key from the environment variable. The API Key can also be provided with the apiKey option to the model helper.

Providing a model instance to an Agent

import { createAgent } from '@inngest/agent-kit';

const supportAgent = createAgent({
  model: openai("gpt-3.5-turbo"),
  name: "Customer support specialist",
  system: "You are an customer support specialist...",
  tools: [listChargesTool],
});

Providing a model instance to a Network

The provided defaultModel will be used for all Agents without a model specified. It will also be used by the “Default Routing Agent” if enabled.

import { createNetwork } from '@inngest/agent-kit';

const network = createNetwork({
  agents: [supportAgent],
  defaultModel: openai('gpt-4o'),
});

List of supported models

For a full list of supported models, you can always check the models directory here.

Environment variable used for each model provider

  • OpenAI: OPENAI_API_KEY
  • Anthropic: ANTHROPIC_API_KEY
  • Gemini: GEMINI_API_KEY

Contribution

Is there a model that you’d like to see included in AgentKit? Open an issue, create a pull request, or chat with the team on Discord in the #ai channel.

Contribute on GitHub

Fork, clone, and open a pull request.