Skip to main content
Monitor Anthropic’s official TypeScript SDK by wrapping the client once with lunary/anthropic.
1

Install both packages

npm install lunary @anthropic-ai/sdk

JavaScript

Learn how to set up the JS SDK.
2

Monitor Anthropic

Wrap the Anthropic client once, then keep using the Anthropic SDK as usual.
import Anthropic from "@anthropic-ai/sdk"
import { monitorAnthropic } from "lunary/anthropic"

const anthropic = monitorAnthropic(
  new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY })
)
3

Supported surface

The monitored client supports the current Anthropic Messages API surface:
  • messages.create(...)
  • Raw streaming via messages.create({ stream: true })
  • messages.parse(...)
  • messages.stream(...)
  • beta.messages.create(...)
  • beta.messages.parse(...)
  • beta.messages.stream(...)
  • beta.messages.toolRunner(...)
Tool-runner loops appear in Lunary as one LLM run per underlying Anthropic request. Raw streams and helper streams keep the normal Anthropic SDK lifecycle, and Lunary preserves Anthropic content blocks in the run details, including thinking, redacted_thinking, tool_use, tool_result, server_tool_use, and web_search_tool_result, along with token usage and cached input tokens.
4

Typical usage

Structured outputs with Lunary context:
const parsed = await anthropic.messages.parse({
  model: "claude-sonnet-4-5-20250929",
  max_tokens: 256,
  messages: [
    {
      role: "user",
      content: "Return JSON with `answer` and `confidence` for 2 + 2.",
    },
  ],
  output_config: {
    format: {
      type: "json_schema",
      schema: {
        type: "object",
        additionalProperties: false,
        properties: {
          answer: { type: "number" },
          confidence: { type: "string" },
        },
        required: ["answer", "confidence"],
      },
    },
  },
  tags: ["support", "structured-output"],
  userId: "user_123",
  userProps: { plan: "pro" },
  metadata: { user_id: "user_123" },
})
Helper streams and beta server tools:
const stream = anthropic.messages.stream({
  model: "claude-sonnet-4-5-20250929",
  max_tokens: 256,
  messages: [{ role: "user", content: "Write one sentence about Lunary tracing." }],
})

const finalMessage = await stream.finalMessage()

const searchStream = anthropic.beta.messages.stream({
  model: "claude-sonnet-4-5-20250929",
  max_tokens: 512,
  messages: [{ role: "user", content: "Use web search and summarize the latest Bun release." }],
  tools: [{ type: "web_search_20250305", name: "web_search" }],
})

const searchMessage = await searchStream.finalMessage()
Beta tool-runner loops:
import { betaTool } from "@anthropic-ai/sdk/helpers/beta/json-schema"

const toolResult = await anthropic.beta.messages.toolRunner({
  model: "claude-sonnet-4-5-20250929",
  max_tokens: 256,
  max_iterations: 3,
  messages: [{ role: "user", content: "Call the weather tool and summarize the result." }],
  tools: [
    betaTool({
      name: "getWeather",
      description: "Returns a canned weather response for the requested city.",
      inputSchema: {
        type: "object",
        additionalProperties: false,
        properties: {
          city: { type: "string" },
        },
        required: ["city"],
      },
      run: ({ city }) => `The weather in ${city} is sunny and 20C.`,
    }),
  ],
})
Anthropic validates the provider-side metadata object. Use Anthropic-supported fields such as user_id there, and use tags, userId, and userProps for Lunary-only context. For Anthropic beta features that require betas: [...], pass the beta headers exactly as Anthropic documents.