X Enterprises
fastify-x-ai

complete

Simple text completion — accepts a prompt string and returns the generated text directly.

complete

The simplest generation method. Takes a prompt string, delegates to generate(), and returns result.text directly — no result object to destructure.

Signature

fastify.xai.complete(prompt: string, options?: CompleteOptions): Promise<string>

interface CompleteOptions {
  provider?: "openai" | "anthropic" | "google"
  model?: string
  maxTokens?: number
  temperature?: number
  system?: string
}

Params

NameTypeRequiredDescription
promptstringYesThe text prompt
options.providerstringNoOverride default provider
options.modelstringNoOverride default model
options.maxTokensnumberNoOverride defaultMaxTokens
options.temperaturenumberNoOverride defaultTemperature
options.systemstringNoSystem message

Returns

Promise<string> — the generated text.

Throws

ErrorWhen
xAI complete: 'prompt' is requiredprompt is empty, null, or undefined
xAI: Provider '…' not configuredSpecified or default provider has no API key

Examples

Basic usage

const text = await fastify.xai.complete("Write a haiku about the ocean.");
console.log(text);
// "Waves crash endlessly / Salt and foam kiss the warm shore / Peace in every tide"

With provider and model override

fastify.post("/summarize", async (request, reply) => {
  const { article } = request.body;

  const summary = await fastify.xai.complete(
    `Summarize the following article in three bullet points:\n\n${article}`,
    {
      provider: "anthropic",
      model: "claude-sonnet-4-20250514",
      maxTokens: 300,
    },
  );

  return { summary };
});

See Also

  • generate — What complete() calls internally; use when you need token usage or tool results
  • chat — Multi-turn conversation with message history

AI Context

package: "@xenterprises/fastify-x-ai"
method: fastify.xai.complete(prompt, options?)
use-when: Simplest text generation — takes a prompt string and returns just the text string
params: prompt (required, string), model, provider, maxTokens, temperature
returns: string (text only)
Copyright © 2026