fastify-x-ai
Vercel AI SDK plugin for unified access to AI providers — text, streaming, embeddings, and structured output.
fastify-x-ai
Vercel AI SDK plugin providing unified AI provider access with text generation, streaming, embeddings, and structured output.
Installation
npm install @xenterprises/fastify-x-ai
Quick Start
await fastify.register(xAI, {
provider: 'openai',
apiKey: process.env.OPENAI_API_KEY,
defaultModel: 'gpt-4o-mini',
})
API
Text Generation
const text = await fastify.xai.generate({
prompt: 'Summarize this document: ...',
model: 'gpt-4o', // optional override
maxTokens: 1000,
})
Streaming
fastify.post('/chat', async (request, reply) => {
const stream = await fastify.xai.stream({
messages: request.body.messages,
})
return reply.send(stream)
})
Structured Output
const result = await fastify.xai.generateObject({
prompt: 'Extract the address from: ...',
schema: z.object({
street: z.string(),
city: z.string(),
zip: z.string(),
}),
})
Embeddings
const embedding = await fastify.xai.embed('Hello world')
// Returns: number[]
AI Context
package: "@xenterprises/fastify-x-ai"
type: fastify-plugin
decorates: fastify.xai
providers: [openai, anthropic, google, cohere]
use-when: AI text generation, document summarization, classification, embeddings
methods:
generate: prompt → text
stream: messages → SSE stream (for chat)
generateObject: prompt + zod schema → typed object
embed: text → number[]
env-required: provider API key (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.)
