Skip to content

与 AI SDK 集成

¥Integration with AI SDK

Elysia 轻松支持响应流,让你可以与 Vercel AI SDK 无缝集成。

¥Elysia provides a support for response streaming with ease, allowing you to integrate with Vercel AI SDKs seamlessly.

响应流

¥Response Streaming

Elysia 支持 ReadableStreamResponse 的连续流传输,允许你直接从 AI SDK 返回流。

¥Elysia support continous streaming of a ReadableStream and Response allowing you to return stream directly from the AI SDKs.

ts
import { Elysia } from 'elysia'
import { streamText } from 'ai'
import { openai } from '@ai-sdk/openai'

new Elysia().get('/', () => {
    const stream = streamText({
        model: openai('gpt-5'),
        system: 'You are Yae Miko from Genshin Impact',
        prompt: 'Hi! How are you doing?'
    })

    // Just return a ReadableStream
    return result.textStream 

    // UI Message Stream is also supported
    return result.toUIMessageStream() 
})

Elysia 将自动处理流,允许你以各种方式使用它。

¥Elysia will handle the stream automatically, allowing you to use it in various ways.

服务器发送事件

¥Server Sent Event

Elysia 还支持服务器发送事件,只需将 ReadableStream 函数封装为 sse 函数即可实现流式响应。

¥Elysia also supports Server Sent Event for streaming response by simply wrap a ReadableStream with sse function.

ts
import { Elysia, sse } from 'elysia'
import { streamText } from 'ai'
import { openai } from '@ai-sdk/openai'

new Elysia().get('/', () => {
    const stream = streamText({
        model: openai('gpt-5'),
        system: 'You are Yae Miko from Genshin Impact',
        prompt: 'Hi! How are you doing?'
    })

    // Each chunk will be sent as a Server Sent Event
    return sse(result.textStream) 

    // UI Message Stream is also supported
    return sse(result.toUIMessageStream()) 
})

作为响应

¥As Response

如果你不需要流的类型安全以便进一步使用 Eden,则可以直接将流作为响应返回。

¥If you don't need a type-safety of the stream for further usage with Eden, you can return the stream directly as a response.

ts
import { Elysia } from 'elysia'
import { ai } from 'ai'
import { openai } from '@ai-sdk/openai'

new Elysia().get('/', () => {
    const stream = streamText({
        model: openai('gpt-5'),
        system: 'You are Yae Miko from Genshin Impact',
        prompt: 'Hi! How are you doing?'
    })

    return result.toTextStreamResponse() 

    // UI Message Stream Response will use SSE
    return result.toUIMessageStreamResponse() 
})

手动流式传输

¥Manual Streaming

如果你想更好地控制数据流,可以使用生成器函数手动生成数据块。

¥If you want to have more control over the streaming, you can use a generator function to yield the chunks manually.

ts
import { Elysia, sse } from 'elysia'
import { ai } from 'ai'
import { openai } from '@ai-sdk/openai'

new Elysia().get('/', async function* () {
    const stream = streamText({
        model: openai('gpt-5'),
        system: 'You are Yae Miko from Genshin Impact',
        prompt: 'Hi! How are you doing?'
    })

    for await (const data of result.textStream) 
        yield sse({ 
            data, 
            event: 'message'
        }) 

    yield sse({
        event: 'done'
    })
})

获取

¥Fetch

如果 AI SDK 不支持你正在使用的模型,你仍然可以使用 fetch 函数向 AI SDK 发出请求并直接流式传输响应。

¥If AI SDK doesn't support model you're using, you can still use the fetch function to make requests to the AI SDKs and stream the response directly.

ts
import { Elysia, fetch } from 'elysia'

new Elysia().get('/', () => {
    return fetch('https://api.openai.com/v1/chat/completions', {
        method: 'POST',
        headers: {
            'Content-Type': 'application/json',
            Authorization: `Bearer ${process.env.OPENAI_API_KEY}`
        },
        body: JSON.stringify({
            model: 'gpt-5',
            stream: true,
            messages: [
                {
                    role: 'system',
                    content: 'You are Yae Miko from Genshin Impact'
                },
                { role: 'user', content: 'Hi! How are you doing?' }
            ]
        })
    })
})

Elysia 将自动使用流支持代理获取响应。

¥Elysia will proxy fetch response with streaming support automatically.


更多信息,请参阅 AI SDK 文档

¥For additional information, please refer to AI SDK documentation