OpenAI Completion Stream with Node.js and Express.js
Asked Answered
F

2

10

I'm trying to build a ChatGPT website clone and now I need to make the stream completion effect that shows the result word-per-word. My server is a TypeScript Node.js app that uses the Express.js framework.

Here's the route:

import express, { Request, Response } from 'express';
import cors from 'cors';
import { Configuration, OpenAIAPI } from 'openai';

// ...

app.post('/api/admin/testStream', async (req: Request, res: Response) => {
    const { password } = req.body;

    try {
        if (password !== process.env.ADMIN_PASSWORD) {
            res.send({ message: 'Incorrect password' });
            return;
        }
        const completion = await openai.createCompletion({
            model: 'text-davinci-003',
            prompt: 'Say this is a test',
            stream: true,
        }, { responseType: 'stream' });

        completion.data.on('data', (chunk: any) => {
            console.log(chunk.toString());
        });

        res.send({ message: 'Stream started' });
    } catch (err) {
        console.log(err);
        res.send(err);
    }
});

// ...

Right now, it gives me an error saying

Property 'on' does not exist on type 'CreateCompletionResponse'.ts(2339)

even if I set the { responseType: 'stream' }.

How can I solve this problem and send the response chunk-per-chunk to the frontend? (I'm using Socket.IO.)

F answered 29/4, 2023 at 19:15 Comment(4)
Instead of completion.data.on('data', ...);, you might need to do completion.on('data', ...);.Myrilla
@Myrilla already tried that, it doesn't work either (same error)F
You're right. I looked at the package's NPM page and it says "Streaming completions (stream=true) are not natively supported in this package yet, but a workaround exists if needed." Did you try it, although it seems quite similar to what you've already? Link to workaround: github.com/openai/openai-node/issues/18#issuecomment-1369996933Myrilla
Glad you got it working :)!Myrilla
F
18

Finally solved it thanks to the help of @uzluisf ! Here's what I did:

import express, { Request, Response } from 'express';
import cors from 'cors';
import { Configuration, OpenAIAPI } from 'openai';
import http, { IncomingMessage } from 'http';

// ...

app.post('/api/admin/testStream', async (req: Request, res: Response) => {
    const { password } = req.body;

    try {
        if (password !== process.env.ADMIN_PASSWORD) {
            res.send({ message: 'Incorrect password' });
            return;
        }

        const completion = await openai.createChatCompletion({
            model: 'gpt-3.5-turbo',
            messages: [{ role: 'user', content: 'When was America founded?' }],
            stream: true,
        }, { responseType: 'stream' });

        const stream = completion.data as unknown as IncomingMessage;

        stream.on('data', (chunk: Buffer) => {
            const payloads = chunk.toString().split("\n\n");
            for (const payload of payloads) {
                if (payload.includes('[DONE]')) return;
                if (payload.startsWith("data:")) {
                    const data = JSON.parse(payload.replace("data: ", ""));
                    try {
                        const chunk: undefined | string = data.choices[0].delta?.content;
                        if (chunk) {
                            console.log(chunk);
                        }
                    } catch (error) {
                        console.log(`Error with JSON.parse and ${payload}.\n${error}`);
                    }
                }
            }
        });

        stream.on('end', () => {
            setTimeout(() => {
                console.log('\nStream done');
                res.send({ message: 'Stream done' });
            }, 10);
        });

        stream.on('error', (err: Error) => {
            console.log(err);
            res.send(err);
        });
    } catch (err) {
        console.log(err);
        res.send(err);
    }
});

// ...

For more info, visit https://github.com/openai/openai-node/issues/18

Also managed to send chunks of message using Socket.IO events!


Links to example code:

F answered 30/4, 2023 at 20:45 Comment(3)
what does your client code look likeSorayasorb
@Sorayasorb you can find the entire codebase here: github.com/alessandrofoglia07/LeafGPT-FE frontend; github.com/alessandrofoglia07/LeafGPT-BE backend;F
Thanks for that and thanks for the links to the example code!Marseillaise
D
1

If you're using the Rest API instead of the package:

 const streamRes = await fetch("https://api.openai.com/v1/chat/completions", {
    method: "POST",
    headers: {
        "Authorization": `Bearer ${agencyData.OPENAI_API_KEY}`,
        "Content-Type": "application/json"
    },
    body: JSON.stringify({
        "model": agentData.vg_defaultModel || 'gpt-3.5-turbo',
        "messages": [
            {
                "role": "system",
                "content": agentData.vg_systemPrompt || 'You are a helpful assitant.'
            },
            ...embedHistory,
            {
                "role": "user",
                "content": agentData.vg_prompt || 'Tell me a very short story'
            }
        ],
        "temperature": agentData.vg_temperature || 0.5,
        "stream": true
    })
})

const reader = streamRes.body.getReader();
let done = false;
let concenattedJsonStrn = '';

while (!done) {
    const { value, done: readerDone, } = await reader.read();
    done = readerDone;
    const buffer = Buffer.from(value);
    const textPayload = buffer.toString();
    concenattedJsonStrn += textPayload;
    if (!concenattedJsonStrn.includes(`data: `) || !concenattedJsonStrn.includes(`\n\n`)) {
        continue;
    }
    const payloads = concenattedJsonStrn.toString().split("\n\n");
    concenattedJsonStrn = '';

    for (const payload of payloads) {
        if (payload.includes('[DONE]')) return;
        if (payload.startsWith("data:")) {
            try {
                const data = JSON.parse(payload.replace("data: ", ""));
                const chunk: undefined | string = data.choices[0].delta?.content;
                if (chunk) {
                    console.log(chunk);
                    // ws.send(chunk); // send the chunk to websocket for example
                }
            } catch (error) {
                console.log(`Error with JSON.parse and ${payload}.\n${error}`);
                concenattedJsonStrn += payload;
            }
        }
    }
}
Danica answered 19/3 at 2:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.