«   Previous tip Next tip   »

JavaScript: Consume Web streams from OpenAI using vanilla JavaScript

Last updated: 14th April 2023


In this video, I demonstrate how to download streamed data using vanilla JavaScript, without using any plugins or node modules from NPM. I'll be using OpenAI's API to download GPT data. The full code is here:

// main.js
const url = "https://api.openai.com/v1/chat/completions";

// Replace 'your_api_key_here' with your actual OpenAI API key
const apiKey = `your_api_key_here`;

// Create an AbortController to control and cancel the fetch request when the user hits the stop button
const controller = new AbortController();

document.querySelector("#stop").addEventListener("click", () => {
    // Exercise for the reader:
    // Add error handling for when the controller is aborted

// Make a POST request to the OpenAI API to get chat completions
const response = await fetch(url, {
    method: "POST",
    headers: {
        "Content-Type": "application/json",
        Authorization: `Bearer ${apiKey}`,
    body: JSON.stringify({
        messages: [{ role: "user", content: "Tell me a joke" }],
        temperature: 0.6,
        model: "gpt-3.5-turbo",
        // Limiting the tokens during development
        max_tokens: 30,
        stream: true,
    // Use the AbortController's signal to allow aborting the request
    // This is a `fetch()` API thing, not an OpenAI thing
    signal: controller.signal,

// Create a TextDecoder to decode the response body stream
const decoder = new TextDecoder();

// Iterate through the chunks in the response body using for-await...of
for await (const chunk of response.body) {
    const decodedChunk = decoder.decode(chunk);

    // Clean up the data
    const lines = decodedChunk
        .map((line) => line.replace("data: ", ""))
        .filter((line) => line.length > 0)
        .filter((line) => line !== "[DONE]")
        .map((line) => JSON.parse(line));

    // Destructuring!
    for (const line of lines) {
        const {
            choices: [
                    delta: { content },
        } = line;

        if (content) {
            document.querySelector("#content").textContent += content;

And the HTML:

<!-- index.html -->
<!DOCTYPE html>
<html lang="en">
        <meta charset="UTF-8" />
        <meta name="viewport" content="width=device-width, initial-scale=1.0" />
        <title>Web Streams Demo</title>
        <link rel="stylesheet" href="https://unpkg.com/mvp.css" />
        <script type="module" src="main.js"></script>
                    <h3>Tell me a joke</h3>
                    <button id="stop">STOP</button>

                    <p id="content"></p>

Notes below:

Making a Fetch Request

Note the fetch request uses a model of gpt-3.5-turbo - it's much cheaper that gpt4. The stream: true tells OpenAI to stream data back.

Streaming Data

To handle the stream of data received from OpenAI, I use the for-await...of syntax on Firefox. By setting the stream parameter as true within the payload, I ask OpenAI to stream tokens back as soon as they are available.

Handling Streamed Data

The received data needs to be parsed into JSON format before being used. In production, eventsource-parser is probably more comprehensive and resillient.

Running the Script with Node JS

Notice the script can be run in a recent version of Node.js. The TextDecoder, async iteration, destructuring, and response.body as an async iterable are all working. Even the AbortController! Of course the document.querySelector() calls won't work in a Node.js environment.

See browser support for async iterables here.

«   Previous tip Next tip   »

Sign up to receive a developer tip, in the form of a gif, in your inbox each week