Streaming JSON responses using the Fetch API

Shared by: devcanvas

javascript

1
async function streamJSONResponse(url, onChunk) {
2
    const response = await fetch(url);
3

4
    if (!response.ok) {
5
        throw new Error(`HTTP error! status: ${response.status}`);
6
    }
7

8
    const reader = response.body.getReader();
9
    const decoder = new TextDecoder();
10
    let buffer = "";
11

12
    while (true) {
13
        const { value, done } = await reader.read();
14
        if (done) break;
15

16
        buffer += decoder.decode(value, { stream: true });
17

18
        // Process each line or JSON object
19
        let boundary;
20
        while ((boundary = buffer.indexOf("\n")) !== -1) {
21
            const chunk = buffer.slice(0, boundary).trim();
22
            buffer = buffer.slice(boundary + 1);
23

24
            if (chunk) {
25
                try {
26
                    const json = JSON.parse(chunk);
27
                    onChunk(json); // Pass each JSON object to the callback
28
                } catch (e) {
29
                    console.error("Invalid JSON chunk:", chunk);
30
                }
31
            }
32
        }
33
    }
34
}
35

Have you ever wondered how ChatGPT streams its responses like it typing it out? this lies in processing data in chunks, instead of waiting for the entire response to load. This concept isn’t unique to AI tools — streaming JSON responses is a widely-used technique in web development for handling large or real-time data streams efficiently.

Traditionally, when a web application fetches JSON data from an API, it waits until the entire response is received before processing it. This approach can be inefficient, especially when dealing with large datasets or real-time data, as it delays the processing and rendering of the data. For example, imagine fetching live updates from a server that sends a continuous stream of JSON objects. If the application waits for the entire response, it can’t display updates in real time, leading to poor user experience.

Using the Fetch API’s ReadableStream feature, you can process server responses incrementally. This approach handles large datasets or real-time streams efficiently without waiting for the full response.

// Usage: 
streamJSONResponse("https://example.com/stream", (chunk) => {
  console.log("Received chunk:", chunk);
});

Streaming JSON significantly enhance the performance and user experience of web applications. Handling data in chunks helps you build applications that feel faster and more responsive, just like how ChatGPT or other LLM interface behaves. Try out the snippet above in your next project!

Love it? Share it!

DevCanvas DevCanvas Logo

Online Editor with a collection of awesome frontend code and code snippets for developers of all levels.

Legal & Support

Stand with Palestine 🇵🇸! DO NOT BE SILENCED

© 2025 DevCanvas. All rights reserved.