Echo Stream Response
Introduction
When building modern web applications, there are scenarios where you need to send data to the client continuously over time rather than in a single response. This concept is known as "streaming" and is particularly useful for real-time updates, large file downloads, or long-running processes where you want to provide continuous feedback to the user.
Echo framework provides elegant ways to implement streaming responses that allow your server to push data to clients as it becomes available. In this tutorial, we'll explore how to create streaming responses in Echo applications and understand the underlying concepts.
What is a Streaming Response?
A streaming response is a HTTP response where the server does not immediately close the connection after sending initial data. Instead, it keeps the connection open and continues to send data chunks as they become available. This enables:
- Real-time updates without polling
- Progressive loading of large data sets
- Resource efficiency for long-running operations
- Better user experience through immediate feedback
Basic Streaming Response
Let's start with a simple example of a streaming response in Echo:
package main
import (
"encoding/json"
"time"
"github.com/labstack/echo/v4"
)
func main() {
e := echo.New()
e.GET("/stream", func(c echo.Context) error {
c.Response().Header().Set(echo.HeaderContentType, echo.MIMEApplicationJSON)
c.Response().WriteHeader(200)
// Encode and send data in a loop
enc := json.NewEncoder(c.Response())
for i := 0; i < 10; i++ {
data := map[string]interface{}{
"message": "Update #" + string(i),
"timestamp": time.Now().Unix(),
}
if err := enc.Encode(data); err != nil {
return err
}
c.Response().Flush()
time.Sleep(1 * time.Second)
}
return nil
})
e.Logger.Fatal(e.Start(":1323"))
}
In this example, we:
- Set the content type header to JSON
- Write a 200 OK status code
- Create a JSON encoder for the response writer
- Loop and send 10 updates, one per second
- Use
Flush()
to ensure each chunk is sent immediately
Understanding Response Flushing
A key concept in streaming responses is "flushing". When you write data to the response, it often gets buffered before being sent to the client. Calling Flush()
forces any buffered data to be sent immediately.
c.Response().Write([]byte("This data might be buffered"))
c.Response().Flush() // Force the data to be sent now
Without flushing, the client might not receive any data until the buffer fills up or the request completes, defeating the purpose of streaming.
Server-Sent Events (SSE)
For many real-time applications, Server-Sent Events (SSE) provides a standardized way to stream updates from the server to the client. Echo makes SSE implementation straightforward:
func streamSSE(c echo.Context) error {
c.Response().Header().Set(echo.HeaderContentType, "text/event-stream")
c.Response().Header().Set("Cache-Control", "no-cache")
c.Response().Header().Set("Connection", "keep-alive")
c.Response().WriteHeader(200)
// Send a ping immediately and then every 15 seconds
pingTicker := time.NewTicker(15 * time.Second)
defer pingTicker.Stop()
// Channel to signal client disconnection
disconnected := c.Request().Context().Done()
for {
select {
case <-disconnected:
return nil
case <-pingTicker.C:
c.Response().Write([]byte("event: ping\n\n"))
c.Response().Flush()
case data := <-someDataChannel:
c.Response().Write([]byte("event: update\n"))
c.Response().Write([]byte("data: " + data + "\n\n"))
c.Response().Flush()
}
}
}
The client JavaScript to consume this SSE stream would look like:
const eventSource = new EventSource('/stream-sse');
eventSource.addEventListener('update', (event) => {
console.log('Received update:', event.data);
});
eventSource.addEventListener('ping', () => {
console.log('Received ping from server');
});
eventSource.onerror = (error) => {
console.error('EventSource error:', error);
eventSource.close();
};
Streaming Large Files
Echo is also great for streaming large files without loading them entirely into memory:
func streamLargeFile(c echo.Context) error {
file, err := os.Open("large-file.zip")
if err != nil {
return err
}
defer file.Close()
fileInfo, err := file.Stat()
if err != nil {
return err
}
c.Response().Header().Set(echo.HeaderContentType, "application/zip")
c.Response().Header().Set(echo.HeaderContentDisposition, "attachment; filename=\"large-file.zip\"")
c.Response().Header().Set(echo.HeaderContentLength, fmt.Sprintf("%d", fileInfo.Size()))
// Stream the file in 4KB chunks
buf := make([]byte, 4096)
for {
n, err := file.Read(buf)
if err == io.EOF {
break
}
if err != nil {
return err
}
if _, err = c.Response().Write(buf[:n]); err != nil {
return err
}
c.Response().Flush()
}
return nil
}
This approach allows you to stream files of any size without memory constraints.
Real-World Application: Progress Updates
A practical example is sending progress updates for a long-running operation:
func processWithProgress(c echo.Context) error {
// Configure response for streaming
c.Response().Header().Set(echo.HeaderContentType, echo.MIMEApplicationJSON)
c.Response().WriteHeader(200)
encoder := json.NewEncoder(c.Response())
totalSteps := 10
// Process each step and send progress
for step := 1; step <= totalSteps; step++ {
// Do some actual processing work
time.Sleep(500 * time.Millisecond)
progress := map[string]interface{}{
"step": step,
"totalSteps": totalSteps,
"percentComplete": float64(step) / float64(totalSteps) * 100,
"status": "processing",
}
if err := encoder.Encode(progress); err != nil {
return err
}
c.Response().Flush()
}
// Send completion message
final := map[string]interface{}{
"step": totalSteps,
"totalSteps": totalSteps,
"percentComplete": 100.0,
"status": "complete",
}
if err := encoder.Encode(final); err != nil {
return err
}
c.Response().Flush()
return nil
}
Handling Client Disconnection
In streaming responses, it's important to detect when the client disconnects to avoid wasting server resources:
func streamWithDisconnectionDetection(c echo.Context) error {
ctx := c.Request().Context()
// Configure response headers
c.Response().Header().Set(echo.HeaderContentType, echo.MIMETextPlain)
c.Response().WriteHeader(200)
for i := 0; i < 100; i++ {
select {
case <-ctx.Done():
// Client disconnected
fmt.Println("Client disconnected, stopping stream")
return nil
default:
// Continue processing
c.Response().Write([]byte(fmt.Sprintf("Update %d\n", i)))
c.Response().Flush()
time.Sleep(1 * time.Second)
}
}
return nil
}
This pattern is especially important for long-running streams to prevent server resources from being tied up after clients disconnect.
Performance Considerations
When implementing streaming responses, keep these performance considerations in mind:
- Memory Usage: Be careful not to accumulate large amounts of data in memory while streaming.
- Connection Limits: Each streaming connection consumes a server resource. Plan your application to handle the expected number of concurrent connections.
- Timeout Settings: Configure appropriate timeout settings in your reverse proxy and Echo server.
- Backpressure: Consider implementing backpressure mechanisms if the client can't consume data as fast as it's produced.
Summary
Echo Stream Response is a powerful feature that enables real-time communication between your server and clients. We've covered:
- Basic streaming response implementation
- Importance of response flushing
- Server-Sent Events for standardized streaming
- Streaming large files efficiently
- Progress updates for long-running operations
- Handling client disconnection
- Performance considerations
Streaming responses allow your application to provide a more interactive and responsive experience to users while efficiently handling server resources.
Additional Resources
Exercises
- Create a streaming endpoint that provides real-time updates from a simulated sensor (temperature, humidity, etc.) every 2 seconds.
- Implement a file upload progress tracker that streams the upload progress to the client.
- Build a chat application that uses SSE to stream new messages to connected clients.
- Create a stream that sends random quotes every few seconds, with proper error handling for client disconnections.
If you spot any mistakes on this website, please let me know at [email protected]. I’d greatly appreciate your feedback! :)