Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save mstaack/dc13f0dbe1e148ab4dccc6c759a3d3b7 to your computer and use it in GitHub Desktop.
Save mstaack/dc13f0dbe1e148ab4dccc6c759a3d3b7 to your computer and use it in GitHub Desktop.
Emitting stream responses with Slim

Emitting stream responses with Slim

What is Chunked transfer encoding?

Chunked transfer encoding is a streaming data transfer mechanism available in version 1.1 of the Hypertext Transfer Protocol (HTTP). In chunked transfer encoding, the data stream is divided into a series of non-overlapping "chunks". The chunks are sent out and received independently of one another. No knowledge of the data stream outside the currently-being-processed chunk is necessary for both the sender and the receiver at any given time.

Read more

Implementing a chunked response with a PSR-7 implementation is a quite complex task.

Most PSR-7 implementations use streams. Since the stream is held in memory, problems can occur if the requests/responses are too large. Depending on memory limit and configuration, the limit may vary.

The cleanest solution would be to use (or implement) a special SapiEmitter, as Diactoros does. Read more: Emitting Responses with Diactoros

In Slim I could't find a way to add a new SapiEmitter and tried something different in Vanilla PHP.

Here you can see a Slim route with a callback thats reads a ZIP file into the output buffer and flushes it directly into the response.

use Slim\Http\Request;
use Slim\Http\Response;

$app->get('/download', function (Request $request, Response $response) {
    set_time_limit(60 * 5);
    ini_set('max_execution_time', 60 * 5);

    $fileName = __DIR__ . '/file.zip';
    $baseName = basename($fileName);
    $fileSize = filesize($fileName);
    $file = fopen($fileName, 'r');

    header("Transfer-Encoding: chunked", true);
    header("Content-Encoding: chunked", true);
    header("Content-Type: application/zip", true);
    header("Content-Disposition: attachment; filename=$baseName");
    header("Connection: keep-alive", true);
    header("Content-length: $fileSize");

    while (($buffer = fgets($file, 4096)) !== false) {
        ob_start();
        echo sprintf("%x\r\n%s\r\n", strlen($buffer), $buffer);
        ob_end_flush();
    }

    fclose($file);

    // Stop slim to handle the response
    exit;
});

Reading a chunked response

You could use Guzzle to read bytes of the (response) stream until the end of the stream is reached

http://docs.guzzlephp.org/en/latest/request-options.html#stream

$response = $client->request('GET', '/stream/20', ['stream' => true]);
// Read bytes off of the stream until the end of the stream is reached
$body = $response->getBody();
while (!$body->eof()) {
    echo $body->read(1024);
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment