S3 client

This page contains some example usage for S3. There are other resources that explain authentication and configuration.

The S3 package could be installed with Composer.

composer require async-aws/s3

The client supports presign of requests to be able to pass the URL to a normal mortal person so they can download a file within the next X minutes. Read more about presign here.

Note: There is a SimpleS3Client that might be easier to work with for common use cases.

Usage

Upload files

If you want to upload a 1 Gb file, you really don't want to put that file in memory before uploading. You want to do it a smarter way. AsyncAws allow you to upload files using a string, resource, closure or a iterable. See the following examples:

use AsyncAws\S3\S3Client; $s3 = new S3Client(); // Upload plain text $s3->PutObject([ 'Bucket' => 'my-company-website', 'Key' => 'robots.txt', 'Body' => "User-agent: *\nDisallow:", ]); // Upload with stream $resource = \fopen('/path/to/big/file', 'r'); $s3->PutObject([ 'Bucket' => 'my-company-website', 'Key' => 'file.jpg', 'Body' => $resource, ]); // Upload with Closure $fp = \fopen('/path/to/big/file', 'r'); $s3->PutObject([ 'Bucket' => 'my-company-website', 'Key' => 'file.jpg', 'ContentLength' => filesize('/path/to/big/file'), // This is important 'Body' => static function(int $length) use ($fp): string { return fread($fp, $length); }, ]); // Upload with an iterable $files = ['/path/to/file1.txt', '/path/to/file2.txt']; $s3->PutObject([ 'Bucket' => 'my-company-website', 'Key' => 'file_merged.jpg', 'ContentLength' => array_sum(array_map('filesize', $files)), // This is important 'Body' => (static function() use($files): iterable { foreach ($files as $file) { yield file_get_contents($file); } })(), ]);

When using a Closure, it's important to provide the property ContentLength. This information is required by AWS, and cannot be guessed by AsyncAws. If ContentLength is absent, AsyncAws will read the output before sending the request which could have a performance impact.

Download files

When you download a file from S3, AsyncAws gives you a ResultStream which can be used as a string, as a resource, or iterated over. This allows you to handle larger files without having them in memory.

// download a file and use it directly as string $result = $s3->GetObject([ 'Bucket' => 'my-company-website', 'Key' => 'metadata.json', ]); $metadata = json_decode($result->getBody()->getContentAsString()); // download a big file and save it efficiently $result = $s3->GetObject([ 'Bucket' => 'my-company-website', 'Key' => 'bunny.mkv', ]); $fp = fopen('/path/to/big_file.mkv', 'wb'); stream_copy_to_stream($f->getBody()->getContentAsResource(), $fp); // use an iterable to perform some business logic on chunks while downloading (or show a progress bar) $result = $s3->GetObject([ 'Bucket' => 'my-company-website', 'Key' => 'orders.csv', ]); $fp = fopen('/path/to/orders.csv', 'wb'); foreach ($result->getBody()->getChunks() as $chunk) { fwrite($fp, $chunk); $progress->advance(); }

Virtual Hosted-Style Requests

When calling AWS endpoints, AsyncAws uses Virtual Hosted-Style Requests: The bucket name is part of the endpoint's host. To change this behavior, and use "path styled endpoints" instead, set pathStyleEndpoint parameter to true when initializing the client.

use AsyncAws\S3\S3Client; $s3 = new S3Client(['pathStyleEndpoint' => true]);

Non-AWS S3 endpoints

To use the S3Client with example Digital Oceans' Spaces, you need to initialize the S3Client with your endpoint.

$s3 = new S3Client([ 'endpoint' => 'https://fra1.digitaloceanspaces.com', 'pathStyleEndpoint' => true, ]);