Recently, while working on some E2E automation tests, I ran into a situation where I needed to upload files to S3.
Normally, you’d just throw the file up with the AWS CLI and call it a day. But when you’re running your flows with Playwright, you start hitting needs like: “I want to place a file on S3 as part of the test setup,” or “I want to simulate how the frontend uploads directly to S3 with a presigned URL.”
In this post, I’ll share the different upload patterns for S3 that I tried and found useful. I’ll also dive into some more advanced cases like managing credentials with .env, setting up keys safely in CI/CD, and handling large files with multipart uploads.
If you’re using Playwright or just like tinkering with AWS, I think you’ll find some ideas here.
1. Uploading Directly with the AWS SDK
The most straightforward approach: call the AWS SDK inside your test code.
import { test } from '@playwright/test';
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';
import fs from 'fs';
test('S3へ直接アップロード', async () => {
const s3 = new S3Client({
region: 'ap-northeast-1',
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
},
});
const filePath = 'tests/fixtures/sample.txt';
const fileContent = fs.readFileSync(filePath);
await s3.send(new PutObjectCommand({
Bucket: 'my-bucket',
Key: 'uploads/sample.txt',
Body: fileContent,
ContentType: 'text/plain',
}));
});
Impressions
-
Super simple and easy to reason about.
-
Perfect for setup and teardown tasks in tests.
-
Doesn’t really reflect how the frontend works, so more of a behind-the-scenes helper.
Pros | Cons |
---|---|
Simple to implement | Not representative of real user flow |
Great for prep/cleanup | Needs direct use of AWS keys |
2. Using Presigned URLs
This feels closer to how real apps work. The backend issues a presigned URL, and the client (or test) does a PUT
directly to S3.
import { test, expect } from '@playwright/test';
import fetch from 'node-fetch';
test('署名付きURLでS3にアップロード', async () => {
const apiRes = await fetch('http://localhost:3000/api/presigned-url');
const { url } = await apiRes.json();
const fileContent = Buffer.from('Hello from Playwright!');
const uploadRes = await fetch(url, {
method: 'PUT',
body: fileContent,
headers: {
'Content-Type': 'text/plain',
},
});
expect(uploadRes.status).toBe(200);
});
Impressions
- Matches real-world workflows exactly.
- Safe since credentials don’t leak to the client.
- Requires a backend endpoint that generates the presigned URL.
3. Uploading Through the UI
Here’s the “simulate the user” route: Playwright selects a file in <input type="file">
and triggers the server → S3 flow.
import { test, expect } from '@playwright/test';
import path from 'path';
test('UIからS3へアップロード', async ({ page }) => {
await page.goto('http://localhost:3000/upload');
const filePath = path.resolve('tests/fixtures/sample.txt');
const fileChooserPromise = page.waitForEvent('filechooser');
await page.click('#upload-button');
const fileChooser = await fileChooserPromise;
await fileChooser.setFiles(filePath);
await page.click('#submit-button');
await expect(page.locator('#upload-result')).toHaveText('Upload successful');
});
Impressions
- Closest to how actual users interact.
- Great for end-to-end assurance, but tests take longer.
- You’ll want backend monitoring to confirm the S3 save happened.
4. Preloading Test Data into S3
For CI/CD workflows, this one shines: preload required images or test data into S3 before running the tests.
test.beforeAll(async () => {
const s3 = new S3Client({ region: 'ap-northeast-1' });
const fileContent = fs.readFileSync('tests/fixtures/avatar.png');
await s3.send(new PutObjectCommand({
Bucket: 'my-bucket',
Key: 'test/avatar.png',
Body: fileContent,
ContentType: 'image/png',
}));
});
5. Managing Credentials with .env
Confession: I once hardcoded my AWS keys in test code (oops). Lesson learned.
The safe way: manage credentials with .env
and load them via dotenv.
.env example
AWS_ACCESS_KEY_ID=xxxxxxxx
AWS_SECRET_ACCESS_KEY=yyyyyyyy
AWS_REGION=ap-northeast-1
Code setup
import dotenv from 'dotenv';
dotenv.config();
As long as .env
isn’t committed to Git, this is a clean and secure workflow.
6. Secure Key Handling in CI
On CI/CD (GitHub Actions, GitLab CI, etc.), .env
is not enough. Here are my go-to approaches:
- Store keys in GitHub Actions Secrets → inject as env vars
- Use IAM roles attached to EC2/Lambda
- Go fully keyless with OIDC role assumption
The last option (OIDC) is increasingly popular. Not having keys in CI feels so much safer.
7. Multipart Upload for Large Files
When dealing with GB-sized videos or archives, you’ll hit timeouts unless you switch to multipart upload.
import { Upload } from "@aws-sdk/lib-storage";
import { S3Client } from "@aws-sdk/client-s3";
import fs from 'fs';
const s3 = new S3Client({ region: 'ap-northeast-1' });
const fileStream = fs.createReadStream('bigfile.zip');
const upload = new Upload({
client: s3,
params: {
Bucket: 'my-bucket',
Key: 'bigfile.zip',
Body: fileStream,
},
});
await upload.done();
Impressions
- More on the system-dev side of things, but sometimes tests need it.
@aws-sdk/lib-storage
is the key library here.
Conclusion
Quick recap:
- Direct SDK upload → great for setup/cleanup
- Presigned URL → mirrors real-world flows
- UI upload → true end-to-end testing
- Preloading test data → perfect for CI/CD
- .env for safe credential management
- Secure CI strategies with Secrets or OIDC
- Multipart uploads for large files
Personally, working with S3 inside Playwright tests taught me that Playwright isn’t just a test framework. When combined with AWS, it turns into a mini integration testing platform.
E2E testing might seem boring at first, but add S3 into the mix, and suddenly it feels practical and exciting.
Next up, I’m thinking about trying download testing from S3… stay tuned.