How to Upload Files to AWS S3 with Node.js: Complete Step by Step Guide
By Braincuber Team
Published on March 24, 2026
File upload functionality is essential for modern web applications that handle user-generated content, documents, images, and media files. Amazon S3 provides a scalable and cost-effective solution for storing files, while Node.js offers excellent streaming capabilities for efficient file processing. This comprehensive step by step guide will teach you how to implement robust file uploads to AWS S3 using Node.js, Express framework, and the AWS SDK.
What You'll Learn:
- Set up a Node.js server with Express framework for file handling
- Configure formidable parser for multipart form data processing
- Implement streaming file uploads to AWS S3 using transform streams
- Handle file upload events and error scenarios properly
- Configure AWS SDK credentials and S3 bucket permissions
- Implement best practices for file upload security and validation
Prerequisites
Before starting this tutorial, you should have a solid understanding of:
Basic HTML
Understanding of HTML forms and file input elements
Node.js & Express
Basic knowledge of Node.js server and Express framework
You also need an AWS account with an active S3 bucket. Follow the AWS documentation for detailed steps on setting up your account and creating an S3 bucket.
Step 1: Create the Node.js Server
Install Required Packages
Install express, dotenv, formidable, @aws-sdk/lib-storage, and @aws-sdk/client-s3 packages for the application.
Create Server File
Set up index.js with Express server, HTML form, and basic route configuration.
Configure Environment
Create .env file with AWS credentials and S3 bucket configuration.
npm install express dotenv formidable @aws-sdk/lib-storage @aws-sdk/client-s3
const express = require('express');
const app = express();
const parsefile = require('./fileparser');
require('dotenv').config();
app.set('json spaces', 5); // to pretify json response
const PORT = process.env.PORT || 3000;
app.get('/', (req, res) => {
res.send(`
File Upload With Node.js
`);
});
app.post('/api/upload', (req, res) => {
parsefile(req)
.then(result => {
res.json({ success: true, data: result });
})
.catch(error => {
res.status(500).json({ success: false, error: error });
});
});
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}.`);
});
PORT=3000
AWS_ACCESS_KEY_ID=your_access_key_here
AWS_SECRET_ACCESS_KEY=your_secret_key_here
S3_REGION=us-east-1
S3_BUCKET=your-bucket-name
Security Note
Never commit your AWS credentials to version control. Always use environment variables and secure key management practices in production.
Step 2: Configure the File Parser
Create File Parser Module
Set up fileparser.js with formidable configuration for handling multipart form data and file uploads.
Configure Parser Options
Set maxFileSize to 100MB and allowEmptyFiles to false for proper file validation.
Handle Form Events
Implement event listeners for error, data, and fileBegin events to manage the upload process.
| Formidable Option | Description | Default Value |
|---|---|---|
| allowEmptyFiles | Determines if empty files should be allowed | true |
| minFileSize | Smallest file size allowed in bytes | 1 byte |
| maxFileSize | Largest file size allowed in bytes | 200 MB |
const formidable = require('formidable');
const { Upload } = require("@aws-sdk/lib-storage");
const { S3Client } = require("@aws-sdk/client-s3");
const Transform = require('stream').Transform;
const accessKeyId = process.env.AWS_ACCESS_KEY_ID;
const secretAccessKey = process.env.AWS_SECRET_ACCESS_KEY;
const region = process.env.S3_REGION;
const Bucket = process.env.S3_BUCKET;
const parsefile = async (req) => {
return new Promise((resolve, reject) => {
let options = {
maxFileSize: 100 * 1024 * 1024, //100 MBs converted to bytes
allowEmptyFiles: false
}
const form = formidable(options);
form.parse(req, (err, fields, files) => {});
form.on('error', error => {
reject(error.message)
})
form.on('data', data => {
if (data.name === "complete") {
resolve(data.value);
}
})
form.on('fileBegin', (formName, file) => {
file.open = async function () {
this._writeStream = new Transform({
transform(chunk, encoding, callback) {
callback(null, chunk)
}
})
this._writeStream.on('error', e => {
form.emit('error', e)
});
// upload to S3
new Upload({
client: new S3Client({
credentials: { accessKeyId, secretAccessKey },
region
}),
params: {
ACL: 'public-read',
Bucket,
Key: `${Date.now().toString()}-${this.originalFilename}`,
Body: this._writeStream
},
tags: [], // optional tags
queueSize: 4, // optional concurrency configuration
partSize: 1024 * 1024 * 5, // optional size of each part, in bytes, at least 5MB
leavePartsOnError: false, // optional manually handle dropped parts
})
.done()
.then(data => {
form.emit('data', { name: "complete", value: data });
}).catch((err) => {
form.emit('error', err);
})
}
file.end = function (cb) {
this._writeStream.on('finish', () => {
this.emit('end')
cb()
})
this._writeStream.end()
}
})
})
}
module.exports = parsefile;
Step 3: Test the Application
Start the Server
Run node index.js to start your Node.js server and verify it's running on the configured port.
Test File Upload
Open your browser to http://localhost:3000 and test uploading files of various sizes.
Verify S3 Upload
Check your S3 bucket to confirm files are uploaded with the correct permissions and unique timestamps.
{
"success": true,
"data": {
"$metadata": {
"httpStatusCode": 200,
"requestId": "request-id-here",
"attempts": 1,
"totalRetryDelay": 0
},
"ETag": "\"etag-here\"",
"Key": "1640995200000-filename.jpg",
"Location": "https://your-bucket.s3.region.amazonaws.com/1640995200000-filename.jpg",
"Bucket": "your-bucket-name"
}
}
Frequently Asked Questions
How do I handle large file uploads efficiently?
Use streaming uploads with transform streams to process files in chunks, reducing memory usage. Configure multipart uploads with appropriate partSize (minimum 5MB) and queueSize for optimal performance.
What's the difference between ACL public-read and private?
Public-read allows anyone to access the file via URL, while private restricts access to the file owner. Use public-read for publicly accessible files like images, and private for sensitive documents.
How do I secure my AWS credentials in production?
Use AWS IAM roles instead of access keys when running on EC2, or store credentials in AWS Secrets Manager. Never commit credentials to version control or hardcode them in your application.
Can I upload multiple files at once?
Yes, add the multiple attribute to your file input element. Formidable will handle multiple files, and you can process each file separately in the fileBegin event handler.
How do I add file type validation?
Check the file.mimetype property in the fileBegin event handler. Create an array of allowed MIME types and reject files that don't match before starting the S3 upload process.
Need Help with AWS Integration?
Our cloud experts can help you implement secure file upload solutions, optimize S3 performance, and set up proper AWS infrastructure for your applications.
