Integrate AWS S3 with Your Node.js Project: A Step-by-Step Guide

Mr. Freelancer
7 min readApr 13, 2024

--

Today, I’ll walk you through a simple method to connect your AWS S3 to a Node.js project in just a few easy steps. Let’s get started on our journey.

Step 1: Ensure you have an AWS account. Log in to your account and navigate to the AWS S3 panel.

Step 2: Create a bucket to store your files. Make sure your bucket name is unique, as AWS S3 validates that each bucket has a distinct name. For example, I’ll name my bucket “akshay-testing-01”.

Step 3: Next, create some subfolders within your bucket if needed. For instance, I’ll create “sub1” and “sub2” subfolders within the bucket I just created.

Step 4: Head to the IAM console. You can either make a new IAM user or opt for an existing one. Ensure that the chosen user or role possesses the required permissions to access the S3 bucket. Once you’ve selected the user or role, move to the “Security credentials” tab and opt for “create access key”. If you’re adjusting an existing user, you might already spot an access key listed here. After generating the access key, ensure to securely save both the access key ID and secret access key.

Step 5: Open your Node.js project in VS Code and install the following packages to establish a connection with AWS S3.

npm install @aws-sdk/client-s3
npm install @aws-sdk/s3-request-presigner

Step 6 : I assume you already have a .env file in your project. If not, create one and add the following keys for AWS S3 connection.

# Your created bucket name in AWS S3
AWS_BUCKET_NAME="akshay-testing-01"

# Your AWS access key ID
AWS_ACCESSKEYID="PGIP6KB2K1CXGVECA5RB"

# Your AWS secret access key
AWS_SECRETACCESSKEY="9hHnf7G8TLS128/v7sh8sH62LzHHqsveSxinAaVm"

# The AWS region where your bucket is located
AWS_REGION="ap-south-1"

# NOTE:-: The keys provided above are for illustration purposes only and won't function if used directly.

Step 7 : After adding these details, save your .env file. Next, create a new file named awsS3connect.js. This file will contain a common function for communication with AWS S3.

Step 8 : Let’s kick off by connecting AWS S3 to Node.js. Implement the following code snippet in the newly created file

// Import necessary modules from AWS SDK
const { S3Client, DeleteObjectsCommand, GetObjectCommand, ListObjectsV2Command, HeadObjectCommand, PutObjectCommand, DeleteObjectCommand } = require('@aws-sdk/client-s3');
const { getSignedUrl } = require('@aws-sdk/s3-request-presigner');
const fs = require('fs');

// Initialize an S3 client with provided credentials
const s3Client = new S3Client({
region: process.env.AWS_REGION, // Specify the AWS region from environment variables
credentials: {
accessKeyId: process.env.AWS_ACCESSKEYID, // Access key ID from environment variables
secretAccessKey: process.env.AWS_SECRETACCESSKEY // Secret access key from environment variables
}
});

// Export folder names for easier reference
exports.awsFolderNames = {
sub1: 'sub1',
sub2: 'sub2'
};

Step 9: Now, let’s establish our first function to upload a file to our S3 bucket, focusing on storing it in the “sub1” folder. Before proceeding, ensure you have a “uploads” folder created in your Node.js project to store files locally before transferring them to AWS S3. We’ll be sending files from this folder to AWS S3.

exports.uploadFileToAws = async (fileName, filePath) => {
try {
// Configure the parameters for the S3 upload
const uploadParams = {
Bucket: process.env.AWS.BUCKET_NAME,
Key: fileName,
Body: fs.createReadStream(filePath),
};

// Upload the file to S3
await s3Client.send(new PutObjectCommand(uploadParams)).then((data)=>{
// Delete the file from the local filesystem after successful upload
if (fs.existsSync(filePath)) {
fs.unlink(filePath, (err) => {
if (err) {
console.error('Error deleting file:', err);
} else {
console.log('File deleted successfully.');
}
});
}
});

} catch (err) {
console.error('Error ', err);
return 'error';
}
};

Step 10: Next, we’ll write a function to retrieve the uploaded file from the AWS S3 bucket as a URL with an expiration time. This means that the generated file download URL will expire after a certain period, with the default being 15 minutes, but we can adjust it according to our requirements.

// Export function to get a signed URL for downloading a file from AWS S3
exports.getFileUrlFromAws = async (fileName, expireTime = null) => {
try {
// Check if the file is available in the AWS S3 bucket
const check = await this.isFileAvailableInAwsBucket(fileName);

if (check) {
// Create a GetObjectCommand to retrieve the file from S3
const command = new GetObjectCommand({
Bucket: process.env.AWS.BUCKET_NAME, // Specify the AWS S3 bucket name
Key: fileName, // Specify the file name
});

// Generate a signed URL with expiration time if provided
if (expireTime != null) {
const url = await getSignedUrl(s3Client, command, { expiresIn: expireTime });
return url;
} else {
// Generate a signed URL without expiration time
const url = await getSignedUrl(s3Client, command);
return url;
}
} else {
// Return an error message if the file is not available in the bucket
return "error";
}
} catch (err) {
// Handle any errors that occur during the process
console.log("error ::", err);
return "error";
}
};

In the function above, we’ve added an additional step to ensure the file’s existence in AWS before generating its URL. This step is crucial as generating a URL for a non-existent file could lead to errors.

exports.isFileAvailableInAwsBucket = async (fileName) => {
try {
// Check if the object exists
await s3Client.send(new HeadObjectCommand({
Bucket: process.env.AWS.BUCKET_NAME,
Key: fileName,
}));

// If the object exists, return true
return true;
} catch (err) {
if (err.name === 'NotFound') {
// File not found in AWS bucket, return false
return false;
} else {
// Handle other errors
return false;
}
}
};

Step 11: Finally, we’ll create our last function to remove a file from our AWS S3 bucket.

exports.deleteFileFromAws = async (fileName) => {
try {
// Configure the parameters for the S3 upload
const uploadParams = {
Bucket: process.env.AWS.BUCKET_NAME,
Key: fileName,
};
// Upload the file to S3
await s3Client.send(new DeleteObjectCommand(uploadParams)).then((data)=>{
});

} catch (err) {
console.error('Error ', err);
return 'error';
}
};

Now that we’ve implemented all the functions related to AWS S3 functionality, let’s proceed to the next step, which involves utilizing these functions to upload files to the AWS S3 bucket. To do this, I’ll create a new file named s3Use.js.

// s3Use.js - File for utilizing AWS S3 functions

// Import the module for AWS S3 functions
const awss3connect = require("./awsS3connect");

// Function to upload a file to AWS S3 bucket
exports.uploadFileToAwsS3 = async function(dataObject) {
try {
// Define the path where the file will be saved locally
const savePath = `uploads/abcd.txt`;

// Upload the file to AWS S3 bucket in the specified subfolder
await awss3connect.uploadFileToAws(`${awss3connect.awsFolderNames.sub1}/${dataObject.fileName}`, `${savePath}`);
} catch (error) {
console.error("Error uploading file to AWS S3:", error);
throw error;
}
}

// Function to retrieve a file from AWS S3 bucket as a URL
exports.getFileFromAwsS3 = async function(dataObject) {
try {
const fileName = `abcd.txt`; // Specify the file name

// Get the file URL with default expiration time (15 minutes)
const fileUrl = await awss3connect.getFileUrlFromAws(`${awss3connect.awsFolderNames.sub1}/${fileName}`);
console.log("File URL:", fileUrl);

// Set custom expiration time for the file URL (2 years from now)
const expirationDate = new Date();
expirationDate.setFullYear(expirationDate.getFullYear() + 2);
const fileUrl1 = await awss3connect.getFileUrlFromAws(`${awss3connect.awsFolderNames.sub1}/${fileName}`, expirationDate);
console.log("Custom Expiry File URL:", fileUrl1);
} catch (error) {
console.error("Error getting file from AWS S3:", error);
throw error;
}
}

// Function to delete a file from AWS S3 bucket
exports.deleteFileFromAwsS3 = async function(dataObject) {
try {
const fileName = `abcd.txt`; // Specify the file name

// Delete the specified file from AWS S3 bucket
await awss3connect.deleteFileFromAws(`${awss3connect.awsFolderNames.logo}/${fileName}`);
} catch (error) {
console.error("Error deleting file from AWS S3:", error);
throw error;
}
}

Combine the code for the awsS3connect.js file into a single block.

// Import necessary modules from AWS SDK
const { S3Client, DeleteObjectsCommand, GetObjectCommand, ListObjectsV2Command, HeadObjectCommand, PutObjectCommand, DeleteObjectCommand } = require('@aws-sdk/client-s3');
const { getSignedUrl } = require('@aws-sdk/s3-request-presigner');
const fs = require('fs');

// Initialize an S3 client with provided credentials
const s3Client = new S3Client({
region: process.env.AWS_REGION, // Specify the AWS region from environment variables
credentials: {
accessKeyId: process.env.AWS_ACCESSKEYID, // Access key ID from environment variables
secretAccessKey: process.env.AWS_SECRETACCESSKEY // Secret access key from environment variables
}
});

// Export folder names for easier reference
exports.awsFolderNames = {
sub1: 'sub1',
sub2: 'sub2'
};

exports.uploadFileToAws = async (fileName, filePath) => {
try {
// Configure the parameters for the S3 upload
const uploadParams = {
Bucket: process.env.AWS.BUCKET_NAME,
Key: fileName,
Body: fs.createReadStream(filePath),
};

// Upload the file to S3
await s3Client.send(new PutObjectCommand(uploadParams)).then((data)=>{
// Delete the file from the local filesystem after successful upload
if (fs.existsSync(filePath)) {
fs.unlink(filePath, (err) => {
if (err) {
console.error('Error deleting file:', err);
} else {
console.log('File deleted successfully.');
}
});
}
});

} catch (err) {
console.error('Error ', err);
return 'error';
}
};

// Export function to get a signed URL for downloading a file from AWS S3
exports.getFileUrlFromAws = async (fileName, expireTime = null) => {
try {
// Check if the file is available in the AWS S3 bucket
const check = await this.isFileAvailableInAwsBucket(fileName);

if (check) {
// Create a GetObjectCommand to retrieve the file from S3
const command = new GetObjectCommand({
Bucket: process.env.AWS.BUCKET_NAME, // Specify the AWS S3 bucket name
Key: fileName, // Specify the file name
});

// Generate a signed URL with expiration time if provided
if (expireTime != null) {
const url = await getSignedUrl(s3Client, command, { expiresIn: expireTime });
return url;
} else {
// Generate a signed URL without expiration time
const url = await getSignedUrl(s3Client, command);
return url;
}
} else {
// Return an error message if the file is not available in the bucket
return "error";
}
} catch (err) {
// Handle any errors that occur during the process
console.log("error ::", err);
return "error";
}
};

exports.isFileAvailableInAwsBucket = async (fileName) => {
try {
// Check if the object exists
await s3Client.send(new HeadObjectCommand({
Bucket: process.env.AWS.BUCKET_NAME,
Key: fileName,
}));

// If the object exists, return true
return true;
} catch (err) {
if (err.name === 'NotFound') {
// File not found in AWS bucket, return false
return false;
} else {
// Handle other errors
return false;
}
}
};

exports.deleteFileFromAws = async (fileName) => {
try {
// Configure the parameters for the S3 upload
const uploadParams = {
Bucket: process.env.AWS.BUCKET_NAME,
Key: fileName,
};
// Upload the file to S3
await s3Client.send(new DeleteObjectCommand(uploadParams)).then((data)=>{
});

} catch (err) {
console.error('Error ', err);
return 'error';
}
};

That wraps up our guide on connecting to AWS S3! Thanks for diving into this straightforward approach. Stay tuned for more informative blogs like this one. Your enthusiasm keeps me going!

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Mr. Freelancer
Mr. Freelancer

Written by Mr. Freelancer

Hi, I’m Mr. Freelancer, sharing my tech knowledge and freelancing journey. Join me as I explore insights, challenges, and innovations in the tech world!

No responses yet

Write a response