Preface
This tiny guide is written while I'm exploring AWS, using JavaScript SDKs.
Overview
AWS can be overwhelming for first-time users.
I plan to explore the essential services commonly used in web development, including:
- S3 (Object Storage)
- SNS (Notification Service)
- DynamoDB (NoSQL Database)
- SQS (Message Queue)
- Lambda (Function as a Service)
Related code can be found on github.
awscli
There are four primary ways to interact with AWS: the web console, the command line, IaC tools, and SDKs. Compared to the web console, the command line interface is typically more efficient and straightforward for many tasks.
awscli is the official command line interface for AWS. It is written in Python and can be installed via pip or Homebrew:
$ brew install awscli
Setting Up AWS CLI
After installation, configure your AWS credentials:
$ aws configure
You'll need to provide your Access Key ID and Secret Access Key, which can be obtained from the AWS Management Console. To do this, navigate to IAM (Identity and Access Management), create a new user, and during the process, create a user group and assign necessary permissions, such as "S3FullAccess" and "SNSFullAccess."
Once the user is created, generate an access key for that user to complete the CLI configuration.
Verify Configuration
To confirm that your setup is correct, run:
$ aws sts get-caller-identity
You should see an output similar to the following:
{
"UserId": "***",
"Account": "***",
"Arn": "arn:aws:iam::<account>:user/<name>"
}
Here, "Arn" stands for Amazon Resource Name, which uniquely identifies AWS resources.
S3
Simple Storage Service (S3) is an object storage service, which was one of the earliest services offered by AWS, it is now a cornerstone of cloud storage due to its simplicity, scalability, and durability.
It provides a simple interface for storing and retrieving any amount of data.
Creating a Bucket
To get started, we need to create(make) a bucket, the bucket namespace is shared by all users, so you need a very unique name:
$ aws s3 mb s3://your-unique-bucket-name
Uploading Files
$ aws s3 cp file.txt s3://your-unique-bucket-name/
Listing Files
$ aws s3 ls s3://your-unique-bucket-name
Output:
2024-10-30 10:22:11 115 file.txt
Downloading Files
$ aws s3 cp s3://your-unique-bucket-name/file.txt download.txt
Using the SDK
I will use bun for simplicity:
Install the SDK:
$ bun add @aws-sdk/client-s3
Add s3.ts with the following content:
import { S3Client, GetObjectCommand, PutObjectCommand } from '@aws-sdk/client-s3'
import fs from 'node:fs'
const s3 = new S3Client()
const Bucket = 'wdgfs'
await s3.send(
new PutObjectCommand({
Bucket,
Key: "s3.ts",
Body: fs.readFileSync("./s3.ts"),
}),
);
const { Body } = await s3.send(
new GetObjectCommand({
Bucket,
Key: "s3.ts",
}),
)
console.log(await Body.transformToString())
Credentials
You might noticed that not credentials are provided in the code, that's
because the SDK will retrieves them from ~/.aws/credentials.
Run your script with:
$ bun s3.ts
This covers the very basics of S3.
SNS
SNS stands for Simple Notification Service, it's a fully managed messaging service that allows you to send messages to various endpoints, including mobile devices, email, SMS, and HTTP endpoints. It can also deliver messages to SQS queues or invoke Lambda functions.
Topics, Subscriptions, and Publishing
SNS works like a message queue, which has topics, subscribers, and publishers.
Creating a Topic
To create a topic via awscli:
aws sns create-topic --name topic0
This will return an ARN:
arn:aws:sns:us-east-1:<account>:topic0
Subscribing
First, install the SDK:
bun add @aws-sdk/client-sns
Next, create a file named subscribe.ts with the following content:
import { SNSClient, SubscribeCommand } from "@aws-sdk/client-sns"
const sns = new SNSClient({})
const params = {
TopicArn: "arn:aws:sns:us-east-1:Software Engineer:topic0",
Protocol: "email",
Endpoint: "<youremail>",
}
const command = new SubscribeCommand(params)
const data = await sns.send(command)
console.log("Subscription created successfully:", data.SubscriptionArn)
Run it using bun subscribe.ts.
You will receive an "AWS Notification - Subscription Confirmation" email, and you need to confirm it.
Publishing Messages
Next, create a file named publish.ts with the following content:
import { SNSClient, PublishCommand } from "@aws-sdk/client-sns"
const client = new SNSClient({})
const params = {
TopicArn: "arn:aws:sns:us-east-1:112233445566:topic0",
Message: "Hello from SNS!",
}
const data = await client.send(new PublishCommand(params))
console.log("Message published successfully:", data.MessageId)
Run it using bun publish.ts.
You should receive an email with the expected content.
This really saved a lot of trouble in managing subscriptions and the need to adapt to different transports, not to mention other benefits like scalability and reliability.
DynamoDB
DynamoDB is a fully managed NoSQL database service designed for high performance and scalability.
It is basically a key-value store, but it also supports document-like data structures.
Create a Table
aws dynamodb create-table --table-name users \
--attribute-definitions AttributeName=id,AttributeType=S \
--key-schema AttributeName=id,KeyType=HASH \
--billing-mode PAY_PER_REQUEST
Using SDK
bun add @aws-sdk/client-dynamodb
Here's an example using the SDK to insert and retrieve an item:
import {
DynamoDBClient,
PutItemCommand,
GetItemCommand
} from "@aws-sdk/client-dynamodb"
const client = new DynamoDBClient({})
await client.send(new PutItemCommand({
TableName: 'users',
Item: {
"id": {S: "1"},
"name": { S: "Alice" },
"age": { N: "7" }
}
}))
const data = await client.send(new GetItemCommand({
TableName: 'users',
Key: {
"id": { S: "1" }
}
}))
console.log(data)
The official SDK looks a bit primitive, there are third party libraries like dynamoose, which provides a better developer experience.
SQS
Simple Queue Service is a fully managed queue service, means AWS handles the infrastructure and scaling. The API is indeed a bit simpler compared to RabbitMQ or Kafka.
Create a Queue
aws sqs create-queue --queue-name queue0
Using the SDK
bun add @aws-sdk/client-sqs
Publish a Message
Add sqs-send.ts:
import {
SQSClient,
SendMessageCommand,
GetQueueUrlCommand
} from "@aws-sdk/client-sqs"
const sqs = new SQSClient({})
const {QueueUrl} = await sqs.send(new GetQueueUrlCommand({QueueName: "queue0"}))
const data = await sqs.send(new SendMessageCommand({
QueueUrl,
MessageBody: "Hello, World!"
}))
console.log(data.MessageId)
Run it with bun sqs-send.ts.
Then add sqs-receive.ts:
import {
SQSClient,
ReceiveMessageCommand,
DeleteMessageCommand,
GetQueueUrlCommand
} from "@aws-sdk/client-sqs"
const sqs = new SQSClient({})
const {QueueUrl} = await sqs.send(new GetQueueUrlCommand({QueueName: "queue0"}))
const {Messages} = await sqs.send(new ReceiveMessageCommand({
QueueUrl,
MaxNumberOfMessages: 10,
WaitTimeSeconds: 20
}))
if (Messages) {
Messages.forEach(async message => {
console.log("Message:", message.Body)
await sqs.send(new DeleteMessageCommand({
QueueUrl,
ReceiptHandle: message.ReceiptHandle
}))
console.log("Message deleted")
})
}
Run it with bun sqs-receive.ts.
Lambda
Like other resources, Lambda can be deployed with awscli. But I'll use SST for better developer experience.
I'll also use hono as the web framework. It's minimal and works on other platforms as well.
Permission
SST requires a bunch of permissions to work correctly. For now, grant
AdministratorAccess permission. More on this.
Init
bunx sst@latest init
bun add hono
Update sst.config.ts:
async run() {
const fn = new sst.aws.Function("Foo", {
url: true,
handler: "index.handler",
})
return {
api: fn.url,
}
}
Add index.ts
import { Hono } from "hono"
import { handle } from "hono/aws-lambda"
const app = new Hono()
.get("/", async (c) => {
return c.text("Hono")
})
export const handler = handle(app)
Start dev server:
bunx sst dev
Deployment
bunx sst deploy --stage prod
More On SST
SST combines IaC with a web framework, allowing you to allocate AWS resource in a declarative way.
Add a bucket in sst.config.ts:
+ const bucket = new sst.aws.Bucket("bucket0")
const hono = new sst.aws.Function("Foo", {
url: true,
+ link: [bucket],
handler: "index.handler",
})
Then you can use it in the application like this:
const objects = await s3.send(
new ListObjectsV2Command({
Bucket: Resource.bucket0.name,
}),
)
SST ensures the bucket is created.