AWS Reference Guide: CLI, S3, EventBridge, and Best Practices

A collection of AWS tips, code snippets, and best practices covering AWS CLI installation, S3 bucket configuration, EventBridge setup, IAM policies, and more.

AWS CLI Installation and Setup

Installing AWS CLI v2 on ARM-based Systems

# Download the installer
curl "https://awscli.amazonaws.com/awscli-exe-linux-aarch64.zip" -o "awscliv2.zip"

# Extract the files
unzip awscliv2.zip

# Run the installer
sudo ./aws/install

Installing AWS CLI v2 on x64-based Systems

# Download the installer
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"

# Extract the files
unzip awscliv2.zip

# Run the installer
sudo ./aws/install

Installing AWS CLI v2 on Raspberry Pi 4

# Install dependencies
sudo apt-get install git python3-pip cmake

# Clone the AWS CLI repository
git clone https://github.com/aws/aws-cli.git
cd aws-cli
git checkout v2

# Install AWS CLI
pip3 install -r requirements.txt --upgrade --user
pip3 install . --upgrade --user

# Add to PATH
export PATH=/home/pi/.local/bin:$PATH

Verifying AWS CLI Installation

To check your current AWS identity and confirm the CLI is working correctly:

aws sts get-caller-identity

AWS EKS (Elastic Kubernetes Service)

Adding Users to EKS Clusters

To add users to your EKS cluster, follow the official documentation:

AWS Networking Concepts

Internet Gateway vs NAT Gateway

Gateway TypePurpose
Internet Gateway (IGW)Allows instances with public IPs to access the internet and receive inbound connections
NAT Gateway (NGW)Allows instances with private IPs (no public IPs) to access the internet, but doesn’t allow inbound connections

AWS EBS (Elastic Block Storage)

Mounting EBS Volumes to Ubuntu Instances

To mount an EBS volume to an Ubuntu instance, follow the steps in the official AWS documentation for using EBS volumes.

Resizing EBS Volumes

To increase the size of an EBS volume:

  1. First, modify the volume size in AWS:

  2. Then, extend the filesystem on the Linux instance to use the additional space:

Amazon S3 Quick Start Guide

Step 1: Create an IAM User for S3 Access

  1. Navigate to the IAM console
  2. Create a new user named s3-qa
  3. Generate access keys for programmatic access
  4. Attach appropriate S3 permissions (or use the bucket policy below)

Step 2: Create an S3 Bucket with User Access

  1. Navigate to the S3 console
  2. Create a new bucket named qa-app
  3. Configure the bucket settings as needed
  4. Add the following bucket policy to allow access for the IAM user:
{
    "Version": "2012-10-17",
    "Id": "BucketPolicy",
    "Statement": [
        {
            "Sid": "Stmt1295042087538",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::<ACCOUNT_ID>:user/s3-qa"
            },
            "Action": [
                "s3:GetObject",
                "s3:PutObject",
                "s3:ListMultipartUploadParts",
                "s3:GetObjectAttributes",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::qa-app/*",
                "arn:aws:s3:::qa-app"
            ]
        }
    ]
}

Step 3: Configure S3 CORS Settings

To enable cross-origin resource sharing for your S3 bucket (necessary for web applications):

  1. Navigate to the bucket’s Permissions tab
  2. Scroll down to Cross-origin resource sharing (CORS)
  3. Click Edit and add the following configuration:
[
    {
        "AllowedHeaders": [
            "*"
        ],
        "AllowedMethods": [
            "HEAD",
            "GET",
            "PUT",
            "POST",
            "DELETE"
        ],
        "AllowedOrigins": [
            "http://localhost:3000",
            "http://localhost:4000",
            "http://localhost:4200",
            "https://*.example.com"
        ],
        "ExposeHeaders": [
            "ETag"
        ]
    }
]

EventBridge

sample event

{
  "version": "0",
  "id": "17793124-05d4-b198-2fde-7ededc63b103",
  "detail-type": "Object Created",
  "source": "aws.s3",
  "account": "123456789012",
  "time": "2021-11-12T00:00:00Z",
  "region": "ca-central-1",
  "resources": ["arn:aws:s3:::<BUCKET_NAME>"],
  "detail": {
    "version": "0",
    "bucket": {
      "name": "<BUCKET_NAME>"
    },
    "object": {
      "key": "example-key",
      "size": 5,
      "etag": "b1946ac92492d2347c6235b4d2611184",
      "version-id": "IYV3p45BT0ac8hjHg1houSdS1a.Mro8e",
      "sequencer": "00617F08299329D189"
    },
    "request-id": "N4N7GDK58NMKJ12R",
    "requester": "123456789012",
    "source-ip-address": "1.2.3.4",
    "reason": "PutObject"
  }
}

s3 bucket to enable eventbridge

https://docs.aws.amazon.com/AmazonS3/latest/userguide/enable-event-notifications-eventbridge.html

You can enable Amazon EventBridge using the S3 console, AWS Command Line Interface (AWS CLI), or Amazon S3 REST API.

Using the S3 console To enable EventBridge event delivery in the S3 console. Sign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/.

In the Buckets list, choose the name of the bucket that you want to enable events for.

Choose Properties.

Navigate to the Event Notifications section and find the Amazon EventBridge subsection. Choose Edit.

Under Send notifications to Amazon EventBridge for all events in this bucket choose On.

event bridge pipe with sqs

when sqs message is encrypted using kms, you need add below policy statement to Role assigned to eventbridge pipe

{
        "Version": "2012-10-17",
        "Statement": [
                {
                        "Sid": "kmsStatement",
                        "Effect": "Allow",
                        "Action": [
                                "kms:GenerateDataKey",
                                "kms:Decrypt"
                        ],
                        "Resource": "*"
                }
        ]
}

S3 using policy and user

Define policy

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:ListBucketMultipartUploads",
                "s3:PutObjectAcl"
            ],
            "Resource": [
                "arn:aws:s3:::<BUCKET_NAME>",
                "arn:aws:s3:::<BUCKET_NAME>/*"
            ]
        }
    ]
}

Create user and attach policy to user

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:ListBucketMultipartUploads",
                "s3:PutObjectAcl"
            ],
            "Resource": [
                "arn:aws:s3:::bucket-name",
                "arn:aws:s3:::bucket-name/*"
            ]
        }
    ]
}

Create user credential for commandline usage

upload file to s3

create user add s3 policy to user create credential

apk add aws-cli

export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export AWS_DEFAULT_REGION=

aws s3api put-object —bucket <BUCKET_NAME> —key folder/file —body file

To list all buckets in an Amazon S3 account, you need the s3:ListAllMyBuckets permission. This permission is required to navigate to buckets in the Amazon S3 console. To access the contents of a bucket, you also need the required AWS Identity and Access Management (IAM) permissions. For example, to perform the s3:ListBucket action, you need the s3:ListBucket permission in both your IAM policy and bucket policy. However, if your user or role is in the bucket owner’s account, you only need one of the policies to allow the action. You can list buckets using the Amazon S3 console, the AWS CLI, or the AWS SDKs. To list buckets using the AWS CLI, you can use the s3 ls command without a target or options.

S3 list all my buckets

https://docs.aws.amazon.com/AmazonS3/latest/userguide/walkthrough1.html#walkthrough-group-policy

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "AllowGroupToSeeBucketListInTheConsole",
      "Action": [
        "s3:ListAllMyBuckets",
        "s3:ListBucket",
        "s3:GetBucketLocation"
      ],
      "Effect": "Allow",
      "Resource": ["arn:aws:s3:::*"]
    }
  ]
}

aws list all files from s3 with given prefix

aws event bridge

Your custom bus will not receive any “aws.ssm” events. All aws.* are going to default bus only. The custom bus can only receive custom events from your application, e.g.:

"source": [
  "myapp.test"
]

From docs:

When an AWS service in your account emits an event, it goes to your account’s default event bus.

https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-service-event.html