AWS 101 Blog Services

AWS Testing 101

Never used AWS before? Don't know how to test a cloud app?
This guide will teach you exactly how to pretend you have a real AWS cloud on your local computer using Ministack.

1. What is AWS and Why Do We "Fake" It?

Amazon Web Services (AWS) is a giant computer up in the sky. If you work as a developer or QA engineer, your company's app probably stores files or databases inside AWS (like "S3" for files, and "DynamoDB" for databases).

The Problem: When you are writing code or testing things on your own computer, you don't want to accidentally delete real files on Amazon. Plus, using the real AWS costs money every time you click a button!

The Solution: Ministack. Ministack is a software tool that exactly mimics the AWS cloud, but it strictly runs offline on your laptop. You can test uploading files, deleting databases, and running code for free, safely. When you turn Ministack off, everything gets wiped clean!

2. Start Your Fake Cloud

To get your fake AWS cloud running, you just need Docker installed on your computer. Open your computer's terminal (Mac/Linux) or command prompt (Windows) and paste this exact line:

docker run -p 4566:4566 nahuelnucera/ministack

That's it! When you press enter, Ministack downloads a tiny package and starts running. Your computer is now pretending to be a giant AWS data center, specifically listening on port 4566.

3. Let's Write Your First AWS Test (S3)

๐ŸŒ The Real World: S3 completely replaced physical hard drives on servers. It acts exactly like an infinite, invisible Dropbox up in the sky.

โš ๏ธ The "Oops" Scenario: Writing a bad test script that accidentally wipes your company's real S3 production bucket will cost millions of dollars and likely get you fired. Faking it with Ministack costs $0 and hurts nothing.

Now we are going to write a simple script in Python to interact with our fake cloud. We will tell Python to pretend it is talking to Amazon's S3.

Normally, Python connects to the internet to talk to Amazon. But we use a trick called endpoint_url to force Python to talk to Ministack doing the testing locally instead!

# This is Python code. Boto3 is the official library that talks to AWS.
import boto3

# We tell Boto3 to talk to our local port 4566 instead of the real internet.
# We use fake passwords ('test', 'test') because Ministack doesn't care!
fake_amazon_s3 = boto3.client('s3',
    endpoint_url='http://localhost:4566',
    region_name='us-east-1',
    aws_access_key_id='test',
    aws_secret_access_key='test'
)

# Test 1: Let's create a digital folder (called a 'Bucket' in AWS)
fake_amazon_s3.create_bucket(Bucket='my-first-bucket')
print("Success! We created a fake cloud folder!")

4. Let's Build a Fake Database (DynamoDB)

๐ŸŒ The Real World: DynamoDB is a massive, lightning-fast database used by Twitch and Netflix. Think of it as an impossibly large Excel spreadsheet without the user interface.

โš ๏ธ The "Oops" Scenario: In real DynamoDB, writing a test loop that scans a table infinitely can rack up a $10,000 Amazon bill in an hour! Locally, infinite loops are totally free.

Using the exact same port trick, we can ask our fake cloud to give us a database instantly:

# We tell Boto3 we want to talk to the 'dynamodb' service instead of 's3'
fake_db = boto3.client('dynamodb',
    endpoint_url='http://localhost:4566',
    region_name='us-east-1',
    aws_access_key_id='test',
    aws_secret_access_key='test'
)

# Let's create a fake table called 'Users' to hold our customers
fake_db.create_table(
    TableName='Users',
    KeySchema=[{'AttributeName': 'UserId', 'KeyType': 'HASH'}],
    AttributeDefinitions=[{'AttributeName': 'UserId', 'AttributeType': 'S'}],
    BillingMode='PAY_PER_REQUEST'
)
print("Success! We created a fake database!")

5. The "Post Office" of the Cloud (SQS)

๐ŸŒ The Real World: SQS (Simple Queue Service) is literally a digital post-office. If an app needs to process 10,000 user sign-ups, it drops them in the SQS "line" immediately, and background workers fetch them one by one so servers don't crash.

โš ๏ธ The "Oops" Scenario: In real AWS, unread messages can stockpile and cost thousands of dollars or break downstream services. Locally, you can simply restart Docker to wipe it instantly!

Let's create a fake queue locally and drop a message in:

# Connect to the 'sqs' service locally
fake_queue = boto3.client('sqs', endpoint_url='http://localhost:4566', region_name='us-east-1', aws_access_key_id='test', aws_secret_access_key='test')

# Create the queue
response = fake_queue.create_queue(QueueName='my-fake-line')
queue_url = response['QueueUrl']

# Drop a message into the line!
fake_queue.send_message(QueueUrl=queue_url, MessageBody='Hello background workers!')
print("Success! Message sent to the fake post office!")

6. Infrastructure as Code (Terraform)

๐ŸŒ The Real World: Modern companies do not let humans physically click buttons in the Amazon website to create servers (people make mistakes). Instead, they write "blueprint" files mapping out their entire architecture as Code (Terraform).

To fake it locally with Ministack, you just tell your Terraform aws provider to point to your computer and skip all the real Amazon credential checks:

# In your main.tf blueprint file:
provider "aws" {
  region = "us-east-1"
  access_key = "test"
  secret_key = "test"
  s3_use_path_style = true
  skip_credentials_validation = true
  skip_metadata_api_check = true
  skip_requesting_account_id = true

  # Point the services to your fake cloud
  endpoints {
    s3 = "http://localhost:4566"
    dynamodb = "http://localhost:4566"
    sqs = "http://localhost:4566"
    lambda = "http://localhost:4566"
  }
}

# Now 'terraform apply' creates everything safely on your local machine!

7. Why We Need This Offline Cloud

If you are a QA Engineer writing frameworks in PyTest or Cypress, you can use these fake services to safely test what happens to your app if the AWS database randomly crashes or is missing, without ever touching the real production app.

If you are a Cloud Developer, you can instantly test complex Cloud integrations (like SQS firing a Lambda function) rapidly on your laptop while sitting on an airplane without WiFi.

If you are a DevOps Engineer, you run Ministack automatically inside GitHub Actions or GitLab CI to physically guarantee that all backend code correctly talks to S3 or DynamoDB before anyone is allowed to hit "Deploy to Production".

8. Serverless Functions โ€” Lambda

๐ŸŒ The Real World: Lambda is Amazon's most powerful and popular service. Instead of running a full server 24 hours a day, Lambda lets your code run only when needed โ€” in less than a millisecond. Think of it as a vending machine: it does nothing until you press a button, then immediately performs one task, and stops.

โš ๏ธ The "Oops" Scenario: Lambda functions in real AWS trigger from real events (user payments, file uploads, API calls). A buggy Lambda can silently skip processing thousands of user records in production. Testing it safely in Ministack first is absolutely essential.

Here we write a tiny Python Lambda function, package it up, deploy it to our fake cloud, and invoke it โ€” all entirely locally:

# Step 1: Connect + create the function package
import boto3, json, zipfile, io

lam = boto3.client('lambda',
    endpoint_url='http://localhost:4566',
    aws_access_key_id='test', aws_secret_access_key='test', region_name='us-east-1')

# Our tiny Lambda function โ€” it just says Hello!
function_code = ''' def handler(event, context): name = event.get('name', 'stranger') return {'statusCode': 200, 'body': f'Hello {name}!'} '''

# Step 2: Package the code into a zip (how AWS expects functions)
zip_buffer = io.BytesIO()
with zipfile.ZipFile(zip_buffer, 'w') as zf:
    zf.writestr('lambda_function.py', function_code)

# Step 3: Deploy it to our fake cloud
lam.create_function(
    FunctionName='my-hello-function',
    Runtime='python3.12',
    Role='arn:aws:iam::000000000000:role/fake-role',
    Handler='lambda_function.handler',
    Code={'ZipFile': zip_buffer.getvalue()}
)

# Step 4: Invoke it with a payload {name: 'Maria'}
response = lam.invoke(
    FunctionName='my-hello-function',
    Payload=json.dumps({'name': 'Maria'})
)
result = json.loads(response['Payload'].read())
print(result) # {'statusCode': 200, 'body': 'Hello Maria!'}

That's a real Lambda function โ€” deployed, invoked, and returned a response โ€” entirely offline. A QA engineer can now write tests that verify the output of this function for different inputs, zero cloud account needed.

๐ŸŽ“ You are now a cloud tester.

You have just learned how to safely fake 5 of the most critical AWS services โ€” S3, DynamoDB, SQS, Lambda, and Terraform โ€” all entirely free and offline.

The next step is to automate all of this with PyTest and run it on every code push with GitHub Actions CI/CD. Your future teammates will thank you.

โ†’ Write your first PyTest โ†’ Set up CI/CD in GitHub