What is PyTest and Why Do QA Engineers Use It?
PyTest is a free Python testing framework. Think of it as a robot that automatically runs a checklist of tests for you every time someone changes the code. Instead of a human manually clicking through an app, PyTest does it in milliseconds.
When combined with Ministack, PyTest can automatically verify that your application's AWS infrastructure (file storage, databases, message queues) behaves correctly โ all without touching the real cloud.
Step 1: Install the Tools
Open your terminal and install both PyTest and the AWS Python library (Boto3) with one command:
Then start your fake AWS cloud via Docker:
Step 2: Write a Shared Test Setup (The "Fixture")
In PyTest, a fixture is a reusable helper that sets up your test environment. Our fixture will connect to MiniStack and hand off a ready-to-use AWS client to every test that needs it. Create a file called conftest.py:
import pytest
import boto3
@pytest.fixture
def s3_client():
# Every test that uses this gets a fresh S3 connection
# to our fake cloud โ like a blank canvas every time.
return boto3.client('s3',
endpoint_url='http://localhost:4566',
aws_access_key_id='test',
aws_secret_access_key='test',
region_name='us-east-1'
)
@pytest.fixture
def dynamo_client():
return boto3.client('dynamodb',
endpoint_url='http://localhost:4566',
aws_access_key_id='test',
aws_secret_access_key='test',
region_name='us-east-1'
)
Step 3: Write Your First Cloud Tests
Now create a file called test_cloud.py. These are your actual tests. Each function starting with test_ will be discovered and run automatically by PyTest:
def test_can_create_a_file_bucket(s3_client):
# ACTION: Create a storage bucket
s3_client.create_bucket(Bucket='user-uploads')
# CHECK: Confirm the bucket now exists in our fake cloud
buckets = s3_client.list_buckets()['Buckets']
bucket_names = [b['Name'] for b in buckets]
assert 'user-uploads' in bucket_names
def test_can_upload_a_file(s3_client):
s3_client.create_bucket(Bucket='test-files')
# ACTION: Upload a file to the fake cloud
s3_client.put_object(Bucket='test-files', Key='hello.txt', Body=b'Hello World')
# CHECK: Download it back and verify the content is identical
response = s3_client.get_object(Bucket='test-files', Key='hello.txt')
content = response['Body'].read()
assert content == b'Hello World'
def test_can_store_user_in_database(dynamo_client):
# ACTION: Create a Users table
dynamo_client.create_table(
TableName='Users',
KeySchema=[{'AttributeName': 'UserId', 'KeyType': 'HASH'}],
AttributeDefinitions=[{'AttributeName': 'UserId', 'AttributeType': 'S'}],
BillingMode='PAY_PER_REQUEST'
)
# ACTION: Insert a user record
dynamo_client.put_item(
TableName='Users',
Item={'UserId': {'S': 'u1'}, 'Name': {'S': 'Maria'}, 'Role': {'S': 'QA Engineer'}}
)
# CHECK: Retrieve that user and confirm the name matches
result = dynamo_client.get_item(TableName='Users', Key={'UserId': {'S': 'u1'}})
assert result['Item']['Name']['S'] == 'Maria'
Step 4: Run your Tests
With MiniStack running in the background, open a second terminal and run:
You will see a satisfying green output confirming your tests pass. PASSED means your app correctly talks to S3 and DynamoDB. Fail means something is off โ and you caught it before it ever hit production!
Why This Matters
After Ministack: Every developer and QA engineer runs their own private, disposable fake cloud. Tests are isolated, free, instantaneous, and the entire cloud gets wiped when Docker restarts. No shared state, no accidents, no bills.
You now have the foundation to test any AWS behaviour your application depends on โ safely, locally, and for free. From here, explore the AWS 101 guide to learn about SQS, Lambda, and Terraform testing.