The limitations of AWS free tier


As pretty much all users of AWS will be aware of by now, AWS offer a free tier of resources for certain of their managed services. This free tier is a fantastic way to learn about AWS services that are unfamiliar with, but they do have limits that need to be taken into account when working with them. Failing to keep track of your development resources puts you at risk of racking up some non-trivial costs in a short space of time.

Let’s look at two AWS services that offer free tier resources to see what the limits of the free tier are, and how we can best manage their resources to get the most value from our free tier credits.

All pricing will be for On-Demand Linux instances in the Frankfurt Region(eu-central-1) and taken from the AWS Pricing Calculator, assuming a month of 730 hours


AWS offer 750 hours of t2.micro and t3.micro instances for both Windows and Linux instances, totalling 1,500 hours a month. There is a specific reason that AWS has chosen 750 hours, 31 days is exactly 744 hours long. This means that you can run a single t3.micro instance any month of the year, at 100% utilisation, with up to 30GB of EBS storage, without incurring a fee, and still have some hours to spare for a different instance. That’s pretty cool!

However, taking an average month of 730 hours and 30GB of storage, this also means that if you were to run 2 instances for a month, you will be charged the cost of running a single instance for that length of time ($12.33), doubled less the remaining 20 credit hours ($12.33 – $1,69), totalling $10.64. Each additional instance after this will be charged at the full rate of $12.33.

But wait, I said this was for 100% utilisation for the whole month. What if you only want to use these instances for development 20 hours a week? Well, that will cost you $0.90 per instance per month, or 173.33 instance hours. This means that you can run up to 4 instances the entire time you are working before you start to incur a cost. And after this the costs are still quite low, here is a table for easier readability

No. of instances Cost 20 hours per week Cost whole month 
1Instances: $0, storage: $3.57 = $3.57Instances: $0, storage: $3.57 = $3.57
2Instances: $0, storage: $7.14 = $7.14Instances: $8.32, storage: $7.14 = $15.46
5Instances: $0, storage: $17.85 = $17.85Instances: $34.00, storage: $17.85 = $51.85
10Instances: $0.84, storage: $35.70 = $36.54Instances: $76.80, storage: $35.70 = $112.50
20Instances: $10.44, storage: $71.40 = $81.84Instances: $162.40, storage: $71.40 = $233.80


Much like EC2, OpenSearch also offers 750 hours of or instances every month, in a single-AZ cluster, with up to 10GB of storage space. The twist here though is that you are required to run at least 2 instances in the cluster, and if you want to be able to handle usage from multiple clients at once hitting your cluster with multiple requests per minute, you will quickly find that you need more than 2 of the free tier instances to handle the load. You can spin up a max of 10 free tier instances in a cluster, and up to 20 standard search instances.

This table breaks down the monthly cost for potential clusters of instances, as you can see, the costs really begin to rack up if you are not careful with leaving your instances running, and this is free tier instances. 

No. of instances Cost 20 hours per week Cost whole month 
10 $2.10$275.10
20 $35.70$581.70
OpenSearch cost breakdown

Lambda to the rescue

A quickly obtainable win for the EC2 instances is to use AWS Resource Tags and a Lambda function set to a CRON trigger to shut down any instances that do not need to be running outside of office hours. This prevents instances running at night and eating up free tier credits, or better, preventing them racking up costs when they are not being used.

# Lambda Function

import boto3

ec2 = boto3.resource("ec2")

def handler(context, event):

  # get all resources with "office-hours" tag
  instances = ec2.instances.filter(
    Filters = [
        "Name": "instance-state-name",
        "Values": ["running"]

  # try to stop instances
  for instance in instances:
    if instance.tags != None:
        print(f"{instance} stopped")
      except Exception as e:
        print(f"Error stopping {instance}, error: {e}")
# CDK to deploy resources

import events = require('aws-cdk-lib/aws-events');
import targets = require('aws-cdk-lib/aws-events-targets');

import * as cdk from "aws-cdk-lib";
import * as lambda from "aws-cdk-lib/aws-lambda";
import * as path from "path";
import * as iam from "aws-cdk-lib/aws-iam";

import { PythonFunction } from "@aws-cdk/aws-lambda-python-alpha";
import { Construct } from "constructs";

export interface UtilitiesProps extends cdk.StackProps {};

export class UtilitiesStack extends cdk.Stack {
  constructor(scope: Construct, id: string, props: UtilitiesProps) {
    super(scope, id, props);

    var lambdaService = "";

    const lambdaRole = new iam.Role(this, "lambdaRole", {
      roleName: "office-hours-instances",
      assumedBy: new iam.ServicePrincipal(lambdaService),

    lambdaRole.addToPolicy(new iam.PolicyStatement({
      effect: iam.Effect.ALLOW,
      actions: [
      resources: ["*"]

    const instance_stopper = new PythonFunction(this, "office_hours_functions", {
      runtime: lambda.Runtime.PYTHON_3_8,
      timeout: cdk.Duration.seconds(30),
      entry: lambda.Code.fromAsset(path.join(__dirname, "lambda_functions/python/office_hours_ec2/")).path,
      handler: "handler",
      index: "",
      functionName: "shut_down_instances_cron",
      role: lambdaRole

    // Run script at 7 PM UTC every weekday.
    const rule = new events.Rule(this, 'Rule', {
      schedule: events.Schedule.expression('cron(0 19 ? * MON-FRI *)')

    rule.addTarget(new targets.LambdaFunction(instance_stopper));


Sources used for this blog post:

AWS Pricing Calculator:





I.M. Bruton