AWS Cost Optimization: FinOps Best Practices 2025
Transform your AWS spending with proven FinOps strategies that drive business value
Introduction
As cloud adoption accelerates, managing AWS costs effectively has become critical for business success. FinOps (Financial Operations) provides a framework for maximizing cloud value while optimizing spend. This guide covers the essential FinOps practices for AWS in 2025.
The Three Pillars of FinOps
1. Visibility & Allocation
Cost Visibility:
- Implement comprehensive tagging strategy
- Use AWS Cost Explorer for trend analysis
- Enable Cost and Usage Reports (CUR)
- Deploy cost anomaly detection
Example Tagging Strategy:
{
"CostCenter": "Engineering",
"Project": "WebApp",
"Environment": "Production",
"Owner": "team@company.com",
"Application": "CustomerPortal"
}
Automated Tagging with CDK:
import * as cdk from 'aws-cdk-lib';
const app = new cdk.App();
// Apply tags at stack level
cdk.Tags.of(app).add('CostCenter', 'Engineering');
cdk.Tags.of(app).add('ManagedBy', 'CDK');
2. Optimization
Compute Optimization:
- Rightsizing: Analyze CloudWatch metrics to match instance types to workload
- Savings Plans: Commit to consistent usage for up to 72% savings
- Spot Instances: Use for fault-tolerant workloads (up to 90% savings)
- Auto Scaling: Scale based on actual demand
Storage Optimization:
- S3 Intelligent-Tiering: Automatic cost optimization
- EBS snapshots: Delete unused snapshots
- Lifecycle policies: Move old data to cheaper tiers
Example S3 Lifecycle Policy:
import * as s3 from 'aws-cdk-lib/aws-s3';
new s3.Bucket(this, 'DataBucket', {
lifecycleRules: [
{
transitions: [
{
storageClass: s3.StorageClass.INFREQUENT_ACCESS,
transitionAfter: cdk.Duration.days(30),
},
{
storageClass: s3.StorageClass.GLACIER,
transitionAfter: cdk.Duration.days(90),
},
],
expiration: cdk.Duration.days(365),
},
],
});
3. Operations & Culture
FinOps Culture:
- Cross-functional collaboration (Finance, Engineering, Leadership)
- Regular cost reviews and optimization cycles
- Cost awareness in development process
- Shared responsibility for cloud costs
Automation:
# Automated resource cleanup
import boto3
from datetime import datetime, timedelta
ec2 = boto3.client('ec2')
# Find unused EBS volumes
volumes = ec2.describe_volumes(
Filters=[{'Name': 'status', 'Values': ['available']}]
)
for volume in volumes['Volumes']:
if volume['CreateTime'] < datetime.now() - timedelta(days=30):
print(f"Deleting unused volume: {volume['VolumeId']}")
ec2.delete_volume(VolumeId=volume['VolumeId'])
AWS FinOps Best Practices for 2025
1. Implement Cost Anomaly Detection
AWS Cost Anomaly Detection uses machine learning to identify unusual spending patterns:
# Enable via CLI
aws ce create-anomaly-monitor \
--anomaly-monitor Name=ProductionMonitor,MonitorType=DIMENSIONAL \
--cost-anomaly-detection-monitor-dimension-value LINKED_ACCOUNT
aws ce create-anomaly-subscription \
--anomaly-subscription Name=ProductionAlerts \
--monitor-arn arn:aws:ce::123456789012:anomalymonitor/abc123 \
--subscribers Type=EMAIL,Address=finops@company.com \
--threshold-expression '{"Dimensions": {"Key": "ANOMALY_TOTAL_IMPACT_ABSOLUTE","Values": ["100"]}}'
2. Leverage Graviton Processors
AWS Graviton3 provides up to 40% better price-performance:
import * as ec2 from 'aws-cdk-lib/aws-ec2';
import * as ecs from 'aws-cdk-lib/aws-ecs';
// Use Graviton-based instances
const taskDefinition = new ecs.FargateTaskDefinition(this, 'Task', {
cpu: 1024,
memoryLimitMiB: 2048,
runtimePlatform: {
cpuArchitecture: ecs.CpuArchitecture.ARM64,
operatingSystemFamily: ecs.OperatingSystemFamily.LINUX,
},
});
3. Optimize Data Transfer Costs
Data transfer is often overlooked but can be significant:
Best Practices:
- Use CloudFront for content delivery (reduces data transfer costs)
- Keep data in same region when possible
- Use VPC endpoints to avoid NAT Gateway costs
- Enable S3 Transfer Acceleration only when needed
import * as ec2 from 'aws-cdk-lib/aws-ec2';
const vpc = new ec2.Vpc(this, 'VPC');
// Add S3 VPC Endpoint to avoid NAT costs
vpc.addGatewayEndpoint('S3Endpoint', {
service: ec2.GatewayVpcEndpointAwsService.S3,
});
4. Implement Reserved Instance & Savings Plans Strategy
Commitment Strategy:
- Analyze 30-day usage patterns
- Start with Compute Savings Plans (most flexible)
- Add EC2 Instance Savings Plans for stable workloads
- Reserve 70-80% of baseline usage
- Use Spot/On-Demand for variable load
Automated Recommendations:
import boto3
ce = boto3.client('ce')
# Get Savings Plans recommendations
recommendations = ce.get_savings_plans_purchase_recommendation(
SavingsPlansType='COMPUTE_SP',
TermInYears='ONE_YEAR',
PaymentOption='NO_UPFRONT',
LookbackPeriodInDays='THIRTY_DAYS'
)
for rec in recommendations['SavingsPlansPurchaseRecommendation']['SavingsPlansPurchaseRecommendationDetails']:
print(f"Hourly Commitment: ${rec['HourlyCommitmentToPurchase']}")
print(f"Estimated Savings: ${rec['EstimatedMonthlySavingsAmount']}")
5. Database Cost Optimization
RDS Optimization:
- Use Aurora Serverless v2 for variable workloads
- Enable RDS storage autoscaling
- Use read replicas strategically
- Consider Graviton-based instances
import * as rds from 'aws-cdk-lib/aws-rds';
const cluster = new rds.ServerlessCluster(this, 'Database', {
engine: rds.DatabaseClusterEngine.AURORA_POSTGRESQL,
scaling: {
minCapacity: rds.AuroraCapacityUnit.ACU_2,
maxCapacity: rds.AuroraCapacityUnit.ACU_16,
autoPause: cdk.Duration.minutes(10),
},
});
6. Container Cost Optimization
ECS/EKS Best Practices:
- Use Fargate Spot for fault-tolerant tasks (70% savings)
- Implement pod rightsizing with VPA (Vertical Pod Autoscaler)
- Use Karpenter for node provisioning (EKS)
- Enable ECS container insights selectively
Fargate Spot Example:
import * as ecs from 'aws-cdk-lib/aws-ecs';
const service = new ecs.FargateService(this, 'Service', {
cluster,
taskDefinition,
capacityProviderStrategies: [
{
capacityProvider: 'FARGATE_SPOT',
weight: 2,
},
{
capacityProvider: 'FARGATE',
weight: 1,
base: 1,
},
],
});
Advanced FinOps Strategies
1. Multi-Account Cost Allocation
# Consolidated billing analysis
import boto3
import pandas as pd
ce = boto3.client('ce')
response = ce.get_cost_and_usage(
TimePeriod={
'Start': '2025-01-01',
'End': '2025-01-31'
},
Granularity='MONTHLY',
Metrics=['UnblendedCost'],
GroupBy=[
{'Type': 'DIMENSION', 'Key': 'LINKED_ACCOUNT'},
{'Type': 'TAG', 'Key': 'CostCenter'}
]
)
# Convert to DataFrame for analysis
df = pd.DataFrame(response['ResultsByTime'])
2. Automated Resource Scheduling
# Lambda function for EC2 scheduling
import boto3
import os
ec2 = boto3.client('ec2')
def lambda_handler(event, context):
action = os.environ['ACTION'] # 'stop' or 'start'
# Get instances with Auto-Stop tag
instances = ec2.describe_instances(
Filters=[
{'Name': 'tag:AutoStop', 'Values': ['true']},
{'Name': 'instance-state-name', 'Values': ['running' if action == 'stop' else 'stopped']}
]
)
instance_ids = [
instance['InstanceId']
for reservation in instances['Reservations']
for instance in reservation['Instances']
]
if instance_ids:
if action == 'stop':
ec2.stop_instances(InstanceIds=instance_ids)
else:
ec2.start_instances(InstanceIds=instance_ids)
return {'statusCode': 200, 'body': f'{action}ped {len(instance_ids)} instances'}
3. Cost Allocation Reports with QuickSight
Set up automated dashboards:
- Enable Cost and Usage Reports (CUR)
- Configure S3 bucket for CUR delivery
- Create Athena database from CUR
- Build QuickSight dashboards for stakeholders
FinOps KPIs to Track
Essential Metrics
- Unit Economics: Cost per customer, per transaction
- Cost per Environment: Production vs Development spend
- Waste Metrics: Unused resources, idle instances
- Commitment Coverage: % covered by Savings Plans/RIs
- Cost Anomalies: Unusual spending patterns
- Rightsizing Opportunities: Over-provisioned resources
Dashboard Example
-- Athena query for cost by service
SELECT
line_item_product_code,
SUM(line_item_unblended_cost) as total_cost,
DATE_FORMAT(line_item_usage_start_date, '%Y-%m') as month
FROM cur_database.cur_table
WHERE line_item_usage_start_date >= DATE '2025-01-01'
GROUP BY 1, 3
ORDER BY 2 DESC
Implementation Roadmap
Month 1: Foundation
- Implement tagging strategy
- Enable Cost Explorer and CUR
- Set up cost allocation tags
- Create initial dashboards
Month 2: Quick Wins
- Delete unused resources
- Stop non-production resources after hours
- Enable S3 Intelligent-Tiering
- Clean up old snapshots
Month 3: Optimization
- Analyze rightsizing recommendations
- Purchase Savings Plans for baseline
- Implement automated scheduling
- Set up anomaly detection
Month 4: Culture & Automation
- Establish FinOps working group
- Create cost review cadence
- Implement automated optimization
- Develop chargeback model
Dutch Market Considerations
For Nederlandse bedrijven:
- EU data residency impacts costs (eu-west-1 vs us-east-1)
- Consider Dutch public sector procurement rules
- Align with internal budget cycles (often calendar year)
- Include BTW/VAT in cost calculations
- Use euro-based cost reporting
Conclusion
Effective FinOps is not just about reducing costs—it’s about maximizing business value from cloud investments. By implementing these best practices, you can:
- Reduce AWS spend by 30-50%
- Improve cost visibility and accountability
- Enable faster innovation with cost confidence
- Build a culture of cost awareness
Ready to optimize your AWS costs? Contact Forrict for expert FinOps guidance tailored to your organization.
Resources
Fons Biemans
AWS expert and consultant at Forrict, specializing in cloud architecture and AWS best practices for Dutch businesses.
