Docs
/
AWS Cloud
Chapter 8
08 — Lambda & Serverless
What is Lambda?
Run code without servers. Pay only when your code executes.
Upload code → Define trigger → Lambda runs → You pay per invocation
No servers to manage, no patching, auto-scales to thousands of concurrent executions.
Lambda Function
// handler.ts
import { APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda';
export const handler = async (event: APIGatewayProxyEvent): Promise<APIGatewayProxyResult> => {
const body = JSON.parse(event.body || '{}');
// Your business logic
const result = await processOrder(body);
return {
statusCode: 200,
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ success: true, data: result }),
};
};
Triggers (Event Sources)
| Trigger | Use Case |
|---|---|
| API Gateway | REST/HTTP API endpoints |
| S3 | File upload processing (resize images, parse CSV) |
| SQS | Queue processing |
| DynamoDB Streams | React to DB changes |
| EventBridge | Scheduled tasks (cron), event-driven |
| SNS | Notification processing |
| CloudWatch Events | Scheduled (rate/cron) |
| Cognito | Auth triggers (pre-signup, post-confirm) |
Configuration
aws lambda create-function \
--function-name my-api \
--runtime nodejs20.x \
--handler dist/handler.handler \
--role arn:aws:iam::123456:role/LambdaRole \
--memory-size 256 \
--timeout 30 \
--environment Variables={NODE_ENV=production,DB_HOST=my-db.rds.amazonaws.com}
| Setting | Default | Max | Notes |
|---|---|---|---|
| Memory | 128 MB | 10,240 MB | CPU scales with memory |
| Timeout | 3s | 900s (15 min) | Use SQS+Lambda for longer tasks |
| Ephemeral storage | 512 MB | 10 GB | /tmp directory |
| Package size | — | 50 MB (zip), 250 MB (unzipped) | Use layers for shared deps |
| Concurrency | 1000 | Account limit | Request increase if needed |
Cold Starts
First invocation (cold start):
Download code → Init runtime → Init handler → Execute
~100ms-2s (depends on runtime, package size, VPC)
Subsequent invocations (warm):
Execute only
~1-10ms
Mitigation:
✅ Use smaller packages (tree-shake, exclude dev deps)
✅ Use Provisioned Concurrency (keeps N instances warm)
✅ Avoid VPC unless needed (adds ~1s cold start)
✅ Use Node.js/Python (faster cold starts than Java/.NET)
✅ Initialize SDK clients OUTSIDE the handler
// ✅ SDK client initialized outside handler (reused across invocations)
const dynamodb = DynamoDBDocumentClient.from(new DynamoDBClient({}));
export const handler = async (event: any) => {
// ❌ Don't create client here (re-created every call)
const result = await dynamodb.send(new GetCommand({ ... }));
return { statusCode: 200, body: JSON.stringify(result) };
};
Lambda Layers
Share code/dependencies across functions.
# Create layer (node_modules, shared utils)
zip -r layer.zip nodejs/
aws lambda publish-layer-version \
--layer-name shared-deps \
--zip-file fileb://layer.zip \
--compatible-runtimes nodejs20.x
# Attach to function
aws lambda update-function-configuration \
--function-name my-api \
--layers arn:aws:lambda:us-east-1:123:layer:shared-deps:1
Concurrency
Reserved Concurrency: Guarantees N instances for this function
Provisioned Concurrency: Keeps N instances warm (no cold starts)
Account limit: 1000 concurrent
┌──────────────────────┐
Function A: Reserved 100 │████████░░░░░░░░░░│
Function B: Reserved 200 │████████████████░░│
Unreserved: 700 │░░░░░░░░░░░░░░░░░░│
└──────────────────────┘
AWS SAM (Serverless Application Model)
# template.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Globals:
Function:
Runtime: nodejs20.x
Timeout: 30
MemorySize: 256
Resources:
ApiFunction:
Type: AWS::Serverless::Function
Properties:
Handler: dist/handler.handler
Events:
GetOrders:
Type: Api
Properties:
Path: /orders
Method: get
CreateOrder:
Type: Api
Properties:
Path: /orders
Method: post
Environment:
Variables:
TABLE_NAME: !Ref OrdersTable
OrdersTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: Orders
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: pk
AttributeType: S
- AttributeName: sk
AttributeType: S
KeySchema:
- AttributeName: pk
KeyType: HASH
- AttributeName: sk
KeyType: RANGE
sam build
sam local invoke # Test locally
sam local start-api # Local API Gateway
sam deploy --guided # Deploy to AWS
Key Takeaways
- Lambda = serverless functions — no servers, pay per invocation
- Cold starts are real — minimize package size, init clients outside handler
- Use SAM or CDK for defining serverless infrastructure as code
- Layers for sharing dependencies across functions
- Memory ↑ = CPU ↑ — increasing memory can make functions faster and cheaper
- Max 15-minute timeout — use SQS for longer tasks
- Provisioned Concurrency eliminates cold starts for latency-sensitive functions