A monorepo containing two serverless applications for product management using AWS Lambda, DynamoDB, and the Serverless Framework.
This project consists of two interconnected serverless applications:
- Product API - RESTful CRUD API for product management
- Data Importer - Scheduled service for importing products from S3 to DynamoDB
Both applications share common types and utilities, ensuring consistency and reducing code duplication.
serverless-product-management/
βββ apps/
β βββ product-api/ # REST API for product CRUD operations
β β βββ controllers/ # HTTP request handlers
β β βββ services/ # Business logic and DynamoDB operations
β β βββ routes.yml # API route definitions
β β βββ serverless.yml # API deployment configuration
β βββ data-importer/ # Scheduled CSV import service
β βββ controllers/ # Import logic handlers
β βββ services/ # S3 and DynamoDB operations
β βββ handler.ts # Lambda entry point
β βββ serverless.yml # Scheduler deployment configuration
βββ shared/
β βββ types/ # Common TypeScript interfaces
β β βββ product.ts # Product data models
β βββ utils/ # Shared utilities
β βββ dynamodb-client.ts # DynamoDB client configuration
βββ README.md
βββ CLAUDE.md # Development guidance
- β Create new products with validation
- β Retrieve products by ID
- β List all products
- β Update existing products
- β Delete products
- β Request validation using Yup schemas
- β Comprehensive error handling
- β Scheduled daily imports at 8 AM Beirut time using EventBridge Scheduler
- β CSV parsing from S3 buckets with date-based folder structure
- β Batch processing to DynamoDB
- β Automatic UUID generation for imported items
- β Direct EventBridge trigger (no API Gateway)
The data importer follows this automated workflow:
- Daily Schedule: EventBridge Scheduler triggers the Lambda function every day at 8:00 AM Beirut time
- Date-based File Structure: The system expects CSV files to be uploaded to S3 under folders named with the current date (YYYY-MM-DD format)
- File Processing: Lambda function reads the CSV file from
s3://bucket-name/YYYY-MM-DD/items.csv - Data Parsing: CSV contains items with columns:
name,description,price - Database Storage: Each parsed item is stored in DynamoDB with auto-generated UUID and timestamps
s3://import-s3-to-ddb-dev-data/
βββ 2024-01-15/
β βββ items.csv
βββ 2024-01-16/
β βββ items.csv
βββ 2024-01-17/
βββ items.csv
name,description,price
"Product A","Description of Product A",19.99
"Product B","Description of Product B",29.99
"Product C","Description of Product C",9.99- Runtime: Node.js 20.x
- Framework: Serverless Framework v4
- Database: AWS DynamoDB
- Storage: AWS S3 (for CSV imports)
- Scheduler: AWS EventBridge Scheduler
- API Gateway: AWS HTTP API (Product API only)
- Language: TypeScript
- Validation: Yup (API only)
- CSV Processing: csv-parser
- AWS SDK: v3
- Node.js 18+
- Serverless Framework
- AWS CLI configured
- AWS account with appropriate permissions
-
Clone the repository
git clone <repository-url> cd serverless-product-management
-
Install dependencies for both applications
# Install Product API dependencies cd apps/product-api npm install # Install Data Importer dependencies cd ../data-importer npm install
cd apps/product-api
serverless deploycd apps/data-importer
serverless deploy# Deploy both applications
cd apps/product-api && serverless deploy && cd ../data-importer && serverless deploycd apps/product-api
# Start local development server
serverless dev
# Or use offline mode
serverless offlinecd apps/data-importer
# Start local development server
serverless dev
# Test the import function locally
serverless invoke local -f importItemsBoth applications support local DynamoDB development:
# Start local DynamoDB (in either app directory)
serverless dynamodb startAfter deployment, the Product API provides these endpoints:
POST /products- Create a new productGET /products/{id}- Get a product by IDGET /products- List all productsPATCH /products/{id}- Update a productDELETE /products/{id}- Delete a product
interface Product {
id: string; // UUID primary key
name: string; // Product name
category: string; // Product category
price: number; // Product price
quantity: number; // Stock quantity
inStock: boolean; // Availability flag
description?: string; // Optional description
imageUrl?: string; // Optional image URL
tags?: string[]; // Optional tags array
createdAt: string; // ISO timestamp
updatedAt: string; // ISO timestamp
}interface Item {
id: string; // Auto-generated UUID
name: string; // From CSV
description: string; // From CSV
price: number; // From CSV (parsed as float)
createdAt: string; // Auto-generated timestamp
updatedAt: string; // Auto-generated timestamp
}{
"Effect": "Allow",
"Action": [
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:GetItem",
"dynamodb:PutItem",
"dynamodb:UpdateItem",
"dynamodb:DeleteItem"
],
"Resource": "arn:aws:dynamodb:*:*:table/http-crud-tutorial-items"
}{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"dynamodb:PutItem"
],
"Resource": [
"arn:aws:s3:::import-s3-to-ddb-dev-data/*",
"arn:aws:dynamodb:*:*:table/http-crud-tutorial-items"
]
}PRODUCTS_TABLE: DynamoDB table name (default:http-crud-tutorial-items)
TABLE_NAME: DynamoDB table name (default:http-crud-tutorial-items)S3_BUCKET: S3 bucket for CSV imports (default:import-s3-to-ddb-dev-data)
The data importer uses EventBridge Scheduler with the following configuration:
- Schedule: Daily at 8:00 AM Beirut time
- Cron Expression:
cron(0 8 * * ? *) - Timezone:
Asia/Beirut - Target: Lambda function (direct invocation, no API Gateway)
View logs for deployed functions:
# Product API logs
cd apps/product-api
serverless logs -f createProduct --tail
# Data Importer logs
cd apps/data-importer
serverless logs -f importItems --tailThe data importer includes comprehensive error handling:
- Missing S3 files for the current date
- CSV parsing errors
- DynamoDB write failures
- Invalid data format validation
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
For questions and support:
- Check the CLAUDE.md file for development guidance
- Open an issue in the GitHub repository
- Review the Serverless Framework documentation