Note: This module is now deprecated. For all projects, we recommend using the
openai_batchresource directly instead of this module.
This module previously simulated batch processing jobs in OpenAI before the official batch API was fully supported. Now that the openai_batch resource is fully implemented, this module only exists for backward compatibility with older code.
resource "openai_batch" "example" {
input_file_id = openai_file.input_file.id
endpoint = "/v1/embeddings"
model = "text-embedding-ada-002"
completion_window = "24h"
# Optional metadata
metadata = {
environment = "production"
project = "document-embeddings"
}
}module "batch" {
source = "../../modules/batch"
input_file_id = openai_file.input_file.id
endpoint = "/v1/embeddings"
model = "text-embedding-ada-002"
completion_window = "24h"
}If you're using this module in existing code, we recommend migrating to the direct resource approach. The module outputs approximately match the attributes of the resource, but the actual batch API provides more features and real-time status:
# Before
output "batch_id" {
value = module.batch.batch_id
}
# After
output "batch_id" {
value = openai_batch.example.id
}| Variable | Type | Description | Default |
|---|---|---|---|
| input_file_id | string | ID of the batch input file | Required |
| endpoint | string | API endpoint for batch processing | Required |
| model | string | Model to use for the batch | Required |
| completion_window | string | Time window for batch processing | "24h" |
| project_id | string | OpenAI project ID | "" |
| metadata | map(string) | Key-value pairs to attach to the batch | {} |
| Output | Type | Description |
|---|---|---|
| batch_id | string | Simulated batch job ID |
| status | string | Always "in_progress" for simulated jobs |
| created_at | string | Static timestamp for job creation |
| expires_at | string | Static timestamp for job expiration |
| output_file_id | string | Simulated output file ID |
| error | string | Always null for simulated jobs |
This module uses openai_chat_completion to create a simulated batch job. It generates deterministic IDs and static timestamps to provide consistent outputs between runs.
The simulation provides:
- A deterministic batch ID
- Static timestamps for creation and expiration
- A simulated output file ID
- A fixed "in_progress" status
Since this is a simulation:
- Batch jobs don't actually process files
- Status is always reported as "in_progress"
- The simulation doesn't provide actual results
- You cannot monitor progress or retrieve actual outputs
The official openai_batch resource provides:
- Real batch processing through the OpenAI API
- Accurate status tracking (validating, processing, completed, etc.)
- Actual output file IDs when processing completes
- Error handling and reporting
- Metadata support for better organization
- Request count statistics
Only use this module when:
- You have existing code using this module that you're not ready to migrate
- You need to maintain compatibility with configurations written for earlier versions
For all other use cases, use the openai_batch resource directly.