Batch Predictions
Make predictions on multiple samples in a single API call.
Input Format
To make batch predictions, provide arrays with multiple values for each feature:
response = requests.post(
f"{API_URL}/projects/{PROJECT_ID}/predict",
headers={"X-API-Key": API_KEY, "Content-Type": "application/json"},
json={
"input": {
"date": ["2024-01-15", "2024-01-16", "2024-01-17"],
"category": ["A", "B", "A"],
"value": [100, 200, 150]
}
}
)
important
All feature arrays must have the same length. This length determines the number of predictions returned.
Response Format
The response contains an array of predictions matching the input order:
{
"success": true,
"data": {
"predictions": ["low", "high", "medium"]
}
}
Batch Size Limits
| Parameter | Limit |
|---|---|
| Maximum samples per request | 1,000 |
| Maximum payload size | 10 MB |
For larger batches, split your data into multiple requests.
Performance Considerations
- Async recommended: Large batches may take longer than the sync timeout
- Parallel requests: For very large datasets, send multiple batch requests in parallel
- Webhooks: Use webhooks to get notified when large batch jobs complete
Example: Processing a CSV File
import pandas as pd
import requests
# Load your data
df = pd.read_csv("data.csv")
# Convert to API format
input_data = {
col: df[col].tolist()
for col in df.columns
}
# Make batch prediction
response = requests.post(
f"{API_URL}/projects/{PROJECT_ID}/predict",
headers={"X-API-Key": API_KEY, "Content-Type": "application/json"},
json={"input": input_data}
)
# Add predictions to dataframe
df["prediction"] = response.json()["data"]["predictions"]