Getting Started
The Lab Automation Platform is a distributed system designed for pharmaceutical quality control laboratories. It orchestrates complex analytical workflows using containerized lab instruments and provides real-time monitoring.
Key Features
- Event-Driven Architecture - PostgreSQL NOTIFY/LISTEN for real-time workflow execution
- Plugin System - Dynamic task, service, and instrument plugins
- Automatic Task Sequencing - Database events trigger next tasks automatically
- Real-time Monitoring - Live status updates and progress tracking
- Concurrent Processing - Multiple workflows run simultaneously
Installation
Get the platform running in minutes with Docker Compose.
Prerequisites
- Docker and Docker Compose
- Python 3.9+ (for local development)
- Node.js 18+ (for frontend development)
Quick Start
# Clone the repository
git clone https://github.com/your-org/automation-platform-FastAPI.git
cd automation-platform-FastAPI
# Start all services
docker compose -f compose_v1.yml up -d
# Verify services are running
docker compose -f compose_v1.yml ps
Access Points
Workflow API
Manage workflows through RESTful API endpoints.
POST/api/workflows/
Create a new workflow with automatic event-driven execution.
Request Body
{
"name": "Pharmaceutical Quality Control",
"author": "Lab Technician",
"tasks": [
{"name": "Sample Measurement"},
{"name": "Run Weight Balance"},
{"name": "Weight Balance"}
]
}
Response
{
"id": 93,
"name": "Pharmaceutical Quality Control",
"status": "pending",
"message": "Workflow created - event-driven execution will start automatically",
"tasks": [
{
"id": 278,
"name": "Sample Measurement",
"plugin_type": "task",
"status": "awaiting_manual_completion"
}
]
}
GET/api/workflows/
List all workflows with their current status and tasks.
GET/api/workflows/{workflow_id}
Get detailed information about a specific workflow.
PUT/api/workflows/{workflow_id}
Update workflow status or configuration.
Request Body
{
"status": "running|completed|failed|paused|stopped"
}
Task API
Manage individual tasks within workflows.
POST/api/tasks/{task_id}/complete
Complete a manual task with data.
Request Body
{
"data": {
"materials_table": [
{"run": 1, "material_1": 0.1, "material_2": 0.05}
]
}
}
| Parameter |
Type |
Required |
Description |
| task_id |
integer |
Yes |
Unique identifier for the task |
| data |
object |
Yes |
Task completion data and results |
| materials_table |
array |
No |
Materials and measurements for analysis |
Plugin System
The platform uses a dynamic plugin architecture with three types of plugins:
Plugin Types
- Task Plugins - Manual/User tasks requiring human input
- Service Plugins - Lab services that prepare data and instructions
- Instrument Plugins - Physical/simulated instruments that execute measurements
Available Plugins
GET/api/plugins/available
List all available plugins by type.
Response
{
"task_plugins": ["Sample Measurement", "Data Review"],
"service_plugins": ["Run Weight Balance", "Data Processing"],
"instrument_plugins": ["Weight Balance", "HPLC System", "Sample Prep Station"]
}
Plugin Configuration
Plugins are defined in JSON files and loaded dynamically:
{
"name": "Weight Balance",
"type": "instrument",
"endpoint": "http://host.docker.internal:5011",
"parameters": {
"action": "dispense",
"precision": "0.0001g",
"capacity": "220g"
}
}
Event System
The platform uses PostgreSQL NOTIFY/LISTEN for real-time event-driven execution.
How It Works
- Workflow creation triggers database insert
- Database trigger emits NOTIFY event
- Event listener receives notification
- Task plugin loaded and executed via Celery
- Task completion triggers next task automatically
Event Types
workflow_events - Workflow creation, status changes
task_events - Task status updates, completions
system_events - System-wide notifications
Monitoring Events
# Monitor event-driven execution
docker compose -f compose_v1.yml logs backend --follow | grep "Event-driven:"
# Monitor plugin system activity
docker compose -f compose_v1.yml logs worker --follow | grep "Plugin:"
Monitoring & Troubleshooting
Monitor system health and diagnose issues effectively.
Real-time Monitoring
# Check all containers status
docker compose -f compose_v1.yml ps
# Monitor Celery worker activity
docker compose -f compose_v1.yml logs worker --follow
# Monitor instrument activity
docker compose -f compose_v1.yml logs sample-prep-station --follow
docker compose -f compose_v1.yml logs hplc-system --follow
Database Access
# Access PostgreSQL database directly
docker compose -f compose_v1.yml exec db psql -U postgres -d lab_automation
# Sample queries:
# SELECT * FROM workflows ORDER BY created_at DESC;
# SELECT * FROM tasks WHERE workflow_id = 11;
# SELECT * FROM results WHERE task_id = 23;
Performance Optimization
# Scale Celery workers
docker compose -f compose_v1.yml up worker --scale worker=4 -d
# Monitor Redis performance
docker compose -f compose_v1.yml exec redis redis-cli INFO memory
Production Deployment
Deploy the platform in production environments with Kubernetes.
Kubernetes Deployment
apiVersion: apps/v1
kind: Deployment
metadata:
name: backend
namespace: lab-automation
spec:
replicas: 2
selector:
matchLabels:
app: backend
template:
metadata:
labels:
app: backend
spec:
containers:
- name: backend
image: lab-automation/backend:latest
ports:
- containerPort: 8001
Scaling Considerations
- Multiple replicas for each service
- Horizontal Pod Autoscaler (HPA)
- Load balancing for API endpoints
- Redis clustering for high availability
Examples
Common usage patterns and integration examples.
Create and Monitor Workflow
# Create workflow
curl -X POST http://localhost:8001/api/workflows/ \
-H "Content-Type: application/json" \
-d '{
"name": "Pharmaceutical QC Analysis",
"author": "QC Lab",
"tasks": [
{"name": "Sample Measurement"},
{"name": "Run Weight Balance"},
{"name": "Weight Balance"}
]
}'
# Monitor workflow progress
curl -s http://localhost:8001/api/workflows/93 | python -m json.tool
# Complete manual task
curl -X POST http://localhost:8001/api/tasks/278/complete \
-H "Content-Type: application/json" \
-d '{
"data": {
"materials_table": [
{"run": 1, "material_1": 0.1, "material_2": 0.05}
]
}
}'
Batch Processing
# Create multiple workflows for batch processing
for i in {1..5}; do
curl -X POST http://localhost:8001/api/workflows/ \
-H "Content-Type: application/json" \
-d "{\"name\": \"Batch_$(date +%s)_$i\", \"author\": \"QC Lab\", \"tasks\": [{\"name\": \"Sample Measurement\"}, {\"name\": \"Run Weight Balance\"}, {\"name\": \"Weight Balance\"}]}"
done
Integration with LIMS
import requests
import json
# Integration with Laboratory Information Management System
def create_workflow_from_lims(sample_id, test_methods):
workflow_data = {
"name": f"LIMS Sample {sample_id}",
"author": "LIMS Integration",
"tasks": [{"name": method} for method in test_methods],
"metadata": {
"sample_id": sample_id,
"source": "LIMS"
}
}
response = requests.post(
"http://localhost:8001/api/workflows/",
json=workflow_data
)
return response.json()
# Example usage
workflow = create_workflow_from_lims("S001", ["Sample Measurement", "HPLC Analysis System"])
print(f"Created workflow {workflow['id']}")