API Documentation
Programmatic access to DicomPressor DICOM processing services
Overview
The DicomPressor API allows you to programmatically upload, process, and download DICOM files. You can merge multi-frame images, compress, anonymize, split, export to PNG/video, and more — all via simple REST API calls.
Base URL: https://dicompressor.sitnov.work
All API endpoints return JSON responses. File uploads use multipart/form-data,
processing requests use application/json.
Authentication
All API endpoints require an API key. You can pass it in two ways:
| Method | Example |
|---|---|
| HTTP Header (recommended) | X-API-Key: your-key-here |
| Query parameter | ?api_key=your-key-here |
Quick Start
Merge 400 DICOM slices into one multi-frame file in 4 commands:
# 1. Create a session SESSION=$(curl -s -X POST \ -H "X-API-Key: YOUR_KEY" \ https://dicompressor.sitnov.work/api/session \ | jq -r .session_id) # 2. Upload DICOM files curl -X POST \ -H "X-API-Key: YOUR_KEY" \ -F "[email protected]" \ -F "[email protected]" \ https://dicompressor.sitnov.work/api/upload/$SESSION # 3. Process (merge) curl -s -X POST \ -H "X-API-Key: YOUR_KEY" \ -H "Content-Type: application/json" \ -d '{"action":"merge"}' \ https://dicompressor.sitnov.work/api/process/$SESSION # 4. Download result curl -H "X-API-Key: YOUR_KEY" \ -o merged.dcm \ https://dicompressor.sitnov.work/api/download/$SESSION/merged_multiframe.dcm
Workflow
Every processing job follows this flow:
Sessions are temporary and automatically expire after 1 hour.
Each session has an input/ directory for uploaded files and an output/
directory for results.
Endpoints
Create Session
Creates a new processing session. Returns a session_id (UUID) used in all subsequent requests.
Response
{
"session_id": "c256553e-b352-4280-af2c-21cb8bfb1f64"
}
Upload Files
Upload DICOM files to the session. Use multipart/form-data with field name files. You can upload multiple files per request, and call this endpoint multiple times (batch upload).
| Parameter | Type | Description |
|---|---|---|
session_id | path | Session UUID from /api/session |
files | form-data | One or more DICOM files |
Response
{
"uploaded": 400,
"files": ["0000.dcm", "0001.dcm", "..."]
}
List Files
List all input and output files in the session.
Response
{
"input": [
{"name": "0000.dcm", "size": 406602},
{"name": "0001.dcm", "size": 406602}
],
"output": [
{"name": "merged_multiframe.dcm", "size": 162068266}
]
}
Process Files
Run a processing action on the uploaded files. Send JSON body with action field.
| Field | Type | Required | Description |
|---|---|---|---|
action | string | Yes | One of the available actions |
params | string | Only for anonymize | Anonymization parameters (DICOM tag=value, one per line) |
Response
{
"success": true,
"action": "merge",
"log": "[INFO] Processing 400 files...\n[INFO] Done.",
"output_files": [
{"name": "Patient_series31_multiframe.dcm", "size": 162068266}
]
}
Download Result
Download a result file. The filename comes from the output_files array in the process response.
Returns the file as application/octet-stream with Content-Disposition: attachment.
View File (VolView)
Serve a file inline (for loading in VolView 3D viewer). Same as download but without Content-Disposition: attachment,
served with application/dicom MIME type.
To view in VolView, construct a URL like:
https://dicompressor.sitnov.work/volview/?urls=https://dicompressor.sitnov.work/api/view/{session_id}/{filename}
Available Actions
| Action | CLI Flag | Description | Input | Output |
|---|---|---|---|---|
merge | -j | Merge single-frame DICOM slices into one multi-frame file | Multiple .dcm | Single multi-frame .dcm |
split | -s | Split a multi-frame DICOM into individual frames | Single .dcm | Multiple .dcm |
compress_lossless | -x | Apply JPEG 2000 lossless compression | .dcm files | Compressed .dcm |
compress_lossy | -z | Apply JPEG 2000 lossy compression | .dcm files | Compressed .dcm |
decompress | -u | Remove compression from DICOM files | .dcm files | Uncompressed .dcm |
anonymize | -a | Replace patient identifiers with custom values | .dcm files + params | Anonymized .dcm |
export_png | -I png | Export each frame as a PNG image | .dcm files | .png images |
export_video | -E | Export multi-frame DICOM as MP4 video | Single .dcm | .mp4 video |
dicomdir | -d | Create a DICOMDIR index file | .dcm files | DICOMDIR file |
headers | -t | Export DICOM headers to text | .dcm files | Modified .dcm |
info | --info | Show detailed DICOM file information | Single .dcm | Text (in log field) |
summary | --summary | Show folder summary statistics | .dcm files | Text (in log field) |
CLI: Scheduler & Watch Mode
DicomPressor supports automated batch processing with idempotent markers and continuous monitoring.
--skip-if-done
Creates a .dicompressor_done JSON marker file after successful processing. On subsequent runs, folders with an existing marker are skipped instantly. Safe for cron jobs and schedulers.
# Merge, but skip if already processed python3 dicompressor.py -j --skip-if-done -f /path/to/patient_folder # Second run — instantly skips python3 dicompressor.py -j --skip-if-done -f /path/to/patient_folder # Output: SKIPPED (already processed) # To re-process, delete the marker: rm /path/to/patient_folder/.dicompressor_done
--watch N
Continuously monitors a parent folder for new patient subfolders and auto-processes them every N seconds. Implies --skip-if-done. Press Ctrl+C to stop.
# Watch /data/patients, merge new subfolders every 5 minutes python3 dicompressor.py -j --watch 300 -f /data/patients/
--output-dir DIR
Copy merged result files to a separate directory. Works with -j, --skip-if-done, and --watch. The directory is auto-created if missing.
# Watch + copy all merged files to a central folder python3 dicompressor.py -j --watch 300 --output-dir /data/merged -f /data/patients/
Standalone Watch Scripts
For production deployments, dedicated watch scripts are included in the download:
Linux / macOS / WSL (Bash):
# Basic watch: ./dicompressor-watch.sh /data/patients 300 # Watch + output dir: ./dicompressor-watch.sh /data/patients 300 /data/merged
Windows (PowerShell):
# Basic watch: .\dicompressor-watch.ps1 -WatchDir "D:\DICOM\Patients" -IntervalSeconds 300 # Watch + output dir: .\dicompressor-watch.ps1 -WatchDir "D:\DICOM\Patients" -OutputDir "D:\Merged"
Both scripts scan for .dcm files in subfolders, skip already-processed folders (marker file), and log progress with timestamps.
Full Examples
cURL
Complete merge workflow with cURL:
#!/bin/bash # DicomPressor API — Full merge example # Replace YOUR_KEY with your actual API key API="https://dicompressor.sitnov.work" KEY="YOUR_KEY" # Step 1: Create session echo "Creating session..." SESSION=$(curl -s -X POST -H "X-API-Key: $KEY" $API/api/session | jq -r .session_id) echo "Session: $SESSION" # Step 2: Upload all .dcm files from a folder echo "Uploading files..." for f in ./dicoms/*.dcm; do curl -s -X POST \ -H "X-API-Key: $KEY" \ -F "files=@$f" \ $API/api/upload/$SESSION > /dev/null done echo "Upload complete" # Step 3: Merge echo "Merging..." RESULT=$(curl -s -X POST \ -H "X-API-Key: $KEY" \ -H "Content-Type: application/json" \ -d '{"action":"merge"}' \ $API/api/process/$SESSION) echo "Success: $(echo $RESULT | jq .success)" # Step 4: Download each output file echo $RESULT | jq -r '.output_files[].name' | while read fname; do echo "Downloading $fname..." curl -s -H "X-API-Key: $KEY" \ -o "$fname" \ $API/api/download/$SESSION/$fname done echo "Done!"
Python
import requests import os from pathlib import Path API = "https://dicompressor.sitnov.work" KEY = "YOUR_KEY" HEADERS = {"X-API-Key": KEY} # 1. Create session session_id = requests.post(f"{API}/api/session", headers=HEADERS).json()["session_id"] print(f"Session: {session_id}") # 2. Upload files (batch of 20) dcm_folder = Path("./dicoms") files = sorted(dcm_folder.glob("*.dcm")) for i in range(0, len(files), 20): batch = files[i:i+20] form_files = [("files", (f.name, open(f, "rb"))) for f in batch] r = requests.post(f"{API}/api/upload/{session_id}", headers=HEADERS, files=form_files) print(f" Uploaded {r.json()['uploaded']} files") # 3. Process result = requests.post( f"{API}/api/process/{session_id}", headers={**HEADERS, "Content-Type": "application/json"}, json={"action": "merge"} ).json() print(f"Success: {result['success']}") print(f"Output files: {len(result['output_files'])}") # 4. Download results for f in result["output_files"]: r = requests.get( f"{API}/api/download/{session_id}/{f['name']}", headers=HEADERS ) Path(f["name"]).write_bytes(r.content) print(f" Downloaded {f['name']} ({f['size'] / 1048576:.1f} MB)")
Python — Anonymize with Custom Parameters
# After creating session and uploading files... result = requests.post( f"{API}/api/process/{session_id}", headers={**HEADERS, "Content-Type": "application/json"}, json={ "action": "anonymize", "params": """(0010,0010)=ANONYMOUS^PATIENT (0010,0020)=ANON001 (0010,0030)=19000101 (0010,0040)=O (0008,0090)=DR ANONYMOUS""" } ).json() print(f"Anonymized {len(result['output_files'])} files")
PowerShell
# DicomPressor API — PowerShell example $API = "https://dicompressor.sitnov.work" $Key = "YOUR_KEY" $Headers = @{ "X-API-Key" = $Key } # 1. Create session $session = (Invoke-RestMethod -Uri "$API/api/session" ` -Method POST -Headers $Headers).session_id Write-Host "Session: $session" # 2. Upload files $files = Get-ChildItem -Path ".\dicoms\*.dcm" foreach ($file in $files) { $form = @{ files = Get-Item $file.FullName } Invoke-RestMethod -Uri "$API/api/upload/$session" ` -Method POST -Headers $Headers -Form $form | Out-Null } Write-Host "Uploaded $($files.Count) files" # 3. Process $body = @{ action = "merge" } | ConvertTo-Json $result = Invoke-RestMethod -Uri "$API/api/process/$session" ` -Method POST -Headers ($Headers + @{ "Content-Type" = "application/json" }) ` -Body $body Write-Host "Success: $($result.success)" # 4. Download foreach ($f in $result.output_files) { Invoke-WebRequest -Uri "$API/api/download/$session/$($f.name)" ` -Headers $Headers -OutFile $f.name Write-Host "Downloaded $($f.name)" }
Bash — Batch Upload (Fast)
#!/bin/bash # Fast batch upload — sends 50 files per request API="https://dicompressor.sitnov.work" KEY="YOUR_KEY" FOLDER="./dicoms" SESSION=$(curl -s -X POST -H "X-API-Key: $KEY" $API/api/session | jq -r .session_id) # Build batch upload commands FILES=($FOLDER/*.dcm) BATCH=50 for ((i=0; i<${#FILES[@]}; i+=BATCH)); do CMD="curl -s -X POST -H 'X-API-Key: $KEY'" for ((j=i; j<i+BATCH && j<${#FILES[@]}; j++)); do CMD+=" -F 'files=@${FILES[$j]}'" done CMD+=" $API/api/upload/$SESSION" eval $CMD | jq -r '"Batch: uploaded \(.uploaded) files"' done # Process curl -s -X POST \ -H "X-API-Key: $KEY" \ -H "Content-Type: application/json" \ -d '{"action":"merge"}' \ $API/api/process/$SESSION | jq .
Error Handling
The API returns standard HTTP status codes:
| Code | Meaning | Common Cause |
|---|---|---|
200 | Success | — |
400 | Bad Request | Invalid action, missing files |
401 | Unauthorized | Invalid or missing API key |
404 | Not Found | Invalid session ID or filename |
413 | Payload Too Large | Upload exceeds 500 MB |
429 | Too Many Requests | Rate limit exceeded |
503 | Service Unavailable | Max sessions reached (server busy) |
504 | Gateway Timeout | Processing exceeded 5 minute limit |
All error responses are JSON:
{
"error": "Description of the problem"
}
Rate Limits
| Limit | Value |
|---|---|
| API requests per minute (per IP) | 30 |
| Session creation per minute (per IP) | 5 |
| Max concurrent sessions (server-wide) | 50 |
| Max upload per request | 500 MB |
| Max files per session | 2000 |
| Processing timeout | 5 minutes |
| Session TTL (auto-delete) | 1 hour |
Get an API Key
Email: [email protected]
Please include: your name, organization/project, and a brief description of how you plan to use the API. Keys are issued free of charge for research and medical use.