This document describes how to test the Astro Planner system to ensure it's functioning correctly.
docker-compose psYou should see:
- ✅
astronomus- Main API (port 9247) - ✅
astronomus-celery- Background worker - ✅
astronomus-redis- Task queue
All should show status Up and be healthy.
curl http://localhost:9247/api/health | python3 -m json.toolExpected output:
{
"status": "healthy",
"service": "astronomus-api",
"version": "1.0.0",
"telescope_connected": false
}Verify both containers have Docker CLI:
# Main API container
docker exec astronomus docker --version
# Celery worker container
docker exec astronomus-celery docker --versionBoth should return: Docker version 28.5.2, build ecc6942 (or similar)
The easiest way to test the complete processing pipeline:
# Install requirements
pip install astropy requests numpy
# Run the test
python test_processing.pyThis script will:
- ✅ Check API health
- 📸 Create synthetic star field FITS file
- 📁 Create processing session
- 📤 Upload FITS file
- ⚙️ Start processing job
- ⏳ Monitor progress
- 📥 Download result
- ✅ Verify success
Expected output:
============================================================
🧪 Astro Planner Processing Pipeline Test
============================================================
🏥 Checking API health...
✓ API is healthy: {'status': 'healthy', ...}
📸 Creating synthetic star field...
✓ Created test FITS: /tmp/tmpXXXX.fits
Size: 1024x1024 pixels
Stars: 100
File size: 4.19 MB
📁 Creating processing session...
✓ Created session: test_session_1699401234 (ID: 1)
📤 Uploading FITS file...
✓ Uploaded: tmpXXXX.fits (4.19 MB)
✅ Finalizing session...
✓ Session finalized and ready for processing
⚙️ Starting processing with preset 'quick_dso'...
✓ Processing job started (ID: 1)
⏳ Monitoring job progress (timeout: 120s)...
⏸️ PENDING: 0.0% -
🔄 RUNNING: 25.0% - Loading FITS file
🔄 RUNNING: 50.0% - Applying histogram stretch
🔄 RUNNING: 75.0% - Exporting to JPEG
🔄 RUNNING: 100.0% - Complete
✅ Processing completed successfully!
Output: /fits/sessions/1/result.jpg
📥 Downloading result...
✓ Downloaded: result.jpg (245.3 KB)
============================================================
✅ ALL TESTS PASSED!
============================================================
📄 Result saved to: result.jpg
open http://localhost:9247
# or visit in browser- Click "Process" tab
- Create a new session:
- Enter session name:
manual_test_001 - Click "Create Session"
- Enter session name:
- Upload a FITS file:
- Drag and drop or click to browse
- Watch upload progress bar
- Should see ✓ with file name when complete
- Click a processing preset button:
- "Quick DSO" for fast auto-stretch
- "PixInsight Export" for 16-bit TIFF
- Monitor job progress:
- Progress bar updates every 2 seconds
- Shows current step
- GPU badge if GPU used
- Download result when complete:
- Click "📥 Download Result" button
- Image opens or downloads
cd backend
pytest tests/ -v# Processing tests only
pytest tests/test_processing_integration.py -v
# With coverage
pytest tests/ --cov=app --cov-report=html
# Verbose output with print statements
pytest tests/ -v -sExpected coverage:
- Processing pipeline: >80%
- Direct processor: >90%
- API endpoints: >70%
- Astronomy services: >85%
curl -X POST http://localhost:9247/api/planner/plan \
-H "Content-Type: application/json" \
-d '{
"date": "2025-11-15",
"location": {
"latitude": 45.9183,
"longitude": -111.5433,
"elevation": 1234
},
"target_preferences": {
"target_type": "dso",
"min_altitude": 30
}
}' | python3 -m json.toolShould return scheduled targets with rise/set times.
curl http://localhost:9247/api/weather/current | python3 -m json.toolShould return current weather conditions (if API key configured).
curl "http://localhost:9247/api/targets/search?query=M31&limit=5" | python3 -m json.toolShould return Andromeda Galaxy and similar targets.
Test with various file sizes:
# Small file (512x512)
python test_processing.py # ~5-10 seconds
# Medium file (1024x1024)
# Modify script to create larger image
# Large file (2048x2048)
# Should complete in <60 secondsTest multiple jobs running simultaneously:
# Create multiple sessions and start jobs
for i in range(5):
# Create session
# Upload file
# Start processing
# All should complete successfullyCheck celery worker logs:
docker-compose logs -f celery-workerCommon issues:
-
"Docker is not available"
- Solution: Rebuild celery-worker
docker-compose build celery-worker docker-compose up -d celery-worker
-
"File not found"
- Check FITS_DIR volume mount
- Verify file was uploaded successfully
-
"Memory error"
- Increase Docker memory limit
- Process smaller files first
Check main API logs:
docker-compose logs -f astronomusRestart services:
docker-compose restartReset database:
docker-compose down
rm -rf data/*.db
docker-compose up -dFind what's using port 9247:
lsof -i :9247Kill native processes:
./scripts/docker-clean.shCreate .git/hooks/pre-commit:
#!/bin/bash
# Run tests before commit
echo "Running tests..."
cd backend
pytest tests/ -v
if [ $? -ne 0 ]; then
echo "Tests failed! Commit aborted."
exit 1
fi
echo "All tests passed!"
exit 0Make executable:
chmod +x .git/hooks/pre-commitCreate .github/workflows/test.yml:
name: Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.11
- name: Install dependencies
run: |
pip install -r backend/requirements.txt
pip install -r backend/requirements-test.txt
- name: Run tests
run: |
cd backend
pytest tests/ -v --cov=app
- name: Upload coverage
uses: codecov/codecov-action@v2The test_processing.py script creates synthetic star fields. You can also create custom test data:
from astropy.io import fits
import numpy as np
# Create test image
data = np.random.poisson(lam=100, size=(1024, 1024)).astype(np.float32)
# Add gradient (to test gradient removal)
y, x = np.ogrid[0:1024, 0:1024]
gradient = (y / 1024) * 500
data += gradient
# Add stars
# ... (see test_processing.py for star generation code)
# Save
hdu = fits.PrimaryHDU(data=data)
hdu.header['OBJECT'] = 'Test with Gradient'
hdu.writeto('test_gradient.fits', overwrite=True)For testing with real Seestar S50 data:
- Capture images with Seestar S50 app
- Export FITS files
- Copy to
fits/directory - Upload via web UI or test script
Keep a set of baseline processed images to detect regressions:
# Process baseline image
python test_processing.py
# Save result as baseline
cp result.jpg baselines/test_001_baseline.jpg
# On future changes, compare
python test_processing.py
compare result.jpg baselines/test_001_baseline.jpg -metric RMSE diff.pngTrack key metrics:
- Processing time: <30s for 1024x1024 image
- Output file size: 200-300 KB for JPEG
- Memory usage: <2GB per job
- Success rate: >95%
# Check API health
curl http://localhost:9247/api/health
# Check disk space
df -h data/
df -h fits/
# Check logs for errors
docker-compose logs --tail=100 | grep ERROR# Run full test suite
pytest backend/tests/ -v
# Check database size
ls -lh data/*.db
# Clean old processed files
find fits/sessions/ -type f -mtime +30 -delete# Update dependencies
pip list --outdated
# Check Docker image sizes
docker images | grep astronomus
# Review and archive old sessionsBefore releasing changes:
- All unit tests pass
- Integration tests pass
- Manual UI testing complete
- Processing pipeline tested with real data
- Memory usage acceptable
- No error logs during testing
- Documentation updated
- Changelog updated
If tests fail:
- Check logs:
docker-compose logs - Verify Docker setup:
docker-compose ps - Review error messages carefully
- Check TESTING_GUIDE.md troubleshooting section
- Open issue on GitHub with:
- Test output
- Logs
- System info (
docker version,python --version) - Steps to reproduce
Quick daily test:
python test_processing.pyFull test suite:
pytest backend/tests/ -vSystem health:
docker-compose ps
curl http://localhost:9247/api/healthKeep testing regularly to catch issues early!
Last Updated: 2025-11-07