A tiled-based data management system for the Advanced Light Source (ALS).
This project provides a specialized Tiled server configuration for managing scientific data at the Advanced Light Source. It builds on the Tiled framework to provide efficient data access and management capabilities.
- Built on the robust Tiled framework
- Docker containerization for easy deployment
- Support for Python 3.13
- Comprehensive CI/CD pipeline with GitHub Actions
- Reproducible development environment managed with Pixi
# Build the Docker image
docker build -t splash_tiled -f Containerfile .
# Run the container
docker run -p 8000:8000 splash_tiled# Clone the repository
git clone https://github.com/als-computing/splash_tiled.git
cd splash_tiled
# Install and activate the environment
pixi install
pixi shell
# Run tests
pixi run test- Pixi (recommended)
- Python 3.13
- Docker (for containerized deployment)
# Install Pixi if needed
curl -sSf https://pixi.sh/install.sh | bash
# Install all dependencies
pixi installPixi manages all dependencies (including Python 3.13) and installs the package in editable mode automatically.
This project uses Pixi for reproducible environments.
# Install Pixi if you don't have it
curl -sSf https://pixi.sh/install.sh | bash
# Install the environment
pixi install
# Start a shell in the environment
pixi shellTo update dependencies:
pixi update# Run all tests with coverage
pixi run test
# Or run pytest directly inside a pixi shell
pytest tests/test_specific.pyThe project uses Black, isort, flake8, and mypy for code quality.
pixi run lint # Check formatting, imports, style, and types
pixi run format # Auto-format code with black and isortFor more, see the Pixi documentation.
docker build -t splash_tiled -f Containerfile .# Basic run
docker run -p 8000:8000 splash_tiled
# With environment variables
docker run -p 8000:8000 -e TILED_SERVER_ENABLE_ORIGINS=* splash_tiled
# With volume mounting for data
docker run -p 8000:8000 -v /path/to/data:/data splash_tiledFor access-control testing, use the compose stack in
docker-compose.dev.yaml. It starts:
tiled: Tiled server from the projectContainerfilesync-worker: ESAF + staff sync loop from the same image, running on an intervalseed-data: one-shot service that creates abeamlinescontainer with beamline containers, ESAF containers, and array nodes with differentaccess_blob.tags
Start server + sync worker:
docker compose -f docker-compose.dev.yaml up --build -d tiled sync-workerSeed the catalog data:
docker compose -f docker-compose.dev.yaml run --rm seed-dataAdjust sync cadence and target beamlines via environment variables in
docker-compose.dev.yaml:
SYNC_CRON(for example:*/5 * * * *)BEAMLINES(comma-separated list orall)
Stop everything:
docker compose -f docker-compose.dev.yaml downThe project includes a comprehensive GitHub Actions workflow that:
- Linting: Runs code quality checks (black, isort, flake8, mypy)
- Testing: Executes the test suite with Python 3.13
- Building: Creates Docker images for multiple architectures
- Publishing: Pushes images to GitHub Container Registry
- Push to
mainordevelopbranches - Pull requests to
main - Release publications
Docker images are automatically published to:
ghcr.io/als-lbl/splash_tiled
The application can be configured through environment variables:
TILED_SERVER_ENABLE_ORIGINS: Configure CORS originsPYTHONPATH: Python module search path
The repository includes a small CLI for loading ESAFs into SQLite.
als-esaf-sync \
--beamline 7.0.2 \
--beamline 12.3.2 \
--db-path ./esafs.dbThe script stores data in four tables:
beamlinefor sync metadata per beamlineuserfor PI, experimental lead, and participant identitiesesaffor the ESAF record itselfesaf_userfor user-to-ESAF roles
- Fork the repository
- Create a feature branch
- Make your changes
- Run the test suite
- Submit a pull request
This project follows:
- PEP 8 style guidelines
- Black code formatting
- Import sorting with isort
- Type hints where appropriate
This project is licensed under the MIT License. See the LICENSE file for details.
For support and questions, please open an issue on the GitHub repository or contact the ALS team at contact@als.lbl.gov.