Back to Photo Suite
Technical Architecture

File-Based DAM

Built alongside Photo Developer. SwiftData for local metadata, XMP for portability, FSEvents for auto-refresh. No catalog lock-in.

Pure Apple Stack

User Interface SwiftUI + NSEvent
Query Engine PhotoIndex (inverted indexes)
Read Layer PhotoSummary (value-type structs)
Write Layer SwiftData + CoreData bridge
Metadata Persistence XMP Sidecars (portable)
File System FSEvents + ImageIO + Bookmarks
Large
Library Scale
Instant
Folder Switch
Instant
Filter Toggle
300ms
First Photos On Screen

No Catalog Lock-In

Shoebox Architecture

Point the app at folders on your disk. No import process. No copying files. Photos stay where you put them. SwiftData stores references and cached metadata locally, but the source of truth is always your files + XMP sidecars.

XMP Sidecar Files

All metadata written to .xmp files next to the RAW. Industry-standard XMP namespaces (xmp:, dc:, lr:, xmpDM:). Preserves Photo Developer settings (pd:) and Camera Raw settings (crs:).

FSEvents Auto-Refresh

macOS FSEvents API watches shoebox folders for changes. New photos appear automatically. XMP edits from external apps detected and re-read. Deleted files removed from database. 500ms debounce batches rapid changes. Incremental updates — no full rescan.

XMP Namespace Purpose Compatibility
xmp: Rating, label, metadata dates Adobe Compatible
dc: Dublin Core (title, description, keywords) Adobe Compatible
xmpDM: Pick status (-1 reject, 0 neutral, 1 pick) Adobe Compatible
lr: Hierarchical keywords with | separator Adobe Compatible
crs: Camera Raw Settings Preserved
pd: Photo Developer custom settings Preserved

Read/Write Split

SwiftData handles writes. A value-type struct layer handles all reads. The database can be deleted and rebuilt from XMP sidecars at any time.

Layer Key Properties Role
Photo Write url, rating, pickStatus, captureDate, keywordPaths SwiftData model. Mutations only via resolvePhoto(id:)
PhotoSummary Read ~600-byte struct, ~40 fields, pre-parsed keyword sets Value type for all filtering, sorting, display
PhotoIndex Read 15 inverted indexes + 6 convenience sets O(1) sidebar switch, set-intersection filtering
Keyword name, fullPath, parent/children Hierarchical tree. KeywordNode struct for sidebar
Collection name, parent, isSmartCollection, smartCriteria Regular + smart. Hierarchical with drag-and-drop
Shoebox path, bookmark, isPrivate Root library folder. Multiple shoeboxes, private flag
// Read path: lightweight value-type struct (~600 bytes) struct PhotoSummary { let id: UUID let url: URL var rating: Int var pickStatus: PickStatus // .picked, .rejected, .neutral let captureDate: Date? // Pre-parsed for O(1) keyword matching var keywordPathSet: Set<String> // {"Animals|Birds|Owls", ...} var keywordNameSet: Set<String> // {"Owls", "Birds", ...} // Pre-computed index keys (no string transforms at query time) let cleanedCameraName: String? let formattedAperture: String? let normalizedShutterSpeed: String? // RAW+JPEG pairing (denormalized) let pairedPhotoID: UUID? let pairedPhotoURL: URL? // ... ~30 more fields (EXIF, develop, dates, shoebox) } // Write path: resolve managed object only for mutations func resolvePhoto(id: UUID) -> Photo? { // Fetch from SwiftData, edit, save // Then: update summary, incremental index update, re-query }

Smart Collections

Auto-Updating Collections

Define criteria once, collection updates automatically. Filter by rating, pick status, keywords, camera, lens, aperture, shutter speed, ISO, focal length, or file type. AND logic: photos must match ALL selected criteria. Purple sparkles icon distinguishes smart from regular collections.

Live Preview

Smart collection editor shows live count of matching photos as you adjust criteria. "27 photos match these criteria" updates instantly. No surprises after creation.

// Smart collection criteria — 11 dimensions, AND logic struct SmartCriteria: Codable { var ratings: [Int] = [] // [4, 5] = 4★ or 5★ var pickStatuses: [Int] = [] // [-1, 1] = rejects + picks var keywordNames: [String] = [] // ["Portrait", "Studio"] var cameras: [String] = [] // ["Canon EOS R5"] var lenses: [String] = [] // ["RF 50mm F1.2 L USM"] var apertures: [String] = [] // ["f/1.2", "f/1.4"] var shutterSpeeds: [String] = [] // ["1/250", "1/500"] var isos: [String] = [] // ["100", "400"] var focalLengths: [String] = [] // ["50mm", "85mm"] var fileTypes: [String] = [] // ["ORF", "JPEG"] var editedStatuses: [String] = []// ["Photo Developer", "Unedited"] } // Evaluation runs on PhotoSummary, not managed objects func matches(_ summary: PhotoSummary) -> Bool

RAW+JPEG Pairing

Automatic Pairing

IMG_1234.ORF + IMG_1234.jpg = paired. IMG_1234-edited.jpg also pairs. Settings → General: "Show RAW+JPEG as separate items" toggle. Inspector: "View RAW" / "View JPEG" buttons. Context menu: "Show RAW/JPEG File". Toggle between RAW and JPEG instantly in loupe view.

Version Tracking

Multiple edited versions linked to original RAW. IMG_1234-bw.jpg, IMG_1234-final.jpg detected as versions. Purple "Versions" badge in grid. Inspector shows version list. Click to view any version.

Staleness Detection

Orange "Stale" badge when XMP is newer than paired JPEG. Compares file modification dates. Inspector warning: "Settings changed - re-export recommended". Helps users understand when JPEG preview doesn't match current develop settings. Updated by FSEvents and rescan.

Photo Developer Sync

Feature Implementation Status
Copy/Paste Settings Copy and paste XMP develop settings between photos Shipped
Edit Tracking Display pd:EditCount, pd:EditDuration, pd:FirstEditDate, pd:LastEditDate Shipped
Staleness Warning Compare XMP vs JPEG modification dates, show orange badge Shipped
Dev Badge Cyan "Dev" badge for pd: namespace, teal "CRS" for other editors Shipped
Open in Photo Developer Context menu → "Edit in External Editor" Shipped
Statistics Dashboard Gear performance, hit rates, time patterns, "what works" insights Shipped
Selective Paste Choose which panels to paste (Light, Color, Tone EQ, etc.) Planned
"Organize in Photo Archive Pro. Process in Photo Developer. Sync via XMP."
The intended workflow

Inverted Index

Every filter dimension is a pre-built Set<UUID>. Queries are set intersections, not array iterations. Sidebar switching and filter toggling are O(1) dictionary lookups.

Index Cardinality Query Cost
Shoebox ~2–10 shoeboxes O(1)
Folder ~2,000 unique paths O(1)
Rating, Pick Status 6 + 3 buckets O(1)
Camera, Lens ~10–200 models O(1)
Aperture, Shutter, ISO, Focal Length ~20–50 each O(1)
Keyword (by fullPath) ~3,000 keyword paths O(1)
Edited Status, Develop Technique, File Type 4 + 8 + ~10 O(1)
// Query pipeline — no array iteration for filtering let base = index.baseSetForSidebar(selection) // O(1) dict lookup → Set<UUID> let filtered = index.applyFilters(base, filterState) // set intersections let sorted = index.materializeAndSort(filtered) // map + sort let options = index.computeCascadingFilterOptions() // scoped per dimension // Incremental updates — no rebuild needed index.updateRating(id, from: 3, to: 5) // O(1): remove from old set, insert into new index.updateKeywords(id, old: oldSet, new: newSet) // O(keywords changed)

Cascading Filter Options

Cross-dimensional scoping. Select Camera=E-510, and the Lens dropdown narrows to E-510 lenses only. Each dimension is scoped by all other active filters. Fast path when no filters are active.

Incremental Updates

Rating, pick status, and keyword changes update the index in place — no rebuild. removePhoto() cleans all 15+ indexes in a single pass. The full index only rebuilds on startup or bulk import.

Two-Phase Startup

Phase 1: First 500 Photos in ~300ms

Single-query scalar fetch via NSFetchRequest with dictionaryResultType. Sidebar-aware scope: respects last sidebar selection (folder, shoebox, keyword). Builds a mini-index and renders the grid before the full library loads. No managed objects created.

Phase 2: Full Library in Background

Launched as a separate Task so Phase 1 fully releases the MainActor — SwiftUI gets a complete render cycle before Phase 2 begins. Full summary load via CoreData bridge dictionary fetch on a background queue. Index build + keyword tree in Task.detached.

CoreData Bridge

SwiftData's change tracking accumulates ~200 bytes per fault at scale. Solution: Mirror-based bridge to NSPersistentContainer. Bulk import via NSBatchInsertRequest (bypasses change tracking entirely). Dictionary fetches for the read layer — no managed objects on the main context.

Thumbnail Engine

3-tier sizing: 300px grid, 600px retina, 2048px loupe preview. LRU memory cache (500 items) + disk cache. EXIF orientation applied. Paired JPEGs preferred over RAW for faster decode.

Batched Writes

Database saves debounced at 100ms — rapid rating changes batch into a single write. XMP sidecar writes queued with 500ms debounce. FSEvents watches all shoeboxes; incremental scan processes only changed files.

"The bottleneck is SwiftData's change tracking, not SQLite."
The lesson that shaped the architecture

One-Time Purchase

Pay Once, Own Forever

Mac App Store distribution. No subscription fatigue. Updates included.

No Catalog Lock-In

Delete the app, keep your files and XMP sidecars. Open them in any XMP-aware tool. Your data is portable.

Works Offline Forever

No license servers. No activation limits. The app runs without internet, now and in 10 years.