I keep a list of every feature I've considered for Photo Developer. It has 30-something entries. Five of them got fully built, shipped, and then ripped back out. The rest died at various stages between "2am idea" and "opened Xcode."

The kill list is longer than the feature list. I'm not sure what that says about me, but I think it says something good about the app.


The five I built and then deleted

These are the ones that sting. Not because they were bad ideas — a couple of them were genuinely clever — but clever and useful aren't the same thing, and apparently I need to keep learning that.

1. Nothing Mode

A mode that hid all UI chrome. Inspector, tab bar, overlays, badges — gone. Just the image on neutral gray. I gave it the L key. It had its own state machine, escape routing, settings persistence.

What I didn't check before building it: what happens when you just... toggle the sidebar? The sidebar is 95% of the chrome. Hiding it already gives you a near-bare canvas. Nothing Mode was a state machine that replicated a checkbox.

I wrote it, wired the shortcut, tested it, felt satisfied, then realized I could get the same result by clicking a button that was already there. That was a Wednesday.

2. Mask Feather Preview

Local adjustments use masks — gradients, brushes, AI-detected regions. The feather (edge softness) matters. So I built a diagnostic: Option+hover to see the mask as a grayscale image. White where it applies, black where it doesn't, feather gradient in between.

Immediately useless. A grayscale mask floating in space tells you nothing about what it's doing. Is the feather falling on the face or the background? You can't tell. The face isn't there anymore.

The app already had a colored tint overlay showing mask coverage on top of the actual image. Press O. That's the diagnostic. Mine removed the thing you're diagnosing — which makes it modern art, not a tool.

3. Tone EQ Guided Filter

This one was technically impressive. That should have tipped me off.

The tone equalizer lets you push specific brightness ranges around — lift shadows, pull down highlights, leave midtones alone. Mine worked per-pixel: this pixel is 40% bright, apply the 40% slider. Simple, correct, same as Lightroom and darktable.

Then I read about guided filters — edge-aware spatial smoothing — and thought: what if the tool understood regions instead of pixels? A face in open shade wouldn't be scattered bright pixels; it'd be one region, lifted as a unit. No halos. Elegant. I wrote a Metal kernel, integrated it, tested it.

At normal slider values — the ones people actually use — it turned smooth tonal transitions into blotchy painted regions. Like the app had decided where your shadows were and colored them in with a marker. The halos I was solving for were barely visible. The artifacts I introduced were very visible.

The per-pixel approach wasn't broken. I just decided it wasn't sophisticated enough. It was.

4. ISO-Adaptive Noise Reduction

Automatically scale noise reduction based on the image's ISO. High ISO = more smoothing, low ISO = less. A slider value of 30 would do different amounts of work depending on the shot. Smart.

Too smart. "I used 30 last time and it looked great" — yes, but last time was ISO 800. This time is ISO 3200, so 30 now means vaseline on the lens. The slider was lying about its own value.

If you want more noise reduction, move the slider further. That's what sliders are for. I removed it within a week.

5. Absolute Kelvin for Local White Balance

Stored local white balance as absolute Kelvin instead of a relative warmth offset. A local adjustment of "5200K" meant completely different things depending on whether the base image was tungsten or daylight.

Every other RAW editor uses relative offset for local white balance. There's a reason for that. I found the reason by doing it the other way first.


The twelve I talked myself out of

These never got built. Some came close — design documents, feasibility studies, one had a hand-curated database of 15 film stock fingerprints. Sometimes the discipline is just closing the text editor before you start writing code.

Virtual Fill Light

A vignette that brightens away from the detected subject instead of from image center. Vision framework finds the person, calculates the centroid, positions the light there.

Three problems. I already had a Spotlight filter doing subject-centered light shaping. Vignettes are about the frame, not the subject — center falloff guides the eye inward regardless of where someone's standing. And "no other editor does this" turned out to be a warning, not an opportunity. Sometimes nobody does something because nobody should.

Anatomical Spotlight

Virtual Fill Light's weirder cousin. Body pose detection to shape light around detected limbs and torso. Would've been great for the 60% of images where the pose detection works. A disaster for the other 40%.

ML-Personalized Auto Enhance

Train a model on my editing history, predict settings for new images. If the model suggests Temperature +300 and Tint -12 — why? "Because your last 200 golden-hour photos averaged that." That's not a reason. That's a statistic dressed up as taste.

"Match This Look" (Automatic)

Feed in two images, app derives settings to make one look like the other. Histogram matching, feature transfer. The non-automatic version — pin a reference image, match by eye using the existing tools — is still on the roadmap. That one requires judgment. (It's also blocked by a keyboard shortcut conflict, which is its own kind of comedy.)

3D LUT Import

LUTs are popular. LUTs are also opaque transforms you can't edit or partially apply. The whole point of Photo Developer is that you can see the numbers, change the numbers, understand the numbers. A LUT is a black box with good marketing.

If you know what look you want, you can build it. If you don't, a LUT won't teach you.

AI Relighting

Generative fill light, key light, backlighting simulation. I thought about it for twenty minutes before remembering I'd spent two months building gradient and radial mask tools so that light adjustments would be predictable and not look like they were hallucinated.

Layer-Based Editing

Layers, blend modes, opacity stacks. That's Photoshop. Photo Developer is parametric — settings describe the edit, not pixel operations. Local adjustments with masks handle the "I want different settings here" use case. Layers would be solving it again, differently, for no one who asked.

Preset Approximation Detection

This is the one I'm embarrassed about — not the idea, but how far the thinking went.

The concept: detect when a photographer's manual edits happen to approximate a built-in preset. "You're 3 tweaks away from Portra 400." A quiet nudge.

I designed a directional signature matching algorithm. Hand-curated 15 preset fingerprints — Portra 400, Ektar 100, CineStill 800T, Tri-X 400, eleven others. Confidence scoring. Parameter classification (corrective vs. stylistic). The whole design doc.

Then I did the math. The design constraint was "fire rarely — once per 100-200 edits." That's also the problem. Too rare to become a habit. And the photographers skilled enough to trigger it — the ones touching 5-6 stylistic parameters across categories — already know what they're doing. The nudge is a micro-improvement for people who don't need micro-improvements. Beginners won't hit the threshold because they haven't developed the vocabulary.

400 lines of code and 15 hand-tuned fingerprints for a moment that feels clever but doesn't change what happens next. I closed the file.

B&W Filter Suggestion

Expose which filter the Auto B&W algorithm picked, offer alternatives. But auto is auto. If you want control, there are eight channel sliders right there. Showing the machine's reasoning doesn't make the tool better. It makes the interface noisier.

Color Crossover / Couplers

Shifting red automatically warms orange. Needs spectral simulation. Would be technically interesting and practically invisible. The HSL sliders are independent on purpose.

Nitro's Segmentation Mattes

Apple's RAW pipeline exposes per-segment tone mapping — skin, foliage, sky — but only for iPhone ProRAW. It excludes every DSLR and mirrorless camera. I'm not building features for one phone.

Edge-Aware Vignette

Vignette falloff that follows subject edges. I evaluated three approaches. The distance field one would've been technically beautiful. Vignettes darken corners. That's what they do. That's all they need to do.


Purgatory

Some features aren't dead. They have design documents, feasibility studies, no urgency, and no keyboard shortcut left to assign them.

Oklab HSL Mixer — replace the color model with a perceptually uniform one. Blue saturation at 0.5 genuinely looks darker than yellow at 0.5 in HSL, and Oklab fixes that. But it breaks every existing XMP value, needs eight zone centers retuned by ear, and solves a limitation Lightroom also has. We'd be more correct than everyone and incompatible with everyone. Not yet.

Tone Curve Transfer — extract the tone curve from one image's edit, apply it to another. Algorithm designed, infrastructure exists, maybe 150 lines of code. But Copy/Paste Settings covers 90% of this. The remaining 10% doesn't justify a feature. Maybe later, as a button on a reference panel that also doesn't exist yet.

Soft Proofing — simulate how exports will look in sRGB or other color spaces. The conversion logic already exists in the export pipeline. The UI doesn't. It's waiting for me to care enough about print to give it a keyboard shortcut, and the keyboard is getting full.


What the kill list taught me

Every feature I didn't ship made the ones I did ship better — not in some motivational-poster way, but literally. Each killed feature was a shortcut I didn't assign, a state machine I didn't debug, a preferences pane I didn't build, a regression I didn't chase across releases.

After Nothing Mode and Mask Feather Preview died in the same week, I started asking three questions before opening Xcode. What does the app already do here? Does this add information or remove it? Would a photographer actually notice?

Nothing Mode failed question one. Mask Feather Preview failed question two. Preset Approximation failed question three. These aren't hard questions. I just wasn't asking them.

The guided filter taught me something else: don't improve a tool in a direction that changes what it is. A per-pixel luminance curve and a spatial region classifier are different instruments. Making one into the other doesn't improve either. And the ISO-adaptive NR taught the corollary — don't make a slider lie about its own position.

The features I never built? They taught me that "no other editor does this" is usually not a gap in the market. It's a wall everyone already walked into.


The math

If I'd shipped everything on this list, Photo Developer would have about 1,500 more lines of code, a dozen more keyboard shortcuts, and a settings screen that scrolls. It would also be worse — not theoretically, but in ways I can point to, because I built five of these and watched them make the app worse in real time.

The app has 19 custom Metal shaders, a 25-stage pipeline, seven mask types, spot removal, split toning, film simulation, batch export. It does a lot. But the reason it hangs together is the 30-something things it doesn't do.

Weniger, aber besser. I used to read that as philosophy. Now I read it as project management.


Photo Developer is part of a suite of photography tools I'm building to replace Lightroom. Leaving Lightroom covers the why. Building a RAW Developer covers the how. This one covers the why not.