Page 1 of 1

Requesting Support for iPhone 16 Pro Max RAW Pipelines

Posted: Sun Jan 04, 2026 4:26 pm
by KERICH
Over the past weeks, I’ve been experimenting extensively with Indigo RAW combined with ColorPerfect, and I’d like to share both my results and a request that naturally follows from this workflow.



Indigo RAW + ColorPerfect in Practice

I’d like to begin by showing several images processed with Indigo RAW + ColorPerfect.

The results genuinely surprised me:
• very natural color reproduction
• extremely wide and usable dynamic range
• smooth tonal transitions without harsh sharpening
• an overall “camera-like” rendering rather than a typical mobile look As a landscape photographer who enjoys mountaineering and long hikes, this combination has fundamentally changed how I think about equipment. Carrying heavy camera bodies and lenses into the mountains has always been a trade-off between image quality and physical burden.

With iPhone + Indigo + ColorPerfect, I’m now achieving results that, in overall image impression, often rival—or even surpass—what I get from most traditional cameras, especially when weight, flexibility, and shooting conditions are considered.

Even when using ColorPerfect’s generic camera profile, the output quality is already extremely high. While some colors still benefit from minor manual refinement, the baseline is remarkably strong.



Comparing the RAW Pipelines on iPhone

To better understand where this performance comes from, I’ve been working with three distinct RAW pipelines on the iPhone, each with different design goals and trade-offs.



1. Apple ProRAW (48 MP and 12 MP)

Apple ProRAW provides very wide dynamic range, but it is deeply intertwined with Apple’s computational photography stack (Smart HDR, Deep Fusion, tone mapping, sharpening).

From both third-party analyses and my own observations, ProRAW appears to use different internal pipelines depending on output resolution:
• 48 MP ProRAW appears to allow sharpening to be fully disabled in Lightroom by setting sharpening to zero.
• 12 MP ProRAW, however, still shows visible residual sharpening even when sharpening is set to zero, suggesting that some sharpening may be baked into the data.

I am not a RAW pipeline expert, so I would greatly appreciate confirmation or correction on this point. If these two ProRAW resolutions indeed follow different processing paths, there may also be additional differences (noise reduction, tone mapping, spatial processing) that are not immediately visible.

For this reason, I plan to submit both 48 MP and 12 MP ProRAW ColorChecker captures, clearly labeled and separated.



2. Standard RAW (DNG – ~12 MP only)

Standard RAW provides the most physically direct representation of the sensor output. The rendering is honest and natural, but image quality is limited by the intrinsic capabilities of the small mobile sensor—particularly in terms of noise control and usable dynamic range.

It is also important to note that Apple does not expose a full-resolution 48 MP Bayer RAW path via the standard RAW API. As a result, standard RAW output is limited to approximately 12 MP.

This format reflects the sensor’s true baseline performance without heavy computational intervention.



3. Indigo RAW (~12 MP only)

Indigo RAW represents a very interesting middle ground.

Technically, Indigo continuously captures Bayer RAW frames via Apple’s RAW API and, at the moment of capture, aligns and stacks up to 32 frames before outputting a linear DNG. This approach prioritizes signal accumulation over aggressive spatial noise reduction or sharpening.

In practice, Indigo RAW offers:
• exceptionally wide dynamic range
• significantly reduced noise through multi-frame stacking
• restrained sharpening with natural texture retention
• smooth, camera-like tonal transitions

Although Indigo RAW is currently limited to ~12 MP (due to the same Bayer RAW API limitations), its perceived detail often rivals much higher-resolution ProRAW output because fewer sharpening artifacts and less texture destruction are present.



Long Exposure and Manual Control: A Key Difference

A particularly important capability is that Indigo can output RAW in a true long-exposure mode while still allowing manual control of exposure parameters (ISO and shutter time).

By contrast, Apple’s native camera can produce RAW using long-exposure / Night-mode style processing, but the exposure decision is essentially automatic, without photographer-style manual control of shutter speed and ISO.

To my knowledge, Indigo is currently the only non-Apple camera app on iOS that can produce RAW under long exposure while still letting the user manually set shutter speed and ISO.
This specific combination is extremely valuable for real-world work such as astrophotography.



Request: Supporting iPhone 16 Pro Max RAW Pipelines

Based on these findings, I would like to request support for the iPhone 16 Pro Max across:
• three rear camera modules (each treated as an independent camera), and
• the following RAW pipelines:
• Apple ProRAW (48 MP and 12 MP)
• Standard RAW (DNG, ~12 MP)
• Indigo RAW (~12 MP)

I am currently preparing ColorChecker captures for all three camera modules across all formats, under controlled daylight conditions, and will submit them clearly labeled and separated.

Given the results already achievable with a generic profile, the potential with proper characterization feels extremely promising.

Thank you very much for taking the time to read this, and for all the work that has made workflows like this possible. :D

Re: Requesting Support for iPhone 16 Pro Max RAW Pipelines

Posted: Wed Jan 07, 2026 7:34 pm
by C.Oldendorf
Thanks for the detailed write-up — the Indigo RAW + ColorPerfect results you’re showing are very interesting, and the way you’ve separated the three iPhone “RAW” pipelines is exactly the right approach.

One important detail that may not be obvious to everyone reading: “Indigo” here is Adobe’s (officially Project Indigo, an Adobe Labs camera app). It’s currently distributed as a free, experimental app, and its central idea is explicitly multi-frame capture per shutter press and computational merging for lower noise / higher dynamic range (rather than a single exposure).

I’d love to try Project Indigo myself when I get the chance — but at the moment I can’t test it first-hand: in our household we have an iPhone 11 Pro and a plain iPhone 13, and Project Indigo does not run on either. According to Adobe / the App Store listing, it supports iPhone 12 Pro/Pro Max, iPhone 13 Pro/Pro Max, and all iPhone 14+ models (and Adobe recommends iPhone 15 Pro or newer for the best experience).

How Indigo differs from Apple ProRAW (and why this may matter for characterization)
Your post already describes the “what”; the key extra point is the “where” in the pipeline:
  • Apple ProRAW (12 / 48 MP) is Apple’s “hybrid”: it combines sensor data with elements of Apple’s computational pipeline (Smart HDR / Deep Fusion / noise reduction etc.) before you ever see the DNG. That makes it a very capable starting point, but it also means it’s not the same kind of “rawness” as a straight sensor readout.
  • Standard RAW (DNG, ~12 MP) is the classic iOS RAW API path (single-frame, as direct as Apple exposes).
  • Project Indigo RAW (~12 MP) captures a burst and merges it into one output image at capture time to improve SNR and DR, while aiming for a more natural look.
That last point is why your request for separate support by “pipeline” is well motivated: even if Indigo writes DNG, it’s a different acquisition process than single-frame RAW, and it is very plausible that it benefits from its own characterization in ColorPerfect versus ProRAW (which bakes in Apple’s choices). The good news is: if Indigo’s goal is fidelity-first merging, it often aligns better with what serious editing expects than a heavily “interpreted” raw.

About the “12 MP limit” and why binning is different today than early iPhone DNG
This also connects nicely to your note about resolution. Apple’s “honest” RAW capture on iPhone started years ago with the iOS RAW API era (iPhone 6s / SE / 7 etc.), and anyone who shot those early DNGs remembers how brutally noisy they could be — even in decent light — precisely because they were basically small-sensor, single-frame captures.

Today’s main iPhone sensors are much larger and often quad-Bayer at ~48 MP, and many systems get their best real-world quality by relying on binning and/or stacking rather than trying to preserve “48 MP truth” at all costs. So “12 MP” in 2025/2026 doesn’t automatically mean “worse” — it can mean “the clean, stable path” (binning) and, in Indigo’s case, potentially “the clean, stable path plus multi-frame signal accumulation.”

Again: fascinating results — and Project Indigo being Adobe Labs makes this even more interesting, because it suggests a deliberate “fidelity-first” capture experiment rather than another “mobile look” engine.

Re: Requesting Support for iPhone 16 Pro Max RAW Pipelines

Posted: Sat Jan 17, 2026 12:09 pm
by C.Oldendorf
Here is a quick update from Saturday, January 17, 2026.

Even though mid-January in the northern hemisphere isn’t ideal for controlled daylight work (low noon sun angle), I managed to get a short window around noon with roughly half an hour of genuinely usable direct sun.

In that window, I captured a ColorChecker Classic 24 under the same light / same spot (open patio door in exactly the right angle, so that I could stay inside with outdoor lighting) on two phones:
  • iPhone 13 Pro
  • iPhone 15 Pro
And on both devices I recorded both RAW pipelines, so we can compare like-for-like:
  • Project Indigo RAW
  • Apple’s full-resolution RAW pipeline (captured via Halide)
What I’m hoping to learn from this is how closely the pipelines align numerically on each device when you actually measure the chart data.

My expectation is that the iPhone 13 Pro may end up being the more internally consistent case (Indigo vs Apple RAW might land closer together), while the iPhone 15 Pro remains the interesting question — because with a modern 48 MP quad-Bayer sensor and the way binning / reconstruction enters the chain, it’s not obvious ahead of time how comparable the two RAW outputs will be.

I’ll do the numerical work next week, and then we should have something concrete to report back rather than impressions.