Imaging Systems Measured by Information, Not Pixels
March 23, 2026 · 3 min read
In a recent NeurIPS 2025 paper, researchers demonstrated that a single information metric can predict the performance of imaging systems across four diverse domains—color photography, radio astronomy, lensless imaging, and microscopy—with higher information consistently correlating to better downstream task . This finding s traditional evaluation s that rely on separate metrics like resolution and signal-to-noise ratio, which often fail to capture overall system capability when these factors trade off against each other. The work shows that what matters in modern imaging isn't how measurements appear to humans, but how much useful information they contain for algorithms that process them directly.
Mutual information, which quantifies how much a measurement reduces uncertainty about the object that produced it, serves as the core metric. Two systems with the same mutual information are equivalent in their ability to distinguish objects, even if their measurements look completely different. This single number captures the combined effect of resolution, noise, sampling, and all other factors affecting measurement quality. A blurry, noisy image preserving features needed to distinguish objects can contain more information than a sharp, clean image losing those features, unifying traditionally separate quality assessments.
The researchers developed a to estimate mutual information directly from measurements without requiring explicit object models or ignoring physical constraints. They decomposed the problem into two terms: H(Y), measuring total variation in measurements from both object differences and noise, and H(Y|X), measuring variation from noise alone. Since imaging systems have well-characterized noise—photon shot noise follows Poisson distributions, electronic readout noise is Gaussian—H(Y|X) can be computed directly, leaving only H(Y) to be learned from data.
For H(Y), they fit probabilistic models to datasets of measurements, testing three approaches spanning efficiency-accuracy tradeoffs: a stationary Gaussian process (fastest), a full Gaussian (intermediate), and an autoregressive PixelCNN (most accurate). This approach provides an upper bound on true information, meaning any modeling error can only overestimate, never underestimate. The framework requires only noisy measurements and a noise model, avoiding the need for ground truth data or subjective visual assessment.
In practical tests, information estimates correctly ranked color filter designs—traditional Bayer pattern, random arrangement, and learned arrangement—matching neural network demosaicing rankings without requiring reconstruction algorithms. For radio telescope arrays, information estimates predicted reconstruction quality across configurations, enabling site selection without expensive image reconstruction. In lensless imaging and microscopy applications, information estimates correlated with reconstruction accuracy and neural network performance at predicting protein expression.
The team also developed Information-Driven Encoder Analysis Learning (IDEAL), which uses gradient ascent on information estimates to optimize imaging system parameters without requiring a decoder network. When tested on color filter design, starting from random arrangements, IDEAL progressively improved designs to match end-to-end optimization performance in both information content and reconstruction quality. This approach avoids memory constraints and optimization difficulties associated with backpropagating through entire decoder networks during training.
Current evaluation approaches require either subjective visual assessment, unavailable ground truth data, or isolated metrics missing overall capability. This provides an objective, unified metric from measurements alone, with computational efficiency suggesting possibilities for designing previously intractable imaging systems. The framework may extend beyond imaging to any sensing domain with deterministic encoding and known noise characteristics, including electronic, biological, and chemical sensors.
While the approach provides an upper bound on information and was validated across four imaging domains, the authors note it requires well-characterized noise models and measurement datasets. They explore these capabilities more extensively in follow-on work, with code available on GitHub and a video summary on their project website. This information-centric perspective could transform how we evaluate and design the sensing systems increasingly fundamental to technology from smartphones to scientific instruments.