We describe a framework for quantifying color image distortion based on an adaptive signal decomposition. Specifically, local blocks of the image error are decomposed using a set of spatiochromatic basis functions that are adapted to the spatial and color structure of the original image. The adaptive functions are chosen to isolate specific distortions such as luminance, hue, and saturation changes. These adaptive basis functions are used to augment a generic orthonormal basis, and the overall distortion is computed from the weighted sum of the coefficients of the resulting overcomplete decomposition, with smaller weights chosen for the adaptive terms. A set of preliminary experiments show that the proposed distortion measure is consistent with human perception of color images subjected to a variety of different common distortions. The framework may be easily extended to include any form of continuous spatio-chromatic distortion.