The conventional method of generating a basis that is optimally adapted (in MSE) for representation of an ensemble of signals is Principal Component Analysis (PCA). A more ambitious modern goal is the construction of bases that are adapted to individual signal instances. Here we develop a new framework for instance-adaptive signal representation by exploiting the fact that many real-world signals exhibit local self-similarity. Specifically, we decompose the signal into multiscale subbands, and then represent local blocks of each subband using basis functions that are linearly derived from the surrounding context. The linear mappings that generate these basis functions are learned sequentially, with each one optimized to account for as much variance as possible in the local blocks. We apply this methodology to learning a coarse-to-fine representation of images within a multi-scale basis, demonstrating that the adaptive basis can account for significantly more variance than a PCA basis of the same dimensionality.