Multimodal Microscopy Image Alignment Using Spatial and Shape Information and a Branch-and-Bound Algorithm

Shuonan Chen, Bovey Y. Rao, Stephanie Herrlinger, Attila Losonczy, Liam Paninski, Erdem Varol

    Research output: Contribution to journalConference articlepeer-review


    Multimodal microscopy experiments that image the same population of cells under different experimental conditions have become a widely used approach in systems and molecular neuroscience. The main obstacle is to align the different imaging modalities to obtain complementary information about the observed cell population (e.g., gene expression and calcium signal). Traditional image registration methods perform poorly when only a small subset of cells are present in both images, as is common in multimodal experiments. We cast multimodal microscopy alignment as a cell subset matching problem. To solve this non-convex problem, we introduce an efficient and globally optimal branch-and-bound algorithm to find subsets of point clouds that are in rotational alignment with each other. In addition, we use complementary information about cell shape and location to compute the matching likelihood of cell pairs in two imaging modalities to further prune the optimization search tree. Finally, we use the maximal set of cells in rigid rotational alignment to seed image deformation fields to obtain a final registration result. Our framework performs better than the state-of-the-art histology alignment approaches regarding matching quality and is faster than manual alignment, providing a viable solution to improve the throughput of multimodal microscopy experiments.


    • Biomedical signal processing
    • Microscopy
    • Multi-modal image registration
    • branch-and-bound

    ASJC Scopus subject areas

    • Software
    • Signal Processing
    • Electrical and Electronic Engineering


    Dive into the research topics of 'Multimodal Microscopy Image Alignment Using Spatial and Shape Information and a Branch-and-Bound Algorithm'. Together they form a unique fingerprint.

    Cite this