Optimal parallel dictionary matching and compression

Martin Farach, Shanmugavelayutham Muthukrishnan

    Research output: Contribution to conferencePaperpeer-review

    Abstract

    Emerging applications in multi-media and the Human Genome Project require storage and searching of large databases of strings - a task for which parallelism seems the only hope. In this paper, we consider the parallelism in some of the fundamental problems in compressing strings and in matching large dictionaries of patterns against texts. We present the first work-optimal algorithms for these well-studied problems including the classical dictionary matching problem, optimal compression with a static dictionary and the universal data compression with dynamic dictionary of Lempel and Ziv. All our algorithms are randomized and they are of the Las Vegas type. Furthermore, they are fast, working in time logarithmic in the input size. Additionally, our algorithms seem suitable for a distributed implementation.

    Original languageEnglish (US)
    Pages244-253
    Number of pages10
    StatePublished - 1995
    EventProceedings of the 7th Annual ACM Symposium on Parallel Algorithms and Architectures, SPAA'95 - Santa Barbara, CA, USA
    Duration: Jul 17 1995Jul 19 1995

    Other

    OtherProceedings of the 7th Annual ACM Symposium on Parallel Algorithms and Architectures, SPAA'95
    CitySanta Barbara, CA, USA
    Period7/17/957/19/95

    ASJC Scopus subject areas

    • Software
    • Safety, Risk, Reliability and Quality

    Fingerprint

    Dive into the research topics of 'Optimal parallel dictionary matching and compression'. Together they form a unique fingerprint.

    Cite this