Naïve bayes with higher order attributes

Bernard Rosell, Lisa Hellerstein

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    The popular Naïve Bayes (NB) algorithm is simple and fast. We present a new learning algorithm, Extended Bayes (EB), which is based on Naïve Bayes. EB is still relatively simple, and achieves equivalent or higher accuracy than NB on a wide variety of the UC-Irvine datasets. EB is based on two ideas, which interact. The first is to find sets of seemingly dependent attributes and to add them as new attributes. The second idea is to exploit "zeroes", that is, the negative evidence provided by attribute values that do not occur at all in particular classes in the training data. Zeroes are handled in Naïve Bayes by smoothing. In contrast, EB uses them as evidence that a potential class labeling may be wrong.

    Original languageEnglish (US)
    Title of host publicationAdvances in Artificial Intelligence
    EditorsAhmed Y. Tawfik, Scott D. Goodwin
    PublisherSpringer Verlag
    Pages105-119
    Number of pages15
    ISBN (Electronic)9783540220046
    DOIs
    StatePublished - 2014
    Event17th Canadian Conference on Artificial Intelligence, Canadian AI 2004 - London, Canada
    Duration: May 17 2004May 19 2004

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume3060
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Other

    Other17th Canadian Conference on Artificial Intelligence, Canadian AI 2004
    Country/TerritoryCanada
    CityLondon
    Period5/17/045/19/04

    ASJC Scopus subject areas

    • Theoretical Computer Science
    • General Computer Science

    Fingerprint

    Dive into the research topics of 'Naïve bayes with higher order attributes'. Together they form a unique fingerprint.

    Cite this