Checks and balances: Monitoring data quality problems in network traffic databases

Flip Korn, S. Muthukrishnan, Yunyue Zhu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Internet Service Providers (ISPs) use realtime data feeds of aggregated traffic in their network to support technical as well as business decisions. A fundamental difficulty with building decision support tools based on aggregated traffic data feeds is one of data quality. Data quality problems stem from network-specific issues (irregular polling caused by UDP packet drops and delays, topological mislabelings, etc.), and make it difficult to distinguish between artifacts and actual phenomena, rendering data analysis based on such data feeds ineffective. In principle, traditional integrity constraints and triggers may be used to enforce data quality. In practice, data cleaning is done outside the database and is ad-hoc. Unfortunately, these approaches are too rigid and limited for the subtle data quality problems arising from network data where existing problems morph with network dynamics, new problems emerge over time, and poor quality data in a local region may itself indicate an important phenomenon in the underlying network. We need a new approach - both in principle and in practice - to face data quality problems in network traffic databases. We propose a continuous data quality monitoring approach based on probabilistic, approximate constraints (PACs). These are simple, user-specified rule templates with open parameters for tolerance and likelihood. We use statistical techniques to instantiate suitable parameter values from the data, and show how to apply them for monitoring data quality. In principle, our PAC-based approach can be applied to data quality problems in any data feed. We present PAC-Man, which is the system that manages PACs for the entire aggregate network traffic database in a large ISP, and show that it is very effective in monitoring data quality problems.

Original languageEnglish (US)
Title of host publicationProceedings - 29th International Conference on Very Large Data Bases, VLDB 2003
EditorsPatricia G. Selinger, Michael J. Carey, Johann Christoph Freytag, Serge Abiteboul, Peter C. Lockemann, Andreas Heuer
PublisherMorgan Kaufmann
Pages536-547
Number of pages12
ISBN (Electronic)0127224424, 9780127224428
StatePublished - Jan 1 2003
Externally publishedYes
Event29th International Conference on Very Large Data Bases, VLDB 2003 - Berlin, Germany
Duration: Sep 9 2003Sep 12 2003

Publication series

NameProceedings - 29th International Conference on Very Large Data Bases, VLDB 2003

Other

Other29th International Conference on Very Large Data Bases, VLDB 2003
CountryGermany
CityBerlin
Period9/9/039/12/03

ASJC Scopus subject areas

  • Software
  • Information Systems
  • Hardware and Architecture
  • Information Systems and Management
  • Computer Science Applications
  • Computer Networks and Communications

Fingerprint Dive into the research topics of 'Checks and balances: Monitoring data quality problems in network traffic databases'. Together they form a unique fingerprint.

  • Cite this

    Korn, F., Muthukrishnan, S., & Zhu, Y. (2003). Checks and balances: Monitoring data quality problems in network traffic databases. In P. G. Selinger, M. J. Carey, J. C. Freytag, S. Abiteboul, P. C. Lockemann, & A. Heuer (Eds.), Proceedings - 29th International Conference on Very Large Data Bases, VLDB 2003 (pp. 536-547). (Proceedings - 29th International Conference on Very Large Data Bases, VLDB 2003). Morgan Kaufmann.