Efficient identification of multiple bad data

Yuzhang Lin, Ali Abur

Research output: Chapter in Book/Report/Conference proceedingConference contribution


This paper describes a computationally more efficient alternative to the bad data identification procedure that is known as the largest normalized residual (LNR) test. LNR test is a sequential procedure where measurements suspected to carry gross errors are identified and removed from the measurement set one at a time. Thus, the computational burden of the test increases proportional to the existing bad data, making it prohibitively inefficient for systems commonly containing large numbers of measurements with gross errors. In this paper, an improved version of this approach is proposed where the number of identification and correction cycles needed to process a large number of bad data points is significantly reduced. Thus efficient application of the LNR test in very large practical power systems is facilitated.

Original languageEnglish (US)
Title of host publication2017 North American Power Symposium, NAPS 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781538626993
StatePublished - Nov 13 2017
Event2017 North American Power Symposium, NAPS 2017 - Morgantown, United States
Duration: Sep 17 2017Sep 19 2017

Publication series

Name2017 North American Power Symposium, NAPS 2017


Other2017 North American Power Symposium, NAPS 2017
Country/TerritoryUnited States


  • bad data
  • computational efficiency
  • largest normalized residual
  • state estimation

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Energy Engineering and Power Technology
  • Control and Optimization
  • Electrical and Electronic Engineering


Dive into the research topics of 'Efficient identification of multiple bad data'. Together they form a unique fingerprint.

Cite this