Skip to content Skip to navigation
PhD Defense
5/27/2015 11:00 am
CoRE A(Room 301)

Quantifying Signal Model Misperception: Why Bad Things Happen to Good Data

John-Austen Francisco, Rutgers University

Defense Committee: Richard P. Martin (Chair), Thu Nguyen, Wade Trappe and Alan Schiz ( AT&T )

Abstract

The constant reduction in cost and increase in the power of computing machinery has resulted in an ever-increasing interest and deployment of Internet-enabled sensing systems. Such systems have the distinctly difficult task of making use of noisy data sampled from dynamic environments. While some natural processes may have very exact theoretical models, real actual data rarely holds to the model, making the comparison, improvement and iterative development of sensing applications extremely difficult. The inability to determine whether a sensing system's error is the result of noisy data or algorithmic miscomputation, or the prevalence and signficance of signal errors in a particular environment make the causes of the error inscrutable. In such cases strongly amortized or very general probabilistic analysis is often used as a last resort resulting in conclusions that are overly generic, heuristic, or strongly underdetermined. We present a systematic  method that can be used to construct a holistic synthetic error model for sensed data, the algorithms that process it and the envornment in which it is sampled. We demonstrate how this method can be applied to the problem of laterative localization to construct deductive, analytic and evaluative mechanisms that allow model misperception, algorithmic error and environmental character to be understood.