Abstract
In the context of a random process scene environment model, a method is presented for fusing data from multiple sensors into a simplified, ordered space for performing electronic vision tasks. The method is based on a new discriminating measure called the tie statistic that is introduced to quantify sensor/feature performance and to provide a mapping from sensor/feature measurement space to a simplified and ordered decision space. The mapping process uses the tie statistic to measure the closeness of an unknown sample probability density function (pdf) to a known pdf for a decision class. Theorems presented in this article relate the tie statistic to minimum probability of error decision making and to the well known Kolmogorov‐Smirnov distance. As examples of the sensor/feature fusion method, the tie mapping process is applied to the object location (cueing) and the texture recognition problems.
Original language | English (US) |
---|---|
Pages (from-to) | 373-393 |
Number of pages | 21 |
Journal | Journal of Robotic Systems |
Volume | 7 |
Issue number | 3 |
DOIs | |
State | Published - Jun 1990 |
ASJC Scopus subject areas
- Control and Systems Engineering