|Project:||Three-Dimensional Object Recognition|
|Client:||US Air Force and the Missile Defense Agency|
Energid has developed powerful new methods for identifying and tracking three-dimensional objects. Our unique specialty is our ability to use CAD models of objects to automatically build tailored algorithms. The approach is flexible and extendible and can be used with many types of imaging sensors-gray scale, two channel infrared, ladar, imaging radar, and hyperspectral intensity sensors.
Our algorithm uses three spatial components to analyze each image in isolation and a temporal component to combine spatial-analysis results over time. The first spatial component segments the image by identifying any regions in a single set of raster data that might be objects of interest. The second spatial component makes an initial assessment of type, pose, and geometry of each potential object. It uses harmonic functions to weight orientations in constructing template images, and for orientation sampling, it uses a four-dimensional geometric approach to select quaternions. The third spatial component refines the type, pose, and geometry estimate using real-time differencing between the sensor image and synthetic images generated quickly using commodity PC graphics cards. The temporal process uses multiple hypothesis tracking (MHT) to integrate multiple looks at targets. We have extended traditional MHT-which applies to a collection of point targets-to apply to a logical tree structure of type, pose, and geometry.
The algorithms are implemented as a C++ software toolkit. The toolkit implements support for single, dual, and multichannel images, support for the representation of general bifurcating articulated models, support for rendering of those models using a PC graphics card, and support for the top-level identification framework. Every module can be configured using the Extensible Markup Language (XML) to exchange run time for accuracy.
Would you like Energid to apply this expertise to solve your machine vision problem? Please contact us.