A system is requested by the Air Force Research Laboratory (AFRL) with the purpose of quickly and automatically capturing the movement of people, along with the sizes and descriptions of those detected. The idea is to do this with a portable radar system. This system will be useful in a variety of scenarios, including military reconnaissance and search and rescue efforts. It is important to be able discover individuals in environments where it is difficult for the naked eye to properly detect and identify targets. The main purpose of this particular project is to work on developing a way to automatically determine when a human is detected by the radar, and to determine the specific characteristics of the subject's body when he/she is detected.
The concept of the device to be designed includes something that is portable, small, lightweight and interfaces with a PC in order to provide a complete "Moving Human Electromagnetic Scattering Simulator" The result of this project will include an enclosure with a variety of components capable of directly processing radar signals detected by an imaging device.
The goal of this project is to develop a motion sensing device that will interface with AFRL software to render an end-to-end moving human electromagnetic scattering simulator. This device is meant to be:
- Small, portable, and lightweight
- Connect to a PC via Ethernet, or 802.11
- Take advantage of the Microsoft Kinect's or similar imaging device and use an open source driver to access it
- Accurately identify humans based off human body characteristics
- Develop an enclosure for our device that encloses the imaging device and PandaBoard ES
- Use a battery or power supply to power the device
- Launch the AFRL software using parameters collected from the system
The hardware used for this project is a PandaBoard ES. This board is a microcomputer on a chip and comes with a dual-core ARM processor, as well as onboard Wi-Fi. The PandaBoard is able to run the Linux operating system. Ubuntu 11.10 is what was chosen for this application. Along with the PandaBoard ES, an ASUS Xtion is also utilized. The ASUS Xtion is a motion and depth sensing device similar to an Xbox Kinect. This device is capable of automatically detecting and tracking people.
There were two different software applications developed for this project. The first application is for the PandaBoard. The program runs on Linux and the program runs a TCP server, and allows client PCs to connect. The PandaBoard software is also responsible for reading the sensor and processing data received from it. The software is able to read the sensor by taking advantage of two freely available libraries: OpenNI and NITE. These two libraries provide an API that can be utilized through C++ libraries which provide information about people found by the sensor. The second application is for a PC. This program connects to the PandaBoard as a TCP client, and sends and receives commands. In addition to commands, the program also receives the data about people found by the sensor. This data is plotted with the software and the data can also be sent to an AFRL provided Matlab program which computes the electromagnetic scattering simulation.
The Final Product
The result is a box enclosed around the PandaBoard, along with a battery to enable better portability. The ASUS Xtion sensor is mounted on top. The system is able to successfully relay data about people within the sensor's field of view. A PC can connect to the PandaBoard server over Wi-Fi; the PandaBoard acts as a wireless access point and emits its own wireless network. The PC application can accurately plot the relative location of all targets found by the sensor, and parameters about these targets are also displayed, and can be input into the AFRL provided program to perform an electromagnetic scattering simulation. The parameters sometimes have questionable accuracy, as the NITE software binaries available for an ARM processor are outdated and can report incorrect joint locations of the people found.