The rugged enclosure operates in temperatures from -40 to 71 degrees Celsius, and houses combinations of Abaco's latest 3U VPX single-board computers, field-programmable gate arrays (FPGAs), digital signal processors (DSPs), general-purpose graphics processing units (GPGPUs) to deliver as much as 14 TeraFLOPS of computing power.
The Obox also has high-speed network switches to capture the input from sensors like lidar, radar, and cameras over Ethernet. It also drives actuators to control velocity and trajectory. The design and development system enables autonomous operations like detection, recognition, tracking, fusion, perception, planning, decision making, and effecting.
The Obox uses Xeon processors from Intel; CUDA-enabled GPUs from NVIDIA; and FPGA technology from Xilinx.
Software integration at the ground- and node layers enables designers to pre-consolidate image-processing and neural-network algorithms for machine learning and autonomous vehicle applications.
Two accelerator processors support neural network-based inference engines for object detection and image segmentation of several video image streams and lidar point clouds. Developers can use GPUs via NVIDIA's Deep Learning SDK, or FPGAs via Xilinx's ML suite.
Supporting the Obox is Abaco's AXIS software development environment. AXIS ImageFlex provides optimized visualization capabilities to combine video, lidar, radar, and other sensor information into one unified viewer application. It provides high performance algorithms for image transformation to correct for optical distortions in real-time and also to project the image for different viewpoints.
For more information contact Abaco Systems online at www.abaco.com.
Ready to make a purchase? Search the Intelligent Aerospace Buyer's Guide for companies, new products, press releases, and videos