BOULDER, Colo., – Black Swift Technologies (BST), a specialized engineering firm based in Boulder, Colo., recently announced its Automated Emergency Safe Landing (AESL) functionality for unmanned aerial systems (UAS). The technology integrates machine learning algorithms and onboard processors into the Black Swift S2™ UAS to capture and classify images, at altitude, enabling a UAS to autonomously identify a safe landing area in the event of a catastrophe — a key enabler for safe beyond line of sight flights. This solution processes large amounts of data quickly and efficiently to enable the identification of objects and terrain to be avoided in order to land the aircraft without harm to people or property.
“Our emphasis is to make UAS operations safer for both operators and the public,” emphasizes Jack Elston, Ph.D., CEO of Black Swift Technologies. “The goal of AESL is to be able to take a snapshot and within 60 seconds of something like a catastrophic engine failure, be able to identify a landing zone, calculate a landing trajectory, and safely land a UAS away from people and obstacles. We remain convinced that a thorough understanding and integration of artificial intelligence and machine learning can help serve as a catalyst for accelerating UAS growth and adoption industry-wide.”
AESL functionality is the result of a NASA SBIR Grant awarded to BST, and an ongoing collaboration with Luxonis LLC, a Colorado-based technology company that specializes in embedded machine learning, artificial intelligence, and computer vision, from concept through custom hardware, firmware, software and UI/UX.
While AESL functionality can serve as a significant stepping stone towards obtaining FAA exemptions for safe beyond line of sight flights, what observers and users are describing as the most striking feature is the size of the components and their power requirements (which are quite low) for what’s actually doing this image capture/processing onboard the aircraft.
“Our technology is intended to help enable a safer flight experience both for UAS operators and the general public,” Elston emphasizes. “All the data products that we’d be accessing are going to be common across any autopilot. We don’t want to give too much decision making data to the autopilot—just enough to know what to avoid and where to safely land.”