Visualis

GPS signals are vulnerable to jamming, spoofing, or outright denial in specific combat situations, rendering them unreliable. Opponents equipped with sophisticated electronic warfare capabilities can easily interfere with GPS signals, resulting in drones losing their navigation or even being captured by the enemy. The reliance on radio control in these circumstances is also perilous, as it is easily detectable and may unintentionally disclose the operator’s location, putting them at risk of attacks. Furthermore, conducting long-range operations in hostile settings frequently results in weak or disrupted signals, which constrains the operational range and effectiveness of the drone. This underscores the importance of eliminating an unmanned system’s reliance on properly functioning radio frequencies in contemporary warfare.

In response to this challenge, Roark developed Visualis, an innovative solution that integrates Plexus intelligence and software to facilitate autonomous drone operations in areas where GPS signals are compromised, ensuring precise navigation while functioning completely independently of GPS or radio control signals.

By utilizing a straightforward camera and onboard computing capabilities — features that all Roark drones possess as standard — it assesses the drone’s location against onboard satellite imagery, enabling navigation without significant long-range drift.

Essentially, Visualis replicates the method humans employed prior to the widespread use of GPS: it allows the drone to navigate by interpreting a map.

The Technology

Roark has dedicated numerous years to the development of products that merge computer vision with geospatial algorithms, particularly focusing on image and video georegistration (the precise alignment of visual data with real-world locations). The Visualis by Roark represents the culmination of this expertise. Its navigation algorithms integrate three unique data sources to construct a detailed representation of the drone’s position and movement concerning its mission, and effectively communicate that information back to the drone to ensure the maintenance of an accurate flight trajectory.

Source 1: Drone Sensors

Roark drones are equipped with a diverse array of onboard sensors, which include accelerometers, gyroscopes, and magnetometers—often collectively known as an “inertial measurement unit” or IMU—along with barometers and additional instruments. These sensors work in unison to provide critical information regarding the drone’s orientation, acceleration, and turning movements. When IMU sensors are sufficiently precise, it is possible to navigate over long distances with a fair degree of accuracy using only IMU data through a method known as “dead reckoning.” However, such high-precision sensors tend to be both large and expensive, often costing hundreds of thousands of dollars solely for the sensors. In contrast, smaller and more cost-effective devices typically feature IMUs with lower accuracy, which can generally maintain reliable dead reckoning navigation for only a few seconds before becoming entirely unreliable.

Visualis has the capability to integrate this sensor data, even from lower-cost sensors, with advanced computer vision techniques to develop a holistic solution for autonomous navigation. The system utilizes the sensor data to enhance stability and responsiveness, while simultaneously employing computer vision to address the challenges associated with long-range navigation.

Source 2: Optics

In addition to utilizing inertial sensors, Visualis also employs a method referred to as optical flow. Optical flow involves monitoring the movement of pixels from one frame to the next in a video feed. By analyzing the pixel motion alongside data from the inertial sensors, it becomes feasible to estimate the velocity of the drone. However, this velocity estimation necessitates an accurate measurement of the distance from the drone to the ground. Consider a scenario where a distant surface is moving swiftly while a nearby surface is moving slowly — both will appear to move similarly on camera, leading to significant errors in distance calculations. Even when a barometer is used to estimate altitude, the resulting calculations can quickly become unreliable, rendering precise long-distance navigation nearly impossible.

The combination of inertial measurements with optical flow data enables a sophisticated method of dead reckoning that is appropriate for manual control or navigation over short distances. This method, commonly known as Visual Inertial Odometry, is a prevalent component of various “Visual Navigation” solutions available in the market and can offer reliable navigation across distances ranging from several tens to hundreds of meters. However, this technique is not without its distinct shortcomings. Since it primarily monitors “relative” motion, it results in unavoidable, un-correctable errors and “drift” as time progresses. This situation is comparable to traditional wilderness navigation that relies on a compass and pace counting: approximations of distance and direction can gradually lead to inaccuracies or “drift” in your location, and before long, you may find yourself lost in the woods.

Source 3: Reference Matching

The ultimate and most distinctive data source utilized by Visualis is what Roark refers to as “Reference Matching.” Through the application of computer vision, we are able to automatically juxtapose the drone’s camera feed with satellite data that has been pre-loaded into the onboard computing system, thereby identifying corresponding points between the images. Utilizing these points, Visualis can mathematically and continuously ascertain the precise location of the drone, automatically rectifying any drift that may have accumulated over time.

If this method is so effective, why is it not more widely adopted? Although it fundamentally represents a form of classical image matching, it faces numerous traditional challenges associated with computer vision matching. These challenges include natural imagery that lacks distinct man-made features, unclear reference photographs, significant seasonal variations, terrain alterations, inconsistent lighting, differences between visual and infrared imagery, among others. Even the most advanced deep learning-based image matching techniques frequently encounter failures in this context, especially considering that they are generally designed to operate on computing resources that exceed 1000% of what is available on a compact drone.

To address these challenges, Visualis employs Roark’s innovative proprietary image matching kernel, which has been meticulously developed across a diverse array of terrains, drone sensors, and operational profiles. This represents the final component necessary for enabling Visualis to effectively interpret its map, correct any navigational drift errors that may have been introduced by other technologies, and maintain its trajectory, ensuring that the drone remains focused on its target. Our matching kernel has been reliably tested in both urban environments and natural landscapes, utilizing visual, infrared, and even multispectral imagery.

Visualis Power

Visualis’ groundbreaking method for drone navigation stems from its distinctive data fusion derived from three separate sources, all analyzed through a mathematical framework known as a Kalman Filter. This statistical model is crafted to accurately assess a system based on incomplete or potentially flawed measurements. When combined with Roark’s extensive defense knowledge and proprietary improvements, Visualis effectively addresses the limitations present in each data source.

Short-term inaccuracies in optical flow or reference matching are balanced by dependable inertial telemetry. Any drift in optical flow is corrected through reference matching, while gaps in reference matching data are efficiently compensated by calculations using optical flow.

This strategic integration culminates in a precise, low-latency representation of the drone’s state. Visualis utilizes this representation to relay position and velocity data to the drone’s flight controller through standard communication protocols. This not only enables the drone to navigate proficiently but also guarantees successful mission execution.

By surmounting conventional limitations and leveraging the capabilities of multiple data sources, Visualis presents a robust, dependable solution that redefines the future of drone navigation. Its distinctive methodology and innovative use of technology highlight Roark’s dedication to pushing boundaries and advancing progress in the realm of autonomous navigation. In light of the increasing concerns regarding GPS interference on the battlefield, this solution contributes to safer and more effective mission outcomes.