Drones Enhance Autonomous Navigation with Visual Servoing

Drones, also known as unmanned aerial vehicles (UAVs), have revolutionized various industries by providing versatile platforms for tasks such as surveillance, mapping, delivery, and disaster response. One of the key capabilities that enable drones to operate autonomously is their ability to navigate without constant human intervention. Autonomous navigation refers to a drone’s capacity to determine its position, plan a path, and adjust its trajectory based on real-time data, all without direct human control. This is made possible through cutting-edge technologies, with GPS receivers playing a pivotal role.

GPS receivers communicate with a constellation of satellites orbiting the Earth, providing precise timing and location information to the UAV. By triangulating signals from multiple satellites, the GPS receiver accurately determines the drone’s position. The onboard computer then utilizes this information to calculate the drone’s trajectory, speed, and direction, allowing it to follow a predetermined flight path or respond to dynamic environmental changes.

However, GPS signals can sometimes be unreliable or unavailable. Challenges such as signal interference, urban canyons, and adverse atmospheric conditions can impact GPS signals, resulting in a loss of GPS signal. When a drone loses GPS signal, it encounters difficulties in maintaining its intended course.

To address the problem of lost GPS signals, researchers at Fuzhou University in China have focused on enhancing the accuracy of vision-based control systems, specifically visual servoing. Visual servoing utilizes cameras and image processing algorithms to gather information about the drone’s surroundings. By improving the accuracy of visual servoing, the researchers aim to provide a low-cost and lightweight solution for UAV control tasks, eliminating the need for power-hungry computational hardware.

One of the challenges with image-based visual servoing is that noise can negatively impact the accuracy of translational velocity measurements, which are essential for flight control. Additionally, tracking a rotating target and dealing with disturbances during flight pose further difficulties.

To overcome these challenges, the research team proposed several enhancements. They developed a velocity observer to estimate relative velocities between the drone and the target, eliminating the need for translational velocity measurements. This observer mitigates the control problems caused by noise in those measurements. The researchers also introduced a novel image depth model capable of tracking objects in any orientation, ensuring accurate tracking and smooth trajectory calculation even when the target is rotating. Unpredictable disturbances were accounted for through the integration of an integral-based filter to enhance tracking stability.

The stability of the custom image-based visual servoing controller was analyzed using the Lyapunov method, which demonstrated its tracking stability, robustness, and ability to operate in the presence of unexpected disturbances.

In the future, the research team plans to further refine their methods and apply them to real-world scenarios involving capturing dynamic targets and facilitating autonomous landings.

Sources:
– Definition of drones and autonomous navigation: Own knowledge
– Source article: Drones Enhance Autonomous Navigation with Visual Servoing. (2021, May 11). DroneDJ.