In the rapidly evolving landscape of unmanned aerial vehicles (UAVs), the term “pursue” has transitioned from a simple action verb to a complex technical classification. At its core, “pursue” refers to a drone’s autonomous capability to identify, lock onto, and follow a moving subject without direct manual input from a pilot. This functionality represents the pinnacle of Category 6: Tech & Innovation, merging computer vision, artificial intelligence (AI), and advanced flight kinematics into a single, seamless operation.
As we move deeper into the era of autonomous flight, understanding what “pursue” means requires a look under the hood at the sophisticated software and hardware architectures that allow a machine to mirror the predatory instincts of nature, albeit for the purposes of cinematography, data collection, and industrial surveillance.

The Architecture of Pursuit: From GPS to Computer Vision
The concept of a drone following a target is not entirely new, but the “pursue” technology used today is lightyears ahead of early “Follow Me” modes. To understand the innovation, one must distinguish between the two primary methods of autonomous tracking.
The Shift from GPS-Based Tethering
Early iterations of pursuit technology relied on GPS “leashing.” In this setup, the drone would track the position of the pilot’s controller or a wearable GPS beacon. The drone wasn’t “seeing” the subject; it was simply maintaining a calculated distance from a set of geographical coordinates. While effective in open fields, this method was plagued by latency and a total lack of spatial awareness. If the subject moved behind a building or under a canopy, the drone remained oblivious to the physical environment.
The Rise of Visual Computer Vision (CV)
Modern “Pursue” modes utilize Computer Vision (CV). This is where Tech & Innovation truly shines. Using the primary camera and often a suite of auxiliary sensors, the drone’s onboard processor analyzes video frames in real-time. Through a process called “feature extraction,” the AI identifies the pixels that constitute the subject—be it a vehicle, an animal, or a human—and separates them from the background. Once a “bounding box” is established around the subject, the drone’s flight controller uses complex algorithms to maintain that subject’s position within the frame, regardless of how the subject or the drone moves.
The Mechanics of Visual Tracking and Deep Learning
For a drone to pursue a target effectively, it must do more than just see; it must understand. This is achieved through the integration of Deep Learning and Neural Networks, which are trained on millions of images to recognize shapes and predict movement.
Real-Time Object Recognition
When a pilot selects a subject to pursue, the AI begins a high-speed diagnostic. It identifies the “centroid” of the target and calculates its vector—the direction and speed at which it is traveling. Innovation in this sector has led to drones that can distinguish between a cyclist and a pedestrian, adjusting their flight dynamics accordingly. For instance, if the AI recognizes the target as a fast-moving car, it will preemptively increase its tilt angle and power output to compensate for expected acceleration.
Predictive Pathing and Motion Vectors
One of the most significant hurdles in pursuit technology is “occlusion”—when the subject momentarily disappears behind an obstacle like a tree or a sign. Sophisticated pursuit algorithms now include predictive pathing. By analyzing the subject’s previous velocity and trajectory, the drone can “guess” where the target will emerge. This prevents the drone from orphaning the mission the moment visual contact is lost, allowing the AI to re-acquire the target with high precision once it reappears.
Edge Computing and Processing Power
The ability to pursue is limited by the “brain” of the drone. To process 4K video data and translate it into flight commands in milliseconds, modern UAVs utilize specialized System-on-a-Chip (SoC) architectures. These chips are designed for “edge computing,” meaning the AI processing happens locally on the drone rather than in the cloud. This reduces latency to near-zero, which is critical when pursuing a target at speeds exceeding 30 or 40 miles per hour.
The Intersection of Pursuit and Autonomous Obstacle Avoidance

A drone cannot pursue a target effectively if it cannot navigate the environment safely. This is where “pursue” technology intersects with stabilization and obstacle avoidance systems, creating a holistic autonomous ecosystem.
Advanced Pilot Assistance Systems (APAS)
In a pursuit scenario, the drone’s “attention” is split between the target and its surroundings. Advanced Pilot Assistance Systems (APAS) and similar technologies act as a secondary layer of intelligence. While the primary AI is focused on the pursuit, a network of binocular vision sensors and Time-of-Flight (ToF) sensors map the environment in 3D. This creates a “point cloud” of the surroundings, allowing the drone to autonomously bank, climb, or dive to avoid branches, wires, or walls without losing the pursuit lock.
360-Degree Spatial Awareness
The most innovative pursuit drones now feature omnidirectional obstacle sensing. In the past, pursuit was often limited to “Follow” (behind the target) because drones lacked sensors on their sides or rear. Current innovation has enabled “Parallel Pursuit” and “Profile Pursuit,” where the drone flies alongside or even in front of the subject. This requires the drone to “see” in directions it isn’t flying, using a mesh of sensor data to ensure that while it pursues the target, it doesn’t collide with lateral obstacles.
SLAM Technology (Simultaneous Localization and Mapping)
SLAM is a cornerstone of high-level autonomous pursuit. It allows the drone to build a map of an unknown environment while simultaneously keeping track of its own location within that map. In complex environments like forests or urban canyons, SLAM enables the drone to remember where obstacles are, allowing for more aggressive pursuit maneuvers and smoother flight paths that mimic the intuition of a human pilot.
Industrial and Scientific Applications of Pursuit Technology
While many associate pursuit modes with consumer gadgets, the underlying technology is driving massive innovation in industrial and scientific sectors. The ability to autonomously track an object has implications far beyond the “active track” features found in consumer drones.
Wildlife Research and Conservation
In the realm of remote sensing and biology, pursuit technology allows researchers to monitor animal migration and behavior without human interference. Drones equipped with AI pursuit can follow a specific animal across rugged terrain, maintaining a non-invasive distance while collecting high-resolution data. This provides a level of observation that was previously impossible or prohibitively expensive via helicopter.
Search and Rescue (SAR) and Public Safety
In search and rescue operations, pursuit technology is used to “lock on” to a person spotted by a thermal or optical sensor. Once a potential victim is identified, the drone can pursue their heat signature or visual profile, providing constant eyes-on-target for ground teams. In public safety, this technology allows for the autonomous tracking of suspect vehicles during high-speed chases, reducing the need for dangerous ground-based pursuits in congested areas.
Infrastructure Inspection and Mapping
Pursuit technology is also being adapted for “linear infrastructure” inspection. For example, a drone can be programmed to pursue the path of a power line or a pipeline. By recognizing the structure as the “target,” the drone follows the line autonomously, using its sensors to detect anomalies or damage while maintaining a precise distance from the energized equipment.
The Future of Pursuit: Swarms and Full Autonomy
As we look toward the future of Tech & Innovation in the UAV space, the concept of pursuit is expanding into multi-agent systems and “True Autonomy.”
Multi-Drone Collaborative Pursuit
The next frontier is collaborative pursuit, where multiple drones work in a swarm to track a single target or a group of targets. Through M2M (Machine-to-Machine) communication, these drones can share coordinates and visual data. If one drone loses the target due to an obstacle, another drone in the swarm—having a different vantage point—takes over the lead. This creates a “persistent pursuit” environment where escape or loss of visual contact becomes nearly impossible.

Beyond Human-in-the-Loop
The ultimate goal of pursuit innovation is a “Level 5” autonomous system where no human-in-the-loop is required. We are moving toward drones that can be deployed to a specific area, autonomously identify a target based on pre-set parameters (such as a specific license plate or a thermal signature), and initiate a pursuit and reporting sequence entirely on their own. While this raises significant regulatory and ethical questions, the technological capability is rapidly approaching reality.
In conclusion, “pursue” is no longer a simple feature; it is a sophisticated discipline within drone technology. It represents the perfect marriage of AI, computer vision, and high-performance aeronautics. As processing power increases and algorithms become more refined, the ability of drones to autonomously navigate and track the world around them will continue to redefine our capabilities in filmmaking, industry, and global security. Understanding “pursue” is, in many ways, understanding the future of flight itself.
