In the rapidly accelerating world of unmanned aerial vehicles (UAVs), technology often outpaces the language we use to describe it. It was only a decade ago that a “drone” was primarily a military asset or a fragile hobbyist project controlled by complex radio frequencies and a prayer. Today, these machines are essentially flying supercomputers. As features shift from experimental prototypes to standard industry benchmarks, we often find ourselves reaching for terms that have since been rebranded, refined, or entirely replaced. We ask, “What was it called again?”—only to realize that the “Automatic Follow” of 2014 is the “Autonomous AI-Driven Pathfinding” of 2024.

This article explores the evolution of drone tech and innovation, specifically focusing on how the “intelligence” of the aircraft has transitioned from simple scripted behaviors to complex, real-time cognitive processing.
From “Smart Follow” to Neural Networks: The Evolution of Tracking Systems
The ability of a drone to identify a subject and follow it through space was once the “holy grail” of consumer and commercial tech. In the early days, this was a clunky, often unreliable feature that led to more than a few tree-bound quadcopters.
The Era of Follow-Me Mode
In the beginning, “Follow-Me” was essentially a tether. It relied almost exclusively on a GPS handshake between the drone and a ground station (usually the pilot’s smartphone or a dedicated beacon). The drone didn’t “see” the person; it simply chased a coordinate. If the person moved behind a building or under a canopy, the drone followed the coordinate blindly, often resulting in a collision. This was the era of “dumb” following—relying on external data rather than onboard intelligence.
ActiveTrack and the Rise of Computer Vision
As processing power increased, we saw the birth of computer vision. This is where the industry began to shift away from GPS-based tracking toward visual recognition. By using a series of algorithms to identify the pixels that constituted a “human,” a “car,” or a “cyclist,” drones began to perceive the world. This transition marked the first time the drone was truly “aware” of its subject. This phase introduced terms like “lock-on” and “visual tethering,” moving us closer to the sophisticated AI we see today.
Modern AI-Driven Predictive Pathfinding
Today, we no longer just “follow.” Modern high-end drones utilize neural networks to predict where a subject is going to go. If a mountain biker disappears behind a thicket of trees, the drone’s AI calculates the rider’s trajectory and speed, maintaining the flight path to reacquire the subject the moment they emerge. This isn’t just a follow-mode; it is a predictive behavioral model. The industry has moved from “Follow-Me” to “Predictive Autonomous Tracking,” reflecting a massive leap in computational complexity.
Navigating the Fog: When “Obstacle Avoidance” Became “Spatial Awareness”
If tracking was the first hurdle, not hitting things was the second. The terminology surrounding how a drone interacts with its environment has undergone a similar transformation, moving from reactive safety features to proactive environmental mapping.
Early Ultrasonic and Infrared Sensors
Early attempts at keeping drones from crashing were primitive. We used “ping” sensors—ultrasonic or infrared—that worked much like a parking sensor on a car. They were effectively “stop signs.” If the sensor detected a wall within three meters, the drone would simply halt. These systems were famously unreliable in bright sunlight or against absorbent surfaces like bushes. This was “Obstacle Avoidance” in its most literal, rudimentary form.
The Leap to Omnidirectional Vision Systems
As the tech matured, manufacturers began “skinning” the aircraft in cameras. By using stereoscopic vision—essentially two cameras placed a few centimeters apart to mimic human depth perception—drones began to see in 3D. This allowed for the transition from “avoidance” to “navigation.” The drone was no longer just stopping; it was calculating a way around the object. This is when the term “VIO” (Visual Inertial Odometry) entered the lexicon, allowing drones to maintain their position even when GPS was unavailable.

SLAM and 3D Environment Mapping
The current gold standard is SLAM (Simultaneous Localization and Mapping). This technology allows a drone to build a 3D map of an unknown environment in real-time while simultaneously tracking its own location within that map. When we look at modern enterprise drones used for inspecting mines or warehouses, they aren’t just “avoiding obstacles.” They are performing “Spatial Mapping.” They understand the geometry of the room, the texture of the surfaces, and the depth of the void. When a pilot asks, “What was that system that maps the room called again?” the answer is SLAM—a cornerstone of modern robotics.
The Semantic Shift of Autonomy: Defining Levels of Flight Intelligence
The word “autonomous” is often thrown around loosely, but in the realm of tech and innovation, it has very specific tiers. Much like the automotive industry’s levels of self-driving, drone flight has moved through various stages of independence.
Semi-Autonomous vs. Fully Autonomous Flight
For a long time, drones were “semi-autonomous.” They could hover in place or return to their takeoff point if they lost signal. This was often called “Fail-Safe Mode” or “Auto-RTH.” However, the pilot was still the primary intelligence. Today, we are seeing the rise of “Level 4” and “Level 5” autonomy in drones. This means the drone can execute a mission—take off, fly a complex grid, avoid unpredictable obstacles, and land—without a single input from a human. We have moved from “automated” (following a script) to “autonomous” (making its own decisions).
The Role of Edge Computing in Real-Time Decisions
The reason for this shift is “Edge Computing.” In the past, complex data had to be sent to a server to be processed, which was too slow for a fast-flying drone. Now, the “brain” is on the aircraft. High-performance AI modules (like those from NVIDIA or specialized proprietary chips) allow the drone to process gigabytes of visual data every second. This enables “Low-Latency Autonomy.” If you’re wondering why modern drones feel so much more “alive” and responsive, it’s because the processing is happening at the “edge”—right there on the gimbal-mounted hardware.
Remote Sensing and the “Invisible” Pilot
Beyond just flying, drones are now intelligent sensors. We’ve seen the integration of LiDAR (Light Detection and Ranging) and ToF (Time of Flight) sensors. These innovations have changed the drone from a “flying camera” to a “data acquisition platform.” In industries like construction and agriculture, the “drone” is almost secondary to the “Remote Sensing” tech it carries. The innovation here isn’t just in the propellers, but in the drone’s ability to interpret the data it gathers on the fly, identifying crop stress or structural cracks without human intervention.
Future-Proofing the Vernacular: Beyond the Term “Drone”
As we look toward the future, the very word “drone” is starting to feel like an oversimplification. In professional and technical circles, the language is shifting to reflect a more integrated ecosystem of technology.
The Transition to Unmanned Aerial Systems (UAS)
“What was it called again? A drone?” If you’re talking to a regulatory body or a high-end engineer, they will likely correct you: “It’s a UAS” (Unmanned Aerial System). This distinction is important because it acknowledges that the aircraft is only one part of the innovation. The “system” includes the ground control station, the data links, the AI cloud processing, and the sensor arrays. This shift in terminology reflects the maturity of the industry—moving away from toys and toward integrated infrastructure.
Swarm Intelligence and Collective Autonomy
One of the most exciting “What’s next?” moments in tech is Swarm Intelligence. We are moving away from the idea of a single pilot controlling a single aircraft. In the near future, we will see “Collective Autonomy,” where dozens or hundreds of drones communicate with each other to perform a task. They fly like a flock of birds, avoiding each other and dividing tasks efficiently. This isn’t just innovation; it’s a paradigm shift in how we think about aerial labor.

The Integration of 5G and Beyond-Visual-Line-of-Sight (BVLOS)
Finally, the innovation of connectivity is changing the names of the game. We are moving into the era of BVLOS. Previously, a drone was limited by the pilot’s eyesight. With 5G integration and satellite links, the “What was it called?” question will likely refer to “Global Teleoperation.” A pilot in London could be “flying” a drone in the Australian Outback with near-zero latency.
As we look back at the humble “quadcopters” of a decade ago, it is clear that “what it was called” matters less than what it has become. We have moved from mechanical toys to intelligent, autonomous agents that are reshaping our world. The next time you find yourself forgetting the name of a specific feature, remember that in this industry, today’s “innovation” is tomorrow’s “standard,” and the language will continue to evolve as fast as the wings can carry it.
