Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
News

Voice-controlled drones a military game-changer, Primordial Labs says

Hovering over old Civil War-era farmland, a small quadcopter is told to go up 125 feet and then move 50 meters north of a specific target on the ground — in this case, a soccer goal. The drone rises and flies off.

On the ground, there isn’t someone with a standard gaming stick controller maneuvering the small drone. Rather, it is receiving all directions by someone’s voice.

“Doing that with the sticks is a lot of work,” Lee Ritholtz, CEO and co-founder of Primordial Labs, said during a recent demonstration of the capability at a local park in Leesburg, Virginia.

“He just told it to do a lot of things that would have required a lot of manual engagement,” Ritholtz said of his senior product manager and former Marine Raider, Jordan Dross, who was in the midst of talking the drone through its mission via a simple headset and push-to-talk radio.

Ritholtz and his co-founder Adrian Pope — both with roots at Lockheed Martin, including at Skunk Works, where they focused on advanced development programs and Sikorsky — created Primordial Labs in 2021 with the intent of developing technology that helps tackle large problems pervasive in human-machine interactions on the battlefield.

Defense technology often just recreates problems it intends to solve, such as cognitive overload, manpower requirements, physical workload and training burdens, Ritholtz said at the demonstration.

Ritholtz worked extensively on the Stalker XE program and knew there was much to solve when it came to the ground control conundrum.

“People love the airplane, but people always said this ground control station software sucks, and we just didn’t have a better way of doing it at the time,” he said. “And that’s where Anura comes in.”

Dross said he was hired after spending his military career using Stalker XE and not sugar-coating its user experience during his interview. He had no idea at the time that Ritholtz had worked extensively on its development.

The company developed the software Anura, which allows a human controller to simply speak to the drone to get it to do what they want.

“Some of the challenges that exist right now with the current set of interfaces for drones, for all kinds of robots … even using one robot, is overloading,” Ritholtz said. “We’re focused on new machine collaboration, two pieces that are really core to our mission, it’s easy and natural, and our focus … is about making it more human, the interaction more human.”

Anura’s key feature is the use of natural language.

“In the past there’s been work done trying to apply voice to these systems, oftentimes they are voice commands and they are really memorizing keywords, memorizing phrases. That’s doomed to fail, that will never work,” Ritholtz said.

The system is “truly natural language. You’re talking to it, you can say things in lots of different ways to express the intent,” he said. “There’s an element of feedback loop from the systems in Anura. That’s the battle space awareness … You can talk about things that emerge in real time.”

For example, when a controller talks to a robot, Anura understands the intent, breaks it down into a sequence of instructions that is feeding the robot, measures the status of those instructions and can make changes on the fly, Ritholtz explained. “That’s our autonomy under the hood. Our autonomy is essentially a wrap around other people’s autonomy.”

The software is designed to be able to work on any platform or system. At the demonstration, Anura was integrated onboard both a small Skydio quadcopter and one from Teal Drones — both competitors for the Short-Range Reconnaissance program in the U.S. Army.

“Wherever we’re needed to live, we’ll find a way to fit in,” Ritholtz said. “We don’t use any big black boxes. There aren’t any [large language models] being used here. We’re not calling out to, kind of, any open [artificial intelligence] servers. Everything is running locally and it’s because we own our pipeline.”

The company focused its original scope of the technology development on unmanned aerial systems within Army aviation and the Special Operations Command universe. But that work has expanded.

Primordial Labs has also worked with the Army’s program executive office for ground combat systems to experiment on a number of platforms, according to Ritholtz.

The company continues to find ways to work both with government customers and original equipment manufacturers to incorporate the capability. Primordial Labs currently has contracts with four program executive offices and five OEMs that include both air and ground platforms.

In addition to working with potential candidates for the Short-Range Reconnaissance program, Anura has also been integrated for demos on the micro-drone Black Hornet, which is the Army’s Soldier Borne Sensor system.

In 2025, the company is providing the Army’s “transformation in contact” brigades and divisions —designed to test technologically ready, innovative capabilities in operational environments — a minimum of 8,000 Anura licenses to support human-machine integrated formations experimentation.

Primordial Labs supported a major prime at Project Convergence in 2024 and is participating in the Army’s Expeditionary Warrior Experiment in April.

Across the board, no matter what program Anura is supporting, “there is one Anura,” Ritholtz emphasized. “Even when we are using Anura for different applications, it’s the same software. Every time we have a new application, we create a fork of Anura and then we figure, is this something that has legs?”

Every time Anura gets a new application, “it makes Anura better for everything,” he noted.

At the demo, the company also showcased how it is able to control multiple drones at once as a team, which is key to finding a solution for controlling large swarms of drones at once to flood the battlefield and overwhelm the enemy.

A swarm of a thousand drones can’t be controlled by a thousand people, and if even if the ground control was whittled down to a few controllers, the “graphical user interfaces break down when you have lots and lots and lots of objects and you can’t build enough dropdown menus and buttons to communicate all the constraints you want to an intelligent system,” Ritholtz said.

“As these robots get more and more intelligent … it’s kind of what we call the bad coworker problem, where you have an intelligent being that joins the office, but they don’t know anything about how you operate,” he explained. “They don’t know anything about your culture, your workplace culture, and so they’re kind of annoying to work with. That’s how I believe the robots of the future are going to be unless we find a better way to interact with them.”

Jen Judson is an award-winning journalist covering land warfare for Defense News. She has also worked for Politico and Inside Defense. She holds a Master of Science degree in journalism from Boston University and a Bachelor of Arts degree from Kenyon College.

Read the full article here

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button