JON ASKONAS AND COLBY HOWARD
Are drones reliable? Would you bet your life on them? In a recent article, Jacquelyn Schneider and Julia Macdonald argued that “the troops don’t trust drones” to protect them in combat with close air support. To understand how the people who actually coordinate airstrikes feel, they interviewed and surveyed Joint Terminals Air Controllers (JTACs) and Joint Fires Observers (JFOs) about their thoughts on working with manned and unmanned aircraft. They find some measure of hesitation about and distrust towards working with unmanned aircraft. In their conclusion, they argue that manned aircraft overhead inspire a “warm fuzzy” feeling of comfort and confidence in ground troops that unmanned platforms cannot provide. Ultimately, the authors recommend that “[p]olicymakers should reexamine their apparent commitment to an unmanned future.”
While we are happy to see scholars doing empirical research on unmanned aircraft and close air support, the evidence as Schneider and Macdonald have presented it provides a misleading picture of the relationship between drones, their pilots, and troops on the ground, and leads to faulty policy recommendations. Schneider and Macdonald’s article has some important empirical and logical flaws, and does not reflect our experience with close air support. Rather, we believe that in future combat, JTACs and JFOs will evaluate unmanned aircraft the same way they evaluate any other — according to the capabilities of the platform and the skill and reliability of their pilots.
Misleading Evidence
Schneider and Macdonald chose not to interview manned and unmanned pilots, arguing that they’d likely be biased about the merits of their own platforms. Yet the authors never address the potential biases of the JTACs and JFOs they interviewed. The most likely bias in their results is simple lack of familiarity with unmanned platforms in action, though the authors haven’t yet released their data. While firm numbers are hard to come by, there simply were not enough unmanned platforms available outside the special operations community for most JTACs and JFOs to have worked with them before about 2010, much less worked with them frequently. Numerous combat urgent needs statement submissions for armed unmanned aircraft reinforce this point. For example, the Marines still don’t have a single armed unmanned capable platform in their aviation element, so the service’s JTACs and JFOs do not usually have the opportunity to work with these platforms and their aircrews in training. Thus, the vast majority of JTACs and JFOs are likely to have had little to no experience working with unmanned close air support, and even less with recent updates that have improved performance even further.
Indeed, it was not clear that any of the soldiers the authors surveyed had substantial experience working with unmanned aircraft, though the article does not explicitly address this question. None of the interview subjects they quoted referred to actual experience with drones, and the article doesn’t specify how many of those surveyed had ever actually called in close air support from an unmanned platform.
Limited experience with unmanned platforms may explain why the personnel interviewed in the article had a skewed perception of unmanned pilots. Rather than being “coffee-drinking gamers whose distance from the battlefield severed their emotional connection to friendly ground troops,” as Schneider and Macdonald reported hearing during their research, drone pilots are deeply engaged in their missions, including with emotional and physiological responses. Of all their mission sets, there is none they are more passionate about than close air support, where they witness their comrades under attack through high-resolution video and audio. In our experience, JTACs and JFOs who work with drone pilots came to understand and trust the pilots and the platform, and certainly saw the pilots as human beings (something Schneider and Macdonald say their interviewees had a hard time doing). Before they are anything else, JTACs and JFOs are professionals. While suspicion towards platforms they haven’t trained with or that haven’t performed on the battlefield is understandable, JTACs and JFOs who work with drones treat them as they do any other platform: with a mission-oriented professional ethic.
The article’s discussion of JTACs’ concerns about the role of pilot risk is also misleading. The authors write: “In total, 63 percent of our survey respondents preferred a manned aircraft in scenarios that featured a high risk to both aircrew and ground troops.” Of course, it is exceedingly unlikely that any of the JTACs interviewed were ever themselves in such a scenario, given that the insurgencies in Iraq and Afghanistan have not managed to shoot down a single fixed-wing close air support platform (helicopters are sadly another story). They go on to quote a JTAC saying, “I’ve called in air [support] before from the ground. If the [aircraft] are afraid, they won’t come. If they think they can do it, they will do everything they can to help.” These findings are incongruous. It is precisely because unmanned vehicles are remotely piloted that their pilots are able to ignore danger to the airframe in support of troops on the ground. The notion that pilots who are at risk themselves will perform the close air support mission better contravenes what empirical evidence exists.
The authors also suggest that the JTACs and JFOs they spoke to were concerned about friendly fire from unmanned platforms. While the comparison is inexact due to the greater number of manned platforms, far more troops have died by friendly fire from manned than from unmanned aircraft. In Iraq and Afghanistan, there is only one documented incident of friendly fire on U.S. troops from an unmanned platform, compared to dozens by manned platforms. A lower incidence of friendly fire makes sense: One of the chief virtues of unmanned platforms is a long loiter time, allowing crews to develop a situational awareness of the battlefield that is more difficult to achieve with fast-moving manned platforms like the F-16 or A-10. Moreover, remote piloting technology enables additional checks on a potential strike, reducing the risk of friendly fire. In the single confirmed U.S. friendly fire incident involving a drone over the past 16 years, no fewer than four groups of people were involved in signing off on the strike. Strikes involving only troops on the ground and pilots in the air are at higher, not lower, risk for friendly fire, due to miscommunication, misunderstanding, or technical error. The tragedies in Nasiriyah, Iraq, in 2003 and Zabul, Afghanistan, in 2014 are painful reminders of this fact.
Schneider and Macdonald do a good job trying to tease apart concerns about the engineering limitations of existing unmanned platforms from unmanned platforms as they might be. But the hypothetical scenario they proposed to their interviewees seemed designed more to touch on an emotional attachment to an existing platform (the beloved A-10) than to help personnel think critically about manned and unmanned platforms: “When we asked JTACs in interviews if they would prefer support from ten remotely piloted A-10s or one manned A-10, they chose the latter.” If you’d asked a cavalryman in 1913 whether they preferred the support of ten horseless carriages or one horse, they might have said the latter too. That speaks more to trust in a proven platform and the difficulty of imagining future warfare than to any enduring concern about the underlying technology. Transformative technological military change is often accompanied by a lack of trust in new platforms, a preference for the familiar over the alien.
This resistance isn’t necessarily irrational; it takes time for a military service to figure out the tactics and organization that work. Senior drone pilots we have spoken with have seen unmanned aircraft go from missiles slapped onto a reconnaissance platform to sophisticated weapons systems employing advanced tactics to change the course of major engagements. One pilot we spoke with suggested that while early platforms did have frustrating limitations for close air support, drones — both the platforms and the pilots who fly them — had grown up in combat. “The 2007 Predator is as different from the 2017 Reaper,” he suggested, “as the Shooting Star [the first Air Force jet fighter] is from an F-16.” Future close air support-oriented unmanned platforms will have improved loiter times and durability, married to communications technologies that generate higher situational awareness and more confidence from the troops on the ground. As the authors themselves acknowledge, the differences in performance between unmanned and manned platforms are primarily an artifact of the missions they were engineered for. The scenario the authors proposed to the JTACS and JFOs misses the point: In the future, a JTAC won’t be able to tell whether the pilot he is working with is flying from overhead or from a base far away.
Throughout the piece, Schneider and Macdonald simplistically identify unmanned platforms with machines and humans with manned platforms, as if human piloting and machine capabilities weren’t equally involved in both. For instance, the authors doubt that “a remotely operated machine can make the same gut decisions that a human would make.” They discount the fact that there is a human being making those gut decisions. Far from being isolated from the battlefield, drone pilots experience rates of PTSD and depression equal to or greater than manned pilots. While downplaying the risks pilots face compared to older ways of warfare is nothing new, it is surprising and disappointing to see two scholars make the same mistake. Moreover, the latest research anticipates that drone pilots in future warfare (at least against a peer competitor) will even share combat risk as aviators fly unmanned platforms from airborne “motherships” and bases in theater. With military theorists increasingly looking to human-computer teaming and swarms of diverse manned and unmanned platforms to perform battlefield missions including close air support, it isn’t clear why the authors fixate on the waning paradigm of the self-contained, manned aircraft.
Misleading Conclusions
Schneider and Macdonald write that “in domains where humans are in direct physical contact with the enemy… troops will be reluctant to delegate decisions to machines. Instead, they will want to work with humans they can trust.” This is a misleading conclusion on several points. For one thing, so-called unmanned systems are now and will for the foreseeable future be piloted by human aviators who are fully engaged with the mission. The article reflects stereotypes about American service members flying unmanned platforms that are unjustified and hurt the mission.
Even more dangerously, Schneider and Macdonald’s recommendations entail pilots risking their lives solely to inspire confidence in troops on the ground. Using a manned system where an unmanned would suffice results in unnecessary pilot deaths, due to both airframe malfunction and combat risk — and also ties the hands of commanders. There will always be some situations where the need for a “man on the spot with a gun” is unavoidable. But it is unethical to ask pilots to risk their lives unless it actually reduces U.S. casualties or advances the mission — rather than just inspiring warm feelings.
Third, there is an assumption throughout the article that the trust problem comes from unmanned platforms or the unmanned aviation community, rather than from inexperience with unmanned close air support and stereotypes about drones. The “warm fuzzy” Schneider and Macdonald discuss is not an emotion but a well-trained gut instinct for assessing the competence and capabilities of platforms and pilots often known only through a radio link. Platforms, manned or unmanned, and their pilots are judged by how well they perform the mission, and nothing else. The authors’ evidence does not prove that manned platforms are inherently preferable, but rather that JTAC and JFO training should be modified to give additional exposure to and comfort with unmanned platforms.
There is no denying that Schneider and MacDonald have identified a possible trust problem in the JTAC community. But they are offering a hazardous solution (risking the lives of more American pilots) to a fleeting problem. In the future, increased familiarity with drones and adapted training of JTACs and JFOs will alleviate trust issues, as unmanned platforms more tailored to close air support come online, communications technologies improve, and JTACs and JFOs get more exposure to drone pilots’ passion for the mission and results on the battlefield. Already, JTACs and drone pilots working together in the special operations community have developed those bonds. We believe the further spread of unmanned aircraft will lead this cultural change into the rest of the force. Indeed, the critical role that unmanned close air support has played in actions against ISIL in Syria, Iraq, and Libya suggests this is already occurring. Like other War on the Rocks authors, we are optimistic that JTACs and JFOs in the future will count “unmanned” aviators among the group of “humans they can trust.”
No comments:
Post a Comment