Kris Osborn
ABERDEEN PROVING GROUND, MD—In the future, warfare is likely to involve a dangerous and unpredictable mixture of air, sea, land, space, and cyber operations, creating a complex, interwoven set of variables likely to confuse even the most elite commanders.
This anticipated “mix” is a key reason why futurists and weapons developers are working to quickly develop cutting-edge applications of artificial intelligence (AI), so that vast and seemingly incomprehensible pools of data can be gathered, organized, analyzed, and transmitted in real-time to human decisionmakers. In this respect, advanced algorithms can increasingly “bounce” incoming sensor and battlefield information off of a seemingly limitless database to draw comparisons, solve problems and make critical, time-sensitive decisions for human commanders.
Many procedural tasks, such as finding moments of combat relevance amid hours of video feeds or surveillance data, can be performed exponentially faster by AI-enabled computers. At the same time, there are certainly many traits and abilities that are unique to human cognition. This apparent dichotomy is perhaps why the Pentagon is fast pursuing an integrated approach, combining human faculties with advanced AI-enabled computer algorithms.
Human-machine interface, manned-unmanned teaming, and AI-enabled machine all refer to a series of emerging technologies, one that will redefine the future of warfare and introduce new tactics and concepts of operation.
Just how can a mathematically-oriented machine truly learn things? What about more subjective variables such as feeling, intuition, or certain elements of human decision-making faculties? Can a machine integrate a wide range of otherwise disconnected variables and analyze them in relation to one another?
A drone without labeled data is referred to by Army Research Laboratory (ARL) scientists as one involved in unsupervised learning, meaning that it may not be able to “know” or contextualize what it is looking at. In effect, the data itself needs to be tagged, labeled, and identified for the machine to be able to quickly integrate it into its database as a point of reference for comparison and analysis.
“If you want AI to learn the difference between cats and dogs, you have to show it images … but I also need to tell it which images are cats and which images are dogs, so I have to “tag” that for the AI,” Dr. Nicholas Waytowich of the Army Research Laboratory told the National Interest in an interview.
No comments:
Post a Comment