Over the past two years, the U.S. government has identified and developed initial strategies around what it has determined to be the technologies that are most important to national security, particularly relating to U.S.-China competition. Last year, the Under Secretary of Defense for Research and Engineering (USD(R&E)) published its technology vision and fourteen critical technology areas, and the White House’s National Science and Technology Council released [PDF] a list of nineteen “critical and emerging technologies.”
Most of the areas mentioned in official critical technology lists reflect well-publicized topics, such as supply chain challenges for semiconductors, the development of new biotechnologies (including vaccines), and the risks and opportunities presented by breakthroughs in artificial intelligence. However, some [PDF] of these technologies are not as well-known outside of the scientific community, such as “advanced manufacturing,” “advanced gas turbine engine technologies,” and “human-machine interfaces.”
Identifying the most important technologies for national security and U.S.-China competition, especially those that don’t make headlines, currently requires an enormous amount of manual, time-consuming work, such as collecting, reading, and categorizing academic publications. This process tends to be top-down, with senior decisionmakers assigning topics for investigation to their staff. It can also be reactive, with certain technologies only being prioritized after high-profile issues emerge, such as a global pandemic, and introduce human bias. Moreover, once experts identify critical technologies, determining the entities involved and the United States’ relative standing in these fields, also known as net assessment, requires even more effort and can take months to years.
The U.S. government has a history of studying technology emergences and funding net assessments to find the next major technologies and understand how the United States compares to its adversaries in technological readiness. The Office of the Director of National Intelligence’s National Intelligence Council releases its “Global Trends report” every four years, which forecasts future technology trends, and the Department of Defense has a dedicated Office of Net Assessment. Federally funded research and development centers (FFRDCs) and non-profits also support both activities, and quantitative investment firms in the private sector similarly seek to identify emerging technologies and the entities commercializing them.
The U.S. government should align resources across the public, non-profit, and for-profit sectors by investing in the development of automated capabilities that leverage publicly available data for emerging technology detection and net assessment. Publicly available data is data that can be obtained by anyone, whether freely online, or from data vendors [PDF] for a fee. This can include patent, business intelligence, supply chain, and other types of data. Leveraging publicly available data in sophisticated analytic methodologies would enable the U.S. government to more pre-emptively and organically prioritize the next generation of critical technologies and evaluate how the United States compares to China in these areas.
It would also enable policymakers, strategists, and funding agencies to identify American and allied innovators in emergent critical technology areas faster and earlier. Instead of senior leaders tasking their staff to investigate specific technologies in the classic, top-down approach, insights on emerging technologies in U.S.-China competition would be delivered to agencies as leads generated algorithmically. Such capabilities would also enable the U.S. government to identify gaps where China is leading or lagging in discovered critical technology areas, and fields where innovation and technological maturity are comparable.
A bottom-up approach leveraging publicly available data and automation would be immune to hype and human bias toward technologies that become popularized in the media. Instead, it would enable the discovery of advances in technologies that could go unnoticed by the broader defense community, such as breakthroughs in alternative protein development, which would be essential in a foreseeable future where global fisheries are depleted.
Results from these automated processes will also be delivered faster, empowering government analysts and subject matter experts to spend more time reviewing and analyzing potential findings. This is preferable to current net assessment and emerging technology detection processes, which involve spending countless hours manually reviewing technical data from different sources.
The U.S. government should fund and oversee the development of data-driven, automated capabilities for net assessment and emerging technology detection. Because different agencies will benefit from this capability, a funding model incorporating all government stakeholders should be developed so that costs are shared fairly. Data vendors, quantitative investment firms, and other private sector partners would supply data and influence technical approaches adopted to build the product.
A capability for data-driven, automated emerging technology detection and net assessment will empower policymakers, funding agencies, and other decisionmakers to prioritize cultivating and protecting emergent, critical technologies and the innovators behind them earlier and faster than currently possible. This will enable decisionmakers to be less reactive in U.S.-China technology competition and avoid overlooking less-publicly known technologies that quietly become crucial to national security.
Connor Fairman is a systems engineer at the MITRE Corporation. The author’s affiliation with The MITRE Corporation is provided for identification purposes only and is not intended to convey or imply MITRE’s concurrence with, or support for, the positions, opinions, or viewpoints expressed by the author.
No comments:
Post a Comment