Robert Spalding
Joint all-domain command and control, or JADC2, is the military’s latest buzzy acronym and one of the stated priorities for the Defense Department. Yet it is a concept without a clear solution and is likely to languish without the infrastructure to ensure its success.
One of the biggest challenges for JADC2 is interoperability. Today, just in nuclear command and control, which is a subset of JADC2, there are more than 100 programs underway. Since these programs often involve technology that was not built to be interoperable, bridging information is a challenge. Data becomes trapped and unable to be used quickly to build a comprehensive picture for decision makers.
The question becomes, why not use the cloud? There, the focus is on data transport. If we could just get the data to the cloud, then we could solve many of the problems. But then that raises the question, which cloud? And inevitably, how do I move my data there?
There are many different clouds with different architectures, which makes managing information across disparate platforms difficult. And there is no easy way to onboard cloud service providers. The department attempted to do so through a competition called the Joint Enterprise Defense Infrastructure initiative. Doomed from the start, the program envisioned a winner-take-all cloud service procurement scenario. If that is not a recipe for stagnant innovation, I’m not sure what is.
So, how does the military move data? There are numerous transport networks that are designed to carry the bits and bytes of JADC2 back to some all-knowing cloud — Starlink, OneWeb, Hughes Network Systems, Lockheed Martin, Boeing, Iridium and on and on. Which will win?
Probably all in some fashion since the pie is so big. And that’s just the satellite providers. Don’t forget the telecom service providers and fiber-optic companies, which also play a part in this orchestra.
What is consistent about these technologies? They represent separate networks that need a gateway and a location to on-ramp and off-ramp data between and among them.
The answer the tech industry seems to be offering is the cloud. The cloud, however, is not ubiquitous. The data centers that can be the gateways for moving, transferring and processing data are often large and centralized. This presents an incredible juicy target for would-be attackers.
Even if they were one single network, we still wouldn’t have the bandwidth to move the bulk of the data to the cloud. Often, by the time data does make it to the cloud, it is too late to enable the processing required to make the information gleaned actionable. Once again, warfighters are stuck with data islands.
Multiple independent networks and stuck data turn out to be two main challenges that the Defense Department needs to solve. How is it possible to harness the power of the commercial sector to solve these problems? The answer lies in 5G and edge computing.
5G is best known for providing the radio access technology in the latest generation of cellular networks, but it is far more than that. 5G also represents the implementation of virtual network functions, software-defined networking and transport gateways for the already interoperable wireless networks.
The industry standards body, 3rd Generation Partnership Project, or 3GPP, has taken on this challenge. A recent release envisions a set of standards for interoperability that combines computers, networking and storage across the different layers of commercial communication. It covers radio-access network standards to include network-to-network links. Properly configured, a 5G stand-alone network anywhere in the world should be able to hand off a slice of the network to a device according to the resource needs of the customer.
At the heart of 5G is edge computing. Without edge computing, trapped data will become a problem for commercial cellular networks as the devices attached to 5G networks multiply with their use.
Autonomous systems and the artificial intelligence algorithms that seek to make sense of data at the edge will require computing at the edge.
There will not be enough bandwidth to transport the data to centralized clouds for processing. These edge nodes will bring the power of applications to the data. Some of these applications will create the opportunity for wireless gateways at the edge that bring fiber, telecom and satellite together into one seamless network. This network can be reconfigured on the fly, for any application, because it is entirely constructed in software.
It seems logical that embracing commercial 5G and edge computing standards for JADC2 would be a no-brainer. The problem is that 3GPP is not built for the battlefield.
Commercial networks are built to commercial standards. As a former B-2 pilot, commercial networks look like an easy target, and judging from the bombing two years ago of a Nashville AT&T communications hub, you don’t even need a plane to take them down.
How do you build a 21st century secure, hardened, hyper-converged network? First, it would be electromagnetic pulse hardened. Russia, China and North Korea — and soon perhaps Iran — have nuclear weapons. Depending on the altitude of the burst, a single nuclear weapon detonated in the upper atmosphere above the United States could take out the grid and communications networks and data centers.
Second, the network would be resistant to physical tampering and ensure the provenance and security of both the hardware and software.
Third, the personnel who work on this network would be vetted, and more than one would be required to do any work of a critical nature on the hardware or software.
Finally, the network would be configured in such a way so that classified systems could operate securely on the infrastructure, alongside commercial applications.
No comments:
Post a Comment