According to the Consortium for Information and Software Quality (CISQ), in 2020 the United States wasted $2.08 trillion on bad software and its effects—nearly 10 percent of U.S. gross domestic product! The naval services’ estimated budget for fiscal year 2021 is about $207 billion, suggesting the Sea Services may lose $20 billion in 2021 to the effects of bad software. If the Navy was wasting $20 billion per year in fuel, then command dismissals, inspector general investigations, and Congressional inquiries would follow.
The Navy’s response to this crisis has been to embrace and implement Development Secure Operations (DevSecOps). In a nutshell:
DevSecOps is an organizational software engineering culture and practice that aims at unifying software development (Dev), security (Sec) and operations (Ops). The main characteristic of DevSecOps is to automate, monitor, and apply security at all phases of the software lifecycle: plan, develop, build, test, release, deliver, deploy, operate, and monitor. In DevSecOps, testing and security are shifted to the left through automated unit, functional, integration, and security testing—this is a key DevSecOps differentiator since security and functional capabilities are tested and built simultaneously.
DevSecOps leverages the agile software development process, which excels at rapidly moving out minimum viable software products to users. While laudable in its intentions, DevSecOps will not solve the Navy’s problem. Within its constraints, DevSecOps is fundamentally sound—experimental programs such as the Air Force’s Kessel Run project have shown it can work, providing increased capabilities to operators faster and more securely. But DevSecOps does not—cannot—fix fundamentally bad software. The “SecOps” stages will only succeed as well as the initial development process is executed.
“Agile” accelerates capabilities and lowers risk. Instead of working to build a complete system and then deploy it, Agile delivers segments that provide initial capabilities, then gradually increases them over time. In this way, each segment creates less risk. Over time, though, all these developed segments result in a big system, with the same problems as current legacy software. Indeed, some say that Agile places a premium on speed, at the cost of good documentation. This can result in a poor basic framework developed for the entire system, adding to the longer-term repercussions. In the old coder’s formulation: garbage in, garbage out. But new software development tools, like those that have revolutionized manufacturing, are changing everything about how software is written, and they promise to bring the full power of DevSecOps to bear on the Navy’s code.
How We Got Here
Fundamentally, software is developed using a Middle Ages paradigm known as the guild concept. New coders, as apprentices, work to gain journeyman status, and a few eventually become masters. They belong to a guild (there are other names, too) that maintains standards, provides tools, and insists on certain good practices. But, as in the Middle Ages, the guilds hold tremendous sway over the practitioners, since having a good guild is a competitive advantage. Many would argue that current software guilds are less devoted to craft than their Medieval forebears, and the craftsmen approach has driven software engineering into several long-term problems.
Software guilds are like tribes. In a perfect tribe, tribal wisdom is documented and proven and passed down over time. It explains what, how, and why, and good practices beget more good practices. The reality, though, is that overextended software tribes tend to pass tribal mythology, in which information is not grounded in good practices.
In tribal mythology, the language often heard is: “Because that’s how I was taught”, “I’m not really sure, but Joe says that’s how it is done”, and “Jane could have told you, but she retired last year, so just copy what’s there.” If you doubt this, go find your nearest coder and ask him or her to verify these comments.
Tribal mythology is a substantial contributor to this bad software pandemic, and a major factor is poor software documentation, which has existed since the first codebases. As one book says, “Agile techniques focus on working code, not documentation.” But: “Agile systems have a funny way of becoming legacy systems . . . [and therefore] face the same challenges as legacy systems: developers leave, documentation rots, knowledge is lost.”
From coding guilds that lean on mythology and poor documentation, DevSecOps just accelerates the delivery of bad software. Worse, CISQ has documented a direct correlation between software quality and security. Therefore, more bad software also begets more cyber vulnerabilities.
Cross Appropriation Saves the Day?
Into the late 20th century, mechanical and aeronautical engineering used similar guild-like approaches. Engineers designed entire machines or airplanes, creating intricate drawings, and enlisting scores of draftsmen. Drawings were delivered to machine shops, where grizzled machinists reviewed the designs, frequently rejecting them as not realistic. The dance between engineers and machinists might last weeks or months. Eventually, systems got built and new airplanes were manufactured. For machines, a design was relatively easy to test: Turn it on, and, if it broke, figure out why. Daring test pilots became famous—if they lived long enough.
These engineers abandoned that medieval culture a generation ago, switching to computer-aided design (CAD) and computer-aided manufacturing (CAM) to realize their designs. CAD/CAM software environments include all the design rules, physics-based constraints, and mechanical and aeronautical engineering standards, and these limits are continually updated. A completed conceptual design is inserted into a virtual test harness or a virtual wind tunnel. Every conceivable operating condition is simulated; problems are identified and fixed. Eventually, the engineers are happy, and the drawings are passed to a machine shop now consisting of just a few computer operators, who ensure the robot machinists are set up correctly to build the parts. Everything fits together with microscopic precision.
Consider Boeing’s 777 airliner, for example. The “Triple-7” was the first commercial airliner designed entirely on computers (using the CATIA system). From first test flight to FAA certification and its first revenue flight was a mere 14 months. Everyone already knew how it would be assembled, because of the extensive simulation evaluations.
But a different example shows the contrast between the Middle Ages process and a modern approach. In 1996, the Department of Defense-funded two Joint Strike Fighter prototypes. Both flew in summer 2000, less than four years from funds receipt. Each was among the most complex aircraft ever built—an amazing accomplishment that owed everything to the CAD/CAM revolution. But it took 14 more years for the winner to reach initial operational capability—because of the aircraft’s software—and many aviation experts contend the airplane was not ready at that point. The guild-based approach to software took 3.5 times longer than the CAD/CAM approach to aircraft design and production.
CAD/CAM, But for Software
Many software engineers recognized their community’s shortcomings, and they developed a system to bring coding out of the Middle Ages. Model-based system engineering (MBSE) uses CAD/CAM-like practices for software development.
In MBSE, the software engineer creates a conceptual design model that is formally provable—that is, the logic of the software is impeccable. The follow-on evaluation of the conceptual design model is as simple as connecting it to a virtual operating environment that includes cyber threats. When the engineers are satisfied that the model passes the test-and-evaluation regime, it is a simple matter to shift the model to specialized transform engines, which automatically generate code for any combination of user-desired hardware and operating system combinations. This transforms the practice of software development from an ad hoc tribal coding culture to a 21st-century model-driven science.
Building the conceptual model spins off many benefits. First, the modeling tools provide thorough documentation automatically. Second, the tools can likewise automate the generation of required paperwork, such as Department of Defense Architecture Framework diagrams. But the largest benefit may be that to adjust or add to the capabilities or respond to a new cyber threat, developers merely need to change the conceptual model and reevaluate in the virtual operating environment. This means updates, patches, or new capabilities can be realized in hours or days. With this approach to code development from the ground up, DevSecOps can become a powerful process for rapid capability delivery to the fleet.
Further, legacy software development projects often reinvent a large proportion of functionality, whereas, in traditional manufacturing, reuse is much more the norm, often enabled by base platforms of various models. (For instance, the Toyota RX350 and several Lexus models are very successful SUVs, built on the standard Toyota Camry base.)
The original MBSE tool, Systems Modeling Language (SysML) 1.0, worked, but it was difficult to build a logically complete and formal model that lent itself to automated code generation. But SYSML 2.0, in development for some time and now close to release, is designed to fix these issues. Building a base software platform will be much easier in the future with SYML 2.0 and the supporting model libraries. SysML 2.0 also offers morphological analysis of integrated models, which ensures that integrated systems will work properly, with no vulnerabilities.
The Navy is coming around on MBSE as the software engineering path forward. It formed a Digital Systems Engineering Transformation Working Group (DSET) in 2019 to embrace these practices. Navy engineers have a long way to go to take full advantage, but work is underway. Many thousands of Navy engineers have received initial training in MBSE and SysML 1.0, and the Navy has begun to insist on MBSE techniques in contracts with third-party developers.
The Promise of MBSE
Several important outcomes result from employing conceptual designs. A conceptual design lasts forever. If a new software language is needed to modernize the code base, it is a simple matter to use the conceptual design to deliver modified or modernized code. Today, when legacy code is old, it is often difficult or impossible to modernize it, if it is in an obsolete language or the documentation is poor. Often, the entire codebase must be completely reinvented—remember the “Y2K bug”? But for new programs, CISQ has invented many tools that double-check conceptual designs for quality and security. Therefore, there is considerable confidence that the conceptual model will generate the highest quality code possible.
And there is hope for older code. A set of tools developed by the Object Management Group, the standards organization that builds MBSE tooling, enables modelers and engineers to reverse-engineer legacy code into conceptual models. Once it is evaluated for quality, security, and the formality discussed above, the legacy-code-turned-model will be able to generate a modern, secure codebase for the legacy functions. This allows program managers to leapfrog the older codebase, where there often exist artifacts of inefficiencies, vulnerabilities, and code cul-de-sacs.
Next Steps
How does the Navy initiate this software development transformation? Experts are planning for SYSML 2.0 to be formal and stable by mid-2022, and it has already been submitted for consideration as a new standard to the Object Management Group. It is currently undergoing finalization. What follows are a minimum set of necessary transforming actions.
Increase support to DSET to explore SYSML 2.0 adoption. Hire several formal MBSE masters from OMG to assist. Require that the Naval Postgraduate School Systems Engineering Department adopt SYSML 2.0 and related tools, so students begin to learn the model-based best practices.
Place several of the Navy’s best and brightest modelers on the OMG SysML 2.0 finalizing team.
Identify industry performers who are leaders in formal MBSE methods and are planning to adopt SysML 2.0. Give them priority on contracts.
Ensure Defense Acquisition University updates its certifications to reflect this modern software engineering approach.
Require each systems command to begin a pilot software project using MBSE.
Most important, senior technical leaders must realize that the practice of developing software will change dramatically with the advent of SysML 2.0. Since code will be generated automatically, the naval services will need more broad-minded engineers able to translate operational requirements into formal conceptual models.
MBSE software development methods are not measurably faster at the outset, but they pay dividends every time there is a new security patch, software upgrade, user demand for a new capability, a reason to integrate with another system, or a newly discovered vulnerability. This aligns perfectly with the goals of DevSecOps.
Today’s Navy software practices, whether executed by Navy labs or contractors, cost the naval services billions of dollars annually. DevSecOps applied to the code those labs and contractors write only improves the speed with which more problem-ridden stove-piped code is delivered; it cannot solve the problem of poor initial coding. By itself, it is the last thing the naval services need. As a 2020 CISQ 2020 report puts it:
The issues of poor-quality software and technical debt have been with us since the dawn of IT. It would appear we have reached a crisis point where we need to be much more serious about how we address these issues and we need to start thinking about how we make developers engineers.
Current software development practices are the province of Middle-Ages thinking; they will fail no matter what adjustments are made at the margins. Only a full-scale transformation can help reclaim the $20 billion the Navy and Marine Corps will lose this year to bad software.
No comments:
Post a Comment