Scene: A conference room on the 5th floor. Mark Antony rises and moves to the front where a PowerPoint slide displayed on the video wall announces that the F-35 Joint Strike Fighter program has been cancelled.
Friends, Americans, Congressmen, lend me your ears.
I come to bury the Joint Strike Fighter, not to praise it.
The problems complex defense programs face live on in committee reports;
The good is oft interred in appendix footnotes . . .
I speak not to disprove what Critics speak,
But here I am to speak what I do know . . .
The assassination of Julius Caesar, and Antony’s funeral oration, is a compelling story in its own right. Shakespeare notes, however, that it offered a warning to Elizabethan audiences at a time of political conflict and uncertainty. Caesar’s assassins acted on professed motives of saving the Roman Republic, but their conspiracy triggered a civil war that destroyed the republic and brought about the very imperial reign they feared.
Antony’s speech convinced the Roman populace that Caesar’s actions were more complex than mere personal ambition. So, too, issues around the F-35 Joint Strike Fighter and other defense acquisitions are more complex than cost overruns and claims about capability shortfalls.
What’s at stake is a much deeper issue. How can the U.S. maintain military effectiveness at a time of rapid technological change and the disruption it brings?
We can look back 35 years to another time of rapid tech change to understand why questions around the F-35 program need to be seen in larger context.
On What, And How, To Standardize?
In the mid 1980s, the Defense Department asked an industry group for input on a pressing acquisition issue. Electronics in the form of computer microchips, ubiquitous today, were rapidly evolving. The Cold War was active, the Strategic Defense Initiative sought to anticipate major new capabilities on the part of the Soviet Union and its allies, and existing military planes were quickly becoming obsolete.
During a time of brisk changes in computing electronics, the U.S. Air Force asked, in what ways should it standardize development of avionics, flight control, and similar embedded hardware/software systems on military planes? These systems needed to meet high performance standards that included time sensitive, reliable responses to sensor and other data. What was the best tradeoff between the latest chip technologies and the ability of the defense contractors to deliver needed system capabilities on time, within cost, and with reliable performance?
The small company I was technical director for at the time provided defense contractors the highly optimized software development tools they needed to design, implement, and test embedded software that in turn had to meet those rigorous reliability and performance requirements. The Air Force had previously standardized on a particular 16-bit CPU chip instruction set. The defense contractors were free to implement their own chip designs for avionics and flight control, so long as their chips responded to a standard vocabulary of low level, hardware-oriented computational instructions. Our software development tools translated their code into the detailed hardware instructions that made their avionics and flight control systems function.
Was this, the Pentagon asked the industry group, the best way to standardize as chip technology rapidly changed? Should they relax standards and only require that programmers use a specific high level language (JOVIAL, ADA) and trust that tool developers would translate that code into efficient, reliable executable programs for embedded systems with rigorous performance needs? Should they go in the other direction, and standardize on a specific chip hardware design to be mandated for all embedded systems?
DARPA, the Pentagon’s research arm, proposed adoption of the then-new RISC (reduced instruction set) chips whose development it had funded. RISC chips can be programmed to mimic other instruction vocabularies and theoretically would allow new and old systems to work together. But that hardware mimicry introduced major uncertainties and time lags into systems that needed to respond rapidly to incoming missiles, evasive maneuvers, and other challenging events.
Ultimately, the Air Force decided to adopt a 32-bit version of its existing instruction set for embedded systems. Chip technology was just too unstable to settle on something new that would require new tool sets, retraining programmers and engineers, new testing regimens, and all of the rest of the scaffolding that goes into system designs and upgrades.
Drowning In Tech-Driven Change
In addition to rapid changes in chip technology, there was another reason to compromise in that mid-1980s Pentagon standards decision. A wide range of tech changes were driving new operational requirements by enabling new opportunities and threats. New digital communications capabilities. New materials for plane construction. New missile and other potential attack capabilities a plane needed to avoid.
In other words, a rapidly evolving, uncertain tech ecology within which US military aircraft needed to function was disrupting military doctrines and strategies, operational requirements, and geopolitical alignments. The operational requirements that military aircraft needed to meet were themselves changing rapidly. Keeping the embedded system chip standard relatively stable made it possible to respond to changing external requirements without throwing aircraft system designs into total chaos.
That decision supported the arms race that peaked in the latter part of the 1980s. Reagan’s defense ramp up achieved its goal: at great financial cost, and with some inevitable program failures, U.S. innovation and determination drove the Soviet Union into bankruptcy as it continued to try and fail to keep up.
That wasn’t the only value of the Reagan defense investment. Many key innovations migrated into civilian and commercial use by the 1990s: the internet, for example, whose packet switched approach to communications was originally developed for the Pentagon. Cell phones and the cell telephone system, an outgrowth of military radio approaches. New approaches to designing and building complex software systems that evolved over time into today’s reusable component methods, in which developers construct new websites, e-commerce platforms, and other systems out of standardized pieces rather than hand crafting them. And much more.
The commercialization of technologies originally developed for military use drove a major economic boom for the United States in the 1990s. It also brought major changes to our daily lives. Today you and I use sophisticated radio-enabled handheld computers hosting software that shares data with globally distributed data collections. We call them “smartphones.” And they, in turn, are driving the evolution of artificial intelligence for such uses as real time identification of people based on face, voice, and the way they walk. China is already using this technology for massive surveillance of its population and to enforce politically approved repression of Muslim Uighurs and its Christian population.
Technology has consequences that are increasingly disruptive to societies, economies, and geopolitical alignments. As tech spreads, militarily relevant capabilities spread as well to state and non-state adversaries of the United States. China’s rapid military evolution today started with the transfer of chip and other manufacturing from U.S. companies in the 1990s—transfer that was touted as a key achievement of the Clinton Administration. Along with preferred entry of Chinese students into many U.S. graduate degree programs at universities known for advanced research on behalf of DARPA and other agencies, those policies helped an adversarial nation to become a military near-peer to the United States, and to threaten to surpass our capabilities in the near future.
The Challenge Ahead
Even more disruptive capabilities are on the near horizon. Quantum computing threatens to break data and communications encryption, exposing our most advanced systems to penetration and sabotage. Artificial intelligence enables increasingly autonomous robots, including unmanned combat air vehicles (UCAVs) that can work in concert with and potentially without piloted military planes.
How can our military integrate advanced tech systems like UCAVs into combat and other operations? Effective use of advanced tech systems requires coordination of data, communications, and actions among them, under the control of human commanders at the tactical and operational levels.
And that’s what the F-35 is intended to do. Beyond being a fighter jet flown by military pilots from the different services, with their different missions and training cultures, the Joint Strike Fighter is above all intended to be the coordination platform for sensors, weapons, and communications systems in a given airspace—including future systems not yet designed. Our military desperately needs this capability, and the need will only grow over the next decade or more.
Increasingly, our Army, Navy, Marine Corps, and Air Force must work together on short notice in a wide range of operations. Disparate systems make such coordinated deployment difficult. And yet the F-35 has had to fit into existing service-specific environments as the services struggle to adapt to tech-driven changes.
And so we have service variants of the Joint Strike Fighter. And that means more development and more debate around specific parameters for them, and more overruns. Those are real. And they point to a challenge in the overall Defense Department Joint Capabilities Integration and Development System (JCIDS) acquisition approach.
JCIDS is intended to reduce development time and expense by targeting shared needs across the various services. But system requirements are intertwined with training, mission approaches, and skills that are often service-specific. Teasing out commonalities and making good cost-benefit tradeoff decisions at a time of major change and uncertainty is not an easy thing to achieve.
Yet the deeper challenge remains. How can the United States maintain military effectiveness in the face of rapid, tech-driven disruption and the emergence of near-peer adversaries whose own capabilities are rapidly advancing?
If the F-35 Joint Strike Fighter is not the answer, then the answer nonetheless includes something like it: a software intensive platform that can rapidly integrate evolving new sensors, weapons systems, and communications. A platform that is flown by highly skilled military crews but that increasingly places at their command integrated information and command capabilities that humans cannot achieve alone.
How do you get to Carnegie Hall? Practice, practice, practice. How do you build such a platform at a time of rapid tech change? With some stumbles, if the history of both military and commercial tech evolution is any indication.
Maintaining and enhancing American military effectiveness requires a new, integrated look at the role of advanced tech as a fundamental driver of changes in military operations and the nature of the military forces themselves. That challenge goes well beyond rooting out Beltway bandits and bureaucratic inertia and complicity in cost overruns. It means making the right decisions about what to keep and build on, too.
Photo Credit: DigtialStorm/Getty Images