Command to Collaboration: Rethinking How the DoD Builds
America doesn’t suffer from a lack of defense innovation, it suffers from a lack of alignment.
Despite billions spent on AI, advanced networks, and digital tooling, too many programs stall not because of technical failure but because of structural centralization, legacy silos, and inter-service mistrust.
What if we stopped treating innovation as something to be contained within prime contractors and program offices and instead treated it like an open ecosystem? What if the Pentagon built like GitHub, not Lockheed?
The Centralization Problem
The recent AI skirmish between the Army and Air Force is a perfect example. The Air Force’s generative AI tool, NIPRGPT, developed by AFRL, was blocked from Army networks out of concern for governance and data leakage. The Army opted to deploy its own internal LLM, Ask Sage, which it deemed more secure and production-ready.
This wasn’t a fight over tech maturity. It was a fight over trust, interoperability, and control. Each service is innovating within its own walled garden, reinforcing stovepipes under the guise of security. The result? Duplicative tools, fractured governance, and wasted time.
A recent Military.com article highlights this fragmentation, noting that inter-service competition is quietly stalling the rollout of next-gen AI capabilities across the Department of Defense. Officials within both services privately admit that risk aversion, control concerns, and classification issues often take precedence over collaboration, even in areas that demand unity of effort.
Meanwhile, the commercial world builds through collaboration. GitHub, open APIs, and federated development models allow distributed teams to improve shared tools in parallel. Version control, not ownership, is the currency. Silicon Valley doesn’t ask who owns the repo, they ask who’s contributing to it.
DoD’s Growing Appetite for Control
Secretary of Defense Hegseth’s recent memo on IT consulting contracts reveals another angle: a tightening grip on advisory and digital services spending. New thresholds now require DOGE review for IT contracts over $10M and advisory services over $1M, including justification and cost-benefit analysis. The policy aims to contain costs and rein in overuse of external support, but also exposes the fragility of internal capacity.
The move is designed to create more in-house muscle for tech oversight but it also risks reinforcing the very bureaucratic barriers that slow innovation. A Defense.com analysis points out that, while the policy is a nod toward fiscal discipline, it’s also a warning shot at large-scale digital outsourcing, signaling a preference for internal stewardship. Yet internal capacity doesn’t scale overnight.
In theory, this should push the Pentagon toward greater in-house capability. In practice, it may lead to even more centralized bottlenecks where oversight, rather than empowerment, defines the workflow.
The more tightly we try to control innovation, the more we risk throttling it.
The Case for a Decentralized Arsenal
Imagine a future where defense platforms are treated like open-source projects:
Modular codebases maintained across agencies and commands
Secure LLM models updated through federated training pipelines
Cross-service tooling developed with version transparency and shared APIs
A community of engineers, inside and outside government, contributing to secure-by-design components
The Navy CTO’s recent prioritization of AI and quantum is promising—but real traction will depend not just on the tech choices, but the model of development. In a recent Meritalk piece, Navy CTO Justin Fanelli emphasized the need for emerging tech to be deployed at the speed of mission not bureaucracy. That sentiment can’t be realized without decentralized, modular development pathways that enable adaptation at the edge.
A GitHub-like structure enables not only agility but survivability when tools can be forked, adapted, and scaled by different users with shared visibility. Instead of reinventing the wheel inside every service, developers could fork existing tools, improve them, and push updates upstream. That’s not chaos, it’s evolution.
Interoperability > Ownership
The friction between NIPRGPT and Ask Sage highlights a deeper problem: in defense, interoperability is often treated as an afterthought. Each service fears loss of control more than missed opportunity. Until this changes, the U.S. will continue to build powerful but disconnected tools requiring endless translation layers and interface bridges.
Defense.com recently reported on the growing concern that stovepiped AI programs are leading to redundant infrastructure and slowing delivery timelines. Without a shared development environment, new tools become integration liabilities rather than operational force multipliers.
In a decentralized model:
Trust is built through transparency and auditability, not control.
Services align on common standards, not common vendors.
Security is enforced through architecture, not through isolation.
Just as GitHub enables decentralized development while maintaining code security, DoD could adopt DevSecOps practices that allow cross-agency collaboration without compromising classification or integrity.
What Needs to Happen
Decentralizing the arsenal requires more than intention, it demands structural shifts in how the Department of Defense builds, governs, and scales technology. First, the Pentagon needs to establish a secure, interoperable development backbone; a digital equivalent of GitHub, hosted on IL5/6 infrastructure. This environment must support continuous ATO, granular access controls, and modular permissions, allowing distributed teams to contribute without compromising security.
But infrastructure alone isn’t enough. Culture and incentives must evolve too. Developers and units that improve shared codebases, build reusable components, or enable others to move faster should be actively rewarded, not sidelined for not owning the original program. This means shifting from service-centric recognition to contribution-based validation.
Interoperability must become a foundational principle, not a compliance afterthought. Trusted interface standards across data pipelines, AI models, and logistics systems would ensure that innovation isn’t lost to proprietary traps or endless integration costs. Rather than prescribing specific platforms, the Department should define open, extensible APIs that invite modular innovation.
Procurement practices must also evolve. The current system rewards prime contractors who win full-system contracts. Instead, the focus should shift to capability-centric modules, funding specific functionality that can be integrated across programs and platforms. This modularity creates a marketplace of parts, not just programs.
Leadership will need to keep pace. The Department should elevate technical stewards and leaders who understand not just operational goals but the architectural demands of federated design. These are the individuals who can bridge mission and method, ensuring capability meets need without succumbing to rigid bureaucracy.
Finally, the Department must embrace experimentation. That means standing up protected zones, digital sandboxes where warfighters and engineers can co-develop tools without being crushed by compliance on day one. These spaces should prioritize speed, iteration, and direct user feedback, modeled after modern software incubators.
This doesn’t mean sacrificing oversight. It means refocusing it. True accountability should track performance, adaptability, and user satisfaction, not just process adherence. In a decentralized future, the best oversight is transparency, not bureaucracy.
The Bottom Line
We can’t afford to replicate legacy manufacturing and acquisition structures in the digital age. Deterrence today isn’t just about how many ships you can build, it’s about how fast your software can learn, adapt, and deploy.
If the Pentagon wants to win the race for digital superiority, it must stop building like Lockheed and start building like GitHub.
Decentralization isn’t disorder. It’s the future of agility, resilience, and scale in an age where speed and interoperability decide the fight.
Whether it’s generative AI, quantum algorithms, or predictive logistics, the next breakthrough will come not from command-and-control models but from empowered developers, interoperable frameworks, and systems that evolve through use, not dictate.
Innovation doesn’t need another gatekeeper. It needs a platform and a shared mission that spans beyond service lines.