SONiC Capabilities: Empowering Networks with Open-Source Solutions

Download PDF

Building Sovereign AI: Ukraine’s Comprehensive Infrastructure Initiative

January 16, 2026

In Central-Eastern Europe, a nation is building one of the continent’s most ambitious artificial intelligence infrastructures. Ukraine’s recent partnership with NVIDIA to construct what they’re calling a “sovereign AI state” represents a convergence of digital independence principles and technological ambition that deserves close attention from European neighbors and technology leaders globally.

In this article you will discover:

  • How Ukraine is building world-class AI supercomputing infrastructure
  • The four strategic pillars of the Ukraine-NVIDIA partnership
  • The critical role of vendor-neutral networking in sovereign infrastructure

When Digital Infrastructure Becomes Strategic Priority

Ukraine’s digital transformation over the past few years has been remarkable. The Diia app, which lets citizens access government services, store digital IDs, and sign legal documents, seemed revolutionary when it launched. But what Ukraine is doing now represents the next evolution.

In November 2025, Ukraine’s First Deputy Prime Minister and Minister of Digital Transformation Mykhailo Fedorov announced a comprehensive partnership with NVIDIA to build essential sovereigninfrastructure and emphasized that building sovereign AI was a matter of national security and data protection, especially in wartime conditions.

The strategic importance is clear: as nations increasingly recognize AI and open networking as criticalinfrastructure, Ukraine is investing in indigenous capabilities and operational independence.

Learn more about SONiC's architecture, its capabilities, and explore inspiring success stories of SONiC deployments across diverse network environments.

The Four Pillars: Comprehensive Infrastructure Development

What makes this partnership distinctive and what differentiates it from typical government IT contracts is its comprehensive scope. Ukraine is building an entire ecosystem around four strategic pillars:

1. National AI Infrastructure

At the heart of this initiative is the “AI Factory,” a supercomputing environment built on NVIDIA’s powerful hardware – the DGX SuperPOD systems, capable of scaling to tens of thousands of GPUs. These are enterprise-grade platforms used to train trillion-parameter AI models that power everything from advanced language models to cutting-edge medical research.

NVIDIA DGX Superpod Architecture

The timing is strategic. Ukrainian private cloud providers have already invested over $2.5 million USD in NVIDIA’s H200, H100, and L40S GPUs. The government is now creating a standardized platform that the entire tech ecosystem can leverage.

2. Talent Development

The partnership includes structured workshops, architectural review sessions with NVIDIA experts, and AI education programs. It’s a masterclass in technology transfer – internalizing the knowledge to build and maintain it independently. This addresses a critical challenge: building domestic AI expertise at scale while competing with higher-paying opportunities abroad.

3. Joint Research & Development

Ukraine and NVIDIA are collaborating on custom AI models adapted to Ukrainian legislation, language, and specific governance needs. The complexity is substantial: training an AI that understands Ukrainian administrative law, can process requests in Ukrainian (with all its linguistic nuances), and operates within the country’s legal framework requires far more than off-the-shelf solutions.

4. Startup Ecosystem Support

The commitment to support Ukraine’s AI startup scene is particularly strategic. By providing essential resources, the government is subsidizing innovation. Local companies can develop tech solutions, healthcare applications, and commercial products using world-class computing power, democratizing access to expensive infrastructure.

From Gemini to Sovereign: The Diia AI Architecture Evolution

In September 2025, Ukraine launched Diia.AI, which officials called the world’s first AI assistantintegrated into a national public services system. The platform answers questions about taxes, permits, and social services in real-time, providing genuinely impressive functionality.

The initial implementation runs on Google’s Gemini model – an excellent choice for rapid deployment and proof-of-concept validation.

However, for long-term digital sovereignty, relying on an external AI model for government services creates strategic dependencies. Questions around model integrity, data governance, and operational continuity drive the need for indigenous solutions.

Evolution of Ukraine's digital infrastructure

This is why the centerpiece of the NVIDIA partnership is developing the Diia AI LLM – a sovereign large language model specifically adapted to Ukrainian legislation, public services, and citizens’ needs. It’s a strategic transition from rapid deployment to sustainable sovereignty.

The Road Ahead: Technical and Strategic Challenges

Building an AI supercomputing infrastructure on a national scale also presents some challenges. And maintaining it to enterprise-grade standards demands a sustained commitment.

Technical and strategic challenges

Financial sustainability is the most obvious concern. High-performance NVIDIA systems require massive capital expenditure – for initial purchase and ongoing maintenance, power, cooling, and regular upgrades. Ukraine will need sustained investment, potentially blending national budget allocations, international donor funding (like the €250,000 from Estonia’s ESTDEV for strategy development), and creative public-private partnerships.

Operational excellence also presents unique technical challenges. The AI Factory becomes critical national infrastructure requiring enterprise-grade reliability. This demands geographic dispersion,
aggressive redundancy planning, backup power generation, and operational protocols that exceed typical data center standards.

Vendor management is a strategic consideration. While NVIDIA provides world-class technology, dependency on any single supplier creates vulnerabilities. What happens if prices spike? If new hardware faces supply delays? If geopolitical situations shift? Ukraine needs to balance the immediate benefits of this partnership with long-term supply chain resilience – a challenge that extends to every layer of the infrastructure stack, from compute to networking to storage.

The Networking Foundation: Beyond AI Hardware

While NVIDIA provides Ukraine’s AI “intelligence” layer, the networking fabric under it is equally critical. True sovereignty means controlling every layer – including the network operating systems that connect data centers, manage traffic, and ensure resilience. A sovereign AI on sovereign hardware is vulnerable if it depends on proprietary networking equipment or software.

This same logic is now shaping Europe’s digital sovereignty. To reduce dependency and regain control over critical infrastructure layers, the EU is turning to open-source networking solutions. PLVision, a leader in open disaggregated networking, exemplifies this approach through its work with the SONiC (Software for Open Networking in the Cloud) and DENT projects.

Considering a switch to open-source networking? Explore the reasons to choose an open-source NOS like SONiC, along with a breakdown of the Total Cost of Ownership (TCO) for both proprietary and open-source solutions.

Just as Ukraine is transitioning from Google’s Gemini to its own Diia AI LLM, organizations are moving from proprietary network operating systems to open alternatives that offer vendor neutrality, cost optimization, full control over network infrastructure, and community-driven innovation.

PLVision blends deep switch-software engineering with active community leadership. Our offerings include:

By owning your custom SONiC version, you gain full control, eliminate vendor lock-in and licensing fees, and align your infrastructure with modern demands.

Conclusions

Ukraine has defined sovereignty practically: state-controlled infrastructure, domestically managed data pipelines, and nationally adapted AI models.

For Ukraine’s neighbors, partners, and allies across Europe, this initiative offers lessons in digital sovereignty, opportunities for technological collaboration, and a reminder that innovation often emerges from strategic necessity and real-world constraints.

Concretely, an open-source NOS such as SONiC enables organizations to implement controlled infrastructure, manage data pipelines locally, and customize networking for various workloads.

Ready to explore how open-source solutions can enhance your business?

Book a call with our experts to discuss your use case and unlock the full potential of open, disaggregated networking.
Message:
Your message has been sent, thank you! We will contact you as soon as possible.
Kamil Krawczyk
Latest posts by Kamil Krawczyk (see all)

FAQ

What technical hardware defines a "Sovereign AI Factory" infrastructure?

A sovereign AI infrastructure is typically anchored by high-density supercomputing clusters, such as the NVIDIA DGX SuperPOD. These systems leverage enterprise-grade GPUs (like the H200, H100, and L40S) capable of scaling to tens of thousands of units. This hardware enables the local training of trillion-parameter models, ensuring that critical data processing remains within domestic or organizational control rather than relying on external cloud providers.

Why shift from external AI APIs (like Google Gemini) to a Sovereign LLM?

The transition from external AI models to an indigenous Sovereign Large Language Model (LLM) is driven by three technical imperatives:
  • Data Governance: Keeping sensitive information within a controlled environment to ensure model integrity.
  • Architectural Customization: Adapting the LLM to specific legal frameworks, local linguistic nuances, and specialized administrative logic.
  • Operational Independence: Eliminating strategic dependencies on third-party vendors, which mitigates risks related to API availability, pricing shifts, or geopolitical constraints.

What role does "Open Disaggregated Networking" play in AI scalability?

For AI supercomputing to be truly sovereign, the networking layer must be vendor-neutral. Open disaggregated networking separates hardware (the switch) from the software (the Network Operating System). Using open-source stacks like SONiC (Software for Open Networking in the Cloud) prevents vendor lock-in, allowing organizations to manage high-throughput AI traffic without being tied to proprietary licensing or closed hardware ecosystems.

What are the primary operational challenges of maintaining AI SuperPODs?

Maintaining enterprise-grade AI infrastructure requires addressing complex technical demands:
  • Thermal Management: Managing the massive heat output of thousands of H100/H200 GPUs.
  • Power Redundancy: Ensuring consistent uptime through aggressive backup power generation and geographic dispersion.
  • Networking Fabrics: Implementing low-latency, high-bandwidth fabrics (like InfiniBand or specialized Ethernet) to handle the massive synchronization requirements of distributed AI training.

How does SONiC enhance the performance of AI Data Centers?

SONiC provides a standardized, community-driven platform that supports advanced features like SONiC-DASH API implementation and xPU/SmartNIC offloading. These technologies optimize data transfer between compute nodes and storage, reducing the CPU overhead and ensuring that the high-performance GPUs in an AI factory are utilized at maximum efficiency.