The data center that OpenAI, Samsung SDS, and SK Telecom broke ground on in South Korea in March 2026 will start at just 20 megawatts of combined capacity across two facilities. By hyperscaler standards, that is a pilot project. By strategic standards, it is an anchor investment that determines who controls AI infrastructure across the Asia-Pacific region for the next decade , and the most interesting detail is not the compute capacity. It is the floating metal boxes that Samsung Heavy Industries is co-developing with OpenAI to solve a problem that will define where AI can physically exist at all.

What Actually Happened

In February 2026, OpenAI announced that Samsung and SK had formally joined its Stargate initiative , the company's overarching global AI infrastructure platform initially capitalized at $500 billion for US domestic deployment. The Korean announcement was structurally different from the US version: rather than a single capital commitment, it formalized a series of interlocking partnerships spanning compute supply, construction, real estate, connectivity, and model access. Samsung Electronics and SK Hynix committed to scaling advanced memory production to 900,000 DRAM wafer starts per month to supply OpenAI's accelerating compute appetite. Samsung SDS , the IT services and cloud subsidiary , and SK Telecom jointly agreed to begin construction of Korea's first Stargate data centers, with groundbreaking in March 2026 on two facilities with combined initial capacity of 20MW.

The structurally novel element was the Samsung C&T and Samsung Heavy Industries partnership with OpenAI to explore floating data centers. Samsung C&T is Samsung's construction and trading conglomerate; Samsung Heavy Industries is one of the world's largest shipbuilders. The floating data center concept , a modular, water-cooled compute facility mounted on a marine platform , is designed to address three constraints simultaneously: land scarcity in dense Asian markets, cooling costs that represent 30 40% of traditional data center operating expenses, and the ability to reposition compute capacity where energy prices are lowest or regulatory conditions are most favorable. OpenAI is not the first company to investigate offshore compute , Microsoft and Google have both explored undersea server pods , but the combination of a $500 billion infrastructure platform and Samsung Heavy's world-class shipbuilding capability represents the most credible attempt to actually build one at commercial scale.

Why This Matters More Than People Think

The surface story is that OpenAI is expanding its infrastructure globally and South Korea is a beneficiary. The more important story is what Korea's companies are getting in return , and the asymmetry that arrangement creates over time. When Samsung and SK are not merely suppliers to OpenAI's infrastructure but co-owners and co-developers of the physical plant, the power dynamic of the relationship changes fundamentally. As AI increasingly runs on infrastructure rather than discrete software transactions, the party that owns the pipes extracts a different class of value than the party that rents them. Samsung and SK are negotiating to be in the first category , and that distinction will compound for decades.

This matters enormously for the competitive dynamics of the Asian AI market. Japan, Taiwan, Singapore, and India are all pursuing AI infrastructure strategies. South Korea's Stargate anchor gives it a first-mover position in a region where compute access has been the binding constraint on AI development. Korean AI companies , Upstage, Rebellions, and the growing cohort of Korean AI labs , will have domestic access to frontier model infrastructure that would otherwise require routing compute jobs through US-domiciled data centers, with all the latency, data sovereignty, and cost implications that entails. The Korea Stargate is not just a Samsung and SK profit center; it is a national AI capability multiplier that Korea's $12.30 billion 2026 AI budget explicitly intended to catalyze , and it arrived ahead of schedule.

The Competitive Landscape

OpenAI's decision to anchor its Asian Stargate expansion in South Korea over Japan, Singapore, or India reflects a specific structural advantage Korea holds. Samsung is uniquely positioned as both a memory chip manufacturer and a construction and heavy industry conglomerate , meaning a single corporate family can supply the chips, build the building, maintain the cooling systems, and connect the facility to the internet. No other country offers a single-vendor solution of that completeness. SK adds SK Telecom's network connectivity and SK Hynix's memory production, creating a second vertically integrated stack that Korea's competitors cannot easily replicate. The result is the lowest total transaction cost for OpenAI to build at scale in Asia , a factor that outweighs marginal differences in land cost or labor.

The competitive pressure this creates for regional rivals is significant. Taiwan's TSMC is OpenAI's primary logic chip supplier but lacks Samsung's data center construction capability. Japan's SoftBank, which backed the original US Stargate announcement, provides capital but not the vertical manufacturing stack. Google's competing strategy , its $40 billion investment in Anthropic, announced April 24, 2026 , is pursuing a different model entirely: investing in the model layer rather than the infrastructure layer. The emerging structural split is between infrastructure-first players (OpenAI-Stargate-Samsung) and model-first players (Google-Anthropic-TPU). Samsung and SK have deliberately chosen the infrastructure side , a bet that as model capabilities converge toward parity, the party that controls compute will extract more durable value than the party that controls any particular model in any particular moment.

Hidden Insight: The Floating Data Center Is a Strategic Weapon, Not a Gimmick

The floating data center concept has been covered primarily as an engineering novelty , an interesting challenge for Samsung Heavy Industries to solve. The actual strategic logic is considerably more consequential than the coverage suggests. Land acquisition for large-scale data centers in Asia's most economically productive coastal cities , Seoul, Tokyo, Singapore, Shanghai , now routinely takes three to five years due to zoning constraints, environmental review, and political opposition from adjacent communities. A floating facility can be positioned in international waters or in maritime economic zones with entirely different regulatory frameworks, cutting the approval-to-operation timeline from years to months. That speed advantage, in an industry where compute supply is the binding constraint on AI capability development, is worth far more than any real estate cost saving over the facility's lifetime.

The cooling economics are equally significant and rarely mentioned in coverage of the announcement. Traditional data centers use air cooling systems that consume approximately 30 40% of total facility power just managing heat generated by compute. Water-cooled facilities, whether floating or utilizing cold-water intake from adjacent bodies of water, can reduce that cooling overhead by more than half. At the scale of a gigawatt data center , which SK Telecom is explicitly targeting in its public roadmap , the difference between 35% and 15% cooling overhead represents hundreds of megawatts of power redirected to actual compute. Over a twenty-year facility lifecycle, that differential compounds into a cost advantage that is simply not available to landlocked competitors drawing from mixed-grid power at standard cooling efficiency.

There is a third dimension almost entirely missing from coverage: carbon accounting. As AI companies face mounting pressure from enterprise customers to document Scope 2 and Scope 3 emissions, compute infrastructure running on renewable marine energy , wave, tidal, or offshore wind , carries a fundamentally different carbon profile than landlocked data centers drawing from coal-heavy grids. The ESG accounting for a Samsung Heavy floating facility powered by offshore Korean wind is dramatically better than the carbon math for an equivalent land-based facility. In a world where enterprise AI procurement increasingly runs through sustainability committees at global banks, pharmaceutical companies, and governments with net-zero commitments, that difference will appear directly in procurement decisions within the next three years.

Most importantly: if the floating data center concept works at commercial scale , and Samsung Heavy Industries has constructed more complex structures at sea than most countries' navies , it breaks the geographic bottleneck that has constrained where AI infrastructure can physically exist. An AI data center that can be repositioned is one that can arbitrage energy prices globally, avoid regulatory risk in any single jurisdiction, and expand into markets that have electricity demand but not land availability. The geopolitical implications of mobile compute infrastructure are genuinely novel. Samsung and OpenAI would become the first entities to demonstrate it at scale, setting the engineering template, the regulatory precedent, and the commercial model that everyone else would have to follow or license.

What to Watch Next

The 90-day milestone is the first operational status update on the Korea Stargate facilities. Ground was broken in March 2026; standard data center construction for a 10MW facility runs approximately 12 18 months. Any disclosure in Q3 2026 earnings calls about ahead-of-schedule delivery would indicate the partnership is functioning better than typical construction timelines , and signal that Samsung SDS's construction management capability is a genuine differentiator that OpenAI will want to replicate in its next Asian expansion. Watch specifically for announcements from Korea's Ministry of Science and ICT about the data centers' role in the national GPU reserve. The government has committed to 52,000 high-performance GPUs by 2028, and the Stargate facilities are the most plausible physical home for a significant portion of that national compute reserve.

For the floating data center, the concrete indicator is a formal engineering contract between Samsung Heavy Industries and OpenAI beyond the initial feasibility assessment agreement. The transition from feasibility to contracted engineering would represent an irreversible capital commitment and confirm that the floating concept has cleared OpenAI's internal technical review. South Korea's maritime zone regulations , specifically the designation of Exclusive Economic Zone areas where floating compute facilities could be legally operated , will be a critical regulatory variable to monitor. Any parliamentary discussion or Ministry of Oceans and Fisheries guidance on maritime compute platforms in late 2026 would signal the legal framework is being actively prepared. By 2028, the question will not be whether floating AI infrastructure is technically possible. It will be who built the first one, who operates it, and who set the regulatory template that every other country has to follow.

The most important thing Samsung and SK are building with OpenAI is not a data center , it is the right to be called infrastructure, not a supplier, in the age of AI.


Key Takeaways

  • Stargate comes to Asia: OpenAI, Samsung SDS, and SK Telecom broke ground in March 2026 on Korea's first Stargate data centers , two facilities with combined initial capacity of 20MW, marking the first formal Asian deployment of OpenAI's $500B global infrastructure initiative.
  • Samsung Heavy goes to sea: Samsung C&T and Samsung Heavy Industries are co-developing floating data center designs with OpenAI, targeting land scarcity, cooling costs (30 40% of DC power), and carbon accounting constraints that make traditional construction increasingly untenable in dense Asian markets.
  • Memory at scale: Samsung Electronics and SK Hynix are ramping to 900,000 DRAM wafer starts per month to supply OpenAI's compute infrastructure, cementing Korea as the memory backbone of the global AI stack.
  • Co-owner, not supplier: Samsung and SK's Stargate participation grants them co-development and co-ownership positions in AI infrastructure , a structurally different relationship than simply selling chips or connectivity to a foreign AI company.
  • National compute backstop: Korea's government has committed to 52,000 high-performance GPUs by 2028, scaling to 260,000 by 2030, with the Stargate facilities providing the physical home for a significant portion of Korea's national AI compute reserve.

Questions Worth Asking

  1. If floating data centers become commercially viable, does that fundamentally change the geopolitics of AI infrastructure , and which countries benefit most from removing the constraint that AI compute must be physically anchored to land within a single jurisdiction?
  2. Samsung and SK are positioning as co-infrastructure owners rather than component suppliers to OpenAI's Stargate. Does that strategic upgrade come with meaningful governance rights over how the infrastructure is used , or is it a branding exercise that leaves operational control entirely with OpenAI?
  3. As AI infrastructure becomes as strategically critical as semiconductor fabrication, should governments with no domestic compute sovereignty , no data center footprint, no GPU reserves, no carrier AI platform , treat this as a national security deficit rather than a technology procurement question?