OpenAI, Samsung SDS, and SK Telecom Begin Data Center Construction in South Korea
OpenAI partners with Samsung SDS and SK Telecom to begin construction on new data centers in South Korea with initial capacity of 20 MW, expanding the AI infrastructure footprint into Asia as global demand for inference compute grows.
OpenAI, Samsung SDS, and SK Telecom have begun construction on new data centers in South Korea with initial capacity of 20 megawatts, marking OpenAI's first dedicated infrastructure investment in Asia and expanding the global footprint of AI inference compute closer to the rapidly growing Asian market.
Partnership Structure
The three-way partnership leverages each company's strengths: OpenAI provides the AI models and software stack, Samsung SDS brings data center design and construction expertise, and SK Telecom contributes network infrastructure and local market knowledge. The initial 20 MW facility is designed to handle AI inference workloads — running trained models to serve user queries — rather than model training, which requires larger facilities and more specialized hardware. The partnership includes plans for expansion to 100 MW based on demand.
Strategic Rationale
The South Korean data centers serve multiple strategic purposes. First, they reduce latency for Asian users of OpenAI's products by processing inference requests closer to the end user. Second, they address data sovereignty concerns — some South Korean enterprises and government agencies require that their data be processed within the country. Third, the partnership strengthens OpenAI's relationship with Samsung and SK Telecom, two of South Korea's most influential technology conglomerates, potentially opening channels for deeper enterprise adoption.
Asian AI Infrastructure Race
The investment comes as multiple AI companies expand infrastructure in Asia. Meta broke ground on a second Indiana data center campus with a $10 billion investment, while also planning Asian facilities. Google and Microsoft have announced data center investments in Japan, Malaysia, and Thailand. The race to build inference infrastructure close to users reflects the shift in AI economics from training (concentrated in a few large facilities) to inference (distributed globally), where the cost of serving billions of daily queries makes geographic proximity to users economically important.
Related Articles
NGINX 1.29.6 Adds Native Sticky Sessions and Fixes QUIC Reset Packet Overflow
NGINX 1.29.6 mainline release introduces a sticky-session directive for upstream blocks, enabling cookie-based session affinity without external load balancers and solving session-loss issues during worker restarts. The release also fixes oversized QUIC reset packets and improves SCGI backend proxying.
FreeBSD 14.4 Delivers Post-Quantum SSH, OpenZFS 2.2.9, and Intel E610 Support
FreeBSD 14.4-RELEASE has arrived with OpenSSH 10.0p2 defaulting to hybrid post-quantum key exchange, OpenZFS 2.2.9, and new driver support for Intel Ethernet E610 NICs. The release also adds 9P filesystem support for Bhyve virtualization guests and patches vulnerabilities in OpenSSL and libarchive.
OFC 2026: Coherent and Broadcom Demonstrate 3.2 Terabit-Per-Second Optical Transceivers
At the Optical Fiber Communication Conference in Los Angeles, Coherent and Broadcom have demonstrated 3.2 Tbps optical transceiver modules — doubling the bandwidth of current-generation 1.6T interconnects. The technology is designed for the next wave of AI data center buildouts, where single training runs require moving exabytes of data between thousands of GPUs.