Samsung is preparing for one of the most aggressive expansion phases in its history, as it plans to invest over $73 billion (~ 110 trillion won) in 2026 to strengthen its position in the fast-growing AI chip market. The scale of the investment clearly shows a significant increase from the company’s already huge capital expenditure in 2025, when it invested over $65 billion (~ 90 trillion won). According to the latest filing by the electronics giant, the spending will focus on advanced memory like HBM, next-generation chips like 2nm processors, and new manufacturing facilities.
This move comes at a time when Samsung’s semiconductor business is experiencing a strong rebound driven by rising AI demand. In 2025, the company recorded annual revenue of about $230 billion (~ 333.6 trillion won) with an operating profit of around $30 billion (~ 43.6 trillion won), marking one of its strongest financial performances in recent years. A significant share of this growth was driven by its chip division, which has become central to the firm’s overall profitability as global demand for AI hardware continues to accelerate.
At the core of Samsung’s AI strategy is its high-bandwidth memory (HBM) business. HBM has become essential for AI accelerators because it allows extremely high data transfer speeds between processors and memory. Samsung is currently mass-producing HBM3E chips, which are used in modern AI GPUs, and is preparing to launch HBM4, its next-generation memory expected to significantly improve bandwidth and power efficiency. Most importantly, the company is already aligning this technology with major customers. For example, it has recently expanded its partnership with AMD to supply HBM4 chips for AMD’s Instinct MI455X GPUs and DDR5 memory for its sixth-generation EPYC processors.
Along with memory, Samsung is pushing forward in advanced chip manufacturing through its foundry division. The company is developing 2-nanometer process technology, which represents the next frontier in semiconductor miniaturization. These chips are expected to deliver higher performance and lower power consumption, making them ideal for AI workloads. Samsung has already demonstrated progress with its earlier 3nm Gate-All-Around (GAA) architecture.
In parallel, Samsung is securing long-term demand through major contracts. A notable example is its multi-billion-dollar deal to produce AI chips for Tesla, which will be used in autonomous driving systems and robotics. The company is also moving toward multi-year supply agreements (3-5 years) with large customers to stabilize revenue. The foundry business is also beginning to gain traction through strategic partnerships. Samsung is manufacturing new AI chips for companies like Nvidia using its 4nm process.
However, despite all such efforts, the company faces intense competition on multiple fronts. In the memory sector, it lags behind SK Hynix, which currently leads the HBM market and has secured a dominant share of supply for leading AI chipmakers. Even in the foundry domain, Taiwan Semiconductor Manufacturing Company (TSMC) remains the clear leader in advanced node production, manufacturing the majority of chips for companies like Apple and Nvidia. Meanwhile, Nvidia itself dominates the AI processor market and is driving demand across the entire supply chain, with projections suggesting AI chip revenue could reach $1 trillion by 2027.
The Tech Portal is published by Blue Box Media Private Limited. Our investors have no influence over our reporting. Read our full Ownership and Funding Disclosure →