Micron Technology (MU) has posted a 122.9% gain over the past six months, making it one of the best-performing semiconductor stocks in the market right now. That trounces the broader Zacks Computer and Technology sector, which returned just 3.4% over the same stretch.
Micron Technology, Inc., MU
The fuel behind that run is no mystery: AI infrastructure spending is accelerating fast, and memory chips are at the center of it. As data centers scale up to handle more AI workloads, demand for Micron’s DRAM, NAND, and especially High Bandwidth Memory (HBM) has surged. Supply contracts for its HBM3E and HBM4 chips are already sold out for the entire 2026 calendar year.
NVIDIA confirmed in 2025 that Micron is a core HBM supplier for its GeForce RTX 50 Blackwell GPUs. Demand for HBM4 is being driven heavily by NVIDIA’s upcoming Vera Rubin architecture.
Micron is also expanding its HBM advanced packaging facility in Singapore to meet that demand. Analysts at BofA Securities noted on April 7 that global AI capital investment is expected to nearly triple to $1.4 trillion by 2030, with Micron well-positioned in the memory subsector as hyperscalers and sovereign entities upgrade their IT systems.
In Q2 of fiscal 2026, Micron’s revenue hit $23.86 billion, up 196% year over year. Non-GAAP EPS came in at $12.20, a 682% increase from the same period a year ago. Both figures beat Wall Street estimates by a wide margin — revenue by 21.67% and EPS by 38.57%.
Non-GAAP gross margin expanded to 74.9%, up from 37.9% in the year-ago quarter. Operating income jumped to $16.46 billion from $2.01 billion. For full-year fiscal 2026, analysts expect revenue to grow 194% and EPS to grow 604%.
The growth story doesn’t stop there. Fiscal 2027 consensus estimates point to another 58.5% revenue increase and 63.9% EPS growth.
Even after its big run, Micron trades at a forward P/E of around 5 to 6 — well below the sector average of 23.43. For comparison, Marvell Technology trades at 26.74x, Texas Instruments at 31.23x, and Intel at 87.21x.
One longer-term bull case ties into AI inference. Unlike model training, which happens episodically, inference runs constantly every time someone interacts with a deployed AI system. That means memory demand scales with AI usage, not just with the expansion of models. Micron’s HBM3E and LPDDR5X chips are designed for exactly this environment.
There is also the edge AI angle that doesn’t get much attention. Autonomous vehicles, smart factories, and surgical robotics all need on-device memory that processes compressed AI models locally. This runs on LPDDR and embedded NAND — a second demand vector for Micron that is separate from data center cycles.
BofA pointed out that while some analysts have raised concerns about Micron hitting “peak margin,” the stock is currently trading at the low end of its historical P/E range. Micron has also committed to investing over $25 billion in fiscal 2026 as it scales capacity.
The post Micron (MU) Stock Doubles in Six Months — Here’s Why Analysts Are Still Bullish appeared first on CoinCentral.


