2 Artificial Intelligence (AI) Stocks That Could Go Parabolic


The adoption of artificial intelligence (AI) technology is set to continue at a rapid pace in 2025: Market research firm IDC estimates that investments in data center infrastructure, AI agents, and efforts taken by organizations to embed AI capabilities into their operations will add up to outlays of $227 billion this year.

What’s worth noting here is that IDC expects 67% of that total to go toward businesses’ efforts to integrate AI into their operations. So 2025 could be a year of solid growth for both AI hardware and software companies. That’s why now would be a good time to take a closer look at two AI companies that could win big from the massive spending on AI infrastructure and solutions, and potentially see parabolic increases in their share prices.

A parabolic move refers to a sharp increase in the stock price of a company in a short period, tracing a path that resembles one side of a parabolic curve. Micron Technology (NASDAQ: MU) seems to be on that path — its stock price has risen 20% in 2025 already. Snowflake (NYSE: SNOW), too, has experienced a sharp jump in its stock price in recent months, and could very well maintain its momentum.

The memory market is expected to enjoy another year of solid growth in 2025 thanks to the AI trend. Market research firm Gartner estimates that unprecedented demand for high-bandwidth memory (HBM), which is used in AI accelerators to enable faster data transmission speeds and impart more computing power, along with increases in prices, could boost dynamic random access memory (DRAM) sales this year by 28% to $115.6 billion.

Micron is already making the most of the AI-driven opportunity in its core market. The memory specialist got a jump on larger rival Samsung, as it was Micron’s HBM chips that were selected for use in Nvidia‘s graphics cards for both gaming and AI workloads. More specifically, Nvidia’s upcoming GeForce RTX 50 series gaming graphics cards will use Micron’s HBM.

Meanwhile, Micron management announced on the company’s December earnings conference call that Nvidia’s Grace server CPU (central processing unit) is also using its HBM. Meanwhile, Nvidia picked Micron’s fastest HBM chip for use in its next-generation Blackwell AI systems. Samsung, on the other hand, has reportedly struggled to get its chips qualified at Nvidia, paving the way for Micron to keep making the most of the HBM market’s potential.

That bodes well for Micron, which forecasts that the size of the HBM market will grow from $16 billion in 2024 to more than $100 billion by 2030. At the same time, investors should note that Micron’s growth is set to pick up remarkably in its fiscal 2025 (which began Aug. 30). Revenue in the first quarter of the fiscal year increased by an impressive 84% year over year to $8.7 billion.



Source link

About The Author

Scroll to Top