Nvidia praised DeepSeek’s R1 model as “an excellent AI advancement,” even as the Chinese startup’s debut contributed to a 17% drop in the chipmaker’s stock on Monday.
“DeepSeek is an excellent AI advancement and a perfect example of Test Time Scaling,” an Nvidia spokesperson told CNBC. “DeepSeek’s work illustrates how new models can be created using that technique, leveraging widely-available models and compute that is fully export control compliant.”
Nvidia’s remarks follow DeepSeek’s release of R1 last week, an open-source reasoning model that reportedly outperformed leading AI models from U.S. firms like OpenAI. With a self-reported training cost of under $6 million—just a fraction of the billions spent by Silicon Valley companies—R1 has drawn significant attention.
Nvidia’s statement suggests that it views DeepSeek’s advancement as driving further demand for its graphics processing units (GPUs), despite the market reaction.
“DeepSeek’s rise has sent shockwaves through the AI investment landscape. Their latest large language model (LLM) matches the performance of U.S. leaders like OpenAI—at just 5–10% of the cost. Developed for around USD 6 million, compared to the hundreds of millions others spend, DeepSeek’s efficiency has rattled the market. Major players like Nvidia, Microsoft, and Meta have seen significant selloffs,” Jacob Falkencrone, Chief Investment Strategist, Europe, Saxo Bank, said.
The AI boom and the surging demand for Nvidia GPUs were largely fueled by the “scaling law,” a concept introduced by OpenAI researchers in 2020. This concept suggested that more powerful AI systems could be created by significantly increasing the amount of computation and data used to build new models, leading to a growing need for more chips.
Earlier this month, Microsoft announced plans to spend $80 billion on AI infrastructure in 2025, while Meta CEO Mark Zuckerberg revealed the company’s capital expenditures for its AI strategy will range between $60 billion and $65 billion next year.
“If model training costs prove to be significantly lower, we would expect a near-term cost benefit for advertising, travel, and other consumer app companies that use cloud AI services, while long-term hyperscaler AI-related revenues and costs would likely be lower,” wrote BofA Securities analyst Justin Post in a note Monday.
While market turbulence may spark concerns, the long-term trajectory of AI remains strong—this isn’t the end of the AI boom, but the beginning of its next phase.
“DeepSeek’s rise may have disrupted the market, but it doesn’t derail AI’s long-term potential. In fact, innovations that lower costs and broaden accessibility could spark the next wave of growth,” Jacob Falkencrone said.
“For investors, the key is to stay calm, diversify, and focus on the bigger picture. ETFs provide an easy and efficient way to participate in AI’s growth without the risks of single-stock dependency. AI isn’t just here to stay—it’s evolving into something even bigger. By staying disciplined and strategic, you can position yourself to benefit from this exciting investment theme,” he added.