«

Cerebras以百亿美元OpenAI协议挑战英伟达市场地位

qimuai 发布于 阅读:1 一手编译


Cerebras以百亿美元OpenAI协议挑战英伟达市场地位

内容来源:https://aibusiness.com/generative-ai/cerebras-poses-an-alternative-to-nvidia

内容总结:

谷歌云赞助发布:生成式AI应用落地的首个突破口
生成式AI的实践应从提升人类信息交互体验的领域切入,这有助于企业快速验证技术价值并建立应用信心。

Cerebras百亿订单挑战英伟达,AI芯片市场格局生变
1月14日,AI初创公司Cerebras与OpenAI达成一项价值100亿美元的多年度合作协议。根据协议,Cerebras将从今年晚些时候开始向OpenAI提供总功率达750兆瓦的晶圆级AI芯片系统,用于提升代码生成、图像推理、复杂逻辑任务等场景的实时响应能力。该合作被业界视为对英伟达GPU市场主导地位的一次关键挑战。

技术路径之争:晶圆级芯片能否颠覆行业?
Cerebras的核心技术在于将整片晶圆制成单一巨型芯片,避免传统芯片切割带来的性能损耗。分析师指出,这种设计在理论上具备更高效率,尤其适合处理需要多步“思考”的复杂AI任务。OpenAI此次合作旨在突破当前大模型运行速度慢、成本高的瓶颈,推动AI从“对话工具”向“执行代理”转型。

市场影响:开放竞争或降低企业AI成本
随着Cerebras、AMD、博通等厂商加速入局,AI芯片市场长期由英伟达主导的局面可能被打破。行业观察者认为,多元竞争将为企业提供更多硬件选择,并有望推动AI服务价格下降。不过,Cerebras仍需解决大规模交付、系统集成复杂度高等实际挑战,其技术对非超大规模企业的适用性仍有待验证。

行业博弈背后的战略布局
对OpenAI而言,此次合作不仅是应对谷歌、苹果等竞争对手压力的战略举措,也为其未来部署能独立执行任务的AI代理夯实硬件基础。值得注意的是,OpenAI首席执行官萨姆·奥特曼个人亦投资于Cerebras,双方早有技术探索渊源。

尽管英伟达近期与AI芯片初创公司Groq达成200亿美元授权协议,并持续向OpenAI供应芯片,但分析师警告,若Cerebras晶圆级技术经OpenAI验证成功,英伟达市场份额可能面临长期侵蚀。这场围绕算力效率的竞赛,正悄然重塑AI基础设施的权力平衡。

中文翻译:

由谷歌云赞助
选择您的首批生成式AI应用场景
要开始应用生成式AI,首先应关注能够提升人类信息交互体验的领域。

这项协议为Cerebras提供了机会,以验证其备受瞩目的晶圆级引擎能否比英伟达芯片更有效地驱动巨型AI模型。Cerebras与OpenAI达成的百亿美元合作,使这家初创公司及其晶圆级引擎成为AI芯片市场中英伟达的挑战者,同时助力OpenAI尝试提升其大型AI模型的运行性能。

根据1月14日披露的多年期协议,Cerebras需从今年晚些时候开始向OpenAI交付总功率达750兆瓦的晶圆级系统。OpenAI将利用该晶圆级引擎在代码编写、推理、图像生成和复杂逻辑处理等任务中实现近实时响应。Cerebras的AI芯片尺寸更大,且厂商宣称其速度超越英伟达GPU。

对自2015年成立以来客户群长期局限于阿联酋AI科技控股集团G42的Cerebras而言,此项协议使其作为挑战英伟达的芯片制造商之一,在AI芯片市场获得了立足点。该合作同时回应了企业面临的核心痛点:当前庞大的生成式AI模型往往因运行速度过慢或成本过高而难以实现实时应用。

此次合作为Cerebras提供了证明其晶圆级引擎实力的契机。Informa TechTarget旗下研究机构Omdia分析师迈克·莱昂指出,尽管此前业界将这家AI厂商视为偏重科学应用的实验性公司,但此次合作堪称“终极合法性认证”。莱昂表示:“这使他们一夜之间从细分市场替代者转变为所有AI实验室都必须重视的强力竞争者。”

来自博通、AMD等老牌半导体企业及Cerebras的竞争,或将通过逐步降低AI服务价格使企业受益。Futurum Group分析师戴维·尼科尔森认为:“总体而言,企业客户在获取所需AI资源时将拥有更多选择。”

该协议还直指当前市场的核心议题:AI模型运行速度。莱昂分析道:“行业正面临艰难权衡——更智能的模型正变得愈发庞大且运行缓慢。此次合作似乎正是为了解决这一具体痛点。通过聚焦推理速度,双方试图确保未来的AI智能体能处理复杂任务,避免当前令用户困扰的响应延迟。”

对近期因长期财务可持续性受质疑而面临谷歌等竞争对手压力的OpenAI而言,与Cerebras的协议堪称一场及时雨——尤其是在本周早些时候谷歌与苹果达成重大合作,以Gemini基础模型支持苹果AI计划之后。莱昂指出,这项合作表明OpenAI正在为AI代劳而非仅对话的未来拓展基础设施。

“这背后蕴含着从聊天机器人向真正智能体的深层变革,”他解释道,“单纯对话时数秒延迟尚可接受,但当AI智能体需在后台进行二十步‘思考’以解决复杂问题时,现有标准硬件难以提供所需的高效运行速度。”

若Cerebras能证明其硬件可实现该目标,或将侵蚀英伟达的市场份额。尼科尔森表示:“从硬件厂商角度看,我认为Cerebras是英伟达面临的最大单体威胁。如果OpenAI验证了基于Cerebras技术的数据中心更具优势,长期来看将对英伟达造成冲击。”

尼科尔森指出,部分观察家认为英伟达因缺乏晶圆级产品而处于相对劣势。英伟达的生产方式是从晶圆上切割并组装多个独立芯片,废弃的瑕疵芯片可能导致机械与电气误差。相比之下,Cerebras将整片晶圆保持为单一大型芯片,仅连接有效单元,这种方法可能降低系统复杂度与潜在故障点。

“从技术层面看,Cerebras的方案显然更优越,”尼科尔森补充道,“但英伟达凭借其全栈技术、行业资源与市场惯性,使其相对落后的技术仍能占据市场。”

然而尼科尔森也指出,Cerebras面临需立即实现大规模交付的挑战。此外,系统集成亦是难题——有意采用Cerebras技术的企业可能因其系统集成复杂度高、需额外专业人才配置而却步,这对非超大规模厂商吸引力有限。“选择英伟达的现成解决方案要省事得多。”尼科尔森坦言。

与此同时,英伟达上月已与另一家AI芯片初创公司Groq达成200亿美元授权协议,并正与OpenAI洽谈价值100亿瓦特的芯片供应合作。据《华尔街日报》报道,Cerebras宣称已与IBM和Meta达成合作,而OpenAI与Cerebras早有渊源——OpenAI曾一度考虑收购这家芯片制造商,其首席执行官萨姆·奥尔特曼也是Cerebras的个人投资者。

英文来源:

Sponsored by Google Cloud
Choosing Your First Generative AI Use Cases
To get started with generative AI, first focus on areas that can improve human experiences with information.
The agreement gives Cerebras a chance to show if its highly touted wafer-scale engine can successfully drive giant AI models better than Nvidia's chips.
Cerebras' $10 billion deal with OpenAI positions the startup and its wafer-scale engine as a challenger to Nvidia in the AI chip market, while helping OpenAI try to accelerate the performance of its large AI models.
The multiyear deal, revealed on Jan. 14, requires Cerebras to deliver 750 megawatts of wafer-scale systems to OpenAI starting later this year. OpenAI will use the wafer-scale engine to deliver near-real-time responses for tasks such as coding, inference, image generation, and complex reasoning. The Cerebras version of an AI chip is larger and, the vendor says, faster than Nvidia GPUs.
The agreement gives Cerebras, which has struggled to expand its customer base since it was founded in 2015 beyond Abu Dhabi-based AI and tech holding company G42, a foothold in the AI chip market as one of several chipmakers trying to rival Nvidia. It also addresses a key challenge for enterprises finding that massive generative AI models are often too slow or costly for real-time use.
The deal gives Cerebras an opportunity to prove the capabilities of its wafer-scale engine.
While some have previously seen the AI vendor as a somewhat experimental company focused on science applications, this partnership is the “ultimate stamp of legitimacy,” said Mike Leone, an analyst at Omdia, a division of Informa TechTarget.
“It transforms them overnight from a niche alternative into a serious contender that every other AI lab now has to pay attention to,” Leone said.
Competition from chipmakers, including longstanding semiconductor companies Broadcom and AMD, along with Cerebras, could benefit enterprises by potentially driving down AI service prices over time.
“Overall, an enterprise customer is going to have more choices when it comes to how they get their AI stuff that they want,” said David Nicholson, an analyst at Futurum Group.
The deal also tackles the paramount market issue: AI model speed.
“The industry is grappling with a difficult trade-off right now where smarter models are becoming much heavier and slower to run,” Leone said. “It appears this partnership is trying to solve that specific friction point. By focusing on inference speed, they seem to be trying to ensure that future AI agents can handle complex tasks without the lag that currently frustrates users.”
For OpenAI, which has seen competition from Google and others intensify amid questions about OpenAI's long-term financial viability, the agreement with Cerebras, is a win, particularly after Google's major agreement with Apple earlier this week to power Apple's AI initiatives with the Google Gemini foundation model. This deal shows OpenAI is expanding its infrastructure for a future when AI does the work for users, not just converses with them, Leone said.
“There is a deeper story here about the shift from chatbots to actual agents,” he said. “When you’re just chatting, a few seconds of delay is fine. But when you have an AI agent trying to solve a complex problem that requires twenty steps of 'thinking' in the background -- that requires a level of speed that standard hardware struggles to deliver efficiently.”
If Cerebras proves its hardware accomplishes that, it could lead to erosion of market share for Nvidia.
“Cerebras, in my opinion, is the biggest single threat, from a hardware company perspective, to Nvidia,” Nicholson said. “This negatively affects Nvidia in the long term, if OpenAI proves out that a data center built on Cerberus technology is superior.”
Nicholson noted that some observers see Nvidia as being at a relative disadvantage because it lacks a wafer-scale product. Nvidia's approach involves producing many separate chips from a wafer by cutting them out and assembling them, which can result in mechanical and electrical errors due to discarded, faulty chips. In contrast, Cerebras keeps the entire wafer intact as a single, large chip, connecting only the working cells. This approach may reduce system complexity and potential error points.
“At its surface, it's so obvious that what Cerebras does is technically superior,” Nicholson added. “But Nvidia has been able to get away with its inferior technology because it has the full stack and it has the industry connections and momentum.”
However, Cerebras faces the challenge of delivering at a massive scale almost immediately, Nicholson said.
Also, integration is a challenge. Enterprises interested in Cerebras might find its technology too complicated to integrate with their systems and find they need more talent to them it work better. This could make it less appealing to non-hyperscalers.
“It’s a lot easier to go with the ready-made solution from Nvidia,” Nicholson said.
Meanwhile, Nvidia forged a $20 billion licensing agreement with Groq, another AI chip startup, last month. And Nvidia has been in talks with OpenAI to sell the generative AI vendor Nvidia chips, representing 10 gigawatts.
Cerebras says it has also entered into deals with IBM and Meta. OpenAI and Cerebras had a pre-existing relationship, with OpenAI at one point exploring an acquisition of the chipmaker, the Wall Street Journal reported. OpenAI CEO Sam Altman is a personal investor in Cerebras.

商业视角看AI

文章目录


    扫描二维码,在手机上阅读