谷歌首次公布数据,揭示AI单次提示能耗。
内容来源:https://www.technologyreview.com/2025/08/21/1122288/google-gemini-ai-energy/
内容总结:
谷歌首次公布其AI模型单次查询能耗数据,为行业透明度树立新标杆
科技巨头谷歌近日发布技术报告,首次详细披露其人工智能产品Gemini每次处理文本提示所消耗的能源量。数据显示,单个中等复杂度的文本提示平均耗电量为0.24瓦时,相当于运行家用微波炉一秒所消耗的电能。该报告同时公布了相应的碳排放和水资源消耗指标,成为大型科技公司对AI能耗的最透明披露。
这份发布于5月21日的报告显示,AI专用芯片仅占总支出的58%,支持硬件运行的CPU和内存消耗25%能源,备用设备与数据中心冷却系统分别贡献10%和8%的能耗。值得注意的是,自2024年5月至2025年5月间,单次查询能耗已下降33倍,这得益于模型优化与软件升级。
谷歌首席科学家杰夫·迪恩在接受独家采访时表示:"我们希望用户了解,使用Gemini进行查询的能耗相当于观看几秒电视或消耗五滴水,日常使用无需过度担忧环境影响。"据测算,每次查询平均产生0.03克二氧化碳排放,消耗0.26毫升水(约5滴水滴)。
学界对此给予高度评价。密歇根大学教授莫沙拉夫·乔杜里指出,此类数据唯有企业能提供,因其具备研究人员难以企及的运营规模与内部数据访问权限。机器学习能源排行榜联合负责人郑在元博士称该报告将成为"AI能源领域的基石性文件"。
不过报告也存在局限性:数据仅反映文本提示的中位数能耗,复杂查询(如要求模型总结多本书籍)或图像视频生成任务的能耗显著更高。此外,公司未披露每日查询总量,使整体能耗评估仍存空白。
此次发布正值全球对AI能耗关注度持续升温之际。Hugging Face研究员萨莎·卢奇奥尼强调,虽然企业自主披露是进步,但行业仍需建立类似"能源之星"的标准化评估体系。目前包括OpenAI在内的其他主要AI公司尚未公布同类数据。
随着AI技术深入日常生活,科技公司在追求性能突破的同时,正面临越来越大的环保责任压力。谷歌此次开创性的透明度实践,或将为行业建立能耗披露标准提供重要参考。
中文翻译:
谷歌首次公布AI单次查询耗电量
这是主流AI企业迄今最透明的能耗评估,也让研究人员得以一窥这个期待已久的幕后数据。谷歌最新发布的技术报告详细披露了其Gemini应用处理每次查询的能耗——以中位数计算,单次提示词查询消耗0.24瓦时电能,相当于标准微波炉运行一秒的耗电量。报告还提供了Gemini文本提示词查询的平均用水量和碳排放估值。
这份来自头部科技企业的AI产品能耗评估堪称史上最透明,报告包含详细计算方法说明。随着AI应用日益普及,其能耗问题愈发受到关注。但由于无法全面接触科技巨头的实际运营,此前试图直接测量AI能耗的公开研究一直受阻。
今年早些时候《麻省理工科技评论》发布AI与能源专题系列时,所有主流AI公司均拒绝透露单次提示词能耗数据。谷歌此次发布终于揭开了研究人员期盼已久的幕后真相。
研究全面考察了能源需求,不仅包含运行模型的AI芯片耗电,还涵盖支持硬件的全部基础设施能耗。谷歌首席科学家杰夫·迪恩在接受专访时表示:"我们希望尽可能全面覆盖所有相关因素。"
这一点至关重要——在此次测量中,AI芯片(谷歌自研的专用TPU,功能相当于GPU)仅占0.24瓦时总耗电量的58%。其余大部分能耗来自AI专用硬件的配套设备:主机CPU和内存占总能耗25%,故障备用设备占10%,数据中心运行开销(含冷却和电力转换)占8%。
密歇根大学教授、ML.Energy排行榜负责人莫沙拉夫·乔杜里指出,此类报告彰显了产业界对AI能源研究的价值。类似评估通常只有企业能完成,因其运营规模远超研究机构且掌握幕后数据。密歇根大学博士候选人、ML.Energy联合负责人郑在元评价:"这将成为AI能源领域的里程碑,是迄今最全面的分析。"
需要说明的是,谷歌数据并不代表所有Gemini查询——公司处理的请求类型繁多,本次评估取的是中位数能耗值。某些提示词耗能远超此数,迪恩举例说:"若向Gemini输入数十本书并要求生成详细摘要,能耗必然高于中位数。"推理模型因需多步运算,能耗也相对更高。
本次研究严格限于文本提示词,不包含图像或视频生成(《麻省理工科技评论》今年发布的"高耗能"系列研究显示这类任务能耗显著更高)。报告同时发现,处理单次查询的总能耗正持续下降——2025年5月的Gemini中位数提示词能耗较2024年同期降低97%,这归功于模型进步和软件优化。
谷歌同时估算了中位数提示词对应的温室气体排放量为0.03克二氧化碳。该数值通过总耗电量乘以单位电力平均排放系数得出,但未采用美国电网或谷歌运营地电网平均值,而是基于其采购清洁能源的市场化评估。自2010年以来,谷歌已签署协议采购超过22吉瓦的太阳能、风能、地热及先进核能电力,因此其单位电力排放量理论值仅为运营地电网平均值的三分之一。
AI数据中心冷却系统还需耗水,谷歌估算每次查询消耗0.26毫升水(约5滴水滴)。迪恩表示此举旨在让用户直观了解AI交互的能耗水平:"人们正在各类场景使用AI工具,但无需过度担忧Gemini模型的能耗水耗——实际测量表明,其相当于日常无意识行为,如观看几秒电视或消耗五滴水。"
这份报告极大拓展了人们对AI资源消耗的认知。近期企业面临越来越大的技术能耗披露压力,Hugging Face的AI与气候研究员萨莎·卢奇奥尼表示:"很高兴看到他们发布这份报告,公众有权知晓代价。"她补充说,虽然这份评估报告提供了比以往更丰富的公开信息,有助于了解大型企业实际场景中的AI使用情况,但仍有细节未披露,例如Gemini每日查询总量——该数据才能推算出AI工具的总能耗。
最终,公开哪些细节及何时如何公开仍由企业自主决定。卢奇奥尼指出:"我们一直推动建立标准化AI能耗评分体系,类似家电能源之星评级。企业自主报告不能替代标准化比较。"
深度解析
人工智能
塑造OpenAI研究未来的双雄
专访OpenAI研究双主管马克·陈和雅各布·帕乔基,探讨能力更强的推理模型与超级对齐的发展路径。
如何在笔记本电脑上运行大语言模型
现在您可以安全舒适地在个人电脑上运行实用模型,具体方法如下。
GPT-5来了,然后呢?
这款备受期待的产品为ChatGPT用户体验带来多项升级,但距离通用人工智能仍很遥远。
中国大学鼓励学生更多使用AI
与西方大学仍在纠结学生使用AI完成作业不同,中国顶尖高校正全力推进AI应用。
保持联系
获取《麻省理工科技评论》最新动态
发现特别优惠、热门话题、近期活动及更多内容。
英文来源:
In a first, Google has released data on how much energy an AI prompt uses
It’s the most transparent estimate yet from one of the big AI companies, and a long-awaited peek behind the curtain for researchers.
Google has just released a technical report detailing how much energy its Gemini apps use for each query. In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.
It’s the most transparent estimate yet from a Big Tech company with a popular AI product, and the report includes detailed information about how the company calculated its final estimate. As AI has become more widely adopted, there’s been a growing effort to understand its energy use. But public efforts attempting to directly measure the energy used by AI have been hampered by a lack of full access to the operations of a major tech company.
Earlier this year, MIT Technology Review published a comprehensive series on AI and energy, at which time none of the major AI companies would reveal their per-prompt energy usage. Google’s new publication, at last, allows for a peek behind the curtain that researchers and analysts have long hoped for.
The study focuses on a broad look at energy demand, including not only the power used by the AI chips that run models but also by all the other infrastructure needed to support that hardware.
“We wanted to be quite comprehensive in all the things we included,” said Jeff Dean, Google’s chief scientist, in an exclusive interview with MIT Technology Review about the new report.
That’s significant, because in this measurement, the AI chips—in this case, Google’s custom TPUs, the company’s proprietary equivalent of GPUs—account for just 58% of the total electricity demand of 0.24 watt-hours.
Another large portion of the energy is used by equipment needed to support AI-specific hardware: The host machine’s CPU and memory account for another 25% of the total energy used. There’s also backup equipment needed in case something fails—these idle machines account for 10% of the total. The final 8% is from overhead associated with running a data center, including cooling and power conversion.
This sort of report shows the value of industry input to energy and AI research, says Mosharaf Chowdhury, a professor at the University of Michigan and one of the heads of the ML.Energy leaderboard, which tracks energy consumption of AI models.
Estimates like Google’s are generally something that only companies can produce, because they run at a larger scale than researchers are able to and have access to behind-the-scenes information. “I think this will be a keystone piece in the AI energy field,” says Jae-Won Chung, a PhD candidate at the University of Michigan and another leader of the ML.Energy effort. “It’s the most comprehensive analysis so far.”
Google’s figure, however, is not representative of all queries submitted to Gemini: The company handles a huge variety of requests, and this estimate is calculated from a median energy demand, one that falls in the middle of the range of possible queries.
So some Gemini prompts use much more energy than this: Dean gives the example of feeding dozens of books into Gemini and asking it to produce a detailed synopsis of their content. “That’s the kind of thing that will probably take more energy than the median prompt,” Dean says. Using a reasoning model could also have a higher associated energy demand because these models take more steps before producing an answer.
This report was also strictly limited to text prompts, so it doesn’t represent what’s needed to generate an image or a video. (Other analyses, including one in MIT Technology Review’s Power Hungry series earlier this year, show that these tasks can require much more energy.)
The report also finds that the total energy used to field a Gemini query has fallen dramatically over time. The median Gemini prompt used 33 times more energy in May 2024 than it did in May 2025, according to Google. The company points to advancements in its models and other software optimizations for the improvements.
Google also estimates the greenhouse gas emissions associated with the median prompt, which they put at 0.03 grams of carbon dioxide. To get to this number, the company multiplied the total energy used to respond to a prompt by the average emissions per unit of electricity.
Rather than using an emissions estimate based on the US grid average, or the average of the grids where Google operates, the company instead uses a market-based estimate, which takes into account electricity purchases that the company makes from clean energy projects. The company has signed agreements to buy over 22 gigawatts of power from sources including solar, wind, geothermal, and advanced nuclear projects since 2010. Because of those purchases, Google’s emissions per unit of electricity on paper are roughly one-third of those on the average grid where it operates.
AI data centers also consume water for cooling, and Google estimates that each prompt consumes 0.26 milliliters of water, or about five drops.
The goal of this work was to provide users a window into the energy use of their interactions with AI, Dean says.
“People are using [AI tools] for all kinds of things, and they shouldn’t have major concerns about the energy usage or the water usage of Gemini models, because in our actual measurements, what we were able to show was that it’s actually equivalent to things you do without even thinking about it on a daily basis,” he says, “like watching a few seconds of TV or consuming five drops of water.”
The publication greatly expands what’s known about AI’s resource usage. It follows recent increasing pressure on companies to release more information about the energy toll of the technology. “I’m really happy that they put this out,” says Sasha Luccioni, an AI and climate researcher at Hugging Face. “People want to know what the cost is.”
This estimate and the supporting report contain more public information than has been available before, and it’s helpful to get more information about AI use in real life, at scale, by a major company, Luccioni adds. However, there are still details that the company isn’t sharing in this report. One major question mark is the total number of queries that Gemini gets each day, which would allow estimates of the AI tool’s total energy demand.
And ultimately, it’s still the company deciding what details to share, and when and how. “We’ve been trying to push for a standardized AI energy score,” Luccioni says, a standard for AI similar to the Energy Star rating for appliances. “This is not a replacement or proxy for standardized comparisons.”
Deep Dive
Artificial intelligence
The two people shaping the future of OpenAI’s research
An exclusive conversation with Mark Chen and Jakub Pachocki, OpenAI’s twin heads of research, about the path toward more capable reasoning models—and superalignment.
How to run an LLM on your laptop
It’s now possible to run useful models from the safety and comfort of your own computer. Here’s how.
GPT-5 is here. Now what?
The much-hyped release makes several enhancements to the ChatGPT user experience. But it’s still far short of AGI.
Chinese universities want students to use more AI, not less
Unlike the West, where universities are still agonizing over how students use AI in their work, top universities in China are going all in.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.