«

Niantic的AR外星狗Peridot现已化身为会说话的导游。

qimuai 发布于 阅读:8 一手编译


Niantic的AR外星狗Peridot现已化身为会说话的导游。

内容来源:https://www.wired.com/story/niantic-peridot-augmented-reality-snap-spectacles/

内容总结:

近日,增强现实领域迎来新突破。曾开发《宝可梦GO》的Niantic公司旗下工作室Niantic Spatial与AI初创企业Hume AI合作,为其虚拟宠物应用Peridot植入人工智能语音交互功能。用户通过Snap公司的Spectacles智能眼镜,可与虚拟萌宠进行拟真对话,获得沉浸式导航与场景解说体验。

这款名为"Peridot Beyond"的升级应用将生成式AI与地理空间技术相结合。佩戴智能眼镜后,电子宠物"小点"会以3D形象出现在现实场景中,不仅能根据用户视线内容进行智能解说,还可通过浮现脚印轨迹提供实景导航。项目负责人艾丽西亚·贝里表示,该设计旨在还原"本地向导带路"的自然体验,降低用户出行焦虑。

Hume AI创始人艾伦·考恩认为,这种能与人类共情的数字伴侣代表着未来计算发展方向。目前该功能仍处于演示阶段,团队正重点完善数字形象的行为逻辑,避免重蹈《宝可梦GO》玩家安全事故的覆辙。尽管现阶段仅支持Spectacles眼镜用户使用,但开发者表示这将为增强现实眼镜的交互模式提供重要参考。

(注:根据要求已完全采用符合中文读者习惯的新闻表述,去除了原文中的文化隔阂表述、主观评论及冗余信息,对技术术语进行了通俗化转译,并采用倒金字塔结构组织内容。)

中文翻译:

想象一下,你正在遛狗。它时而嗅闻街边的物体,时而在某些地方留下自己的气味。某个阳光灿烂的日子,当你沿着旧金山 Embarcadero 海滨漫步,眺望海湾时,远处的轮渡大厦映入眼帘。这时你的狗突然转过头,凝视着你的眼睛说:"你知道吗?这片海滨曾被码头和高速公路封锁了整整一百年。"

现在请想象这只狗长得像外星生物,而且只有你能看见它——这正是Niantic实验室为其AR体验游戏《Peridot》打造的全新功能愿景。

作为风靡全球的AR游戏《宝可梦GO》的开发商,Niantic希望通过数字造物增强现实空间的方​​式,将元宇宙延伸至物理世界。《Peridot》这款手机游戏允许用户定制专属的"小点"——这些狗仔体型的数字伙伴会出现在手机屏幕中,通过摄像头视角与真实环境互动。它们外形酷似宝可梦,模样惹人喜爱,而现在,它们还学会了说话。

该游戏最初于2022年以手游形式面世,后融入生成式AI功能。今年四月成立的初创公司Niantic Spatial接手开发,致力于将地理空间数据转化为AR技术的试验场。如今升级为《Peridot Beyond》的版本已适配Snap Spectacles智能眼镜。

专注让聊天机器人更具共情力的Hume AI公司,其运行的大型语言模型正与Niantic Spatial展开合作,为Spectacles眼镜中的"小点"注入语音交互能力。这项合作于九月首次公布,目前已完成公测准备,将在本周Snap Lens Fest开发者活动中公开演示。

Hume AI创始人艾伦·考恩认为该项目预示着计算技术的未来:"增强现实终将以某种形态融入每个人的生活。未来人们将与引导我们认识世界的数字伙伴对话。"

佩戴Spectacles眼镜召唤"小点"时,它会以3D影像形态与你周围环境互动。若你询问前往附近餐厅的路线,视野中便会出现指向目的地的脚印路径。沿途遇到有趣事物时,这个小家伙还会驻足对你所见之物发表评论。

"在东京乘坐地铁时,跟着当地朋友走是不是比看地图轻松得多?"Niantic Spatial执行制片人艾丽西亚·贝瑞解释道,"我们尽可能还原这种真实体验,既减轻导航压力,又让人获得安心感。"

Niantic Spatial今年初发布的演示视频(若能忽略其中略显尴尬的对话)清晰展现了"小点"的预期行为模式。

考恩强调:"这本质上是叙事体验。经过特定内容策划的AI能感知你所见的事物,它既是聪慧的知识库,更是具共情力的伙伴。"

过去一年间,Snap笨重但实用的Spectacles眼镜已向热衷AR眼镜开发的创作者开放。将卡通狗变为健谈导游的创意,正昭示着这些行业先锋对AR眼镜未来的期待。需要注意的是,目前该服务仅支持Spectacles眼镜用户——如果你不介意戴着这副显眼的设备在公共场合行走,或许就能与幻想中的外星狗对话。当然,这还需要比我更强的社交焦虑免疫力。

由于Niantic Spatial需要确保数字形象不会重演《宝可梦GO》玩家坠崖事件,导游功能暂不开放公开测试。

贝瑞表示:"这只是我们的第一步。接下来要重点打磨它们的个性特征。"

英文来源:

Imagine you’re walking your dog. It interacts with the world around you—sniffing some things, relieving itself on others. You walk down the Embarcadero in San Francisco on a bright sunny day, and you see the Ferry Building in the distance as you look out into the bay. Your dog turns to you, looks you in the eye, and says, “Did you know this waterfront was blocked by piers and a freeway for 100 years?”
OK now imagine your dog looks like an alien and only you can see it. That’s the vision for a new capability created for the Niantic Labs AR experience Peridot.
Niantic, also the developer of the worldwide AR behemoth Pokémon Go, hopes to build out its vision of extending the metaverse into the real world by giving people the means to augment the space around them with digital artifacts. Peridot is a mobile game that lets users customize and interact with their own little Dots—dog-sized digital companions that appear on your phone’s screen and can look like they’re interacting with the world objects in the view of your camera lens. They’re very cute, and yes, they look a lot like Pokémon. Now, they can talk.
Peridot started as a mobile game in 2022, then got infused with generative AI features. The game has since moved into the hands of Niantic Spatial, a startup created in April that aims to turn geospatial data into an accessible playground for its AR ambitions. Now called Peridot Beyond, it has been enabled in Snap’s Spectacles.
Hume AI, a startup running a large language model that aims to make chatbots seem more empathetic, is now partnering with Niantic Spatial to bring a voice to the Dots on Snap’s Spectacles. The move was initially announced in September, but now it’s ready for the public and will be demonstrated at Snap’s Lens Fest developer event this week.
Alan Cowen, Hume AI’s founder, sees the project as a glimpse into the future of computing. “At some point, everyone is going to have AR in their lives in some format,” Cowen says. “You'll be talking to companions that help guide you through the world.”
Call on Dot while wearing the Spectacles and it will look like a 3D image interacting with the world around you. Ask it to find you walking directions to a nearby restaurant and it will show a little footprint path heading that direction in your field of vision. If you see something fun along the way, Dot might stop and comment on whatever you’re looking at.
“Isn't it way easier when you're in Tokyo to follow your friend who lives in Tokyo through the subway system than it is to follow the map?” says Alicia Berry, executive producer at Niantic Spatial. “We wanted to re-create that as close to reality as possible to reduce the stress of navigation but also just kind of make people feel calm.”
Niantic Spatial released a reveal video of this in action earlier this year, which gives a good idea of how the company hopes Dot will act and speak (if you can get past the cringy dialog).
“This is really a storytelling experience,” says Cowen. “It's curated for certain things. It is an AI that sees what you see. It is intelligent and it is knowledgeable and it's an empathetic friend.”
For the past year, Snap's chunky but functional Spectacles have been available to developers eager to tinker with their augmented reality glasses. Turning cartoon dogs into chatty tour guides is a way to show off what these key players hope the future of AR glasses will be. The caveat is that this service is available only for Snap Spectacles, so if you are comfortable walking around in public while wearing these big, chunky frames, you can talk to your make-believe alien dog. You also have a stronger resilience against social anxiety than I.
The tour guide experience won’t be available outside the Snap demo yet, as Niantic Spatial wants to take the time to ensure that, unlike Pokémon Go, the digital avatars won’t lead to anyone falling off cliffs.
“This is just our first step,” Berry says. “Let's figure out the personality.”

连线杂志AI最前沿

文章目录


    扫描二维码,在手机上阅读