«

我们来聊聊门铃摄像头、走失的狗狗,以及监控无处不在的社会。

qimuai 发布于 阅读:11 一手编译


我们来聊聊门铃摄像头、走失的狗狗,以及监控无处不在的社会。

内容来源:https://www.theverge.com/podcast/879203/ring-search-party-super-bowl-ai-surveillance-privacy-security

内容总结:

智能门铃制造商Ring近期因一项名为"搜救派对"(Search Party)的AI寻宠功能陷入舆论风暴。该功能在"超级碗"赛事期间高调投放广告,宣称能协助用户快速找回走失的宠物,却意外引发公众对大规模监控的深切担忧。

争议焦点在于,同一套AI识别技术既能定位宠物,也可能被用于追踪特定人群,进而演变为侵犯公民隐私的工具。这种担忧在Ring长期与执法部门紧密合作的背景下被进一步放大。去年秋季,Ring宣布与安防公司Flock Safety建立合作,而后者系统曾被美国移民及海关执法局(ICE)使用,这直接触发了关于公民权利保障的激烈讨论。

舆论反弹迅速且猛烈。社交媒体监测数据显示,相关讨论在"超级碗"后两日达到峰值,情绪呈现压倒性负面。美国参议员埃德·马基直指广告呈现"反乌托邦"图景,呼吁亚马逊旗下Ring全面停止门铃人脸识别技术。面对压力,Ring在广告播出四天后紧急宣布终止与Flock的合作,称整合计划"需远超预期的时间与资源"。

公司创始人杰米·西米诺夫在近期访谈中重申其"消除犯罪"的使命愿景,强调AI将赋予摄像头智能判断能力,从简单运动检测升级为识别特定事件。他主张通过邻里安防网络与可控的视频分享机制,在提升社区安全的同时保障用户自主权。然而,当被问及"零犯罪"目标会否导向全天候主动监控时,西米诺夫否认会走向如此"反乌托邦"的境地。

当前,尽管争议性合作已被取消,"搜救派对"功能仍默认开启(用户可手动关闭),而海量视频数据持续上传至科技公司的服务器。这些数据如何被存储、使用以及与政府机构的交互,仍处于缺乏透明监管的灰色地带。这场风波折射出更深层的时代命题:在人工智能与联网设备日益普及的今天,社会如何在技术创新、公共安全与个人隐私之间找到可持续的平衡点。

中文翻译:

今天,我们来聊聊摄像头公司Ring、走失的狗狗,以及监控社会。

我们来聊聊Ring、走失的狗狗,以及监控社会。

这家安防摄像头制造商在超级碗期间推广的"搜寻派对"功能,引发了一场关于监控的强烈反弹。

几周前的超级碗比赛中,你可能看到了这则广告:

由于在超级碗期间面向海量观众播出,Ring的"搜寻派对"广告迅速成为争议焦点——显而易见,同一项既能寻找走失狗狗的技术,也能被用来追踪人,进而被警察和普通民众以各种令人不适的方式侵犯我们的隐私。

Ring尤其以其与执法部门的合作为傲。这引发了关于我们公民权利的巨大疑问,特别是考虑到Ring去年秋天宣布与一家名为Flock Safety的公司建立合作关系,而移民海关执法局(ICE)能够访问该公司的系统。这其中有些复杂情况——我们稍后会再谈。

对Ring超级碗广告的反弹迅速、激烈且有效:数据公司PeakMetrics表示,在X等社交平台上关于该广告的讨论量在超级碗结束两天后达到顶峰,而他们监测到的舆论情绪明显是负面的。我的意思是,当运营"weratedogs"账号的马特·尼尔森都开始发布这样的视频时,你就知道情况不妙了:

参议员埃德·马基称这则广告"反乌托邦",并表示这证明拥有Ring的亚马逊需要停止在Ring门铃上使用所有人脸识别技术。他说:"这绝对与狗无关——这是关于大规模监控。"

接着,在2月12日星期四,也就是超级碗结束仅仅四天后,Ring宣布取消与Flock的合作关系,这一声明首先由The Verge的詹·图伊报道。这份声明本身也信息量巨大:

"经过全面审查,我们确定计划的Flock Safety集成需要比预期多得多的时间和资源。因此,我们共同决定取消计划的集成。该集成从未上线,因此没有任何Ring客户视频被发送给Flock Safety。"

该公司还继续提到,Ring摄像头曾被警方用于识别2025年12月布朗大学校园枪击案的枪手。在一份关于取消一项争议性合作的新闻稿中,这显得有点突兀,但也确实在很大程度上解释了Ring以及这家公司如何看待自己。

巧的是,Ring的创始人杰米·西米诺夫几个月前刚做客《解码器》节目,谈论了他如何以及为何创立这家公司,并详细解释了他为何将Ring的使命视为消除犯罪。不是销售摄像头、门铃、泛光灯或Ring制造的任何其他产品,而是消除犯罪。

我们确实讨论了"搜寻派对"以及人们对这类监控的感受,还有Ring如何与警方大量合作。事实上,杰米在2023年曾短暂离开Ring,公司也放缓了与执法部门的合作。但自从他回归后,对犯罪问题的关注以及与警方的合作只增不减。我问他:

尼莱·帕特尔: 你离开了,亚马逊说我们将停止与警方合作,你回来了,好家伙,Ring又要和警方合作了。你与制造泰瑟枪的Axon建立了合作关系,允许执法部门获取Ring的录像。这感觉像是一扇双向门吗?你不在的时候他们做了错误的决定,你回来后说,"我们要重新开始这么做"?

杰米·西米诺夫: 我不知道这是对是错,但我认为不同的领导者会做不同的事情。我确实花了很多时间跟随警车巡逻。我花了很多时间在那些对某些人来说并不安全的区域,我见过很多情况,我认为我们可以对它们产生积极影响。所以,我们与警方的合作方式不是……我想谨慎一点,因为我们不是……我们所允许的是,当事件发生时,机构可以请求获取录像。我们允许我们的邻居——在这一点上我要说明是我们的客户,为了清晰起见——我们允许我们的客户匿名决定是否愿意参与其中。

所以,如果他们决定不想成为这个网络的一部分,不想帮助这个向他们提出请求的公共服务机构,他们只需说不。如果他们决定愿意参与——顺便说一句,很多人都希望提高他们社区的安全性。很多人希望他们的孩子在更安全的社区长大,很多人希望拥有实现这一目标的工具,并且他们身处危险的地方。我们赋予他们说"是"的能力,并让他们与那些公共服务机构的沟通更高效,同时以非常可审计的数字格式进行。

这是另一方面。今天,没有这些工具,如果一名警察想去获取某事的录像,他们必须去敲你的门问你,这对任何人来说都不舒服。没有数字审计追踪记录,而有了这个,他们可以高效地完成并留下审计追踪。这非常清晰,而且是匿名的。

杰米实际上在这个背景下谈了很多关于寻找狗狗的事情,因为他如此兴奋地回到Ring的原因之一,就是利用人工智能来搜索Ring摄像头生成的海量视频。事实上,他告诉我,五年前Ring不可能构建"搜寻派对",因为当时还没有可用的AI系统来完成这件事。

杰米对此非常直接,我很欣赏这一点。他真的认为你可以使用AI和摄像头来减少甚至消除犯罪。但我对此有很多疑问:

杰米·西米诺夫: 但是当你把AI引入其中,突然间,你就有了AI赋予你的人性化元素。我认为,随着我们的产品进入社区——再次强调,你需要对此稍微具体一点——我确实看到了一条路径,我们可以真正开始将社区的犯罪率降至接近于零。我甚至说过,当然有些犯罪是你无法阻止的。

尼莱·帕特尔: 从机制上讲,向人们解释一下你的意思。你在一个社区安装足够多的Ring产品,然后AI对它们做什么来帮助你实现犯罪清零的使命?

所以,心智模型,或者说我的看法是,AI让我们能够拥有……如果你有一个拥有无限资源的社区,每栋房子都有保安,而且这些保安是那些在同一栋房子工作了10年或20年的人,我是从知识的角度来说的。所以,他们对那栋房子的了解是极致的;他们了解你、你的住所、你的家庭、你的生活方式、进出的人的一切。

然后,如果那个社区有一个业主协会,可以称之为私人安保,那些私人安保人员也在周围并且知道一切,会发生什么?当一只狗走失时,你会说:"哦,天哪,我的狗丢了。"嗯,他们会互相打电话,其中一个人会很快找到狗。所以,我们如何改变这一点并将其带入数字世界就是——

我能就那个具体的社区问你一个问题吗?

当然。

你有没有停下来想过,那个社区可能很糟糕?仅仅是我街上每栋房子都有无所不知的私人保安,我会有一个业主协会,而那个业主协会会有一支私人安保力量。

你可以轻易地把这描绘成反乌托邦。每个人都如此害怕每个角落都有私人警察,我还要交业主协会费,这本身就是一场噩梦。

所以,我猜你住在一个安全的社区。

我希望如此,是的。

不,今天,我会去……如果你愿意,我可以带你去一个人们居住的地方,他们放学回家后必须锁上门待在家里,不能出去——

但我只是说,那个模式是"每个人都如此害怕,以至于他们有私人警察"。

我认为模式是,在像那样的社区犯罪是无利可图的,而且我认为你希望人们转向另一份工作。我不认为犯罪是好事,所以我认为……但是听着,这当然是一个值得争论的问题,我确实相信……我认为更安全的社区能让孩子们在更好的环境中成长,我认为这能让他们专注于重要的事情,这就是我们的目标。

我只是想质疑这个前提。

我认为这是一个合理的质疑。

模式就是到处都是警察。那种隐私程度。

是的,不是警察。我认为更多的是你将有能力了解正在发生的事情。不是像……但是,是的,我想,听着,这是一个合理的说法。我想我想住在一个安全的地方。

你的社区里有很多情报,也许是私人安保,也许不是。AI做什么?它只是让摄像头更智能吗?它让你对摄像头看到的东西进行更智能的评估?

现在,我们只说运动检测,运动检测,运动检测。有趣的是,当我创立Ring时……写那本书很有趣,因为我得以回顾并梳理整个故事,看这个东西是如何诞生的,运动检测是一个了不起的发明。你在机场,前门有动静,你看着它想:"哇,这太疯狂了。"

现在,有了AI,我们不应该告诉你关于运动检测;我们应该告诉你那里有什么,你什么时候应该看,什么时候重要,我们不应该一直打扰你。这就是我说的你家或你社区里有保安这个想法的意思。你的社区里应该有这种智能,可以告诉你什么时候你应该尝试参与某事,但不是总是告诉你。所以,它不仅仅是"车,车,狗,人,人"。而是像:"嘿,看看这个。你现在需要注意这个。"

我真的很想追问杰米这一点,因为我仍然不认为Ring如何仅通过AI就实现消除犯罪是完全清楚的。这也是为什么当公司说不会使用能找到狗的系统去做其他侵犯我们权利的事情时,人们不信任它。毕竟,如果你的目标是使用AI来阻止犯罪,而你构建了一个能找到狗的AI系统……嗯,接下来会发生什么就很明显了,对吧?

尼莱·帕特尔: 当你谈到将社区犯罪清零时,你认为社区里每个人前院都有一个那种发光的Ring标志牌,这就足够了吗——

杰米·西米诺夫: 这是其中的一部分。

这足以起到威慑作用吗?坏人会知道他们的脸会被视频捕捉到,并被AI分析,然后会发生一些事情。你需要做更多的外向威慑吗?

我认为这是其中的一部分。意识是很大一部分。我认为也有使用灯光的方式,利用照明来做一些事情,这也是很大一部分。我认为仅仅……如果,突然间,有人因为出现异常情况而出来查看,这也是很大一部分。不一定是什么疯狂的事情。这就是我说的,很多这些小事情加起来才能实现这个目标。

所以,当你思考时,好吧,我们可以将一个社区的犯罪率降至接近于零,那么逐步推进的步骤是什么?是每个人都装上Ring摄像头,然后你的平台完成所有工作吗?还是有人被抓,然后他们在监狱里告诉所有朋友他们被抓了?步骤是什么?

我认为这实际上是为了让邻居们为了这个特定的事情团结起来。所以,关键在于你个人如何……我们一直认为每栋房子都是由邻居控制的独立节点,也就是由个人控制,我会一直回到这一点,那就是百分之百,你的视频由你控制;你所做的一切都由你控制,你是否想参与任何事情都由你控制。这必须是所有这一切的第一层。

但是,当事情发生时,你想参与吗?所以,如果你收到一个警报,说这只狗看起来像在你家门前的那只狗,你能联系你的邻居吗?你可以决定不参与,那么永远不会有人知道,这没关系,基本上就是删除了,或者你可以参与。我认为这就是我们如何能让一个社区变成这样一个节点,每个邻居都是独立的,但当事情发生时,他们可以按照自己的意愿合作。

你认为AI会加速这个过程吗?

我认为AI是一个副驾驶。它是他们的助手,帮助他们弄清楚这一点。因为,再次强调,如果你只是收到每一个运动警报,如果你有八个摄像头,整天都在收到运动警报,没有人能解析所有这些数据。这就是我和詹谈论的,我确实认为我看到了一种利用AI为我们提供更好数据的方法,这让我们能做出更好的决定,更好地合作。

这就谈到了Flock,Ring在2025年10月宣布与其建立合作关系。Flock主要为警方制造摄像头和搜索视频的系统。你可能在你住的地方见过Flock的设备——它们是那些安装在路灯上或停车场中央的小型太阳能摄像头和跟踪设备。它们收集大量数据,公司声称这些数据在提供给合作伙伴(通常是当地执法部门)之前是匿名的。

然而,根据优秀的404 Media的深入报道,Flock的数据经常在不需要搜查令的情况下,被提供给ICE、FBI、特勤局和其他执法机构。这是因为这些数据是当地警方自愿提供的。

上个月,在关于与Flock的交易意味着什么的严格审查下,Ring表示这种合作关系尚未"上线",并且"Ring与ICE没有合作关系,不向ICE提供视频、数据流或后端访问权限,也不与他们共享视频。"就Flock而言,他们也表示同样的话——他们实际上不与ICE合作,而是与当地执法部门合作,是那些当地机构与ICE合作。这就是我之前提到的复杂情况。

如果你是《解码器》的听众,你知道这一切的走向。我问杰米关于所有这些数据库,谁拥有它们,以及用AI将它们全部连接起来意味着什么:

尼莱·帕特尔: 但是当你连接一堆这样的数据库,特别是连接到人脸识别时,隐私对话就会出现一个转折点,风险会急剧上升,也许隐私就永远消失了。

你如何考虑这个决策过程?好吧,我们的AI有很多情报;AI连接到另一个信息库是轻而易举的事。这是你可以用AI做的事情,尤其是在像亚马逊这样的大公司,你有很多其他信息库。有一条线,对你来说这条线在哪里?

杰米·西米诺夫: 显然,构建安全的产品是一种责任。我们就从这里开始。是的,我们确实宣布了人脸识别,我们称之为"熟悉面孔",但那不是联网的,那只是用于你的……就像你今天的iPhone。如果你搜索你的iPhone,这很疯狂。在你的照片中搜索某人的名字,他们的照片就会出现。

所以我认为,在允许那些应该存在、能帮助人们、给他们更高效率、更安全家园的技术存在,与显然不创造这个反乌托邦的地方之间,需要取得平衡。所以,我认为这就是责任,但我们用"熟悉面孔"所做的是,我们只是赋予你能力,让你可以说,当我妻子回家时,不要……因为这很傻。为什么我妻子回家时我会收到警报?我不想要,我不需要。

我问这个问题有很多原因,但我看看世界上监控录像的广泛情况。我不是说Ring参与其中,我只是给你举个例子。ICE有人脸识别系统,他们辩称,在其人脸识别系统中匹配成功,就是对某人移民身份的最终确定。这太离谱了。我不认为你在做那种事。

但你可以发展到,"好吧,我们有人脸识别,我们有大量来自Ring摄像头的证据,为了让它真正安全,你想从被动监控转向主动监控。研究显示是这样。现在摄像头将直接通过面部识别罪犯,并告诉警察这个人试图从这个车道上偷车,"而这正是让你真正实现犯罪清零的事情。

这些步骤中有很多风险。但如果我从你所说的理出一条线,那就是一直延伸到这样一个想法:罪犯不会来这里,因为摄像头会知道他们是谁并告诉警察。你愿意走那么远吗?

我认为还有摄像头会提醒人们。Ring之所以成功,以及Ring 1.0让邻居更安全的部分原因——我认为我们现在处于Ring 2.0——是因为家里没有人在场。人们是如何闯入住宅的?他们会去当"敲门"窃贼。他们会敲门,没人在家。下午3点,他们会去隔壁的房子,找到一个没人的地方,然后进入房子。

Ring让你能够,突然间,当有人来到门前时,你会说:"哦,我收到一个运动警报。嗨,有什么事吗?"这样它就给了家一种在场感。所以,我认为要达到我们谈论的程度,你不必走那么远去做那种实时的事情,我认为更多的是异常检测,让人们能够做到,如果有人进来,你能意识到社区周围正在发生什么,因为现在对周围发生的事情没有意识。所以我不认为它像你说的那样反乌托邦,当然也不是我们正在构建的,而且我确实认为我们可以在社区层面产生非常高水平的影响。再次提到詹·图伊那件事,我们谈论的是在社区里,通过AI和我们用一堆Ring设备一起做的事情。我认为即使是"狗狗搜寻派对"也是一个很好的观察角度,看看这些摄像头如何在社区里为善而协同工作。

今天,对监控的强烈反对导致Ring终止了与Flock的交易。该公司也在对"搜寻派对"进行损害控制,告诉The Verge,支持该功能的技术不具备用于寻找人的"能力",并且没有迹象表明此类功能会出现在未来的路线图上。

当然,Ring显然需要赢回很多信任,而且我们肯定都在思考,在家里安装联网安防摄像头意味着什么。我认为这是好事——而且肯定早就该思考了。

但让我给你稍微复杂化一下。在关于Ring的讨论进行的同时,我们也看到普通人用手机记录警察和ICE,捕捉这些机构如何侵犯普通美国人权利的关键证据,这些证据正在导致改变,尽管缓慢。

明尼苏达州州长蒂姆·沃尔兹现在告诉人们,看到ICE时就开始录像,以便这些录像能用于未来的起诉。就在上周,FBI发布了一段视频,谷歌似乎专门从南希·格思里家中的Nest摄像头系统中恢复出来,以帮助识别绑架她的人。

有很多视频正以可能感觉不那么具有侵入性的方式被捕捉和使用,在某些情况下甚至感觉良好。但那些创建、存储和共享这些视频的系统都是一样的,围绕它们的防护栏就像让我们对Ring感到不适的任何东西一样薄弱。

我不确定该如何看待所有这些视频,尤其是在一个AI让伪造变得容易的世界里,拥有某种真相来源似乎比以往任何时候都更重要。我也问了杰米这个问题——Ring是否会控制所有视频,或者用一些元数据签名,以便人们能确保它是真实的。

尼莱·帕特尔: 假设我们必须有一个经过认证的服务器,我的社区发生了犯罪,我选择了加入,并且我们要说警察只能从Ring服务器获取视频,我们知道那里的是真实的。我可能不再像以前那样完全控制我的视频了。

杰米·西米诺夫: 不,不是这样构建的,只要我在这里就不会,因为它的工作方式是,你将决定是否想与某人分享那段视频,那是你的财产。现在,一旦你分享了它,那么就要由我们来弄清楚,按照你的观点,我们如何分享它,我们如何确保数字指纹贯穿始终,或者这段视频的保管链如何运作以确保过程中没有伪造?我认为这就是为什么构建这些系统很重要。

不过,这将会很重要。这也是政府必须介入的地方。我们将不得不全面处理这个问题,因为我们也有来自手机的视频。所以,我们确实需要弄清楚如何构建……而且将会有公司,Axon可能就是其中之一。我不想替他们说话,但他们有evidence.com,所以构建这些证据系统来接收……

因为Ring只是接收数据——可以说是犯罪现场数据——的一部分,但手机视频今天可能是更重要的来源。那么,你如何接收它?你如何确保它确实是直接在iPhone上捕捉的,而不是在两者之间被篡改?我们将不得不弄清楚这一切。我认为我们必须共同努力,而AI技术正在推动我们去做。我自豪的是,通过Ring,我们已经构建了它,让你可以直接获取并保存在服务器上。你可以了解它在哪里,来自哪里,创建于何处,我们在上面有数字指纹和审计追踪。

随着世界的变化,你将不得不越来越多地这样做,你将无法仅仅因为有人发给你一段视频就相信它是真的。

你会感觉到,在未来几年里,我们会经常回到这个想法。

但今天Ring已经取消了与Flock的交易,而Flock本身正在发布博客文章,断然声明其与ICE没有合同,并指出Ring的另一个合作伙伴Axon确实与ICE有合同。

与此同时,"搜寻派对"仍然处于活动状态且默认开启,尽管你可以进入设置将其关闭。而我们所有人产生的海量视频正被上传到由大公司运营的服务器上,这些公司与政府和执法机构有着我们远无法控制的往来。

对于所有这些问题有很多解决方案,也有很多方法来设计法规,以平衡隐私和公民自由与警察的需求。但现在,在2026年的美国,我不确定我们是否真的能做到。

所以我们将继续追问这些公司的领导者他们的真实意图,并持续播报答案,以便你们可以倾听并决定。我认为是时候开始思考我们用来改善自己生活的所有技术如何影响他人了。

因为我们需要的那个更大的对话?是的,那才是真正重要的。

对本集有任何问题或评论?请发邮件至decoder@theverge.com。我们真的会阅读每一封邮件!

解码器,与尼莱·帕特尔一起

来自The Verge的播客,关于大想法和其他问题。

英文来源:

Today, let’s talk about the camera company Ring, lost dogs, and the surveillance state.
Let’s talk about Ring, lost dogs, and the surveillance state
The security camera maker’s Search Party feature, advertised during the Super Bowl, has sparked a surveillance backlash.
You probably saw this ad during the Super Bowl a couple of weekends ago:
Since it aired for a massive audience at the Super Bowl, Ring’s Search Party commercial has become a lightning rod for controversy — it’s easy to see how the same technology that can find lost dogs can be used to find people, and then used to invade our privacy in all kinds of uncomfortable ways, by cops and regular people alike.
Ring in particular has always been proud of its cooperation with law enforcement. That raises big questions about our civil rights, especially since Ring announced a partnership last fall with a company called Flock Safety, whose systems have been accessed by ICE. There’s some complication to that — we’ll come back to it in a bit.
The backlash to Ring’s Super Bowl ad was swift, intense, and effective: the data company PeakMetrics says conversation about the ad on social platforms like X actually peaked two days after the Super Bowl, and the vibes, as they measured them, were strikingly negative. I mean, you know it’s bad when Matt Nelson, who runs the weratedogs account, is posting videos like this:
Sen. Ed Markey called the ad “dystopian” and said it was proof Amazon, which owns Ring, needed to cease all facial recognition technology on Ring doorbells. He said, “This definitely isn’t about dogs — it’s about mass surveillance.”
And then, on Thursday, February 12th, just four days after the Super Bowl, Ring announced it was canceling its partnership with Flock, in a statement first reported by The Verge’s Jen Tuohy. That statement itself is a lot:
Following a comprehensive review, we determined the planned Flock Safety integration would require significantly more time and resources than anticipated. As a result, we have made the joint decision to cancel the planned integration. The integration never launched, so no Ring customer videos were ever sent to Flock Safety.
The company also goes on to say that Ring cameras were used by police in identifying a school shooter at Brown University in December 2025. It’s an odd non sequitur in a press release about canceling a controversial partnership that really explains a lot about Ring, and how the company sees itself.
As it happens, Ring’s founder Jamie Siminoff was just on Decoder a few months ago, talking about how and why he founded the company, and in detail about why he sees Ring’s mission as eliminating crime. Not selling cameras or doorbells, or floodlights, or anything else Ring makes, but getting rid of crime.
We actually talked about Search Party and how people might feel about that kind of surveillance, and how Ring works with the cops quite a bit. In fact, Jamie briefly left Ring in 2023, and the company slowed down on its work with law enforcement. But ever since he returned, the emphasis on crime and the work with police has only intensified. I asked him about it:
NILAY PATEL: You left, Amazon said we’re going to stop working with police, you came back, boy, Ring is going to work with police again. You have a partnership with Axon, which makes the taser, that allows law enforcement to get access to Ring footage. Did that feel like a two-way door? They made the wrong decision in your absence, and you came back and said, “We’re going to do this again”?
JAMIE SIMINOFF: I don’t know if it’s wrong or right, but I think different leadership does different things. I do believe that I spent a lot of time going on ride-alongs. I spent a lot of time in areas that I’d say are not safe for those people, and I’ve seen a lot of things where I think we can positively impact them. So, we don’t work with police in the way of ... I just want to be careful, as we’re not ... What we do allow is for agencies to ask for footage when something happens. We allow our neighbors, which I’ll say in this point are our customers, just to be clear, we allow our customers to anonymously decide whether or not they want to partake in that.
So, if they decide they don’t want to be part of this network and don’t want to help this public service agency that asks them, they just say no. If they decide that they do want to, which, by the way, a lot of people want to increase the security of their neighborhoods. A lot of people want their kids to grow up in safer neighborhoods, a lot of people want to have the tools to do that, and are in places that are dangerous. We give them the ability to say yes and make it more efficient for them to communicate with those public service agencies, and also do it in a very auditable digital format.
That’s the other side. Today, without these tools, if a police officer wanted to go and get footage from something, they’d have to go and knock on the door and ask you, and that’s not comfortable for anyone. There’s no digital audit trail of it, and, with this, they can do it efficiently with an audit trail. It is very clear, and it’s anonymous.
Jamie actually talked a lot about searching for dogs in this context, because one of the reasons he was so excited to come back to Ring was to use AI to search through the massive amounts of video generated by Ring cameras. In fact, he told me that Ring could not have built Search Party five years ago, because AI systems to do it weren’t available.
Jamie is nothing if not direct about this, which I appreciate. The man really thinks you can use AI and cameras to reduce or even eliminate crime. But I had a lot of questions about this:
JAIME SIMINOFF: But when you put AI into it, now, all of a sudden, you have this human element that AI gives you. I think, with our products in neighborhoods and, again, you have to be a little bit specific to it, I do see a path where we can actually start to take down crime in a neighborhood to call it close to zero. And I even said, there are some crimes that you can’t stop, of course.
NILAY PATEL: Mechanically, walk people through what you mean. You put enough Ring products in a neighborhood, and then AI does what to them that helps you get closer to the mission of zeroing out crime?
So, the mental model, or how I look at it, is that AI allows us to have ... If you had a neighborhood where you had unlimited resources, so every house had security guards and those security guards were people that worked the same house for 10 years or 20 years, and I mean that from a knowledge perspective. So, the knowledge they had of that house was extreme; they knew everything about you and that residence and your family, how you lived, the people that came in and out.
And then, if that neighborhood had an HOA with, call it private security, and those private security were also around and knew everything, what would happen? When a dog gets lost, you’d be like, “Oh, my gosh, my dog is lost.” Well, they would call each other, and one of them would find the dog very quickly. So, how do we change that and bring that into the digital world is—
Can I just ask you a question about that neighborhood specifically?
Sure.
Do you ever stop and consider that that neighborhood might suck? Just the idea that every house on my street would have all-knowing private security guards, and I would have an HOA, and that HOA would have a private security force.
You can easily paint that as dystopia. Everyone’s so afraid that we have private cops on every corner, and I’m paying HOA fees, which is just a nightmare of its own.
So, I would assume you live in a safe neighborhood.
I hope so, yeah.
No, today, I’d go to ... If you want, I’ll take you to a place where people live and have to, when they get home from school, lock their doors and stay in their house, and they can’t go out and—
But I’m just saying that that model is “everybody is so afraid that they have private cops.”
I think the model is that doing crime in a neighborhood like that is not profitable, and I think that you want people to move into another job. I don’t think that crime is a good thing and so I think ... But listen, it certainly is an argument to have, I do believe that ... I think safer neighborhoods allow for kids to grow up in a better environment and I think that allows them to be able to focus on the things that matter and so that’s what we’re going for.
I just wanted to challenge the premise.
I think it’s a fair challenge.
The model is that there are cops everywhere. That level of privacy.
Yeah, it’s not cops. I think it’s more that you’ll have the ability to understand what’s happening. It’s not like ... But yeah, I think, listen, it’s a fair statement, I guess. I think I want to live in a safe place.
There’s a lot of intelligence in your neighborhood, and maybe it’s private security, maybe it’s not. What does the AI do? Does it just make the camera smarter? It lets you do a more intelligent assessment of what the cameras are seeing?
Right now, we just say motion detection, motion detection, motion detection. It’s funny, when I started Ring… The book was fun because I got to go back and actually go through this whole story of how this thing came to be, and motion detection was an amazing invention. You’re in the airport, and there’s a motion at your front door, and you look at it like, “Wow, this is crazy.”
Now, with AI, we shouldn’t be telling you about motion detection; we should be telling you what’s there, when you should look at it, when it matters, and we shouldn’t be bothering you all the time. That’s what I mean by this idea of these security guards at your house or in your neighborhood. There should be this intelligence in your neighborhood that can tell you when you should be trying to be part of something, but not always tell you. So, it’s not just like, “Car, car, dog, person, person.” It’s like, “Hey, look at this. You want to pay attention to this right now.”
I really pressed Jamie on this because I still don’t think it is entirely clear how Ring accomplishes the elimination of crime through AI alone. And it’s why people don’t trust the company when it says it won’t use systems that can find a dog to do things that otherwise violate our rights. After all, if your goal is to use AI to stop crime, and you built an AI system that can find a dog… well, it’s pretty obvious what comes next, right?
NILAY PATEL: Do you think when you talk about zero out crime in a neighborhood, the idea that everyone in a neighborhood has one of those illuminated Ring signs in the front yard, is that enough to—
JAMIE SIMINOFF: It’s a part of it.
Is that just enough of a deterrent? The bad guys will know their face is going to be captured on video, and that will be analyzed by an AI, and something will happen. Do you have to do more outbound deterrents?
I think that’s a part of it. Awareness is a big part of it. I think there are ways with lights also, using lighting to do stuff, that’s a big part of it. I think having just ... If, all of a sudden, someone comes outside because something’s an anomaly, that’s a big part of it. It doesn’t have to be some crazy thing. And that’s what I was saying, is a lot of these little things add up to make that work.
So, when you think about it, okay, we can bring crime down in a neighborhood to close to zero in a neighborhood, what are the ratcheting steps? Does everyone just get the Ring camera, and your platform does all the work? Is it that someone gets caught and they tell all their friends in jail that they got caught? What are the steps?
I think it’s really about bringing neighbors together for this particular thing. So, it’s about how you individually… and we’ve always thought about how each house is its own node controlled by the neighbors, so controlled by the person, and I’ll keep going back to that, which is one hundred percent, your video is in your control; everything you’re doing is in your control, whether you want to take part in anything is in your control. That has to be the first layer of all of it.
But then, when something happens, do you want to take part in it? So, if you get an alert that this dog looks like the dog that’s in front of your house, can you contact your neighbor? You can decide not to take part in it, and then no one will ever know, and it’s fine, it’s just basically deleted, or you can take part in it. I think that’s how we can do things that can make a neighborhood into this node where individual neighbors are all on their own, but when things happen, they can work together as they want to.
And you think that AI will accelerate the process?
I think AI is a co-pilot. It is their assistant, and it’s helping them to figure this out. Because, again, if you’re just getting every motion alert, and if you have eight cameras and you’re just getting motion alerts all day, no human being can parse all this data. So that’s what I was talking to Jen about, is that I do think I see a way to use AI to help feed better data to us, which allows us to make better decisions and work together better.
This is where we get to Flock, which Ring announced a partnership with in October 2025. Flock primarily makes cameras and systems to search video for the cops. You’ve probably seen Flock’s devices where you live — they’re those little solar-powered cameras and tracking devices affixed to streetlights or placed in the center of parking lots. They vacuum up large amounts of data, and the company claims it is anonymized before it’s made available to partners, which in most cases is local law enforcement.
However, according to in-depth reporting from the excellent 404 Media, Flock’s data has often found its way to ICE, the FBI, the Secret Service, and other law enforcement agencies, without the requirement of a warrant. That’s because that data is willingly provided by local police.
Last month, under intense scrutiny about what a deal with Flock would mean, Ring said that this partnership was not yet “live,” and that, “Ring has no partnership with ICE, does not give ICE videos, feeds, or back-end access, and does not share video with them.” For its part, Flock says the same — that it doesn’t actually work with ICE, but rather local law enforcement, and it’s those local agencies that work with ICE. This is the complication I mentioned earlier.
If you’re a Decoder listener, you know where this is all going. I asked Jamie about all these databases, who owns them, and what it means to connect them all with AI:
NILAY PATEL: But when you connect a bunch of those databases, particularly to facial recognition, there’s a turn in the privacy conversation where the stakes ratchet up really high, where maybe it’s gone forever.
How are you thinking about that decision-making? Okay, we have a lot of intelligence in the AI; it’s trivial for the AI to connect to another store of information. That’s a thing you can do with AI, especially at a big company like Amazon, where you have lots of other stores of information. There’s a line, what’s the line for you?
JAMIE SIMINOFF: There is a responsibility, obviously just to build safe products. So let’s just start with that. Yeah, we did announce facial, we call it Familiar Faces, but that’s not connected, that’s just for your... Your iPhone today. If you search your iPhone, it’s crazy. Search for someone’s name in your photos, and their pictures come up.
So I do think there’s a balance between not allowing technology to exist that should exist that helps people and gives them more efficiency, gives them safer homes and then also, obviously, not creating this dystopian place. And so, I think that’s the responsibility, but what we’re doing with Familiar Faces is we’re just giving you the ability to say, when my wife comes home, don’t... Because it is silly. Why do I get an alert when my wife comes home? I don’t want it, I don’t need it.
I’m asking this for a lot of reasons, but I look at what’s broadly happening with surveillance footage out in the world. And I’m not saying Ring is participating in this, I’m just giving you an example. ICE has facial recognition systems, and they are arguing that a positive match in their facial recognition system is a definitive determination of someone’s immigration status. That’s way out there. I don’t think you’re doing that.
But you can get to, “Okay, we have facial recognition, we have a bunch of evidence coming off of Ring cameras, to make it really safe, you want to go from passive surveillance to active surveillance. That’s what the studies show. Now the camera will literally identify the criminal by face and tell the cops this person tried to steal a car from this driveway,” and that’s the thing that would get you to actually zero out crime.
There’s a lot of risk in those steps. But if I draw the thread from what you’re saying, it’s all the way to the idea that the criminals won’t come here because the cameras will know who they are and tell the cops. Are you willing to go that far?
I think it’s also that the cameras will alert people. Part of what made Ring and what made neighbors safer with Ring 1.0, and I think we are in Ring 2.0, is that there was no presence at the home. How did people break into homes? They would go and be knock-knock burglars. They would knock-knock, no one was home. It was 3PM, they’d go to the homes next door, find a place that was empty, and they’d go into the home.
Ring allowed you to, now, all of a sudden, when someone comes up to the door, you’re like, “Oh, I got a motion alert. Hi, what’s going on?” and so it gave a presence to the home. So, I don’t think you have to go as far as that real time stuff to get to where we’re talking about, I think it’s more of the anomaly detection and allowing people to make it so that, if someone comes in, that you’re aware of what’s happening around the neighborhood because right now there’s no awareness of what’s going on around it.So I don’t think it’s as dystopian as where you’re going, and certainly it’s not what we’re building, and I do think we can impact things to a really high level in neighborhoods. Which, again, to the Jen Tuohy thing, in neighborhoods is what we were talking about, that with AI and what we’re doing with a bunch of Rings together. I think even the Dog Search Party is a good way to look at it, which is how these cameras come together for good in the neighborhood.
Today, the backlash against surveillance has led Ring to kill that Flock deal. The company is also doing damage control about Search Party, telling The Verge that the technology that powers the feature is not “capable” of being used to find people, and that there’s no indication that such features are on future road maps.
Sure, it seems obvious that Ring has a lot of trust to earn back here, and certainly, we are all thinking about what it means to put internet-connected security cameras in our homes. I think that’s good — and certainly overdue.
But let me complicate this a little for you. At the same time this conversation about Ring is happening, we’re also watching regular people record the police and ICE with their cell phones, capturing critical documentary evidence of how those agencies are violating the rights of everyday Americans in ways that is leading to change, however slow.
Minnesota governor Tim Walz is now telling people to hit record when they see ICE so the footage can be used in future prosecutions. And just last week, the FBI released video that Google appears to have specifically recovered from a Nest camera system at Nancy Guthrie’s house, in order to help identify her kidnapper.
That’s a lot of video that’s being captured and used in ways that maybe don’t feel so invasive, and in some cases even feels good. But the systems that create, store, and share that video are all the same, and the guardrails around them are just as weak as whatever makes us feel uncomfortable about Ring.
I’m not sure how to feel about all of that video, especially in a world where AI makes it easy to fake, and having some source of truth seems more important than ever. I asked Jamie about that, too — whether Ring will control all the video, or sign it with some metadata so people can ensure it’s real.
NILAY PATEL: Presuming we have to have an authenticated server, there’s a crime in my neighborhood, and I’ve opted in, and we’re going to say the cops can only get the video from the Ring server, where we know it’s true. I might not be as in control of my video anymore.
JAMIE SIMINOFF: No, not how it’s built and not while I’m here because the way it works is that you will decide if you want to or not want to share that video, which is your property, with someone. Now, once you share it, then it is up to us to figure out, to your point, how do we share it, how do we make sure that the digital fingerprint goes all the way through, or how does the chain of custody work of this video to make sure there’s no fake in the process of it? I think this is why it is important to build these systems.
It’s going to be important, though. This is also where the government is going to have to step in. We’re going to have to deal with this across the board because we also have video coming off of cell phones. So, we do need to figure out how to build... And there’s going to be companies, Axon would probably be one of the companies. I don’t want to speak for them, but they have evidence.com, so to build these evidentiary systems to take in…
Because Ring is one part of taking in data around, call it a crime scene, but cell phone video is maybe even more of a source today. So, how do you take that in? How do you make sure that it actually was captured on the iPhone directly and not tampered with between the two things? We’re going to have to figure it all out. I think we have to work together on it, and the AI stuff is pushing us to do it. I am proud that with Ring, we have built it so that you can take it directly and keep it on the server. You can understand where it was, where it’s from, where it was created, and we have that digital fingerprint on it and the audit trail of it.
You’re going to have to do that more and more as this world is changing, you’re just not going to be able to trust that just because someone sends you a video doesn’t mean it’s true.
You get the feeling we’ll be coming back to that idea quite a lot in the years to come.
But today Ring has canceled its deal with Flock, and Flock itself is putting out blog posts flatly stating it does not have a contract with ICE, and noting that Ring’s other partner, Axon, does in fact have an ICE contract.
In the meantime, Search Party is still active and on by default, although you can just go into settings and flip it off. And the enormous amount of video we all generate is being uploaded to servers run by big companies that have their own dealings with governments and law enforcement agencies far outside of our control.
There are a lot of solutions to all these problems, and lots of ways to design regulations to balance out privacy and civil liberties with the needs of police. But right now, in 2026 America, I’m not sure were really going to be able to do that.
So we’re going to keep pushing the leaders of these companies on what they really mean, and keep running the answers so you can listen and decide. I think it’s about time we started thinking about how all the technology we use to make our own lives better affects other people.
Because that bigger conversation we need to have? Yeah, that’s what it’s really about.
Questions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email!
Decoder with Nilay Patel
A podcast from The Verge about big ideas and other problems.

ThevergeAI大爆炸

文章目录


    扫描二维码,在手机上阅读