Facebook’s quarterly earnings, released last month, have surpassed most market expectations, sending its stock price to an all-time high. They have also confirmed the company’s Teflon credentials: no public criticism ever seems to stick.
Facebook上月公布的季度业绩超出了市场中大多数人的预期,股价因此被推升至有史以来的最高点。这再次显示,它好像给自己的招牌涂了一层“特氟龙”,任何公众批评都没法“粘”在上面。
Wall Street has already forgiven Facebook’s experiment on its users, in which some had more negative posts removed from their feeds while another group had more positive ones removed. This revealed that those exposed to positive posts feel happier and write more positive posts as a result. This, in turn, results in more clicks, which result in more advertising revenue.
华尔街也已原谅了Facebook对用户所做的一个实验。在那个实验里,在用户不知情的情况下,Facebook在其中一些用户的朋友动态中删除了较多消极帖子,而在另一些用户的朋友动态中删除了较多积极的帖子。结果显示,那些看到更多积极帖子的人感觉更快乐一些,于是他们会发出更多积极的帖子,这反过来增加了点击量,从而能带来更多广告收入。
Troubling ethics notwithstanding, the experiment has revealed a deeper shift in Facebook’s business model: the company can make money even when it deigns to allow its users a modicum of privacy. It no longer needs to celebrate ubiquitous sharing – only ubiquitous clicking.
这次实验带来的道德问题暂且不谈,它更揭示了Facebook商业模式的深层次变化:即便它屈尊赏赐用户一点隐私权,依然能确保滚滚财源。这家公司所推崇的,不再是“无所不在的分享”,而是“无所不在的点击”。
At the earnings call, chief executive Mark Zuckerberg acknowledged that the company now aims to create “private spaces for people to share things and have interactions that they couldn’t have had elsewhere”. So Facebook has recently allowed users to see how they are being tracked, and even to fine tune such tracking in order to receive only those adverts they feel are relevant. The company, once a cheerleader for sharing, has even launched a nifty tool warning users against “oversharing”.
在发布季报时的电话会议上,Facebook首席执行官马克•扎克伯格(Mark Zuckerberg)承认,Facebook现在的目标是“为人们建立私人空间,让他们可以分享信息,实现他们在其它环境中无法实现的互动”。基于这个目标,Facebook最近已允许用户查看该网站如何跟踪他们的数据,甚至还允许用户对数据的使用方式进行微调,从而可以只收到他们感兴趣的广告。这家当初极力鼓励用户间分享的公司,甚至还推出了一种工具,能提醒用户防止“过度分享”。
As usual with Facebook, this is not the whole story. For one, it has begun tracking users’ browsing history to identify their interests better. Its latest mobile app can identify songs and films playing nearby, nudging users to write about them. It has acquired the Moves app, which does something similar with physical activity, using sensors to recognise whether users are walking, driving or cycling.
和Facebook所做的其它事情一样,这不是事情的全貌。首先,为了更准确地了解用户兴趣所在,Facebook早已开始跟踪用户的浏览历史。该公司最新推出的一款移动应用能分辨用户附近播放的歌曲和电影,并鼓励用户对它们做出评价。同时,该公司还收购了Moves应用,这款应用能利用手机内的传感器,跟踪用户的运动状态,分辨他们是在走路、开车还是在骑自行车。
Still, if Facebook is so quick to embrace – and profit from – the language of privacy, should privacy advocates not fear they are the latest group to be “disrupted”? Yes, they should: as Facebook’s modus operandi mutates, their vocabulary ceases to match the magnitude of the task at hand. Fortunately, the “happiness” experiment also shows us where the true dangers lie.
但是,Facebook既然在一开始就热情接受了“保护隐私”这种说法,并从中盈利,个人隐私的维护者们难道不该担心,他们可能成为又一个“被带歪了的”团体?没错,他们确实应该感到担心:随着Facebook不断改变做法,它所定义的“隐私”已与“保护隐私”这个真正目标相去甚远。所幸的是,那个有关“幸福感”的实验向我们展示了这其中真正的危险是什么。
For example, many commentators have attacked Facebook’s experiment for making some users feel sadder; yet the company’s happiness fetish is just as troubling. Facebook’s “obligation to be happy” is the converse of the “right to be forgotten” that Google was accused of trampling over. Both rely on filters. But, while Google has begun to hide negative results because it has been told to do so by European authorities, Facebook hides negative results because it is good for business. Yet since unhappy people make the best dissidents in most dystopian novels, should we not also be concerned with all those happy, all too happy, users?
比如,许多评论人士批评Facebook的实验加剧了部分用户的悲伤情绪。但是,Facebook对幸福感的过度推崇其实同样有问题。Facebook暗示人们“有快乐的义务”,它的反面即是人们“有被遗忘的权力”(谷歌此前就被批评无视人们这种权力)。这两者都依赖对信息的过滤。不过,虽然谷歌开始隐藏负面搜索结果,是出于欧盟当局的压力,Facebook隐藏负面帖子,却是因为这对它的业务有好处。不过,既然在多数反乌托邦小说中,最好的异见者都是那些不快乐的人,难道我们不该提防那些整天乐呵呵的,甚至太过快乐的用户?
The happiness experiment confirms that Facebook does not hesitate to tinker with its algorithms if it suits its business or social agenda. Consider how on May 1 2012 it altered its settings to allow users to express their organ donor status, complete with a link to their state’s donor registry. A later study found this led to more than 13,000 registrations on the first day of the initiative alone. Whatever the public benefits, discoveries of this kind could clearly be useful both for companies and politicians. Alas, few nudging initiatives are as ethically unambiguous as organ donation.
那个幸福感实验证明,Facebook会毫不犹豫地修改算法,只要此举符合它的商业或社会利益。回想一下,2012年5月1日,Facebook曾更改其设定,允许用户表达对器官捐赠问题的立场,同时还附上了用户所在国器官捐献登记网站的链接。后来的一个研究发现,仅仅在倡议提出当天,这种做法就导致逾1.3万人登记捐献器官。不论公众从中获得了什么好处,这一发现显然对企业和政客都很有用。但是,很少有其他倡议像器官捐赠一样在道德上没有争议。
The reason to fear Facebook and its ilk is not that they violate our privacy. It is that they define the parameters of the grey and mostly invisible technological infrastructure that shapes our identity. They do not yet have the power to make us happy or sad but they will readily make us happier or sadder if it helps their earnings.
我们担心Facebook及其同类,原因不在于它们会侵犯我们的隐私,而在于它们是规则制定者——它们可以定义灰色地带的边界,也掌握着那些决定我们以怎样的面目示人的最隐秘的计算方法。他们虽然还没有力量让我们感到快乐或者悲伤,却很乐意加强我们的快乐感,或悲伤感,如果这样做能让他们更赚钱的话。
The privacy debate, incapacitated by misplaced pragmatism, defines privacy as individual control over information flows. This treats users as if they exist in a world free of data-hungry insurance companies, banks, advertisers or government nudgers. Can we continue feigning such innocence?
错位的实用主义对围绕隐私权的争论产生了有害影响,人们在争论中将隐私权定义为个人对于信息流的控制权。在这样的语境下,用户仿佛存在于这样一个世界:在这个世界里,那些渴望得到个人数据的保险公司、银行、广告商或政府引导人员仿佛都不存在。对此,我们还能继续掩耳盗铃么?
A robust privacy debate should ask who needs our data and why, while proposing institutional arrangements for resisting the path offered by Silicon Valley. Instead of bickering over interpretations of Facebook’s privacy policy as if it were the US constitution, why not ask how our sense of who we are is shaped by algorithms, databases and apps, which extend political, commercial and state efforts to make us – as the dystopian Radiohead song has it – “fitter, happier, more productive”?
如果要围绕隐私权展开更有益的辩论,就需要问一问:是谁需要我们的数据?为什么?与此同时,应该提出制度化的方案,而不是一味接受硅谷企业给出的方案。与其把Facebook的隐私政策推崇得像美国宪法一样,围绕如何解释它争吵不休,我们为什么不问一句:那些算法、数据库和应用是如何影响我们的自我认知的?事实上,这些程序正在做的,是让我们像那首Radiohead乐队的反乌托邦歌曲中唱的那样——“更健康、更快乐、更高效”,而这其实正是政界、商界及政府希望看到的。
This question stands outside the privacy debate, which, in the hands of legal academics, is disconnected from broader political and economic issues. The intellectual ping pong over privacy between corporate counsels and legal academics moonlighting as radicals always avoids the most basic question: why build the “private spaces” celebrated by Mr Zuckerberg if our freedom to behave there as we wish – and not as companies or states nudge us to – is so limited?
如今,这个真正的问题却游离于隐私权保护的争论之外。法学家们把持着这场争论,切断了隐私保护与更大范围的政治经济问题间的联系。那些企业法律顾问和“兼职”激进分子的法学家们在围绕隐私权你来我往地开展争论之际,总是回避一个最基本的问题:既然我们按照自己的意愿行动(而不是被企业和国家推动着行动)的自由如此有限,我们为何还需要扎克伯格推崇的那种“私人空间”?