Imagine being talked about behind your back. Now picture that conversation taking place covertly in your own sitting room, with you unable to hear it.想象一下有人在你背后谈论你。现在设想一下,这样的谈话就悄悄再次发生在你家客厅里,而你却无法听见。That is the modus operandi of SilverPush, an Indian start-up that embeds inaudible sounds in television advertisements. As the advert plays, a high-frequency signal is emitted that can be picked up by a mobile or other device installed with an app containing SilverPush software. This “pairing” — currently targeted at Indian consumers — also identifies users’ other nearby devices and allows the company to monitor what they do across those. All without consumers hearing a thing.这就是印度创业企业SilverPush的作法,该公司在电视广告里映射听得将近的声音。
广告播出时,不会收到一种高频信号,加装有内置SilverPush软件的应用于的手机或其他设备可接管到这种信号。这种“筛选”——目前是针对印度消费者的——也不会辨识出有用户附近的其他设备,让该公司以求监控他们在这些设备上做到些什么。这一切都在消费者幼稚无觉的情况下再次发生。This “cross-device tracking technology”, being explored by other companies including Adobe, is an emblem of a new era with which all of us — governments, companies, charities and consumers — will have to contend.这种“横跨设备追踪技术”——还包括Adobe在内的其他公司也在探寻此技术——标志着一个新时代的到来。
这个新时代是所有人——政府、公司、慈善机构和消费者——将被迫应付的。Last month, the Royal Statistical Society hosted a conference at Windsor castle to ponder the challenges of Big Data — an overused, underexplained term for both the flood of information churned out by our devices and the potential for this flood to be organised into revelatory and predictive rivers of knowledge.不久前,英国皇家统计学不会(Royal Statistical Society)在温莎(Windsor)城堡开会了一次大会,思维大数据带给的挑战。大数据是一个被欺诈、内涵说明不出确切的术语,既指我们的设备产生的海量信息流,也所指把这些信息整理为分门别类的一股股具备说明了性和预见性的信息流的潜力。The setting was apt: the ethics and governance surrounding the growing use of data are a right royal mess. Public discussion about how these vast quantities of information should be collected, stored, cross-referenced and exploited is urgently needed. There is excitement about how it might revolutionise healthcare — during outbreaks of disease, for example, search data can be mined for the greater good. Today, however, public engagement largely amounts to public outcry when things go wrong.这次大会开会得正是时候:环绕日益减少的数据用于的伦理和管理堪称一团糟。
目前迫切需要就这些海量数据应该如何搜集、存储、互相参考和利用进行公众辩论。有人对大数据有可能促成医疗革命深感激动:比如说,在疾病愈演愈烈时,可以为了更高的利益挖出搜寻数据。然而,如今,当经常出现差劲情况时,公众辩论相当大程度上变为公众的反感声援。The extent to which tech shapes our lives — the average British adult spends more than 20 hours a week online, according to a report by UK media regulator Ofcom — means our behaviour, habits, desires and aspirations can be revealed by our swipes and keystrokes.英国媒体监管机构英国通信办公室(Ofcom)的一份报告表明,英国成年人平均值每周在线时间多达20小时。
科技对我们生活的极大影响,意味著我们的不道德、习惯、性欲和志向都可以通过触摸屏和键盘操作者显露出来。This has made analysis of online be a new Klondike. Personal data are like gold dust, and we surrender them every time we casually click “OK” to a website’s terms and conditions.这使得对在线不道德的分析沦为一座新的金矿。个人数据就像金砂,每次我们随便对一家网站的条款与条件页面“确认”时,就把我们的个人数据递了过来。And here is our first problem: most of us click unthinkingly (it is usually impenetrable legalese, anyhow). It is thus questionable whether we have given informed consent to all the ways in which our personal data are subsequently used. To demonstrate this, a security company set up a public WiFi spot in the City of London and inserted a “Herod clause” committing users to hand over their firstborn for eternity. Within a short period of time, several people unwittingly bartered away their offspring in return for a free connection.这是我们面对的第一个问题:我们中大多数人都是不假思索地页面的(不过,条款与条件一般来说是晦涩的法律措辞)。
那么,我们对自己的个人数据随后被用于的各种情形否行使了知情同意权,就出了疑惑。为了证明这一点,一家安全性公司在伦敦金融城(City of London)成立了一个公共WiFi热点,并映射一个“希律条款”(Herod Clause),拒绝用户允诺总有一天退出他们的第一个孩子。在很短时间内,就有不少人为了免费上会儿网稀里糊涂地退出了自己的孩子。
Legal challenges aside, there is rarely independent scrutiny of what is a fair and reasonable relationship between an online company and its consumers. Facebookfell foul of this when it manipulated the news feeds of nearly 700,000 users for a psychology experiment. Users claimed they had been duped by the study, which found that those exposed to fewer positive news stories were more likely to write negative posts. The company retorted that consent had already been given. Approval last week of EU data protection rules permitting hefty fines for privacy breaches may prevent a repetition; consent will no longer be the elastic commodity it was.除了法律挑战,关于网络公司及其消费者之间公平与合理的关系应当是怎样的,我们也很少展开过独立国家的检视。Facebook在这一点上之后曾引发众怒,因为它为了做到一个心理实验,对近70万用户的动态消息一动了手脚。
用户们声称,他们被那项研究给骗了,研究结果显示,那些接管到较少大力消息的人更加有可能写消极的内容。Facebook驳斥称之为,他们已取得了用户的表示同意。不久前,欧盟通过了数据保护规则,新规容许对侵害隐私的不道德判处高额罚款,这也许能制止类似于情况再次发生;用户仍然像以往那样无论代价如何都不能被动表示同意了。
A second challenge arises from the so-called internet of things, when devices bypass humans and talk directly to one another. So my depleted smart fridge could automatically email the supermarket requesting replenishment. But it could also mean my gossiping gadgets become a network of electronic spies that can paint a richly detailed picture of my prandial and other proclivities, raising privacy concerns. Indeed, at a robotics conference last month, technologists identified the ability of robots to collect data, especially in private homes, as the single biggest ethical issue in that field.第二个挑战源于各种设备跨过人类、必要彼此对话的所谓物联网。所以,我的智能冰箱在储存消耗光了的时候可以自动给超市发电邮,拒绝补货。但这也可以意味著,我的那些“八卦”的设备包含了一张电子间谍网,它可以绘制出有一幅有关我的饮食与其他癖性的极为详细的图画,令人担心隐私曝露。
实质上,在不久前的一个机器人学大会上,技术专家们把机器人搜集数据(特别是在是在私人住所里)的能力确认为大数据领域仅次于的单个伦理问题。Alongside the new EU rules on data protection, we need something softer: a body of experts and laypeople that can bring knowledge, wisdom and judgment to this fast-moving field. There is already a Council for Big Data, Ethics and Society in the US, comprising lawyers, philosophers and anthropologists.除了欧盟新的数据保护规则外,我们也必须更加软性的方式:一个由专家和非专业人员构成的机构,为这一较慢发展的领域带给科学知识、智慧和判断力。
眼下,美国有数了一个由律师、哲学家和人类学家构成的大数据、伦理与社会委员会(Council for Big Data, Ethics and Society)。Europe should follow this example — because, as a stream of anecdotes at the Windsor conference revealed, companies and academics to be navigating this new data-rich world without a moral compass. In 2012 a Russian company created Girls Around Me, an app that pooled publicly available information to show the real-time locations and pictures of nearby women, without their consent; the app, a stalker’s dream, was withdrawn. High-tech rubbish bins in London’s Square Mile, which captured information from smartphones to track unwitting owners’ movements in order to target them with advertising, were ditched on grounds of creepiness.欧洲应该效仿美国的作法,因为正如温莎大会上的一连串趣闻所表明的那样,公司和学术界人士在这个数据非常丰富的新世界航行时,或许没拿着伦理指南针。2012年,一家俄罗斯公司发售了一款取名为“Girls Around Me”的应用于(App),可以汇聚公开发表可见的信息,在毋须使用者附近女性表示同意的情况下表明她们的动态方位和照片。
这款追踪侵扰者梦寐以求的应用于被删除了。“平方英里”(Square Mile,即伦敦金融城,因面积正好1平方英里故名——译者录)的高科技电子垃圾箱捕猎来自智能手机的信息,以追踪不知情的机主的下落,从而针对他们公布广告,这些垃圾桶因令人毛骨悚然而被查禁。Meanwhile, a scientist has created software that combs Twitter connections to infer a tweeter’s ethnicity and even religion, raising the question of whether public posts can legitimately be used to deduce private information. Do we, as one lawyer suggested,need laws against misuse of our online personae?同时,一名科学家做到了一款软件,需要通过彻底搜查推特(Twitter)人脉图,推测一名推特用户的种族、甚至宗教,这引起了用于公开发表讲话推测私人信息否合法的疑惑。
我们否如一名律师所指出的那样,必须实施避免个人在线角色被欺诈的法律?We have wearable devices that, like Santa, see you when you are sleeping and know when you’re awake. It is possible that a company will find a way of deducing — through sentiment analysis of social media postings, visits to charity websites, checks on your bank balance and fitness tracking — if you’ve been bad or good.我们有了可穿着设备,这些设备像圣诞老人一样,在你睡觉时身旁着你,也告诉你何时是醒着的。一家公司有可能寻找推测你近来生活否积极向上的办法——通过分析社交媒体讲话展现出出有的情绪、采访慈善网站以及核查你的银行存款余额和身体健康跟踪。This goes to show: just because big data makes it technically possible to do something, does not mean we should.这证明:并非意味着因为大数据使某事在技术上不具备可行性,就意味著我们应当那么做到。
本文来源:JN江南·(中国)体育-www.soucili.top
扫一扫关注我们