999精品在线视频,手机成人午夜在线视频,久久不卡国产精品无码,中日无码在线观看,成人av手机在线观看,日韩精品亚洲一区中文字幕,亚洲av无码人妻,四虎国产在线观看 ?

人臉識別時(shí)代的來臨

2018-03-12 19:11:41ByRomanKrznari
英語學(xué)習(xí) 2018年1期

By+Roman+Krznari

在最新的蘋果手機(jī)發(fā)布會上,最大的亮點(diǎn)也是被人吐槽最多的恐怕要算它的人臉識別功能了。除了指紋之外,面部特征無疑也是區(qū)分個(gè)體差異的最有效方式。人臉識別技術(shù)在不遠(yuǎn)未來的廣泛應(yīng)用是可預(yù)見的,但其潛在的安全隱患也值得人們關(guān)注。

The human face is a remarkable piece of work. The astonishing variety of facial features helps people recognise each other and is crucial to the formation of complex societies. So is the faces ability to send emotional signals, whether through an involuntary blush or the artifice1 o f a false smile. People spend much of their waking lives, in the office and the courtroom as well as the bar and the bedroom, reading faces, for signs of attraction, hostility, trust and deceit. They also spend plenty of time trying to dissimulate2.

Technology is rapidly catching up with the human ability to read faces. In America facial recognition is used by churches to track worshippers attendance; in Britain, by retailers to spot past shoplifters. In 2017, Welsh police used it to arrest a suspect outside a football game. In China it verifies the identities of ride-hailing3 drivers, permits tourists to enter attractions and lets people pay for things with a smile. Apples new iPhone is expected to use it to unlock the homescreen.

Set against human skills, such applications might seem incremental4. Some breakthroughs, such as flight or the internet, obviously transform human abilities; facial recognition seems merely to encode them. Although faces are peculiar to individuals, they are also public, so technology does not, at first sight, intrude on something that is private. And yet the ability to record, store and analyse images of faces cheaply, quickly and on a vast scale promises one day to bring about fundamental changes to notions of privacy, fairness and trust.

The final frontier

Start with privacy. One big difference between faces and other biometric5 data, such as fingerprints, is that they work at a distance. Anyone with a phone can take a picture for facial-recognition programs to use. FindFace, an app in Russia, compares snaps of strangers with pictures on VKontakte6, a social network, and can identify people with a 70% accuracy rate. Facebooks bank of facial images cannot be scraped7 by others, but the Silicon Valley giant could obtain pictures of visitors to a car showroom, say, and later use facial recognition to serve them ads for cars. Even if private firms are unable to join the dots between images and identity, the state often can. Photographs of half of Americas adult population are stored in databases that can be used by the FBI. Law-enforcement agencies now have a powerful weapon in their ability to track criminals, but at enormous potential cost to citizens privacy.endprint

The face is not just a name-tag. It displays a lot of other information—and machines can read that, too. Again, that promises benefits. Some firms are analysing faces to provide automated diagnoses of rare genetic conditions, such as Hajdu-Cheney syndrome8, far earlier than would otherwise be possible. Systems that measure emotion may give autistic people a grasp of social signals they find elusive.9 But the technology also threatens. Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm10 could attribute their sexuality correctly 81% of the time. Humans managed only 61%. In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.

Keys, wallet, balaclava11

Less violent forms of discrimination could also become common. Employers can already act on their prejudices to deny people a job. But facial recognition could make such bias routine, enabling firms to filter all job applications for ethnicity and signs of intelligence and sexuality. Nightclubs and sports grounds may face pressure to protect people by scanning entrants faces for the threat of violence—even though, owing to the nature of machine-learning, all facial-recognition systems inevitably deal in probabilities. Moreover, such systems may be biased against those who do not have white skin, since algorithms trained on data sets of mostly white faces do not work well with different ethnicities. Such biases have cropped up in automated assessments used to inform courts decisions about bail and sentencing.12

Eventually, continuous facial recording and gadgets13 that paint computerised data onto the real world might change the texture of social interactions. Dissembling helps grease the wheels of daily life.14 If your partner can spot every suppressed yawn, and your boss every grimace15 of irritation, marriages and working relationships will be more truthful, but less harmonious. The basis of social interactions might change, too, from a set of commitments founded on trust to calculations of risk and reward derived from the information a computer attaches to someones face. Relationships might become more rational, but also more transactional.

In democracies, at least, legislation can help alter the balance of good and bad outcomes. European regulators have embedded a set of principles in forthcoming data-protection regulation, decreeing that biometric information, which would include “faceprints”, belongs to its owner and that its use requires consent16—so that, in Europe, unlike America, Facebook could not just sell ads to those car-showroom visitors. Laws against discrimination can be applied to an employer screening candidates images. Suppliers of commercial facerecognition systems might submit to audits, to demonstrate that their systems are not propagating bias unintentionally.17 Firms that use such technologies should be held accountable.endprint

Such rules cannot alter the direction of travel, however. Cameras will only become more common with the spread of wearable devices. Efforts to bamboozle18 facial-recognition systems, from sunglasses to make-up, are already being overtaken; research from the University of Cambridge shows that artificial intelligence can reconstruct the facial structures of people in disguise. Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocratic regimes19. Other tech firms seem less picky. Amazon and Microsoft are both using their cloud services to offer face recognition; it is central to Facebooks plans. Governments will not want to forgo its benefits. Change is coming. Face up to it.

人臉是一件了不起的作品。多到令人驚訝的面部特征幫助人們辨認(rèn)彼此,而且對于復(fù)雜社會的形成至關(guān)重要。面部傳遞情緒信號的能力也是如此,無論是通過不自覺的臉紅還是虛偽的假笑。人們花費(fèi)大量醒著的時(shí)間在辦公室、法庭以及酒吧和臥室觀察人臉,來讀取愛慕、敵意、信任和欺騙的跡象。他們也花費(fèi)許多時(shí)間試圖掩飾自己。

技術(shù)也在快速跟上人類識別臉部的能力。在美國,教堂利用面部識別來監(jiān)測信徒的到場情況;在英國,零售商利用它來辨認(rèn)有犯罪歷史的扒手。2017年,威爾士警方利用面部識別在一場足球比賽的賽場外抓捕了一名嫌犯。在中國,面部識別被用來驗(yàn)證網(wǎng)約車司機(jī)的身份,允許游客進(jìn)入景點(diǎn)以及讓人們通過微笑付款。蘋果的新iPhone也將使用它來解鎖主屏幕。

與人類的技能相比,這些應(yīng)用看起來似乎只是細(xì)枝末節(jié)的進(jìn)步。一些突破性進(jìn)步,比如飛機(jī)或互聯(lián)網(wǎng),顯然改變了人類的能力;而面部識別似乎只是對人類能力的編碼。雖然面孔是個(gè)人所特有的,但它們也是公開的,因此,乍看之下,技術(shù)并沒有侵犯到私人領(lǐng)域。然而,低成本、高速度、大規(guī)模地記錄、存儲和分析面部圖像的能力有一天會使隱私、公平和信任的概念從根本上發(fā)生改變。

最后的邊界

從隱私說起。面部和其他生物識別數(shù)據(jù)(如指紋)之間的一大差異在于面部識別在一定距離之外就可以完成。任何人只要有部手機(jī)就可以拍張照供面部識別程序使用。FindFace是俄羅斯的一款應(yīng)用程序,能將陌生人的照片與社交網(wǎng)絡(luò)VKontakte上的照片進(jìn)行比較,其人像識別的準(zhǔn)確率高達(dá)70%。Facebook的面部圖像庫不能被其他人用程序自動抓取,但是這家硅谷巨頭,比方說,可以獲得那些光顧了汽車展廳的參觀者的照片,之后利用面部識別技術(shù)向他們展示汽車廣告。盡管私營公司無法將照片和身份關(guān)聯(lián)起來,國家卻往往可以。美國一半成年人的照片存儲在聯(lián)邦調(diào)查局能夠使用的數(shù)據(jù)庫中。執(zhí)法機(jī)構(gòu)如今在追蹤罪犯的能力方面擁有了一件強(qiáng)大的武器,但可能會以觸及公民隱私作為巨大代價(jià)。

人臉并非只是一個(gè)姓名牌。人臉可以展示出許多其他信息——而機(jī)器也能讀出這些來。當(dāng)然,這肯定會帶來好處。一些公司正在通過分析面部來自動診斷罕見的遺傳病,比如遺傳性骨發(fā)育不良并肢端溶骨癥,診斷速度之快遠(yuǎn)遠(yuǎn)超過其他可能的診斷方法。情緒評估系統(tǒng)也許能幫助自閉癥患者掌握他們覺得難以理解的社會信號。但這項(xiàng)技術(shù)也會帶來威脅。斯坦福大學(xué)的研究人員已經(jīng)表明,面對一張男同的照片和一張直男的照片,計(jì)算機(jī)算法能夠以高達(dá)81%的準(zhǔn)確率判斷他們的性取向,而人眼的準(zhǔn)確率只有61%。在同性戀尚不合法的國家,通過人臉就能準(zhǔn)確推斷性取向的軟件令人感到前景堪憂。

鑰匙、錢包、巴拉克拉法帽

不那么暴力的形形色色的歧視也可能會普遍起來。雇主早已經(jīng)可以因?yàn)槠姸芙^錄用求職者。但是面部識別可能會使這種偏見變得司空見慣,讓企業(yè)能夠基于種族以及智力和性取向的面部特征來篩選求職申請。夜店和體育場可能會迫于保護(hù)民眾的壓力而不得不掃描入場人員的臉來防止暴力威脅——盡管,由于機(jī)器學(xué)習(xí)的本質(zhì),所有面部識別系統(tǒng)都不可避免地是在概率上做文章。此外,這些系統(tǒng)可能對有色人種存在偏見,因?yàn)檫@些算法主要是根據(jù)采集自白人的面部數(shù)據(jù)而訓(xùn)練得出的,因此并不能很好地適用于其他種族。這類偏見已經(jīng)出現(xiàn)在用于供法院保釋和量刑參考的自動化評估中。

最終,持續(xù)的面部記錄和將計(jì)算機(jī)化的數(shù)據(jù)涂畫到現(xiàn)實(shí)世界的電子設(shè)備可能會改變社交活動的肌理。掩飾內(nèi)心有助于潤滑日常生活的各個(gè)環(huán)節(jié)。如果你的伴侶能夠發(fā)現(xiàn)每一個(gè)被強(qiáng)忍住的哈欠,你的老板可以注意到每一張惱怒的苦臉,婚姻和工作關(guān)系誠然多了一份真實(shí),卻少了一份和諧。社交的基礎(chǔ)也可能會發(fā)生改變,從基于信任的種種承諾轉(zhuǎn)變?yōu)橛?jì)算機(jī)通過某人面部信息所計(jì)算出的風(fēng)險(xiǎn)和回報(bào)。人際關(guān)系可能會變得更加理性,但也感覺更像在做交易。

在民主國家,至少立法可以幫助調(diào)節(jié)利弊端之間的平衡。歐洲監(jiān)管部門已經(jīng)制定出了一套原則用于即將出臺的數(shù)據(jù)保護(hù)法規(guī),要求生物識別信息(包括“面部信息”)屬于其所有者,其使用需要經(jīng)過所有者授權(quán)同意——因此,在歐洲,與美國不同,F(xiàn)acebook不能向那些光顧了汽車展廳的參觀者展示廣告。反歧視的法律也可用于禁止雇主掃描應(yīng)聘者面部圖像的情況。商用面部識別系統(tǒng)的供應(yīng)商可能會被要求接受審核,以表明其系統(tǒng)不會在無意中傳播偏見。使用這類技術(shù)的公司應(yīng)該承擔(dān)責(zé)任。endprint

然而,這些規(guī)定并不能改變大勢所趨。照相機(jī)只會隨著可穿戴設(shè)備的普及而更加普遍。無論是依靠戴太陽鏡還是化妝來迷惑面部識別系統(tǒng)的嘗試都已經(jīng)不再奏效;劍橋大學(xué)的研究表明人工智能可以對那些偽裝自己的人進(jìn)行面部重構(gòu)。谷歌已經(jīng)明確表示不贊成將人臉與身份匹配,因?yàn)閾?dān)心會遭到非民主政權(quán)的濫用。其他技術(shù)公司則似乎沒有這么講究。亞馬遜和微軟都在利用其云服務(wù)來提供面部識別;而其對于Facebook的計(jì)劃也十分關(guān)鍵。政府不會讓面部識別帶來的好處白白溜走。變革正在來臨。面對吧。

1. artifice: 詭計(jì),狡詐。

2. dissimulate: 隱藏(真實(shí)情感或目的)。

3. ride-hailing: 叫車服務(wù)。

4. incremental: // 逐步增長的。

5. biometric: 生物識別的。

6. VKontakte: 俄羅斯最大的社交網(wǎng)站,VKontakte為“保持聯(lián)系”之意。

7. scrape: 本義是“艱難取得,勉強(qiáng)獲得”,這里指利用爬蟲程序抓取信息,爬蟲程序是一種數(shù)據(jù)采集程序。

8. Hajdu-Cheney syndrome: 遺傳性骨發(fā)育不良并肢端溶骨癥,于1948年和 1965年分別由Hajdu和Cheney兩位放射科醫(yī)生進(jìn)行了病例報(bào)道。

9. autistic: 自閉癥的;elusive: 難懂的。

10. algorithm: // 算法。

11. balaclava: 巴拉克拉法帽,一種僅露雙眼和鼻子的羊毛頭罩,本來用于御寒,后來由于其能掩蓋臉部、隱藏身份,常被特種部隊(duì)、恐怖分子、劫匪等佩戴。

12. crop up: 發(fā)生,出現(xiàn);bail: 保釋;sentence: 判決。

13. gadget: //(電子或機(jī)械)小裝置。

14. dissemble: 掩飾(真實(shí)的情感或想法);grease: 給……加潤滑油。

15. grimace:(表示疼痛或厭惡等的)怪相,鬼臉。

16. embed sth. in: 使嵌入,使成為……的重要部分;decree: 下令,命令;consent: 同意,許可。

17. audit: 審核,嚴(yán)格檢查;propagate: 宣傳,傳播。

18. bamboozle: // 愚弄,蒙蔽。

19. regime: 政權(quán),政體。endprint

主站蜘蛛池模板: 亚洲欧美一区在线| 国产午夜人做人免费视频中文| 亚洲欧美国产视频| 不卡色老大久久综合网| 国产精品夜夜嗨视频免费视频| 伊人久热这里只有精品视频99| 99久久精品无码专区免费| 九色综合伊人久久富二代| 色妞www精品视频一级下载| 欧美国产日韩一区二区三区精品影视| 国产玖玖视频| 亚洲欧美色中文字幕| 日韩欧美91| 综1合AV在线播放| 精品黑人一区二区三区| 国产精品va免费视频| 狠狠v日韩v欧美v| 久久综合五月| 狠狠色丁香婷婷| 天天躁夜夜躁狠狠躁躁88| 少妇极品熟妇人妻专区视频| 视频一区视频二区中文精品| 自拍偷拍欧美| 一区二区三区精品视频在线观看| 久久国产高清视频| 全部无卡免费的毛片在线看| 园内精品自拍视频在线播放| 伊人久久精品亚洲午夜| 狠狠综合久久| 久久精品人人做人人| 亚洲国产91人成在线| 午夜欧美在线| 在线观看无码a∨| 91国语视频| 国产精品护士| 99伊人精品| 国产在线自乱拍播放| 制服丝袜无码每日更新| 大学生久久香蕉国产线观看| 国产成人1024精品| 91啪在线| 欧美一区二区福利视频| 国产视频 第一页| 国产精品观看视频免费完整版| 亚洲国产成熟视频在线多多| yjizz视频最新网站在线| 成人无码一区二区三区视频在线观看 | 91麻豆精品国产高清在线| 亚洲无码高清视频在线观看 | 精品国产美女福到在线不卡f| 免费一级毛片在线观看| 综合成人国产| 国产区免费精品视频| 国产精品男人的天堂| 久久99国产乱子伦精品免| 全色黄大色大片免费久久老太| 在线观看免费国产| 无码专区国产精品第一页| 91久久国产热精品免费| 亚洲第一香蕉视频| 国产成人精品高清在线| 精品人妻系列无码专区久久| 成人自拍视频在线观看| 国产三级视频网站| 波多野结衣久久精品| 亚洲综合亚洲国产尤物| 欧美午夜在线观看| 特级欧美视频aaaaaa| 无遮挡国产高潮视频免费观看| 四虎成人在线视频| 欧美色香蕉| 老司国产精品视频| 亚洲国产欧美目韩成人综合| 亚洲欧美成aⅴ人在线观看| 国产成人无码久久久久毛片| 久久亚洲天堂| aa级毛片毛片免费观看久| 在线观看精品自拍视频| 成年午夜精品久久精品| 任我操在线视频| 欧美精品在线观看视频| 国产亚洲视频免费播放|