英語新聞丨人工智能成為美國欺詐者的得力工具

2023-07-31 11:00:0004:59 1.7萬
聲音簡介

People in the United States are being warned to stay vigilant against a growing number of scams using artificial intelligence that mimics a person's voice during a phone call to a concerned relative or friend, who is then asked to send money for ransom.

美國警告人們保持警惕,防范利用人工智能的新型詐騙手段,即通過模仿一個人的聲音打電話給其親朋好友,要求他們拿錢贖人來行騙。

The Federal Trade Commission, or FTC, issued the consumer warning alert this year after an increase in the number of people reporting they had been asked to send money after receiving a frantic phone call from a person who they believed was their loved one but was in fact a cloned voice using AI.

今年,聯(lián)邦貿(mào)易委員會(FTC)發(fā)出了消費者警告,因為有越來越多的人上報稱,他們在接到一個瘋狂的電話后被要求匯款,他們以為通話者是他們的親人,但實際上聲音是由人工智能克隆偽造的。

Jennifer DeStefano from Scottsdale, Arizona, experienced the crime firsthand. She told a US Senate judiciary hearing last month that she got a call from an unlisted number in April, and when she picked up, she could hear her daughter, Briana, crying.

上個月,在美國參議院司法委員會的聽證會上,來自亞利桑那州斯科茨代爾的珍妮弗·德斯特法諾親身經(jīng)歷了這起犯罪事件。她說四月份接到了一個沒有顯示號碼的電話,當(dāng)她接聽時,可以聽到她的女兒布里安娜在哭泣。

"Mom! I messed up," her daughter said sobbing on the phone call.

“媽媽!我搞砸了,”她女兒在電話里哭著說道。

DeStefano asked her daughter, "OK, what happened?"

珍妮弗問她女兒:“好吧,發(fā)生了什么事?”

She then heard a man's voice on the phone telling her daughter to "lay down and put your head back".

然后,她聽到一個男人的聲音在電話里告訴她女兒“躺下,把頭往后仰”。

He then told the worried mother: "Listen here, I have your daughter. You tell anyone, you call the cops, I am going to pump her stomach so full of drugs."

然后,那個男人告訴這位憂心忡忡的母親:“聽好了,你女兒在我手上你敢告訴任何人,你敢報警,我就讓她肚子里灌滿毒品。”

DeStefano was at her other daughter Aubrey's dance rehearsal when she picked up the phone. She put the phone on mute and asked nearby parents to call 911.

珍妮弗拿起電話時,她正在參加她另一個女兒奧布里的舞蹈排練。她把電話調(diào)成靜音,讓身邊的其他家長撥打911。

The scammer first asked her to send $1 million, but when she said she did not have access to that much money, he asked for $50,000 in cash and arranged a meet-up spot.

騙子先是讓她匯款100萬美元,當(dāng)珍妮弗說她沒這么多錢時,騙子又要了5萬美元現(xiàn)金,并安排了見面地點。

The terrified mother said the man on the phone told her that "if I didn't have all the money, then we were both going to be dead".

驚恐萬分的母親說,電話中的男子告訴她,“如果我沒拿到所有的錢,我們都會死”。

However, she contacted her husband and daughter and found out Briana was safe, and it was a hoax.

不過,她聯(lián)系了丈夫和女兒,發(fā)現(xiàn)布里安娜平安無事,這只是一個騙局。

Cybercrimes on rise

網(wǎng)絡(luò)犯罪日益猖獗

Last year, frauds and scams rose 30 percent compared with the previous year, the FTC said. Cybercrimes are also increasing with losses of $10.2 billion last year, the FBI said.

美國聯(lián)邦貿(mào)易委員會表示,去年的欺詐和詐騙案比前一年增加了30%。美國聯(lián)邦調(diào)查局(FBI)表示,網(wǎng)絡(luò)犯罪也在不斷增加,去年損失達102億美元。

Scammers use AI to mimic a person's voice by obtaining "a short audio clip of your family member's voice from content posted online and a voice-cloning program", the consumer protection watchdog said. When they call, they will sound just like the person's loved one.

美國聯(lián)邦貿(mào)易委員會稱,騙子利用人工智能模仿一個人的聲音,“從網(wǎng)上發(fā)布的內(nèi)容和語音克隆程序中獲取您家人的聲音片段”。當(dāng)他們撥打電話時,他們的聲音就和當(dāng)事人的親人一模一樣。

In another scam, a Canadian couple was duped out of C$21,000($15,940) after listening to an AI voice that they thought was their son, The Washington Post reported in March.

據(jù)《華盛頓郵報》3月報道,在另一起騙局中,一對加拿大夫婦在收聽了以為是自己兒子的人工智能聲音后,被騙走了21,000加元(約合15,940美元)。

According to a recent poll by McAfee, an antivirus software organization in San Jose, California, at least 77 percent of AI scam victims have sent money to fraudsters.

根據(jù)位于加利福尼亞州圣何塞的殺毒軟件組織McAfee最近進行的一項民意調(diào)查,至少有77%的人工智能詐騙受害者曾向騙子匯款。

Of those who reported losing money, 36 percent said they had lost between $500 and $3,000, while 7 percent got taken for anywhere between $5,000 and $15,000, McAfee said.

McAfee說,在報告損失錢財?shù)娜酥校?6%的人說他們損失了500到3000美元,而7%的人被騙了5000到15000美元。

About 45 percent of the 7,000 people polled from nine countries — Australia, Brazil, France, Ger — many, India, Japan, Mexico, the United Kingdom and the US — said they would reply and send money to a friend or loved one who had asked for financial help via a voicemail or note.

在來自澳大利亞、巴西、法國、德國、印度、日本、墨西哥、英國和美國九個國家的7,000名受訪者中,約45%的人表示,如果朋友或親人通過語音郵件或紙條請求經(jīng)濟幫助,他們會回復(fù)并匯款。

Forty-eight percent said they would respond quickly if they heard that a friend was in a car accident or had trouble with their vehicle.

48%的人表示,如果聽說朋友出了車禍或車輛出了問題,他們會立刻作出回應(yīng)。

Although phone scams are nothing new worldwide, in this AI version, fraudsters are getting the money sent to them in a variety of ways, including wire transfers, gift cards and cryptocurrency.

盡管電話詐騙在全球范圍內(nèi)已不是什么新鮮事,但在人工智能的加持下,詐騙分子通過電匯、禮品卡和加密貨幣等各種方式獲得匯款。

Consumers are being encouraged to contact the person that they think is calling to check if they are OK before ever sending cash.

消費者被鼓勵在寄送現(xiàn)金之前,先與他們認(rèn)為是打來電話的人取得聯(lián)系,確認(rèn)他們是否安全。

FTC Chair Lina Khan warned House lawmakers in April that fraud and scams were being "turbocharged" by AI and were of "serious concern".

美國聯(lián)邦貿(mào)易委員會主席莉娜·汗在4月份警告眾議院立法者,人工智能正在“加速”欺詐和詐騙,需引起“嚴(yán)重關(guān)切”。

Avi Greengart, president and lead analyst at Techsponential, a technology analysis and market research company in the US, told China Daily: "I think that it is hard for us to estimate exactly how pervasive (AI) is likely to be because this is still relatively new technology. Laws should regulate AI."

美國技術(shù)分析和市場研究公司Techsponential總裁兼首席分析師阿維·格林加特告訴《中國日報》:“我認(rèn)為,我們很難準(zhǔn)確估計(人工智能)的普及程度,因為這仍然是一項相對較新的技術(shù)。法律應(yīng)該對人工智能進行監(jiān)管?!?/p>

The software to clone voices is becoming cheaper and more widely available, experts say.

專家說,克隆聲音的軟件越來越便宜,也越來越普及。

AI speech software ElevenLabs allows users to convert text into voice-overs meant for social media and videos, but many users have already shown how it can be misused to mimic the voices of celebrities, such as actress Emma Watson, podcast host Joe Rogan and columnist and author Ben Shapiro.

人工智能語音軟件ElevenLabs允許用戶將文本轉(zhuǎn)換為社交媒體和視頻的畫外音,但許多用戶已經(jīng)展示了它如何被濫用來模仿名人的聲音,如女演員艾瑪·沃森、播客主持人喬·羅根和專欄作家兼作家本·夏皮羅。

Other videos mimicking the voices of US President Joe Biden and former president Donald Trump have also appeared on platforms such as Instagram.

其他模仿美國總統(tǒng)喬·拜登和前總統(tǒng)唐納德·特朗普聲音的視頻也出現(xiàn)在Instagram等平臺上。

Scammer

英/?sk?m?(r)/ 美/?sk?m?r/

n.騙子

Cybercrime

英/?sa?b?kra?m/ 美/?sa?b?rkra?m/

n.網(wǎng)絡(luò)犯罪



用戶評論

表情0/300

丟了100的孩子

stay vigilant against保持警惕send money for ransom拿錢贖人was duped out of被騙走 put the phone on mute把電話調(diào)靜音 poll 民意調(diào)查

猜你喜歡
人工智能

人工智能的書,比較薄。

by:冷手熱心

人工智能

人工智能來了,信息時代來了,我們要如何應(yīng)對挑戰(zhàn)?我們何以為人?

by:張漁頑zyw

人工智能

人工智能的過去,現(xiàn)在和將來!人工智能(ArtificialIntelligence),英文縮寫為AI。它是研究、開發(fā)用于模擬、延伸和擴展人的智能的理論、方法、...

by:實踐牛

人工智能

內(nèi)容重點:本書作者韓東、陳軍,本書介紹了大量的優(yōu)秀AI應(yīng)用、Al產(chǎn)品、AI專家及AI公司的案例。選取的案例多與生活場景(如衣食住行、娛樂文藝)相關(guān),讓我們對AI...

by:德直君國學(xué)與科技

Richard談人工智能

Richard談人工智能,讓你與全球最頂尖的大腦同步!你想真正了解人工智能嗎?不是你想不想,而是必須了解因為人工智能很可能導(dǎo)致人類的永生或者滅絕,而這一切會在我...

by:智能大咖匯

說說人工智能

人工智能技術(shù)、資訊、自我學(xué)習(xí)心得。希望與愛好者交流!

by:藍(lán)天白云綠草