◆本月初,警方網罪科網絡安全組警司陳純青(中)聯同4名「防騙刑警」講解日新月異的詐騙手法及預防知識。 資料圖片

【原文】下文摘錄自2023年7月3日香港《文匯報》:

香港警方首次委任4名警務督察作為「防騙刑警」,並呼籲市民警惕使用AI「深度偽造」技術「換臉」、「換聲」的新型詐騙。該類詐騙正在全球迅速蔓延,雖暫未襲港,仍值得港人高度警惕。對付這類騙案,最基本的辦法是教育市民認清騙術、及早防範,警方要與地區組織合力向市民宣講AI防騙貼士,提高市民防範能力,同時要盡快找到AI詐騙的規律,進行嚴厲打擊。

眼見不一定為實,通過社交媒體視頻所見的「親朋好友」,原來竟是騙徒以「換臉」、「換聲」術合成的「李鬼」,這種AI「深偽技術」詐騙正在全球蔓延。有科技公司統計顯示,「深度偽造」詐騙佔全部詐騙數量比例,今年第一季美國上升到2.6%、加拿大上升到4.6%,英、德、意等歐洲國家超過了5%。5月24日,中國互聯網協會發布警示指,伴隨着深度合成技術的開放開源,深度合成產品和服務逐漸增多,利用AI換臉、AI換聲等虛假音視頻進行詐騙、誹謗的違法行為已屢見不鮮。

目前香港警方尚未接獲涉人工智能詐騙案的舉報,但警方獲悉有兩宗涉及香港元素的個案查詢,包括投資騙案和事主裸聊時被盜取樣貌、移花接木到色情影片而遭勒索等,顯示使用AI深度偽造技術「換臉」、「換聲」的新型詐騙隨時襲港。綜合各國此類騙案特點,一是電話詐騙,能模仿特定人物的聲音,如家人、親朋好友或者知名人士的聲音,取信於被害人進行詐騙;二是以視訊獲取信任進行財務詐騙;三是恐嚇詐騙,如將個人肖像移花接木,用以合成不雅照片進行勒索等。

目前對付此類科技騙案的其中一個有效方法,是教育市民認清騙術、加強防範的技巧。如若有「親友」在視頻或錄音中提出匯款要求,要特別警惕;讓對方用手在臉部慢慢劃過,如果是通過AI換臉合成的視頻,臉就會變形。此外,和家人、朋友設置暗號,並盡量不要在社交媒體上發布個人及家人相關照片、語音、視頻等,也是有效防範的辦法。警方要及時發現科技騙案新特點、及早總結出預防AI詐騙技巧貼士,並和各地區組織加強合作,向市民發放、宣講相關騙術特點、防範方法等,提升香港市民的防範能力。

目前香港的騙案仍在持續攀升,今年首5個月共錄得15,792宗,較去年同期的9,988宗上升近58%,損失金額則高達21.1億元,即平均每日損失1,300萬元,每10分鐘便發生一宗騙案。不過,警方今年的騙案破案率亦有提升,至今共拘捕1,739人。面對騙案上升、騙術花樣翻新,警方通過設立「防騙刑警」、在今年7月和8月「防騙月」加強宣傳等,加大打擊騙案的力度,實屬必要之舉。

警方應加強與國家的溝通及國際合作,密切追蹤AI騙案規律特點,進行針對性的打擊,確保市民安全和國際金融中心地位的穩固。

Preventing scams with face swapping technology starts with education

【譯文】The Hong Kong Police Force has appointed four police inspectors as "Anti-deception Officers", and urged the public to be on the alert for new types of scams that involve the new "deepfake technology", in which scammers use artificial intelligence (AI) to replace someone's face or voice. These scams are spreading rapidly around the world and warrant a high degree of vigilance in Hong Kong although they have not yet hit the city. The most basic way of combatting such scams is to educate the public to recognise them and take precautions in the very beginning. The Police and local organisations should work together to educate the public on tips for preventing AI-driven frauds and enhance their ability to guard against such scams, and at the same time find out the patterns of AI-driven frauds as soon as possible and crack down on them.

Seeing is not necessarily believing. The "close relatives and good friends" you see in social media videos turn out to be shams that fraudsters make by replacing their faces or voices with AI technology. Such scams with "deepfake technology" are spreading globally. According to the statistics of technology companies, scams with "deepfake technology" climbed to 2.6% of the total number of frauds in the first quarter of this year in the United States, 4.6% in Canada and more than 5% in European countries such as the United Kingdom, Germany and Italy. On 24 May, the Internet Society of China issued a warning that, with the opening-up of deep synthesis technology and the release of its source code, deep synthesis products and services were gradually increasing, and it became a common occurrence that fake audio or video with the voices or faces replaced by AI were used for fraudulence or defamation.

The Hong Kong Police Force has not yet received any reports of AI-driven scams, but it has received two enquiries from citizens, including fraudulent investment cases and blackmail in which victims' faces are stolen during naked chats and transposed to pornographic videos. It indicates that new types of frauds using AI deepfake technology such as "face swapping" and "voice swapping" are ready to hit Hong Kong. The study of the characteristics of such frauds in different countries concludes three types of scams: the first is telephone scams : scammers mimic the voice of a specific person, such as a family member, friend or celebrity, to gain the victim's trust and then commit fraud. The second is financial scams: the scammer gains the trust of the victim through video. The third is blackmail: the victim is intimidated and blackmailed with indecent photos into which the face of the victim is transposed.

One of the most effective ways of tackling this type of technology fraud is to educate the public on fraud techniques and take precautions. If a "friend or family member" requests a transfer of money in a video or recording, be especially wary; ask the person to slowly move his hand in front of his face. If the video is produced by AI face-swapping technology, the face will be deformed. Another effective way is to set up watchwords with your family members and friends, and try not to post photos, voices and videos of you and your family members on social media. The Police should identify new features of AI-driven frauds in a timely manner, draw up early tips to prevent such scams, and strengthen co-operation with local organisations to disseminate information to the public on the features of such frauds and ways to prevent them, so as to enhance citizen's ability to avoid them.

The number of fraud cases in Hong Kong is still on the rise–15,792 cases were recorded in the first five months of this year, up 58% from 9,988 cases in the same period last year. The number of losses amounted to $2.11 billion, or an average of $13 million per day, or one fraud case every 10 minutes. However, the police's detection rate has also improved this year, with 1,739 arrests so far. In view of the increase in fraud cases and the new tricks used, it is necessary for the Police to step up their efforts to combat fraud by appointing "Anti-deception Officers" and enhancing publicity during "Anti-deception Month" in July and August this year.

The Police should strengthen communication with the central government and international cooperation, and closely track the patterns and characteristics of AI-driven frauds, so as to combat them in a targeted manner to ensure the safety of the public and consolidate the city's position as an international financial centre.

◆ Kevin Cheung (CUSCS Lecturer)