EN

แชร์

Copied!

Veriflied: AI Deepfake Clip Falsely Claims Thai PBS World Interviewed 2 Famous Businesspeople, Invites Online Investment: Invest 8,000 Baht, Get a 5-Times Return.

28 ส.ค. 6811:47 น.
วิทยาศาสตร์และเทคโนโลยี#ข่าวปลอม
Veriflied: AI Deepfake Clip Falsely Claims Thai PBS World Interviewed 2 Famous Businesspeople, Invites Online Investment: Invest 8,000 Baht, Get a 5-Times Return.

Thai PBS Verify discovered a fake advertisement that falsely used the Thai PBS World Tonight logo. It utilized AI deepfake technology to impersonate well-known news anchors and businesspeople, inviting people to invest in a scheme called CRYSTALLUM AI. But the original clip was actually a news report about a flood in 2013. Victims have warned others not to be fooled, as it could lead to financial losses and fraud.

Thai PBS Verify found the source of the fake news to be Facebook.

Screenshot of a fake post

Thai PBS Verify has found that a Facebook page named “Muscle Merge” is using the logo of
the program Thai PBS World Tonight. It published a video clip showing an interview with Mr. Toss Chirathivat, Executive Chairman of Central Group, and Mr. Kalin Sarasin, Honorary Chairman of the Thai Chamber of Commerce and Board of Trade of Thailand. The content of the clip encourages people to invest in a project called “CRYSTALLUM AI”. The video was distributed through Facebook ads and has received over 1,200 likes.

Upon review, the news anchor in the clip was identified as Ms. Patchari Raksawong, a former NHK World news anchor. When asked, Ms. Patchari stated that she has never been a news anchor for Thai PBS. She has only appeared on programs related to teaching English but has never served as an anchor for the network.

Ms. Patchari Raksawong, a former NHK World news anchor

For Ms. Raksawong, having her image used in an AI deepfake was a first. While she was
familiar with technology, she had never used it in such a negative context before. She was shocked that her image was used to deceive people into investing, as it could cause harm to others and discredit her reputation. She felt it was necessary to speak out so that no one else would be misled in this way. She also wants everyone to be more aware of the double-edged sword of AI. At the same time, it can be an excellent tool for efficiency, but it can also be used for malicious purposes, especially fraud that causes harm to others.

She believes the clip used was from an old news report she did for NHK, a Japanese news
agency with an Asia-Pacific office in Bangkok, where she worked for several years. She suspects that a malicious person found the old clip on the internet and used it to create the deepfake. She clarified that her work for NHK was always in English, never in Thai.

In closing, she urged viewers to use discretion and to check the source carefully before believing this type of news. She stressed the importance of double-checking information two or three times to ensure its authenticity.

The original clip was only a news report about a flood.

Based on Ms. Patchari’s interview, where she suggested that the clip might have been taken
from her time as an anchor for NHK WORLD, we used the keywords “Patchari Raksawong NHK WORLD” to search for the original video on Google Search Engine. This search led us to the original video, which was a news report about a flood in Phra Nakhon Sri Ayutthaya Province. It was published on the “Join Us to Do Good for Society” YouTube channel of the Habitat for Humanity Thailand Foundation on February 28, 2013, and has no connection to any investment solicitation offering returns.

The search results for the keywords found the video that was used to create the AI

deepfake image. Comparing the two clips revealed that the logo on the clip used was changed to the Thai PBS channel logo, and the language was changed from English news reporting to
synthesized Thai voice reporting from AI.

 

A screenshot showing the fake clip with the Thai PBS World Tonight logo superimposed (left), compared to the news report on building homes for flood victims from 2013 (right).

Verification Tool Confirms AI-created Clip

When the clip was examined using the AI detection tool from HIVEMODERATION, it was found to have a high probability of being an AI deepfake, at 72.3%.

The results from the AI image detection tool found that there is a 72.3% probability that AI created the clip.

Thai PBS World Tonight reports news exclusively in English.

แคลร์ ปัจฉิมานนท์, the Director of Thai PBS World, expressed personal concern and
worry that viewers or anyone who sees the clip might believe it to be real. She stated that Thai PBS World’s anchors have had their images used to create negative AI deepfakes on several occasions in the past. However, Ms. Patchari, who appears in the clip, is not an anchor for the Thai PBS World program.

Claire Pachimanon, the Director of Thai PBS World

She also stated that

Thai PBS World programs do not communicate in the Thai language. Thai PBS World Tonight is a weekly news summary program that reports exclusively in English. Therefore, if a report appears in any other language, it should be immediately considered a fake clip.

If anyone finds this kind of clip and is unsure, they can send it to the Thai PBS team for
verification. A clear message the team wants to convey is that Thai PBS is a non-profit organization. We want everyone to understand that if you see this type of clip, you should immediately be suspicious and question whether it is fake.

This is not the first time Thai PBS World news anchors have been victims of deepfakes used
to spread false news. For example, Verified: Clip of Thai PBS Anchor Reporting on “Anne
Jakrajutatip” Inviting Investment for a 500,000 Baht Return in 1 Month Was Actually Created by Deepfake,ตรวจสอบแล้ว : คลิปผู้ประกาศ Thai PBS รายงานข่าว “แอน จักรพงษ์” ชวนลงทุนได้ผลตอบแทน 5 แสนบาท ใน 1 เดือน แท้จริงสร้างจาก Deepfake

Central Group Warns of Fake Investment Clip

Meanwhile, the Central Group Facebook page posted about the issue, stating: “Fake News
Alert! Central Group Issues a Warning!!. Due to malicious individuals misrepresenting the company’s name or personnel to lure the public into investing using false information on various online platforms, Central Group does not have a policy to invite external individuals to invest in this manner. They ask the public not to believe, not to share, and not to click on these links.

Central Group Facebook Page Posts Fake News Warning

Current Laws Regulating Deepfakes in Thailand

Computer-Related Crime Act, B.E. 2560 (2017), Section 16 which states: “Whoever imports into a computer system accessible to the general public any computer data that appears as an image of another person, and that image is a creation, addition, or modification by electronic or any other means, in a manner likely to cause that person to lose their reputation, be humiliated, hated, or disgraced, shall be punished with imprisonment for not
more than three years and a fine not exceeding two hundred thousand baht.”
The Criminal Code, sections related to defamation, contempt, and the dissemination of
obscene media.
The Personal Data Protection Act (PDPA) prohibits the use of personal data (such as face,
name, and voice) without consent.
If the victim is a minor, it may fall under offenses according to the Anti-Human Trafficking Act or child protection laws.

From the investigation of this matter, Thai PBS Verify found the news to be false. The AI
deepfake was utilized to falsely impersonate the image and voice of a Thai PBS World news anchor and famous businesspeople, combining them with content in the clip that falsely claimed to invite investment for a return of up to 5 times the initial amount.

กระบวนการตรวจสอบ

■ Investigation using the keywords “Patchari Raksawong NHK WORLD” on the Google Search
Engine led to the discovery of the original video, which has no connection to any
investment solicitation offering returns.
■ Confirmation from Individuals Involved in the Clip

Confirmation was received from the victims whose personal data was used through AI-
Deepfake technology. Additionally, inquiries were made with Thai PBS World and Central Group to confirm that the program does not report in Thai and that the businesspeoplewho were impersonated are not involved with the said clip.

■ Investigation using an AI Image Detection Tool
It was found that there is a high probability of up to 72.3% that an AI deepfake created the
clip.

Impact of this False Information

■ Damages the credibility of the media and news anchors.
■ If people are misled or misunderstand the content, it could lead to a decline in the overall
credibility of the news station or the media.
■ Violates personal rights. Using another person’s face or voice without permission is a
violation of their individual rights and is illegal.
■ Creates an avenue for fraudsters to operate. Fake clips can be widely used to create and
distribute scams, such as fraud or the deceptive sale of products.

ข้อแนะนำเมื่อได้ข้อมูลเท็จนี้ ?

Recommendations When You Find False Information

1. Immediately Collect Evidence: Inform the person whose private images were used by fraudsters.
Save the edited images, videos, or related messages and the time they were published. If it was
shared in a chat group or on social media, save the chat history and the post’s URL.
2. Report it on the Platform (Facebook, YouTube, TikTok): Report the content to the platform
where it was published (Facebook, X, or TikTok) to have the content removed.
3. File a Report or Consult with Relevant Agencies
If you are a victim whose image or voice was used, file a report at a police station or the Cyber Crime Investigation Bureau (CCIB). You can also contact additional relevant agencies, such as the Ministry of Digital Economy and Society (DE), to request assistance in coordinating the removal of the data or call the Cybercrime Hotline at 1212.
4. How to Spot an AI-Deepfake Video

■ Look for abnormal lip movements.
■ Pay attention to the accent of the voice in the clip.
■ Observe the video as a whole and look for any abnormalities in the objects or people in the
photo or video.
■ Consider the possibility of the event. For example, in this case, the clip showed a rally with
a background resembling a temple or the Grand Palace, which would not happen in reality.

แท็กที่เกี่ยวข้อง

ผู้เขียน