Loading...

Share

Copied!

Viral Image On Crane Collapse on Train at Sikhio Appears to Be AI‑Generated, But Incident Is Real

Jan 15, 2026 | 16:02 Jan 21, 2026 | 16:23
Science & Tech#Disinformation
Viral Image On Crane Collapse on Train at Sikhio Appears to Be AI‑Generated, But Incident Is Real

In the aftermath of the crane collapse incident in Sikhio District, Nakhon Ratchasima, the viral aerial photo showing wreckage and alleged casualties was not authentic. Verification found the image had no source and did not match the actual scene. Image‑analysis tools indicated it was likely created using artificial intelligence (AI). The State Railway of Thailand confirmed the picture was fake and not taken at the crash site.

Thai PBS Verify found the source of fake news on: Facebook

On January 14, 2026, it was reported that, in Sikhio District, Nakhon Rachasima, a construction crane from the ThaiChinese highspeed rail project collapsed onto Special Express Train No. 21 on the Bangkok–Ubon Ratchathani route, causing injuries and fatalities. At the same time, social media users widely shared an image resembling the accident, showing bodies covered with cloth lying on the railway track. The post was accompanied by the caption: “May the 22 souls rest in peace. May they achieve a better realm.”

The image gained more than 120,000 likes and was shared over 16,000 times. Yet it was circulated without a clear source, date, or time stamp. No official agencies or media outlets reporting from the scene released this photo. This raised strong doubts about its authenticity whether it may have been created or manipulated by artificial intelligence (AI). 

Post claiming to be an image of the crane collapse tragedy

Was the image AIgenerated?

When the viral photo was examined using Google Lens, the “About this image” section indicated that the picture had been created with Google’s artificial intelligence (AI).

When the viral photo was examined using Google Lens, the “About this image” section indicated that the picture had been created with Google’s AI.

In addition, when the photo was analyzed using an AIimage detection tool from the Hive Moderation website, the results also indicated that the picture was likely generated by artificial intelligence.

Analysis with Hive Moderation found a 99.8% likelihood the photo was AI‑created.

Comparing the real train and the AI Image

A review of official reports confirmed that the train involved in the accident was Special Express No. 21. According to information on the State Railway of Thailand’s website, the passenger train operating on this route was a Daewootype train.

The train in the incident, Special Express No. 21 Bangkok–Ubon Ratchathani, was identified by the State Railway of Thailand as a Daewoo‑type passenger train.

The Daewoo passenger train involved in the accident (top) has doors at the front and rear of each carriage, whereas the ATR diesel railcar (bottom image) features mid‑carriage doors, matching the train shown in the viral social media photo.

The structural differences indicate that the spread photo was not that from the real incident, but generated by AI.

The AI‑generated image (top) shows a train with doors in the middle of the carriage, while the real accident photo (bottom) shows a Daewoo train with doors at the ends of the carriages.

Another observation concerns the placement of victims’ bodies. In the real incident, the bodies were not left on the railway tracks but were moved outside the crash site.

The AI‑generated image shows bodies lying in rows along the railway tracks.

Image showing the placement of victims’ bodies from the real incident, photographed by the news team in Sikhio District, Nakhon Ratchasima.”

State Railway of Thailand confirms that the image is fake

Methapat Sunthorawaraphak, Director of the SRT Public Relations, confirmed that the train photo circulating on social media is not from the real incident.

The SRT also told Thai PBS Verify that the train involved in the accident was a Daewoo passenger train, purchased from South Korea.

By contrast, the image identified as AIgenerated shows structural features resembling an ATR diesel railcar, which the SRT purchased from Japan — a model not involved in the accident.

What is the truth

The incident in which a construction crane from the Thai–Chinese highspeed rail project collapsed onto Special Express Train No. 21 in Sikhio District, Nakhon Ratchasima, on January 14, 2026, was real and resulted in injuries and fatalities.

However, widely shared on social media, the aerial photo of a wrecked train with covered bodies lying on the tracks did not come from the actual event. An AIimage detection tool confirmed that the photo was AIgenerated. Moreover, the train’s features and the placement of the bodies in the image did not match official data or real photos from the incident. The State Railway of Thailand has also confirmed that the circulated photo is fake.

Verification Process

  1. Checking the “About this image” feature in Google Lens: The system indicated that the photo was AIgenerated by Google.
  2. Using Hive Moderation’s AI image detection tool: It was found the photo had a 99.8% likelihood of being AIgenerated.
  3. Fact‑checking with relevant authorities: The State Railway of Thailand (SRT) confirmed the photo was not from the real incident in Sikhio District, Nakhon Ratchasima. A comparison of train features showed that the actual train involved was a Daewoo passenger train, with doors located at the front and rear of each carriage. In contrast, the train in the circulated image resembled an ATR diesel railcar, which has mid‑carriage doors and was not the train involved in the accident.

Impacts of this false information

  1. Information Misunderstanding: False content can mislead the public on the severity of the incident and the actual details.
  2. Erosion of Trust in News: When false information spreads, it can confuse people, undermine trust in credible news sources, and distort public perception during crises.
  3. Encouraging Misuse of AI: Sharing AI‑generated images without verification opens the door to imitation, where AI may be used to create false visuals or information in other events. This  makes it increasingly difficult to distinguish truth in the future.

Recommended Response

      1. Do not believe or share immediately: Images or information about serious incidents often trigger emotions. Take a break and consider whether the source is credible.

  1. Check the source and publication details: Look for clear references to date, time, location, and origin of the image or information. If these are missing, treat the content with suspicion.
  1. Compare it with official or mainstream sources: Verify whether the same reports or images appear from relevant authorities or established media outlets covering the event.
  1. Use image or AI verification tools: Apply reverse image search or AI‑detection tools to assess the credibility of the photo.
  1. If uncertain, stop forwarding: Not sharing is the most effective way to reduce the spread of false information and help maintain the accuracy of the information space.