×
AI upscaling tools create fake details in FBI Kirk shooting investigation photos
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Internet users are using AI tools to upscale and “enhance” blurry FBI surveillance photos of a person of interest in the Charlie Kirk shooting, but these AI-generated images are creating fictional details rather than revealing hidden information. The practice demonstrates how AI upscaling tools can mislead criminal investigations by inferring nonexistent features from low-resolution images.

Why this matters: AI upscaling has a documented history of creating false details, including past incidents where it transformed Obama into a white man and added nonexistent features to Trump’s appearance, making these “enhanced” images potentially harmful to legitimate investigations.

What happened: The FBI posted two blurry surveillance photos on X seeking information about a person of interest in the Charlie Kirk shooting.

  • Users immediately responded with AI-upscaled versions created using tools like X’s Grok bot and ChatGPT.
  • The AI-generated variations show dramatically different details, with some displaying obviously fabricated features like “Gigachad-level chin” and completely different clothing.
  • While ostensibly meant to help identify the suspect, the images also serve as attention-grabbing content for likes and reposts.

The technical problem: AI upscaling tools don’t actually uncover hidden details in pixelated images—they extrapolate and infer what might be there based on training data.

  • The technology fills in gaps by making educated guesses rather than revealing actual information.
  • Different AI tools produce varying results from the same source image, highlighting their unreliability.
  • These tools can be useful in certain circumstances but should never be treated as evidence in criminal investigations.

Past examples: Previous incidents demonstrate the dangerous inaccuracy of AI upscaling in real-world scenarios.

  • AI tools have “depixelated” low-resolution images with completely incorrect results.
  • The technology has added nonexistent physical features to recognizable public figures.
  • These failures underscore why law enforcement agencies rely on original surveillance footage rather than AI enhancements.
Internet detectives are misusing AI to find Charlie Kirk’s alleged shooter

Recent News

AI upscaling tools create fake details in FBI Kirk shooting investigation photos

The tools make educated guesses, not discoveries, potentially misleading criminal cases.

Claude AI now remembers conversations automatically for Team users

Users control what Claude remembers while competitors face reports of "delusional" AI chats.

Nvidia unveils Rubin CPX GPU with 128GB memory for AI inference

Think restaurant kitchen specialization, but for AI workloads requiring massive contextual understanding.