Some popular bloggers noticed strange changes in their YouTube videos: people's skin became "plastic," clothing details were overly emphasized, and ears sometimes looked distorted. It turned out the service was using neural networks to enhance the quality of videos—and it did so without the creators' knowledge or consent.
Rick Beato, a music blogger with over 5 million subscribers, said his face looked unusual in one of his videos: "I thought, 'Do I really think so?' But the more I looked at it, the weirder it got." His colleague, Rhett Schall, also noticed artifacts in his own videos and said, "If I wanted this horrible cross between sharpening and anti-aliasing, I would do it myself." But it gives the impression that the videos are AI-generated, and it undermines viewer trust.
User complaints began arriving over the summer, and after much discussion, the service admitted it was conducting experiments. Renee Ritchie, head of the creators department, said the company was testing a machine learning system on YouTube Shorts that "blurs out noise and improves clarity," comparing it to video processing in modern smartphones. However, the researchers point out that unlike smartphones, where users decide to apply filters themselves, in this case, the author was simply faced with the fact of turning on the filter.
According to BBC reporters, the YouTube story reflects a broader trend: companies are increasingly implementing AI-powered filters, which are changing the way we perceive photos and videos.