randycupertino 2 hours ago
A makeup influencer I follow noticed youtube and instagram are automatically adding filters to his face without permission to his videos. If his content was about lip makeup they make his lips enormous and if it was about eye makeup the filters make his eyes gigantic. They're having AI detecting the type of content and automatically applying filters.

https://www.instagram.com/reel/DO9MwTHCoR_/?igsh=MTZybml2NDB...

The screenshots/videos of them doing it are pretty wild, and insane they are editing creators' uploads without consent!

Aurornis 10 minutes ago
The video shown as evidence is full of compression artifacts. The influencer is non-technical and assumes it's an AI filter, but the output is obviously not good quality anywhere.

To me, this clearly looks like a case of a very high compression ratio with the motion blocks swimming around on screen. They might have some detail enhancement in the loop to try to overcome the blockiness which, in this case, results in the swimming effect.

It's strange to see these claims being taken at face value on a technical forum. It should be a dead giveaway that this is a compression issue because the entire video is obviously highly compressed and lacking detail.

reactordev 54 minutes ago
I can hear the ballpoint pens now…

This is going to be a huge legal fight as the terms of service you agree to on their platform is “they get to do whatever they want” (IANAL). Watch them try to spin this as “user preference” that just opted everyone into.

api 50 minutes ago
That’s the rude awakening creators get on these platforms. If you’re a writer or an artist or a musician, you own your work by default. But if you upload it to these platforms, they own it more or less. It’s there in the terms of service.
sodapopcan 42 minutes ago
What if someone else uploads your work?
benoau 11 minutes ago
Section 230 immunity for doing whatever they want, as long as they remove it if you complain.
echelon 45 minutes ago
This is an experiment in data compression.
Groxx 26 seconds ago
[delayed]
plorg 37 minutes ago
If any engineers think that's what they're doing they should be fired. More likely it's product managers who barely know what's going on in their departments except that there's a word "AI" pinging around that's good for their KPIs and keeps them from getting fired.
echelon 32 minutes ago
> If any engineers think that's what they're doing they should be fired.

Seriously?

Then why is nobody in this thread suggesting what they're actually doing?

Everyone is accusing YouTube of "AI"ing the content with "AI".

What does that even mean?

Look at these people making these (at face value - hilarious, almost "cool aid" levels of conspiratorial) accusations. All because "AI" is "evil" and "big corp" is "evil".

Use occam's razor. Videos are expensive to store. Google gets 20 million videos a day.

I'm frankly shocked Google hasn't started deleting old garbage. They probably should start culling YouTube of cruft nobody watches.

asveikau 27 minutes ago
Videos are expensive to store, but generative AI is expensive to run. That will cost them more than storage allegedly saved.

To solve this problem of adding compute heavy processing to serving videos, they will need to cache the output of the AI, which uses up the storage you say they are saving.

echelon 16 minutes ago
https://c3-neural-compression.github.io/

Google has already matched H.266. And this was over a year ago.

They've probably developed some really good models for this and are silently testing how people perceive them.

hatmanstack 24 minutes ago
If you want insight into why they haven't deleted "old garbage" you might try, The Age of Surveillance Capitalism by Zuboff. Pretty enlightening.
echelon 19 minutes ago
I'm pretty sure those 12 year olds uploading 24 hour long Sonic YouTube poops aren't creating value.
glitchc 15 minutes ago
Probably compression followed by regeneration during decompression. There's a brilliant technique called "Seam Carving" [1] invented two decades ago that enables content aware resizing of photos and can be sequentially applied to frames in a video stream. It's used everywhere nowadays. It wouldn't surprise me that arbitrary enlargements are artifacts produced by such techniques.

[1] https://github.com/vivianhylee/seam-carving

jazzyjackson 41 minutes ago
Totally. Unfortunately it's not lossless and instead of just getting pixelated it's changing the size of body parts lol
jsheard 32 minutes ago
What type of compression would change the relative scale of elements within an image? None that I'm aware of, and these platforms can't really make up new video codecs on the spot since hardware accelerated decoding is so essential for performance.

Excessive smoothing can be explained by compression, sure, but that's not the issue being raised there.

Aurornis 5 minutes ago
> What type of compression would change the relative scale of elements within an image?

Video compression operates on macroblocks and calculates motion vectors of those macroblocks between frames.

When you push it to the limit, the macroblocks can appear like they're swimming around on screen.

Some decoders attempt to smooth out the boundaries between macroblocks and restore sharpness.

The giveaway is that the entire video is extremely low quality. The compression ratio is extreme.

echelon 20 minutes ago
AI models are a form of compression.

Neural compression wouldn't be like HVEC, operating on frames and pixels. Rather, these techniques can encode entire features and optical flow, which can explain the larger discrepancies. Larger fingers, slightly misplaced items, etc.

Neural compression techniques reshape the image itself.

If you've ever input an image into `gpt-image-1` and asked it to output it again, you'll notice that it's 95% similar, but entire features might move around or average out with the concept of what those items are.

jsheard 15 minutes ago
Maybe such a thing could exist in the future, but I don't think the idea that YouTube is already serving a secret neural video codec to clients is very plausible. The dramatic increase in CPU usage alone would give away that something weird is going on, and nothing of the sort has been reported, nor has anyone noticed any weird undocumented streams showing up in tools like yt-dlp.
swatcoder 26 minutes ago
Selectively "compressing" content of interest into stylized cartoonish exaggeration probably does save some bytes on the wire, but it's an impactful alteration of the work that average people would naturally see as more like an unapproved "filter".

Why repeat the corporate line when at best it was a embarassingly foolish execution?

If you look at any samples, you'll see that this isn't some brilliant new AI compression technique. It was irresonsibly deployed slop that's ripe to cause a backlash against them.

adzm 57 minutes ago
This is ridiculous
TazeTSchnitzel 50 minutes ago
The AI filter applied server-side to YouTube Shorts (and only shorts, not regular videos) is horrible, and it feels like it must be a case of deliberate boiling the frog. If everyone gets used to overly smooth skin, weirdly pronounced wrinkles, waxy hair, and strange ringing around moving objects, then AI-generated content will stand out less when they start injecting it into the feed. At first I thought this must be some client-side upscaling filter, but tragically it is not. There's no data savings at all, and there's no way for uploaders or viewers to turn it off. I guess I wasn't cynical enough.
api 46 minutes ago
I’ve been saying for a while that the end game for addictive short form chum feeds like TikTok and YouTube Shorts is to drop human creators entirely. They’ll be AI generated slop feeds that people will scroll, and scroll, and scroll. Basically just a never ending feed of brain rot and ads.
eagleinparadise 8 minutes ago
I buy into this conspiracy theory, it's genius. It's literally a boiling the frog kind of strategy against users. Eventually, everyone will get too lazy to go through the mental reasoning of judging every increasingly piece of content as "is this AI" as you mentally spend energy trying to find clues.

And over time the AI content will improve enough where it becomes impossible and then the Great AI Swappening will occur.

bitwize 38 minutes ago
Yes, but what happens when the AIs themselves begin to brainrot (as happens when they are not fed their usual sustenance of information from humans and the real world)?
api 21 minutes ago
Have you seen what people watch on these things? It won’t matter. In fact, the surreal incoherent schizo stuff can work well for engagement.
echelon 43 minutes ago
Do you know how much data YouTube is having to store at scale?

Google isn't enhancing anything. They're compressing it.

Compressing videos could save Google an extremely large amount of money at YouTube scale.

Most of these shorts aren't even viewed more than a few times.

delichon 16 minutes ago
YouTube should keep their grubby hands off. And give that capability to us instead. I want the power to do personal AI edits built in. Give me a prompt line under each video. Like "replace English with Gaelic", "replace dad jokes with lorem ipsum", "make the narrator's face 25% more symmetrical", "replace the puppy with a xenomorph", "change the setting to Monument Valley", etc.
someothherguyy 10 minutes ago
i wonder how many years (decades?) out this is still. it would be wild to be able to run something like that locally in a browser. although, it will probably be punishable by death by then.
chao- 59 minutes ago
I learned to ignore the AI summaries after the first time I saw one that described the exact OPPOSITE conclusion/stance of the video it purported to summarize.
windex 39 minutes ago
There are entire fake persona videos these days. Leading scientists, economists, politicians, tech guys, are being impersonated wholesale on youtube.
acomjean 6 minutes ago
I saw this today where "influencers" were taking real doctors from videos and using AI to have them pitch products.

https://www.theguardian.com/society/2025/dec/05/ai-deepfakes...

AmbroseBierce 36 minutes ago
Talking about AI, Google, and shady tactics, I wouldn't be surprised if soon we discover they are purposefully adding video glitches (deformed characters and so on) in the first handful of iterations when using Veo video generation just so people gets used to trying 3 or 4 times before they receive a good one.
VTimofeenko 29 minutes ago
Well the current models that cost per output sure love wasting those tokens on telling me how I am the greatest human being ever that only asks questions which get to the very heart of $SUBJECT.
data-ottawa 60 minutes ago
Are these AI filters, or just applying high compression/recompressing with new algorithms (which look like smoothing out details)?

edit: here's the effect I'm talking about with lossy compression and adaptive quantization: https://cloudinary.com/blog/what_to_focus_on_in_image_compre...

The result is smoothing of skin, and applied heavily on video (as Youtube does, just look for any old video that was HD years ago) would look this way

randycupertino 58 minutes ago
It's filters, I posted an example of it below. Here is a link: https://www.instagram.com/reel/DO9MwTHCoR_/?igsh=MTZybml2NDB...
data-ottawa 37 minutes ago
It's very hard to tell in that instagram video, it would be a lot clearer if someone overlaid the original unaltered video and the one viewers on YouTube are seeing.

That would presumably be an easy smoking gun for some content creator to produce.

There are heavy alterations in that link, but having not seen the original, and in this format it's not clear to me how they compare.

randycupertino 25 minutes ago
you can literally see the filters turn on and off making his eyes and lips bigger as he moves his face. It's clearly a face filter.
diputsmonro 8 minutes ago
To be extra clear for others, keep watching until about the middle of the video where he shows clips from the YouTube videos
ares623 55 minutes ago
The time of giving these corps the benefit of the doubt is over.
echelon 40 minutes ago
The examples shown in the links are not filters for aesthetics. These are clearly experiments in data compression

These people are having a moral crusade against an unannounced Google data compression test thinking Google is using AI to "enhance their videos". (Did they ever stop to ask themselves why or to what end?)

This level of AI paranoia is getting annoying. This is clearly just Google trying to save money. Not undermine reality or whatever vague Orwellian thing they're being accused of.

skygazer 19 minutes ago
"My, what big eyes you have, Grandmother." "All the better to compress you with, my dear."
randycupertino 30 minutes ago
Why would data compression make his eyes bigger?
echelon 11 minutes ago
Because it's a neural technique, not one based on pixels or frames.

https://blog.metaphysic.ai/what-is-neural-compression/

Instead of artifacts in pixels, you'll see artifacts in larger features.

https://arxiv.org/abs/2412.11379

Look at figure 5 and beyond.

brailsafe 10 minutes ago
Whatever the purpose, it's clearly surreptitious.

> This level of AI paranoia is getting annoying.

Lets be straight here, AI paranoia is near the top of the most propagated subjects across all media right now, probably for worse. If it's not "Will you ever have a job again!?" it's "Will your grandparents be robbed of their net worth!?" or even just "When will the bubble pop!? Should you be afraid!? YES!!!" and also in places like Canada where the economy is predictably crashing because of decades of failures, it's both the cause and answer to macro economic decline. Ironically/suspiciously it's all the same re-hashed redundant takes by everyone from Hank Green to CNBC to every podcast ever, late night shows, radio, everything.

So to me the target of one's annoyance should be the propaganda machine, not the targets of the machine. What are people supposed to feel, totally chill because they have tons of control?

muppetman 33 minutes ago
They're heating the garbage slightly before serving it? Oh no.
superkuh 59 minutes ago
The citation chain for these mastodon reposts resolves to the Gamers Nexus piece on youtube https://www.youtube.com/watch?v=MrwJgDHJJoE
TheTaytay 35 seconds ago
Yes! Thank you! He is talking about AI generated summaries being inaccurate, which is plenty to get up in arms about.

A lot of folks here hate AI and YouTube and Google and stuff, but it would be more productive to hate them for what they are actually doing.

But most people here are just taking this headline at face value and getting pitchforks out. If you try to watch the makeup guy’s proof, it’s talking about Instagram (not YouTube), doesn’t have clean comparisons, is showing a video someone sent back to him, which probably means it’s a compression artifact, not a face filter that the corporate overlords are hiding from the creator. It is not exactly a smoking gun, especially for a technical crowd.

stevenalowe 53 minutes ago
Every YT short looks AI-ified and creepy now
koolba 38 minutes ago
What’s the point of doing this?

I don't understand the justification for the expense or complexity.

choilive 36 minutes ago
What PM thought this was a good idea? This has to be the result of some braindead we need more AI in the product mandate
jeeeb 49 minutes ago
I really hate all the AI filters in videos. It makes everyone look like fake humans. I find it hard to believe that anyone would actually prefer this.
SilverElfin 39 minutes ago
I’ve also noticed YouTube has unbanned many channels that were previously banned for overt supremacist and racist content. They get amplified a lot more now. Between that and AI slop, I feel like Google is speed running the changes X made over the last few years.
MaxL93 53 minutes ago
"Making AI edits to videos" strikes me as as bit of an exaggeration; it might lead you to think they're actually editing videos rather than simply... post-processing them[1].

That being said, I don't believe they should be doing anything like this without the creator's explicit consent. I do personally think there's probably a good use case for machine learning / neural network tech applied to the clean up of low-quality sources (for better transcoding that doesn't accumulate errors & therefore wastes bitrate), in the same way that RTX Video Super Resolution can do some impressive deblocking & upscaling magic[2] on Windows. But clearly they are completely missing the mark with whatever experiment they were running there.

[1] https://www.ynetnews.com/tech-and-digital/article/bj1qbwcklg

[2] compare https://i.imgur.com/U6vzssS.png & https://i.imgur.com/x63o8WQ.jpeg (upscaled 360p)

ssl-3 49 minutes ago
Please allow me "post-process" your comment a bit. Let me know if I'm doing this right.

> "Making AI edits to videos" strikes me as something particularly egregious; it leads a viewer to see a reality that never existed, and that the creator never intended.

randycupertino 47 minutes ago
It's not post-processing, they are applying actual filters, here is an example they make his eyes and lips bigger: https://www.instagram.com/reel/DO9MwTHCoR_/?igsh=MTZybml2NDB...
MaxL93 40 minutes ago
Sure, but that's not YouTube. That's Instagram. He says so at 1:30.

YouTube is not applying any "face filters" or anything of the sort. They did however experiment with AI upscaling the entire image which is giving the classic "bad upscale" smeary look.

Like I said, I think that's still bad and they should have never done it without the clear explicit consent of the creator. But that is, IMO, very different and considerably less bad than changing someone's face specifically.

randycupertino 36 minutes ago
His followers also added screenshots of youtube shorts doing it. He says he reached out to both platforms and says he will be reporting back with an update from their customer service and is doing some compare an contrast testing for his audience.

Here's some other creators also talking about it happening in youtube shorts: https://www.reddit.com/r/BeautyGuruChatter/comments/1notyzo/...

another example: https://www.youtube.com/watch?v=tjnQ-s7LW-g

https://www.reddit.com/r/youtube/comments/1mw0tuz/youtube_is...

https://www.bbc.com/future/article/20250822-youtube-is-using...

MaxL93 31 minutes ago
> Here's some other creators also talking about it happening in youtube shorts (...)

If you open the context of the comment, they are specifically talking about the bad, entire-image upscaling that gives the entire picture the oily smeary look. NOT face filters.

EDIT : same thing with the two other links you edited into your comment while I was typing my reply.

Again, I'm not defending YouTube for this. But I also don't think they should be accused of doing something they're not doing. Face filters without consent are a far, far worse offense than bad upscaling.

I would like to urge you to be more cautious, and to actually read what you brandish as proof.