This should be a showstopper, critical issue. I’m surprised to see this still be in the wild after being posted on Friday of last week.

Imagine getting still frames of somebody else's bedroom photos or kinky selfies sent to a partner or something similar rendered into your own video.

This could be very much like the technical cloud-based version of the fictional Tyler Durden splicing dick pics into single frames of 35mm film movies.

Based on what people use phones for these days some sizable percentage of icloud synced photos have to be something you really wouldn't want to get out there to random other icloud users.

Imagine having CSAM inserted in your videos, then getting persecuted for it. Oh wait, they already scan for it, good on them.

Hm, what if you modified a CSAM video to evade detection (adversarial ML etc), then injected it onto the target's machine with this fun bug, then when the target's computer automatically creates a thumbnail for the video the thumbnail would be automatically flagged.

Or just make your own fake/questionable hash collisions with a script Some Guy made on Github: https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX