It's a truism of the TV business that the camera adds ten kilograms.And we all know about airbrushing/photoshopping of models on magazine covers and in adverts.We need AI to get the lip-sync right, in other words, before today's simplistic AI-generated video porn turns toxic.
In this case, Rule 34 applies: it's being used to take porn videos and replace the actors with film stars.
The software runs on a high-end GPU and takes quite a while—hours to days—to do its stuff, but it's out there and it'll probably be available to rent as a cloud service running on obsolescent bitcoin-mining GPU racks in China by the end of next week.
It doesn't matter how insanely CPU-intensive this sort of application is: some dipshit with no social insight and an underdeveloped sense of morality is going to deploy it in an attempt to monetize us.
The is procedural porn tailored to appeal to the micro-targeted audience's kinks, even if they don't think they have any (use A/B testing to see which random fetish images get their attention, then converge).
It's all very well to fake up a video of David Cameron fucking a goat, but without the bleating and mindless quackspeak it's pretty obvious that it's a fake.
Being able to train a network to recognize the cadences of our target's intonation, though, and then to modulate a different speaker's words so they come out sounding right takes it into a whole new level of plausibility for human viewers, because we give credence to sensory inputs based on how consistent they are with our other senses. A study of Wyoming from its beginning to the present. (Offered based on sufficient demand and resources) 1251. [VV] A survey which encourages an understanding of Wyoming history, how it relates to the history of the West and the rest of America and how it has influenced the present. For those of you who don't read the links: you can train off-the-shelf neural networks to recognize faces (or other bits of people and objects) in video clips.You can then use the trained network to edit them, replacing one person in a video with a synthetic version of someone else.The political consequences of this toxic metastasis of "false news" I leave for discussion in comments. For a while now there's been a very weird phenomenon on You Tube, whereby popular childrens videos are pirated, remixed, and reuploaded as advertising delivery vehicles.