June 20, 20231 yr I used AI to "uncensore" the images. Don t know if its allowed or even wanted. But these are the results Edit: Removed until mod conformation.
June 20, 20231 yr @vatras While you're at it, you might as well AI un-blur these... https://www.bellazon.com/main/topic/17352-josephine-skriver/?do=findComment&comment=5561945 Input to ChatGPT: "Please unblur Jojo's 🍑"
June 20, 20231 yr done already The one on the bed does not work, at least not to a good looking degree Edit: Removed until mod conformation
June 20, 20231 yr 1 hour ago, vatras said: I used AI to "uncensore" the images. Don t know if its allowed or even wanted. But these are the results Hide contents
June 21, 20231 yr 23 hours ago, Robyc said: This is not ok. Delete this. You will be banned. These are like fakes. AI or not. If Josephine or Cameron will post the uncensored photos, we will see then. I think it's borderline criminal what you have done. I would like to get a mods opinion on that. Like you said, these are basically fakes that should be clear. Until I get an official statement from the mod team, I will delete them. So could a mod pleas clarify if AI “uncensored” images are okay or not?
June 21, 20231 yr I wouldn't go as far as to say that the deblurring of those images is criminal or even "fake" in the traditional sense of how the term has been used to identify obvious fake pictures. If you apply that definition across the board, then technically all the pictures you take with your modern smartphone are fake because the pictures generated are not 100% based on the optics but rely on computational photography algorithms and HDR to provide the best looking picture. Also, the "photoshopping" done by SI and VS and other editorials to make models look "better" could also fall into the definition of fake. Heck, the filters people use for their selfies could also be considered "fake". Lastly, native smartphone photo apps now have built in editing tools that "unblur" photos. So yeah, can 'o worms could be opened. I think it really boils down to fair-use and intent. In this case, @vatras just ran the images through an AI app to remove the blur - his intent was not to deceive or make it fake, e.g., enlarging the body part or superimposing someone else's body part from another image onto the original. So in my humble opinion, the deblurred pictures are OK. I think a better categorization would be original vs. edited. And you could also simply add a transparent watermark that overlays the photo indicating that it has been edited from the original with text like "edited" or "deblurred". Just my 8 cents (adjusted for inflation) . Carry on, folks!
June 22, 20231 yr Using an AI to unblur the images isn't illegal or fake. Unless we get a complaint from the model/photographer I don't particularly care but use your best judgment. Obviously they were not released uncensored, whather that is because they are going to be in a publication at a later date or because Jo has requested they be censored I don't know. Just clearly mark that they have been modified.
June 22, 20231 yr I'm surprised that everyone focused, on the blured parts of the photo. In terms of content, they are great. In my opinion, they lose heavily due to poor quality, resolution. Full of noise. And as for AI if the blur covers something unique then it should remain that way.
June 22, 20231 yr They try to make money on us, sell their clothes, lipsticks, colognes, and you think about the morality of using AI unblur on their naked asses. Progress cannot be stopped. Use AI at full capacity and don't ask for permission. No one cares if they continue to make a profit.
June 22, 20231 yr So with @phenobarbies okay: These are the AI uncensored images. I added a little watermark as requested. Pleas note that this is not some magic that “uncensored” the image or the unreleased original. Its an edit made by an AI that may or may not resemble the original. It goes without saying, but I don’t want to offend anyone with these pics, especially not Josephine or Cameron. I just personally think that these blurs ruined the overall image. Same for white bars in front of nipples or big white borders on the site of a pic to get the magazine/instagram effect. In the case of Josephine, nothing was added that we haven't seen before, since her naked bum can be seen by doing a quick google search, so I don’t understand why it was even censored in the first place. Spoiler
June 22, 20231 yr @vatras Since those photos are from IG, I'm guessing they were censored so they wouldn't get flagged by the IG algorithms. That's why I'm thinking if the pics show up on Cameron's website, they will be the uncensored version. Let's hope that comes to fruition. 🙏
June 22, 20231 yr 15 hours ago, phenobarbie said: Using an AI to unblur the images isn't illegal or fake. Unless we get a complaint from the model/photographer I don't particularly care but use your best judgment. Obviously they were not released uncensored, whather that is because they are going to be in a publication at a later date or because Jo has requested they be censored I don't know. Just clearly mark that they have been modified. I'm not sure how you can say that. We don't know what's under the blur. Maybe the model is not naked and this is just a trick of the photographer to make us think otherwise. The AI isn't magic; it draws what you tell it to. If the photos are published that way, who are we to say or do otherwise? Because if the line is drawn at what you just said, I can take the photo where she's covered in soap, tell the AI to remove the soap and publish that. It's a big ethical problem, much more complex than your view. Worst-case scenario: the model or photographer gets upset if things like this start circulating on the internet. In the best-case scenario, maybe they'll never post more blurred photos if they don't want to see them retouched with AI on the internet. The thing is, it's not some user's place to decide to post them unblurred with AI.
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.