Yeah I mean I’m not saying there are no predators out there but at this point of a scum society I think the majority of children on OF are not exactly victims of a predator.
It’s a matter of common sense… what did you expect to happen with so many abandoned children (without dad, mom or both) with these ideologies of objectification of people and with the easy access that children have to the Internet today. You don’t need to be a genius to realize that most likely the majority of children who are in OF are not necessarily because of a predator in question.
I understand the lack of studies about this but anyway I believe if you research there will be no information available.
I’ve said this for years as others online have claimed OF is so safe. You don’t know the full story. It is not necessarily a more ethical business model than traditional porn
It seems to me that OnlyFans takes several steps that make it easier for police and prosecutors to do their job, some of which are detailed in the article. What additional steps do you think they should take?
I guess I’m thinking about this from a solutions-oriented mindset: what concrete, achievable steps can various entities take to reduce child abuse? For an adult content platform, the identity verification steps OnlyFans uses seem reasonable; even when an offender uses an account owned by someone else, that often provides enough of a lead for investigators.
It’s definitely more ethical in that it allows the creators to get more of the money.
But that doesn’t mean there aren’t issues. Clearly it doesn’t fix the whole issue of people under 18 sometimes making their way into the industry. It’s not a silver bullet that fixes an entire (sometimes problematic) industry.
I don’t know who was trying to tell you that OF solves everything about the porn industry, there would be no further issues, and that we’d live happily ever after, but that was obviously never going to be the case.
I think receiving 70% of the prices that you yourself set and deem acceptable is likely better than ?% of whatever PornHub or XHamster say they made from your video predominantly through ad revenue.
At the very least, it gives creators a great amount more control. In terms of setting prices, in terms of creating content they want to make as opposed to what a production company says, in terms of how you want to advertise, in terms of whether you want to lock your content behind a paid tier or not, etc.
And 30% is also pretty standard. Google, Apple, Valve, etc all charge 30%. Shit, on twitch it’s 50% IIRC. I’m not saying it’s perfect and couldn’t be cheaper, but it’s the usual market rate.
Also the article isn’t even so much about underaged users trying to get on the platform to post pictures of themselves or trying to gain access to porn, OF seems to be fairly good at keeping them out, it’s adults posting content involving minors and that’s a lot harder problem to prevent without literally going through every upload manually.
It’s almost like creating a platform where the intent is for users to post content without any kind of curation or manual review is itself a flawed idea. I understand how tempting the whole thing is, to set up a platform that allows you to be a passive middleman and take a cut of all activity on the platform.
Should be a law that if a platform is making money from something, it is also responsible for that content. Curation shouldn’t be enforced by law, but the legality of the content should be, whether it be illegal on its own like in this case or fraud. Ads included.
I’m talking about the commercial platforms where the idea is to scale up to the point where some small fee results in large revenues and companies often scale beyond their capacity to review the content of their platform. Others end up hurt in the process while the company makes money from it.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world
This is a most excellent place for technology news and articles.
At this point of social decomposition, I suspect that rather than being predators, it is the children themselves who do it.
Really? There’s like 10 examples here of children being abused and your response is to talk about how kids are bad?
Yeah I mean I’m not saying there are no predators out there but at this point of a scum society I think the majority of children on OF are not exactly victims of a predator.
What you’re experiencing is generational bias and forming an opinion without actual facts. Be better.
It’s a matter of common sense… what did you expect to happen with so many abandoned children (without dad, mom or both) with these ideologies of objectification of people and with the easy access that children have to the Internet today. You don’t need to be a genius to realize that most likely the majority of children who are in OF are not necessarily because of a predator in question.
I understand the lack of studies about this but anyway I believe if you research there will be no information available.
“majority”
You sound like you feel guilty after you masturbate and your favorite expedition was the Mayflower
I’ve said this for years as others online have claimed OF is so safe. You don’t know the full story. It is not necessarily a more ethical business model than traditional porn
The main thing I’m getting from the article is that adults who try to profit from abusing children on OnlyFans get arrested.
The ones who get caught, yeah.
It seems to me that OnlyFans takes several steps that make it easier for police and prosecutors to do their job, some of which are detailed in the article. What additional steps do you think they should take?
None. My point is that there’s a shit ton of abuse out there. Only a tiny amount is caught
You said the people who do this get arrested. But they don’t. Only a tiny fraction get caught and arrested
I guess I’m thinking about this from a solutions-oriented mindset: what concrete, achievable steps can various entities take to reduce child abuse? For an adult content platform, the identity verification steps OnlyFans uses seem reasonable; even when an offender uses an account owned by someone else, that often provides enough of a lead for investigators.
It’s definitely more ethical in that it allows the creators to get more of the money.
But that doesn’t mean there aren’t issues. Clearly it doesn’t fix the whole issue of people under 18 sometimes making their way into the industry. It’s not a silver bullet that fixes an entire (sometimes problematic) industry.
I don’t know who was trying to tell you that OF solves everything about the porn industry, there would be no further issues, and that we’d live happily ever after, but that was obviously never going to be the case.
OF take 30% I think. What does an average scene make on OF? How does that compare to the pay rate for old school porn?
I think receiving 70% of the prices that you yourself set and deem acceptable is likely better than ?% of whatever PornHub or XHamster say they made from your video predominantly through ad revenue.
At the very least, it gives creators a great amount more control. In terms of setting prices, in terms of creating content they want to make as opposed to what a production company says, in terms of how you want to advertise, in terms of whether you want to lock your content behind a paid tier or not, etc.
And 30% is also pretty standard. Google, Apple, Valve, etc all charge 30%. Shit, on twitch it’s 50% IIRC. I’m not saying it’s perfect and couldn’t be cheaper, but it’s the usual market rate.
Also the article isn’t even so much about underaged users trying to get on the platform to post pictures of themselves or trying to gain access to porn, OF seems to be fairly good at keeping them out, it’s adults posting content involving minors and that’s a lot harder problem to prevent without literally going through every upload manually.
It’s almost like creating a platform where the intent is for users to post content without any kind of curation or manual review is itself a flawed idea. I understand how tempting the whole thing is, to set up a platform that allows you to be a passive middleman and take a cut of all activity on the platform.
Should be a law that if a platform is making money from something, it is also responsible for that content. Curation shouldn’t be enforced by law, but the legality of the content should be, whether it be illegal on its own like in this case or fraud. Ads included.
You do realise how ironic posting that to Lemmy of all places is?
I’m talking about the commercial platforms where the idea is to scale up to the point where some small fee results in large revenues and companies often scale beyond their capacity to review the content of their platform. Others end up hurt in the process while the company makes money from it.