I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.
This will actually protect smaller artists. It will prevent giant companies from profiting from their work without credit or payment.
So we kill open source models, and proprietary data models like adobe are fine, so they can be the only resource and continue rent seeking while independent artists can eat dirt.
Whether or not the model learned from my art is probably not going to affect me in any way shape or form, unless I’m worried about being used as a prompt so people could use me as a compass while directing their new image aesthetic. Disney/warner could already hire someone to do that 100% legally, so it’s just the other peasants im worried about. I don’t think the peasants are the problem when it comes to the wellbeing and support of artists
I believe a person can still sell or market art that is AI created. I just believe they shouldn’t have total ownership of the work.
Already most creators don’t fret over fanart or fanfiction so there is wiggle room for fair-use. It’s a lot like the game modding scene. Usually modders use pre-existing assets or code to create something new.
I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.
Generative AI models could be trained using only on public domain and royalty free images. Should the output of those be eligible for copyright, but not if they also had unlicensed training data?
It seems there two separate arguments being conflated in this debate. One is whether using copyrighted works as AI training data is fair use. The other is whether creative workers should be protected from displacement by AI.
Big companies like Adobe and Google can get the rights to use material to train their models. If stricter laws get passed it will only slightly inconvenience the larger companies, but might completely destroy any of the smaller companies or open-source versions available.
The anti-ai lawsuits aren’t going to stop ai art/etc, just determine whether it’s completely controlled by the current tech giants or not.
Sadly no matter what, the big media companies are going to have a huge advantage in everything because of decades of lobbying etc.
I think people should still be able to profit from selling the image themselves, however, I don’t think we have enough knowledge on how AI will truly impact things. If it becomes a minor fad and is just a tool to help speed a process I think the law doesn’t need to change much.
If AI becomes the majority creator on projects then we have to have this conversation about who owns what.
Close models will probably be the future, much like stock photos, and people will have to pay to access the models.
In the end big business will always fuck us over, copyright or not.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world
This is a most excellent place for technology news and articles.
I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.
This will actually protect smaller artists. It will prevent giant companies from profiting from their work without credit or payment.
So we kill open source models, and proprietary data models like adobe are fine, so they can be the only resource and continue rent seeking while independent artists can eat dirt.
Whether or not the model learned from my art is probably not going to affect me in any way shape or form, unless I’m worried about being used as a prompt so people could use me as a compass while directing their new image aesthetic. Disney/warner could already hire someone to do that 100% legally, so it’s just the other peasants im worried about. I don’t think the peasants are the problem when it comes to the wellbeing and support of artists
I believe a person can still sell or market art that is AI created. I just believe they shouldn’t have total ownership of the work.
Already most creators don’t fret over fanart or fanfiction so there is wiggle room for fair-use. It’s a lot like the game modding scene. Usually modders use pre-existing assets or code to create something new.
Let people play but not own AI work for now.
Generative AI models could be trained using only on public domain and royalty free images. Should the output of those be eligible for copyright, but not if they also had unlicensed training data?
It seems there two separate arguments being conflated in this debate. One is whether using copyrighted works as AI training data is fair use. The other is whether creative workers should be protected from displacement by AI.
Big companies like Adobe and Google can get the rights to use material to train their models. If stricter laws get passed it will only slightly inconvenience the larger companies, but might completely destroy any of the smaller companies or open-source versions available.
The anti-ai lawsuits aren’t going to stop ai art/etc, just determine whether it’s completely controlled by the current tech giants or not.
Sadly no matter what, the big media companies are going to have a huge advantage in everything because of decades of lobbying etc.
I think people should still be able to profit from selling the image themselves, however, I don’t think we have enough knowledge on how AI will truly impact things. If it becomes a minor fad and is just a tool to help speed a process I think the law doesn’t need to change much.
If AI becomes the majority creator on projects then we have to have this conversation about who owns what.
Close models will probably be the future, much like stock photos, and people will have to pay to access the models.
In the end big business will always fuck us over, copyright or not.