About AI technology, a pledge for regulation and compensation for artists contribution.

Discussion started by iterateCGI

Lets reflect a bit with the community here on the rapid developments in the AI space and the changing landscape it causes for artists.

As we (probably most artists?) know now, AI art/image generation (also games, story telling and music AI generation) has become hugely popular and will grow ever so as it matures in its scope and quality. It’s a very powerful tool for conception and crafting ideas fast, something we as 3D artists can appreciate very much right?

As fantastic as it all is, there are some caveats to consider though.

Consider the latest developments regarding “Stable Diffusion”, this is probably one of the most popular AI generators right now and its made by the developers from Stability AI. Let’s hear about some thoughts from the CEO of this company (https://www.youtube.com/watch?v=k124oUlY_6g) and some announcements of what’s to come (https://www.youtube.com/watch?v=1Uy_8YPWrXo).

Sounds supper fantastic yes?

I agree, But….

Notice how often these very smart people mention the term “DATA”.
Data is what these AI machines drive, no data no magic.
So data from what, from where and whom?
From all of us, its the images/pixels, vectors, sounds and stories we produce.
Apparently its all free for the garbing by these companies and ready to get converted in piles of money (for them). Apparently they can do this unhindered by some pretense (a legal loophole) regarding copyright exceptions for scientific research purposes.

So now, every time we upload an image (or a 3D model, or whatever) to the internet, we potentially feed these “scientific” companies with free “DATA”. It’s apparently all fine as they also provide these powerful tools for free to the masses?

Note;

Some build GPU renting services from them and others start noticing the potential for selling custom trained AI models (for top dollars) or own them to build exclusive subscription based services around them (eg. Midjourney). In short, commercial applications while not paying a single penny royalty to any creators that made the magic possible.

We can also see this with the move from shutterstock with the integration of DALL-E.

They made a deal to integrate the technology into their platform and already announced to the community that they would only allow AI images on its platform made with its own proprietary AI (for security reasons), and that they would “compensate” contributions.

Sounds like a great opportunity, feeding a proprietary shutterstock supper AI training dataset that would eventually render any artistic interventions obsolete? Existing highly trained contributors/sellers (that have build this huge image library for years) can now expect a million more contributors/competitors that need no skills at all. They will feed the exclusive shutterstock AI training library beyond comprehension, with the image machine that all those previous contributors unknowingly participated in building (with their own data).

Isn’t it getting to often that creators do all the heavy lifting and eventually the big company's get all the value and power?

As open and share minded companies as Stability AI appear, the company is now valued at a billion dollars while they have used a free available dataset from (https://laion.ai/) to provide Stable Diffusion, a magical proof of concept (now thrown in public for success proofing).

And a success it is, now that it is evident a large crowd pays up to 50 dollars monthly subscription to use this type of magic and investors are lining up to get a piece of this cake.

Note however that the LAION datset was initially installed to serve scientific research only, and that they (for safety reasons regarding potential infringements) do not store or provide images, only links to them. This means that the one doing the actual AI training needs to download the images to the Vram of its GPU servers. In this case presumably the servers from Stability AI, which is all very questionable practice.

No wonder this has created a backlash from many artists who have found links to their work in these AI training datasets and never had a mention of it.

Imagine you spend half your life scraping by while building special art skills, then someone finds a way to capture these skills and throw them in public, gets rewarded for it with millions while you get not even a single penny (except a free copy of the skillset). Meanwhile the technology is out and everybody can do the same thing (individually), train some custom AI models on art/data they can grab anywhere on the net and sell those or the outputs for a profit (individuals who did not need to master anything).

In conclusion we need to understand now that everything we do and publish is in fact “DATA” that can be downloaded and put in the Vram of a GPU and get some magic out that can be used in a variety of ways (eg, get endless variation of same thing with push of a button, etc), and thus we need to re-imagine how we are going to license out this data for such purposes.

Every artist would need to consider if he cares about this or not.

The ones who do care need to take a stance and demand a proper legal framework for this.
For one it can not be a normal practice our data can be used with no restrictions for this?

For a start, it would probably make sense the standard CGtrader license gets some extra paragraph, specifically prohibiting any data from CGtrader to be used for AI training. In this way we would have some legal ground to defend against unhindered data grabbing by these companies or individual AI trainers. Also if CGtrader gets approached by these companies with request for data, then please let the artists know?

It should be an individual choice whether or not someone would license out his/her data for AI training.

Maybe there is some positive side to this, there is already demand for custom trained AI‘s, maybe this could be an entire new market for artists, providing quality content/data and special licenses for AI training (if they do not which to do AI training themselves). It could be better then just standing there watching the big corporations (and general public) taking it all and harvesting the value?

The message to those companies (and general public) should be, if you don’t know how you are “properly” going to license/compensate the artists work for this type of use, then please don't use their work in this type of use until there is a proper framework to exchange/compensate/license the data.

We don’t necessarily need to fight against these fantastic technologies, there is vast potential in them for everyone, we do however need to condemn the careless use of our data and demand a fair exchange.

!!Make no mistake, 3D data is next for grabbing, its a massive pile of gold we made here!!

If you want to support the idea’s/concerns expressed here then share them, inform yourself and maybe join one of the boards found in the description of this video (https://www.youtube.com/watch?v=tjSxFAGP9Ss).


Lets talk and explore how we can push the balance to a healthy exchange!

Answers

Posted about 2 years ago
0

This is just a run down for anyone that wants the TLDR:

1. Using AI does not pay the artist any royalties.
2. Shutterstock etc. who use AI are creating a way to 'compensate' contributors.
3. Creators are doing the heavy lifting and the big companies get all the value/power.
4. Customers can pay a subscription to use the AI generated images/3d.
5. Companies profit from artists hard work / special skills using AI technology.
6. First conclusion from the author: Everything we artists publish is used as data, how do we license it?
7. Does the artist even care?
8. CGT should provide some legal framework to protect 3D artists.
9. Positive side - could provide a new market, better quality products, license's for artists to profit from.
10. Open a discussion about the situation moving forward.

To be honest, like all technology it usually means less mundane work and actually getting to focus on the creative aspect. I remember before Wacom tablets I had to draw things by hand, scan it, trace over it and paint etc. Thank god we don't do that anymore.

Regarding compensation for artists, like Shutterstock they will pay artists if this indeed is the case since as a business its only wise to compensate the artists for legal/moral reasons.

Creators have always done 'the heavy lifting'. If you don't like it do technical/managerial type work, it pays more, you do less and you output/deliver 10x more than a single 'artist' can do in 1 day. Alternatively take out your saving/get a loan and build a company - scary thought for those who do not dare to be brave.

Does the artist even care? Lets find out. Personally it doesn't bother me if they use my artwork to generate AI, I use AI so it helps me too. If there is a financial benefit even better, but at the end of the day I'm focusing on the next step - which means technology needs to get smarter - win/win.

Pretty sure that covers your first 9 points, half of which cover the same topic. 10th point in my contribution to this post, however meaningless it is for the long run.

Posted about 2 years ago
0

Thanks for TLDR version, main questions apparently get lost in the long version.
I agree to most of your reply but main questions remain unanswered or debated.

Maybe some reframing?

1. Agree automation of technical mundane tasks is welcome (just respect copyright while making tech products?). Giving the tech product away for free does not justify the art theft in first place?
2. The Shutterstock example is beside the main point, “main question” is commercial activity with your art acceptable, no license required, no questions asked? (biggest art theft in human history is just a footnote?)
3. personal stance for no further action noted and respected.
4. Totally agree on need for smarter tech, but not agree artists should be fine with art theft for commercial tech applications without compensation. Note we can get a copy of stable diffusion for free but only the ones getting further training from further art theft produce the best results and they are not free, including for the ones getting the art stolen from.

I just want to point out here that standard licenses are unequipped for this new type of use (the AI training). The AI training is now done by large sets of GPU’s but next ware can train on a single personal GPU.

You and I will be able to train/customize AI, by grabbing data from anywhere on the net (if we want). Personally I would like to pay the artist if I would get his/her art to train my custom AI (then get countless variations from it), do you?

So again part of main question, do we need a legal refresh for this new type of use?
Maybe just a new category and separate license type?

Are we all going to just grab everything we can get in our hands to train our custom AI’s, then drop the output here on CGtrader?

Second question, are we fine CGtrader at some point (like Shutterstock) exchange all our data here for some 3D model AI generator (trained on our work)? Then silencing questions asked with some nice promises? (maybe instigate some war and division between the pro en conners?)

Should CGtrader adjusted the standard license for this type of use?

I hope this is more to the point?

Posted about 2 years ago
0

Thanks for clarifying. 

So If I understand you correctly your first question is regarding licensing. Are you aware that you are able to use a "Custom License" type? "Custom License allows the uses of the product as specified by the designer."

You make a good point however regarding the current license restrictions.

Regarding your question about using AI to sell the output on CGT. If we generate textures, use neural filters/sky replacement AI tools etc then why not generate models? There are always people who will try to abuse the system to their benefit (sell en masse) but this problem has always existed (100 models of the same person with a different color shirt). For the above average artist or those with a moral conscience, they will just have another tool at their disposal. Times change, workflows change and like all changes we must adapt.

Regarding your second question about stock sites using the data for training. At this current point I suppose you can only add a Custom License to your content. You can always contact CGT and inquire directly for clarification. If the stock sites you sell with don't align with your intended use then you can always take your content off that platform. 

If you are concerned about being silenced then make sure your Custom License is verified by a competent lawyer. Not sure if you read the case about Capcom being sued by Juracek for the unsolicited use of her photography in their video games - the case is still pending however shows that individuals can stand up for their rights if you have done your research and prepared to protect your IP.

iterateCGI wrote
iterateCGI
Yes I have custom license and will update it soon. It will however not stop these big companies from taking the content for AI training. Should there be a community effort to send a strong message that this is unjust? In this case for example bring CEO from stability AI to cord for hearing how he justifies downloading millions of copyrighted works to the Vram of his GPU servers?
3DCargo wrote
3DCargo
Who said they will be taking the content without asking?
iterateCGI wrote
iterateCGI
Because that's what they are doing right now.
3DCargo wrote
3DCargo
Havent heard about CGT or any 3D stock site agreeing to this as of yet, do you have any reference to this?
iterateCGI wrote
iterateCGI
It's only the presentation renders of the 3D models taken for now, better prepare to protect 3D models, maybe renders from different angles is sufficient for AI in rear future?
Posted about 2 years ago
0

Just to add to this, I'm sure you have read Shutterstock's new policy regarding AI content:
https://support.shutterstock.com/s/article/How-is-Shutterstock-bringing-AI-generated-content-to-their-platform?language=en_US

I assume 3D stock sites would follow suit and give the contributors an opt in/out system. If not, best to clarify.

These items from their article are interesting point from my view:
1. In expanding its partnership with OpenAI and launching the Contributor Fund to compensate artists for their contributions...
2. However, we will not accept AI-generated content being directly uploaded to our library because we want to ensure the proper handling of IP rights and artist compensation.
3. Because AI content generation models leverage the IP of many artists and their content, AI-generated content ownership cannot be assigned to an individual and must instead compensate the many artists who were involved in the creation of each new piece of content. (as per my opt in/out comment)
4. We’re committed to unlocking this opportunity for our customers but want to do so in an ethically responsible way, so please stay tuned... (leads back to our previous topic, protect your IP if it's in your interest until new license types are available).

iterateCGI wrote
iterateCGI
It was just food for thought not the main point, Shutterstock will figure it out. However, they could have told the community beforehand that they would use their work to train an AI before actually doing it? That's just an idea.
3DCargo wrote
3DCargo
Well I am a member of Shutterstock, I have contacted them to see if this is opt in/out - I can let you know when I do... if they tell me. Shutterstock shut down their forums awhile ago after drastically reducing the income for contributors , which may also have an affect on TS in future but I'm no fortune teller. This also means its difficult to ask who has accepted this new AI policy, so not sure if they 'told them' anything, or if they have asked the artists involved, not me obviously at this point.
Posted about 2 years ago
1

Raised the problem, in my opinion the best thing would be to prevent.
I think that the best way is to add a clause in the standard license, where the express prohibition of using the protected material in AI training is specified.
Starting from here, naturally, each creator can reach the agreements that he considers convenient with those interested in acquiring his material.
But in this way, one starts from a specific protection already established on copyright without having to discuss case by case.
In my opinion, CGT would do well to take this initiative, and it would surely do so if there are enough authors who demand it.

Posted about 2 years ago
0

Sorry, ones again (for newcomers to the debate) maybe smaller and better recap to core issue and points to debate.

Rough picture about matters thus far;

Company (LION.AI) provides huge datasets of links (billions) to images (lots of it copyrighted material) including metadata about its descriptions (they are out of reach for law, they store no images/work directly). Many companies are involved now with these datasets (eg. Midjourney, Dall-E, Shutterstock, etc.), public also gets involved/included.

The work of many artists are in these data link-sets (yours probably too), more get added/updated frequently. When a company use this data link-sets for AI training, the effective images/data (several millions) get downloaded to Vram of GPU servers ((core questionable event is happening here)) for processing the end product (eg, Stable Diffusion, DALL-E, Midjourney). Ones done they delete the images from Vram (evidence gone). When the end product is solidified its very hard to trace back images involved (some essential data potential remain in a single file) and what’s in can’t get back out (lots is in now and more get in, the products need to compete now, thus get better and need more essence).

Company’s involved use loopholes in copyright law with scientific use pretense to commit largest scale “art theft” ever witnessed (to make billion dollar tech company’s/products)?

Most of us seem ok with it because some parts of the product are thrown in public for free?

The company’s probably go full throttle (3D next) if they notice public discourse is ok with this type of default opt in as long as they provide some free parts of the product?

General public is probably going to do the same type of unlicensed art theft to customize own AI instance if there is no common ground about this type of use cases?

Is it artistic cannibalism on never seen scale soon?

Very little care and its fine?

What can we do if we care?

Just a few informed artists updating his/her custom license probably little to no effect?

CGtrader globally changing its license maybe more effect?

Advocating for separate license, category and repository for AI training content and prohibit use of everything outside of it better option?

Some type of pixel tag maybe while uploading work (some browser extinction, or tool for publisher platforms) the AI auto image crawlers then need to respect?

Opt out is set by default, opt in is option?

Posted about 2 years ago
0

I was curious and just tried out the stable Diffusion Demo software, and I was not impressed. The images it generated were not anywhere near stock image quality that would be accepted by a stock photo agency like Shutterstock. However I know with the rapid advance in technology this will probably improve a lot in the next few years. As a funny side note, being the human I am I tried to see what images it would give me with various controversial or explicit words just for laughs. With many of the words it gave me an "unsafe content" warning. However not always, like when I typed in something as simple as sex it did give me some very weird and distorted explicit images. I also tried something ambiguous, the word "creampie" which in English has some uh, lets say different meanings. Images of delicious dessert pies did not come up, instead some explicit sexual images, although very weird and distorted ones, came up haha. So I wonder where they get some of their "data" from. I guess they have some work to do on their unsafe filter. I also typed in some English words that have the double meaning of a sexual organ and a jerky mean person. Various distorted images of certain politicians came up-make of that what you will.

Posted almost 2 years ago
0

At least artists on artstation are clear about it (https://imgur.com/a/KNTDqCC)

Posted over 1 year ago
0

Well the news are a week old but maybe a good opportunity to keep this thread gonig, because i think it is quite important. Turbosquid partners with Nvidia to harvest the Data of 3D-content creators who haven specifically opted out to train Nvidia's 3D-Model AI.

A great opportunity for CGtrader to joint Sketchfab and Epic Games on their "No harvesting of Content Creator Data for AI training" policy.

Your answer

In order to post an answer, you need to sign in.

Help
Chat