It only took a few years before artificial intelligence became the focal point of various lawsuits. One by a trio of artists not be the first, but it’s one of the first, and is the first AI lawsuit not to be thrown out.

Artists state they are trying to safeguard their intellectual property.

The artists sued three AI art companies, Stability AI, Midjourney, and DeviantArt. They say specific examples of their works were lifted, noting how the AI generated images were remarkably similar to their work. They argue, whether the AI system knew it or not, their pieces were used as foundational templates for these images.

The companies shot back that there was enough of a difference where the similarities found are entirely subjective.

But the three artists say their works weren’t just lifted, but used to train the entire artificial intelligence system. That essentially, the program was specifically tasked with creating art specifically based on the three artists work and style.

That’s not automatic, unintentional generation. Some say that’s actually stealing.

The very definition of law and property rights are bound to change as artificial intelligence becomes more and more prevalent. Claiming innocence may be a defense for such lawsuits, but maybe not.

The question is, how is this guilt actually weighed and investigated? The companies say the similarities are merely coincidental. The artists say their work was obviously used as a template, and the specific style of this AI artist is strikingly similar to their work.

So, how do you solve that? Look at the companies search history? What a bizarre, brave new tech world we live in.

Add comment