The American Matthew Butterick has started a legal crusade against generative artificial intelligence (AI). In 2022, he filed the first lawsuit in the history of this field against Microsoft, one of the companies that develop these types of tools (GitHub Copilot). Today, he’s coordinating four class action lawsuits that bring together complaints filed by programmers, artists and writers.
If successful, he could force the companies responsible for applications such as ChatGPT or Midjourney to compensate thousands of creators. They may even have to retire their algorithms and retrain them with databases that don’t infringe on intellectual property rights.
Not going to happen, buddy.
The fundamental purpose of copyright is to promote Science and the Useful Arts. The purpose is to expand our collective body of knowledge; to increase our collective intelligence.
It is impossible to infringe on copyright by reading a book. Even if the book was illegally produced and distributed, the act of reading it is not a copyright violation. A natural mind cannot be denied access to published information through copyright law.
That natural mind is restricted by not being allowed to produce or distribute a copy or a derivative work, but knowledge of the work and inspiration from that work are not restricted by copyright or patent law. Copyright exists specifically to promote providing knowledge to that mind.
Blocking the development of an artificial mind fundamentally breaks the purpose for which copyright exists.
This is a very simplified narrative if I may say so. I’d argue there is no such thing as an artificial ‘mind.’ What you call mind is a stochastic parrot. Whatever the bot yields, its whole work is being copied, because that’s the point of training a foundation model.
The copyright laws in our current forms can’t simply be applied here. I’m not a laywer and can’t elaborate how we should address the issue legally, but the models’ results are 100 percent copied. There can be no doubt. There’s no mind that has created anything original.