Court Rules Against Photographer Who Sued AI Dataset for Copyright Theft
A German court has ruled against a photographer who sued the AI image dataset company LAION in a case that could have big implications.
A German court has ruled against a photographer who sued the AI image dataset company LAION in a case that could have big implications.
The open-source LAION-5B dataset used to train AI image generators has been re-released after it was pulled last year when child sex abuse material (CSAM) was discovered among the billions of pictures.
The best-known dataset for training AI image generators, LAION-5B, has removed its service after a Stanford study found thousands of child sex abuse images in its library.
A study has found that training AI image generators with AI images produces bad results.
A prominent stock photographer who requested his photos be removed from a dataset used to train AI image generators has been sent an invoice demanding $1,000.
As the penny drops with photographers and artists alike that their images have been used to train AI image generators on a monumental scale — the backlash is growing stronger.