AI Image Dataset is Pulled After Child Sex Abuse Pictures Discovered
The best-known dataset for training AI image generators, LAION-5B, has removed its service after a Stanford study found thousands of child sex abuse images in its library.
The best-known dataset for training AI image generators, LAION-5B, has removed its service after a Stanford study found thousands of child sex abuse images in its library.
A study has found that training AI image generators with AI images produces bad results.
A prominent stock photographer who requested his photos be removed from a dataset used to train AI image generators has been sent an invoice demanding $1,000.
As the penny drops with photographers and artists alike that their images have been used to train AI image generators on a monumental scale — the backlash is growing stronger.