Ex-Google CEO Says AI Companies Can Steal Content With Impunity
Former Google CEO Eric Schmidt made controversial comments during a discussion at Stanford University where he suggested AI companies don’t need to worry about stealing copyrighted content.
That wasn’t the only contentious comments delivered by Schmidt who left Google in 2020. He also blamed Work From Home (WFH) culture for the company’s woes.
A video of the event was on YouTube but Schmidt requested that it be taken down after telling The Wall Street Journal that he “misspoke about Google and their work hours”.
‘That’s Typically How Things Are Done’
During the interview at Stanford University, Schmidt suggests to the students that Silicon Valley can use copyrighted data with impunity because they will either get rich enough to hire expensive lawyers or they will fail so nobody will know of their crimes.
“If TikTok is banned, here’s what I propose each and every one of you do: Say to your LLM the following: Make me a copy of TikTok, steal all the users, steal all the music, put my preferences in it, produce this program in the next 30 seconds, release it, and in one hour, if it’s not viral, do something different along the same lines,” Schmidt says, per The Verge.
“That’s the command. Boom, boom, boom, boom.”
Schmidt returns to the topic a little later during the speech.
“So, in the example that I gave of the TikTok competitor — and by the way, I was not arguing that you should illegally steal everybody’s music — what you would do if you’re a Silicon Valley entrepreneur, which hopefully all of you will be, is if it took off, then you’d hire a whole bunch of lawyers to go clean the mess up, right? But if nobody uses your product, it doesn’t matter that you stole all the content,” he says.
“In other words, Silicon Valley will run these tests and clean up the mess. And that’s typically how those things are done.”
Schmidt’s remarks come after the explosion of generative AI in recent years which is predicated on masses of copyrighted data being used to build AI models — mostly without the permission of the copyright holders.
There are multiple, ongoing lawsuits over the issue of training data. This week, a group of artists suing AI companies Stability AI and Midjourney scored a win as a federal judge decided that it is plausible the firms violated the artists’ rights by illegally storing copies of their work on their systems.