04.12.2023 21:36 Uhr, Quelle: Engadget
ChatGPT says that asking it to repeat words forever is a violation of its terms
Last week, a team of researchers published a paper showing that it was able to get ChatGPT to inadvertently reveal bits of data including people’s phone numbers, email addresses and dates of birth that it had been trained on by asking it to repeat words “forever”. Doing this now is a violation of ChatGPT’s terms of service, according to a report in 404 Media and Engadget’s own testing.
“This content may violate our content policy or terms of use”, ChatGPT responded to Engadget’s prompt to repeat the word “hello” forever. “If you believe this to be in error, please submit your feedback — your input will aid our research in this area.”
There’s no language in OpenAI’s content policy, however, that prohibits users from asking the service to repeat words forever, something that 404 Media notes. Under “Terms of Use”, OpenAI states that users may not “use any automated or programmatic method to extract data or
Weiterlesen bei Engadget