The US Department of Justice arrested a Wisconsin man last week for generating and distributing AI-generated child sexual abuse material (CSAM). As far as we know, this is the first case of its kind as the DOJ looks to establish a judicial precedent that exploitative materials are still illegal even when no children were used to create them. “Put simply, CSAM generated by AI is still CSAM,” Deputy Attorney General Lisa Monaco wrote in a press release.
The DOJ says 42-year-old software engineer Steven Anderegg of Holmen, WI, used a fork of the open-source AI image generator Stable Diffusion to make the images, which he then used to try to lure an underage boy into sexual situations. The latter will likely play a central role in the eventual trial for the four counts of “producing, distributing, and possessing obscene visual depictions of minors engaged in sexually explicit conduct and transferring obscene material to a minor under the age of 16.”
The government says Anderegg’s ima