Uh-huh. Put AI in the same room as teen-age boys..and what do you think will happen?
Stevie Hyder felt nauseous.
Standing in the hallway at her Illinois high school a few weeks ago, the 15-year-old found out one of her sophomore classmates was using artificial intelligence, or AI, to create nude photos of her. Dozens of doctored images of her and other teenage girls were floating around, a friend told her. Some even depicted teachers.
By the time the principal called her mom, Hyder was the 22nd girl on his list.
As AI gains a stronger foothold in the American economy and culture, administrators are watching it creep into schools.
In December, two middle school boys at a charter school in Miami were arrested on suspicion of using an AI app to create nude photos of their classmates, who were between the ages of 12 and 13, according to an arrest warrant.
Officials charged the boys with third-degree felonies, citing a state law that forbids the “unauthorized promotion of a sexually explicit image.” A number of states, including Texas and Virginia, have so-called “deepfake laws,” which criminalize the nonconsensual creation of pornography using an image of another person. Even more state legislatures are mulling putting such rules on the books.
A few months later, a similar scandal hit a middle school in Beverly Hills, California.
In February, five eighth-graders at Beverly Vista Middle School were involved in using AI to superimpose the faces of 16 other eighth-graders onto photos of nude bodies, according to CBS Los Angeles and a statement from the district.
The Beverly Hills Police Department launched an investigation into the incident, according to department spokesperson Andrew Myers. The probe is ongoing.

Students used AI to create nude photos of their classmates. For some, arrests came next.
As AI becomes more popular, principals are watching it creep into schools. But policing it isn't easy.
www.usatoday.com
Stevie Hyder felt nauseous.
Standing in the hallway at her Illinois high school a few weeks ago, the 15-year-old found out one of her sophomore classmates was using artificial intelligence, or AI, to create nude photos of her. Dozens of doctored images of her and other teenage girls were floating around, a friend told her. Some even depicted teachers.
By the time the principal called her mom, Hyder was the 22nd girl on his list.
As AI gains a stronger foothold in the American economy and culture, administrators are watching it creep into schools.
In December, two middle school boys at a charter school in Miami were arrested on suspicion of using an AI app to create nude photos of their classmates, who were between the ages of 12 and 13, according to an arrest warrant.
Officials charged the boys with third-degree felonies, citing a state law that forbids the “unauthorized promotion of a sexually explicit image.” A number of states, including Texas and Virginia, have so-called “deepfake laws,” which criminalize the nonconsensual creation of pornography using an image of another person. Even more state legislatures are mulling putting such rules on the books.
A few months later, a similar scandal hit a middle school in Beverly Hills, California.
In February, five eighth-graders at Beverly Vista Middle School were involved in using AI to superimpose the faces of 16 other eighth-graders onto photos of nude bodies, according to CBS Los Angeles and a statement from the district.
The Beverly Hills Police Department launched an investigation into the incident, according to department spokesperson Andrew Myers. The probe is ongoing.