Minnesota is studying “Nudife” applications that use artificial intelligence to make explicit images without approval

street. Paul, Lee. – Molly Kelly shocked her discovery in June that someone I knew had used a “achieved” technology to be widely available to create very realistic and sexual videos and pictures of her, using family photos posted on social media.

Kelly said: “My first shock turned into terror when I learned that the same person targeted about 80 and 85 other women, and most of them lived in the state of Minnesota, and some of them know them personally, and all of them had some somehow links to the perpetrator,” Kelly said.

Backed by testimony, Minnesota is studying a new strategy to take strict measures against Dipfik pornography. The bill that has support from the two parties will target companies that run web sites and applications that allow people to download an image that is converted into explicit photos or videos.

Countries around the country and Congress are studying strategies to regulate artificial intelligence. Most of them have banned the explicit detepfakes or revenge for porn, whether it was produced with artificial intelligence or not. The idea behind the Minnesota legislation is to prevent the creation of materials – before spreading online.

Cautious artificial intelligence experts may be unconstitutional for freedom of expression.

The lead author, Democratic Senator Irene May Code, said that additional restrictions are necessary because artificial intelligence technology has advanced very quickly. Its bill will require “update” and its applications to stop it to people in Minnesota or to face civil sanctions that amount to $ 500,000 “for each illegal access, download or use. The developers will need to know how to exclude Minnesota users.

She said this is not just a harmful publication of the victims. It is the fact that these pictures are at all.

Kelly told reporters last month that anyone could create “very realistic nude photos or porn video” in minutes.

Most law enforcement is focused so far on distribution and possession.

San Francisco in August A lawsuit of its kind was filed Against many “urgent update” sites that have been widely visited, claiming that they broke state laws against fraudulent commercial practices, extraordinary pornography and sexual assault on children. This issue is still pending.

Last month, the US Senate unanimously agreed to a bill by Democrat Amy Kloposhr, from Minnesota, and Republican Ted Cruz, from Texas, to make a federal crime to publish unusual sexual images, including the deep generator of artificial intelligence. Social media platforms will be removed to remove them within 48 hours of notification from the victim. Melania Trump on Monday I used her first single appearance Since I became the first lady again to urge the passage of the Republican house, where it is suspended.

Kansas’s house last month agreed to a draft law that expands the definition of illegal sexual exploitation of a child to include possession of images created with artificial intelligence if it “cannot be distinguished from a real child, was transferred from the image of a real child or created without any actual child participation.”

A draft law presented in the Legislative Council in Florida creates a new felony for people who use technology such as artificial intelligence to generate nude images and criminalize possession of sexual assault images of children born with it. Similar bills have also been offered in Illinois, Montana, New Jersey, New York, North Dakota, Oregon, Roses Island, South Carolina and Tixas, according to Associated Press analysis using a bills tracking program format.

Maye Quade said that she will share her proposal with legislators in other states because a few of them realize that technology can be easily accessible.

“If we cannot make Congress act, we can get the largest possible number of countries to take action,” said Maye Quade.

Sandy Johnson, the Higher Legislative Policy Adviser, Rayan Rights Group – Rape, the Interest and Delegation Network – said that the Minnesota bill will carry web sites.

“Once the images are created, they can be posted unknown, or widely published on a large scale, and it becomes almost impossible to remove them.”

Megan Hurley was also horrific to learn that someone had created clear photos and videos of her using “to achieve this”. She said that she feels particularly insulting because she is a massage therapist, a profession that has already collected in some minds.

“It is very easy for one person to use his phones or computer and create convincing, artificial, intimate images for you, your family, friends, children and grandchildren.” “I do not understand the reason for this technology and I find it hateful that there are companies that earn money in this way.”

However, two Amnesty International-Wayne Unger of Law Faculty of the University of Koinibiac and Riana in Fevrken from the Stanford University Institute for Managed Human Intelligence-The Minnesota Bill is very widely built to survive in the court challenge.

PFEFFERKORN said only the range to reduce the images of real children may help withstand the first amendment challenge because these are generally not protected. But she said that it is still likely to contradict a federal law saying that you cannot sue websites to the content created by users.

“If Minnesota wants to go to this direction, they will need to add more clarity to the bill,” said Angger. “They will have to narrow what they mean by preparing and compensation.”

But Maye Quade said she believed her legislation on a strong constitutional basis because it regulates behavior, not speech.

“This cannot continue, technology companies cannot continue to launch this technology in the world without any consequences. It is harmful to its nature,” she said.

___

The Associated Press Matt O’Brien, John Hanna and Kate Pine contributed to this story from Provence, Rod Island; Witchita, Kansas; And Talhaasi, Florida, respectively.

Leave a Comment