With growing concerns about artificial intelligence being used to harass and exploit women, producing sexually explicit, deep fake content is soon to be illegal in England and Wales.

The UK Ministry of Justice revealed on Tuesday that anyone who makes such a photograph or video of another adult without that person’s permission—even if that person doesn’t plan to publish it—would be subject to an unspecified fine and criminal record under a draft law.

Sharing explicit fakes without the subject’s agreement is already prohibited in England and Wales, and offenders risk jail time.

Deep fakes are pictures or films that have been altered to appear as though someone has said or done something they haven’t, frequently with the help of artificial intelligence.

The UK’s Minister for Victims and Safeguarding, Laura Farris, stated on Tuesday on ITV that, “to the best of her knowledge,” the two member states of the UK will be the first globally to prohibit the production of sexually explicit deep fakes.

Whether or not the individual is acting sensually, pornographic photos and nude deepfakes would also be considered such content under the new law.

The devolved governments of Northern Ireland and Scotland are in charge of enacting pertinent legislation in their nations. “The Department is conscious of the Ministry of Justice’s proposals in this regard and is currently exploring options available for Northern Ireland,” a spokesman for the justice department of Northern Ireland told CNN.

See also  How the Ghanaian central bank lost $5 billion in a single year