National emergency: UK to ban AI 'nudification' apps creating fake nudes

upday.com 3 godzin temu
Safeguarding minister Jess Phillips announced the plans (PA) Jonathan Brady

The Government will ban AI "nudification" tools that create fake nude images and videos without consent. Safeguarding minister Jess Phillips announced the measure on Thursday as part of a broader strategy to halve violence against women and girls within a decade.

The new laws will prevent the creation and sharing of deepfake abuse content that transforms real people's pictures into fake nudes using artificial intelligence. Phillips called the issue a "national emergency" and said "change is coming".

The ban targets individuals who create or supply nudification apps. Phillips emphasized the severity of the problem: "Nudification apps are not used for harmless pranks. They devastate young people's lives, and we will ensure those who create or supply them face real consequences."

Wider strategy measures

The Government announced a £19 million funding boost for councils to provide safe housing for domestic abuse survivors.

Secondary schools will teach all children about healthy relationships. The Government will train teachers to spot worrying behavior in young men.

Every police force across the country will introduce specialist rape and sexual offences investigators.

The NHS will provide better support for survivors.

Phillips said the strategy addresses root causes: "We must stop these images being created and shared while tackling the root causes of negative influences on young men in their schools, homes and online. That's why we will join forces with tech companies to stop predators online and prevent the next generation from being exploited by sexual extortion and abuse."

The cross-government strategy aims to crack down on online predators and prevent young people from being exploited by sexual extortion.

Note: This article was created with Artificial Intelligence (AI).

Idź do oryginalnego materiału