Newsroom
AI can steal your face and voice, and Cyprus now wants to make it a crime.
The government may soon give people legal ownership of their own face and voice, as lawmakers move to tackle the growing misuse of artificial intelligence and deepfake technology.
A bill submitted by AKEL was discussed Tuesday in the Parliamentary Committee on Trade, aiming to protect individuals from having their image, voice, or personal characteristics used without consent. The proposal is inspired by Danish legislation and introduces what its backers describe as a simple but powerful idea: your face is yours. Your voice is yours. Full stop.
AKEL MP Christos Christofides, who presented the bill, said the law would recognize a special legal right for every person over their physical and digital likeness.
“In essence, this bill says that no one can use your face, your voice, or your personal characteristics without your permission,” Christofides said, noting that exceptions would apply only in cases such as satire or criticism.
The discussion in committee was met with a generally positive response from those involved, he said. Lawmakers are also considering a recommendation from the Attorney General’s Office to make the creation of harmful deepfakes a criminal offense.
Christofides stressed that the timing is no accident. The European Union is already moving toward tighter regulation of artificial intelligence, while Cyprus has previously taken the lead by becoming the first EU country to criminalize child pornography produced using AI.
“We live in an era where the line between what is real and what is fake has become dangerously blurry,” he said, warning that deep fakes can cause serious harm to individuals and society.
He pointed to recent examples, including a fake interview attributed to President Nikos Christodoulides that circulated online before being exposed as AI-generated. Such material, he said, can damage reputations, mislead the public, and stir political or social unrest.
Beyond politics, Christofides warned of growing risks in advertising and commercial use, where people’s images and voices are used without consent, as well as in more malicious cases such as revenge videos.




























