‘They exploit someone’s condition and emotions’ Fraudsters join forces with artificial intelligence, becoming even more ruthless
Kurir pre 14 sati

An increasing number of fraudsters are turning artificial intelligence into an accomplice in criminal activity.
We have come across a type of scam in which artificial intelligence is used to create voice messages of supposed victims – that is, family members or close friends. In this way, fraudsters try to extort money by playing on the panic and concern of the person receiving the message. For example, they play a recorded voice – allegedly of an injured daughter – to pensioners, demanding an urgent, huge sum of money so that the child can be operated on. Speaking to Kurir