Man Loses Rs. 5 Cr to a Scam Deepfake AI Call Posing as His Close Friend

– Advertisement –

AI is proving to do more harm than good. Recently an individual from northern China fell victim to a scam involving highly sophisticated AI deepfake technology.

Thus, AI tools can undoubtedly be used to commit financial crimes. Now the question is- how far it could and would go? The public and authorities are on high alert after this incident which they should be anyway.

What happened?

The scammer made the victim believe that he was his close friend and in dire need of money. The perpetrator could do so by using AI-powered face-swapping technology. He made a video call to the man from China and persuaded him to transfer 4.3 million yuan.

The person believed his friend needed money urgently to deposit during a bidding process, so he complied. He found out about being duped only when his friend expressed ignorance about the whole fiasco.

– Advertisement –

Luckily, Reuters reported most of the stolen money has been recovered as stated by the local police on Saturday. They are working diligently to trace and recover the remaining funds as well.

China recognized the threat posed by AI early on and has since been aggressively tightening its scrutiny of such applications and technology. There has been a rise in AI-driven scams and frauds involving the manipulation of facial and voice data primarily. Thus, new rules were implemented in China in January to deliver legal protection to victims.

Another case

Another case has been reported last month in the USA. Jennifer DeStefano received a call from an unknown number asking for ransom for releasing her kidnapped daughter. The woman said her 15-year-old went for a skiing trip when she got the call.

She heard her daughter saying Mom and sobbing on the phone. Then a man threatened the woman to not call the authorities. DeStefano could hear her child s voice calling for help in the background. The man demanded $ 1 million for the release.

– Advertisement –

The woman said, It was never a question of who is this? It was completely her voice. It was her inflection. It was the way she would have cried, Shed told the local news media, I never doubted for one second it was her. That s the freaky part that really got me to my core. Her daughter was not kidnapped in reality and it was all a scam, nevertheless a dreadful one.

Be vigilant and cautious

Deepfakes have always been an issue. They were mainly used to spread misinformation. But with the development of AI, deepfake technology is getting more sophisticated. Thus, the scope for misusing it is increasing as well.

Since India is seeing a rise in scams of different sorts, people are being advised to remain vigilant and cautious during online interactions.

– Advertisement –

Post navigation

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *