
Given the mass hysteria that has been unleashed in the world with Artificial Intelligence (AI), few are talking about the dangers, although it represents a growing risk for users of bitcoin (BTC) and other cryptocurrencies that operate on exchanges.
Fraudsters are increasingly using the tools of AI to create deepfakes con those seeking to bypass KYC verification (Know your client) of the exchange houses. This is warned by Jimmy Su, director of security at Binance, the cryptocurrency exchange with the highest volume of trade globally.
“The scammer will search the internet for a normal photo of the victim. Based on that, it can produce videos using deep forgery tools to evade verification,” Su said, according to a Cointelegraph post.
Detailing the modus operandi used by scammers, Su adds that AI tools have become so sophisticated that they can even respond to instructions audio files designed to verify that the applicant is human. In fact, there is already software that can do this in real time.
“For example, it requires the user to blink their left eye, or to look left, right, up, or down. The deepfakes are now so advanced that they can actually execute these commands,” the security chief added.
Su added that while AI is becoming more advanced, “it’s not yet at the level where it can fool a human operator” because it’s not indistinguishable from reality.
David Schwed, the COO of blockchain security firm Halborn suggests that a useful way of quickly detect a deepfake is to observe when the subject blinks. If it looks unnatural, it is most likely a doctored video with fraudulent intent.
This is due to the fact that deepfake videos are generated using image files obtained from the internet, where the subject will usually have their eyes open, explains Schwed. Therefore, in a deepfake, the blinking of the subject’s eyes must be simulated.
Another formula that users can implement to avoid deepfakes is to improve privacy and anonymity, although curiously these security measures conflict with the use of centralized exchanges that implement KYC procedures.
Beware of crypto-fishing deppfakes
The security firm Kaspersky is also warning about deepfakes to carry out various scams, such as cryptocurrency fraud, or bypassing biometric security.
In some cases, scammers use images of celebrities or mix old videos to make live broadcasts on social networks. In them they show a predesigned page in which ask victims to transfer certain amount in cryptocurrency, promising that they will double any payment sent to them. As a result, victims of these scams lose the entire amount transferred.
Given this, Kaspersky recommends staying alert regarding threats related to deepfakes to avoid becoming a victim. In that sense suggests being attentive to sudden movements that occur in the videos.
Some of these sudden changes are changes in lighting from one frame to another, changes in the skin tone of people, strange flickering or no flicker at all. He also recommends looking for lip-synchronized speech, digital artifacts in the image, as well as videos intentionally encoded in low quality and in poor lighting.
On the other hand, it is advised to remain skeptical of voice messages and videos, which will not guarantee that people will never be fooled, but it can help avoid many of the most common pitfalls. First of all, the best thing will be to verify before trusting, points Kaspersky.
#scam #exchanges #Binance #chief #security #officer