Recent reports have shown the surge in popularity of Artificial Intelligence (AI) generated technology, which could present a significant challenge for crypto exchanges and the community in the never-ending battle against scams and illicit activity.
Can AI-Generated IDs Bypass Exchanges Verification Processes?
An underground website named OnlyFake has gained popularity for its realistic AI-generated fake ID services. The website, as reported by 404 Media, claims to use “neural networks” to generate convincing photos of fake IDs for $15. The investigation led by Joseph Cox, 404 Media co-founder, details the process of creating these fake documents.
The creation of the fake ID is done within minutes, and the service offers a variety of documents to choose from, including passports and driving licenses. As the report shows, the generation panel gives customization options as the user can fill in the information with the data they desire to provide and also have the option to randomize it with a button.
It also offers the possibility to choose from a pool of pictures from the website if the user doesn’t want to upload their image.
Here's the process of me successfully bypassing the identity verification on OKX, a cryptocurrency exchange I've noticed is being used by criminals – Asks for passport– I took photo of my fake British passport I made earlier (didn't need in hand)https://t.co/hCjHWbKJPf pic.twitter.com/69PvbincUP
— Joseph Cox (@josephfcox) February 5, 2024
The website claims to “use ‘generators’ which create up to 20,000 documents a day.” The service’s owner, who uses the pseudonym ‘John Wick,’ told 404 Media that using their site to forge documents is prohibited, but also said that “hundreds of documents can be generated at once using data from an Excel table.”
Despite the ‘neural networks’ claims, the investigation didn’t show evidence of generative AI tools used when creating fake documents.
Additionally, “Wick” claimed that the service could bypass the verification process of several platforms, including Crypto exchanges: Binance, Revolut, Wise, Kraken, Bybit, Payoneer, Huobi, Airbnb, OKX, and Coinbase.
Cox verified this claim by creating a fake British passport and attempting to create an account in OKX, which has previously been under the spotlight for illegal activity from users.
The journalist focused on testing the exchange’s document recognition feature and not so much the selfie feature, as he used his picture for the test. He bypassed the exchange’s verification system and created a cryptocurrency account with a fake passport and information.
OKX's system said it was reviewing my identity. Then it approved it. There you go: successfully made a cryptocurrency account with a fake name, fake ID. The face is mine (didn't want to implicate innocent person) but site says its going to launch AI faces https://t.co/hCjHWbKJPf pic.twitter.com/VlyIb2EbDE
— Joseph Cox (@josephfcox) February 5, 2024
What Does This Mean For Crypto Security?
Creating realistic fake identification documents requires high skills that the average person or fraudster doesn’t have. Without these skills, individuals looking for a fake ID would have to order them from a third party, which could come at a high cost, take some time to be delivered, and risk being intercepted.
AI-generated documents, with a lower cost, price, and risk of interception, might be attractive to online criminals who, in recent years, have targeted crypto investors with different scamming methods.
Cyber security researcher Abhishek Mathew told 404 Media that “Many use this service for carding, creating fake bank accounts and also many use this service to unban their crypto accounts like Binance where they ask IDd proofs.”
Senator Ron Wyden also shared a statement to the digital media company expressing his thoughts on the matter:
It is clear that AI-based tools that generate fake IDs and video deepfakes are going to pose a real fraud problem for government agencies, financial institutions and other companies. The United States desperately needs secure, authenticated IDs, so that Americans can verify their identity when conducting sensitive or high-risk transactions with the government and private sector.
The surge in AI-generated scams using deepfakes of important figures in the crypto industry has skyrocketed recently, and it has sparked a conversation among the community about the measures or regulations that should be taken to resolve these issues.
by Rubmar Garcia via Bitcoinist.com
Comments
Post a Comment