By Alys Key
2 min read
Weeks after Microsoft announced a new AI-powered version of its search engine Bing, scammers have been flooding the market with tokens trying to cash in on hype around the product.
The new search engine is powered by the same technology behind OpenAI’s popular text chat tool ChatGPT, kicking off an AI arms race with Google
Interest has also extended into the crypto space, where AI-related crypto tokens such as Fetch (FET), SingularityNET (AGIX), and Ocean Protocol (OCEAN) have seen their value boosted by investors keen to get in on the action.
But the excitement also seems to have opened the door for fake tokens posing as well-known brands.
At least 20 on the market are currently using the name BingChatGPT, according to a search carried out on DEXTools. Of these, 12 are on Binance’s BNB Chain, while another six have been created on Ethereum and two more on Arbitrum.
Another 170 are using the ChatGPT name, with BNB once again the most common protocol on which they are issued.
No official crypto projects or tokens were announced as part of Microsoft’s plans for Bing. Decrypt has contacted both Microsoft and OpenAI for comment.
In a tweet flagging the proliferation of BingChatGPT, crypto security firm PeckShield said that at least three of the tokens using the BingChatGPT name appear to be honeypots.
A honeypot is a type of crypto scam in which the fraudster promises users they will be sent additional funds if they transfer an initial sum, only to never see their money again.
Two of the tokens posing as Bing-related schemes also have extremely high sales taxes, so the issuer will take a hefty cut of any proceeds when a token is sold—in this case 99% or even 100% of funds.
Meanwhile, another of the BingChatGPT tokens appears to have been created by a serial rug-puller, identified by PeckShield as “Deployer 0xb583, whose previously deployed coins include ones with names referencing Tesla founder Elon Musk and former UK Prime Minister Liz Truss.
Decrypt-a-cookie
This website or its third-party tools use cookies. Cookie policy By clicking the accept button, you agree to the use of cookies.