ChatGPT Sent Users to a Website for a Feature It Didn't Have—So the Founder Built It

CN
Decrypt
Follow
15 hours ago

What do you do when your website is bombarded with uploads it can’t process? That’s the situation software developer and musician Adrian Holovaty found himself in when he noticed a strange surge in failed uploads to his company’s sheet music scanner.


What he didn’t expect was that the culprit was allegedly ChatGPT.


In a recent blog post, the Soundslice co-founder explained that he was looking at error logs when he discovered that ChatGPT was instructing users to upload ASCII “tabs”—a simple musical format used by guitarists and others in lieu of musical notation—into Soundslice to hear audio playback. The problem was, the feature did not exist. So Holovaty decided to build it.


“To my knowledge, this is the first case of a company developing a feature because ChatGPT is incorrectly telling people it exists,” Holovaty wrote.





Launched in 2012, Soundslice is an interactive music learning and sharing platform that digitizes sheet music from photographs or PDFs.


“Our scanning system wasn’t intended to support this style of notation,” Holovaty wrote. “Why, then, were we being bombarded with so many ASCII tab ChatGPT screenshots? I was mystified for weeks—until I messed around with ChatGPT myself.”


“We’ve never supported ASCII tab; ChatGPT was outright lying to people. And making us look bad in the process, setting false expectations about our service.”


The phenomenon of AI hallucinations is commonplace. Since the public launch of ChatGPT in 2022, numerous instances of chatbots, including ChatGPT, Google Gemini, and Anthropic’s Claude AI, have presented false or misleading information as fact.


While OpenAI did not mention Holovaty’s claims, the company acknowledged that hallucinations are still a concern.


“Addressing hallucinations is an ongoing area of research,” an OpenAI spokesperson told Decrypt. “In addition to clearly informing users that ChatGPT can make mistakes, we’re continuously working to improve the accuracy and reliability of our models through a variety of methods.”


OpenAI advises users to treat ChatGPT responses as first drafts and verify any critical information through reliable sources. It publishes model evaluation data in system cards and a safety evaluation hub.


“Hallucinations aren’t going away,” Northwest AI Consulting co-founder and CEO Wyatt Mayham told Decrypt. “In some cases, like creative writing or brainstorming, hallucinations can actually be useful.”


And that's exactly the approach Holovaty embraced.


“We ended up deciding: What the heck? We might as well meet the market demand,” he said. “So we put together a bespoke ASCII tab importer, which was near the bottom of my ‘Software I expected to write in 2025’ list, and we changed the UI copy in our scanning system to tell people about that feature.”


Holovaty did not respond to Decrypt’s request for comment.


免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

ad
Gate: 注册赢取$10000+礼包
Ad
Share To
APP

X

Telegram

Facebook

Reddit

CopyLink