
Earlier this month, Adrian Holovaty, founder of music-teaching platform Soundslice, solved a mystery that had been plaguing him for weeks. Weird images of what were clearly ChatGPT sessions kept being uploaded to the site.
Once he solved it, he realized that ChatGPT had become one of his company's greatest hype men – but it was also lying to people about what his app could do.
Holovaty is best known as one of the creators of the open-source Django project, a popular Python web development framework (though he retired from managing the project in 2014). In 2012, he launched Soundslice, which remains “proudly bootstrapped,” he tells Technewss. Currently, he's focused on his music career both as an artist and as a founder.
Soundslice is an app for teaching music, used by students and teachers. It's known for its video player synchronized to the music notations that guide users on how the notes should be played.
It also offers a feature called “sheet music scanner” that allows users to upload an image of paper sheet music and, using AI, will automatically turn that into an interactive sheet, complete with notations.
Holovaty carefully watches this feature’s error logs to see what problems occur, where to add improvements, he said.
That's where he started seeing the uploaded ChatGPT sessions.
They were creating a bunch of error logs. Instead of images of sheet music, these were images of words and a box of symbols known as ASCII tablature. That's a basic text-based system used for guitar notations that uses a regular keyboard. (There's no treble key, for instance, on your standard QWERTY keyboard.)

The volume of these ChatGPT session images was not so onerous that it was costing his company money to store them and crushing his app’s bandwidth, Holovaty said. He was baffled, he wrote in a blog post about the situation.
“Our scanning system wasn't intended to support this style of notation. Why, then, were we being bombarded with so many ASCII tab ChatGPT screenshots? I was mystified for weeks — until I messed around with ChatGPT myself.”
That’s how he saw ChatGPT telling people they could hear this music by opening a Soundslice account and uploading the image of the chat session. Only, they couldn’t. Uploading those images wouldn’t translate the ASCII tab into audio notes.
He was struck with a new problem. “The main cost was reputational: new Soundslice users were going in with a false expectation. They’d been confidently told we would do something that we don’t actually do,” he described to Technewss.
He and his team discussed their options: Slap disclaimers all over the site about it — “No, we can't turn a ChatGPT session into hearable music” — or build that feature into the scanner, even though he had never before considered supporting that offbeat musical notation system.
He opted to build the feature.
“My feelings on this are conflicted. I'm happy to add a tool that helps people. But I feel like our hand was forced in a weird way. Should we really be developing features in response to misinformation?” he wrote.
He also wondered if this was the first documented case of a company having to develop a feature because ChatGPT kept repeating, to many people, its hallucination about it.
The fellow programmers on Hacker News had an interesting take about it: Several of them said that it’s no different than an over-eager human salesperson promising the world to prospects and then forcing developers to deliver new features.
“I think that’s a very apt and amusing comparison!” Holovaty agreed.