← Home

Can ChatGPT help me write a VST plugin from scratch?

2 January, 2026

So... I recently left my job and decided I wanted to start experimenting with different projects entirely for fun ("reconnect with my passions" is part of my sabbatical resolutions). But I did not know where to start, and what do people do these days when they don't know what to do? They ask an LLM, of course, so that's what I did.

I first prompted it to treat me like a pal, because this is a collaborative project between pals and not some large complicated product. More specifically, I started the conversation being "let's drop the bs and be real with each other here". I did my best to translate quotes from ChatGPT to English here, because the chat originally happened in extremely informal Portuguese, with lots of slang and grammar errors. I was somewhat surprised that ChatGPT picked up on the minor slang/informal style I wrote in (which is mostly verbal, not written Portuguese) and did a decent job of replicating it.

I then told it I was leaving my job, and how frustrated I was with how long it takes to get any feedback when working on projects where training big models is part of your development loop, and said I wanted to work on something with more instant gratification and closely related to my interests. I was kinda surprised that it peeked into our other conversations to figure out what my interests are! One of the projects it proposed was a VST for a weird, fun effect that doesn't necessarily has any commercial appeal. Deal.

You know what was the first thing I noticed about my new creative buddy? They're pretentious as fuck. Did that come from the language prompt, maybe, or does it always do that? Saying things such as "this is not a toy, it's an instrument"... Very annoying. But the premise was cool, so I agreed to work on it.

The idea that ended up winning was this plugin that would scan frequency bands that were being "overused" and try to motivate you to move your music content somewhere else. While the idea is cool on paper, when I asked it about how to implement it it went on a long, hyped explanation of how basically a very weak 5 band equalizer could do that (spoiler alert: no, it can't).

Several annoyances came up as we were working on this project. As the context gets long, it makes several mistakes, and starts forgetting things from the beginning (we're not using Projucer, we want bands to be decided dynamically based on the contents of each track, etc etc). I know this is something that tools such as Cursor and Claude Code supposedly have fixes for. Maybe ChatGPT just has a harder time keeping up with a codebase... but also, the larger the context, the more likely context summarization will mess things up like ChatGPT was doing here.

For a few iterations I actually had code that actually compiled... but segfaulted. It ended up not being able to figure out how write to a WAV file at some point (something it was doing perfectly in the test for several iterations), then blaming the JUCE API for being "buggy" when it's actually just using it wrong.

Conclusion

Imagine being a music producer who learned everything from books but never heard a song or touched a volume knob before... but knows how to hype their product. This was basically how it felt to work with ChatGPT on having ideas for an audio plugin. Even ignoring the issues with the code (which I know have fixes in tools such as Cursor and Claude Code), it just cannot really know what sounds cool in audio.

Are fully automated audio plugins the future??? Not yet, and certainly not while machines don't have ears that work like ours (at the very least).

HOWEVER

Can I fully blame ChatGPT? Of course not. We should use tools knowing of their limitations and apply them for what they are good at. This is analog to complaining ChatGPT can't make me a cup of coffee; it probably knows and would tell me it cannot make me a cup of coffee, and ideally it should know about its other limitations as well. A few days ago I used Gemini to vibe code a "djent calculator" and it even proposed an algorithm I didn't know existed for Euclidean rhythms. It implemented a fully functional app with visualization and sound using Tone.js. It did what it was supposed to and was useful in a very limited capacity.

Basically, let's not throw the baby out with the bath water and understand that tools have limitations, and use them for what they're useful for. I think LLMs can help a lot with boilerplate code generation and a lot of more basic technical stuff, but they are not supposed to replace our creativity (although I do believe they can augment our creativity when used the right way).