Comment by πŸš€ clseibold

Re: "Been seeing this term around the internet recently, called "…"

In: s/programming

@darkghost Well, I wasn't painting with a broad brush. I'm not entirely sure what "brush" I was painting anybody with by saying "What the absolute heck?" Care to explain?

I use AI sometimes. I just don't think *every* single thing in an application should be made with AI in a professional context, or in open source software that's intended to be used (as distinct from a learning project). I see it more as a tool to be used for specific things (including learning, as you say).

The term "vibe coding" came from OpenAI's CEO, afaik, and here is what he said:

There’s a new kind of coding I call β€˜vibe coding’, where you fully give in to the vibes, embrace exponentials, and forget that the code even exists.

I'm concerned that professional programming is going to become "vibe coding" where *everything* is made with an AI and you don't have to be concerned about things like performance or security (because those things require knowledge of how code and computers work).

But mainly I just think the term "vibe" is obnoxious.

πŸš€ clseibold [OP, πŸ›‚ Code of Conduct rule 1 violations]

2025-03-27 Β· 1 year ago

7 Later Comments ↓

πŸ‘» darkghost Β· 2025-03-27 at 13:08:

@clseibold I agree with every single thing you have said. I was more responding to some later comments I should have tagged their authors on. AI programming is all well and good until somebody invents a new language and there's no "public domain" coding examples on stackexchange to steal er I mean "train" on. But, I mean, what are the odds anybody will ever develop a new programming language right? (obvious foreshadowing)

Edited to add: AI programming is "all well and good" except for a gazillion cases like some of the ones you mentioned.

πŸš€ clseibold [OP, πŸ›‚] Β· 2025-03-27 at 13:41:

@darkghost Ah, ok. Sorry.

Yeah, I agree. AI is being pushed *a lot* by certain individuals and companies. I think it's mostly a fad/trend that's being a little overexaggerated. You have people claiming that AI will end up writing all of the software in the next 5 years, and crazy things like that. I don't know about others, but I don't intend to stop writing code manually in the next 5 years, personally, lol.

And yet, there are still dangers too. AI is a tool that is sometimes useful, but when it gets overused, our performance and software complexity problems could actually increase rather than decrease, which I'm not looking forward to, especially now that Computer Science textbooks have finally started to note the obvious, which is that Moore's Law has stopped being a thing. Maybe it can pick back up at some time in the future, but we'll have to wait for that time.

I do, however, think that people are going to *want* and *have* to go outside of AI eventually to get certain things they want done. Just like programmers nowadays go *outside* of game engines like Unity, or high level languages sometimes because they want to learn something deeper, or because they need something specific done. That's why I'm not convinced that all software will be written by AI in such a short period of time.

πŸ‘» darkghost Β· 2025-03-27 at 13:57:

I'll be surprised if AI programming is used for anything other than very small bespoke applications, at least in its entirety, for the next few years. While it can regurgitate code chunks I am unconvinced it can produce projects all on its own. With future advancements, I don't see it progressing far beyond the kind of projects you find in how to guides. Maybe I'll make VibesOS, the operating system you don't dare run on bare metal connected to the internet. It's all vibes coding with the only comment in the code being "I don't know what any of this does."

🐦 wasolili [...] · 2025-03-27 at 15:15:

I don't think LLMs/AI will be successfully used for entire projects any time soon. The existing AI tools fucking suck. I have tried using them, and will continue to try, but the entirety of my experience can be summed up as "AI generates broken code, guiding it to near-correct code takes longer than if you wrote it yourself."

Reminds me of the phrase "an hour of googling can save you five minutes of reading the documention" - "an hour of prompt engineering can save you 5 minutes of reading the documentation, and lighten your wallet!"

What I am worried about is that LLMs have given people who absolutely have no business being near a single line of code the illusion that they can write code. This results in horrible, broken code that actual developers have to waste time reviewing/refactoring/rewriting.

I am also highly suspicious of the claims companies make that a certain product or feature was developed with heavy use of AI. The people making those claims aren't programmers, they're the marketing people who heard AI was the new big thing, and are probably claiming AI was used to a greater extent than it was in part due to being unfamiliar with the extent the actual programmers used it and (in larger part) because they think "made with AI" will appeal to stakeholders

πŸš€ clseibold [OP, πŸ›‚] Β· 2025-03-27 at 15:18:

I've personally had good experiences with Claude generating a function or a very small portion of code pretty decently and correctly, but I haven't tried making whole projects with it, let alone anything more than like a 25 lines max. It sounds kinda like a pain to use it exclusively for a whole project, honestly, lol.

I agree, if you can't read the code generated by an LLM to know whether it's correct or not, then that's really not a great thing.

🌲 Half_Elf_Monk · 2025-03-27 at 15:20:

I would love to have some kind of "code authority" institutions that verify the integrity and safety of code in OS projects. It'd help non-programmers evaluate the safety of code, and would provide some kind of imprimatur for a codebase' safety, LLM-free status, and bloat levels. I wish I could read/understand every line of code on running on my machine, but realistically... that's not going to happen.

πŸ‘» darkghost Β· 2025-03-27 at 15:54:

I'm old enough to remember IRC. The prank we would play on people is "hey you can get IRC Operator status by just pressing Alt+F4!" then watching a bunch of people leave. Using an LLM to code without understanding what you're reading is a bit like that. Is it still social engineering if it's an LLM?

Original Post

πŸŒ’ s/programming

πŸš€ clseibold: [πŸ›‚]

Been seeing this term around the internet recently, called "Vibe Coding", and I finally looked it up. What that absolute heck?!? 🀣

πŸ’¬ 17 comments Β· 1 like Β· 2025-03-26 Β· 1 year ago