I really do think that software engineering as we know it is ending. It will take 3-5 years for tools like this to mature. But it will happen, and hard fought skill sets like SQL database design, query design, and maybe even ORMs will become obsolete.
My biggest prediction is that ORMs will not be necessary when LLMs can generate SQL. Low level SQL is a better abstraction for LLMs than ORMs, and as people are removed from the equation, so too will abstractions built to help them craft code.
We already had SQL model and code generators well before LLMs. What does adding in random output do to improve that?
You havent been using these tools if you think this
Yes I don’t need to use these tools because I already have code generators. Wondering about config options I use documentation or a search engine. It’s cute to put them together in a single UI but it doesn’t make these tool inherently more intelligent. It just saves me a few alt+tabs.
An LLM is just taking prefabbed templates and swapping the possibilities in the answer for a statistically relevant solution. My code generator outputs a prefabbed template with a deterministic solution no statistical guessing required.
You dont understand how powerful LLMs are. Go use claude sonnet 3.5. Paste in 1000 lines of code, and describe a code change you want to make. Iterate on the solution it gives. Do this for a week.
If I did that I would waste so much time. I know what code is in my codebase. Maybe if I was a novice this would be effective to help me learn it. Is the point of the exercise to wow myself for a week that an LLM can spit out solutions?
No its that it will code a days worth of work in a few minutes
10x LoC is going to require more automation, to manage the sheer mass of it, which is more tools/money/layers of abstraction. AI coders need AI testers and AI peer reviewers, and need to iterate over and over to compensate for incorrectness to produce a working feature. That sounds hellishly inefficient. (but all it has to be is cheaper i suppose)
You're speaking theoretically but we're already using it like this and it's not hellish or inefficient, or I wouldn't use it. Granted, it fits certain tasks better than others but when it does it's a massive relief and I can't imagine going back.
Wow!
What kind of code generators are you talking about? The ones I have are just templates with macros for scaffolding boilerplate, but they are not even remotely comparable to how I use LLMs and definitely not a substitute.
I have, and I (mostly) agree with GP's point.
The utility of LLMs with code generation varies widely with the problem domain and the amount of experience the developer has.
Giving everyone a smartphone with a great video camera built in didn't obsolete the field of cinematography. I don't think giving everyone tools to help them build software will obsolete software engineering.
Are you sure? Most of the popular videos today do not have what one would call great cinematography, but it doesn't seem to matter. No one cares. Sure, movies still use cinematographers, but movie watching time is getting eaten up with Instagram/TikTok, where cinematography doesn't matter.
I fear applications will suffer the same fate. "Good enough" will take over "well-architected".
I think you are 100% spot-on. Good enough has always been fine for the vast majority of people and the vast majority of use-cases.
Couple this with decreasing costs of storage (and ideally compute), and it doesn't matter if the data model is garbage, people can still get something workable that's better than the awful Excel files they curate now. It will still make errors, but eventually fewer than their spreadsheets.
There is no "good enough" for data modeling. There is correct, and there is "this works, but it has latent bugs that will eventually surface." You either have referential integrity, or you don't.
LLMs don't have the context to make good decisions though. You need all of that hard fought skills to make those decisions. And people are the only ones that can have the context enough to actually make decisions.
Not only that, but AI is way more expensive than we think. We're currently in a hype bubble funded by last ditch effort VC money. When that money runs out, and it will eventually, AI is going to get WAY more expensive.
I personally think LLMs make much better decisions than me. I often have a design in mind and then when I prompt Claude it gives me a much better one, and also takes a lot of edge case into account that I didn't even think of. Maybe I'm a useless programmer but I'm sure I'm not the only one.
Even when I write working code, I prompt claude, and it adds a bunch of stuff I would never have added. It astonishes me how good it is
How does that make any of the skills obsolete? If anything, it makes them even more important.
In the 20th century you got away with knowing the syntax and hacking away. Now you really need to have a deep understanding of relational algebra, since the LLM is doing the typing for you.
Until a query becomes a bottleneck and no one knows why because no one knows how databases work anymore.
Reducing barrier to entry is not a bad thing