This is a good way to maximize speed. I'm not convinced it's also a good way to master quality. Rushing ("speedrunning") to a first working version may force you to choose sub-optimal paradigms (algorithms, data types, etc.) that you won't have the time or the will to correct later.
I'd even postulate that's why we have so many crap applications today that are first on the market but slow, inefficient and user unfriendly.
If premature optimization is the root of all evils, totally disregarding it leads to painful refactoring.
I think it's the opposite. I think quality often comes from evolution and iteration.
There've been so many projects where I get stuck because I want to maximize quality, so I get writer's block. The worse, is that sometimes you'll try to perfect something on your project that ultimately isn't of great value.
Building something quickly, and then iterating to perfect it seems to work for many people.
This is true for most things in life. People spend days and weeks on the logo, that the actual product doesn’t get off the ground. People spend so much time planning the perfect vacation, that it never happens. And so on.
Truth is, for most things in life, good enough is just good enough. Lots of things we do have a short shelf life anyways.
I guess deciding the right level of goodness (or perfectness) of the tasks/projects we do in life is a big skill in itself
And what many people of either side forget: both are not a one size fits all. There are some things that need planning up front (a car or a rocket) and some things can be done agile and iteratively. Likewise, some things can't be made via solopreneurship/indiehacking and some things can't be achieved with classic VC-backed entrepreneurship. There's a time for both.
That’s certainly one way to get a crappy application. Another way is to find optimal paradigms only to discover that the problem that needs to be solved has changed and now the optimal paradigms are technical debt that needs to be worked around.
Much of the reason sucky applications suck is because the people who work on them can't change them quickly enough. If you can open up your IDE, grab a flame graph, and chuck out your shitty brute-force algorithm in favour of a dynamic programming one that you thought of in the shower, then one Friday morning you're likely to do just that.
I suspect that the “crap applications” issue arises not necessarily due to the method being wrong, but more likely due to people disregarding step 4 in the article: “Finally, once completely done, go back and perfect”.
It may be because of tight deadlines, lazyness (it’s “good enough” so why bother?) or eagerness to jump to the next project (because it is more exciting or profitable than doing the hard work of getting the details right).
I guess there is also a personality type factor that plays into it, because many people seem to just care about the hard requirements and cannot be bothered about things like performance, accessibility, design consistency, simplicity, maintainability, good documentation, etc., at least as long as nobody complains about it.
It can definitely lead to under-optimized code, but on the flip side, prematurely optimizing can waste time and lead to overly complex code that is difficult to maintain. The key is to know how much to optimize and when.
The point of the article isn't to show you how to produce a shoddy first version as soon as possible, but rather how to avoid things like analysis paralysis and prematurely focusing on style over substance. This applies not just to code but to pretty much anything you create.
By completing a skeleton as soon as possible, you get a better idea of the different components you'll need and how they will interact, before you flesh any of them out. I think there is real value in this approach.
Yes, but at the beginning you can't be totally sure that what you are building is right thing to build or the money/resources to be slow.
Agree. In the context of software development, you might choose different tools (programming language especially) if your goal is rapid application development rather than general high quality and long-term maintainability. You can't easily go back and change those decisions.
This is one of the perennial software development questions: to what extent can you improve an existing solution with a flawed or now-inappropriate architecture or implementation? This topic turned up a couple of months ago. [0]
[0] https://news.ycombinator.com/item?id=40623263