Those are way too abstract advice when you start programming.
You can only understand them because you lived those situations, which implies experience you don't have.
I would say (specifically to my young self):
- There is no substitute for doing. Less tutorials, more coding.
- Stop being obsessed with quality: you are not at the level where you can provide it yet. Do dirty. Do badly. But ship. Some people will be mad at you, and they are right. But do it anyway. Yes, reboot the servers with users on it. Yes, commit the spaghetti. You'll reach the level you want to avoid doing all this.
- The users don't care about the tech. They care about the result.
- Doing something for the hell of it is worth it. Just make it separate from the point above so they don't conflict.
- Programming is a battle against complexity. Again and again. Put that on a post-it. Make your decisions based on it. You will suck at it, but try.
- You have imposter syndrome because you are an imposter. You are really bad. It's ok, doctors hurt people for years while learning to save them. Don't sweat it. It will come.
- You need faith it will work out. On the long run, you'll get better at this. But in the short term also. You'll find the bug. You'll figure out the solution. It will happen if you keep at it even if it can be frustratingly long and unfair. Even if it doesn't feel like that right now.
- The right team is way more important than the right tech. If you are alone, get in touch with people who will lift you up from time to time.
What in the medical malpractice?
A recent Johns Hopkins study claims more than 250,000 people in the U.S. die every year from medical errors.
It is ok to be an imposter.
Is it your impression that the medical profession thinks that’s ok?
Yes. Nothing is perfect. The remedy they've chosen is to rely on insurance. There is a reason we call doctor's offices 'practices'.
There is, but it's not what you're implying.
You think they're called practices because of the less common usage of the term to mean an amateur learning something, to imply they're inexperienced/without training?
Rather than it being the place where a practitioner works? ("someone whose regular work has involved a lot of training")
The medical industrial complex clearly thinks it is, otherwise there would drastically lower patient loads, allow more spots for medical school, and provide better working conditions that would allow for fewer mistakes. None of this is happening and is an implicit acceptance of medical error deaths
They obviously do. Just look at the way they run residency programs. They work those poor kids to the bone under minimal supervision. Only the most disingenuous can pretend it's an accident when something goes wrong. The system is clearly designed to harm people.
Show me a hospital, and I'll show you overworked, under slept young doctors that have to deal with paper work, and are swimming in a field with heavy lobbying.
But even without this, the thing is, when you don't know, you will make mistakes.
And when your job is to take care of people, mistakes hurt them.
It's the nature of things.
Last year I had 3 medical errors, one that almost got me killed, by very well-meaning professionals.
Closing your eyes and pretending it doesn't happen is naive at best.
But people have to start somewhere. They can't stay in theory forever. And no amount of preparation will save you from making terrible mistakes.
Since we can't put an experienced doctor behind every intern 24/7, there is no real solution to this for now.
Same for programming.
Yeah let's go build some shoddy bridges and crash some planes while we're at it.
Individuals can be new to a practice and feel like imposters but we shouldn't be pointing to statistics like this as an example of why it's ok.
I can't believe people think like this.
Planes and bridges are problems where the quality control can be centralized and you can therefore put a lot of redundancy.
Even then, the Boeing scandal shows it's not bulletproof.
It's not the same for medicine. There are way more doctors than you can put safety nets. Also, when a plane crashes, it's on the news, it costs money and PR. Much less so with doctors.
The plane industry is not inherently more moral, just more liable. Responsibility in the health industry is way more diluted.
It's not that we WANT to pay the price of people learning. I wish I would not have had the wrong meds given to me last year.
It's just that the system is not currently set up to do it otherwise.
So you can feel guilty and stop moving, and you will not learn. You will not grow. And you will actually hurt people for a longer period because you will still make mistakes.
Or you can own it and accept the inevitable: doing things means having actual consequences on the world.
If you want, you can dedicate your life to change the whole system and make it better. But I don't think anybody in this thread is doing that right now.
I did see a lot of people on their high horses, but doing nothing though, paralyzed because they were looking for a way to never break anything.
Me, for example. For years.
Sure, we're human. There is no argument there.
However, normalizing the loss of human life as part of the learning algorithm is psychopathic. There's a time and a place to make mistakes... In mission critical situations that's in training and everywhere else we should aspire for safeguards that prevent the loss of life due to mistakes.
In medicine a lot of these deaths can be prevented through greater care. It's not different than engineering in that respect. The greater problem is decision makers putting profits over life. And this kind of mentality in this thread is just fuel for that kind of behavior. It's gross.
I think you're misreading a singular opinion as occurring between two disparate points here.
The initial phrase was > doctors hurt people for years while learning to save them
It's then a separate reply from someone else about deaths from errors/malpractice.
So, nobody seems to be expressing the mentality you are, correctly, lambasting (at least so far as I've read in the comments). But, as it is relevant to the lessons we'd all want to pass back to ourselves (in my opinion, it's because we wish to hold ourselves accountable to said lessons for the future), let's address the elephant in the comment.
Normalising deaths, especially the preventable ones, is absolutely not something that anybody here is suggesting (so far as my read of the situation is).
Normalising responsibility, by recognising that we can cause harm and so do something about it, that seems far more in-line with the above comments.
As you say it yourself, there's a time and a place, which is undoubtedly what we hope to foster for those who are younger or at the start of learning any discipline, beyond programming, medicine, or engineering.
Nobody is saying that the death and loss is acceptable and normalised, but rather that in the real world we need a certain presence around the knowledge that they will occur, to a certain degree, regardless. So, we accept responsibility for what we can prevent, and strive to push the frontier of our capacity further with this in mind. For some, that comes from experience, unfortunately. For others, they can start to grapple with the notion in lieu of it through considering the consequences, and the scale of consequence; as the above comments would be evidence of, at least by implication.
These are not the only ways to develop a mindset of responsibility, fortunately, but that is what they can be, even if you find the literal wording to suggest otherwise. I cannot, of course, attest to the "true" feelings of others, but neither can anyone else... But in the spirit of the matter, consider: Your sense of responsibility, in turn, seems receptive to finding the areas by which such thinking can become justification for the very thing it would otherwise prevent, either as a shield purpose-built for the role or co-opted out of convenience. That too becomes integral, as we will always need to avoid complacency, and so must also promote this vigilance to become a healthy part of the whole for a responsible mindset -- lest we become that which we seek to prevent, and all that.
Exactly as you say, there's a greater problem, but this thinking is not necessarily justification for it, and can indeed become another tool to counter it. More responsibility, more mindfulness about our intents, actions, and consequences? That will prove indispensable for us to actually solve the greater problem, so we must appreciate that different paths will be needed to achieve it, after all, there are many different justifications for that problem which will all need to be robustly refuted and shown for what they are. Doing so won't solve the problem, but is rather one of many steps we will need to take.
Regardless, this mindfulness and vigilance about ourselves, as much as about each other, will be self-promoting through the mutual reinforcement of these qualities. If someone must attempt to visualise the staggering scale of consequence as part of developing this, then so be it. In turn, they will eventually grapple with this vigilance as well, as the responsibility behoves them to, else they may end up taking actions and having consequences that are equivalent to the exact mentality you fear, even if they do/did not actually "intend" to do so. The learning never ends, and the mistakes never will, so we must have awareness of the totality of this truth; even if only as best we can manage within our limited abilities.
Making a mistake or a reasonable but ultimately incorrect call is not malpractice. Doctors are just people, just like the rest of us, and certainly you. It’s scary if you think that the medical system is more of a safety net than it actually is, but your gripe is with the precarious nature of life, not medicine.
Figuring out how to be a doctor by faking it til you make it most definitely is malpractice though, and ‘don’t worry, you’re an imposter but you only learn by screwing up’ is very much not how doctors learn their job.
The Hippocratic oath is not ‘first, do some harm, as long as you’re learning’.
This may be good advice for yourself in the past, and for many people at lots of times, but I'd hesitate to give it as general advice. Reading code others have written should not be understated as way to learn valuable things from fundamental patterns and algorithms to language features and idioms. If you have a job in a team, this may happen anyway, but it's possible to write lots of code without realizing there's a better way.
Although, I should point out that reading code is not the same as reading tutorials. Reading code occurs with a kind of intent and focus that is missing when you don’t know much, and thus you may end up falling into the trap of studying multiple tutorials without really trying out much yourself.
Tutorials include some code which is intended to be exemplary and simple enough for a new programmer to make sense of, so I wouldn't discount it as a part of reading code. Practical code does not always have those properties, although you'd certainly be missing a lot if you only read code from tutorials.
I completely agree with the claim "There is no substitute for doing", and I might even say that code you read without running and tweaking it doesn't count.
Trying to learn to become a developer from tutorials is like trying to become a carpenter by reading instruction manuals for saws, nail guns, &c.
To learn how to implement computer programs, read books about it.
But you're not going to realize which way it's better unless you've written the bad code before.
Tangential, but I think that's the problem with trying to learn from design patterns.
I read about them and tried to apply them.
That's backward and didn't produce anything good.
I started to understand them by doing it the other way around:
- Coding, solving problems.
- Reading other people's sources.
- Then reinventing half a terrible design pattern.
- Later on, looking at a book: "ohhh, that's what I tried to do" or "ohhh, hence the snippet in that source".
- Now I can discuss with people about the pattern and name my code entities according to that.
Design patterns are a communication tool.
Indeed it's not general advice, it's to me specifically, who used to procrastinate a lot under the guise of learning.
Same. I would fall into the same trap. One more article, another tutorial, another chapter of a book… juuust to make sure I understood the concepts; and what actually helped? Just coding, getting it wrong, fixing it, getting it wrong, fixing it, etc.
I think reading code to understand because you forked a project and want to continue it is very different to following tutorials.
I'd definitely agree that reading others' code can be a good way of learning, especially if you take the time to disassemble it line by line and look up things like what language features and functions they're using, how it's structured, etc.
A bit snarky, but I would add
- don’t read opinions from a list and take it as gospel. Adapt to the jobs you’re in, and you’ll develop your own opinions, but now with experience to explain why. Opinions are formed by getting repeatedly hit with the consequences of your (and other’s) decisions, and everyone just has to take enough hits till the pattern seeking area of your brain takes over.
Maybe add to that: try to stay long enough in a role to really feel the consequences of your actions. Even better if you're on pager for a while too. I know it's not trendy to stay in a job for long these days, and conventional wisdom is it's not great for your salary either, but one thing it will do is allow you to understand whether decisions you made were actually good or not.
There are roles I've been in where it's only been years later that the true impact of decisions made was actually apparent. I'm glad I hung around long enough to experience that.
Skin in the game is important.
I have a terrible programmer friend.
He is better at creating actual products than most people I met because his livelihood is on the line: he makes money only from websites.
So he is not living in the abstract idea of best practices, he had to make it profitable for the last two decades, with little resources to go by on top of that.
And he does.
After 10 years, he is still asking me how to write git commands. And I'm still learning from him to shed a lot of BS from my practices.
Being directly chained to the financial outcome of your works will have you dropping Scrum for Scheme real quick...
I wholeheartedly agree with this advice and have for over a decade now. I think there’s something in the fact that you know you caused that that helps the lesson stick. Potentially you will understand the assumptions you had then vs now that allows you to clearly see the underlying lesson to take.
I've been called back to code long after I left an org, and have had to review my own code 10-12-15 years after the fact. Seeing both the positive and negative aspects of decisions having played out in the real world was something it's hard to get from reading, and I'd argue somewhat harder to get even if you stay inside the same org for that length of time. Staying on the inside, you'll rationalize a lot of the changes to the code, the team and org over time, and it may be harder to be more objective about the impact the code has had.
I was quite proud of some decisions, but realized the negative long term impact of others. Trying to share that experience and whatever 'wisdom' or 'lessons learned' with others has been its own challenge in some situations, because you can easily come across as "the old person" who doesn't "get it" wrt to current trends. Some issues are evergreen and fundamental, but it's difficult for less experienced people to understand why some of these things are really core. I'm not sure there's much substitute for experience in many cases.
I agree, and would say is equivalent to:
Yes true. Mostly just opining on the “list of everything you gotta’ know” is a bit TOO concrete IMO. (As opposed to too abstract)
I was quoting out of similar things years ago at the start of my career, “it should be done this way because X said it” didn’t help me at all.
It feels like insulation against being called too junior, since, you can just wholesale adopt someone else’s list of ideas and you’re good. But making mistakes because you’re new to the career is precisely what forms the base of those opinions in the first place.
I agree, form your opinions when you have enough information.
For example, if you can’t decide between two data structures or tech, pick one and add a comment:
// I’m not sure
* This * - The users don't care about the tech. They care about the result.
To the extent users do care about the tech, they care about performance not how "clean" the code is or whether you're using the newest framework. Users hate when software is slow or uses an exorbitant amount of memory.
Most users have no idea how much memory a piece of software uses. They care about user experience though which means they will care if the UI hangs or is unresponsive.
Bingo. I’d go further and say they don’t care about your application at all. They just want to do something, and your application’s quality is measured by how little it stands in the way of accomplishing that.
The sad fact is this encapsulates features (ease of development, a framework probably does help you ship faster), adaptability (clean abstractions that are easy to work with), and performance.
Finding that balance is always going to be hard but they’re all important!
And if the product is aimed at prousers/communities who extend the functionality themselves with their own commands, scripts and plugins.
Yeah, I totally agree. Those other factors are internal optimizations. What gets me is when a team wants to switch horses to new tech and do a forklift upgrade just to implement something using the new hotness.
This 100%. It's especially noticeable in the world of game development, where you see games that are ridiculously buggy and poorly made (on a coding basis) selling millions of copies and changing the industry. The original generation 1 Pokemon games are probably some of the best examples of this, though I'm pretty sure anyone who's reversed engineered any game from the Atari era onwards has probably been left wondering "what the hell were they thinking?"
But it doesn't matter. They were designed well, they were fun to play, and millions of people enjoyed them.
Stop being obsessed with quality: you are not at the level where you can provide it yet. Do dirty. Do badly. But ship.
You'll reach the level you want to avoid doing all this.
What if future you has reached that level and people on your team are shipping spaghetti?
I feel like this is bad advice, really, you need to be obsessed with quality and growth, but you can't let that stop you from shipping. Try for clean to the best of your ability in the time constraints you have, but accept that it will be dirty.
I think aiming for clean is good, but it’s really hard to pin down what clean means when you’re starting out.
I feel like just emulating what you see in your first few jobs is ideal. (As in, ask coworkers who know the thing you’re working on) It could be great code to be inspired from, or mediocre. Either way you get some input about what decisions result in what outcomes, and what the outcome “feels” like.
And if it comes time to change one or many of those decisions later on in this codebase, the person doing it gets a uniform codebase to work from! Unique abstractions and fixes in random places makes refactoring harder.
Obsession will stop you from shipping.
Or it's not an obsession.
It's caring.
Very few people can pull off a Steve Job level of nitpicking and actually finish a project.
I certainly couldn't, and that advice is for young me.
Then you find a new team, else you can do the classic 'making one trivial hill your Happy Path and be prepared to die on it' routine.
A way to rephrase your point about complexity is this wonderful quote "Developers are drawn to complexity like moths to a flame, often with the same outcome" (Neal Ford)
I've experienced this too, but I never understood why this phenomenon happens.
Is it because we start adding abstractions before we understand the problems?
Like, there's a certain threshold where keep things too simple makes them too complex, so people start introducing abstractions to reduce complexity. But often it's the wrong abstractions and then we accidentally end up with worse complexity that's harder to unravel.
There must be a term for this waves hand thing?
Edit: gosh typical on phones is hard
The commonly used term, also mentioned by Fred Brooks, is accidental complexity. Accidental highlights the non intentional nature of devs introducing complexity.
Isn't accidental complexity just complexity that's not part of the problem domain? Say in the context of serving content/downloads, accidentally complexity would be caching, proxies, CDNs (etc). Basically stuff that we have to deal with to handle or optimise downloads, but isn't inherently part of the "just downloading files" problem?
It’s because when you’re in the middle of things with a lot of context in your head, adding another little wrinkle feels like a negligible complication (in particular if the new wrinkle is ostensibly to reduce some complexity, or to ship more quickly), but those complications accumulate up to the limits of any developer’s comprehension (and beyond) rather sooner than later. Hence developers tend to work close to the limit of what they can handle in terms of complexity, which for anyone who hasn’t all the same context in their head (like the developers themselves some time later) is really more than they can handle in an effective manner.
From another perspective, this is a form of entropy: There are many more ways to increase complexity than to reduce it. This is also the reason why biological life has been getting more complex over time, the only criterion being if it’s fit to survive and reproduce.
These seem just as abstract as mine, if not more so, plus at least I provided examples where I could. Feels weird to criticize my post for general advice + examples, then come up with your own general advice without examples.
Also this was just an analogy I know, but doctors definitely don’t hurt people for years while trying to save them, very different profession from ours, if anything doctors earlier in their career have been shown to have better results.
I think you both provide some kind of generational advice, like parent serving kid with life advice. Unfortunatelly, or fortunatelly, they will have to learn it by experiencing own failures first.
I think you are trying to address different audiences. While your tips are mostly targeted at people who are already working as programmers, the parent comment's tips are mostly targeted at complete beginners.
E.g. this tip:
- There is no substitute for doing. Less tutorials, more coding.
is directly addressing a common mistake for absolute beginners. Many beginners will read (or worse yet, watch) loads of coding tutorials while doing little themsves. It is an issue a complete beginner encounters and understands.
Your tip on the other hand:
is addressing people working on medium to large projects with internal tooling. That is not a situation a complete beginner finds themselves in; it's a situation someone who already works in programming for a while finds themselves in.
I wouldn't necessarily say your tips are too abstract; they are simply too high level for a complete beginner.
That is not necessarily a bad thing; perhaps the you of 15 years ago already had the basic understanding necessary to be able to comprehend and make use of your tips.
I really liked “You should know all the major shortcuts in your editor. You should be a confident and fast typist. You should know your OS well. You should be proficient in the shell. You should know how to use the browser dev tools effectively.”
Typing skills are severely underrated in order to professions and roles adject to our professions like PM.
I have a very severe case of impostor syndrome. :(
In a world of imposters, the half decent imposter is king.
Don't worry, you're not a real imposter. You've just inadvertently ended up in a position where you're expected to be one. Just fake it until you actually become a true imposter.
You probably are. But most of your colleagues as well :)
Most adults are kids in big meat suits, they fake it a lot.
I started to live like I was not completely worthless at 35.
Not saying that to be proud of it, just stating that if you think humanity should do better, the first person you'll judge is you.
It will be glaringly obvious you are not meeting your own standards.
The higher your standard, the longer it will take for you to reach them.
And the way to get there faster is to ignore the shame, and do it anyway. Because if you don't, your growth will be slower, and you will do more damage for longer.
Real life means real consequences.
It will make you more tolerant of others as well. Way more tolerant.
Seasoned, grownup engineers and tech business leaders are forgetting this, even today. Users don't care that your product is made with AI, but techies just will not shut up about Generative AI, LLMs, Transformers and all this shit that should be implementation details. Users don't care about any of what goes into the sausage.
Investors care though, and for many startups the customer they need to appeal to is investors.
I don't understand why investors care, either. Product A is made with traditional algorithms, product B is made with AI and LLMs, product C is made with literal magic and wizardry. But they all do exactly the same thing. Why does an investor prefer to invest in product B?
"Those are way too abstract advice when you start programming."
I have come to the conclusion that the use of these sorts of posts is not that the reader, young or otherwise, will instantly and correctly apply all the lessons to their lives.
It's more about sensitizing people to problems they may not currently see, and solutions they may not currently be aware of. It's about shortening the learning curve, rather than eliminating it.
A 1-year programmer is not going to read themselves into a 20-year programmer, no matter what they read. But at the 5 year level I think you'll see a lot of difference between someone who never considers their craft and never pushes themselves, just keeps their heads down and doing the next bug, and the person who has even just occasionally read this sort of post, pondered how it may apply to their current situation, and taken out of it what they can... which may be something completely different than what they take out if they read the exact same post two years later.
Yes, countering "you don't know what you don't know"
Modern medical education doesn't work this way.
Yeah, I'm surrounded with medical professionals.
It totally does.
They make grave mistakes all the time. And they hide them. They lie about them.
They have their ego and career on the line.
And they don't have enough resources at their disposal, not enough hours, too many patients, and they are exhausted.
In short, they are humans in a human system.
I think you're likely to reach that level a lot faster by trying and failing than by not trying at all.
This is a great use for new, standalone open source modules: feel free to experiment with styles or techniques or goals that you wouldn't want to justify. (Or for experiments that you start and then abandon without ever showing anyone.)
For example, when I was making lots of packages to help build out the Racket ecosystem, I'd already made an HTML-writing package, but I felt a craving to do one that used syntax extension at compile time, rather than dynamic s-expressions. So I just did it, as a separate package: https://www.neilvandyke.org/racket/html-template/
I don't recall anyone saying they saw value in it, but I like it, and I scratched that itch, and added another skill to my programming mental bag of tricks.
Most of these lists do not advise on "programming" the task but rather "programming" the job/position. There's no problem with that, I'd just wish it was labelled "software engineering advice" or "advice for programmers". Few of the points are about programming itself and the list is overall pretty good.
Consider this just a rant from an older programmer who hasn't fully recognized that 'programming' is now largely a social endeavor with online info for everything, language and library ecosystems, etc. I wonder how much of this information I would have internalized if it were all available in my time. Seems like the kind of thing I might read and nod in agreement and forget to apply when relevant without some hard earned run-ins (which is how I'd picked them up).
As for actual programming advice, the thing I'd highlight is to look at the data first and foremost. It goes from initial conditions to post conditions. The differences are what your program/function does, but both the pre/post conditions should be able to be fully described as a valid static state. Once you understand that, the problem, the code is largely plumbing with a small part that applies the functional transformation. There's so much focus on the form and style of "the code" that seems to consider it the main thing rather than an artifact of getting the main thing done: think any time there seems to be fancy abstractions that don't provide value to offset its complexity. To relate it to a point in the post, if you can't connect the difficulty you're having with the logical transformation that needs to be done (e.g. from database state to state), it's likely self-inflicted (or it could be due to a poor choice of database schema). Similarly for poor choices of request/response formats--basically bad plumbing between systems (and not intrinsically hard because of the information being handled).
I never really got the "it's ok to be an imposter", "it's ok to be bad" part... And the internet is full of people saying things like "I am a developer and I have no idea what I'm doing haha".
Seriously, you might be inexperienced or not know everything, but you should clearly not be an imposter and you should be confident in your ability to improve and understand what you don't understand yet.
Take responsabilities and ownership in what you do, don't get behind the easy excuse that it's too complicated. You are the professional and you are getting paid for this.
From 30+ years of dev work:
I'd rephrase that as "just write something!"
Many times I find myself being the classic example of 'perfect is the enemy of good' - I'll think about the perfect solution, rather than write something *now*, that works, and refactor towards perfect. TDD and all that.
Other things:
- Beware the beta and the boss. If it works, it will invariably get shipped if your manager sees it. Many managers cannot put a value on the future cost of maintaining something that's barely good enough.
- Classic Confucius: "I hear and I forget. I see and I remember. I do and I understand." If you're interested enough to read about some tech/language/framework, write something (again!).
- Learn at least one other language and ecosystem in addition to your main one. Even if it is syntactically similar (C++, Java, C#, for example).