return to table of content

Code is run more than read

rob74
68 replies
9h52m

TIL of ≹, which "articulates a relationship where neither of the two compared entities is greater or lesser than the other, yet they aren't necessarily equal either. This nuanced distinction is essential in areas where there are different ways to compare entities that aren't strictly numerical." (https://www.mathematics-monster.com/symbols/Neither-Greater-...)

namanyayg
44 replies
9h43m

Likewise, TIL. In your link however it states

"Example 1: Numerical Context

Let's consider two real numbers, a and b. If a is neither greater than nor less than b, but they aren't explicitly equal, the relationship is ≹"

How can that be possible?

yummypaint
16 replies
9h32m

This doesn't really pass the small test for me either, but to play devils advocate:

Imagine you have 2 irrational numbers, and for some a priori reason you know they cannot be equal. You write a computer program to calculate them to arbitrary precision, but no matter how many digits you generate they are identical to that approximation. You know that there must be some point at which they diverge, with one being larger than the other, but you cannot determine when or by how much.

MikeDelta
15 replies
8h48m

Maybe you will find the proof that the infinite series 0.9999... exactly equals 1 interesting:

https://en.wikipedia.org/wiki/0.999...

jcul
14 replies
8h32m

Wow, can't believe I've never realised this. How counterintuitive.

The 1/3 * 3 argument, I found the most intuitive.

thaumasiotes
10 replies
8h24m

I like the argument that observes "if you subtract 0.99(9) from 1, you get a number in which every decimal place is zero".

The geometric series proof is less fun but more straightforward.

As a fun side note, the geometric series proof will also tell you that the sum of every nonnegative power of 2 works out to -1, and this is in fact how we represent -1 in computers.

nly
9 replies
8h19m

How can the sum of a bunch of positive powers powers of 2 be a negative number?

Isn't the sum of any infinite series of positive numbers infinity?

xigoi
3 replies
8h6m

The infinite sum of powers of 2 indeed diverges in the real numbers. However, in the 2-adic numbers, it does actually equal -1.

https://en.wikipedia.org/wiki/P-adic_number

Dylan16807
2 replies
7h41m

Eh, P-adic numbers basically write the digits backwards, so "-1" has very little relation to a normal -1.

xigoi
0 replies
7h39m

-1 means that when you add 1, you get 0. And the 2-adic number …11111 has this property.

mike_hock
0 replies
4h13m

Any ring automatically gains all integers as meaningful symbols because there is exactly one ring homomorphism from Z to the ring.

mr_mitm
1 replies
4h33m

In an introductory course to String Theory they tried to tell me that 1+2+3+4+... = -1/12.

There is some weird appeal to the Zeta function which implies this result and apparently even has some use in String Theory, but I cannot say I was ever convinced. I then dropped the class. (Not the only thing that I couldn't wrap my head around, though.)

thaumasiotes
0 replies
1h16m

The result isn't owed to the zeta function. For example, Ramanujan derived it by relating the series to the product of two infinite polynomials, (1 - x + x² - x³ + ...) × (1 - x + x² - x³ + ...). (Ok, it's the square of one infinite polynomial.)

Do that multiplication and you'll find the result is (1 - 2x + 3x² - 4x⁴ + ...). So the sum of the sequence of coefficients {1, -2, 3, -4, ...} is taken to be the square of the sum of the sequence {1, -1, 1, -1, ...} (because the polynomial associated with the first sequence is the square of the polynomial associated with the second sequence), and the sum of the all-positive sequence {1, 2, 3, 4, ...} is calculated by a simpler algebraic relationship to the half-negative sequence {1, -2, 3, -4, ...}.

The zeta function is just a piece of evidence that the derivation of the value is correct in a sense - at the point where the zeta function would be defined by the infinite sum 1 + 2 + 3 + ..., to the extent that it is possible to assign a value to the zeta function at that point, the value must be -1/12.

https://www.youtube.com/watch?v=jcKRGpMiVTw is a youtube video (Mathologer) which goes over this material fairly carefully.

defrost
1 replies
8h12m

\1 is a good question that deserves an answer.

\2 is "not always" ..

Consider SumOf 1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32 ...

an infinite sequence of continuously decreasing numbers, the more you add the smaller the quantity added becomes.

It appears to approach but never reach some finite limit.

Unless, of course, by "Number" you mean "whole integer" | counting number, etc.

It's important to nail down those definitions.

thaumasiotes
0 replies
6h34m

\1 is a good question that deserves an answer.

The same argument I mentioned above, that subtracting 0.99999... from 1 will give you a number that is equal to zero, will also tell you that binary ...11111 or decimal ...999999 is equal to negative one. If you add one to the value, you will get a number that is equal to zero.

You might object that there is an infinite carry bit, but in that case you should also object that there is an infinitesimal residual when you subtract 0.9999... from 1.

It works for everything, not just -1. The infinite bit pattern ...(01)010101 is, according to the geometric series formula, equal to -1/3 [1 + 4 + 16 + 64 + ... = 1 / (1-4)]. What happens if you multiply it by 3?

        ...0101010101
      x            11
    -------------------
        ...0101010101
     + ...01010101010
    -------------------
       ...11111111111
You get -1.

twiceaday
0 replies
8h5m

https://youtu.be/krtf-v19TJg?si=Tpa3EW88Z__wfOQy&t=75

You can 'represent' the process of summing an infinite number of positive powers of x as a formula. That formula corresponds 1:1 to the process only for -1 < x < 1. However, when you plug 2 into that formula you essentially jump past the discontinuity at x = 1 and land on a finite value of -1. This 'makes sense' and is useful in certain applications.

quchen
1 replies
8h26m

It's a flawed psychological argument though, because it hinges on accepting that 0.333...=1/3, for which the proof is the same as for 0.999...=1. People have less of a problem with 1/3 so they gloss over this - for some reason, nobody ever says "but there is always a ...3 missing to 1/3" or something.

Scarblac
0 replies
8h16m

The problem is that there are two different ways to write the same number in infinite decimals notation. (0.999... and 1.000...).

Thats what's counter intuitive to people, it's not an issue with 1/3. That has just one way to write it as decimals, 0.333...

nly
0 replies
8h22m

Another intuition:

All the decimals that recur are fractions with a denominator of 9.

E.g. 0.1111.... is 1/9

0.7777.... is 7/9

It therefore stands to reason that 0.99999.... is 9/9, which is 1

kaba0
5 replies
8h17m

It is false - real numbers fulfill the trichotomy property, which is precisely the lack of such a relationship: every two real number is either less than, equal or greater than.

But the numerical context can still be correct: (edit: ~~imaginary~~) complex numbers for example don’t have such a property.

fsckboy
2 replies
7h48m

i'm learning about this right here as I read, but do you mean complex numbers rather than imaginary?

kaba0
0 replies
7h42m

Yep, that’s what I meant, sorry!

TeMPOraL
0 replies
7h41m

Same question.

Or more generally, vectors. They don't have a total order, because as we define "less than"/"greater than" in terms of magnitude (length), this means for any vector V (other than 0), there's an infinitely many vectors that are not equal to V, but whose length is equal to length of V.

Is this is what ≹ is talking about?

shemii
1 replies
7h44m

As I've written in another comment here, a great example of a number-y field which is not totally ordered is Games ⊂ Surreal Numbers ⊂ ℝ. There you have certain "numbers" which can can confused with (read incomparable) whole intervals of numbers. Games are really cool :)

SetTheorist
0 replies
2h29m

You've got the subset relationships backwards: Reals are a subset of the field of Surreal Numbers which is a subset of the group of Games. (Probably better phrased as embeddings, rather than subsets, but the point remains...)

Note that Games do _not_ form a field: there is no general multiplication operation between arbitrary games.

shaunregenbaum
3 replies
9h40m

I would imagine trying to compare a purely imaginary number (3i) to a real number (3) would suffice.

thaumasiotes
2 replies
9h40m

An imaginary number wouldn't obey the stated constraint of being real.

coldtea
1 replies
6h15m

No, but if the parent's question goes beyond "how can this happen with reals" to "how can this happen with numbers in general", this answers his question.

thaumasiotes
0 replies
2h29m

The very next example on the page is "imagine two complex numbers with the same magnitude and different angles". For that to answer the parent's question, you'd have to assume he stopped reading immediately after seeing the part he quoted.

The question is why the page says "imagine two real numbers that aren't comparable".

rustybolt
3 replies
9h37m

I think they mean the case where a and b are variables for which you don't know the values.

didntcheck
1 replies
7h42m

Yeah, that's how I understood it. E.g one might write

  (a+b)^2 != a^2 + b^2
To mean that in general the equality doesn't hold. Despite exceptions like a=b=0

Strictly you should write something like

  ¬[∀ a,b  (a+b)^2 != a^2 + b^2]
But shorthand and abuse of notation are hardly rare

didntcheck
0 replies
5h28m

Noticed a copy and paste error too late - the != in the second expression should of course be =

coldtea
0 replies
6h16m

No, that can be the case also for mathematical entities for which you can know the values, not just for "unknown variables".

salmonellaeater
2 replies
9h36m

That doesn't seem possible with the reals. An example from programming that comes to mind is NaN ≹ NaN (in some languages).

jampekka
0 replies
7h32m

Isn't this what happens with infs (in maths and in many programming languages)?

Edit: Not in many programming languages. In IEEE-754 inf == inf. In SymPy too oo == oo, although it's a bit controversial. Feels sketchy.

bruce343434
0 replies
8h40m

The floating point specification mandates that nan does not compare to nan in any way, so it should be all languages. If you want to know nam, use isnan()

kevindamm
2 replies
9h2m

for the Reals, it is only hypothetical, the domain has a total order.

\inf and $\inf + 1$ comes to mind but I don't think it really counts

thaumasiotes
1 replies
8h27m

\inf and $\inf + 1$ comes to mind but I don't think it really counts

That just depends on the numeric structure you're working with. In the extended reals, +inf is equal to +inf + 1.

In a structure with more infinite values than that, it would generally be less. But they wouldn't be incomparable; nothing says "comparable values" quite like the pair "x" and "x + 1".

kaba0
0 replies
8h13m

I guess it depends on the exact definitions, but reals usually doesn’t include the infinities. At my uni we introduced infinities precisely as an extension of the reals with two values defined by `lim`.

runiq
1 replies
7h34m

Hmm, what about +0 and -0?

RedNifre
0 replies
6h44m

+0 = -0

omeid2
1 replies
9h15m

Think how any number on the z axis of complex plain isn't equal to the a number of same magnitude, on x and y axis.

Now if you really think about, a number of a given magnitude on x axis also isn't exactly "equal" to a name of same magnitude on y axis or vice versa. Other wise, -5 and 5 should be equal, because they're the same magnitude from 0.

bruce343434
0 replies
8h39m

But |5|=|-5| so I don't exactly see your point.

Edit: oh, I see what you mean. 1 is not larger or smaller than i, but it also doesn't equal i.

tpoacher
0 replies
43m

Probably does not apply for real numbers, but could totally apply to, e.g., fuzzy numbers, whose 'membership function' bleeds beyond the 'crisp' number into nearby numbers.

You could imagine two fuzzy numbers with the same 'crisp' number having different membership profiles, and thus not being "equal", while at the same time being definitely not less and not greater at the same time.

Having said that, this all depends on appropriate definitions for all those concepts. You could argue that having the same 'crisp' representation would make them 'equal' but not 'equivalent', if that was the definition you chose. So a lot of this comes down to how you define equality / comparisons in whichever domain you're dealing with.

thaumasiotes
0 replies
9h40m

It isn't. The real numbers are a totally ordered field. Any two real numbers are comparable to each other.

magicalhippo
0 replies
8h41m

Perhaps this: if they represent angles, then 1 and 361 represents the same absolute orientation, but they're not the same as 361 indicates you went one full revolution to get there.

Contrived, but only thing I could think of.

brynbryn
6 replies
5h53m

That article is misinterpreting the meaning of the symbol. It isn't useful in mathematics because it is a contradiction in terms: if "neither of the two compared entities is greater or lesser than the other" then they are equal.

The author of the original article uses it correctly - think about it more in regards to importance for their example.

The business is no more or less important than the developer, but they are NOT equal.

It doesn't have to mean importance though, just the method by which you are comparing things.

Monday ≹ Wednesday

Come to think of it, it should be called the 'No better than' operator.

p4bl0
1 replies
5h33m

if "neither of the two compared entities is greater or lesser than the other" then they are equal.

Not in a partial order.

For example in this simple lattice structure, where lines mark that their top end in greater than their bottom end:

      11
     /  \
    01  10
     \  /
      00
11 is > to all other (by transitivity for 00), 00 is < to all other (by transitivity for 11), but 01 is not comparable with 10, it is neither lesser nor greater given the described partial order.

You can actually see this kind of structure everyday: unix file permissions for example. Given a user and a file, the permissions of the user are is an element of a lattice where the top element is rwx (or 111 in binary, or 7 in decimal, which means the user has all three permissions to read, write, and execute) and the bottom element is --- (or 000, in binary, or 0 in decimal, which means the user has no permissions). All other combination of r, w, and x are possible, but not always comparable: r-x is not greater nor lesser than rw- in the permissions lattice, it's just different.

astrobe_
0 replies
4h20m

Yes, or for more familiar examples: coordinates and complex numbers. the "default" less-than and greater-than don't have any meaning for them; you have to define one, which may be "imperfect" (because one can't do better), hence the concept of partial order.

denotational
1 replies
5h45m

It isn't useful in mathematics because it is a contradiction in terms: if "neither of the two compared entities is greater or lesser than the other" then they are equal.

That’s only true for a total order; there are many interesting orders that do not have this property.

It holds for the usual ordering on N, Z, Q and R, but it doesn’t hold for more general partially ordered sets.

In general one has to prove that an order is total, and this is frequently non-trivial: Cantor-Schröder-Bernstein can be seen as a proof that the cardinal numbers have a total order.

adastra22
0 replies
5h38m

Example: alphabetic ordering in most languages with diacritics. For example, "ea" < "éz", but also "éa" < "ez". That's because e and é are treated the same as far as the ordering function is concerned, but they are obviously also NOT the same glyph.

jcparkyn
0 replies
5h27m

Is that really a contradiction? What about complex numbers?

Chirono
0 replies
5h42m

That’s only true for linearly ordered structures, but isn’t true for partially ordered ones.

For example, set inclusion. Two different sets can be neither greater than not smaller than each other. Sets ordered by inclusion form a partially ordered lattice.

shemii
5 replies
7h47m

Reminds me of the concept of games in combinatorial game theory, they are a superset of surreal numbers (which are themselves a superset of the real numbers) in which the definition of the surreal numbers is loosened in a way which looses the property of they being totally ordered. This creates games (read weird numbers) which can be "confused with" or "fuzzy" with other numbers, the simplest example is * (star) which is confused with 0, i.e. not bigger or smaller than it, it's a fuzzy cloud around zero (notated 0║*). More complex games called switches can be confused with bigger intervals of numbers and are considered "hot". By creating numbers from switches you can create even more interesting hot games.

Akronymus
2 replies
6h58m

Heres a relevant video on the topic: https://www.youtube.com/watch?v=ZYj4NkeGPdM

I really love that video.

shemii
0 replies
5h40m

Exactly this video made me read more into this topic, I'm currently reading winning ways and lessons in play simultaneously. It's quite fun! I've just gotten started and am looking forward for what's left.

__MatrixMan__
0 replies
2h55m

Thanks for sharing this. I just discovered it now and I love it too.

The space of possible abstractions for any given phenomenon is vast, yet we almost always just assume that real numbers will do the trick and then begrudgingly allow complex ones when that doesn't work. If we're not lucky we end up with the wrong tool for the job, and we haven't equipped people to continue the exploration. It's a bias with some pretty serious consequences (thanks... Newton?).

I don't think I've seen the inadequacy of number-systems-you've-heard-of demonstrated so clearly as it is done here.

cosmojg
1 replies
7h6m

By creating numbers from switches you can create even more interesting hot games.

Well, don't leave us hanging! What are some of your favorite hot games on top of switches?

shemii
0 replies
5h38m

I'm just starting to learn about all this stuff, but iirc thr game of Go is famously "hot". Also I will emphasize thats when talking about "games", usually what is meant is a game position. What spe cific game you are playing isn't too important as it can be shown that some positions in different games are equivalent.

red_trumpet
1 replies
7h0m

The example z_1 ≹ z_2 for complex numbers z_1, z_2 is weird. Imo it would be clearer to state |z_1| = |z_2|, that is both complex numbers have the same absolute value.

To conclude, the ≹ symbol plays a crucial role in providing a middle ground between the traditional relational operators.

As a PhD student in math, I have never seen it before. I do not believe that it plays any crucial role.

s-phi-nl
0 replies
4h20m

It sounds like the symbol "≹" just means "incomparable", which is a well-known concept in math. https://en.wikipedia.org/wiki/Comparability

This symbol for it may be useful, but it's the concept that matters.

dfee
1 replies
8h51m

I was so hoping I could win the day with 0 ≹ -0. But alas, 0 == -0 and 0 === -0.

Figs
0 replies
5h55m

NaN ≹ NaN

tpoacher
0 replies
47m
rollulus
0 replies
6h29m

Apples ≹ pears.

quietbritishjim
0 replies
5h17m

One example would be if you define one set A to be "less than" another B if A is a subset of B. Then ∅ < {0} and {0} < {0, 1} but {0} ≹ {1}.

Such a thing is called a partial ordering and a set of values with a partial ordering is called a partially ordered set or poset (pronounced Poe-set) for short.

https://en.wikipedia.org/wiki/Partially_ordered_set

nine_k
0 replies
5h26m

I suppose that this glyph should result from a combination of emojis for apples and oranges.

matharmin
0 replies
5h35m

I find this concept is important in understanding causal ordering for distributed systems, for example in the context of CRDTs. For events generated on a single device, you always have a complete ordering. But if you generate events on two separate devices while offline, you can't say one came before the other, and end up with a ≹ relationship between the two. Or put differently, the events are considered concurrent.

So you can end up with a sequence "d > b > a" and "d > c > a", but "c ≹ b".

Defining how tie-breaking for those cases are deterministically performed is a big part of the problem that CRDTs solve.

lewisjoe
0 replies
4h16m

The jargon from category theory for this phenomenon is - partial ordering.

It really is an interesting thing. In fact, as human beings who by nature think in terms of abstract, non-concrete units (as opposed to mathematically precise units like a computer program), we tend to compare two related things. They might belong to the same category of things, but they might not be eligible for direct comparison at all.

Once you internalize partial ordering, our brain gets a little more comfortable handling similar, yet incomparable analogies.

_nalply
26 replies
10h33m

Because we as programmers need to know how to progam: learning trumps all. Thus:

    learning > biz > user > ops > maintainer > author
There was that bit

    dev > *
meaning resume-driven development, but in reality it is

    learning > *
And that's the conflict programmers experience in corporate structures, for example the hassle programmers experience when they interview for a position. Employers know it too but they try to shirk it.

ahoka
22 replies
10h29m

My gripe with most programmers that they keep re-learning the same narrow set of skills while ignore the rest that actually makes someone a efficient expert.

sirsau
9 replies
10h16m

What do you think programmers ignore that would be beneficial?

codebolt
8 replies
10h9m

Their working domain. If you're a developer in financial services, learn finance. If you're a developer in oil exploration, learn seismology. Don't constrict yourself to only being a programmer who writes whatever code you're told to. Be a holistic problem solver.

rexpop
7 replies
9h56m

What's in it for me?

creshal
1 replies
9h45m

Usually, a promotion from "one of the coding peons" to "the software architect who tells coding peons what to do"

darkwater
0 replies
5h31m

Even if not an architect, it will make you a very senior and respected developer among the other smart people that can actually evaluate this. And if in your current company there are no such smart people, it will open the doors for you to find another company where those skills will be rewarded.

theshrike79
0 replies
8h43m

It's easier to write code when you understand the context. The customer requirements aren't always flawless and might make assumptions about things that you don't know.

During my career I've had to study:

  - electrical engineering (phase angles, current transformers etc)
  - finance (payroll and accounting)
  - building management (maintenance schedules etc)
  - mining (terminology, how explosives are set in underground and open mines)
I'm not an expert on any of those, but I know enough to be productive and know _why_ I'm doing the task I'm assigned.

rob74
0 replies
9h51m

Maybe not needing to rewrite your code several times after being told that it's not what the client wanted to have?

rexpop
0 replies
2h0m

People are down voting me, but I am really enjoying the answers.

damethos
0 replies
9h46m

To have more experiences and expand your horizon. You never know where it will take you next. Even if it does not translate to more money, it will definitely give you a sense of self-accomplishment and MAYBE make you wiser.

FridgeSeal
0 replies
9h48m

Understanding your users, and what you have to solve for, better than they can probably explain it to you. This makes both your and their lives easier.

bmitc
4 replies
10h1m

That would be nice, but people keep hiring based upon specific programming languages and other programming specific skills.

antupis
1 replies
8h26m

You usually don't want work companies that do resume matching.

bmitc
0 replies
7h36m

That is indeed true, but there is sometimes a mismatch in how a team hires versus how it actually works and performs. So sometimes, there are some missed opportunities to work in interesting places or fields just because their hiring process is uninspired.

ahoka
1 replies
6h36m

Out of my last five positions, only two required a programming language I’ve already used before professionally. You just have to know enough to pass the screening and no competent interviewer will reject you based on lack of experience if you are a good fit otherwise. Hence the other skills, like basic soft skills, leadership, systems and ops knowledge and office politics matter more in the long run.

bmitc
0 replies
6h13m

I mean, that's more information for companies than me. And yes, every job I've been made an offer for and accepted has been much, much looser on the exact languages or actually looked at my submitted code samples if they were interested in understanding if I could actually write code.

One of the worst examples was Amazon. They gave no indication ahead of time, but I was presented with a coding test with only a small subset of languages. My chosen language was F# because that's what I was most comfortable with but of course it was unsupported in their online tool. I solved a problem using discriminated unions, pattern matching, and pipelines. The interviewers were very confused, a bit antagonistic that I'd choose anything other than Java (even though no one told me ahead of time that there was a limited selection), and proceeded to show their lack of knowledge. For example, the interviewer kept bringing up a violation of the open-closed principle in my code. However, as I explained to them, that principle is only really applicable to object-oriented code. Since I was using a union data structure and pattern matching, that principle is not really applicable, as the functional approach is a transpose of sorts from the OOP approach. But I just got blank stares and an automatic deflation in the room of the interviewers communicating that they were ready for the interview to end. What was mind-blowing about it is that my resume lists nothing but functional oriented programming languages, with not a single example of Java, and I have plenty of code examples posted. But yet, I get asked some dumb question about reading a file using some terrible online coding platform.

CobrastanJorji
2 replies
9h57m

It's surprisingly easy to have one year of experience despite working for ten years.

collyw
1 replies
4h14m

That one year is clearly more important than the other nine.

TheCleric
0 replies
2h53m

Not necessarily. I think at some companies you can just fall into local maximums.

For example maybe you've spent 10 years building the same CRUD front-ends over and over. You're probably really good at that. And the companies that you worked for needed that skill. However, you'd be a lot more marketable if you had other skills that you could put to use at future jobs.

thomastjeffery
0 replies
9h39m

I have the inverse problem, and it's why I don't have a tech job.

luzojeda
0 replies
4h37m

Which would the rest be? Asking genuinely as a 4 YOE dev

f1shy
0 replies
10h21m

I could expand that though to a much larger set than programmers!

antupis
0 replies
8h27m

I think learning some narrow skills is fine but the real issue comes when you don't deploy code to production. Because Production is the best place to learn actual problems.

__mharrison__
1 replies
9h23m

This is why blindly turning folks loose with AI is scary. They don't know what they don't know.

repelsteeltje
0 replies
8h3m

... and the "learning" is left to the AI

collyw
0 replies
4h15m

learning trumps all until it leads to resume driven development, and a crud app need to be using all the shiniest new technologies that no on is actually an expert in.

smt88
23 replies
10h40m

For many of us, running our code 1 billion times will cost less than a few minutes of a developer's time.

Hell, I could spend $200 for a month of server time on AWS and run a lot of my (web API) code 100 billion times.

Optimizing for human readers is always better until you're working on something that proves itself to be too slow to be economical anymore.

krab
8 replies
10h7m

In my experience, you need to care about latency. That affects user experience. It's quite hard to pay for better latency.

rezonant
4 replies
9h32m

That depends on the application and the use case, but good performance and good readability aren't mutually exclusive. Easy to read software might not always be the most performant, but it's far easier to improve performance in an easy to read codebase than it is to make a hard to read but performant codebase easier to read.

krab
3 replies
9h17m

Yeah, that's right. I just feel that latency is sometimes missing from the discussions re. developer efficiency vs paying more for compute.

josephg
1 replies
7h44m

Yep. Theres a huge experiential difference between something happening instantly and it happening in 200ms or so. Especially when typing or playing video games.

krab
0 replies
7h21m

It's everywhere. Next day delivery vs in a week. Streaming vs DVD rental. CI jobs. Notification about relevant news for stock traders. Switching files in editor. Jump to definition in IDE. Typing and videogames you mention have a very tight latency budget.

If you don't fit in the budget for the specific task, your product features don't matter much.

collyw
0 replies
4h19m

I see far more examples of per premature optimization (resulting in horrible code) than performance problems.

smt88
0 replies
2h1m

I can often get better latency by throwing a few extra bucks at vertical scaling. Still cheaper than a few hours of dev time a month.

Like I said, it works until it doesn't, and then you do have to optimize for performance to some extent.

groestl
0 replies
10h5m

You do that by paying for better developers ;)

bee_rider
0 replies
2h6m

The universe puts a hard speed limit on latency, but will give you all the bandwidth you want. There’s something almost mystical about latency, we should be very prudent when spending it.

TeMPOraL
4 replies
7h33m

The problem with "costs less than developer's time" math is that, usually, it's not you who's paying. Your users are, often in nonobvious ways, such as through higher electricity bills, reduced lifespan[0], lost opportunities, increased frustration, and more frequent hardware upgrades.

(And most of your users don't have developer's salary, or developer's quality of life, so it hurts them that many times more.)

--

[0] - Yes, wasting someone's time reduces QALY.

gnuvince
2 replies
5h46m

I've been calling the thinking of the parent "trickle-down devonomics": that by making the code better for the user, the benefits will trickle down to the users. Obviously, as the name suggests, that never happens. Devs get the quality of life and users end up paying for our disregard.

BoppreH
1 replies
5h23m

Loved the analogy. Also tracks with the fact that devs hold all the power in the user x dev relationship.

that by making the code better for the user,

Did you mean the dev?

astrobe_
0 replies
4h12m

Users can "vote with their feet (or wallet)"... Sometimes.

Capricorn2481
0 replies
15m

This implies that if a developer works half as long on something, all of that money that would be spent on them is spread out amongst their users. Which makes absolutely no sense.

Abstractions MIGHT make your code slower. But there's a reason we're not using assembly: Because the minor efficiency hit on the software doesn't match up with the bugs, the salary towards the experts, the compilation errors, etc.

A VM is a pretty good tradeoff for users, not just devs.

mch82
1 replies
7h9m

It seems like the author agrees with you and picked a confusing title for the article. The article ends with a set of equations:

    user > ops > dev
    biz > ops > dev
    biz ≹ user
The conclusion seems to be that code exists in service to the end-user and the business. The last equation (≹) is a neat way of describing that both end-user and the business are equally important to the existence of the code, even though their needs aren’t the same.

Lutger
0 replies
5h14m

Neat summary. I think many developers experience the degree to which biz and user diverges as a source of problems: the more divergence, the more problems.

marcus_holmes
1 replies
8h20m

Also, for my compiled code (in Go), the code that I write is not the code that the compiler generates. I can write simple code that's easy for {{me in 3 months}} to read and let the compiler do the fancy stuff to make it fast.

kaba0
0 replies
6h12m

For what it’s worth, Go might not be the best example here as its compiler does very little (hence the fast compile times).

Some LLVM-based language would fit the bill better, like Rust, C (also true of the intel compiler and gcc), C++, etc.

hk__2
1 replies
8h20m

From the article:

When I say “run” I don’t just mean executing a program; I mean operating it in production, with all that it entails: deploying, upgrading, observing, auditing, monitoring, fixing, decommissioning, etc
melagonster
0 replies
8h1m

So op use this point turn back to support the opinion that readability is more important.

the inference process in article is interesting, but title tempt us to debate about a less related topic.

thanks for your comments let me finish reading.

bee_rider
0 replies
2h0m

It seems like a false trade-off in the first place. The point of writing readable, maintainable code is that your team will be able to edit it later. Including adding performance enhancements.

Another way of stating the relationship could be something like: You have fewer brain-cycles to apply to optimization than the combined sum of everyone who will ever read your code, if your code matters. But that is a mouthful and kind of negative.

Xeamek
0 replies
2h10m

1million users waiting even a 1 second longer is about a month of single developers time.

This sort of calculations you preach are inherently untrue, as they completely ignore that 1second times million. After all, nobody bothers economically evaluate just a single second. But it does account to much, when multiplied by the ammount of users. And when we multiply again, by the times a single user uses your software, and then again, by the time a users uses different software from other developers who also thought "it's only 1 second, nobody cares", we end up living in world where software usability gets lower and lower despite hardware getting faster and faster.

We end up living in a world where literally weeks are wasted everyday, waiting for slow windows file explorer. If you'd want to evaluate that honestly, you would probably come to conclusion that microsoft should have a dedicated team, working for decade on nothing but explorer startup optimization, and it would still pay for it self.

But they don't. Because at the end of the day, this whole "lets evaluate developers time working on given improvement" is just a cope and justification of being us lazy, that only pretends to be an objective argument so we can make ourself feels better

Certhas
0 replies
7h20m

That's the response to the article I was expecting to read. This is a different article, though. Go ahead and read it, worth it!

nehal3m
16 replies
8h18m

I think the corollary to the title (to turn it around on the author) is not 'Code is read more than written' but 'code that can't be read won't run for long'. Disclaimer: Experienced sysadmin trying to make a lateral move to development and as such a complete noob.

TeMPOraL
5 replies
7h36m

Proprietary software you don't have sources for (say e.g. a third-party library), or just about any black-box system, are counterexamples to your corollary.

nehal3m
3 replies
6h54m

Well sort of, I can't read the binaries but I expect the vendor to have source and maintain it properly right?

layer8
1 replies
6h39m

There's a lot of software that only exists in binary and whose vendors/maintainers are long gone, and that is used for decades because it just does its job.

nehal3m
0 replies
6h37m

You’re right, the sibling to the comment I responded to mentioned this as well and it gave me flashbacks to emulating an Amiga system to get a multi million dollar industrial system up and running.

LtWorf
0 replies
4h16m

I've had a coworker commit a binary on git and keep the source on his machine.

I'm sure it has happened more than once.

rob74
0 replies
6h55m

Yeah, some code runs for so long that the system's owners have issues with repairing/replacing the hardware it can only run on if it fails (and then sometimes resort to emulation of the old hardware to be able to keep running it).

rafaelmn
4 replies
7h40m

code that can't be read won't run for long

There's plenty of ossified code people are scared to touch because they don't understand it, but stake their business on it :)

thfuran
0 replies
5h32m

And even executables for which the source code has been lost but which are still important to business operations.

couchand
0 replies
5h12m

One of my clients is convinced that a short hunk of JavaScript code that nobody can clearly describe the behavior of is their most valuable asset.

Non-coders are weird.

Lutger
0 replies
5h17m

Or just code that works and nobody wants or needs to spend money on changing. I've written such code, very crappy stuff. And then coming back to the client many, many years later, finding it all just humming along, driving their business critical applications without it or the computer it runs on ever having been updated once in 5 years or so. I was so surprised, and a bit scared.

Sometimes when you don't change anything, it just keeps working.

So I guess that makes it a very boring:

   code that can't be read won't be changed and will not be relevant for long, but not always

LtWorf
0 replies
4h18m

I (very briefly) worked in a startup whose business was search (on a certain domain) and they had no tests for their "Search.java" file (the real name was 300x longer, because java devs…).

I had found some operations to tweak the scoring, except that some were multiplications by one, so I removed them. But I got told to not touch them because they wouldn't know if I had broken it until some customer complained.

The CTO told me off for my completely irresponsible behaviour.

ric2b
1 replies
5h14m

The entire financial industry disagrees. Also can I interest you in coming out of retirement to explain your cobol code to other developers?

pards
0 replies
4h37m

Or to help us understand that mission-critical MS Excel 5.0 spreadsheet? We need to retire that Windows 95 machine under John's desk.

raziel2p
0 replies
5h18m

That would be nice if it were true :(

benreesman
0 replies
5h14m

Welcome to this side of the shop, I hope we can all make you feel welcome. :)

Bad news: too few experienced ops people became one less!

TheCleric
0 replies
2h58m

I think it can be run for as long as you have the right infrastructure.

I'd say it's more 'code that can't be read won't be modifiable for long'.

t43562
15 replies
5h56m

Some users are not using a system because they like it but because their company bought it.

In those situations biz > user by definition and the developers end up having to cater to the needs of the middle managment of their customers rather than the needs of the actual users. The price of not doing this is failing to win the contract. Users then get locked in to whatever crap you have time to provide for them while you're really busy implementing new features that middle managment like.

You essentially only need a nice looking login screen and some sort of reporting and the rest.....doesn't matter much.

I am being a bit cynical but it does pay as an engineer to know if that's fundamentally the kind of company you're in.

An online retailer, for example, is hypersensitive to its users and I know of at least one that has different versions of its website for different countries because they know that Germans like X and Americans like Y - small changes make a huge difference to sales.

Other companies have no sensitivity to the usability of their products because the people who buy their products are never the users.

4death4
3 replies
1h35m

I wouldn’t say you have to cater to middle management instead of the end user. You just can if you want to. Of course you need to consider what middle management needs, since they’re paying you, but there is usually room for craftsmanship to bring a truly great UX to the end user. Most software engineers are lazy and lack a true sense of craft, so they usually skip building a great UX when it’s not a requirement.

bearjaws
2 replies
55m

In enterprise software, you cater exclusively to management, they will turn a blind eye to 99% of issues as long as the team is able to get the work done.

Take one look at EMRs and realize that its sold / marketed to .01% of the hospital, despite 80% of the health system using it.

sophacles
0 replies
35m

Catering to a mixture of management and users helps sales. Having a product that users (reports) love will get your product to have internal advocates. Some managers are self centered jerks, but most will at least consider their reports advocacy if the product also has the features they need.

4death4
0 replies
50m

In enterprise software, you cater exclusively to management, they will turn a blind eye to 99% of issues as long as the team is able to get the work done.

Again, you can exclusively cater to management, but you don’t have to. Look at Datadog. It’s a great product, but still ultimately purchased by management.

doctorpangloss
2 replies
48m

This is a very narrow minded take. Every big software success story - Gmail, Slack, Dropbox, Zoom… - was business-to-consumer explicitly or in disguise.

Then again, I’m not saying much that vendors screw up pricing when they choose a price other than “$0,” which is an essential part of B2B-disguised-as-B2C software. Easy for me to say.

Anyway, the boomers you are talking about are retiring out of the workforce, and it will be more likely than ever that the audience will be extremely sensitive to UX, desiring something the most like TikTok and Instagram than ever.

dingnuts
0 replies
23m

Slack is a weird choice of example, and so is Zoom. They're both currently getting their lunches eaten by MS Teams for specifically the reasons laid out in the grandparent.

Slack in particular had to take a just-OK buyout from Salesforce and the product has seriously stagnated.

bruckie
0 replies
38m

I'm curious how you'd define "success story". From my POV, it seems like there are definitely some B2B successes, such as Oracle or Salesforce.

derangedHorse
1 replies
4h53m

Typically middle management are users as well, but they are a minority of the user base and use a different set of features (like reporting). So now this becomes a question of which users are prioritized and finding a balance between prioritizing the experience of the small number of users who hold power over the rest, and keeping the product usable enough for the rest of the users to provide some value in data to management.

pixl97
0 replies
2h9m

Or as I've told the devs in the place I work "Don't fuck up the data path to the reports, it is a cardinal sin".

bluGill
1 replies
2h52m

This is short term thinking - users who hate software can voice enough complaints to get things changed in at least some situations. (not all: many SAP programs with garbage UIs exist)

mbb70
0 replies
2h42m

The user of SAP you describe is not the 'user' in the sense of the article. The user is the one who pays, i.e. some other business.

SAP absolutely delights its users to the tune of a $200 billion market cap.

Gabrys1
1 replies
28m

I used to work at a company that sold SaaS to big corporations.

We needed to win contracts, so we needed to tick their checkboxes, but we also cared for the user experience (good UX was almost never a strict requirement from our customer).

Our competitors' software was very painful to use, so we wanted to differentiate in this regard.

This made our own lives easier as the training was easier, the users we interacted with (that usually had no say in whether our solution was bought to them or our competitors') were happier and recommended to their managers (where they could) buying more stuff from us.

In the end this was 80% driven by pride (our software doesn't suck) and empathy (I couldn't stand using the software if it was as bad as our competitors') but to some extent this was also in our interest (especially in the long term, where your brand is built).

SpaceNoodled
0 replies
9m

I wish more companies realized your last point.

manvillej
0 replies
7m

the developers end up having to cater to the needs of the middle managment of their customers rather than the needs of the actual users

this is why all enterprise software sucks

Deprecate9151
0 replies
2h32m

I've experienced this. I worked for a company that sold software to municipal governments. All that mattered was the Mayor/Town Manager/City Councils opinion. If the reports looked good and the price was right, they were going to renew.

I remember being in site meetings where people who used it every day would tell us to our face how terrible it was. Without fail, that site renewed with some promises to fix a couple specific bugs and a minimal price increase.

gostsamo
8 replies
10h25m

So, the author hijacks a perfectly good rule of thumb to build their grand theory of everything. It is all nice, clean, and wise, and besides the tortured turn of phrase is just rechewing of popular truisms.

m463
3 replies
9h47m

so:

  theory > /dev/null

sltkr
0 replies
1h53m

    blog < /dev/random

collyw
0 replies
4h18m

pragmatism > theory

Peritract
0 replies
9h15m

Beautiful.

thomastjeffery
0 replies
9h43m

Did you read to the end? Everything else is backstory.

rob74
0 replies
9h55m

To say it more precisely, their grand theory of everything that can go wrong in software development. But interesting read nevertheless...

ngrilly
0 replies
5h54m

the tortured turn of phrase

Usual reminder that many people in our industry are not native speakers and don't live in an English speaking country, and yet they make the effort to write in English, which may explain the "tortured turn of phrase".

just rechewing of popular truisms

And yet these "popular truisms" are particularly well put together in a coherent way, which makes this post a useful reference.

gotski
0 replies
9h9m

Honestly, I think there's value both in riffing off rules of thumb and using that riff to revisit and re-contextualise things we already think we know.

Everything is new to someone, and even if this was just confirming my own biases I found it an interesting take.

YoshiRulz
8 replies
10h11m

There’s a lot of software being produced that just doesn’t care about its users [... caused by] a mismatch between what we thought doing a good job was and what a significant part of the industry considers profitable [... so] perhaps we should take a stronger ethical stand not to harm users.

Or, we could ditch the "biz" part. It just so happens that software not written to serve business interests tends to also respect users.

vasco
4 replies
10h8m

If we ditch the business part what's your recommendation to pay rent?

YoshiRulz
1 replies
2h27m

I happen to believe in subsidised/affordable housing (I assume a UBI is what cybrox meant by not paying rent; also cracking down on housing-as-investments[1] sounds good to me, a layman, though apparently it wouldn't be enough[2] in our case, supply just needs to increase)—but I'm sure that other, less drastic societal changes would also be sufficient. For example, medium to large businesses could fund any open-source projects they make direct use of[3]. With the likes of OpenCollective[4][5] and GitHub Sponsors[6], the only roadblocks to adopting such a policy are corporate red tape and corporate greed. (And if it leads to smaller SBOMs, well, that's just a bonus as far as I'm concerned.)

All that said, I was actually referring to individuals (who aren't necessarily developers) choosing which software to use. Over the last 20 years, we've all taken free downloads and sign-ups for granted, ignorant to that whole "if you're not paying, you're the product" thing, and a lot of us now have a $500+ supercomputer in our pockets which is to some extent owned by a tech giant and not its nominal owner. It's apparent that such centralised platforms have some intrinsic problems with regards to users' freedom. That's what I'm cautioning against—not the idea of selling your programming labour to a business, which is fine.

[1]: https://en.wikipedia.org/wiki/Australian_property_bubble#For... (This article is outdated. From a quick web search, it seems foreign real estate investment fell during the pandemic but has now picked up again.)

[2]: https://www.abc.net.au/news/2022-09-02/housing-property-aust... (1M unoccupied houses in a country of 26M, ~120k of whom don't have secure housing...)

[3]: https://stackoverflow.blog/2021/01/07/open-source-has-a-fund...

[4]: https://blog.opencollective.com/funds-for-open-source/

[5]: https://docs.opencollective.com/help/financial-contributors/...

[6]: https://docs.github.com/en/sponsors/sponsoring-open-source-c...

vasco
0 replies
2h0m

I thought you'd have something practical in mind like specific industries or working for non-profits or something that would work today, not some theories of how the world could work. I can't go tell my landlord he should believe in subsidised housing like you do unfortunately.

layer8
0 replies
6h33m

It's possible to get paid for writing software that secretly doesn’t solely serve business interests but also and even primarily repects users.

cybrox
0 replies
8h25m

Just ditch the paying rent part.

wruza
1 replies
6h0m

To ditch the biz we have to make dev and ops accessible to ourselves. Nobody does that. Making a proper login form still requires a team and a week.

The biz sits there rubbing its hands, watching us making the means of production more and more complex and expensive to use, so that it has a complete monopoly on creating software.

Devs are so used to relatively big paychecks from the biz that unconsciously tend to ignore these issues.

It’s not implementing dark patterns and cookie popups that makes you on the biz > * side. Tolerating the complexity of software development is. “Biz > complexity > *”. Push the complexity to the right as much as possible to ditch the biz.

Due to developer deformation it should be explicitly stated what the complexity is:

- Pointless “constant evolution” (version.major++ in less than 5-7 years, previous versions abandoned),

- Embracing epoch-breaking changes (2/3, ESM)

- Low-level as the norm (scaffold, config, listen, route, ddl, connect, hash, jwt, css, setstate, useeffect, fetch, … in place of business logic)

YoshiRulz
0 replies
2h7m

It's interesting to hear you say that reducing complexity and building to last(?) is the solution. Do you know of any case studies (not even peer-reviewed necessarily, just examples) showing a "recovery" in terms of user freedom after making such changes? My view has always been that complexity should be reduced, but because it makes maintenance easier—reducing cost if you're paying maintainers—and can prevent bugs/vulns. Only tangentially related to privacy, via the latter.

Also, I don't understand your last point. Are they all React builtins or something? If you're suggesting that the "shape" of an app's navigation, or of network or system calls, etc. should be how business logic is made concrete, I'd have to disagree. That sounds like microservices but with worse intrinsic documentation (the purest documentation there is).

cousin_it
0 replies
1h15m

I think it might be enough to tax all electronic ads per viewer-second. It's what directly or indirectly drives most mistreatment of users. Such a tax could be pretty popular too.

bee_rider
5 replies
2h14m

Businesses don’t really exist, they are an imaginary construct that we’ve come up with to help organize resources, ultimately in the interests of working together.

Business isn’t more important than anything. There are multiple users, sometimes with competing interests; you can’t be everywhere and everything, so you have to prioritize. Going after more profitable users or users that align with some long-term strategy could be seen as “good for the business,” but really the goal is to serve the users (it might just take a couple extra steps).

When the internal politics get confused to the point that people are making decisions just to further the interests of the business without figuring out how it leads to user happiness, the organization has become poisonous. It shouldn’t exist anymore. It might lurch on in a zombie state for quite some time. But it is on the decline, and all the good people will leave.

fritzo
1 replies
1h9m

Let's charitably interpret 'business' here as "a sustainable funding model that can support maintenance, support, and future development". Without a business model, even great user-pleasing, deployable, maintainable software can fizzle out.

bee_rider
0 replies
1h2m

It is a useful imaginary construct. It captures the idea that your users, if you are doing something useful, have an interest in you continuing to be able to do it.

IMO the reason the article had bring up this obscure !>< operator is because it treated this particular set of user interests as somehow separate from the users. The reason it is hard to rank business vs user interest is because business \in user.

charles_f
1 replies
1h22m

I had the same initial reaction. Seeing something written that can be amounted to $ > person looks wrong.

Yet, importance is subjective. If you're working on your own pet code for your own pleasure, business has no importance. If you want to transform that into your main revenue source, business is the most important thing, because no amount of user love will transform into practical revenue if your software serves no-one.

bee_rider
0 replies
1h8m

If your project is useful to users and requires money to continue, it is in their interest that you make money. The business is just an abstraction around the fact that your users need you to be able to do the thing.

franga2000
0 replies
51m

This is simply not true. Business exist as legal constructs and there are many things that are good for the business but bad for almost everyone else. Business also do not exist to serve users, whatever definition of user you might have.

Business, unfortunately, exist to serve their owners. In most cases (I'm talking primarily about larger companies, not <5 people micro-businesses) the owners want money, so everyone at the company is there in order to get the owner more money. Happiness of anyone else, let alone the users, is entirely irrelevant, unless it happens to correlate with revenue. The only other universal incentive inside a company is self-preservstion, so besides doing whatever makes money, decision-makers will also take their own job security into account when making decisions.

Employees won't leave, because the company will make sure they're happy enough. It's surprisingly easy to keep people working for evil and/or faceless organisations if you pay them well, make them feel like part of a "community", etc. (see any FAANG office for an in-depth catalogue of these HR tricks)

I agree that this is "poisonous" and that such a company "shouldn't exist anymore", but this is how companies work in practice. This is not a sign of any kind of decline, but of a mature and healthy business that can keep going for decades. Execs change, products change, even owners change, but the business remains.

jeffparsons
3 replies
9h20m

Hey, author, if you end up reading this thread: put the conclusion on a sticker. I'll put one on my laptop and give a bunch to friends.

For anyone that didn't make it to the end, the conclusion was:

    user > ops > dev
    biz > ops > dev
    biz ≹ user

karmakaze
2 replies
9h15m

In the long run user > biz, otherwise users will be using a product of a different biz via the enshittification cycle.

layer8
0 replies
6h16m

There are enshittified monopolies and oligopolies, with lots of moats.

jeffparsons
0 replies
8h20m

Assuming a non-shit business, you could think of "biz" as "customers integrated over the long term". Therefore putting "users" and "biz" as incomparable could be interpreted as prioritising users, sustainably.

At least that's how I interpret it.

gary_0
3 replies
10h6m

"biz > user" is valid on paper because someone has to pay for the party; there are a finite number of software devs and capitalism is great at allocating scarce resources where they're most needed. Or at least the "spherical cows" version of free-market capitalism is great at that.

In that version of capitalism, if your OS spends too much time putting spam in the Start Menu and not enough on fixing bugs, you can just switch to that other OS with full support from hardware vendors, that runs all your closed-source legacy software. If your favorite bird-themed social media site starts doing boneheaded stuff, you can just switch to that other site that has all your friends and data and lots of posts from experts and celebrities. If your search results are full of spam, you can switch to that other search engine with 10,000 highly paid engineers and 20 years of experience in Web search and integration with all your devices.

And all the businesses you moved away from would have to clean up their act or go out of business. If only those cows were perfectly spherical.

thegrimmest
2 replies
8h10m

Just because this doesn’t happen instantaneously doesn’t mean it’s a fantasy. Large changes in an ecosystem always take time. It may be years before the incumbents are truly displaced, but that doesn’t mean they won’t be. MySpace and RIM also seemed like they’d be there forever.

layer8
0 replies
6h20m

You'd have to explain your optimism of things eventually getting better, given that they've largely become worse over the past 10-15 years. Maybe it'll just be meandering around a mediocre-at-best mean level long-term.

gary_0
0 replies
7h53m

"In the long run, we're all dead."

danielovichdk
3 replies
10h40m

That's not the point. The point is that a computer is not involved in the creative and qualitative process, it does not care. Humans are different ad you might all know.

So of course code should be readable over being runnable.

murkt
2 replies
10h38m

You haven’t read the article past the headline, have you? It’s not about computers running code, it’s all about humans.

danielovichdk
1 replies
10h10m

"It doesn’t matter how well written or maintainable the code is, nor how sophisticated the technology it uses if it doesn’t fulfill its purpose and provides a good experience to the user"

This is what I was commenting on. And I do not agree with it because it underlines the dogma about how little craft quality means as long as someone finds it useful.

You could also hammer 3 wooden boards together and call it a chair.

But that's not for me, it's simply too shallow a mindset for being a professional software engineer.

johnnyanmac
0 replies
8h49m

I do not agree with it because it underlines the dogma about how little craft quality means as long as someone finds it useful.

YMMV vastly based on industry (or in the lens of this post, the demands and requirements on the user end to satisfy). For media, it's fine; very few users are going to suffer over a few minor bugs or an unoptimal system. For aerospace, it's fatal thinking for obvious reasons (with sadly, many real world examples to point to). Both are valid career aspects and both provide value.

it's simply too shallow a mindset for being a professional software engineer.

sounds like you fall under "the right thing", as the author calls it. A sadly dying aspect as elephants and late stage capitalism (again, as the author named them) run wild.

But I argue more that this is just fine for an engineer. You're thinking more like a professional computer scientist. scientists observe and model the nature around them. Engineers apply them, while taking into account the imperfections of nature and yes, humans. In my eyes, an engineer that can't adjust to the needs of the product (be it for biz, users, or someone that is not them) isn't an engineer, but a scientist. Or a craftman, as you deferred to ealier.

Again, not a slight nor compliment, just different roles with different philosophies.

bruce511
3 replies
12h12m

Very nicely explained, as I kept reading it covered all my experiential "yes, but" thoughts.

I will add that-knowing- all this is helpful, but implementing it when you are early in your career is hard.

For example, it's good that Business is the big priority, but when you gave no experience of what is good, or bad, in business, it can be hard to understand the ramifications of decisions made now.

Equally, business priorities should win, but your goals and the business goals may not be aligned. An individual may need to resume-pad (learn that new framework) while the business may want homogeneity (everything built on one framework.)

Finally, of course, this refers to commercial software. Open Source software is the exact opposite flow, dev > maintainer > user > business. Which ultimately explains why it's so hard to get funding for OSS.

goodpoint
1 replies
3h6m

Open Source software is the exact opposite flow, dev > maintainer > user > business

That's a pretty sad assumption.

"dev > user" is why a lot of projects have very poor usability.

bruce511
0 replies
2h45m

Obviously it's not true for all projects. But it's true for most of them, no?

082349872349872
0 replies
11h6m

Then again, the original ideal for OSS is that (just like owning a home is having the tenant and the landlord be the same person) devs and users are the same, so:

(dev = ops = user) ≹ (biz).

azangru
3 replies
9h5m

I think what I am seeing in this article is a curious mixture of should and does. For example, the "user > dev" formula is a clear example of "should"; but when he gets to "biz > user", he surreptitiously switches to "does". He explains the biz > user by pointing out stakeholders, and investors, and personal interests, and politics, and all other sorts of crap that, in the real world, puts biz before the user. Very understandable. But should it? And why isn't the same explanatory method applied to the "dev > *" formula? After all, there are clearly very strong personal interests involved in that too.

C0mm0nS3ns3
2 replies
6h22m

But should it?

How would it work if it wouldn't? He explains what he means by that. If you spend time and resources on all things the users require and your run out of money and go out of business, everybody loses.

Of course you can take any of the "rules" and take them to an extreme where they become wrong. But I think if you don't push them to their breaking point they are good rules of thumb :)

azangru
1 replies
2h55m

How would it work if it wouldn't? He explains what he means by that. If you spend time and resources on all things the users require and your run out of money and go out of business, everybody loses.

I think there is also a bit of a conflation of values vs ability. The formulas in the article represent values. Real life adds constraints based on what's possible.

Consider his formula for dev vs user: "user > dev". You could argue that, just as a business is constrained by time and money, so is the developer constrained by time and skills. And yet, the author is happy with turning the greater than sign towards the user in this formula. Why?

filleduchaos
0 replies
17m

It's the same thing. If you bias your time and resources towards the things the dev require and/or would like and nobody actually wants to use the resultant software, you equally crash and burn and everyone loses.

It's a priority ranking, not a zero-sum game.

Alifatisk
3 replies
5h42m

Code is run more than read. Code is read more than written. Code is written more than ...?

tomovo
1 replies
5h24m

...correct?

Alifatisk
0 replies
2h28m

"tested" feels more suitable

collyw
0 replies
4h5m

tested

xpe
2 replies
4h20m

I'm glad to see something approaching ethics discussed:

There’s a mismatch between what we thought doing a good job was and what a significant part of the industry considers profitable, and I think that explains the increasing discomfort of many software professionals.

"Discomfort" is quite the understatement. This leaves so much unsaid.

I will add some questions:

- What happens when your users are not your customers (the ones that pay)?

- Does your business have any ethical obligation to your users -- all of them -- even the ones that do not pay?

- What happens when your paying customers seek to use your business in ways that have negative downstream effects for your users?

For example, what if:

- Your platform makes fraud easier than the existing alternatives?

- Your platform makes it easier to misinform people in comparison to the alternatives?

- Your platform makes it easier to shape user opinions in ways that are attractive (habit-forming) but destructive in the long-term?

All of these have proven to be successful business models, over some time scales!

Given the reality of the dynamic, should a business pursue such an exploitative model? If so, can it do so more or less responsibly? Can a more ethical version of the business mitigate the worst tendencies of competitors? Or will it tend to become part of the problem?

A key take-away is clear: some classes of problems are bigger and more important than the business model. There are classes of problems that can be framed as: what are the norms and rules we need _such that_ businesses operate in some realm of sensibility?

Finally, I want to make this point crystal clear: a business inherently conveys a set of values: this is unavoidable. There is no escaping it. Even if a business merely takes the stance of 'popularity wins', that is in itself a choice that has deep implications on values. Political scientists and historians have known for years about the problems called 'tyranny of the majority'. Food for thought, no matter what your political philosophy.

I don't know 'the best' set of ethics, but I know that some are better than others. And I hope/expect that we'll continue to refine our ethics rather than leave them unexamined.

[Updates/edit complete as of 8:03 AM eastern time]

ebiester
1 replies
3h51m

I believe this is a different problem than the essay is speaking about. You can choose what problems and domains fit your ethics. This is about how you build a system and how you prioritize the work.

xpe
0 replies
6m

I believe this is a different problem than the essay is speaking about.

Hardly. I'll quote the last paragraph and the three inequalities:

There’s a mismatch between what we thought doing a good job was and what a significant part of the industry considers profitable, and I think that explains the increasing discomfort of many software professionals. And while we can’t just go back to ignoring the economic realities of our discipline, perhaps we should take a stronger ethical stand not to harm users. Acknowledging that the user may not always come before the business, but that the business shouldn’t unconditionally come first, either:

    user > ops > dev
    biz > ops > dev
    biz ≹ user
First, I want to emphasize "perhaps we should take a stronger ethical stand not to harm users". The author did a nice job of "throwing it our faces" but the underlying ethical currents are indeed there.

Second, "the user may not always come before the business, but that the business shouldn’t unconditionally come first". This is very much aligned with my question "what are the norms and rules we need _such that_ businesses operate in some realm of sensibility?"

I'll add one more thing. In much of the software world there is a mentality of "We'll figure out Problem X (such as a particular problem of scaling) if we get to that point." I'll make this claim: naively deferring any such problems that pertain to ethics are fraught. Of course there are practical considerations and people are not angels! For precisely these reasons, ethics must be something we study and put into practice before other constraints start to lock in a suboptimal path.

ofey404
1 replies
9h44m

A good article with a misleading name...

thomastjeffery
0 replies
9h34m

Then again, it doesn't mislead very far.

j16sdiz
1 replies
8h4m

I can’t make a Google search without getting back a pile of garbage.

I think SEO / Spammer share lots of that guilt. Not sure how to attribute them correctly.

xiphias2
0 replies
5h49m

It's quite easy: Google giving better results than the minimum needed to make sure users don't switch is detrimental to its ads revenue.

If you look at how many people are working on search at Google, it's just a small part at this point, vast amount of people at Google work on different ways of monetizing it.

hasoleju
1 replies
9h10m

This article has opened up a new perspective for me. It articulates very well that software is a means to an end.

It's a craft that can be used to solve a problem.

In my past I often emphasized the craft part too much. As if only writing perfect code is all you need to do in order to be successful. The really important stuff is understanding the problem you want to solve and being sure that software is the tool to solve this particular problem.

cybrox
0 replies
8h25m

The person understanding the problem to solve and the person crafting the solution don't necessarily need to be the same person, though.

They can, in fact, be two people with completely different skill sets and if one of them ("you") can "only" write perfectly beautiful code, they can still succeed by relying on the other for breaking down the particular problem.

bogrollben
1 replies
4h5m

Nice article.

I'd like to point out that a mature/legacy codebase contains every example listed under the "smell" section of the article, sometimes even within the same file. This creates a great deal of complexity to unwind beyond the mere coding of it all.

nikanj
0 replies
3h59m

And an enterprising code keener will insist on rewriting that legacy codebase to eliminate all "smells", replacing battle-tested, bug-free-but-bad-smelling code with fresh, elegant, untested and nice code.

Every generation relearns this the hard way, this blog post is 23 years old today and still 100% on the money https://www.joelonsoftware.com/2000/04/06/things-you-should-...

bmitc
1 replies
10h3m

Software should have a purpose, it’s supposed to provide a service to some user. It doesn’t matter how well written or maintainable the code is, nor how sophisticated the technology it uses if it doesn’t fulfill its purpose and provides a good experience to the user:

user > maintainer > author

This is completely wrong. To see why, replace the > signs with = and work backwards from there.

falserum
0 replies
10h0m

Please elaborate.

vinay_ys
0 replies
9h35m

The author's framing can be misunderstood in so many ways that it is not a useful shorthand at all. There can be no absolute rank order of these tokens.

Firstly, in this framing, the "dev" is not one person but it is a collective for lots of people with varied expertise and seniority levels in different orgs – product, engineering and design orgs.

Then, "ops" is again not one thing and not just engineering ops. It could be bizops, customer support etc. too.

Then, "biz" isn't one thing either. There's branding/marketing/sales/legal etc. and execteam/board/regulators/lenders/investors etc.

All of these people affect what code is written and how it is written and how and when it is shipped to users. Everyone should be solving the same "problem".

A lot of the times, a lot of people within the org are just there to make sure that everyone understands/sees the same "problem" and is working towards the same goals.

But that understanding is continuously evolving. And there is lag in propagation of it throughout the org. And hence, there is lag in everyone working towards the same goal – while goal itself is being evolved.

Finally, "user" is not one thing either nor any one cohort of users are static. There are many different cohorts of users and these cohorts don't necessarily have long-term stable behaviors.

So, it helps to understand and acknowledge how all the variables are changing around you and make sense of the imperfect broken world around you with that context. Otherwise it is very easy to say everyone else sucks and everything is broken and you want to restart building everything from scratch and fall into that and other well-known pitfalls.

timeon
0 replies
4h1m

Just side note. Article is written in context of business. But sometimes there is non, like public sector.

Nevertheless, article covered this context in the end as well.

solatic
0 replies
10h29m

And while we can’t just go back to ignoring the economic realities of our discipline, perhaps we should take a stronger ethical stand not to harm users. Acknowledging that the user may not always come before the business, but that the business shouldn’t unconditionally come first, either.

This is why software engineering needs licensure. Not to check if people understand design patterns, or whether people know the fad library of the day, but to set clear ethical standards and hold people to those standards by disbarring them if they violate those standards.

Or more to the point:

  society > biz > user > ops > dev

rtpg
0 replies
7h20m

Kind of funny, but I have seen code that is indeed read more than is run, at least for the first year or two of its existence. Code that is used for generating batch reports for some billing process would run once a month. But I had to spend a lot of time staring at it, wondering what is going on, and fixing things up as reports came in.

One could say that the test runs were run more than the code was read (of course), but in production? Definitely the same order of magnitude for a long time.

rexpop
0 replies
9h39m

To avoid violating my egalitarian principles, I choose to read ">" as "depends on."

redhale
0 replies
5h36m

I'm not sure I've ever read an article where so many times I thought "ok fine, but ..." only to have the next paragraph address that exact point each and every time. The ending point especially.

As others here have pedantically pointed out (as HN is wont to do), there are ways to misinterpret or muddle some of the phrasing used here. But I think that's true of any reasonably simple (and therefore broadly useful) mental model. To me, this is a useful way of framing these ideas at a high level.

rafaelmn
0 replies
7h41m

You have to have something worth running first.

quickthrower2
0 replies
10h44m

I was sceptical but you know what, I love this mental model.

Don’t follow it blindly of course, there are exceptions where dev > biz (see OpenAI debacle) and where dev > ops (early stage startup, move fast, in particular dev > ops because biz)

pjs_
0 replies
35m

Yeah man I press the gas more time than the brake but I still want the fuckin brakes to work...

otikik
0 replies
6h33m

I expected to be very defensive when reading the title but I actually fully agree with the article, thank you for sharing

low_tech_punk
0 replies
8h30m

I know this is considered wrong in so many ways, but my personal job satisfaction depends on dev > *, and in an ideal world, with open-source and end-user programming, we get user == dev > *

lhnz
0 replies
5h23m

The article is good on the whole but the title "Code is run more than read" seems false in many situations I've been in. A lot of software doesn't have users. Often the target users don't like the product enough and are just trialing it for you as a favour.

In these cases, it's not very meaningful to say it's being "run more than read", because either you should have made the code readable enough for other developers to delete it, modify it or rewrite it until users want to use it, or the code has to be valueless and able to be completely thrown away (the knowledge of how to build it is either written down elsewhere or tacit).

I also agree with what somebody else said about the times in which the features/fixes you create aren't seen by users but just benefit a business-to-business sales process. In these cases, you behave differently. You do the work because there's a very clear signal to your company meeting their expectations promptly.

I guess one way I would agree with the article is that once something has users/businesses and is somewhat readable you apply your extra effort on polishing the user experience as prioritising the happiness of your core users is generally a good idea.

kuldeepfouzdar
0 replies
9h56m

Open source [not-for-profit] software says: Hello?

kazinator
0 replies
8h55m

Code being run more than read means this: code is read more by machine than by person. Users don't read code much, on the other hand. Thus:

   machine > maintainer > author > user
QED

hamdouni
0 replies
9h22m

biz > user

This one sounds like there is only the HN reality of dev : startups, VC, growth hacking, free until is not, client=product, burn every $ to reach monopoly,...

But, there is another one with small businesses, craftsmanship, get what you pay, client=client...

gwbas1c
0 replies
15m

maintainer > author

usually a good investment to make the code maintainable by keeping it simple, writing tests and documentation

I recently inherited a project where the leadership 100% believed this and tried to do it.

The problem is that the copious junior developers they hired, with all of their good intentions, just couldn't write maintainable code.

gwbas1c
0 replies
17m

There’s a lot of software being produced that just doesn’t care about its users, or that manipulates them, or that turns them into the product.

There’s a mismatch between what we thought doing a good job was and what a significant part of the industry considers profitable

Feels like one of the main plot points of Tron. (1980s Disney movie where programming was a major plot point.)

gloosx
0 replies
8h1m

Quality article – for years of developing software, I have the bingo – heard and got used to every smell as long as there was air to breathe.

I'm always in the end of this chain, doing the dirty work :)

emrahcom
0 replies
6h10m

Needed to define customers and users separately

ekaramese
0 replies
10h16m

It is filled with wisdom. I can not imagine the most genius programmers try to sell lingerie to people in google, facebook etc. purpose > *

e____g
0 replies
9h50m

Counterpoint: machine code may be run often, but source code is run never.

drewcoo
0 replies
10h45m

It is not about the comparative values of humans.

It is about costs.

The reason we write readable code is because we spend more time reading it than writing it. Engineer time. Which is money.

Yes, other things are money considerations, too. Considerations the advice to write readable code is not meant to address.

devjab
0 replies
10h59m

I don’t think I agree with the sentiment that user doesn’t equal business, to some degree maybe, but a business is what its users produce. I buy the point that some users within a business can have a demand for processes that will not be good for the business because they may be too focused on their specific department, but as a whole I think they are largely the same thing. Happy users work better, better work is good for the business. Aside from that, I think a lot of the “bad” decisions are unavoidable as businesses grow, so too with the bureaucratic demands that are often not producing value, but it’ll be very hard for the “IT department” to do anything about it because it lives as a “service” department similar to HR, but less understood by management. Cost-centres never get political power to match money makers.

I do think it’s a very interesting article, perhaps with a bit of a baiting headline. But I think it would be even better if it included the perspective of the modern ci/cd pipeline. What I mean by that is how we’re building more and more automation into it, things like RennovateBot that will automate certain parts of your maintenance processes. Things that won’t work automatically if you don’t write clean testable code. If you focus on delivering business value, spaghetti be damned, you’re going to end up spending so much time maintaining it that you’re eventually not going to be capable doing what you set out to do. We used to live on a world where this was sometimes ok, because you’d clean it up once the value was created, but with how much you can gain from a highly automated pipeline, I just don’t think that paradigm is true anymore. Because not only will you have to clean things up, you’re also going to miss out on so much efficiency by not being capable of using these automated tools. You can make the argument that you don’t need to update things as often as those tools help you so, and 5 years ago you would have been correct. Many places you may even still be correct, but in large organisations in the EU this is just no longer the case because of the added legislative bureaucracy that IT security needs to play into.

btown
0 replies
1h1m

From the OP:

When I say “run” I don’t just mean executing a program; I mean operating it in production, with all that it entails: deploying, upgrading, observing, auditing, monitoring, fixing, decommissioning, etc. As Dan McKinley puts it: "It is basically always the case that the long-term costs of keeping a system working reliably vastly exceed any inconveniences you encounter while building it."

One of the things separating a great developer from a good one, IMO, is the ability to treat the documented API boundary of an open-source library as something that isn't sacrosanct and impenetrable.

How comfortable are you with command-clicking into, or outright cloning and reading, a library's source to understand the nuance that the documentation may not convey (or situation where the documentation is outright wrong)? Depending on your language's support for it, have you monkey-patched issues, or forked a library (and ideally submitted patches upstream) to add extension points you might need? Are you used to setting breakpoints inside of libraries, not just your own code, so you can understand how data is represented internally when debugging?

And when you're evaluating whether to use a library in the first place, do you just look at the API and examples, or do you read the source to understand whether the underlying code prioritized maintainability, extensibility, and test coverage? Do you read changelogs and think about whether the library maintainers prioritize your ability to upgrade without breaking even your patched/forked code?

The brilliance of open source is that we can do this. We're not just fitting gears together - we're able to fabricate iterations on those gears as we need to.

bcrosby95
0 replies
1h40m

Imaginary software

Isn't this vaporware?

atoav
0 replies
7h55m

Also cars are driven more than repaired. Highways are used more than they need to be maintained. Yet if you don't include some thought to how to deal with a situation where something breaks, this could (and very likely: will) eventually come back to haunt you.

But code is not the same as your car. If money is not an issue and your car breaks you could just get another one that drives equally well. Try doing that if your code breaks.

So whether values like readability, maintainability etc. can be important for good code is out of the question — what remains an open question is (on a per file basis) how to priorize between those two if such a need arises.

And that isn't a small "if": One should guard themselves against the simple thought that these are inevitable tradeoffs that are mutually exclusive. You can easily make code that is both unreadable and doesn't run well. You can also make extremely readable code that runs nearly perfectly. The latter is of course harder than writing the same code to just run well, but if you are not coding alone or your project is beyond a certain scale it might be well worth the extra thought.

The solution space for "clear structure" and "good performance" is smaller than for either of those alone, so finding a good solution here is a bigger challenge, but depending on what you plan to do it could be worth a try.

asylteltine
0 replies
2h51m

Rolled my eyes at the capitalism comment. Was that really necessary?

Idealism > logic

antoineMoPa
0 replies
10h51m

I'm not sure I agree with everything, but I love the mental models this introduces.

amai
0 replies
5h2m

I like the clean design of this blog. Is is very well readable on my smartphone.

a_c
0 replies
4h10m

I think the message is very profound. It is one of the message like "eat your veg, exercise, sleep well, love your family". On surface one would respond "yeah who doesn't get that". Who doesn't know software is as good as getting used? Oh wait, but no, what about knowledge, wealth, career, fame, status, entertainment and so many more. What about architecture, maintainability, scalability, trendiness, developer experience, team management, OKR, career prospect, and others?

A punch is a punch. Software is about getting used. (I swapped user with used, IME user can mean different things to different people)

WolfOliver
0 replies
4h8m

Nice article, I always had the impression that we value DX a little too much. I mean developers are paid to do the jobs, but users are not.

Retr0id
0 replies
2h5m

Code is a means to an end. Software should have a purpose, it’s supposed to provide a service to some user.

I will accept this as a premise for the sake of argument, but it's certainly not a universal truth.

Nevermark
0 replies
2h39m

For enterprise software:

Customer > User ??

HPsquared
0 replies
4h4m

CPUs are lower in the hierarchy than developers.

BlackCherry
0 replies
9h58m

This article codifies general power structures within organizations: who trumps who when making software decisions in an org and sells that status quo as some sort of useful rule of thumb, when it in fact has no utility at all.