It's a bit scary to see that one of the highest-voted answers to this question (188 points) is completely wrong. It says that the (0,0) hotspot simplified the calculations for a cursor position update, because you didn't have to add any (X,Y) offset.
https://ux.stackexchange.com/a/52349/43259
The problem with this idea is that the arrow pointer was never the only cursor. On the first Macintosh, there were many others including the text I-beam and a couple of kinds of crosshairs. And you could define any cursor of your own by providing a bitmap and transparency mask and the hotspot position.
You can see some of these cursors in the original Inside Macintosh Volume I and also in previous works from PARC.
https://web.archive.org/web/20230114223619/https://vintageap...
Page 50 of the PDF (page I-38 of the document) shows some sample cursors.
Page 158 of the PDF (page I-146 of the document) has the pixel detail and hotspot locations for several cursors.
Fun fact! The hotspot for the arrow cursor was not (0,0) but was (1,1).
Can anyone explain why? I think I used to know, but it has long since escaped my memory and I would appreciate a refresher.
This page also has the definition of the Cursor structure:
TYPE Bits16 = Array[0..15] OF INTEGER;
Cursor = RECORD
data: Bits16; {cursor image}
mask: Bits16; {cursor mask}
hotSpot: Point; {point aligned with mouse}
END;
Point is defined on page I-139 and is more or less what you would expect, a pair of vertical and horizontal coordinates.To be clear, the scary part is not that someone came up with the idea that (0,0) saved a few instructions. In fact, the notion came up elsewhere in this HN discussion. It's a perfectly reasonable hypothesis, until you realize that there are many cursor shapes that require different hotspots.
The scary part is that 188 people upvoted this answer!
It's only scary at the beginning. Then you get used to it. Every single social media site - including HN - has uninformed people agreeing that a correct-sounding answer must be right. My friend the tax accountant gets downvoted for clarifying how taxes actually work. My wife the linguist gets downvotes for explaining no that's not how language works. It's not scary - it's typical.
Let me guess: Tax brackets? That's the one thing that most regular workers in the US just don't seem to understand (and arguably, many people knowingly spread falsehoods to further some agenda).
I think the basic thing about taxes that is least understood is the difference between gross income and taxable income (the latter is the amount that tax brackets apply to). A close second is the difference between tax liability, and refund/balance due on the tax return.
Just try to convince the average person that “getting a big refund” is a bad thing, since it means you gave the U.S. government an interest free loan.
Oh yes, that's another fun one! Your yearly tax return should be as close to 0 as possible, otherwise you're either over- or under-withholding. Then again, I met some people that use it as a kind of piggy bank because they wouldn't be disciplined enough to save up for bigger purchases otherwise and... well, I can't even, but if it works for them, there are worse things to spend money on.
I have income from multiple sources and they are not aware of each other. For example, they will all keep paying social security even when I’ve exceeded the max deduction. It is far too complicated to correct the finance departments of multiple companies. I just reconcile it all at the end of the year and get a refund. Got a better strategy I can use?
You can file a W-4 with exemptions and avoid overwithholding! This is a fixable problem.
How do I use a w4 to fix the social security problem without incurring underpayment of state and federal? Exemptions apply to all the taxes, no?
https://www.irs.gov/forms-pubs/about-form-w-4
Fill it out correctly and your employers will do the right thing.
I think the W-4 only applies to federal income tax. There's no field that instructs employers how much to pay in FICA (their share and yours). At best you can reduce withholdings to account for the excess FICA payments.
Not sure about state, but you don't pay Federal Income and FICA separately. They are just numbers that get added together. You just pay money, and the IRS splits it up after they collect. If you "overpay" Income by $1000 and "underpay" FICA by a $1000, you're done, no problem.
FICA cap is per employer, not total. Is that what you're referring to?
FICA cap is not per employer. Well, it is from a withholding perspective (only because it would be impractical to make employers monitor withholding outside their control), but once you do your taxes for that year, you’ll get everything you paid in over the cap refunded.
Right, but employers aren't allowed to coordinate to calculate whether the hit the cap together or not. They don't have that discretion. See
https://www.ecfr.gov/current/title-26/part-31#p-31.3121(a)(1...
If your separate income streams are pretty predictable and so is the overwitholding, and if you care enough: you can put a negative number in the "extra witholding" box on your W-4.
I wouldn't say this is a better strategy, but you can definitely min/max this even if your income is not stable by extrapolating out your expected income and expected witholding a few times a year and adjusting your W-4 based on your calculations.
Wow I wouldn't trust that. I'd add extra exemptions plus a positive withholding if needed.
It sounds like you’re not one of the people they met who use it like a piggy bank. From my perspective, they’re just describing the habits of people who are used to not having any money: gotta spend this windfall quick because money doesn’t last long. It’s irrational and ultimately harmful but it’s borne from the practice of spending all of your money every month on non-trivial things and still being required to increase debt in order to stay in your apartment, e.g., credit card spending.
It's not necessarily irrational. For example, for some people if they ever have any extra money, someone else will immediately spend it for them. If the earner wants to make a larger purchase, perhaps something that will cost short term but pay off in the long term, they need some mechanism to save, outside of the regular controls that apply to daily life.
You may think this situation is still irrational, that the other person is being irrational. But again, there are many life situations out there. Perhaps they have lived in situations where they had to fight for what they needed. Perhaps they lived with an earner who would spend their money on drugs if it wasn't taken away, and yet if the non-earner saved it up themselves, the earner would find it and spend it.
The supposedly rational thing may depend on everyone around you to also be rational, and everyone around them, etc. And given that we are human, and humans are not fully rational...
Woah, it’s okay to have different tax situations. I started a business one year and got a pretty big refund. But we’re not out there bragging about how we get big refunds every year like it’s some goal to aim for and accomplishment to be proud of if achieved. That’s the mentality people are criticizing.
By the interest free loan logic, you should have your employer withhold zero and then you put your taxes in a high yield savings account and pay them all as late as possible.
The IRS already thought of this - they charge you interest on the money you owed them (with some exceptions, like waiving it the first year it happens, only charging you if you withheld less than last year, etc).
Not sure I understand. Taxes are due in April. You don’t get charged a year of interest on the amount you own when filing…
https://www.irs.gov/payments/underpayment-of-estimated-tax-b...
Yeah, it sucks that the IRS is such a buzz-kill here, with 4-5% HYSA's, it would be nice to just let all the taxes sit there and pay one lump sum in April.
I owed a decent chunk more one year due to investments I sold, and left the money in tbills since I knew I was withholding at least as much as the previous year.
The interest rate is also much higher than you can earn on anything risk free (8% right now) plus there’s penalties on top.
(Not from the US) Why is it a good thing to lend for free to the US gov ? Because the regional banks aren’t that stable ?
It is not a good thing because it is interest free and inflation exists. If you would have had that money earlier you could have put it in high yield saving account or payed down debt.
It is not interest free. E.g., I was paid $480 in interest on overpayments last year.
It is interest free if the IRS pays you within N days of you filing. If they're slower, then they pay interest.
Where N is some value between like .... 30 and 90? I forget.
“Look! I filed my taxes and I got money back! Yay money!” (Could have had that money all along.)
The people that enjoy a tax refund would not really even notice the small amount they "could have had all along" by adjusting their withholding amounts.
People like getting the big lump sum and some don't even realize it was their money all along that they just overpaid throughout the year. It's not a good thing for individuals to overpay.
Nah, because for people with poor financial skills, the ability to save is very difficult (even if they had the "Extra" money in their account each pay period instead of paying extra taxes). So even though you're technically getting your money "back", for some people they would not have been successful to 'save' so much without it being forced on them.
That's not how the Earned Income Tax Credit works....
The average person is a financial train wreck of dumpster fires.
Decent guess, but nah. Something to do with corporate tax accounting. Can't remember the details because that's out of my element.
And yet here you are trying to spread an agenda in a thread about mouse pointers that taxes are too low because the majority of people are too stupid to understand tax brackets.
The way I internalize it: public voting selects for layman plausibility, not correctness.
Because laymen massively outnumber experts, the layman vote always overwhelms the informed one, so the reaction of people who don’t know the subject is the only thing that matters. Truth only seems to matter because most subjects either can be somewhat intuited by non-experts, or are in a niche that you’re not, so “layman plausibility” means your reaction, too. But the true nature of the dialog reveals itself as soon as people talk about something you’re an expert on.
Answers like this aren’t a bug in a truth machine, they’re a plausibility machine working as designed.
there's another reason for some optimism about a voting-truth connection: wisdom of the crowds. As long as there isn't a strong bias to people's estimate, the average will converge on the truth.
I am quite unsure as to the veracity of the claim that "the average will converge [upon] the truth". I recall cases being made (as asides) for the opposite conclusion. Intuitively even, this idea of equating truth with convergance towards the average opinion appears contradictory, counterfactual, and ahistorical. Excuse my being brass, but a "wisdom of crowds" seems to me oxymoronic on its face. I'd love to be persuaded otherwise though; mainly due to my perception of a lack of credence towards your view. Perhaps I have misunderstood your qualifier: "As long as there isn't a strong truth bias to people's estimate . . . "? Off the top of my head, I can't imagine any scenario in which a mixed population of laypeople and academics/experts would converge towards the same (vote average) findings as a sample of a handful of experts/academics. For example, would The Average converge towards correct mathematics or physics answers? Besides trivial, non-technical questions that do not require complex analysis, I think not. (See: False Memory: Mandela Effect. [0] [note]) [0]: https://en.m.wikipedia.org/wiki/False_memory#Mandela_effect [1]: https://en.m.wikipedia.org/wiki/Information_cascade [Note]: My point is that groups' thinking is liable to be compromised. (After all, what has been more important to a human — evolutionarily: the truth or social access?) Also see: Information Cascade. [1] {Post-Scriptum: My position is that if averages for answers to questions were taken, from the 'crowd' of the whole Earth, then these would diverge significantly and routinely from The Truth. If there are cases in which you feel this to not be the case I would inquisitively consider such scenarios waveBidder.} <Edit: Deletion: " . . . ~difficulty in lending~ . . . ">
That only works when people bet that their guess is correct.
Wisdom of the crowds is obviously dog shit.
Unfortunately not, because wisdom of the crowds requires not only a lack of bias but independence which, let’s face it, is usually impossible achieve.
As we know in the age of the internet, truth doesn't matter, only popularity does.
The internet has taught me how many brilliant people there are out there. And how massively outnumbered they are by the rest of us!
It's amazing how far that can take you. I saw a post on another social media site about something being wrong, and a comment said it's not wrong, it was just missing a "not". Which was the exact reason it was entirely wrong.
So people can state absolute absurdities and have people agree.
"So people can state absolute absurdities and have people agree"
That Reddit mission statement.
Happens on HN all the time too.
Some people are able to correct typos when reading.
But Reddit is exceptionally bad at this though. It's basically about what sounds the most positive for the upvoter's way of thinking rather than anything else.
Reddit is a place where you get downvoted for linking something that proves what someone was saying is wrong just because it goes against the site’s overall narrative. Lies are encouraged if they’re the correct lies.
The exact same can be said for academia
It's also typical to stand up in the audience of the kindergarten nativity and shout "Mary and Joseph weren't wearing teatowels on their heads!" and when the other parents turn and angrily "ssssh!" you, shout "shhh'd for telling the TRUTH! I thought this was a place of education and learning! Stay classy, parents".
"Happy birthday dear grandma, happy birthday to youuuu"
"Grandma's birthday was YESTERDAY you fuckin' liars!"
A casual chat interrupted by tax pedantry and grammar naziing. I have no idea why people wouldn't want that. Anyone? Anyone? Bueller?
To be honest, I was debating on even posting my message initially. It was off topic but I figured it would've been ignored. If I knew it was going to derail the discussion about Stratoscope's analysis that much, I wouldn't've posted it.
Edit: Also grammar nazi'ing has little to do with linguistics and more to do with being a jerk...usually.
This is why I ask for qualifications when someone has an authoritative tone.
It is the Gell-Mann Amnesia effect, but on social media:
Is it not possible to be both scary and typical?
I think this also partly explains the LLM hype — people can be as confidently incorrect as LLMs, or maybe LLMs are as confidently incorrect as humans since they are trained on text from social media.
Hackernews is similar to ChatGPT in that regard. Lots of correct sounding answers that are really just a word salad.
I've noticed whenever a topic comes up that I have a lot of knowledge in, people almost always chime in with incorrect or just flat out made up stuff. I always remain suspicious of anything I read in any comment section. Including here on HN.
whoe political movements are built on this kind of momentum
The arrow has a white outline around it, so the hotspot is at the tip of the black arrow, at (1,1).
And if I'm not wrong, it still applies to today's Mac interface. The cursor still has a white outline all around.
Yup. You can even customise both the inner and outer colours as an accessibility feature!
What??? TIL.
A lot of the accessibility features are actually neat even to those without the need for them.
Yeah, it’s become super useful for me to color code the cursors between my work and personal Macs.
Bingo! Now that you jogged my memory, I can confirm this.
The next question is why you need a white outline around the black arrow.
This is easy to answer: if you didn't do that, what would the black arrow look like against a black background?
Some DE solved that by having an inverse outline.
Took me a really long pause to think up what DE meant, so to save others from similar waste “desktop environment”
I'm pretty sure even on windows there is the option of having the whole cursor be the inverse of the background
You can see similar things in the Apple Lisa source code as well: https://info.computerhistory.org/apple-lisa-code
The linked SO page is a page of complete speculation.
History isn't just a bunch of logical thought exercises, it's an assembling of documentation and evidence.
As far as I can see, there is no contemporaneous documentation claiming intentionality so the question remains unanswered.
A smoking gun would be a file with a name like cursor.bitmap or some code like "declare cursor_default = [ [ 1, 0 ... ] ];" from a major source (ms/xerox/apple) say, pre-1988 or so, with some comment above it explaining the rationale of why that cursor style in particular. I'd even accept a more minor source like Acorn, Digital Research, Quarterdeck, NeWS, VisiOn or MIT Athena (X).
Finding something that talks about say, lightpens and then defends the mouse cursor style in that way is working backwards from a hypothesis. It's weak and doesn't preclude other possibilities. Let's be rigorous and get it right.
The Inside Macintosh pages from 1985 I cited above may be what you're looking for.
Especially page 158 (I-146).
It doesn't give a longwinded rationale of why you need an X/Y hotspot offset, it does much better than that. It shows you several cursors with their hotspots, so you can see why a hotspot is needed. And it lists the data structure to support it.
but that is 4 years later than the xerox optical mouse tech report, and from a different company which copied their default mouse pointer style from xerox. it doesn't bear on the question of whether xerox was implementing cursors without hotspot coordinates at the time that they adopted the left-leaning shape
(i suspect xerox mouse cursors always had variable hotspot coordinates because it's, what, six microseconds extra in the screen update to subtract them? and i think smalltalk-76 mouse cursors have hotspots. but 01988 or even 01985 is way too late)
Within 8 thousand years, people will figure out variable length storage and processing for integers. I promise.
The second-highest answer is an incorrect just-so myth. It even includes a screenshot of the historically correct answer!
I was hoping that it would be lower than 188 when I clicked. It's not. (196):-(
As you said ten years ago https://news.ycombinator.com/item?id=7253841
The scary part is that you will likely be saying it again in another ten years and again and then you’ll die as “that weird cursor offset obsessed fanatic”.
My assumption (not having an old Mac or documentation to confirm it...) is that the tip of the cursor had to be at (1, 1) to allow for a pixel's worth of mask around the outer edge of the tip.
Perhaps it's because cursors have a one pixel wide black border around them to enhance contrast, but users associate the cursor's position with the first bit of white (or color) at the tip. (0,0) is colored black for a typical cursor.
Edit: ninja'ed further down.
the first macintosh was very late to the party, there had already been GUI cursors for about a decade at PARC, and cursor styles had settled down to some standards.
in the early days of GUI cursors on relatively low resolution displays (by today's standards), an important issue was to reduce the amount of calculation and squinting the human had to do to identify the hotspot so you could accurately select/swipe what you wanted to. the tilted arrow cursor points right at its hotspot quite effectively even if the tip pixel is blurred, as does the i-beam (whose vertical offset is not as important to know accurately) the five fingered hand for moving bulk selections also does not require accurate placement, although I think the hotspot is at the end of a finger.
early GUIs let you edit your own cursors and hotspots.
I think you touched on a wider problem. Peoples shallow understanding of the world, translates to a shallow world view and policies. It's kind of scary to me how much my high school sociology class, group projects, became political policy decades later. Simplistic reductions, when in real life even unclogging a toilet can have complictated steps, nuanced decisions, and many caveats.
I was just about to say that.
There's an amazing video by Posy documenting mouse cursor history, and even provides his own cursor pack:
https://www.youtube.com/watch?v=YThelfB2fvg
http://www.michieldb.nl/other/cursors/