Given that this uses `target`, doesn't it mean that unlike htmx you can't easily make this gracefully degrade when JS isn't enabled?
And, yes, I know, saying "when JS isn't enabled" in 2024 is a bit like saying "when the user is on Mars and has a 10 minute RTT" but forgive me for being an idealist.
I use Disable JavaScript extension with js disabled by default and only enable it if website is broken.
https://addons.mozilla.org/en-US/firefox/addon/disable-javas...
You should use uMatrix so you can only enable the scripts necessary to unbreak the site
Or uBlock origin (from the same author) which is still maintained
It is maintained but the UI for dealing with JS is horribly time consuming and overly complex compared to uMatrix. I'll never really understood it and I keep using uMatrix on my laptop. I switched to NoScript on my phone. Maybe I can install uMatrix now if Mozilla really unblocked many extensions. If uMatrix stops working, I'll switch to NoScript everywhere for JS and uBO for all the other issues.
Or uBlock advanced which is not abandoned/dormant like umatrix.
Could you describe your ideals for why websites should gracefully degrade without JS enabled? It’s not an unpopular view on HN, but from my perspective as a web developer, JS is a part of browser application just like HTML, and there’s no reason for the website to work if you’ve disabled a part of the browser.
I suspect “doesn’t have JavaScript” is being used as a proxy for a lot of other ideals that I do understand, like “should work on as many devices as possible” but that’s a rough correlation and doesn’t make the use of JS inherently bad.
Because there's a case for a very useful Web without running a Turing-complete language on the visitor's end.
If you just want to consume text, images, audios and videos, follow links and fill in forms (and that's quite a lot and pretty awesome already), you shouldn't need JavaScript.
A reason people might want to have JavaScript disabled, is because of the immense tracking possibilities that JavaScript has, which can't easily be safe-guarded against.
The people who do disable JavaScript completely are admittedly few and far between, but are, I would assume, more common among the Hacker News crowd.
Chances are I’m on your website for information, mostly text content. Which really doesn’t require JavaScript.
So then, most JavaScript on the web is enabling revenue generation rather than improving the user experience. So yeah, disabling JS is a proxy for, “don’t waste my time.”
But I agree that it’s not inherently bad, but just mostly bad (for the user.)
So I've been in numerous situations where having JavaScript enabled was either undesirable or impossible, granted, it's my own fault for using strange platforms like a Nokia N900 or whatever, with strange web browsers. But it's still nice when interactive websites continue to work even in contexts where JavaScript isn't a thing. I always thought of JavaScript as something you use to "upgrade" the web experience, not to build it. Although obviously there are some things you simply can't do without JavaScript and for which there exists literally no alternative. There's also situations where JavaScript is a liability. See, for example, the Tor browser.
Especially my ideal is that all functionality which can work without JavaScript should work without JavaScript. So, for example, I am not expecting someone to implement drag-and-drop calendars without JS, but there's no reason why the editing function of a calendar item should fundamentally require JS.
That being said, I know this is an idealist position, most companies which work on developing web-applications simply don't care about these niche use-cases where JS isn't an option and as such won't design their web-applications to accommodate those use-cases to save on development costs and time. But, while I am not really a web-developer, whenever I do deal with the web, I usually take a plain-HTML/CSS first approach and add JavaScript later.
Yeah it breaks without JS. You could add the iframe behind JS, so the target would default to a new tab. But the server would still be designed to return HTML fragments. I never found a way for a server to check if the originating request is for an iframe or a new tab. It's not quite a graceful degradation.
> I never found a way for a server to check if the originating request is for an iframe or a new tab.
There is no such technique. One way to distinguish is to pick a URL convention and modify the URL (before the hash) of the iframe URL. For example, add ?iframe=true to the URL, and then have the server check for that. Perhaps more usefully you could include information about the parent URL, e.g. url += '?parent=${document.referrer}'. Or something.
You could intercept the clicks with JS and add a special header, like htmx does, to return fragments, otherwise fall back to full documents.
Edit: rather than a header, dynamically adding a query parameter to the URL of the element that was clicked would probably fit better with htmz's approach.
It breaks without JS but many JS blocker extensions can be configured to always allow JS from the host serving the page. For example NoScript on my phone has the "Temporarily set top-level sites to TRUSTED" option.
With only 181 bytes it could even be included in the page. It's much less than the sum of the meta tags on many sites.