return to table of content

New Renderers for GTK

smallstepforman
54 replies
7h13m

I hope I dont sound bitter, but most decent graphics engine developers have created renderers that are a couple of generations ahead of the open source GUI toolkit renderers. There are several of us that can truely bring next gen rendering to the open source desktop, however we’re working for gamedev companies (they pay our bills), and we have no time to contribute to open source stacks. If the community can organise a regular budget to pay for such devs, then you’d see a significant rendered snd toolkit updates. Same with other open source apps.

themerone
13 replies
5h50m

How many of these game engines are abstracted at a level where you can swap in a pdf or svg backend? How many of them support cmyk and and print units? I'm only scratching the surface of things a gui renderer needs that a game engine doesn't.

I'm very skeptical that a bunch of game developers are going to whip together something that crushes Skia in performance without sacrificing a ton of capabilities.

lukan
12 replies
5h42m

"swap in a pdf or svg backend"

I am not sure, if I understand you right, but do you mean rendering to pdf or svg instead of the GPU?

If so, are there real world use cases?

"crushes Skia in performance without sacrificing a ton of capabilities"

Same question, what of those capabilities are really in use and needed? Linux GUIs in general are really not a beacon of light, in terms of performance or usability. I strongly suspect things could be better, if some bloat would be removed.

I am amazed with what is possible with PixiJS, a renderer for the Web, using WebGL and soon WebGPU. Having something simple, but powerful as the base, would be my way to go.

hmry
10 replies
5h38m

Saving websites as PDF comes to mind. If taking screenshots of windows as SVG was possible I would definitely use it.

lukan
8 replies
5h32m

A screenshot of a rasterized image saved in SVG is not something I see any use for. It will be a bloated monstrosity.

SVG is vector graphics, when you already have pixels - there is no clear way going back.

chris_wot
6 replies
5h24m

I think he means taking the scenegraph and saving it directly to SVG. Pretty much how it would be done for PDF.

lukan
5 replies
5h17m

Ah, right. Does not sound too complicated, but is an entirely seperated render path. And apparently now supported? But I have never seen it put to use.

themerone
2 replies
4h34m

Cairo and Skia can do this.

lukan
0 replies
4h1m

But is there a real world use case, where this actually was put to use?

(Sorry, I am having flashbacks of the debate with X and wayland, where it was argued, but X is network transparent, except that it wasn't anymore since a long time and, or because - no one used it)

jandrese
0 replies
3h40m

The use case is usually sending the screen to the printer or saving a document.

chris_wot
1 replies
5h15m

It seems super complicated to me :-) cool idea though

lukan
0 replies
4h45m

I got the impression from above, that this is something possible now, yet I have never seen such a capability. And I thing the effort will be way greater the reward to implement from scratch ..

("Easy" assumed, there is already something there)

vilhelm_s
0 replies
5h19m

SVG files can refer to images for bitmap graphics, only the rectangles, text, etc would be specified as vectors. See https://www.joachim-breitner.de/blog/494-Better_PDF_screensh... for a demo.

meragrin_
0 replies
3h17m

Um, isn't that an application feature?

bluGill
0 replies
5h11m

Printers often want pdf or postscript. Or if not some other format that isn't a regular GPU supported thing. Games rarely support printing, the main use is take a screenshot which uses the OS not the game engine, which is great for games. However if you are writing something where printing is important you want more control over the output than a screenshot can give (printers tend to be much higher resolution, but you have to deal with paper size - two things that should be passed back to the application as you can often make adjustments to better use show things when you know those limits - there are other compromises in printing too that may matter)

Const-me
9 replies
5h21m

Couple times in the past I have implemented GPU-targeted GUI renderers, here’s an example: https://github.com/Const-me/Vrmac?tab=readme-ov-file#vector-... https://github.com/Const-me/Vrmac/blob/master/Vrmac/Draw/VAA...

2D graphics have very little in common with game engines. The problem is very different in many regards. In 2D, you generally have Bezier and other splines on input, large amount of overdraw, textures coming from users complicate VRAM memory management. OTOH, game engines are solving hard problem which are irrelevant to 2D renderers, like dynamic lighting, volumetric effects, and dynamic environment.

lukan
6 replies
5h12m

There is PixiJS, a 2D web graphic engine. It is really, really fast. I can imagine something like this, as the base.

JoeyJoJoJr
2 replies
4h25m

I gotta disagree that PixiJS is fast. I’ve worked with a bunch of 2D graphics engines going back to the Flash days, and I found it’s really easy to hit a wall with PixiJS performance, especially if you are using text and filters. I wouldn’t have much issue with it if the cacheAsBitmap feature was reliable, but I found it buggy as heck and it didn’t help performance as much as you would expect. There is no way I would use PixiJS for a full screen game or mobile game.

paddy_m
0 replies
3h24m

Can you give examples of better JS renderers?

What is needed for performance of traditional GUI app rendering? I'm particularly interested in table rendering. Glide and Perspective are both canvas based renderers, but I haven't dug into the internals.

[1] https://github.com/glideapps/glide-data-grid

[2] https://github.com/finos/perspective

lukan
0 replies
4h4m

Ok, I also found that performance can drop in unexpected ways, causing workarounds, to get it flowing again.

And I don't use cacheasbitmap, but do my own caching. (Not because I knew of flaws, but because I was already doing it)

And for text I really recommend BitmapText. That is fast. (But not possible with all use cases, sure)

Also Pixi8 with WebGPU will be stable soon, looking forward to it.

But all in all I am really impressed, I used quite a few other js graphic engines before and Pixi was by far the best. Or which one did you find better?

Jasper_
2 replies
5h10m

PixiJS is the easy problem, blitting a bunch of premade textures to the screen. The much harder problem is getting curveTo() and text and gradients and strokes, etc.

lukan
1 replies
4h57m

That is all or mostly possible with PixiJS or an extension to my knowledge.

Edit: but then again, Pixi uses the HTML canvas element for text drawing, which uses the browsers text capabilities. So yes, at some point and somewhere those functionalities need to be implemented

interactivecode
0 replies
4h43m

Sure canvas can render text, but its filled with problems and limitations. It doesn’t do linewrapping or any of the font rendering correctly. Doesn’t do aliasing correctly. It needs so much hand holding and manual handling to render semi decent that basically all canvas apps just opt out and use html to render text on top of canvas.

Canvas is almost always the wrong choice if you want to do layouts.

BlueTemplar
1 replies
5h12m

While 3D games are the most common ones these days, there's still plenty of development done on 2D games too :

https://www.factorio.com/blog/post/fff-251

https://www.factorio.com/blog/post/fff-264

https://www.factorio.com/blog/post/fff-281

Const-me
0 replies
4h58m

2D games are mostly rendering sprites. GUI renderers do that too for raster images and font glyphs, but the harder part for GUI is vector 2D graphics.

devnonymous
8 replies
6h25m

The ^community^ here are people like you - those who also have to pay bills but contribute anyways, in whatever way they can by using the software and occasionally contributing code.

While yes it would be great if a community could raise funds, that coordination job itself would have to become someone's not-paying-the-bills work.

As much as I love Open source/Free/Libre software and am grateful it exists (and contributing to it, when possible), I've a long held belief that it is the pursuit of the privileged. You need to have the privilege of free time and then the privilege of being able to choose to spend that free time on something that doesn't improve your standard of living and then the privilege of being able to do it consistently.

throwaway89988
5 replies
6h13m

Sadly I totally agree: Open Source is the playground of people who can afford it.

I benefited a lot of Open Source in my career, life so I am very thankful for all contributors (and try to give back in money/time, when I can afford one or the other).

What really annoys me, that my government does not mandate that software build with tax money must be Open Source.

That would go a long way to fund Open Source and improve the quality.

notyoutube
4 replies
5h38m

What government is that?

red_trumpet
3 replies
5h30m

I dunno, all of them? Which governments mandate open source software when calling for bids?

notyoutube
0 replies
4h51m

(I wasn't trying to make a point.) As far as I know, that's what the initiatives like PMPC¹ are for. I think in Switzerland, a law recently passed that seems to go in that direction² (Open Source should by default but some leniency as far as I can interpret the text). According to this³ OSOR report, something similar happened in Italy in 2019. So, I think we're slowly going in that direction in Europe.

¹: https://download.fsfe.org/campaigns/pmpc/PMPC-Modernising-wi...

²: https://www.admin.ch/gov/fr/accueil/documentation/communique...

³: https://joinup.ec.europa.eu/sites/default/files/inline-files...

neoberg
0 replies
4h59m
germandiago
0 replies
1h6m

I do not think it should be mandated. But it should be promoted

coldtea
1 replies
5h1m

The ^community^ here are people like you - those who also have to pay bills but contribute anyways

In principle, yes. In practice, most of the "community" is paid developers for companies like RedHat. So while they do have to pay the bills, they do so by those FOSS contributions.

devnonymous
0 replies
4h18m

True. I recognize that my statement was a bit simplistic.

That said, in the context of what the OP was saying, unfortunately, this is a chicken-and-egg situation. For someone, like the OP, who would like to get paid to do OSS, they'd need to have a reasonably active OSS presence prior to being hired at places like Red Hat. Which is another aspect of my original point of contributing to FLOSS being the pursuit of the privileged.

lynx23
3 replies
5h38m

That sounds a lot like modern extortion. I sincerely hope this attitude does not creep into FLOSS and Open Source more then it already did. Imagine a volunteer firefighter who has found a well paying job announcing "If you'd pay me more, your house wouldn't have burnt down."

coldtea
1 replies
4h56m

A better analogy is volunteer firefighters not cutting it, houses burning down left and right, and a professional firefighter saying "I'd like to come work in your district and help with firefighting, and I could do a much better job than the volunteer guys, but I need to get paid to work".

I sincerely hope this attitude does not creep into FLOSS and Open Source more then it already did

If you took away people working on FOSS because they're paid to, that wouldn't contribute or drop to 1/10th the rate otherwise, you removed the most prolific and important maintainers and contributors of lots of huge FOSS projects.

PaulDavisThe1st
0 replies
2h1m

Since I just recently completed my certification as a Firefighter I/II (also Wildland FF 2), and have been an open source developer for more than 35 years, with the last 25 years of that being full time and the last 15 having FLOSS as my actual income source, I'd like to comment.

The primary difference between professional firefighters and their volunteer counterparts is hours on the job. When I graduated from the state academy, I knew as much as about firefighting as any of my fellow graduates and had been through precisely the same training requirements. However, the gap is going to open up very rapidly since the career guys will be doing regular shifts every week, whereas I will be answering 1-3 calls a month on average, most of which will not be fire-related. A year from now, the career guys will be even more familiar with everything we learned during our certification training and more, while I will be working hard to remember any of it.

So the question is: how well does this analogy hold for s/w development?

It doesn't.

First of all, the gap between most proprietary development outcomes and their FLOSS equivalents has more to do with UI/UX design questions than actual coding skills. At the source code level, it's generally proprietary projects that are "burning down left and right" (shoddily and quickly built, with inadequate attention to engineering and insufficient caring about one's work due to marketing deadlines).

Secondly, the difference between proprietary developers and their FLOSS equivalents in terms of hours of experience is not deterministic. It's going to be a function of employers, personalities, life situation. Plenty of (typically younger) FLOSS developers squeeze in more quality hours on their FLOSS work than their proprietary cousins do.

Thirdly, a firefighter only gets to put out the fires that actually happen. A software developer can pick their own problems and goals and work on them at any time. There's no relationship between the outside world and your ability to advance your skills and knowledge.

marcus0x62
0 replies
4h56m

Not a volunteer fire department, but this comes to mind: https://www.npr.org/sections/thetwo-way/2010/10/08/130436382...

seftggedf
2 replies
6h16m

What do you want the community to do? Pitch in with money from their day jobs so a gamedev savior can come in and make things render slightly faster? At the expense of hideous unmaintable code that only a gamedev genius could understand ?

pid-1
0 replies
6h12m

Some open source projects hire full time developers with donations / sponsor money (e.g. python, zig).

I'm not saying that's the correct approach for GTK, just noting is not an absurd idea.

coldtea
0 replies
4h59m

Yeah, because there aren't well maintained pro rendering engines /s

And FOSS devs always create non-hideous maintanable code that everybody understands /s

bmitc
2 replies
5h54m

Why don't you get your company to pay for it?

lukan
1 replies
5h28m

Why should they, if they are interested in releasing games or a game engine?

bmitc
0 replies
2m

Then why should this happen?

If the community can organise a regular budget to pay for such devs, then you’d see a significant rendered snd toolkit updates. Same with other open source apps.
Jasper_
1 replies
5h20m

I'd be very skeptical about that. I work in the game industry, and while our 3D renderers are very good, I have not seen a 2D UI renderer I would feel is at all competive. What are you using for path and pattern rendering?

germandiago
0 replies
1h7m

I am using rmlui in ly own small game. Not sure how good it is.

thisislife2
0 replies
2h22m

You may be surprised to know that the open source Godot game engine has also been adopted by some developers as a GUI toolkit (See Standalone Tools and Applications Made with Godot GUI - https://alfredbaudisch.com/blog/gamedev/godot-engine/standal... ).

stonogo
0 replies
4m

I can't think of a segment of user interfaces worse for accessibility than game development. Generally speaking, the game-centric UI toolkits and immediate-mode GUIs are very pretty but completely lack any accessibility hooks, including the ability for a screen reader to even know there is text present. If GNOME switched to gamedev-style UIs, I would probably just go buy a Macbook.

olau
0 replies
5h57m

Some of the people working on GTK are funded by Red Hat or otherwise. If you're really interested, you could try talking to people.

Edit: In particular, Matthias Clasen, the guy blogging here, has been with Red Hat for many years.

mananaysiempre
0 replies
5h46m

[M]ost decent graphics engine developers have created renderers that are a couple of generations ahead of the open source GUI toolkit renderers.

Step one: get the knowledge out there. Have those developers at least, I don’t know, talked about what makes those toolkits better at GDC?

maeln
0 replies
4h40m

I am a bit skeptical about this. I feel like game UI toolkit and desktop GUI framework live in two separate world with different expectation. At least, in my humble experience, having to have used both in my career.

GTK/Qt are usually good/very good with their integration with the OS, accessibility feature, keyboard navigation, handling of features like copy/paste, ... These are the kinds of things that game ui toolkit tend to completely forgo since they don't need it, and focus instead on performance, theming, integration with a game engine, ...

Theoretically, you could say that the renderer is agnostic with this, but in practice, it is not completely true. And also with the simple fact that you have a limited budget to work on feature, and they both rather work on different feature. Having a very fast and accurate renderer is just not as important for desktop GUI framework than for game UI toolkits.

johnnyanmac
0 replies
2h4m

game devs are already relatively underpaid for the value they bring to companies. I don't know how an open source initiative can even afford a single professional graphics developer.

Also, you say they are a couple of generations ahead, but do these kinds of software need to be bleeding edge? Even many games don't, the kinds of software in research labs that do pay even better than gamedev (and of course require PhD's and whatnot).

aulin
0 replies
4h46m

There are several companies employing devs for this kind of free software work.

If you think you could contribute you can apply for them.

I guess you will probably find there are already brilliant and competent people working on this stuff and the problem is not so easy as you believe.

BlueTemplar
0 replies
4h43m

The recent "The things nobody wants to pay for in open source (lwn.net)"

https://news.ycombinator.com/item?id=39151000

comes to mind...

BlueTemplar
0 replies
4h45m

Game engines can afford to be generations ahead : they might even be targeting specific hardware (single console release), they typically assume a sandboxed environment that they are in total control of (at best, they'll make a limited and tightly controlled GUI framework for modders), they might have restricted themselves to a limited number of inputs/outputs (single screen with a specific resolution, gamepad only), they don't have to worry about unknown unknown uses by other developers...

None of these apply to a generalist renderer, therefore it can only "lag behind" the game ones. (Unless maybe if we're talking about the "human side of the question" : what are the best designs, layouts for a generalist human/machine interface ? Here it's the generalist GUIs that I would expect to be a couple of generations ahead (Xerox' labs, Apple's Macintosh, IBM's Common User Access standard, CERN's World Wide Web...)

akdor1154
34 replies
8h43m

Pixel-perfect fractional scaling, baby, woohoo!

kuschku
31 replies
7h29m

After over a decade of GTK claiming fractional scaling is "impossible" and GTK devs vetoing any fractional scaling in the wayland protocol, they finally got feature parity with Qt in this area.

Now we just need proper support in Wayland and we'll finally have support for HiDPI in all major Linux DEs.

FirmwareBurner
23 replies
6h36m

>After over a decade of GTK claiming fractional scaling is "impossible" and

So it's just as impossible as thumbnails in the file picker? The insanity of the GTK and the resilience of people willing to put up with it is baffling to me.

Vinnl
15 replies
6h10m

At least the devs can find some motivation in the fact that they'll get comments like these every time they do something good...

EasyMark
6 replies
5h37m

Usually it's from someone who has never wrote a single line of code to improve the situation either.

FirmwareBurner
4 replies
5h29m

Please spare us of this tired trope. People who don't contribute to FOSS, are also allowed to have an opinion on FOSS if they're users of it. Otherwise how would they know what to improve if you don't allow feedback from users? What you're saying is like car companies thinking their customers' opinions are irelevant because their customers never contributed to designing parts of a car.

If you're designing a mainstream product that's not just targeting professional SW developers who can contribute back with code, but also average users who will only use it but never contribute, then you will need to be open to mainstream levels of feedback that represent the average user and not just programmers.

Otherwise let's keep complaining Linux and FOSS alternatives have only ~2% market share that it's somehow the users' fault for only buying Microsoft/Apple and not the projects' for being stubborn to feedback and out of touch with their mainstream users.

EasyMark
2 replies
5h18m

I have contributed and improved more than a few FOSS pieces of software that I used when I thought it could use improvement or a bug fix, or even just documentation of the bug. FOSS isn't a company, and a huge portion of contributors aren't paid a dime. It bothers me when people criticize with vehemence against the developers, where I express my opinion is on the nature of belligerent complaints vs helpful ones. Have a nice day.

Klonoar
1 replies
4h36m

No, they're right. It's not a requirement to sling code to put your opinions out there about open source.

bee_rider
0 replies
3m

Everyone has their opinions. There are so many opinions out there, it isn’t really clear what to do with them. Put them out there, fine, now everybody else just has to decide if they care.

One way to tell if you should care about an opinion is to check and see if the person professing it has taken any concrete actions toward getting the world to align with it.

t43562
0 replies
29m

I remember my cousin telling me how horrible it was to work at a hotel - you really found out how horrible people could be when they thought they were owed something. Staff there obviously had to put up with it because that was their job.

Regarding FOSS, I think some users forget that they're at a free lunch and are unpleasant and insistent about their complaints as if they were paying. They're not entitled to service and don't seem to realise it.

As to whether everyone else should listen - sure - but nobody is forced to work overtime just to right some wrong in OSS software because person A was very upset about it.

mardifoufs
0 replies
1h12m

In this case, PRs were made for close to a decade, tons of people implemented the feature in forks etc but upstream just didn't want it. You can't tell people to just write the code since it's open source when that's irrelevant here, the code itself wasn't what was missing. Though I still think it's great that it is now done regardless of the weird historic that specific feature had.

FirmwareBurner
5 replies
5h51m

Thank you for moving the goalposts for the chance of a cheap shot comment, but you know very well that's not what I meant.

GTK doing something right , doesn't suddenly absolve it of all the wrongs. And just because something is FOSS doesn't mean it's not without fault and therefore without criticism.

SiempreViernes
4 replies
5h35m

Likewise, just because you are right in principle doesn't mean your actual comment is anything but rude and pointless.

FirmwareBurner
3 replies
5h31m

Am I the one being rude? Maybe. Was GTK also being rude for claiming something everyone else was doing at the time, like fractional scaling, as being "Impossible"? Maybe.

Sorry, maybe I was rude, but someone needs to call out BS claims like this as they probably have enough "yes men" tooting their horn.

Vinnl
1 replies
1h48m

someone needs to call out BS claims like this as they probably have enough "yes men" tooting their horn.

Do they, though? Does it have any effect at all, other than potentially demotivating volunteer contributors? And if so, do those effects outweigh the negative effects?

FirmwareBurner
0 replies
25m

Gnome's problems are not those volunteers contributing code, but the leadership who decided what gets merged to mainline.

Devs could have implemented thumbnails in file picker over a decade ago, but Gnome leadership was always against it.

Toxic and poor leadershipof such large projects should be openly criticized and pointed at, and not allow to hide and be gaslighten behind the "poor small volunteer devs".

tristan957
0 replies
2h46m

Please cite where a GTK developer said that fractional scaling was impossible.

kuschku
1 replies
4h33m

The devs who actually implemented should be celebrated.

But they're not the same people as the ones that blocked specs and PRs by stating this was "impossible".

Blocking valuable improvements with an absolute statement like that makes contributing to such projects feel like fighting against windmills.

Vinnl
0 replies
1h45m

Possibly, but it wouldn't be the first time that the first category of devs feel like they're the target of such comments. And it's a well-known phenomenon that negative comments outweigh ten positive ones.

So from where I sit, one might as well not make them. (And I'm pushing back a bit to hopefully support the contributors who I'm grateful to.)

ho_schi
6 replies
6h1m

As Vinni pointed it with irony.

A honest „thank you“ to the people which made the thumbnails possible would be helpful. And involving itself. Or spending money. Gtk and GNOME are not companies. People need a motivation - from their work or from outside.

FirmwareBurner
3 replies
5h49m

>Gtk and GNOME are not companies.

These are large entities made up of people, same as companies, not individual devs making a side project. Does that mean their work should not be criticized because their collective is not a registered publicly listed company?

>People need a motivation - from their work or from outside.

Also feedback. And that was my feedback, candid as it may be.

Like I said below, FOSS work doesn't mean it should automatically be excluded form criticism, especially if that FOSS work is so big and influential enough that it ends up having impact on other products and on people's work and lives, like Gnome being the default on most free but also commercial distros.

So please let's stop attacking and gaslighting critics, as if GNOME is some teenager developing his first app for free in his bedroom who should only be defended and praised for encouragement, instead of the giant mammoth project with private and public funding[1] developed by vested professionals of the software industry who know very well what they're doing and should be criticized when they do something wrong or are being needlessly petty and stubborn with their decisions and arguments, besides being praised when doing something right as there's already enough of that in this topic if you read it.

[1] https://www.omgubuntu.co.uk/2023/11/gnome-sovereign-tech-fun...

eptcyka
1 replies
5h35m

It can be criticized. But that criticism won't necessarily help anyone get what they want, least of all you.

xcdzvyn
0 replies
1h35m

I don't use GNOME :)

ho_schi
0 replies
5h41m

Constructive criticism is helpful. And yes - some people get it wrong or are stubborn. People don’t act always perfect despite good intentions. I struggle often to find the right tone and interpret it properly.

And I’m happy about Linux, GCC, Vim, Coretils and Gtk in general. GNOMEs keyboard centric usage is wonderful.

That said. I’m usually sad about public long term forks. Because they are not about learning or improvement but that people often failed to communicate. Yes, sometimes their targets are just differing.

Anyone good news? Maybe Mutter will support VRR itself. And maybe battery thresholds in a user friendly way. Woohooo

No we just need convince the right people to add Type-Ahead-Find back in Nautilus and background transparency officially back in gnome-terminal. Both „back“. There is existing code for both.

lelanthran
0 replies
2h38m

Gtk and GNOME are not companies.

"Company" doesn't mean "business". It means a bunch of companions. Like the company in `The Fellowship of the Ring`.

carlosjobim
0 replies
1h24m

Or spending money.

People are willing and ready to spend money on quality software. It is a trillion dollar industry. Open source is an endless circle of misery that needs to be ended as soon as possible.

ho_schi
4 replies
6h4m

Both fractional scaling and thumbnails did require massive background work. The term „impossible“ is therefore incorrect. It required huge changes in the backend which weren’t feasible in short term. The thumbnails are done already and the new renderes are approaching.

Keep in mind, developers strive good solutions. Users need good solutions. Companies are often happy with less reliable solutions because they are forced to be fast. And this less reliable stuff tends to stay long.

maccard
2 replies
5h59m

The trade-off is that other platforms have had this support for a decade. Stuff like this keeps people from switching.

ho_schi
1 replies
2h39m

I think there are other reasons which matter.

    * what comes preinstalled
    * a weird application
    * the end
Regarding features - for users - Linux with GNOME is delivering features for decades which Windows lacks:

    * Tabs in file-browser             * fast and modern terminal (ttf-fonts, utf8, OpenGL…)
    * clear concise settings
    * maintained application repositories
    * no „desktop“ which forces me to arrange icons (WTF?)

Other consider Cortana or ChatGPT as „required“. If they think so…

dog436zkj3p7
0 replies
56m

You seem to be very much out of date with the current state of Windows.

tabs in file browser >built in for the last 2 years, tabs can be rearranged, moved between explorer windows, etc.

fast and modern terminal >windows terminal (released 2019 and built in by default) is one of the best terminal apps on any platform. Tabs, GPU acceleration, fonts, seamless multishell support (including Linux shells local - WSL - or remote)

*maintained application repositories > winget (released 2020 and built in by default) repository is now the backend of windows store and can maintain externally installed software, just run ‘winget upgrade —-all’ in your shell of choice

Also, you can now install any linux distro you want with a single click using WSL, with full x11 and Wayland support. If there’s a Linux app you prefer, you can use it seamlessly. You can even run a full desktop. I’d almost go as far as to say that nvidia hardware is supported better through WSL, where latest drivers “just work” and I’ve never experienced things breaking, than on baremetal Linux (although that says more about nvidia than any platform).

kuschku
0 replies
4h38m

I know how much work this is, and I genuinely appreciate that Gtk has finally reached this milestone. It was the last missing puzzle piece.

It's just sad that Gtk used to have this functionality and dropped it when Gtk3 was envisioned, as it meant the ecosystem was stuck waiting for over a decade.

zamadatix
0 replies
3h45m

Wayland has fractional-scale-v1 merged.

jandrese
0 replies
3h37m

I think it was impossible with the old rendering engine, hence the excitement in this blog post.

oever
1 replies
5h13m

From the blog post:

    If your  1200 × 800 window is set to be scaled to 125 %, with the unified renderers, we will use a framebuffer of size 1500 × 1000 for it, instead of letting the compositor downscale a 2400 × 1600 image.
The explanation is a bit confusing to me. I think they are saying that a window that should be drawn as 1500 x 1000 pixels on the screen because of 125% scaling would have 1200 x 800 pixels in application pixels. Since OpenGL and Vulcan use floats to render, they might as well render directly into a buffer that can be rendered 1:1 on the screen by transforming the coordinates in the instructions.

If that's what they are doing, that sounds like sanity finally.

aulin
0 replies
4h11m

I believe he's saying exacty that, they'll render it to the target size instead of rendering 2x and scaling it down.

wg0
24 replies
10h32m

Lomg time ago, I think 2010ish, there was an experimental HTML renderer that would open up a GTK app in a browser that has its UI using plain HTML+CSS.

For the time, it was just jaw dropping. For context, I think it was before Atom, VS Code or Electron (or possibly even NodeJS?) was a thing.

Don't know if that HTML renderer is still around or not.

troupo
5 replies
10h20m

I think qbittorrent still uses that for its web ui

steve_rambo
4 replies
9h37m

Not really. qBittorent is built on Qt (thus the prefix), and has a hand-rolled webui in pure html + css + js (with a couple of helper libraries, but no heavy frameworks):

https://github.com/qbittorrent/qBittorrent/tree/master/src/w...

EasyMark
2 replies
5h44m

and it's actually quite responsive

BlueTemplar
1 replies
4h14m

As Qt usually is.

seba_dos1
0 replies
19m

...which is true, but completely irrelevant to the above-mentioned Web UI.

troupo
0 replies
9h28m

Kudos to them then. It's really well done

arghwhat
5 replies
10h28m
wg0
4 replies
10h7m

Impressive. Web was very primitive back then compared to how it is today. I think there was no flex, no grids, no ESM modules even.

It might not sound impressive in 2024 but back then, it was a huge deal. Very daring to even attempt something like that IMHO.

It was a time of YUI, jQuery, ExtJS, script.aculo.us and such.

toyg
2 replies
9h45m

QML pre-dates it by a year or so, it's basically in the same vein. Everyone wanted to co-opt web-devs for desktop development at that point.

actionfromafar
1 replies
9h23m

Haha, they got quite a bit more than they bargained for. Hello Electron.

toyg
0 replies
4h57m

Lol yeah. In a way, most people kinda got where the industry was moving to, but desktop toolkits moved too slowly and effectively got left in the dust.

deaddodo
0 replies
6h26m

It literally just rendered to a Canvas framebuffer. It's really not that crazy.

I mean cool, yes....but it wasn't generating DOM elements/complex HTML or anything like you're trying to sell it as.

ho_schi
3 replies
10h26m

You mean Broadway?

https://docs.gtk.org/gtk4/broadway.html https://www.phoronix.com/news/GTK4-Broadway-Being-Used

I did not considered it as main/official backend but it is still there and was ported to Gtk4.

wg0
2 replies
10h11m

Oh yes, that's it. Thanks for posting. I think its so beautiful. It is one of those things that have pure artisan value. Of craftsmanship. Whether it is used widely or not, I wish this backend to be there.

Back then, I did run Open Office in Firefox and it amazed me.

mananaysiempre
1 replies
6h13m

If nothing’s changed, the Gtk 4 design tool Cambalache[1] by the former maintainer of the now-defunct Glade project uses[2] Broadway to render its design view, because it was there and an embedded Wayland compositor widget wasn’t. Broadway is awesome, but this just makes me sad.

[1] https://gitlab.gnome.org/jpu/cambalache

[2] https://blogs.gnome.org/xjuan/2021/05/18/merengue-cambalache...

aktuel
0 replies
34m

I dont't understand. What exactly makes you sad?

chrismorgan
3 replies
9h27m

Although it uses more HTML and CSS than some similar things, I don’t think it’s fair to call it “plain HTML + CSS”, because it displays more functional characteristics of the pure canvas approach (where you throw away just about everything the browser gives you and start from scratch). My three primary heuristics for proper behaviour are:

(a) using browser scrolling (the web doesn’t expose the right primitives to make a scroll-event-based reimplementation anything but bad, barely noticeable in some configurations but painfully broken in some of the most common configurations);

(b) using browser text rendering (strongly preferably by DOM nodes, but even canvas plus fillText can be acceptable); and

(c) handling links with real <a> elements (no alternative can provide the same functionality).

Broadway fails all three of these (reimplementing scrolling, rendering text on the server and sending images, and I think actually locking up when you try to click on a link). It also fails my fourth and fifth tests, which are (d) handling text input properly (it looks like it just uses key events, not even a backing <input> or <textarea> or contenteditable, so IME composition fails completely and keyboard navigation will presumably be GTK rather than native); and (d) having a meaningful accessibility tree (preferably backed by normal DOM stuff, but not necessarily; I place this one last despite its importance because it’s likely to be more retrofittable than the others, though it’ll still generally be hard).

I’d count Broadway as suitable for tech demos and for personal use where you know you don’t mind its limitations, but not for any sort of public deployment. It’s basically just an RDP/VNC sort of thing, with a smattering of DOM use. (Also remember that’s how it works—the code is all running on the server.)

KTibow
1 replies
3h53m

no alternative can provide the same functionality

I agree that real links should always be used to represent links, but if you were to emulate links surely it wouldn't be that hard? Just allow middle-clicking and ctrl-clicking and you should have the basic functionality. Is stuff like the ability to drag a problem?

chrismorgan
0 replies
2h45m

You can’t even emulate middle-click or Ctrl+click correctly! The best you can do in-page is open(), which will will typically open a new foreground tab; but in typical desktop configurations, middle-click and Ctrl+click will open a new background tab. Users will notice you disrupting their normal workflow of opening a bunch of thing at once and then going through them one by one and get extremely frustrated.

And then beyond that, well. Right click or long press, browser context menu with things like copy, bookmark, open in new tab, open in private window, share… you can’t emulate most of this at all. And hover to see href in the browser’s status bar. And so it goes on.

No, the only acceptable technique is to have the user interact with a real HTML <a>, HTML <area> or SVG <a> element.

kaba0
0 replies
2h20m

I generally agree with you, but I don’t think c) is true. It’s trivial, and most people won’t go the extra mile to make it “native”, but you can properly manage history from JS, can also add accessibility attributes to make it focusable, etc, so any well-implemented DOM elem with a click handler could be just as good as an <a>. Or do you have any other requirements?

Edit: I have replied from an older state of HN, and haven’t yet seen your other comments. Fair enough, didn’t think hard enough of other edge cases, thanks!

no_time
1 replies
10h10m

Calling it a "HTML renderer" is a bit of a strech. Atleast in GTK3 it just streams pixel data into a canvas element, effectively being the same as VNC with a web viewer.

https://imgur.com/a/2EDZ2Ti

wg0
0 replies
5h50m

Ok I'm not so sure if canvas was around back then so I might have misremembered it. But I did see div upon div with inline styles.

Maybe later they moved to canvas. Or like I said, I might be mistaken.

Recently however, Flutter had two backends for web. One HTML based and other canvas based.

Some WebGL etc might be in the works.

moondev
0 replies
9h27m

broadway - A while back I created a little POC for running it inside docker which worked really well.

https://github.com/moondev/gtk3-docker

My use case was running a browser inside a browser to easily interact with Kubernetes clusterip services without needing to port forward or proxy.

Another awesome example: running virt-manager to run a vm via the gtk virt-viewer (and interacting with it through the browser)

https://github.com/m-bers/docker-virt-manager

gtirloni
0 replies
4h32m

Reminds me of $something that allowed Delphi desktop applications to run in the browser. I think it was CGI-based and really fragile.

Eduard
17 replies
6h45m

Does anyone have a grasp/understanding of how desktop environments work on Linux? I don't. For me, everything feels to get more and more convoluted and tacked on.

pavlov
14 replies
5h48m

The X Window System was basically the wrong bet on how GUIs and computer hardware would evolve. Its client/server architecture was the opposite of the highly integrated graphics processing model we ended up with.

Instead of cutting its losses early and dumping X11, both the Unix vendors and open source spent far too long trying to make lemonade out of a truckload of rotten lemons. And that’s why Linux GUIs are so far behind.

It’s notable how rapidly Apple was able to evolve their Unix GUI because they were not tied to X11 and instead embraced the integrated model, even designing their own GPUs nowadays.

PaulDavisThe1st
2 replies
1h55m

This is utter horseshit. As someone who writes cross-platform native (desktop) GUI code, there's nothing about Apple's GUI system that is substantively superior to the one on Linux that has any relationship to XWindow being the underlying implementation or not. Apple has also had to continually evolve their own GUI toolkit(s) because they also failed to adequately target certain new expectations that arose as time passes (animation being a good example).

snowpid
1 replies
53m

Can you elaborate the superiority of Apple GUI system?

PaulDavisThe1st
0 replies
32m

I said there wasn't any.

EasyMark
2 replies
5h42m

People keep saying this but I never really had any complaints of kde vs windows vs macos. All 3 are perfectly usable, it's only in the past 8 years or so where windows has gone crazy with the obtuse spying and ads and "MICROSOFT" everywhere. Even that isn't too hard to remove with a download or two.

Klonoar
1 replies
4h55m

They're not talking about usability, they're talking about the technical underpinnings. ;P

jwells89
0 replies
40m

And polish. With X you can sometimes feel the “layers” separating and slip-sliding around a bit in places that you don’t with a Wayland DE or in macOS. This was especially true in the late 00s when there was less computing power available to brute-force these issues into being less apparent.

candiddevmike
1 replies
4h43m

Nothing that you said is a real criticism of X in general though, just its client/server model.

pavlov
0 replies
4h22m

But it's so fundamental to the design that it's inescapable. The history of Linux GUI is a series of heroic workarounds to hide the misaligned X underpinnings.

Trying to separate X from the client/server model would be like saying: "I like Unix just fine, except files and processes and the shell and the software tool philosophy, really." — You just don't have much left.

taeric
0 replies
1h15m

Isn't this a bit of a non-sequitur? The desktop environments could have been made coherent regardless of the underlying window system. Indeed, I think you could say that there were many coherent desktop environments in the past. Catch comes from the fact that there were a ton of competing ideas. With no clear reason to want one over the other.

Apple rapidly evolved their GUI because they are trying to support just the one. And, frankly, they are starting to buckle on the idea that they are fully coherent.

charcircuit
0 replies
1h16m

The X Window System was basically the wrong bet on how GUIs and computer hardware would evolve.

I disagree. Other platforms did similar things where clients would command the OS what it should draw and the OS would handle all app's rendering which allowed for good performance on primitive hardware. Like other display severs X did evolve to also be for passing around framebuffers of what the app drew once hardware became capable. X's problems come from being hard to maintain, being a monolith of unrelated concepts, having poor security, etc.

Zardoz84
0 replies
1h59m

I agree that the client/server approach was the wrong way at the end. but saying that Unix GUI was far behind others, it's a too far bridge. The first time that I saw composite desktops using the GPU was on Linux. And current KDE/Plasma it's light years better that the inconsistent and ads shit hole that Windows has become.

Paianni
0 replies
1h41m

To be fair, the secondary technologies available to support the preferred model now are far more advanced than what Quartz Extreme (in macOS) and DWM (in Win. Vista) had to use, and there's no longer a need to provide backward compatibility with software rendering.

IE6
0 replies
2h20m

Tangential - I just watched this and found it very interesting: https://www.youtube.com/watch?v=R-N-fgKWYGU

GartzenDeHaes
0 replies
1h3m

Is running events though a domain socket really so different than Windows message ports?

hsbauauvhabzb
1 replies
6h33m

With gnome being the front runner on that one.

I’m curious to see how much of an impact the wayland architecture will be, and if ‘gnome only’ apps become a thing.

NekkoDroid
0 replies
24m

Funny that you say "gnome only", cuz currently there are "non-gnome only" apps due to a specific protocol not being implemented by gnome (drm-leasing for VR, at least on main. there is a PR that is specifically said to not merge since they want to handle it via a xdg-portal). They are very wary of exposing anything in an way that could very much backfire in the long run, be it as a general implementation or exposing something they would be stuck with.

bjourne
13 replies
6h50m

I don't get why performance degradations are accepted though. I do most of my computing on old hardware and these are features I would turn off if I could and perhaps are not even supported by my gpus.

ahartmetz
3 replies
6h2m

These renderers are not on by default and likely never will be. I have never seen it work that an immediate mode rendering API translated to retained mode becomes faster. It is probably somehow possible, but will require a ton of work and probably changes on the API client side to fix some pathological cases.

mananaysiempre
2 replies
5h47m

I have never seen it work that an immediate mode rendering API translated to retained mode becomes faster.

I don’t think I get your point here. Gtk 4 is retained-mode whatever renderer you use; Vulkan and OpenGL are immediate-mode (well, kinda) whatever renderer you use. Whatever problems that forces in the new renderers would be just as present in the old ones, wouldn’t they?

ahartmetz
1 replies
5h24m

Oh, you are right about GTK4! It has switched to a scene graph and retained mode drawing. Somehow, I had never heard of that before. Presumably, it was done to be able to use 3D graphics APIs that all work in retained mode.

Vulkan and OpenGL, though, are in no practical sense immediate mode: You need to draw the whole frame every frame (barring a few exotic extensions for compositors and such), so you need to retain the state of everything in the frame so you can draw it.

epcoa
0 replies
2h15m

Your usage of these terms in completely unconventional and wacky.

You need to draw the whole frame every frame

This isn’t even true at a high level. You can composite buffers. That’s not exotic. That’s a fundamental operation.

But insofar as needing to actually do all your draw calls at once - that is literally what "immediate" in immediate mode means.

Vulkan and OpenGL

They don’t specifically retain state - they’re immediate. That’s what immediate mode means. What something higher up does has nothing to do with it.

What would you even consider immediate mode by your definition?

so you need to retain the state of everything in the frame so you can draw it.

That state can be a procedure and a handful of variables (which in the extreme is all a shader). The point is Vulkan and OpenGL have no say over the nature of that state.

ComputerGuru
3 replies
1h24m

This is actually the only case where I tolerate performance regressions: where the previous implementation was actually incorrect, rather than merely old or in need of a rewrite in whatever framework or technology is currently in vogue.

taeric
2 replies
1h6m

Performance is part of the implementation. Therefore, if it regresses, it is silly to claim it was incorrect before. Seems more likely to claim that tradeoffs were made for performance.

I'm ok with performance leaning on advances in the hardware. I'm also ok with performance dropping if you are pushing more pixels. But, we had high resolution displays years ago. Such that that is a tough hill to defend.

ComputerGuru
1 replies
45m

Performance is a combination of a) required behavior, b) implementation. If an implementation does not correctly meet the requirements, its performance cannot be compared to one that does.

taeric
0 replies
42m

Close. Performance is also part of your "a", there. And you have to adjust implementation accordingly.

Is why nobody says that old "3d" games like Doom were poorly implemented. Even if they were not fully 3d.

ho_schi
1 replies
5h49m

    > No, the new renderers are not faster (yet).
I think yet is here the important word.

If you see a noticeable performance decrease:

    > GSK_RENDERER=gl

Please keep in mind that it is unlikely that Microsoft, Apple or Google would discuss these. It would be probably „forget the old API, here another API“ (Microsoft), „enforced from ${WEIRD_NAME}“ (Apple) or „you won’t get that update“ (Google).

BlueTemplar
0 replies
4h2m

Yeah, reminds me of Grinding Gear Games (Tencent for a few years now) where if you read the update notes for Path of Exile 1, you can see "performance improvements" mentioned many times, yet, mysteriously, the game today has effective minimal requirements order(s) of magnitude (if multiplied together) higher than on release !

flohofwoe
1 replies
6h39m

In GL it's quite easy to accidentially miss the "fast path" and get surprising slowdowns (and in reverse, GL can be surprisingly fast when hitting the fast path). What's disappointing though is that the Vulkan renderer only nearly reaches the same performance as the old GL renderer, this seems to indicate that the problem sits on the caller side of the 3D API.

It would probably have been a good idea to track performance throughout the implementation and iterate on that instead of "architectural purity".

Vinnl
0 replies
6h12m

I don't understand half of the post, but claiming that they're doing this for "architectural purity" sounds ungenerous to me. The post lists a couple of tangible benefits, most of which do, I believe, relate to performance:

Proper color handling (including HDR) > Path rendering on the GPU > Possibly including glyph rendering > Off-the-main-thread rendering > Performance (on old and less powerful devices)

(Which is not to say that tracking performance isn't a good idea. It's just that "architectural purity" sounds needlessly dismissive to me.)

unleaded
0 replies
5h18m

GNOME dev team (and by extension GTK) probably don't care that much, iirc most of them use expensive macbooks so many issues get ignored because it "Works on my machine" (e.g. several issues with font rendering that don't affect retina displays)

enriquto
6 replies
9h41m

I'd love to see an ansi text renderer, to be able to run gtk programs inside my xterm (optionally, with some sixel thrown in...).

captainmuon
3 replies
1h43m

Nowadays most GTK apps look very simliar. A sidebar, some actions in the titlebar, a details view. (Same story for Mac apps, 'Modern' Windows apps, Mobile apps.)

I wonder if a UX toolkit could be completely declarative and semantic - "I need a master/detail view, a list view with the following fields, some actions, ...". At the high level you don't give any positioning, or styling. It would automatically use the appropirate system widgets. Then on top you would add some polish by using a bit of CSS, or maybe an escape hatch to get the native widgets.

Almost everything that is not a browser, a WYSIWYG editor, or a media viewer would fit that mould. The kicker is that from such a description you could easily generate a TUI.

ComputerGuru
2 replies
1h27m

You’re basically talking about the underpinnings of XAML. Terribly underappreciated and completely alien to the FOSS world.

GartzenDeHaes
1 replies
54m

XAML based WPF and Silverlight were really nice and eliminated so much boilerplate and plumbing code. Unfortunately, Microsoft management-marketing didn't want to sell an opinionated framework, among other blunders.

ComputerGuru
0 replies
42m

I've tried to convince the fledgling GUI toolkit world of rust to take a look at XAML. I'm not going to rehash my latest round of comments [0] just yet, but suffice to say that XAML gives application developers and end users such freedom and enables very nice architectural splits that other options do not.

moondev
1 replies
9h8m

broadway (discussed in other comments) + carbonyl is kind of similar https://github.com/fathyb/carbonyl

https://i.imgur.com/pIQ4K7Q.png

enriquto
0 replies
4h28m

But this is not really usable, is it? I was expecting a look "native" to a terminal, i.e. a text-based user interface with the common conventions of the medium.

phkahler
2 replies
5h21m

I wish GTK didn't follow the trend of allowing widgets in the titlebar. Some can drag, some can not. There's less room for the app and file names. This is not a GTK specific complaint.

dathinab
1 replies
2h5m

didn't gtk/gnome invent this trend?

mike_ivanov
0 replies
46m

AFAIR it was Chrome with the tabs in the titlebar thing.

HumblyTossed
2 replies
3h52m

Why isn't a lot of this just handled by the GPU?

wmf
1 replies
1h20m

GPUs are pretty low level so it takes a lot of code to render anything.

HumblyTossed
0 replies
54m

Just seems odd that it can't handle, for example, the fractional scaling and anti-aliasing.

Thanks for at least replying, everyone else was just downvoting.

sim7c00
0 replies
9h28m

looks like so much fun workin on this :) cool stuff. When i read about the anti-aliasing i thought nice, maybe signed distance fields will work just as nice for font rendering at arbitrary scales as in game engines... (Valve had a nice paper out there on this). there's lots of cool trickery in game renderers in UI code and for things like rendering decals that might be nice in gui code too.

mrdoob2
0 replies
7h51m

If they used https://wgpu.rs/ they would get directx and metal for free (: