return to table of content

The effect of CRTs on pixel art

wodenokoto
24 replies
3d14h

When I see pictures and video from 8 and 16 bit era Nintendo development, they draw their sketches on checked paper and transfer to work stations with professional monitors.

This doesn’t square with the idea that they optimized for cheap TVs, with rectangular pixels.

bernawil
5 replies
2d19h

yup, honestly this whole take is so tired. Its a factoid pulled out of nowhere.

and the counterexample is easy: the gameboy launched with LCD screens and games definitely where meant to look blocky. The pokemon games are perhaps the most influential nowadays in terms of pixelart style and show all examples of it: the very blocky styles of the overhead view and the more detailed pictures of the pokemon. They also cover the 8-bit and 16-bit era with the gameboy advance.

PhasmaFelis
2 replies
2d15h

Your counterexample make me think you've misunderstood the point from the start.

Devs design games to look good on the target platform. Obviously you can achieve gradient effects with a CRT and 16-bit color that are very different from a four-tone LCD. Both things can be true.

Edit: And some of them did that very deliberately and documented it. https://news.ycombinator.com/item?id=41134689

bernawil
1 replies
2d3h

the central point of the piece is this:

blocky pixel art often is a kind of misdirected, anachronistic nostalgia.

and my point was that blocky pixel art is a faithful representation of an important percentage of games of the era, even though some other games of the era were not meant to be "blocky" many indeed where as the gameboy proves.

PhasmaFelis
0 replies
2d3h

The author said "often," not "always." The entire point of the article is to explore the details of the phenomenon and show that it's not as clear-cut as people think. What are you disagreeing with?

dmnmnm
1 replies
2d6h

Its a factoid pulled out of nowhere

Nope.

And your gameboy examples are not counter points to the fact that games were designed around a target screen limitation. They weren't designed around a CRT quirk, but they were designed around their own LCD quirk instead.

The first gameboy's screen was painfully bad, with a lot of ghosting during movement.

It was exploited in various way in games :

https://nerdlypleasures.blogspot.com/2018/03/compatibility-i...

https://nerdlypleasures.blogspot.com/2019/05/screen-persiste...

There was even a game that exploited the screen of the original gameboy to achieve a transparency effect in a spirit (but not implementation) similar to how Sonic devs created transparency on CRTs on the genesis.

https://youtu.be/MytSySMUwv8?t=2892

In this case, artificial flickering doesn't look like flicker on the original gameboy screen because it had a lot of ghosting. But it looks terrible on an emulator. You don't get those effects playing the same games on a modern computer monitor.

The various gameboys, and even the advance, were not as crips as you remember them to be, their LCDs weren't particularly high quality stuff. None of the original pixel art of those times were meant to look the way they look on a modern high contrast, high luminance, high resolution LCD or OLED screens of today. It's particularly true as soon as movement is involved as the ghosting was intense and it was one of the weakness of early LCDs as a whole as even the best computer monitors that came out when LCDs started to show up on the market looked horrible in motion compared to a CRT or a Plasma. So, the pixel art era was never about looking at a crisp image.

And many gameboy games look very, very wrong when run crisply on an emulator. They absolutely weren't meant to look like this. That batman game showing massive flickering around water looked fine on the original gameboy hardware.

bernawil
0 replies
2d3h

I never said that CRTs or whatever didn't influence design sometimes. I just wanted to refute the central claim of the piece:

blocky pixel art often is a kind of misdirected, anachronistic nostalgia.

And no, blocky pixel art is a faithful representation of an important percentage of games from the 8bit to 16bit generations.

burnte
4 replies
2d20h

Because they weren't. The exact same art techniques used for 320x200 res games have been used for centuries in other media. Pixel art was most like tile mosaics, which also use dithering, and around for millennia. We designed pixel art that way not because CRTs looked like that, but because we had to pack maximum communication in those pixels. Yes, CRTs had a softening and blurring effect. We didn't really talk about that back then because it was obvious, and it wasn't something we had the ability to turn off or on, it was simply a fact of the technology.

No one designed differently for monochrome monitors with higher resolutions, or for the nascent LCD technologies of the day either. Every single article that says what this article says is literally just inventing a story.

nyanpasu64
1 replies
2d19h

Dithering certainly existed prior to CRTs, but CRTs have a unique property of scanline gaps for vertical sharpening and scanlines/composite video bandwidth for horizontal smoothing. I've found documentation of pixel artists taking CRT television properties into account, even when designing art on higher-resolution (CRT) computer monitors: https://www.tumblr.com/vgdensetsu/179656817318/designing-2d-...

we know that many developers, graphic designers as well as programmers, used that technique, from Kazuko Shibuya (Square) and Akira Yasuda (Capcom) to the developers behind Thunder Force IV (1992) who used many CRTs to take into account the specifities of each kind of screen.

“It’s a technique where by slightly changing the color of surrounding pixels, to the human eye it looks like the pixels move by around 0.5 pixels.” explains Kazuhiro Tanaka, graphic designer on Metal Slug (1996). His colleague Yasuyuki Oda adds that "Back in the old days, we'd say [to our artists] "add 0.5 of a pixel", and have them draw in the pixels by taking scanlines into account. But with the modern Full HD monitor, the pixels comes out too clearly and too perfectly that you can't have that same taste."
nottorp
0 replies
2d6h

I've found documentation of pixel artists taking CRT television properties into account, even when designing art on higher-resolution (CRT) computer monitors

Even if they didn't explicitly take the tv properties into account, they drew the graphics on CRTs and saw it running on the target machine. And then adjusted it so it looked "better". Even intuitively.

lomase
0 replies
2d9h

Did you ever played a ms-dos game in CGA mode? It was not suposed to look like that!

https://www.youtube.com/watch?v=niKblgZupOc

Developers did definitly took advantages of everything they could to show more colors on the screen, remember all those demos?

PhasmaFelis
0 replies
2d15h

You're missing the fact that many if not most 8/16-bit devs did not make their art on graph paper but on PCs comparable to their target platform.

They did that because it was quicker and easier, not out of a specific intention to make better-fitting art, but it had that effect anyway. The pics of Dracula and FF6 Siren from the post--there's no way someone plotted those on graph paper before importing them with no changes.

doctorpangloss
2 replies
2d21h

Yeah but you’re talking about primary source documents from real developers, aka knowledge. Can you somehow turn this into a 5 page long intellectually self edifying blog post about your personal brilliance?

throwup238
1 replies
2d20h

How self-edifying would you prefer, on a scale of 1 to Wolfram?

a1369209993
0 replies
2d18h

You're thinking of "self-aggrandizing" (and I think doctorpangloss is too, although I'm less certain since they didn't mention Wolfram).

xcv123
1 replies
2d20h

It does square with it. The analogy is in music production. Professional music studios use high end expensive studio monitors with flat frequency response. It's a reference point. From this reference point they can optimize the mix to sound good on other speakers that don't accurately represent the frequency range.

ssl-3
0 replies
2d18h

Indeed. Many forms of art happen via an iterative process through many forms to reach their finished state, and things like drawing/viewing pixel art with graph paper or writing out/playing a musical score on paper can be cromulent parts of the journey that the art takes.

Just because someone drafted some pixel art on paper doesn't mean that this must be the final form, and that no revision may ever be done.

And with recorded music in particular: Yeah, a good studio usually has a variety of monitors. Some might be exquisitely flat and unnvervingly detailed, some might be capable of effortlessly reaching big-stage PA levels, and some might resemble common consumer gear more than anything else.

If it works well with all of these, then: It might be considered a good mix.

The Yamaha NS-10 is/was a popular speaker in studios not because it was good, but because it represented a common medium-size bookshelf-ish speaker that people might actually use at home where they listened to music, and it was consistent betwixt different studio spaces.

vardump
1 replies
2d20h

Tooling wasn't static, but improved over the years.

The first games might have been checked paper sketches and might not have been so CRT optimized.

Even in the eighties, nothing prevented a Nintendo game developing team from using Amiga (or PC) Deluxe Paint for drawing the graphics. (They had also other specialized systems in Japan for pixeling graphics.)

ndiddy
0 replies
2d16h

Before the 32-bit era, game development tooling was very nonstandard, and a lot of studios rolled their own. For example, Sunsoft's NES graphics/animation software ran on the Famicom Disk System (Japanese-exclusive NES add-on that let the system run software from proprietary floppy disks) so they were absolutely optimizing for consumer TVs with rectangular pixels. https://www.youtube.com/watch?v=O8PR2EShp70

bongodongobob
0 replies
2d14h

Yeah, they do that. But you're seeing the final form. It was as iterated as source code is. You're seeing the final result that was shaped by the constraints of the medium.

RodgerTheGreat
0 replies
2d21h

In some cases, developers used grid paper with rectangular cells sized to match the aspect ratio of rectangular pixels on the intended display.

PhasmaFelis
0 replies
2d15h

Most 16-bit games (and many 8-bit) were of course drawn in PC paint programs, not graph paper, for the same reasons that programmers stopped writing their code out in pencil before submitting it.

Dwedit
0 replies
2d20h

Some developers used graph paper, other developers drew the graphics using the actual hardware on a TV. Plot the pixels while zoomed in, but see the real thing live on a TV.

To make drawing on actual hardware work, you either used battery-backed memory, or you used a connection to a PC.

6SixTy
0 replies
2d21h

Be careful mentioning pixels next to the word 'CRT', as you'll get people parroting that CRTs have no real pixels even though it's more complicated than that which would be disingenuous being condensed.

deaddodo
19 replies
2d13h

The latter was never available in the US of A (an apparently poor and technologically backwards nation), which means Americans will simply have to take my word when I say that RGB SCART on a good TV is nearly as sharp as an actual RGB computer monitor - but just nearly.

No, the US didn't have SCART (well, they did, it just wasn't common). Instead, they had VGA (by far superior to SCART for RGB signals) and component (better for native YPbPr/YCbCr, such as most consoles used).

This "SCART is superior" BS is just a weird anachronism from retroheads and europhiles. PS2, GameCube, Xbox, etc all benefit more from component due to 480p support, everything before that generation benefits equally from SCART or Component (it just comes down to what you can get out of your console, SCART tends to be easiest), and everything after benefits best from HDMI (or DisplayPort, if it's supported).

Nursie
14 replies
2d10h

SCART was an absolute pain in the backside.

Technical merit aside, its sockets and cables absolutely defied any attempts to line up and insert by touch.

Want to plug in a new device? You'd damn well better move the tv out because otherwise you're going to spend at least ten minutes with your arm round the back of the tv, fruitlessly wiggling the plug around the general area of the socket, and then you're going to pull the tv out and look anyway. Even if you were exactly on target with the cable, you weren't getting it in without sub-millimetre precision and there was no real way to tell when to apply the force.

They were such a bad design.

pezezin
10 replies
2d6h

SCART is the worst connector ever designed, that is true. Not only it is difficult to plug in, it is also super flimsy and it will unplug if you glance at it wrong.

But on the other hand it provided the best possible picture quality (on consumer TVs) until the arrival of HDMI. Compared to composite video, it was marvellous.

deaddodo
9 replies
2d2h

But on the other hand it provided the best possible picture quality (on consumer TVs) until the arrival of HDMI

Consumer televisions had component for a decade+ before HDMI.

rasz
8 replies
1d19h

Oh wow whole decade! Meanwhile SCART was standardized in seventies.

deaddodo
7 replies
1d17h

That explains why it was inferior to all of the other standards specialized in the plethora of fields it attempted to handle with mediocrity.

rasz
6 replies
1d15h

Tell me more about inferiority of pure RGB goodness :)

deaddodo
5 replies
1d1h

I don't need to, the tech specs speak for themselves. Maybe try reading/comparing them.

This isn't reddit, take your try-hard trolling elsewhere.

rasz
4 replies
21h47m

But its you who is trolling. CRT gun is driven with RGB signals, every TV has to convert cvbs/svideo/component input to RGB anyway. Allowing raw RGB input simply skips unnecessary conversions.

deaddodo
2 replies
15h55m

That has zero effect on how that data gets to said gun. Doing so at 15khz/480i maximum is intrinsically less fidelity than up to 31khz/480p.

In other words: you can send me literal ASCII characters at 1ch/s. 60ch/s via telegraphy would still be superior despite the necessary conversion.

So, congrats on a hobbled standard that got ten extra years of non-use and that you had zero hand in developing.

rasz
0 replies
11h47m

Why so angry? Thanks to French everyone and their mother in Europe had a TV with RGB (not to mention Svideo) input for that perfect non pixel crawly always the same color picture from VCRs, Cable/Sat set top boxes, fourth gen and up game consoles, and finally 16bit home computers.

Meanwhile in US video inputs were somehow used for market segmentation with consumers forced to use RF modulators :o and later CVBS with a very brief couple years of Component input availability on HDTVs around 2000.

pezezin
0 replies
8h12m

SCART could carry 480p, but few devices supported it.

In any case, this was only an issue in the 6th generation of consoles (Dreamcast, PS2, Gamecube, Xbox). Before that era, everything was 15 kHz, and after that, everything started to include HDMI. The 6th generation was caught in between.

pezezin
0 replies
17h51m

You are so right. There is a reason why PC and arcade monitors work in RGB.

But in addition to that, there is no cross-talk, no dot crawl, and all the other weird artifacts of composite video. Also, no chroma modulation means that it was much easier for TVs to support both 50 and 60 Hz. I only ever had a GameCube, but being able to play in RGB at 60 Hz was amazing.

At least in Europe, many old consoles* provided native RGBS output, and with the proper SCART cable the video quality was awesome. Nowadays, people in the retro gaming community go crazy with PVM/BVM and other fancy professional monitors with RGB input, but the French already had it figured out in the '70s!

* anything from Sega, Nintendo SNES~WiiU, the NeoGeo, Sony PSX~PS3.

nottorp
2 replies
2d7h

Frankly I don't see how HDMI is much better. You need to rotate it 180 degrees at least thrice, just like USB-A.

Nursie
1 replies
2d5h

I wouldn’t say it’s great, but both of those are minor league compared to SCART.

nottorp
0 replies
2d4h

I remember it took a lot of force to insert. Not sure if i always moved the TV.

Rinzler89
3 replies
2d12h

>Instead, they had VGA (by far superior to SCART for RGB signals) and component (better for native YPbPr/YCbCr, such as most consoles used).

Couple of issue with your ranty statement from an European:

1) Vast majority of CRT TVs(not monitors) didn't have VGA input, neither component, the cheaper ones at least. The more premium and higher end CRT TVs did have component but almost nobody had them here where I grew up in the 90's and early 2000s.

2) Even in the EU SCART, while being present on most devices, was rarely used, and by far the most used video connector on TVs here and worldwide was the famous yellow RCA Composite video connector which was crap and SCART was definitely way better than that.

But Americans shouldn't feel bad they missed out on SCART since there they mostly used the S-Video alternative which was relatively similar to SCART in quality.

roelschroeven
0 replies
2d9h

Your experience with SCART doesn't match mine: in my experience before the arrival of HDMI, SCART was used almost exclusively for connecting VHS recorders and DVD players.

jeroenhd
0 replies
2d10h

Even in the EU SCART, while being present on most devices, was rarely used

Funny, that's not what I remember at all. Anything I've ever seen plugged into a TV as a kid used a scart connector, in some cases through an RCA-to-scart converter (like video cameras and such). All VHS and DVD players I've seen hooked up definitely outputed over SCART, although they may very well have used the composite pins that were present within the SCART connector.

I've never seen a CRT with a VGA input, just monitors.

deaddodo
0 replies
2d2h

Vast majority of CRT TVs(not monitors) didn't have VGA input, neither component, the cheaper ones at least.

Correct, because most devices going into a television didn't operate on RGB (outside of game consoles). Thus the preference for Component. It was certainly available, but exceedingly rare. And the rare people who did have them and a Dreamcast got the absolute best quality.

The more premium and higher end CRT TVs did have component but almost nobody had them here where I grew up in the 90's and early 2000s.

I'm assuming you grew up in Europe, which is kind of the whole point, isn't it? Component was widely available on non-budget television brands/lines in the US. You could grab a component cable for anything from PS2/GameCube/Xbox on, and S-Video for the majority before that (common with PSX and Saturn).

Obviously the majority of Americans ran it through RF or composite, since that was just simpler and cheaper. But if you wanted high quality configurations, the options were all there for you outside of "superior" SCART.

paol
13 replies
2d20h

An article to the same effect, but with more example images: https://wackoid.com/game/10-pictures-that-show-why-crt-tvs-a...

There are several instances where you can clearly see the artists not merely "living with" the CRT rendering but actually using it to their advantage: the eyes resulting from a single red pixel in the Castlevania picture, the smooth gradients in the Streets of Rage picture arising out of just 3 or 4 colors...

ASalazarMX
8 replies
2d15h

Having lived that era, I believe people these days romanticize too much the limitations of CRT. Simply adding a scan line / CRT grid filter isn't even how consumer-grade CRT TVs looked like: distortion, blurriness, color bleeding, static, cheap selector interference... it was fine when you looked at your TV from several feet, but even arcades preferred to use high-quality CRTs instead of exploiting its side effects because you would look at them up close, like we use monitors now.

Right now is a great time to appreciate the exquisiteness involved in pixel art, its perfect pixels and perfect colors, scaled to high resolution with sharpness. I agree some of the genius of working with CRT limitations is lost, but the overall majority of it became a pleasure to see.

Now, vector arcade games, those really are a sin to play on anything but a vector CRT :P

nottorp
2 replies
2d8h

I played a lot of 320x200x256 games on high end PC CRT monitors in my life with no distortion, bluriness etc.

Trust me, they didn't look like the emulators show them.

actionfromafar
1 replies
2d7h

PC monitors with RGB were a different ballgame, at least at that low resolution there was no distortion or bluriness as you say. Even PCs had games for composite video: https://youtu.be/niKblgZupOc?t=504

TVs though, especially on RF, were pretty darn blurry.

nottorp
0 replies
2d6h

Oh, maybe i stopped before making a point. Even those high end CRT monitors still had free horizontal anti aliasing and pixel characteristics that made the image be perceived differently from how it looks on a LCD.

zozbot234
1 replies
2d8h

even arcades preferred to use high-quality CRTs instead of exploiting its side effects because you would look at them up close, like we use monitors now.

But even high-quality CRT's had gaussian blur. In no case when using CRT's would a pixel be represented as a tiny square, which is what you see in a lot of modern "retro" art. The "pixel" of native CRT displays is always a smoothly blurred point sample.

dmnmnm
0 replies
2d6h

The "pixel" of native CRT displays is always a smoothly blurred point sample.

And that's why CRTs were capable of displaying different resolutions in a way that was appealing.

When I bought my first LCD monitor for my computer I was hit with instant regret. The early LCDs were :

Horrible at anything other than native resolution. If your computer hardware wasn't good enough to run a new game at the highest resolution of your monitor, the game would look terrible. The scaling used to bring lower resolution pixels to fit the screen's native pixels was just terrible, terrible and terrible. I have no words that can describe just how bad the experience of playing games at a lower resolution than native was in that era.

CRT bluriness smoothed things around in a way that didn't actually look excessively blurry. LCDs running lower resolution games looked like you threw vaseline on them.

And they were also terrible in motion. Ghosting! Ghosting! Ghosting! Playing games like Quake 3 and Unreal Tournament didn't feel good on that stuff..

jeroenhd
1 replies
2d10h

Simply adding a scan line / CRT grid filter isn't even how consumer-grade CRT TVs looked like: distortion, blurriness, color bleeding, static, cheap selector interference...

This is why I find many "scanline filters" so jarring. My CRT experiences may have come from the last decade of CRTs, but I've never actually seen the dark, black grid hovering above the PS2 graphics from my youth. Yet, when I enable CRT emulation in many emulators, it looks like someone put a plastic grate in front of my screen.

The scanlines were there, but they didn't have the contrast a black overlay has on a modern flat screen.

I have seen some (GPU intensive) filters that do mimick CRTs more accurately, but to display them well, you need a 4K monitor or better, simply because the effects applied to the individual pixels involves multiple pixels of blend/distortion, often in a non-linear fashion to mimick the bulging screen. ShaderGlass was the first tool that actually allowed me to see some old games as I remembered them (after a bit of tweaking).

I think most people agree that the clarity of modern displays are a huge benefit, but when playing older games, a lot of nuance is lost. When (much younger) developers start emulating older pixel art styles without having ever seen the blurry, distorted CRT graphics, you get this weird anachronistic super-crisp pixel art that's supposed to be retro. Still looks good, but they're graphically off even if they copy directly off of old sprite sheets.

nyanpasu64
0 replies
1d20h

I think one factor is that the PS2 generally outputs 480i signals with scanlines shifting vertically every frame to fill gaps (so you don't see them unless you're tracking a vertically moving object), while many CRT shaders default to simulating a progressive signal with fixed scanline gaps.

guenthert
0 replies
2d7h

Having lived that era, I believe people these days romanticize too much the limitations of CRT.

Yes, I do remember the colourful, blinky, not to say glamorous, 'white' of a VIC-20 or C64 connected via HF modulator to a TV. Still, coming from 'no computer, never encountered one' to such a device then it's difficult not to romanticize that time now. We were also at an very impressible age.

pezezin
0 replies
2d6h

The transparency effects are NOT a byproduct of CRTs, but of the horrible artifacts of composite video.

How do I know? Because just yesterday I was playing with a Megadrive connected to a Trinitron PC monitor through a line doubler. It looks gorgeous, but there were no transparencies.

deaddodo
0 replies
2d12h

Water, blends and dither transparency were always the effects I most felt diminished retro games on LCDs (or even bad CRT filters).

If you play those games today on switch and the like, the experience just isn't quite the same.

titusjohnson
7 replies
4d

One piece of hardware I periodically search the internet for, is a CRT emulation box. I want a piece of physical hardware I can place between my 4-port RCA switcher, and an HDMI connected modern TV, that emulates the display properties of a CRT.

I know that a real CRT would be the best for my nostalgia vibes, but honestly they take up so much space. This theoretical product doesn't need to be perfect, just give me a few knobs to turn to adjust whatever settings are available, and inject vertical bars to fix the aspect ratio. I'd pay good money for such a device.

scheeseman486
6 replies
3d23h

https://www.retrotink.com/product-page/retrotink-4k

It does all of what you ask, plus a lot more. CRT filters that emulate CRT shadow masks from different manufacturers, HDR to boost the max brightness to emulate the phosphor glow, VRR to allow for arbitrary refresh rates (useful for arcade superguns).

titusjohnson
5 replies
3d22h

Oh excellent, that's exactly what I'm looking for! Not quite impulse-buy territory but now I know what it'll take.

nyanpasu64
2 replies
3d20h

As I understand, the OSSC Pro is significantly cheaper for most of the feature set, while the RT5x and OSSC provide varying feature sets (the OSSC only has a line buffer, not framebuffer, which affects the resolutions it can output).

scheeseman486
1 replies
3d19h

You're not wrong that's the OSSC Pro has most of the same feature set, but that deeply undersells the headline feature the RT4K has, particularly given the OP wants something that properly emulates the visual characteristics of a CRT. For that, it's impossible to beat the RetroTINK 4K. Matched with a 4K OLED it's CRT shaders are uncannily real looking.

snvzz
0 replies
3d14h

Not him, but the main feature of OSSC and OSSC Pro is them being Open Source Hardware proper.

I own OSSC and have enjoyed its flexibility, and the improvements it got over time and does continue to get.

Getting a OSSC Pro is in my TODO.

unwind
1 replies
2d22h

Just to save folks a whole click or two, it's $750 which does seem ... steep but I'm not into retro computing (yet, a part of my brain feels like adding). I've heard it mentioned on Youtube ("8-bit guy", I think) but never looked it up, thanks GP.

6SixTy
0 replies
2d20h

Here's my 2 cents: no one needs that kind of stuff. Retro can be free, and only reason you would want to break out your wallet is if you really care that much about things beyond the game.

Dwedit
6 replies
4d11h

Let's not discount the years since 1997 when people actually did play retro games on high resolution computer monitors, with big chunky pixels.

blargey
3 replies
4d

Notably, pixel art has been developed for and played on LCD screens since 1989 with the Gameboy.

Including the entire Pokémon series until its move to 3D, as well as many ports and continuations of older “pixel art” series originating on the NES/SNES.

The blockiness wasn’t exaggerated, of course, but there wasn’t any CRT magic obfuscating it either, so that would change later generations’ experience of the medium.

scheeseman486
2 replies
3d23h

For Gameboy, Game Gear, GBC and GBA in particular there's pixel art and graphical effects that are often designed in mind for the characteristics of the LCDs in those handhelds. The slow pixel response is used to fake more shades of grey/colour and fonts in games for the GBC/GBA often lean heavily on the subpixel arrangement that ends up making them look much worse if you simply scale the pixels up. As a result most modern emulators have features that emulate the LCD panels as well.

dmnmnm
1 replies
2d6h

The original gameboy had so much ghosting you could do this to achieve effects like transparency of water :

https://youtu.be/MytSySMUwv8?t=2892

It looks absolutely godawful on an emulator since without the ghosting what remains is a high amount of flickering.

People taking the gameboy as an example of crisp pixels have either

1/ never had a real gameboy in their hands

2/ putting on nostalgia goggles, hard

The original gameboy LCDs were nothing like what people have today. They had so many limitations and quirks and they were all used in game development, or at the very least, taken into account while designing the games so that things like animations would look decent. Pixels could never look crisp when in motion on an LCD of that era. Not even the early PC monitors.

When people are comparing CRTs vs LCDs, they're thinking of today's LCDs which show a very sharp, high resolution, high contrast image. That's definitely not what a GB had.

Dwedit
0 replies
1d23h

On the GBA and GBA SP screen, pure on/off flickering every frame does not perfectly fake transparency, but it gets very close. On the AGS101 or NDS Lite screen, it looks a lot worse.

The NDS lite screen does have sharp crisp pixels.

pieix
1 replies
2d20h

Agree completely. For me, the author missed the mark with:

modern, blocky pixel art often is a kind of misdirected, anachronistic nostalgia.

All of _my_ pixel art nostalgia comes from the 240x160px LCD Game Boy Advance and crystal clear blocky 16x16 sprites.

Tade0
0 replies
2d7h

I recall being able to start an old game in our school computer lab back in 2003 or so. Its default resolution was 320×200, while we had screens set to 1024x768 there.

lotusZZZ
3 replies
2d18h

It’s not just 8bit and 16bit games though. Everything up to the GameCube can benefit from a CRT. Resident Evil 4 looks insanely better on a CRT to the point that if you are not playing on a CRT, you are playing it wrong.

FMecha
2 replies
2d11h

There's also the issue of latency, which is why Smash Bros. Brawl tournaments were primarily played on CRT TVs.

nyanpasu64
1 replies
2d10h

I find that LCD monitors can display incoming image data on the panel as it's being received (taking slow-motion video captures of a LCD next to a CRT both fed with a VGA distribution amp), but HDTVs generally add one or more frames of latency from image processing (for the TVs I've used, even in game mode). It's possible speedrunners still like CRT TVs because they don't suffer from pixel transition time (possibly lower on gaming and especially OLED monitors), don't require (often laggy) transcoding/scaling 480i/p video to VGA or HDMI, or because they generally don't have the off-chance that a particular monitor has image processing lag.

Also the game famous for competitive CRT tournaments is Melee rather than Brawl.

FMecha
0 replies
2d8h

Stood corrected, only to notice that the editing period has ended.

bitwize
3 replies
4d11h

CRTs just have a richer, warmer picture.

naikrovek
2 replies
2d20h

The “warmth” is X-rays.

wiz21c
1 replies
2d10h

Really ? I mean we do see the XRays ? (genuine question)

qayxc
0 replies
1d21h

Kind of, yes. X-rays do interact with the visual system and cause perceivable effects especially with dark-adapted eyes. It's not like seeing actual colours, though and more of an added effect.

Here's a fascinating article about the history of x-ray perception in humans and animals: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9426237/

UV and EMF radiation are further causes of these "warmth" effects. UV in particular is pretty bad and yes, you could actually get a tan from sitting in front of a (cheap) colour CRT monitor all day :)

latexr
2 replies
2d5h

This argument of CRT purism and nostalgia is getting old. It assumes to know why people like something based on one limited idea.

My favourite consoles were always the different Game Boys, which used LCDs, not CRTs¹. Furthermore, I enjoy modern pixel art² more than I ever liked it in old games (that were new when I played them). Even comparing the examples of old games I still prefer the non-CRT look because I enjoy the clarity of how the sprite is made, it’s easy to learn from it.

So that’s your arguments out the window. Neither CRTs nor nostalgia shaped my appreciation for the medium, and I’m not alone. Pixel art is an interesting aesthetic, and it’s great to see games both adhering to its limitations and “cheating” to do effects that were impossible at the time.

¹ https://www.nintendo.com/en-gb/Support/Game-Boy-Pocket-Color...

² https://cross-code.com

IggleSniggle
1 replies
2d5h

I read the article, but assumed a different intent, and didn't read any "argument" whatsoever. Rather just that the author was sharing some neat things about things they liked, clearly delineated from opinions that were present throughout. I didn't get the impression the author was arguing about what you should and should not like, just sharing their own passion.

latexr
0 replies
2d5h

and didn't read any "argument" whatsoever.

The very first sentence in the article is:

There's a recurring argument (…)

Which the author then explains and links to a TikTok video which expands on it. The article continues:

This is correct. I also agree that modern, blocky pixel art often is a kind of misdirected, anachronistic nostalgia.

So yes, the article (and video) is arguing for the connection between pixel art and nostalgia, and that what we remember with fondness is something different that what was really there.

vardump
0 replies
2d20h

That's a VERY high-end CRT monitor, not really representing what 99.9% of people had to use.

aidenn0
0 replies
2d22h

Here's a good comparison from roughly what TFA calls the golden age of pixel art on the SFC. Note particularly the messed up aspect ratio of the latter. Also Look at Guinevere's dress in the opening movie, particularly while she is running; there is no AA used in it, but the CRT gives is a more organic look.

CRT: https://www.youtube.com/watch?v=e9QhG2KU6A4

Not CRT: https://www.youtube.com/watch?v=8V2IG0makoo

Pikamander2
1 replies
2d7h

I also agree that modern, blocky pixel art often is a kind of misdirected, anachronistic nostalgia.

Articles like this (and /r/CRTgaming over on reddit) always seem to forget that devices like the Game Boy existed. I put hundreds of hours into Pokemon Sapphire, so I can pretty safely say that my nostalgia for crisp pixel art isn't "misdirected".

The fact that some old games were designed with CRTs in mind doesn't mean that others weren't, or that one rendering is inherently better than the other.

I like the article's illustration of the variable CRT aspect ratios, though. It's easy to forget that "pixels" weren't always perfect squares like they more or less are today.

grumbel
0 replies
2d

When it comes to CRT vs modern displays there is also a much simpler issue at play: Modern screens are twice the size of the CRTs we used to have. If a game looks too pixelated on a modern display, simply try shrinking the screen size. That fixes the majority of issue, all without going through the trouble of trying to emulate esoteric CRT artifacts. A SNES game running in 256x224 was never meant to be run on a 70" TV.

wizardforhire
0 replies
2d13h

Somewhat related, the resale market for crts has skyrocketed. Consider this a public service announcement: if you find a crt on the side of the road and you want an extra few hundred bucks you should definitely pick it up.

surfingdino
0 replies
2d11h

Similar lack of understanding the old medium can be seen amongst photographers shooting digital and then applying "film" LUTs. There is more to the film look than just grain.

raldi
0 replies
2d20h

Last year I was playing Super Mario Bros on an NES on a CRT for the first time since my childhood, and was blown away at how cool the shining coins and question-mark blocks were. They look like polished brass in bright sunshine. Somehow this effect is completely lost on a modern display.

pfannkuchen
0 replies
2d18h

It seems like the blur of the CRT allows the mind to fill in details more, like an upscaler built into the human brain. Possibly accidentally triggering a built in compensation layer for vision degradation? Which would have been very common prior to glasses.

Especially with the human forms, the eye sees details that just aren’t possibly really there.

nyanpasu64
0 replies
4d10h

I'm guessing the blur in the SMRPG Peach screenshot comes from a combination of an out-of-focus camera shot of phosphors, and possibly an older 2-PPU SNES's slow color transitions due to flawed silicon design. I don't see any composite artifacts but don't know if the image is RGB or S-Video.

When it comes to computer screens, there are always two resolutions involved: the one produced by the computer, such as 320x200, and the dot pitch, or "dots per inch", of the screen proper.

The fun thing about CRTs is that there's always more things affecting resolution, like spot size (PVMs/BVMs and their narrow spots made interlacing more obvious) and amplifier bandwidth (if you run at a high enough hsync rate, you can get horizontal blur because the electron gun changes brightness slower than once per phosphor).

As I understand, 320x200/240 on a VGA monitor is actually line-doubled because VGA cards and displays can't take hsync rates below 30 kHz. This can be seen in your Duke Nukem 3D screenshot if you watch the source video, switch to 4K resolution, and zoom into the HUD or main screen. Line-doubling affects the appearance of pixel art in complex ways (taking it partway to how a LCD would look with blocky pixels, you've said later in the article it looks "basically as crisp and blocky").

Even among VGA monitors, the same resolution can look different; my newer VX720 has sharper focus (a smaller spot size) with visible gaps at 640x480 (and even faint gaps up to 1024x768), while I didn't see any scanline gaps in the Scandisk section of the video, even when I could make out phosphors.

mdp2021
0 replies
2d22h

CRTs enrich their content through their qualities as a medium; the article notes some level of the content definition - but not in depth with regard to the design qualities of the content.

To indicate remarkable cases of the latter, I suggest you consider (after the article wanted to oppose Maniac Mansion and Mayhem in Monsterland) Monstro Giganto (C64, year 2022, uses the original character set in the C64 ROM).

https://pirates-of-zanzibar.itch.io/monstro-giganto

manfreed
0 replies
2d7h

I don't mind modern games not having the CRT effect. It certainly does look cool, but I can also enjoy graphics with sharp pixels, if it's good.

I'm more annoyed by games where the developer ignores constraints of pixelated graphics.

When objects are restricted to be placed on the same pixel grid, or when they don't have a uniform pixel size.

The worst is when objects are being rotated by rotating the sprite directly. Meaning you get rotated pixels.

egberts1
0 replies
2d1h

Seems like it is long forgotten about the use of blending colors ONLY at the CRT pixel level as a form of "FOR YOUR EYES ONLY".

Such visual sleigh of working "stegography" code whose name has eluded me to date.

Its code is quite easily recreateable today but the diminishing supply of RGB-based CRT monitors has rendered this mode of stealthy delivery of info rather obsolete.

This approach does not use least-bit of an element of RGB for hidden data stream but more about neighboring RGB pixels creating a separate color particularly on the CRT screen in which to "draw" a picture or set of words.

The higher resolution of today's LCD monitor has now eluded our eye's perceptual resolution (although I could imagine someone getting up super close to a LCD screen to get such color-infused "messages".

Would escape nearly all stego detectors today due to lack of deep matrix (RGB mixing), no?

constantcrying
0 replies
2d5h

I also agree that modern, blocky pixel art often is a kind of misdirected, anachronistic nostalgia.

I think everything here (the author and his sources) misses the point. Modern pixel art is not trying to create a faithful reproduction of how some arcade or home console might have looked.

Pixel art, certainly now, is just an art style like many others. If you need some demonstration look at actual pixel art games, they often freely use totally anachronistic methods, such as very complex shaders. Many don't even align different objects to the same pixel grid. Clearly the artists do not want their games to look like faithful recreations.

Criticizing modern pixel art not for its merits or lack thereof, but for not accomplishing a goal, it never set out to accomplish is ridiculous to be frank.

arkh
0 replies
2d11h

My main gripe is with sloppy technique. A lot - not all, but a lot - of modern pixel art is purposely made to look overly blocky. Dithering and anti-aliasing just isn't utilized, which makes it look even chunkier than actual old game art does on a modern screen.

Or lot of pixel art games come from people who are coders first and not artists. So they may not really know of or how to apply those techniques.

_the_inflator
0 replies
2d9h

There is a reason why raster demos were so extremely popular on Commodore 64.

CRT's resolution was so blurred and totally dependent on the user, not the programmer. I started with a black-white TV out on my VC 20 and many of my friends had to use family's old b-w TVs for their computers.

Colors blended a lot and what really made an impression was color cycling.

Later on many graphical artists made use of the color blending or blurring effect by mixing colors (dithering so to say). Everything was multi-color, hardly anything low res/hi res.

Later on, when sharper monitors specialized for computers were available, a shift towards hi-res appeared, blending was abandoned just to be revived with the bulky 4*4 FLI modes etc.

Hardware and artistry went hand in hand just to impress the audience. That's why some demos from the 80th nowadays look awful to some, but were the latest and coolest thing around, in regards to color action on screen.

Remember, action movies were not really CGI, no fast paced stuff - that came with the computers.