return to table of content

Switch off bad TV settings

LeoPanthera
110 replies
19h15m

The thing I hate about "advice" like this is that it assumes that everyone likes the same things and feels the same way, and it comes across as an attempt to shame anyone otherwise.

I like motion interpolation. I always have it turned on, and I chose my TV based on the quality of its motion interpolation.

Screens are so big these days, that if you're watching a 24fps movie, any panning movement becomes a horrible juddering shaking movement. Judder judder judder... ugh. I can't stand it.

With motion interpolation turned on, everything is silky smooth, and I can actually see what's on the screen, even when the picture is moving.

So no, I won't be turning it off, and I suggest that the next time you watch a shakey-cam action movie, you try turning it on too!

prometheus76
50 replies
19h7m

It makes everything look like a cheap soap opera to me. I can't stand it. I think this might be either a generational thing or perhaps a cultural thing, or maybe some of both.

llm_nerd
27 replies
18h26m

"It makes everything look like a cheap soap opera to me"

This is nothing more than conditioning. "Quality" TV shows "film" at 24 fps despite the fact that they were going to be viewed at 30/60. They did this because even though 3:2 pulldown incontestably is a dirty, ugly hack that reduced quality, people were conditioned to think that if something went through that hack, it's quality. If it didn't, it can't be quality.

So when people talk about the "soap opera" effect, what they usually mean is concise and clear.

The best example of this was The Hobbit when presented at the original, director's intention 48FPS. People were so conditioned to movies being a blurring mess at 24FPS that a frequent complaint about the Hobbit was that it had the purported "soap opera effect".

lifeformed
17 replies
17h56m

It's not conditioning. Frame rate is a tool in the toolbox, and more isn't always better, just like a colorized black and white film isn't better just because it has added information.

This is most easily apparent in animation, which frequently makes use of variable frame rates to achieve different effects and feelings, often even within elements of the same scene. A character might be animated at 12 fps while the background is at 8 fps. Bumping those all up to 60 would be an entirely different shot with very different vibes.

andersa
10 replies
17h18m

Actually, more is always better. What do you think why we try to run PC games at 144, or even 240 and 360 fps?

ted_bunny
4 replies
17h2m

What's the point? Can people even tell?

andersa
1 replies
16h51m

Of course they can. If you ever try to go back, it looks like garbage. There is no limit to improving the illusion of smooth motion, but it scales non-linearly (you need to double the frame rate each time to improve it noticeably). I can't personally tell apart whether it is running at 144 or 120 fps, but the larger jumps are very obvious.

Flameancer
0 replies
14h46m

Yea 120 v 144 isn’t too big of a difference for me. Actually until I got a new GPU that could actually output 120fps at 1440p, I left my monitor on 120hz. When AMDs AFMF drivers came out I actually went ahead and bumped the refresh to 144hz as I was definitely starting to see screen tearing but it was still pretty fluid. I’d be curious on whether the next jump for meto will be 4k 144hz or 1440p 240hz

girvo
0 replies
16h45m

Absolutely they can, though I found anything above 240hz to be indistingushable for myself personally. Though the monitor I'm writing this on is 300hz; it's motion clarity is only bested by OLEDs and the $2000 AUD (twice what I paid for this) 360hz bigger brother PG27AQN.

Flameancer
0 replies
14h50m

I’ve never had a display refresh rate higher than 144, but maybe because I play games but I can definitely tell the difference between 30/60 and even 60/120. After 120 the difference between 120/144 starts to diminish but there’s still a slight difference though it’s more pronounced in very fast moving objects like moving a mouse.

lifeformed
2 replies
12h4m

Games are different. Interactive media requires responsiveness. Saying that higher framerate is always better in cinema is a pretty bold statement, one that would fail a simple survey among moviegoers. What does "better" even mean, in this context? Fidelity is just one of many axes of expression when creating art. Higher isn't better than lower, or vice versa.

andersa
1 replies
6h11m

For me the higher frame rate is not really about response time, it is all about smooth motion. Panning the camera at 60Hz is just barely acceptable. 120Hz is where it starts looking real nice and I can really see things during the pan. 24Hz pans are unwatchable garbage.

A movie properly authored at 120Hz (not interpolated with potential artifacts) would be objectively better than a 24Hz one in every possible way.

empiricus
0 replies
1h21m

Also with higher frame rate I can see subtle patterns in the movement and flow of things (clothes, water, humans). Also the eye can see more details and can recognize the texture of things, the different kinds of fabrics, all kind of details like this.

lmm
0 replies
14h16m

PC gamers pushing for higher framerates is mostly about reducing latency, which doesn't matter for a movie.

bee_rider
0 replies
12h54m

Need some way to justify an overpriced graphics card, is what I always assumed.

HPsquared
3 replies
17h29m

Surely 24 fps was made the standard to save money on film.

It's about the slowest frame rate that's not very distracting.

Needless to say, film is no longer the major expense it used to be. In the early days the film itself was a fairly big part of the production cost (and don't forget distribution - cost to replicate the film - and operating cost for the projectors - handling film rolls twice as big, etc.).

And standards are sticky, so the industry stayed on 24fps for decades.

andersa
2 replies
17h17m

It's about the slowest frame rate that's not very distracting.

Except, it is very distracting. Around 60 it starts to become bearable for me and allow camera pans to be interpreted as fluid motion.

llm_nerd
0 replies
16h40m

It's quite astonishing how terrible 24 FPS really is, but as the GP mentioned it was "good enough" for the time. It's a scale, and costs and processing was prohibited that it was the point where most scenes, especially dramas and human interest type things, were served well by the format.

Action scenes are just brutal at 24FPS. Running, pans, and so on. Either it's a blurry mess at a 180 degree shutter, or it turns into a rapid slide show (like early in Saving Private Ryan).

klausa
0 replies
15h28m

This is as much about frame rate as it is about display technology.

24hz on an OLED with very quick pixels is a very different experience than 24hz on a CRT or 24hz on a film projector.

LordDragonfang
1 replies
14h31m

Animation frames are almost totally orthogonal to what we're talking about here. In fact, I'd argue they're the exception that proves the rule. Animation had to develop tricks like spaghettification to look good at 12 fps because it looks inherently worse than higher frame rates. In fact, techniques like smear frames are often used to directly emulate higher frame rates. It's an art form born of limitation, not a neutral tool. Just look at any 3d animated show that just drops the frame rate of mocapped characters to 12fps (Dragon Prince is one that comes to mind) - it looks like jarring, choppy shit.

lifeformed
0 replies
12h9m

Those are the origins, but the result is a style that can look good or better than higher framerates depending on what you're going for. Art is not held back by fidelity - instead fidelity is part of what defines it, as well as other constraints.

hug
5 replies
17h51m

This is an odd take.

Firstly: No one on the mass market actually knows what 3:2 pulldown is, so it's hard for people to see it as an indicator of 'quality' -- most of HN, a technical audience, probably doesn't know what it is either: For reference, it's when a 24 frame per second film is converted to 29.97 frames by by interlacing 4 frames together to create 5 frames. That and a tiny bit of a slowdown gets you to 29.97, which is the NTSC frame rate.

Secondly: Why do people in traditionally PAL regions also hate the soap opera effect? Again, for reference, PAL regions ran at 25 frames per second and so got away with a 4% speedup and a 2:2 pulldown that has no real frame-blurring effect.

Thirdly: Generally, I prefer higher frame rate content: I have a 144Hz monitor, and prefer content as close to that as I can, but I still hate watching The Hobbit -- a lot of this has to do with motion blur: 48 frame per second content is not fast enough to get away with appearing seamless, and not slow enough that the per-frame motion blur you get with 24 frame per second 180 degree shutter hides the stuttering.

llm_nerd
4 replies
16h44m

No one on the mass market actually knows what 3:2 pulldown is

People don't have to know what it technically is to know it when they see it, and the simple incantation of "soap opera effect" demonstrates that.

Again, almost all dramas shoot at 24 fps. There is zero technical reason for this (there once was a cost saving / processing reason for it, and then people retconned justifications, which you can see earlier in this very thread). They do this because, again, people are conditioned to correlate that with quality. It's going to take years to break us from that.

I have a 144Hz monitor, and prefer content as close to that as I can

This is not meaningful. Preferring games at a higher framerate has zero correlation with how you prefer movies. And however odd you think the take is, you like 24 FPS because you've been trained to like it, Pavlov's bell style.

hug
2 replies
15h33m

What you're suggesting is that you know better than every cinema-goer and every cinematographer, animator, producer, and director around what their preferences "really" are, which is a pretty wild thing to claim, especially in the face of people telling you exactly why they prefer unaltered 24 FPS content to horribly interpolated uncanny-valley messes.

The reason no one has changed process isn't because there's tonnes of better options that everyone is just studiously ignoring because of pavlovian conditioning. It's absolutely nothing to do with people liking the look of interlaced 3:2 pulldowns. It's because the current options for HFR content just plain don't look very good. Some of this is unrelated to the technical specification of the recording & due to things like action content in HFR looking cheesy -- there's going to need to be a wild change in how content is choreographed & shot before we're anywhere near it being as well understood as current practises.

There are exceptions: 4K 120FPS HDR content for things like documentary content looks pretty good on a high refresh rate monitor (note: no one said games), but we haven't reached an era where that's even nearly commoditised and the in-the-middle stuff you'd want to do for cinema or TV just can't cut it.

llm_nerd
1 replies
14h39m

better than every cinema-goer

Humorously this submission, and so many just like it, are about people who are outraged that their parents / friends / etc actually like motion smoothing. So...I guess? I remember a similarly boorish, ultimately-failed "no one should ever take vertical video!" movement from a few years ago, again pushed by people really, really certain of the supremacy of their own preference.

and every cinematographer, animator, producer, and director

Now this attempt to appeal to authority is particularly silly. Peter Jackson -- you might have heard of him -- tried to do a movie at 48 FPS for a wide variety of quality reasons, to be lambasted by people just like you. People who are sure that the completely arbitrary, save-our-rolls-of-film 24 FPS is actually some magical, perfect number. It is insane. Everyone else is simply handcuffed to that completely obsolete choice from years ago, and will be for years more.

I'm not going to convince you, and have zero interest in trying. And I am certain you're not going to convince me. But your argument at its roots is "that's the way it's done, therefore that's the perfect way and the way it will forever be done". It's utter nonsense.

shiroiuma
0 replies
11h51m

People who are sure that the completely arbitrary, save-our-rolls-of-film 24 FPS is actually some magical, perfect number. It is insane. Everyone else is simply handcuffed to that completely obsolete choice from years ago, and will be for years more.

Instead of trying to jump to 48fps or 60fps, maybe they should just adopt 30fps as the new standard for a while. The 24fps fans won't have too much to complain about, because it's not that much faster (and it's the same as the old NTSC standard), and the HFR fans will at least have something a little better. Then, in a decade, they could jump to 36fps, then 42, then 48, etc.

As a bonus, the file sizes for movie files won't be that much bigger at only 30fps, instead of 60+.

NickM
0 replies
15h50m

people are conditioned to correlate that with quality

Are you sure it’s really just conditioning? Impressionist paintings are obviously a lower fidelity reproduction of reality than photorealistic paintings, yet people tend to like Impressionism more, and I don’t think that’s necessarily just cultural conditioning. Sometimes less is more.

RcouF1uZ4gsC
1 replies
15h22m

The best example of this was The Hobbit when presented at the original, director's intention 48FPS. People were so conditioned to movies being a blurring mess at 24FPS that a frequent complaint about the Hobbit was that it had the purported "soap opera effect".

The reason I didn't like the Hobbit was because they went overboard on CGI. They had to make Orlando Bloom (Legolas) appear younger than he was in the Lord of the Rings which was released a decade before.

thaumasiotes
0 replies
11h13m

They had to make Orlando Bloom (Legolas) appear younger than he was in the Lord of the Rings which was released a decade before.

Tolkien's elves are literally immortal and the time between The Hobbit and The Lord of the Rings is less than a hundred years. Legolas' father is several thousand years old. There is no reason to expect Legolas to look younger in The Hobbit; you'd want him to look exactly the same.

theshrike79
0 replies
11h24m

I watched The Hobbit in the cinema in HFR.

It looked like absolute soap opera crap. And the worst thing was that I could _easily_ tell when the actors were carrying light prop weapons compared to the real thing.

At 24 fps your eye can't tell, but at 60fps you see the tiny wobble that the foam(?) prop weapons have and it's really jarring. Same with other practical effects, they're easier to hide in the ye olde 24 fps version.

Watching movies in 60fps is the same as when porn moved to 4k. You see every little flaw in great detail that used to be hidden by worse technology. Everyone needs to step up their game in areas where it wasn't needed before.

andersa
14 replies
18h8m

Where does this nonsensical argument keep coming from? No idea what a "cheap soap opera" is even supposed to be. The only thing I notice is that the horrifying judder during camera pans is replaced by minor artifacts that don't distract anywhere near as much. It's literally not possible for me to interpret a 24hz pan as fluid motion.

ac2u
11 replies
18h3m

See llm_nerd's answer below. Conditioning or not, it takes me out of the realism of whatever I'm watching if interpolation is applied. People don't really have to rationally justify it, they're used to a certain way of footage being played growing up and when some folk see interpolation it just feels off to them right away, and reminds them of those soap operas.

andersa
8 replies
18h1m

I wonder if you'd have the same reaction to a movie properly authored at 120Hz without the subtle artifacts.

hug
6 replies
17h32m

There's a point at which these things balance out and work. I'm not sure that 120 frames per second is high enough, but it's much closer. We have anecdotal evidence (the fact that everyone hated The Hobbit) that 48 frames per second isn't.

One of the rules of cinema and shooting at low frame rate is to use "180 degree shutter" (a term that comes from the spinning shutters used in old film cameras), or in other words a shutter that is open for half as long as the frame rate. i.e.: If you're filming at 24 frames per second, use a 1/48th second shutter speed.

The reason for this is that 24 FPS @ 1/48" exposure film adds enough motion blur to each exposure that any movement that occurs over a small number of frames is naturally smeared by a consequence of the exposure time, but anything that is relatively steady across frames tends to look crisp. If you shoot at 24 FPS @ 1/1000", you end up with something that looks like a fast flipbook of static pictures. 24 FPS just isn't fast enough, and people can see that each individual frame doesn't contain movement.

Anecdotally, 120FPS @ 1/1000" on a 120Hz display doesn't exhibit this same problem, at least to me or people I've shown it to, although some people will notice it "feels fast".

48 FPS @ 1/96" seems to be the worst of both worlds: Too fast a shutter for motion blur to compensate nicely, too slow a frame rate to to make it look seamless, so it ends up in the uncanny valley where people know there's something missing, but not what.

The frame interpolation feature that people seem to hate is almost directly designed to fall into this horrible territory.

andersa
4 replies
17h25m

Motion blur

Motion blur doesn't work on modern TVs (i.e. OLED). They will display a static, blurry frame for 1/24 of a second, then snap to the next one practically instantly with a clearly visible jump. What I get in the end is a juddering mess that is also blurry, so I can literally see nothing.

kalleboo
1 replies
13h20m

I wonder if these 120 fps panels are fast enough to do black frame insertion to mimic the effect of film projection

andersa
0 replies
6h17m

I tried the black frame insertion mode on mine, but I can see the strobing, so it is extremely uncomfortable to look at.

hug
1 replies
17h0m

Motion blur doesn't really work in that way on any device, but what you're missing is that the content's captured frame rate has a relatively important relation to the display frame rate: You can't usually change one & not the other without making everything look worse, at least to most people, and this is what a lot of modern TVs do.

Naive interpolation of inter-frame content is awful, is what a lot of TVs do to implement their smoothing feature, and is why everyone complains about it.

The reason a lot of people hated The Hobbit may be partly because of this problem: It was shot at a 270 degree shutter to try to get a bit more motion blur back into each frame, which to a lot of people felt strange.

shiroiuma
0 replies
11h51m

Interestingly, some of The Hobbit fan edits changed the frame rate back to 24fps; watching those on my regular IPS screens, they look just fine.

lmm
0 replies
14h1m

We have anecdotal evidence (the fact that everyone hated The Hobbit) that 48 frames per second isn't.

Maybe, or maybe we just haven't adapted to 48fps yet. Something I heard a lot about The Hobbit was that the outdoor scenes looked great whereas the indoor seens looked like film sets - which, well, they were film sets, and making a set that looks "natural" will require new techniques. Everyone hates early stereo mixes (e.g. The Beatles), not because stereo sound inherently sounds bad but because production takes time to adapt to new technology.

bozhark
0 replies
17h47m

Nah it’s much better at higher Hz

elzbardico
1 replies
15h22m

Kind of weird that you associate motion blur with realism.

ac2u
0 replies
1h49m

Of course it seems weird... if you read what I said I implied it's not really rational.

But no one's brain is saying "wow the lack of motion blur really affects the realism here".

I walk into a house and see a TV show playing on a TV with interpolation turned on and it just looks weird because of how TV looked growing up. I mean that's just the simple reality of it. I understand when film nerds come along and explain motion blur etc but that's all very "system 2" thinking. I'm talking about pure system 1 subconscious processing.

What's even weirder is how some folks can never seem to grasp why their explanations of how motion blur is bad can't convince others to suddenly like interpolation.

justsomehnguy
0 replies
13h34m

No idea what a "cheap soap opera" is even supposed to be

https://youtu.be/MFlz5cCDMmc?t=25

caseyohara
0 replies
16h21m
thedougd
2 replies
18h48m

Maybe some people can pick up on the fact that those extra frames aren't real more easily than others. Some innate sense thrown off by the interpolation. Motion interpolation gives me an uneasy feeling more than anything.

For example, some people can see a high frame rate and thus can't watch color wheel based DLP because they see rainbowing in the screen. I can't watch my old plasma in 48hz mode because it flickers. My spouse can't see the flicker at all.

prometheus76
1 replies
18h31m

For me, the interpolation really seems to separate the "layers" in a shot, and it just completely destroys the illusion of them not being a set with some lights. Like I said, it feels like a cheap soap opera from the developing country no matter what movie or show I'm watching.

Some of the problem for me may be related to the fact that I worked as a camera operator and video editor during the years from transitioning from the old NTSC standard to HD, and I paid hyperattention to detail as HD came online.

For some reason, the interpolation just screams "this is fake" to me.

resters
0 replies
18h0m

For me, the interpolation really seems to separate the "layers" in a shot, and it just completely destroys the illusion of them not being a set with some lights.

Great description. It's the same for me.

jjoonathan
0 replies
18h56m

I suspect it has to do with the degree to which your past was plagued with cheap soap operas vs poorly performing video games.

j45
0 replies
18h57m

It feels like the jump to 4K and 240hz tv at the same time as one groups normal and another groups exception may have something to do with it.

Maybe the group who doesn’t want it too life like is inoculated, or the other way around.

ilamont
0 replies
18h35m

I noticed this too after getting a 4K TV earlier this year. It really ruins a lot of films.

hackernewds
0 replies
14h3m

for as much debate there was on 60fps, I do agree. games and movies on 60fps do not feel "cinematic". I indulge in these to escape from reality.

jonstewart
20 replies
19h2m

My mind is blown. I didn't think -anyone- could possibly like motion interpolation for watching movies. I hate it so, so much. I'm trying to understand your POV.

How do you feel about watching movies in a theater? The frame rate there is low but the screen is so much larger.

viktorcode
3 replies
18h8m

I wish for theatre film releases to move on to HFR instead of sticking to 24 FPS which was arbitrarily set by ancient technology limits.

andersa
2 replies
16h53m

Even the ""HFR"" is a joke. What is 48 fps supposed to be? Every modern TV can do 120 already. Use that as a baseline.

LeoPanthera
1 replies
16h19m

48 allows theaters that haven't upgraded to show the movie by throwing away every other frame.

andersa
0 replies
6h9m

Well then, they can have that same alignment with an actual high frame rate movie at 120Hz, displaying every 5th frame.

andersa
3 replies
17h8m

Your mind is blown that someone might not like to watch movies at a prehistoric 24Hz on a screen that is capable of 120Hz, after video games have been running at this frame rate for a decade already?

jonstewart
2 replies
14h50m

Yes! Whenever I see motion interpolation on a TV, it gives me a sense of revulsion. It seems okay for sports, I guess?, but awful for movies, to the point where I would rather turn the TV off than watch a movie with motion interpolation. Perhaps what I'm thinking of is the 3:2 pulldown thing for 60Hz TVs and I haven't seen what it's like on a 120Hz screen?

I don't play games much, but visually I've always distinguished between games and movies. I expect them to look quite different, not at all similar.

And I thought my feelings about this were universal, and had confirmation bias from seeing postings before Thanksgiving like "if your parents' TVs look weird, here's a guide for turning off motion interpolation depending on make and model", etc. I've assumed that the whole point of motion interpolation was for new TVs to impress people with how they look for sports, that this is what sells TVs in Best Buy, and the over-application of motion interpolation to other content was an unfortunate byproduct.

satvikpendem
0 replies
11h5m

How much of that high FPS or interpolated media have you seen? At first I too was jarred by it, as if it were sped up somehow, but after a few hours, I couldn't watch regular 24 FPS anymore as it literally looked like a slideshow. I distinctly remember watching Thor: Love and Thunder in theaters and when they panned over the pantheon, you literally couldn't see anything. In contrast, Avatar 2 with the 48 FPS scenes was one of the most immersive movies of my life.

andersa
0 replies
6h5m

A lot of lower end TVs have really bad implementations of it. Maybe try it again on a LG G3 or a Sony A95L and it might be completely different from what you remember.

Blackthorn
3 replies
18h58m

Movies have a different display though. Film shutters and whatnot. Helps a lot with keeping the motion from just being jerky. OLEDs don't have that, and attempts at black frame insertion don't really work there because they already struggle with brightness. Hence, a mild motion interpolation is useful.

Different display technologies need different things. No difference from CRT filters on old video games played on modern screens.

criddell
1 replies
18h1m

Is anybody shooting or distributing film anymore?

squeaky-clean
0 replies
17h41m

Lots of people still shoot with film. But distribution will always be digital unless your screening is specifically advertising a film showing. That's usually just big Imax movies like Oppenheimer in select cities, or indie theaters showing old movies who can get access to the film.

jnwatson
0 replies
18h38m

Nah, it is just that filmmakers avoid a lot of panning shots because it looks like crap.

LeoPanthera
2 replies
18h58m

It's even worse in theaters. The screen is HUGE. Panning motions are so bad they often give me motion sickness.

There was one movie that they showed at 48fps - I think it was The Hobbit? I've forgotten. That was amazing. Blissful. My eyes have never been so happy.

Even if I forgot the plot already.

andersa
1 replies
17h7m

Yep. Hate theaters for this reason. Absolutely unwatchable. My LG G3 with motion smoothing provides far better experience. Even in Avatar 2 they put some sections of it in lower frame rate for no reason and I noticed them instantly.

satvikpendem
0 replies
11h8m

Avatar 2 in the 48 FPS scenes was jaw dropping, it looked so real, as if you were transported there. It does not look the same in the 24 FPS scenes and pulled me right out of the movie.

squeaky-clean
0 replies
17h36m

I'm in agreement with them here, judder on movie screens is so bad to me it's distracting. I don't think I've had a better visual experience in a theater than at home in a long time. The screen is physically larger, but when I sit 4 feet from my TV it's the same size visually.

Audio still wins by a mile in a theater though. Though home theater audio has gotten pretty darn good. I just wouldn't know as long as I'm in an apartment.

You can also always choose when to have motion interpolation on or off. For sports and nature documentaries I think it's just better. For animation it's worse. For other films it depends, I usually prefer it in something fast and actiony, like John Wick.

satvikpendem
0 replies
11h9m

I like interpolation too, but it must be high quality, not the low quality that's built into TVs. For example, I use SVP 4 which does interpolation with your GPU, works great.

Movied are even worse, I can absolutely see the judder especially when panning. It is one reason I prefer to wait for home releases and will go to movies that are specifically phenomena irreplaceable at home, such as Oppenheimer in 70mm IMAX.

elzbardico
0 replies
15h16m

The problem with motion interpolation is because most of it is the cheap variety. I don't like motion blur, it is annoying, but bad motion interpolation is worse.

cobbal
0 replies
18h55m

There really are a large range of opinions about this. I love high frame rate, but can't stand interpolation. I wish theaters used higher frame rates. I really enjoyed the parts of Avatar 2 that were smoother, and it felt jarring to me whenever it would switch back to low frame rates.

Probably it's just what you're used to and how much you've been trained to notice artifacts.

arh68
0 replies
16h35m

Motion smoothing is great on my 120Hz TV.

Theater is fine -- I have a home theater w/ a 24p projector. I wish it had 120Hz smoothing, but at least it's not dim like my local theater.

24p content at 60Hz is really, really bad (3:2 pull down what not). At least we can agree on that. That seems to be what "turn off motion smoothing" is about, to me.

24p content at 24Hz is fine, but 120Hz is better.

EDIT: I should say that the input lag (gaming) is much greater for smoothing, so even at ~30 fps I'd run my PlayStation at 60Hz no smoothing (game mode on &c).

Angostura
20 replies
19h10m

I understand and agree with what you are saying, but I think your preference for motion interpolation is quite unusual.

Perhaps it is a preference that will change with generations.

lupusreal
11 replies
19h4m

Probably not, young people watch most of their content on phones/etc, not big TVs with motion interpolation.

TeMPOraL
5 replies
17h55m

Someone tell the film industry, so that they stop lighting and color-grading the movies and TV shows for ultrabright, turbo-HDR-enabled expensive TVs - almost no one has those, and even those who do mostly watch on their phones. Maybe if the industry gets that memo, we'll once again have films and shows in which one can actually see what's happening.

nickthegreek
4 replies
17h49m

I’d rather my films be graded for home theater use than phone use. HDR TVs these days are basically all TVs. Why should we all take a step back in quality when the hardware to view it decently is more affordable than it’s ever been? I don’t care how Oppenheimer looks on your phone I know Nolan would care even less.

TeMPOraL
3 replies
17h44m

Because who has home theater? I'd think it was mostly killed by the Internet, streaming, and the overall trend of cable-cutting, ditching TVs, and switching to computer and laptop screens for one's entertainment.

(I guess at some point this must have reversed everywhere except the bubble myself and my friends and acquaintances live in...)

satvikpendem
0 replies
11h12m

Lots of people have TVs, not many are seriously watching Netflix on their phones (maybe their laptops) but generally on their TVs in the evening.

nickthegreek
0 replies
13h30m

I personally, have a 100” dropdown screen, ust projector and a Sonos surround. This last month, I helped my work buddy shop some deals and got him a 75” Dolby Vision TV and surround sound system with sub for $1200. These are affordable things.

js2
0 replies
17h0m

I have a home theater. I don't know how big the market is, but you can buy receivers from a half dozen manufacturers (Denon, Yamaha, Sony, Onkyo as well as a bunch of speciality brands), digital projectors (Epson, Sony, JVC, BenQ and also a bunch of speciality brands), and all manner of ancillary items (sound proofing, audio treatments, theater seating, etc).

/r/hometheater/ has nearly a million members.

https://www.avsforum.com/ is pretty active.

TVs are still getting larger and they've recently crossed the 100" in size.

So yeah, there's a least a few dozen of us!

nomel
4 replies
18h34m

This is absolutely the case with all of the nieces/nephews that I know. They prefer that they see the majority of, tablets, which 81% of kids have these days [1].

[1] https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&c...

jasonfarnon
3 replies
17h2m

That number is a pretty high--looking at your link, living in a house with a tablet isn't the same as having a tablet. I have tablet from work but my kids don't use it or any other. BTW that's really cool that you could direct the highlighting in the URL like that. If it's not specific to census.gov, what is that called? What level of the stack is interpreting it?

girvo
2 replies
16h48m

Its at the JS layer from what I can see:

#:~:text=In 2021%2C 81%25 of households,17 years old — owned tablets.

That query fragment is what triggered it, which means theres some Javascript taking that in and highlighting whats on the page

verteu
1 replies
13h18m
girvo
0 replies
12h15m

Neat! I learned something new :) Thanks

RussianCow
7 replies
18h56m

I understand and agree with what you are saying, but I think your preference for motion interpolation is quite unusual.

The number of TVs that have it enabled by default seems to indicate otherwise. I'm not saying that the manufacturers are correct, necessarily, but I don't think they would go through the effort if they didn't think enough people wanted it.

gffrd
3 replies
18h45m

Or: it just "shows" better when they're demo'ing the TV to a potential customer, not unlike the hyper-saturated nature footage they use to get you to look at their TV and not the one next to it.

RussianCow
2 replies
18h38m

I don't buy that. Why would motion interpolation look better in the store than it does in your own home?

nemo44x
0 replies
18h31m

I think it’s because of the content used for it. Some things do look better. But movies absolutely do not.

Timwi
0 replies
18h24m

Because the store chooses to show footage for which it is optimized, not a movie. But there's also the consideration that it looks better at first sight when you walk past it, than it does when you watch a whole movie looking straight at it.

resters
1 replies
18h7m

Just as the people who design google maps navigation don't drive, the people who design parental control features for google and apple surely don't have kids, and the PMs who choose to turn on motion interpolation by default likely don't watch movies on their TVs.

Major, shockingly obtuse sensibility gaps are sadly the norm today in product management.

TeMPOraL
0 replies
17h47m

It seems the side effect of treating customers as users in the classical sense, is the cross-industry adoption of the mantra, "don't get high on your own supply".

BikiniPrince
0 replies
18h37m

They are in fact quite incorrect. These poor upscale features are horrors crafted in the early days of post-processing nightmares. Half of it was pitched to make football games appear more real. They all murder the soul of the film and jack up delay. The industry even had to invent a synchronized audio mechanism to compensate for the summoning of these Eldridge horrors. What I don’t mind are modern upscalers which look very nice on expensive sets. Other modern features like HDR are also lovely because they focus on reproducing the movie with vibrant lighting. Anyhow, one summer working in a television station will improve your acuity for these things. You might even come to understand how these old creations will be the downfall of civilization. As an aside, do not forget to surgically remove samba tv from any android set with adb.

deanCommie
4 replies
19h3m

You can also prefer to not like subtitles, and watch films dubbed from their original language.

You can also prefer to not like black & white films, and watch the colorized versions done decades later.

You can also prefer not to see nudity, violence, or profanity, and watch the edited-for-TV versions of those films.

Finally you can prefer a totally different story or shot composition or artistic choices altogether, and ask Generative AI to recreate, reedit, or augment scenes to your preference.

All of these are valid preferences and we have the technology to facilitate them for you.

But 1) They should never be the default on any technology you acquire. That's the PRIMARY sin of all of the technologies mentioned by OP - it's not that they exist, it's that they're on by default, and since most humans never change the default settings on ANYTHING they change, they experience content in a way that as not intended by the original artist behind the vision.

And 2) Well, this is subjective, and everything is a spectrum. But you are ultimately robbing yourself of the specific experience intended for you by the creator of the film. It's certainly within your right not to care and think that you know better than them, but on a spectrum of that philosophy, carried out across all of society, it's probably not a good thing.

WheatMillington
2 replies
18h36m

Oh look the exact type of shaming OP was talking about.

russelg
1 replies
16h58m

And so they should be shamed. There are professionals working on every aspect of a film, and these options just shit all over their work, ruining their intended vision.

Now if you're talking about watching YouTube or similar content then it's a different story.

LeoPanthera
0 replies
16h49m

"Vision" implies that there is a choice. How many feature films have been made at a frame rate other than 24fps? Ever since the sound era, I suspect you can count them all on your fingers.

So I don't buy this "vision" argument. It's just holier-than-thou. Directors choose 24fps because it's literally the only option. It's no choice at all.

Forcing me to watch your movie as a jerky motion-sickness-inducing mess? I will indeed, to use your words, shit on it.

elzbardico
0 replies
15h9m

The default should be what is more popular. The connoisseurs are probably savvy enough to change the settings from those defaults.

Most people don't give a damn about those things. They just want to be entertained, maybe they have other hobbies where they can exercise their particular snobbism, and movies are not particularly important for them as an art form or they don't care much about art at all.

sherry-sherry
2 replies
13h54m

Juddering 24fps footage means you have a display that isn't displaying 24fps properly (see here: https://www.rtings.com/tv/tests/motion/24p). The refresh rate and frame rate aren't matching, most modern TVs account for this and can correctly display it.

LeoPanthera
1 replies
13h4m

It's not that, because I experience exactly the same in movie theaters. I'm always amazed when people say it doesn't bother them.

sherry-sherry
0 replies
11h9m

We could be confusing terms here... I mean 'judder' as uneven frame times, it looks like things skip slightly out of time. Uneven frame times shouldn't even happen with a cinema screening unless something has gone terribly wrong.

V1ndaar
1 replies
18h8m

My hope is that thanks to stuff like DLSS frame generation in video games (or maybe AR/VR) that the opinion of a majority of people will change over time. So that eventually maybe... we might actually see movies being filmed in higher framerates. People's conditioning really stands in their way, imo.

The only bad thing about motion interpolation on most TVs in my book is the fact that the /implementation is often pretty bad/. If perfect frame interpolation was a thing, I'd watch everything upsampled to 165Hz of my monitor. Well, I do it anyway using the Smooth Video Project. But that also suffers from artifacts, so is far from perfect. Much better than judder though...

andersa
0 replies
16h55m

DLSS 3 really is fantastic technology. It works amazingly well for interpolating a game from 60 to 120 (or higher). It fails pretty hard if you start with 24 though. We'll need something like RIFE for that, but currently no hardware exists that can run it in real time...

nunez
0 replies
13h53m

Don't know how you do it. I can't stand the jelly effect. Motion interpolation was the first thing I turned off when we got our A95K earlier this year.

malkia
0 replies
18h55m

Seriously... No! Our first LCD TV (several years old by now) had this, and watching LOTR on it - was wtf is this - am I in a theater? We both sit with my wife and watched it, and we were secretly annoyed but dared not to say or complain about it, because we just spent tons of money on it... Then we found it and fixed it!

New TV I've got has the `Filmmaker mode` - wasn't sure what exactly is that, turned it On and yes - it's how it should be. This article cleared it for me now

epgui
0 replies
14h46m

it assumes that everyone likes the same things and feels the same way

I don't think this is about feelings. (But it just might be in a small minority of cases IMO, and if it's the case for you, then all the power to you.)

In the case of this article, all the points are about discernment. Some people not only notice the issues, but are also able to identify them and articulate precisely what is happening. The rest don't.

dclowd9901
0 replies
15h16m

No one was shaming you, get over yourself. The author repeatedly framed recommendations in the context of how the creators produced it. If you feel shame that’s all you.

beltsazar
0 replies
9h27m

The thing I hate about "advice" like this is that it assumes that everyone likes the same things and feels the same way

It's so ironic that at the end you gave an advice:

I suggest that the next time you watch a shakey-cam action movie, you try turning it on too!

Are you assuming that everyone likes the same things you like? I guess not. The same for the article—it doesn't assume that.

No advice fits everyone, but good advice fits many people. And I'd argue that the article's advice fits many, if not most, people.

TV is probably the only screen many people have that has motion interpolation on by default. They watch movies on theaters, PC monitors, laptops, tablets, and phones probably more than on TVs.

Many people are already used to what movies "look like." Non-tech-savvy ones might not know what "FPS" means, but they would likely be able to notice that watching movies on their phone feels different from on a TV with the default setting. The article's suggestion to disable motion interpolation on TVs makes the watching experience more consistent across all screens.

DHPersonal
0 replies
18h58m

I like the feature, too. I remember watching the Battlestar Galactica remake with the interpolation setting active and getting an even deeper sense of realism out of the scenes. They were already aiming for a documentary style with the camera work and special effects, so the higher prosumer framerate fit with the style very well. On other films and TV shows I like the interpolation for panning shots which jidder a lot on the standard framerate.

BLKNSLVR
0 replies
16h24m

Isn't the manufacturer doing a worse/larger version of this assumption normalisation by having these things on by default?

More articles explaining how to customise the settings is better than less because it draws attention to the fact these options are being forced on all consumers.

Shaky-cam action movies are their own unique problem :)

I think they exist because it saves a fair bit of cash on fight choreographers and the time it takes to train the actors to do it realistically. On the flip side, it really increases the respect I have for properly choreographed action scenes that are consistent and visible.

imhoguy
81 replies
19h41m

Modern smart TVs are so disappointing that I just prefer watching films on my 27" IPS computer monitor - no bloatware and every video just looks right.

Not to mention that after 6 years the TV becomes useless junk killed by bulky modern app updates. I think there is a market for something like "Framework TV".

imp0cat
35 replies
19h34m

Well, an OLED will be pretty much dead after six years anyways.

Also, you cat reset an old, slow TV, put it in "dumb" mode, then add something like a Vero V or another box.

jeffbee
17 replies
19h27m

Well, an OLED will be pretty much dead after six years anyways.

How much TV are you people watching such that this statement becomes even remotely accurate? My 2017 LG OLED display says it has 6400 power-on hours and it looks as good as new.

sb057
14 replies
19h14m

The average American watches five hours of television every day, or just shy of 11,000 hours over six years.

https://www.marketingcharts.com/television/tv-audiences-and-...

sgt
8 replies
19h3m

Lots of people leave their television sets on for almost the entire day, even when not watching. Never understood that.

magicalhippo
6 replies
16h28m

SO does that. She'll put the show on pause, go do shopping, come back and resume. All while leaving the TV on.

I've tried to get her to change oh so many times, all in vain...

No rational explanation either. Just... that's how she rolls I guess.

mvanbaak
2 replies
16h16m

enable energy savings and turn off the tv after 2 hours. problem solved! Most tv's show a popup when they are about to shut off because of power savings, and with a button on the remote you can reset the timer.

c0pium
1 replies
13h53m

This is extremely passive aggressive, tell me you’re single without telling me you’re single. People’s preferences being different than yours is not a technical problem to be worked around.

mvanbaak
0 replies
6h0m

I’m not the one complaining here. My wife turns off everything before she leaves the place. Even if its just a quick walk with the dog :)

arp242
1 replies
14h37m

I had a girlfriend who would flush the toilet before using it. She said "it's cleaner this way". I tried to get her to stop, but it was a futile effort.

imp0cat
0 replies
10h1m

Before and after, I'd hope. :)

cschneid
0 replies
12h11m

That doesn't kill LG OLEDs thought, that results in a nearly black screensaver (just a small slow "firework" effect). Wastes a touch more energy than turning the tv into sleep mode (since nothing actually turns off anymore), but hard to believe that would shorten life much.

eikenberry
0 replies
16h42m

Gives a feeling of having other people around and staves off loneliness.

electrondood
3 replies
15h25m

That's insane to me. I can't stand watching TV. I feel my life being sucked away.

eru
0 replies
14h30m

Many people leave the TV running in the background while they do other things, like eat dinner.

For the lifespan of the screen, it doesn't matter if anyone actually directs their eyes at it.

Hikikomori
0 replies
14h55m

Try sitting down.

Flameancer
0 replies
14h56m

That actuality sounds pretty standard. That would be from 5pm-10pm. Evening news, game shows, then like two 1 hour long shows and it’s already been five hours.

jeffbee
0 replies
19h1m

That explains a few things.

Still, the objective tests from RTINGS seem to suggest that it's the LCD sets that look all fucked up after a few years, while almost all the OLED ones look perfect. And they OLED sets aren't showing a downtrend in overall brightness over time.

https://www.rtings.com/tv/learn/longevity-results-after-10-m...

eropple
1 replies
19h6m

Yeah, that's a wild claim to me too. My 2018 LG C8 has about 5800 hours on it and it is indistinguishable from new. (Though I've never run it at max brightness because it would be blinding.)

sokoloff
0 replies
15h21m

Confusingly, the contrast control has a higher effect on image brightness (in the ordinary English language use) than the brightness control does.

Night_Thastus
8 replies
19h14m

This depends a LOT on both how, and how much the display in question is used.

You can make an OLED visibly burn in within a couple months if you max the brightness, cover it in static content and then leave it running constantly.

Or it can last a decade if used lightly, on lower brightness, with good burn-in compensation software, with few to no static images.

ptmcc
7 replies
18h48m

A decade is still a really short lifespan for a TV

jiggawatts
4 replies
17h3m

Is it though?

2004 is the first year I remember watching a HD widescreen TV show (Lost). There were a handful of test clips and the like earlier, but they were a rare thing. E.g.: there's a HD video taken of the aftermath of 9/11 because the operator was just learning to use this (at the time) brand new technology.

That lasted about a decade, and then 4K become a thing around the 2013-2014 timeframe. I got a Sony 4K TV in late 2013 and a Dell 4K monitor in 2014.

However, both of those were standard dynamic range, albeit with a fairly "wide" colour gamut.

I got my first OLED HDR display in 2021, and an OLED HDR TV in 2022.

In other words, every "generation" of television technologies has lasted about a decade before being superseded by something significantly newer and better.

Display technology still has some ways to go. No current consumer panel reproduces 100% of the Rec.2020 colour gamut. Similarly, there are no consumer panels that can reproduce the full HDR10 or Dolby Vision brightness range of up to 10,000 nits. There are 8K displays that are close to that target spec, but they're either one-off experimental models or more expensive than a luxury car.

A decade from now, around 2033 mark, I full expect something like VR technology to have replaced TVs for many people. Apple's new headset (available next year!) will set a new quality bar that no traditional TV can ever hope to match, such as an extremely wide field of view, stereo, and very bright HDR in combination. Sure, they're expensive now, but after a decade of further development? They'll be amazing and cheap.

RDaneel0livaw
2 replies
14h42m

In your future, do households with more than one person exist? Because no matter what the tech is that'll never happen. We have a family, and after dinner if we want to watch a sitcom and laugh, we ... what .... go put on our own individual headsets and lay in bed? What happens on Sunday and it's icy outside, and we want to put a james bond movie on with the booming sound, have a glass of wine while the kids make hot chocolate, how do we do that? If I'm a single dude with disposable income I can see the vr headset with insane quality being pretty awesome, but for ANY kind of household with a family or roommates, I just can't fathom that kind of dystopian nightmare.

jiggawatts
0 replies
14h0m

Teenage kids will likely want to play video games or watch their own thing.

Many young people live by themselves and can't afford a "huge" TV. Meanwhile VR will give them a virtual TV as big as they like.

Another potential future technology that might obsolete the current 55-85 inch typical TV sizes is "OLED wallpaper", or something similar that would allow wall-sized televisions.

Even before the 2030s I expect something like quantum nanorods to replace OLED panels, potentially allowing up to 10,000 nits in an affordable panel that doesn't suffer from burn-in.

On another front, I've noticed a lot more people watching something like TikTok, YouTube, or Netflix on their hand-held devices instead of traditional TV. I can picture a future super-thin, super-light Apple iPad with an OLED HDR display becoming very popular, especially for people on a budget that can't afford a huge TV.

eru
0 replies
14h31m

The comment you replied to wisely stated 'for many people', and not 'for all people'.

You are likely right about your specific use cases. Though you might also want to see how early on some people predicted that TV would never take off, because families just didn't have the time for it.

We don't have a TV at home, and we would just do different activities in the situations you described. (But, yes, those activities in these situations would likely not include hanging out with headsets on.)

fomine3
0 replies
12h52m

We're outlier in the TV market. 95% of buyer don't care Rec.2020 and continue using TV until it broken (maybe 80%?). OLED isn't for who use it for a decade.

andersa
1 replies
17h4m

...it is? I buy a new TV every two to three years as notably better tech comes out.

andybak
0 replies
16h56m

We probably have different definitions of "notably better"

mattgreenrocks
6 replies
19h27m

Well, an OLED will be pretty much dead after six years anyways.

Why is that?

imp0cat
5 replies
19h18m

They get dimmer as they age and might also develop "burn-in".

jasonfarnon
3 replies
17h9m

Screensavers became necessary again?

imp0cat
0 replies
10h2m

Not just that. OLED TVs nowadays have a screensaver that kicks in when it detects stationary image for more than a few minutes.

They also shift the image around by a few pixels to prevent logos and tickers from permanently marking the screen.

And when turned off, they also try to level the brightness of individual pixels by displaying different patterns - again to combat "burn-in".

girvo
0 replies
16h50m

For OLED computer monitors: yep!

eru
0 replies
14h31m

They were never necessary to begin with: automatically turning the screens off (or at least to 'black') is never worse.

squeaky-clean
0 replies
17h48m

This really isn't an issue these days if you research your screen. Rtings.com has a good series on their website and YouTube testing OLED burn-in, and there are many TVs where this is not an issue. Also check out Wulff Den on youtube and his videos testing the OLED Nintendo Switch. He does get some bad burn in after a year, but that's also because it's on a still image at max brightness and never turns off. I don't mean a year of standard usage. I mean 8,765 hours of it displaying the same image.

stevenwoo
0 replies
19h13m

Anecdotally, My cheap old TCL dumb LED tv light burned out after about six years which is about average from what I’m seeing online but maybe I can fix it. But I did use it as suggested with a mi box.

mtlmtlmtlmtl
20 replies
18h30m

Some Webos LG TVs, like mine, can be rooted. I can now do wonderful things like install an ad-blocking version of youtube(still works in spite of recent changes). And SSH into my tv and mess around with the Linux system if I want to.

aetherspawn
8 replies
17h51m

LG webOS in my opinion is the least bad out of all the smart TV operating systems, and so far (we have a few of these at home & at work) I don't really feel the need to root them as they are ad-free and working fine even after several years of auto updates. This lack of enshittification has urged me lately to buy and prefer LG products over Samsung et al.

freetime2
3 replies
16h19m

they are ad-free

The home screen on my LG TV seems to devote the top 60% of the page to sponsored content. Is this not an issue for you?

dclowd9901
0 replies
15h24m

I don’t even have a Home Screen on my C9

aetherspawn
0 replies
15h41m

I understand where you are coming from, but I personally think that it's reasonable for a TV to show me sponsored content that is TV shows. I don't see this any different to how Netflix or Spotify show sponsored TV shows or music.

But my friend has a Samsung TV and it shows him ads for things he can buy from eBay, ads for games etc. in a similar area .. and yeah that's nonsense, that would really get on my nerves.

abcdefg12
0 replies
15h7m

I think you can switch it off. Poke around on menus

OJFord
1 replies
16h30m

I agree, I'd rather use something else, even if only for philosophical reasons, but I couldn't find anything that didn't sacrifice something - or at least that could promise it didn't.

No separate box seems to do all of 4K, Atmos, Dolby Vision, HDR10+, HLG without some weird restriction like 'except the Netflix app'. Even worse if you wanted DV IQ I assume, but I think my TV doesn't support it anyway.

And honestly I think I prefer WebOS to Fire sticks' UI for example. Stock Android TV isn't great either. If I was into playing Xbox/PS games I imagine that would make a better 'hub', since it's presumably (I have no idea) more organised around your own content/games at least rather than random adverts and features you don't want. Also probably more money being spent on designing it.

mvanbaak
0 replies
16h19m

if you have an LG tv, HDR10+ is not needed as the dynamic handling of HDR10 on the tv makes the HDR10 layer look as good as HDR10+. With this is mind, an Nvidia shield pro is pretty much what you are looking for.

eru
0 replies
14h38m

An external Chromecast stick also works fairly well (assuming you pay for YouTube premium and Netflix etc to avoid ads, but at least not much in the way of bloatware).

dylan604
0 replies
15h4m

the webOS on my LG is so out of date that modern apps won't install. so instead, i don't use the webOS, and treat it as a dumb TV with an HDMI input by using an external device for the smart things. there is nothing wrong with the picture that any of the updates would be required. so, just don't connect the smart tv to the rest of the world, and say thanks to all of the others that do to subsidize the price of your TV

LtWorf
8 replies
17h43m

I don't want to hack my tv, I want to watch a film.

kilolima
7 replies
17h23m

It's "hacker" news...

aaviator42
3 replies
15h24m

Parent comment's point is that you shouldn't have to hack your TV just to get a half-decent user experience.

dylan604
2 replies
15h9m

You shouldn't have to, but the fact that you can is part of the hacker ethos. We've already gotten past the most fundamental part of hacking, can it be done. The next is can it play Doom. The third is announce it. Then, we just have to get past the boring people asking "why"

arp242
1 replies
14h46m

I don't want to spend all my waking hours hacking all things in my life. That's not what "hacker ethos" means. Especially not when the "hack" is working around stupid wankery, which, to me, is completely boring and uninteresting and doesn't "build" anything.

I just want to watch a film.

I don't want to hack my forks either. Or my microwave. I just want to heat stuff and then shovel it in the appropriate foodhole so I can get on with things.

dylan604
0 replies
14h8m

Then don't buy the thing in the first place????

rcstank
0 replies
15h6m

I’d rather spend the time “hacking” something fun than a smart TV. My time is better spent elsewhere

jmprspret
0 replies
15h8m

Tell that to the VC posters, the corporate drama posts, the psychology links, the biology, space and popular science articles that are on here.

This place isn't for just hackers, not anymore at least. And that brings good and bad.

datameta
0 replies
16h56m

I suppose there's an opportune time to hack and a sub-optimal time to do so, subjectively.

IshKebab
1 replies
18h22m

But can you make the UI not super laggy?

I recently acquired a very old Sony TV it reminded me how lag-free TV interfaces are supposed to work. But nooooo LG is like "we're going to make great TVs but the UI is a poorly written web page running on a 386, enjoy!"

mtlmtlmtlmtl
0 replies
18h10m

Not really, but you can improve it a little by disabling automatic firmware updates. So it at least doesn't get worse.

peruvian
10 replies
19h0m

It's not that hard to get around this. I kept my TV offline and plugged in an Apple TV immediately.

hedora
9 replies
17h28m

Some TVs constantly display nag screens or blink the power LED at you if you keep them offline.

I can't decide if it's better to check for such things before purchase, or just return it as defective if I end up getting a model that does this.

myself248
4 replies
14h56m

Absolutely return as defective. Make it sting a little.

If it doesn't say on the box "requires constant internet to function" in big bold letters, it should.

eru
3 replies
14h36m

What about saying in big bold letters 'requires constant electricity to function'?

That would be a ridiculous requirement, because people expect that TVs need electricity. But the producer can argue that people these days also expect TVs to need the internet, can't they?

myself248
2 replies
14h10m

They sure as fuck cannot, I've never connected a single TV to the internet. That's something that needs to be explicitly declared.

eru
1 replies
10h10m

That would be a for a judge to decide.

myself248
0 replies
2h47m

They're going to take me to court for returning a TV?

radicality
1 replies
17h9m

I also use an AppleTV with my Samsung TV. I did give the tv WiFi credentials so that sometimes if needed it can reach the internet (eg firwmare download), but I set up firewall rules to block all traffic for the tv in steady state.

stavros
0 replies
15h24m

Trust me, you don't want your TV to download firmware. My Sony TV started freezing every so often after downloading some random update. Since then, it's gone off wifi and only the Chromecast is allowed to connect.

nunez
0 replies
14h29m

This is why i bought a Sony. They have a ton of features packed in but won't scream if you don't ever connect them.

eru
0 replies
14h37m

[...] blink the power LED [...]

You can fix that with a piece of tap or blu-tack.

lintimes
2 replies
19h3m

Worse yet, now high end monitors (Samsung Odyssey G9 OLED) are offered with poorly implemented smart TV hub features.

petsfed
1 replies
16h27m

The only thing that works well on their smart TV hub features is the ability to block signal from my current-gen Chromecast.

Coordinated asshole design.

eru
0 replies
14h35m

How does that blocking work?

nolan879
1 replies
16h13m

There are some business/enterprise displays offer a "framework" approach by using a RPI CM4 to control everything. Unfortunately, that is out of reach for most consumers lacking a corporate sales account.

Flameancer
0 replies
15h3m

That also makes me wonder how much TVs are being subsidized with ads and sponsored content. Those businesses displays cost significantly more than a regular consumer TV.

c0pium
1 replies
15h40m

This gets most of the way there, with buying a monitor instead of a TV. You just need to take the next step and get a pg42uq. OLED, super accurate color, great response time, and unfortunately the answer to the question what would modern TVs cost without the app/telemetry subsidies.

imhoguy
0 replies
8h2m

It was always surprising me than digital signage screens without all that TV crap were many times more expensive than just consumer equipment.

acchow
1 replies
18h13m

It doesn’t become useless junk. You can still use it as a display without any of the smart features. Connect to a Roku or Apple TV (which also automatically turn on when the TV turns on) for smart features.

tentacleuno
0 replies
18h4m

I've honestly never used the smart features of my TV. It's an old 4K LG (Netcast era), so it's been in service for 6+ years. The "smart" stuff always sucked: riddled with privacy policies, generally quite slow, and could honestly be replicated way better with a media stick / NUC.

A NUC is generally the way to go for smart TV functionality in general, IMO: you get way longer support, and it's much easier to diagnose problems as compared to some locked-down, proprietary, do-it-ourselves OS.

whydoyoucare
0 replies
19h28m

Convert the Smart to DumbTV by offloading all apps to your favorite streaming device (for example, Apple TV or Nvidia Shield). If it helps, don't ever enable internet connectivity to the Smart TV.

satvikpendem
0 replies
15h2m

Funny, I use an LG OLED 48" TV as my monitor. It's really the cheapest option for a 4k 120hz OLED, although I'd prefer 240hz as GPUs are getting more powerful.

nunez
0 replies
14h31m

Doubt it. Most people only care that their TV is huge and cheap. Visual quality is a distant afterthought.

didntknowya
0 replies
15h24m

yeh it's a shame my TVs work so fast when I first bought it but becomes noticeably laggy with each update

dontlaugh
46 replies
23h53m

Sadly, (mild) motion interpolation is necessary for those of us that get headaches from 24fps video, especially panning shots.

If only filmmakers started with decent frame rates. The few films that came out in 48 fps are so much nicer to look at.

jiayo
15 replies
23h17m

It's funny, watching films in 48fps in theatres (specifically the first Hobbit movie that pioneered the concept) to me looks like the actors acted in 2x slow motion and then someone pressed fast forward. Everything looks incredibly unnatural.

cassianoleal
10 replies
21h29m

I had a different experience. At first, I found things unnatural like you did. After a few minutes, I figured out that it was actually the opposite - it had a bit of a theatrical thing going on, i.e. live action vs. recorded and played back.

At that point I figured out that it was just because I've been so used to crappy frame rates that the more natural movements feel out of place.

I wonder what the first pass, with less motion blur, would have felt like. Maybe better, maybe worse. I kind of feel it would make it worse, in the same way that the transitional from analog to high definition digital made it look worse to me since I could notice the transitions between frames. That is, at least at first. I'm used to it now.

whynotminot
9 replies
13h52m

I firmly believe most people complaining about the Hobbit in 48 FPS just don’t like change.

It’s not worse. In fact it’s so vastly better. Watching a 3 hour movie in 3D without getting a headache from 24 FPS judder (“magic”) was a revelation.

But different bad.

kalleboo
6 replies
13h31m

I was really looking forward to 48 fps - the juddering in wide panning shots on 24 fps always takes me out of the immersion so I was hoping the cinema world could move forward.

There was just something very wrong with it. It kept feeling like suddenly parts of the movie were sped up and going at 2x speed. To the point where I wonder if the cinematographer was just not experienced with the technology to make it work right (like, maybe there are collaries to the 180 rule that need to be experimented with)

I watched it in 2D, I'm sure for 3D it makes a whole different experience.

wharvle
2 replies
11h6m

Also saw the first film in 2d in the theater at 48fps, also very excited going in, also felt like everything looked sped-up.

What it reminded me of was silent era film that’s been slightly sped up on purpose for comic effect. All the walking looked kinda jerky, like it does when you speed up footage of someone walking, for instance. Or very early manual-timing film that was cranked a bit inconsistently. It was so distracting I could hardly focus in anything else the entire movie. If it’d been a better film, I’d say that gimmick ruined it, but… well.

kalleboo
1 replies
10h59m

Yeah it was very strange. Like, you don't get that effect from 60fps TV or video shot on your smartphone, so I wonder what went wrong to cause it.

wharvle
0 replies
2h16m

Yeah, I’d seen higher-frame rate video before, and since (though maybe not again in a movie theater?) but that’s the only time I’ve noticed that particular problem. I spent a little time searching around after I saw it, trying to figure out what happened, but the chatter over 48fps in general drowned out any signal about what might have caused that specific issue (though many others did report experiencing a similar sped-up effect, at the time—never found an explanation, though, aside from just blaming the high frame rate, but I suspect there’s more to it)

eternityforest
1 replies
8h51m

Maybe they should try dynamic frame rates, or even dynamic rates for different parts of the screen, since 24 is usually fine for most shots.

simondotau
0 replies
3h29m

This is common in drawn animation, where some elements are effectively 12fps while others are 24fps or 8fps. This can even be occurring simultaneously. (This is known as “on twos” and “on threes” etc.)

olyjohn
0 replies
10h51m

I didn't like it either. I think it looks too real and takes you out of the fantasy. Something about the way 24fps looks is different from reality and lets your imagination take you away a bit easier. I can't really explain it.

Like I can watch an animated show, and I have no expectations of it looking real. I can still get lost in the story and enjoy it. I don't need it to look like I'm actually there.

zhoujianfu
0 replies
11h46m

I watched it in 3D HFR and it was terrible terrible terrible to me. I felt like I was watching a play. The special effects looked weird/bad, the acting felt worse, the makeup more obvious, everything yuck yuck.

stefandesu
0 replies
10h3m

Back in the day, I watched The Hobbit first in 3D 48 fps, and I hated it. Sure, it got less bad throughout the movie, but I just couldn't get used to it at all.

I then rewatched the movie (also in theaters) in 2D 24 fps and it was infinitely better.

I have the same with YouTube videos. I can't stand 60 fps videos (except for gaming content); it causes headaches for me. If it's something I really want to watch, I'll download the 24 fps version and watch it offline (YouTube has it on their servers, but only serves it to certain clients or something).

I know other people who just can't see a difference, or rather, they don't notice it much. I feel like I can spot a 60 fps YouTube video in about 2-3 seconds (usually I pause then to check and reconsider whether I really want to watch it). I also tend to notice 30 fps videos (compared to 24), although they don't really bother me.

Considering that I've always been the only one at movie nights to notice when the TV's frame interpolation setting is on, I guess I'm an outlier.

ender341341
1 replies
21h22m

I think people blamed the frame rate, but for me it was the rest of the effects that put me off, faces were too softened, lots of scenes had weird color saturation, as others mentioned there was a lot of motion blur, compared to LOTR the VFX really pulled me out of lots of scenes.

thaumasiotes
0 replies
11h19m

I did not notice anything that I would attribute to the frame rate. Frankly I think the visuals of the movie were fine.

It was the writing that was the problem.

sp332
0 replies
22h33m

Peter Jackson added more motion blur in those because he said that it played better with a focus group. I think sticking to a normal 180-degree shutter angle would not feel so weird.

dontlaugh
0 replies
21h55m

There was weird motion blur, but to me it looked far more normal than most films. I realise it’s a minority opinion.

TylerE
9 replies
21h56m

Even if they just moved to 30 like normal TV that’d be noticeably better.

dontlaugh
4 replies
21h45m

TV in the UK is generally 50 , at least.

TylerE
3 replies
20h52m

That's interlaced though. Effectively half for a progressive scan.

dontlaugh
2 replies
20h48m

I don’t think interlaced video is still broadcast, since analog was shut down.

kalleboo
1 replies
13h37m

https://en.wikipedia.org/wiki/List_of_HD_channels_in_the_Uni...

All HD channels in the UK broadcast at 1080i, apart from Sky Sports Main Event UHD channel and the BT Sport Ultimate 4K
dontlaugh
0 replies
9h54m

That surprises me. I guess I never noticed since I always have interpolation on nowadays.

squidsoup
2 replies
19h41m

I don't think many cinematographers would agree with you. We have the technology to make films at higher framerates, yet few choose to do so.

Interestingly, David Lynch shot Inland Empire at 60fps interlaced, but the film was released at 24fps.

anthonypasq
1 replies
18h10m

anything with cgi will be much more expensive at higher frames right? I also have absolutely no idea how any of that works.

squidsoup
0 replies
17h53m

I would think so, double the render time for 48fps, which is how Cameron shot Avatar II.

mejutoco
0 replies
21h46m

Just a note that NTSC has 30 fps. PAL has 25 fps.

jokowueu
6 replies
21h58m

I've never heard of headaches from 24fps video , must be rare

Finnucane
3 replies
21h45m

I'd never heard of it growing up in the era when that's all there was. I don't know if it's rare, but it's only in the last decade or so I've heard people complaining about it.

exitb
1 replies
19h42m

We don’t perceive all types of screens in the same way. Film projectors and CRTs display parts of the frame, only part of the time. TFT and IPS screens introduce a lot of inertia and blend the frames. Both of these help the motion illusion. OLED on the other hand has the harshest frame transition - it displays the entire area for the entire time and switches frame content almost immediately.

foresto
0 replies
9h20m

it displays the entire area for the entire time and switches frame content almost immediately.

I've heard this called the sample-and-hold effect. It looks a bit like a fast slide show, and really stands out in high-contrast, steady motion scenes.

JohnFen
0 replies
19h56m

I grew up in the same era, and I definitely knew people who could not watch TV because it induced headaches in them.

smcleod
0 replies
21h38m

I get it too. I visually see tearing / juddering on most video lower than about 40FPS and it’s incredibly tiring.

dontlaugh
0 replies
21h53m

Afaict it’s a kind of migraine. Everyone I know that gets headaches from low fps also gets migraines.

cpach
5 replies
21h25m

Interesting. Did you ever go to the movies in the 90s? Those were all 24 fps. Did you get a headache then?

rocqua
3 replies
18h36m

Cinema has a black period in the middle of every frame, that actually prevents these problems.

huytersd
2 replies
13h43m

Wait, are you saying that 50% of the movie is a black screen?

bee_rider
0 replies
12h59m

Should be 50% off in that case!

ace2358
0 replies
12h5m

Yes. Also, most of your LEDs are off 50% of the time too. If it’s done quick enough, you don’t see it.

dontlaugh
0 replies
9h58m

Cinema at 24 fps also gives me headaches in panning shots. It did as far back as I can remember.

thedougd
1 replies
18h43m

Are we including 3:2 pull down and frame doubling as interpolation?

dontlaugh
0 replies
9h57m

No, that makes it much worse. Avoiding it is a big reason to get 120 fps tv.

rocqua
1 replies
18h37m

There is the alternative of Black frame insertion. It loses a lot of brightness, but helps a lot with stuttering at 24 fps.

The problem that causes stuttering is (simplified) that your eye is moving smoothly to follow something on the screen, whilst the image is moving in discrete steps. So when your eyes expect something fixed in side your view, its actually stuttering.

Black frames make use of a natural image retention in the eye. Where you effectively continue to see the last bright thing. Hence what you expect to be stationary in your field of view does remain stationary.

This was actually key to film based projectors working, because they need a period of black to advance to the next frame. Without image retention it would seem to flicker. Though 24 hz is a bit to slow for that, so they actually added a black period (by just blocking the light) in the middle of each frame to even out the effect. They were already doing BFI, not for motion smoothing, but for flicker smoothing. It seems likely this is accidentally why 24hz film doesn't have stuttering whilst 24hz screens do need it.

Personally I care too much about the brightness loss of BFI, but it might be interesting for you.

dontlaugh
0 replies
9h55m

It helps a little and I prefer lower brightness anyway. But it doesn’t remove the panning judder entirely. Mild interpolation does better.

lainga
0 replies
21h26m

Did you get this symptom from the narrow shutter snapshot-like filming in Saving Private Ryan?

https://cinemashock.org/2012/07/30/45-degree-shutter-in-savi...

hoseja
0 replies
8h20m

Yes! Panning shots are such a pain. Motionblur them or something please.

Timon3
0 replies
22h55m

I'm personally fine with 24 and 48 FPS (without interpolation), but what I absolutely can't stand is a variable rate. I saw Avatar 2 in the cinema with this, and it ruined the experience for me. Switching down always felt like the projector is lagging.

smcleod
33 replies
21h40m

Motion interpolation is an absolute essential for me. I can’t stand how choppy TVs (even high end models) are without them.

30fps videos look very jarring to me, I almost always notice “tearing” or “shuddering” - even when in a cinema. Enabling motion interpolation / frame rate up scaling usually fixes this for me.

It’s so distracting and at times almost painful for me to watch without it that at times I’ll use a tool (Topaz Video AI) to re-render videos to 50-60FPS.

ethbr1
11 replies
21h38m

Interesting!

I can't stand motion interpolation. Turned off on every TV I own. Will literally walk away and do something else if it's on a (non-sports) TV in public. There's something "too smooth" about it that irks me.

smcleod
4 replies
21h30m

It’s interesting to think how different our visual systems must be right? I’m always saying to friends “how can you watch this? It’s so choppy!” And some of them agree and others don’t see it at all.

Biology is weird so I say just give people the options to pick what works for them.

ethbr1
3 replies
20h18m

I read this (from HN?) awhile ago, and it boggled my mind about how much subjective reality is actually invisibly self-fabricated.

https://www.portsmouthctc.org.uk/a-fighter-pilots-guide-to-s...

I expect refresh rate is similar, given that... if a substantial portion of your subjective perception is mentally fabricated, then your brain physiology contributes, and that's set during childhood.

For reference, I grew up on NTSC screens (29.97 interlaced frames per second).

MandieD
1 replies
19h52m

That link is an article I really, really wish I'd read while learning how to drive, and is something I'll teach my kid before he starts riding a bike with traffic. I hadn't seen it before, so thanks.

ethbr1
0 replies
18h42m

That and the Dutch(?) bike-safety trick [0] are minimal effort life hacks I got from HN.

[0] In urban/bike areas, always open a car door with your opposite hand (e.g. driver's side with right hand). It forces you to turn your body, which allows you to look behind you, which lets you notice bikers approaching from behind before you open the door and splat them.

toast0
0 replies
19h58m

For reference, I grew up on NTSC screens (29.97 interlaced frames per second).

Considering it as 30 interlaced frames per second isn't really accurate. It's 60 fields per second. A lot of content intended for interlaced broadcast is not 30 fps broken into fields, it's 60 distinct half pictures per second.

(Excuse my rounding)

saltcured
1 replies
19h46m

I tend to avoid it, but don't constantly try out newer devices and their settings. I always remember when I first saw it on a proud friend's new TV about a decade ago. I was deeply disturbed and asked him to turn off the feature.

We were watching an action/fighting movie with swords and other martial arts, and I distinctly saw these graceful arcs of the actors' limbs and weapons turned into polygons. The motion interpolation clearly inferred linear transitions between the different positions captured in the actual frames. Imagine a large swing tracing out an octagonal path with all the physical unreality that would entail.

It seemed like I was the only one in the room who perceived this travesty.

satvikpendem
0 replies
10h57m

Usually TVs have bad motion interpolation which ruins the concept for many people. I use SmoothVideoProject on my computer which uses your GPU to analyze the motion vectors between frames via deep learning (Nvidia optical flow analysis) so it's much better.

pseudosavant
1 replies
21h24m

The problem typically is that motion interpolation isn't consistently smooth. Generally a fixed framerate 30fps will seem smoother than something that goes between 40-60fps. Our brains are sensitive to changes in the pacing.

smcleod
0 replies
21h18m

The motion of an object isn’t the same as the frame rate though. You can have a 60fps scene where an object is moving fast on one side of the screen and slow on the other. It only means that for a given object travelling from A to B - it will have more fine detail in its movement for a given distance.

satvikpendem
0 replies
10h56m

Usually TVs have bad implementations while on the PC, using something like SmoothVideoProject which uses your GPU for motion vector analysis via optical flow makes it much more high quality.

Also, you get used to it after a while. At first I was similarly jarred by the smoothness, as if it were fast forwarded, but after a few hours of getting used to it, when I saw 24 FPS content, it was literally unwatchable as it looked like a slideshow to me.

runeofdoom
0 replies
21h28m

Same here. It feels like it takes everything, from classic B&W to modern SF extravaganzas, and turns them all into somebody's home videos.

At the same time, I'm pretty confident that this is a subjective phenomenon. My parents have it on all their TVs and my mother both prefers it and notices immediately if video isn't 60fps or equivalent, while my father says he doesn't notice the difference.

xnx
6 replies
21h27m

Is tearing common? What video source are you watching where the frames recorded in the video do not match the frames displayed by the display?

smcleod
5 replies
21h22m

Very common for video footage lower than 40FPS. It doesn’t matter what the source is (AppleTV, laptop with HDMI, nvidia shield, PS5) - this is very noticeable to a large chunk of the population.

wolfd
2 replies
21h13m

What do you mean by “tearing”? Like, VSync-off tearing or a different effect?

smcleod
0 replies
20h44m

Sorry I was conflating juddering and tearing, I meant juddering.

Zetobal
0 replies
21h4m

Must be vsync off otherwise nobody would watch movies.

Retr0id
1 replies
21h2m

Perhaps you're conflating juddering and tearing? - they are distinct. Judder is what you see with, for example, low-fps panning, but tearing is where one segment of the screen (usually a horizontal strip) is out of sync, still displaying the previous frame, while the rest of the screen has moved on. This is not normal on a correctly configured system.

smcleod
0 replies
20h44m

Sorry - yes I am - I had to look up some examples but I’m talking about juddering.

jandrese
5 replies
21h29m

Decades of TV being filmed on cheap(er) video cameras that had lousy picture quality but captured at 60 fps vs. film that looked beautiful but only captured at 24 fps has taught people that blurry smeary motion is the ideal.

smcleod
2 replies
21h21m

If it’s blurry or smeary then your TV / source is doing it very wrong or just can’t keep up or is too low resolution / lacks quality upscaling.

jandrese
1 replies
21h16m

It's blurry and smeary in the movie theater. You just can't capture fast motion at 24 fps. Once you train yourself to look for it you will never be able to stop seeing it.

smcleod
0 replies
21h12m

Oh sorry you mean it’s blurry and smeary at 24 fps in the cinema! Yes I agree. Sorry I thought you meant that higher frame rates looks blurry.

J_Shelby_J
1 replies
21h24m

I used to think that but now I’m not so sure. Yeah 24fps is bad for panning and sweeping movements, but….

There is something about 24fps that I believe may have something to do with how the eye or brain works that makes viewing more immersive. Perhaps it’s due to historical cultural reasons, but I’m not sure that totally explains it.

FWIW I play valorant on a 390fps monitor so I am not a “the eye can only see 60fps” truther.

smcleod
0 replies
21h16m

24/30fps looks dreadfully unnatural and distracting to many people though. It’s almost painful to watch on larger screen sizes.

satvikpendem
2 replies
11h0m

Check out SmoothVideoProject, it does the interpolation in real-time. I also use Topaz sometimes but it's too slow for most use cases unless you have the time to wait.

smcleod
1 replies
6h33m

Very cool! Shame that Plex Server / AppleTV doesn’t implement this natively

satvikpendem
0 replies
4h24m

Yes, maybe it could be done as a plugin but I generally use it on my computer. They also have an Android app which interpolates video on your phone too, pretty cool.

doctorhandshake
1 replies
20h3m

Is it possible what you’re seeing is ‘judder’[1] or bad 3:2 pulldown? I really don’t think much actual ‘tearing’ [2] makes it to screens in theaters - that would be a big screwup!

1 - https://www.techtarget.com/whatis/definition/judder

2 - https://en.m.wikipedia.org/wiki/Screen_tearing

smcleod
0 replies
18h34m

Yes sorry, Judder is the correct term.

wolfd
0 replies
21h17m

A major issue with motion interpolation is that it can’t be perfect, and is often far from it. The implementation on many TVs is jarring, you’ll see super-smooth motion while an object is moving a slow or medium speed, but as soon as the patch of pixels that it’s tracking goes really fast, it assumes the patches are distinct, and the motion will be juddery. Individual objects switching from high-framerate to low in the span of a half-second is quite noticeable to my eyes, but I admit that most people around me don’t seem to care.

Maybe one day the real-time implementation will be good enough, but I find that it’s shockingly bad most of the time.

smusamashah
0 replies
19h35m

For me when motion interpolation is on, I can immediately see that it's interpolated. And then I keep noticing the artifacts where the lines meet and boundaries. It's very distracting. I experimented with this setting while watching Koyaanisqatsi and for me it was better when it was very slight interpolation (at 3 on the scale of 1 to 10).

HPsquared
0 replies
21h24m

Same. It's more noticeable on large screens because more is in the peripheral vision. Screens are larger today (and perhaps people are putting bigger TVs into smaller rooms than before) so we see more of the screen in our peripheral vision than before.

Peripheral vision has a lot of rods (instead of cones) which are more sensitive to rapid motion. I can certainly pick up flicker and "perceive the frames" more clearly when looking in my peripheral vision.

Same goes for the old CRT monitors: 60 Hz was an absolute no-no, 85 was tolerable but higher was better.

Edit: CRTs were worse, of course, because they were constantly flashing light-dark, unlike LCDs which simply transition from frame to frame.

dissident_coder
20 replies
23h56m

motion smoothing is the worst feature ever conceived by man

jokowueu
10 replies
21h57m

It's only good for anime where the fps is extremely low

nyx
4 replies
21h46m

Counterpoint: this YouTube rant by an animation person called Noodle is a pretty good overview of why frame interpolation sucks. https://www.youtube.com/watch?v=_KRb_qV9P4g

Basically, low FPS can be a stylistic choice, and making up new frames at playback time often completely butchers some of the nuances present in good animation.

jandrese
2 replies
21h25m

Personally I'm dubious. There may be times when the low framerate is a stylistic choice, but the vast majority of the time it's purely a budget thing.

wodenokoto
0 replies
10h39m

Which the style is then centered around.

If it had looked better with frame interpolation the studio could have baked it in, before release.

tuna74
0 replies
19h12m

If you are a good director you can make the most of that low budget. Look at the first episodes of Scum's wish (https://www.imdb.com/title/tt6197170/) if you want a good example.

jeroenhd
0 replies
19h49m

Low FPS can be a stylistic choice, but as a member of the audience, I tend to disagree with that choice.

Perhaps it depends on the quality of the execution, but there are shows where I wished I had frame interpolation.

dully-abrading
4 replies
21h47m

Animation is the worst use case for motion interpolation because the frames are individually drawn and timed by the animators to achieve a particular look and feel.

ethbr1
3 replies
21h36m

*Used to be. Now they're rendered

tuna74
2 replies
19h16m

A lot of animation is still drawn. Some CG anime is also combined with 2d drawn elements (like in SpiderMan into the Spiderverse).

ethbr1
1 replies
18h40m

The luddite hacker in me wants to try interpolation on early Ghibli movies.

imp0cat
0 replies
10h9m

Do it! I personally love the old Looney Toons, it makes them so smooth it's almost unreal (but in a good way).

smcleod
2 replies
21h36m

For me it’s the absolute most important setting to enable on any TV. Without it I really notice the tearing / juddering effect that footage <40fps has.

imp0cat
1 replies
19h26m

And you know what, you may both be right! Different TV's use different algorithms and tricks so what looks great on one, might look quite bad on another one.

smcleod
0 replies
18h35m

Yeah absolutely true. I’ve seen some Samsung brand TVs do a really bad job of it (also their colours are terribly over saturated by default), where I’ve found the LGs do a good job (at least the c and g series).

ThrowawayTestr
2 replies
21h50m

I hear it's good for sports.

drivers99
1 replies
21h30m

Sports are already shot at 60 fps. Same for any live TV events. It makes movies look terrible, like it was shot on video, to me and many others. Soap Opera effect. But for things that were shot live on video, it looks like what it is.

For the high frame rate version of The Hobbit that I saw, in my opinion it looked bad for character shots but cool for overhead action views.

imp0cat
0 replies
19h31m

Nowadays, TV sets can display a lot more that 60 fps (120, 144Hz sets are quite common).

todfox
1 replies
22h53m

It amazes me that humans continue to have this perverse desire to fix what isn't broken.

kevincox
0 replies
22h35m

1. Features on the box.

2. More pop on the store display.

It turns out that "fixing" these things do result in more people picking the TV. Just like an overstated display will typically be preferred in a side-by-side comparison.

ProfessorLayton
0 replies
21h48m

I use it on a game-by-game basis when playing my Switch, because it's so underpowered. It works surprisingly well when playing Zelda Tears of the Kingdom.

bkm
19 replies
19h16m

Just get a modern Sony TV and be done with it. They perfected Motionflow to the point where you no longer think about framerate (choppiness nor soap opera). It's clearly a priority, probably because they are the only manufacturer with their own studios (Columbia/Sony pictures). There is a reason people pay the $800+ Sony tax over any TV that has the same panel.

com2kid
6 replies
18h34m

The Sony tax is because ads on Sony TVs can all be turned off. Plenty of TVs have their price subsidized by ads, where as when going through initial setup, I've had Sony TVs with ads disabled by default and questions asking if you want to turn them on.

Sadly disabling "recommended content" on the Google TV launcher also disabled voice search from the remote, but I am pretty sure that is a Google problem and not something Sony chose.

(Also my Sony TV cannot stay connected to my WiFi network for more than half an hour before I have to toggle WiFi on and off again...)

kridsdale1
3 replies
12h53m

All TVs can have ads disabled: unplug the Ethernet.

com2kid
2 replies
10h34m

Given that 100% of my TV usage falls into these categories:

1. Controlling Spotify 2. YouTube videos 3. Photo Albums from Google Photos

No network connectivity would render my TV completely useless.

Though I think I could show photo from a thumb drive, so I'd have that going for me I guess.

hirvi74
1 replies
10h13m

Chromecast, Apple TV, Fire Sticks, Rokus, etc. could all help out here too. I connect a few of those but never my TV.

com2kid
0 replies
8h55m

Fire Stick and Roku are worse for ads than what Sony ships.

To clarify, my TV shows a list of apps, and that is it, aside from a single "suggested channel" at top I cannot get rid of.

No content previews, no "watch now!" no special promos, just a list of apps.

To explain a bit more what I said up above, Sony TVs cost more than other TVs with identical specs, ~$200-$300 USD more, but compared to a mid-range LG or Samsung, Sony opts you out of advertising by default (the initial setup is hilarious, for the most part you'd have to manually select a bunch of checkboxes to opt into tracking!).

newhotelowner
0 replies
11h1m

Sadly disabling "recommended content" on the Google TV launcher also disabled voice search from the remote

WTF.

hackmiester
0 replies
14h6m

That behavior follows, given that Google is an ad company.

richwater
2 replies
19h13m

Is it just that easy? Do you have any specific model recs?

imp0cat
0 replies
10h16m

It's easy, but expensive. ;) Search for Digital Trends on youtube, they have a lot of videos about this. Their current "budget" pick seems to be the Sony Bravia X90L.

Flameancer
0 replies
14h33m

I have an A80J and it’s possibly the best panel I’ve owned/seen in any house that didn’t have something similar. My dad who is a big movie buff with an Atmos HiFi actually got one like a month later after being a big LG fan.

deanCommie
2 replies
19h9m

That doesn't make sense. Are you saying these TV's still butcher the original artistic intent of the creators for the sake of arbitrary petty consumer desires to have their expensive TV purchase be justified?

But they just do it better than the other manufacturers do?

fomine3
0 replies
12h28m

It's a sort of truth. Creators' intension isn't so special for consumers.

flir
0 replies
18h30m

C'mon, that's a reddit-level wilful misinterpretation of what he actually said. I mean, look:

Are you saying the original artistic intent of the creators to insert unskippable ads at the beginning of the disc is more important than the consumer's right to control the playback of the content they bought? Plus I heard it might kill babies.

See? It's just silly.

thedougd
1 replies
18h45m

Are they changing their interpolation settings based on source material? Some TVs will disable motion interpolation when they detect 24 frame rate content.

imp0cat
0 replies
10h18m

Yes, Sony TVs try to detect the original fps and display it accurately.

https://www.reddit.com/r/bravia/comments/7ztuwv/what_is_moti...

viktorcode
0 replies
18h5m

I own one, and can say that Motionflow produces uneven results. In certain scenes it kicks in, while completely ignoring others. Still has a way to go.

dewbrite
0 replies
13h33m

Not sure exactly what display I have, but it's a recent OLED Bravia. Motion smoothing is still awful.

chpatrick
0 replies
19h7m

But then what framerate is it?

Flameancer
0 replies
14h35m

Bit the don’t tax almost three years ago on an A80J. Honestly the best panel I’ve had. Probably going to buy another Sony panel to replace a circa 2012 4K Samsung.

fideloper
18 replies
18h59m

One thing I seemingly can’t disable is how my samsung tv gets louder when ambient noise is high.

I absolutely do not want my TV to get louder when one of my kids is shrieking. Just adds stress on top of stress

j45
14 replies
18h59m

I believe there is a setting for that.

Using an external receiver can help too.

Daneel_
6 replies
18h57m

Open TV. Find microphone. Apply tape.

AlecSchueler
1 replies
18h26m

Just gotta find a proprietary Samsung screwdriver first.

justinclift
0 replies
13h24m

This should be what you're looking for: :)

https://www.milwaukeetool.com/Products/Hand-Tools/Hammers/48...

fideloper
0 replies
16h37m

I don’t hate this solution one bit

autoexec
0 replies
18h8m

screw tape, cut it out. Any samsung TV with a microphone and an internet connection is probably sending everything it picks up to data brokers.

OJFord
0 replies
16h37m

Tape? My television does not need a microphone, if I identify one and open it up, there is no reason to leave it there.

I_Am_Nous
0 replies
18h39m

Apply dollop of superglue, THEN tape for maximum quiet

jjoonathan
5 replies
18h49m

External bluetooth transmitters/receivers are also the cure for shitty PC bluetooth stacks.

They don't switch to garbage quality mode every time an app, website, or game queries the microphone. They don't re-enable shitty defaults every software update. They don't require text config files in linux and the critical settings in those files don't get ignored due to open source politics. They don't mess up pairing every time you reboot into a different OS. They just work. $50 will banish all your bluetooth troubles to the deepest pits of the underworld, where they belong.

j45
2 replies
18h24m

I should have been more clear. An audio/video receiver.

Beyond Bluetooth optical audio is quietly pretty decent.

jjoonathan
1 replies
18h15m

Yes, TOSLINK is a godsend. It's immune to ground loops and motherboard manufacturers that don't give a shit, which is all of them, even ones that brand around having decent audio (ProArt I'm looking at you).

jrockway
0 replies
13h25m

I have a different Asus motherboard and the audio hardware is on an apparently flaky USB bus (the motherboard has several, as they do). Even with an optical connection, the audio drops out sometimes. It was maddening to me when I first got the computer, because things like this are usually "not all the CPU pins connected to the motherboard" or "you know that RAM you bought on Amazon? yeah, it doesn't remember what you store in it! savings!". But... not this time. (I can pretty much kill USB on this machine by plugging in a bunch of unused USB cables; plugged into the computer, but nothing on the other end.)

I use an external DAC and I've learned which buses break USB when looked at the wrong way. But ... of course the on-board audio is just a USB device. Can't waste a PCIe lane on that!

BrandoElFollito
1 replies
2h17m

I tried a BT emitter once (a few years ago) and the out-of-sync with the video was unbearable.

Are there models that are recommended? I have an old Samsung TV (ca 2010) and would love to add BT to pair my wireless headphones.

jjoonathan
0 replies
1h4m

I use a 1Mii B03 connected to my PC through TOSLINK. It has a physical switch to toggle between low latency and high definition mode.

cf100clunk
0 replies
18h55m

an external receiver

Yes, passthrough digital audio to an AVR if at all possible.

squeaky-clean
1 replies
17h52m

Does this help? It's related to the "Sound Sensor" but the menu setting is not called sound sensor. There's also apparently a physical switch you can turn off? I can't check that though.

https://www.samsung.com/latin_en/support/tv-audio-video/how-...

fideloper
0 replies
16h36m

I’ll give it a shot, thanks!!

idontwantthis
0 replies
9h50m

It makes me anxious just knowing that TVs exist that do this.

karmakaze
7 replies
22h7m

I'm surprised it doesn't mention "sharpness". It's tricky find the zero point: e.g. '0', '50', or '100' depending on whether it means 'add sharpening', 'smooth/sharp', or 'remove smoothing'.

smusamashah
4 replies
19h39m

Agree with sharpness, I undo this on every TV I get a chance to touch at friends/family.

WheatMillington
3 replies
18h35m

You really go around to people's homes and change their TV settings?

smusamashah
1 replies
17h35m

As in I randomly knock some doors, ask to come in, find their TV, then their remote, then change settings to my liking and leave?

paledot
0 replies
12h21m

Not all heroes wear capes.

karmakaze
0 replies
17h11m

I once fixed the remote at a brother-in-law's place. Came back a week later and changed channels using the remote. He shouted "You mean it's been working the whole week and I've been on my knees changing channels?!"

layer8
0 replies
17h29m

Yeah, that’s often inconvenient. There are test images like http://www.lagom.nl/lcd-test/sharpness.php one can use for testing this. A USB stick with some test images can be useful.

jwells89
0 replies
19h43m

This is a frustration shared with some monitors, too. Either the zero-point should be obvious or there should be a toggle that disables the setting altogether.

hunter2_
7 replies
19h53m

I find the headline to be seriously clickbaity, as the word "smart" (in the context of TVs) generally refers to a network connection that facilitates streaming, telemetry, ads, etc. but TFA is not discussing that category of features whatsoever. It's discussing features totally unrelated to the growing popularity of disabling "smart TV" features for the sake of privacy and fewer ads.

Unfortunately, I don't know that there's a generic non-jargon word for this collection of settings, but let's not solve for that by overloading the word "smart"!

layer8
1 replies
17h35m

Those are image-enhancement settings. Too jargon-y?

hunter2_
0 replies
12h27m

Not too jargony, but it possibly conflates the regular calibration settings (brightness, contrast, etc.) which are actually worth tweaking because they let you get closer to the original signal. The settings this article discusses generally let you stray farther from the original signal.

dang
1 replies
14h18m

That was my fault - see https://news.ycombinator.com/item?id=38526517. Fixed now. Sorry!

hunter2_
0 replies
12h25m

No worries; I only meant to comment on the headline within the work itself and didn't even notice a discrepancy!

urbandw311er
0 replies
19h44m

Or indeed being a sort of tech apologist by describing it as “weird”.

j45
0 replies
18h58m

TVs were sold to us with some degree of smarts, they weren’t really smart, and upgrading or replacing the smarts is too much work.

gumby
0 replies
16h16m

Attend a CES and you'll see this use of "smart" is standard in the TV industry. They had net connections before they had special apps and the word "smart" wasn't used back then.

ssss11
6 replies
16h46m

I put my Samsung tv behind pihole and holy s** did it chatter. It stopped ads in the tv ui and I can only imagine other dystopian data exfiltration api calls.

godelski
5 replies
16h17m

How did you get the ads to go away? Some left for me but most stayed. I understand that this is because pihole can only do DNS based rejection. I actually had to reduce some blocking because the tighter restriction on DNS calls completely made my Samsung TV inaccessible (a la dumb TV, which may be desired by others but not if you want to use viewing apps like Netflix).

I've heard chatter on and off about ad blocking through packet inspection but I suspect this would not be computationally feasible for a pi. But way out of my wheelhouse to even know tbh.

accrual
2 replies
14h0m

I had to fine-tune my local pihole to achieve this with a Samsung TV. I basically turned on maximum filtering, then unblocked domains one by one until the needed functionality worked.

huytersd
1 replies
13h46m

What do you mean by needed functionality? Do you mean apps like Netflix working or something more fundamental in the TV itself not working without being able to connect to the Internet.

accrual
0 replies
6m

Right. For example, the built-in Netflix app wouldn't launch until I unblocked some Samsung-specific domain, despite Netflix itself not being blocked. So it was kind of like test feature > unblock domain > retest feature until everything worked with fewer/no ads on the dashboard.

wpm
0 replies
11h42m

You can hook up an Apple TV or a PC to your TV and watch whatever you want.

ssss11
0 replies
8h18m

I think I had to block and unblock some calls to figure it out but I don’t recall it being too tricky

sp332
5 replies
1d

And don't forget overscan.

jaywalk
4 replies
21h46m

The fact that it's still enabled by default on TVs being sold today is an unforgivable sin.

jandrese
3 replies
21h26m

We had the perfect opportunity to dump it into the wastebin of history when TVs switched to HD, but for some damn reason the industry decided to carry it forward.

sumtechguy
1 replies
19h58m

You would be surprised how many movies on dvd/bluray have junk in the margins. Usually just a line or an overscan of the audio track. But a lot of them have it.

I turn off overscan as those artifacts do not bother me.

xp84
0 replies
19h24m

Sometimes you can see the closed captions line of the NTSC signal at the top of the frame of the video when watching an old show converted to digital from an NTSC source. It looks like a single line of black and white dashes which dances around quickly every time the onscreen captions would have changed and then sits static until the captions update again.

toast0
0 replies
19h47m

Provisioning for overscan on 1080i CRTs seems just as valuable as with 480i CRTs.

People want content to the edge of the screen, but not to pay a TV technician to come and calibrate their tube to exacting standards in their home. Content creators need to know that some of their broadcast is invisible to some of their viewers as a result.

Pixel perfect tvs came later, so the transiton to HD wasn't the right time. ATSC3 could have been a reasonable time to change, but then broadcasters couldn't use the same feed for ATSC1 and ATSC3 ... and who knows if ATSC3 will ever win over ATSC1, or if all the OTA TV spectrum will be refarmed to telcos before that happens.

lucisferre
5 replies
23h40m

I've tried filmmaker mode and it is just another kind of bad smart TV setting, making everything way to dark instead of way too bright. Turning of most "features" of these TVs seems to be the only sane solution.

Of course perhaps it is the filmmakers that are to blame: https://www.avclub.com/how-to-watch-dark-movies-and-tv-shows...

pipes
3 replies
22h2m

Yeah I'm finding film maker mode way too dark on my Samsung oled.

I can't find any explanation of how it actually works. Does a each movie get different settings set up by the director?! Doubt it.

room500
0 replies
19h21m

Typically, it means that if you put the TV in a dark room, it is calibrated to the same specifications that the monitors used in post-production used. Therefore, it is what the directors "intended" the video to look like since they were looking at the monitors (in a dark room).

However, if your room has even a little light in it, the settings would make the TV too dark.

It will also disable any effects the TV has that aren't "map video to screen 1:1" such as motion interpolation, upscaling algorithms, etc

mjmsmith
0 replies
18h28m

I found FILMMAKER MODE (why is it capitalized in settings?) dark and muddy on a Samsung Frame. The "Samsung TV picture settings" section in the linked which.co.uk article [1] seem like decent advice.

[1] https://www.which.co.uk/reviews/televisions/article/getting-...

imp0cat
0 replies
19h32m

It supposedly makes the content look more "just as the director intended".

However, Samsung TVs are not exactly known for realistic colors. So turn up the eye candy and enjoy!

empiricus
0 replies
1h24m

The truth is the filmmakers want their movies to look dark (cinema projectors are kind of limited), to have no colors (use too much color grading) and to move like a slideshow (24fps) :)

izzydata
4 replies
19h52m

Frame interpolation is so incredibly awful looking in my opinion. Especially when it comes to animation. I can not comprehend all of the people on Youtube that take a beautiful drawn animation that is intentionally 24 frames per second and increase it to 60 thus ruining the hand crafted perfection of the drawn key frames.

tuna74
2 replies
19h22m

Very little hand drawn animation is done a 24 fps. Which make interpolating the actual movements even crappier.

extraduder_ire
1 replies
12h7m

I think the phrase they use is "drawing on 2s" for every second frame (12 fps) being drawn and "drawing on 3s" and so on depending on how many native frames there are between drawn ones. Usually, different things onscreen will be animated differently for effect/budget.

HankB99
0 replies
1h38m

Question: Does this mean that animators draw every 2nd or 3rd frame and then some technology interpolates the missing frames? It sounds to me like that is not fundamentally different from interpolating 24 fps to 60 fps.

I suppose there are two differences:

1. The artist retains control of the interpolation between drawn frames.

2. The computational resources available to the artist far exceed those available in the TV.

Apologies if this is an uninformed question, I watch animation but don't know how it is produced.

Doxin
0 replies
9h41m

A lot of good animators will also vary how many frames they animate per second. Trying to smear all of it into 60fps doesn't improve anything.

A good example is FUNKe[0], He's got a style with very pronounced changes in framerate. One movement will be 3fps and the next 30, never mind that the lipsync tends to be at a high framerate no matter what the rest of the animation is doing. Imagine trying to convert that to 60fps and still have it look good.

[0] e.g. https://www.youtube.com/watch?v=maqIaT_ZUxs

ggm
4 replies
16h59m

I can't quite phrase this right, but persistence of vision and "frames" is really quite distinct from I+P blocks, and what we have in modern LCD screens is anything but a fixed rate scan.

I really liked old TV. those wagon wheels running backwards were a direct artifact of PoV and frame rate.

The whole NTSC vs PAL vs SECAM debate, was about what aesthetically is best against what eyeballs do. Personally I didn't get the argument NTSC was better for live action. I just think PAL was superior to all the others.

Yes it was wasteful of bandwidth. Yes, it was analogue signalling of something which was at the fringes of what analogue and digital is (encoding colour as a layer over b&w signalling of grey levels, under a time code synchronisation for framing). But, it was also amazing, and I think we've lost something in MP2 -> MP4. We've also gained fidelity of what is sent to what is received and we've gained far more content.

I do really miss old analogue(ish) TV

layer8
3 replies
16h33m

What I miss is the motion clarity of impulse-driven displays vs. today's sample-and-hold displays. Smooth 60 Hz scrolling on a CRT is something so visceral that just can't be replicated on modern displays.

grishka
1 replies
8h56m

Scrolling on a modern 120 Hz display does feel extremely smooth though.

layer8
0 replies
2h57m

It’s smooth, but it suffers from motion blur.

username135
0 replies
13h15m

#CRTmasterrace

codetrotter
4 replies
19h27m

The smartest thing you can do with a “smart tv” is to keep it unconnected from your WiFi and instead plug a Raspberry Pi into one of the HDMI ports and use that for your YouTube etc needs.

varjag
2 replies
19h26m

None of the functions TFA discusses has anything to do with connectivity.

codetrotter
1 replies
19h21m

No but my point is that if you keep your “smart TV” offline you don’t need to worry about any of the settings on it. And that’s just aside from all of the problematic things of allowing it to connect.

clintfred
0 replies
19h16m

But their point is that your comment doesn't have anything to do with what the article is talking about. TVs having picture settings have nothing to do with connecting it to the network.

Flameancer
0 replies
14h30m

I mean sure, but then my TV wouldn’t have VRR that was enabled with a software update. Plus a lot of TVs will just connect to an open WiFi network anyways.

j-bos
3 replies
16h20m

I just want to turn the smart tv dumb, simple input switch, volume up down, brightness/contrast, and basic color adjustment, that's it.

godelski
1 replies
16h15m

My TV turns into this if I set it behind a pihole and block all requests to it. I have heard some TVs will look for open networks (or known ones with deals such as xfinity) but there are none around be that I'm aware of and it responds as if it is dumb other than the features that can be turned off like frame interpolation. But those do randomly reset themselves which is quite frustrating.

Flameancer
0 replies
14h39m

The key is block all network access. A lot of IoT devices have hard coded DNS so even running a pihole will sometimes not work.

drivebyadvice
0 replies
16h10m

Don't forget the sleep timer. Such a simple QoL improvement, and none of the TVs I've owned in the past 10-15 years have had one. Seems like every TV had it when I was a kid (or at least a good sized portion did).

resters
2 replies
18h10m

it is surprising that manufacturers turn on motion smoothing by default, as it ruins the aesthetic of most content and makes it look like it was shot on a 1990s camcorder.

theragra
1 replies
17h47m

New generations will get used to more fps, and new content then would be able to filmed at 60 fps. I think that's a good thing. We can't be stuck in the past because of current habits

resters
0 replies
16h58m

True. However my theory is that filmmakers would not use the same shots in a 60fps world that they use in a 24fps world.

The aesthetic of it is so different, seemingly much about the creative aspects of a production would be done differently.

Motion blur is evidently used by filmmakers in an artistic/aesthetic way similar to the way rock bands use guitar distortion and electronic musicians let some signals clip, photographers use lens flare, use limited camera dynamic range for artistic effects, etc.

Adding a post-production step to remove the "limitations" that were used artistically does alter the aesthetics of the production.

Imagine if metal albums were played back on "better" stereo equipment that removed all of the distortion products produced by the amplifiers.

omnicognate
2 replies
9h40m

A modern panel is a complex, powerful device with a wide range of capabilities. Human beings are diverse and even an individual's preferences can change over time. It's right that there are settings for this stuff.

What isn't right is that instead of using standard terms common across manufacturers and clearly related to what the settings actually do, manufacturers expect anyone who wants to change them to reverse engineer what Dynamic TruScungle means, how it interacts with Active ProHorngus and whether 100 means the ProHorngus is turgid or flaccid.

TheGRS
1 replies
9h5m

This article is timely because we just setup our new cyber Monday Samsung TV. The screen was on for about 2 seconds before I recognized the frame interpolation. We spent a good 15 minutes going through the settings to find what we thought was doing it and seeing how it looked. Hate that they turn this shit on by default.

Even worse is I’m always the one who notices it when no one else does. It’s immediately obvious to me and everyone else is going “I don’t see what you’re talking about”, drives me bonkers. And then trying to find the setting to make me happy takes too long.

matt89
0 replies
8h53m

I feel you, I have the exact same experience. I always see this and I'm bothered by it, yet most of the time nobody else sees it and it's hard to really convey to other people what you mean.

layer8
2 replies
17h39m

Those aren’t Smart-TV features, and they precede the introduction of Smart TVs. Smart TVs are internet-connected TVs with apps [0].

[0] https://en.wikipedia.org/wiki/Smart_TV

hackernewds
0 replies
14h5m

haha what a nit-pick

dang
0 replies
14h19m

Ok, sorry - the submitter did have "Switch off bad TV settings", and I changed it later to "Switch off weird smart TV settings" because that phrase felt more informative and does appear in the article. It doesn't work if it's wrong though!

jasonjamerson
2 replies
15h44m

Important to know that Roku has an "adjust frame rate to content native rate" which, insanely, is off by default. It will make up frames on all content by default if it's less than what your TV is capable of. I just got a new LG C3, and this was making us nauseous until I figured it out. I thought it was the TV, but unbelievably it was the Roku.

klausa
1 replies
15h30m

Then Roku must be doing something _really_ weird.

What this setting is usually used for, is to avoid "blinking" and screen going dark when you leave menus and start watching videos — there is unavoidable (...mostly. VRR kinda fixes it?) renegotiation period whenever going from 60hz<->24hz or 60<->30. So to avoid the TV going dark for a second to re-negotiate the connection, the device steps in and "pretends" it's all 60hz all the time.

_Usually_ what a set-top box does is just frame double (if possible), or do 3:2 pull down, which _should_ be unnoticeable for human eyes, unless you're very sensitive.

Side-note: this is why HDMI 2.1 and 120hz HDR is so good for everyone. 120 divides cleanly into 24, 30, and 60fps so you don't have to deal with 3:2 conversions ever.

jasonjamerson
0 replies
15h18m

I went back and watched content that had been jarring, and it's fixed. My wife, for whom all this is instinctual as opposed to technical, is able to watch now without feeling sick. IDK, it's weird.

DitheringIdiot
2 replies
17h52m

Well, I didn't expect that to get so many comments.

It looks like there are a lot of people with perfectly valid reasons to keep some or all of these settings on. Which is great. I will rewrite the article to reflect that — and change the title to something more neutral and less absolutist.

hebleb
1 replies
17h17m

I'm honestly shocked at the amount of pro-motion interpolation comments there are in here lol

imp0cat
0 replies
10h13m

Why? If you have a decent TV (hint: SONY), a little bit of motion interpolation can make stuff look amazing.

smusamashah
1 replies
19h41m

I have done this on family and friends' TVs a number of times already. Most of these settings are crapy and very visibly making picture worse instead of better. Worse offender in my opinion is the noise reduction or sharpness. Noise reduction is incidentally also the one which makes Smart phone or other cheap camera outputs worse.

Retr0id
0 replies
19h24m

Noise reduction is incidentally also the one which makes Smart phone or other cheap camera outputs worse.

This is all subjective of course, but I think you'll find that in cheap cameras, overdone noise reduction is the culprit, rather than noise reduction itself. If you're able to look at the raw sensor data, I think you'll find something even worse still. Small sensors are inherently very noisy, in typical lighting conditions.

So yes, the images look worse than optimal, but not worse than if there was no filtering at all.

shijie
1 replies
19h29m

I recently went down the rabbit hole to find a dumb TV. It was surprisingly difficult. I ended up with a Sceptre 65 inch TV, to which I’ve plugged in a rooted, jailbroken Chromecast.

It’s been awesome. The TV is fast to boot up, responsive, doesn’t spy on me, and doesn’t need useless software updates.

delecti
0 replies
17h51m

What's the benefit of rooting and jailbreaking a Chromecast? You can already cast anything you want to them, so I'm assuming there must be added functionality.

ryaneager
1 replies
22h26m

I treat the configuration settings from www.rtings.com as the Lord’s word.

pstorm
0 replies
22h17m

Took me a while to find the configuration settings you mentioned. For anyone else, you go to a specific TV model's page, and there is a tab called "Settings."

prmoustache
1 replies
9h26m

Between this and the embedded crapware, I am glad I went the videoprojector route. I initially though I would not bear the sound of its fan but since it is a constant noise the brain just filter it out. And I use my noise cancelling headset when I am alone.

foresto
0 replies
9h18m

Thankfully, there are a few projectors with very quiet fans. I haven't kept up with the current models, but Sony made some roughly eight years ago.

jeffmcmahan
1 replies
13h4m

As a former high end audio/video salesperson, I just want to state for accuracy that local dimming, as a feature, is not dynamic contrast. Dynamic contrast is terrible. By having an LED backlight array dim spots that are darker in the source material, the display achieves better absolute contrast. It is not adjusting the exposure to fake it. It is instead getting closer to the contrast given by the source material. It created some halo issues, but it was a step in the right direction.

This is not new tech at all - I was selling Samsung LED TVs with this feature in 2007 or so. Samsung, Sharp, and Sony has little choice but to improve contrast, because their LED sets were right next to Pioneer KURO plasmas that were just absolutely amazing - OLEDs are only catching up their PQ now, 15 years later. First on the scene was the Samsung LN-T5781 - https://www.cnet.com/reviews/samsung-ln-t5781f-review/

jonny_eh
0 replies
11h13m

I came here to warn people that they should not disable local dimming on their TVS. It'll make dark scenes look overly bright and washed out.

grishka
1 replies
8h49m

I don't know if it carried over into 4K TVs, but I'm surprised that the dreaded HDMI overscan isn't mentioned anywhere. I'll never understand WHY and HOW someone thought that implementing this at all, let alone making it the default, is a good idea. You had one job, accept a 1080p signal and output it pixel-perfectly to the 1080p display panel. Yet somehow, someone thought that it would be a great idea to cut off the edges of the image and interpolate it so everything looks atrocious. And then every single TV manufacturer agreed and implemented it. Whenever I see this kind of cropped image on a TV, I grab the remote and set it to the only mode that should exist on digital displays, direct pixel-to-pixel mapping. Just blows my mind I have to do that.

ProllyInfamous
0 replies
3h4m

I have an Aorus 4K Display [Gigabyte], which is technically "not just a TV screen," and in order to defeat the overscan mismatch [having found no method functional either in OS or on TV settings] I plug the HDMI cable into a DVI->HDMI adapter [instead of HDMI direct from TV to HDMI output].

For some reason, this dongle hack, de-necessitates the overscan settings [which never seem to work/hold]. But of course then there is no 4K output [which is fine].

Retr0id
1 replies
21h36m

Motion interpolation or motion smoothing increases the frames per second from 24fps to 60fps

It's pretty common for TVs to have even higher refresh rates these days - my 4 year old mid-range LG OLED is 120hz, for example. Conveniently, 24 evenly divides 120, so when you turn off the interpolation you get perfectly consistent frame times.

As a more general note, don't be afraid to experiment with the settings. If you're watching low-bitrate netflix streams, some of the artifact-reduction filters can be worthwhile, especially on the lower intensity settings.

For watching bluray remuxes however, "filmmaker mode" or equivalent settings is generally the way to go.

deckar01
0 replies
10h54m

My LG OLED had intermittent contrast issues that persisted after toggling every permutation of picture bell and whistle. After finally stumbling across the correct search term, it turned out the offending feature was hidden in a maintenance menu that requires a button that only exists on service remotes… The same issue was fixed in an update for newer models. Tons of forum posts on the subject resulted in RMAs and refunds. How did we get like this?

Liquix
1 replies
19h23m

Or don't buy products that are subsidized by recording and selling your data! Not to mention these half-baked "features" produce thousands of hours of headaches, tech support calls, and general unhappiness. $tvManufacturer could care less because red line go up.

Build quality and software invasiveness are both going to keep trending in the wrong directions until people stop buying smart TVs. And it's not like you need to break the bank or order commercial displays - $150 on Amazon for a dumb 43" 1080p, $260 for a 55" 4K.

clintfred
0 replies
19h18m

I don't necessarily disagree, but this article doesn't talk about any of that. It's talk about picture setting, like motion smoothing, dynamic range, local dimming, etc.

zoom6628
0 replies
15h42m

Every electronic device should be fitted with a large red mechanical switch labelled "Fuck Off" to reset the device to its most basic usable level. Mobile phones, TV, computers, cars, washing machines, any device where manufacturer or seller has options they can configure for you, should be required by law to have such a switch.

And FTR don't have TV anymore since 3year old tested the strength of the screen with a hammer. Don't miss it. Hypnotic on Linux for news TV and Netflix for (mostly stupid) ways to waste an evening.

vsskanth
0 replies
21h27m

I'm one of those people who turns on motion smoothing. For some weird reason, it makes older shows like friends or sound of music crystal clear on my LG C2 OLED. I can't explain why.

tuna74
0 replies
19h13m

"At first this seems great. Why shouldn't 90s sitcoms seem like they were filmed in 4k at 60 frames per second? Then you start noticing things…"

Interpolation will never give the same results as actual capture, so the author is wrong here.

teddyh
0 replies
20h1m
sbliny
0 replies
19h30m

Consumer Reports has a "TV Screen Optimizer" that aims to give users optimal picture settings by Brand/Model.

Also nice that they mention how to turn off ACR and other privacy related features as well.

https://www.consumerreports.org/mycr/benefits/tv-screen-opti...

saturdaysaint
0 replies
17h45m

I found this to be an informative and well-laid out argument in favor of motion interpolation: https://www.wired.com/story/motion-smoothing-defense-hdtv/

I have come to prefer it. I think this comes as a combination of recent algorithms getting better (there are relatively few artifacts and they are far subtler than they once were) and because newer consoles have conditioned me to abhor any content running at less than 60 fps. The jaggedness of 24 fps grates my eyes. I’ve had my iPhone recording 60 fps video for years.

rubatuga
0 replies
19h14m

Noise reduction and dynamic brightness aren't too bad if done tastefully. But it's really up to the TV manufacturers to do it properly which is why there is just general advice to turn it off.

rocqua
0 replies
18h49m

The local dimming suggestion isn't fully in line with the rest.

It's about bringing some parts of the image closer to whats intended by the filmmaker, at the cost of other parts of the image (usually noticeable by adding gradients to flat color). That isn't going against the filmmaker's intention, so much as respecting the contrast the filmmaker wants at the cost of some gradients. It's a different way to approximate the actual signal the TV should send.

porjo
0 replies
17h57m

Not a smart tv problem specifically, but the inability to adjust volume level while on mute (at least adjust down). I often turn the tv on after the rest of the household has gone to bed and it's way too loud, so I hit mute. But as soon as I hit volume down, sounds comes back on at the initial loud volume! Aargh. Only thing is to navigate away to a silent input, adjust volume, then navigate back to previous channel. Annoying. I have seen this feature once before (Sony I think).

phkahler
0 replies
17h21m

I'd be happy if my Samsung didn't put up OSD overlay at power on.

mrbigbob
0 replies
1h17m

I highly recommend when buying or setting up a tv looking to see if HDTVTest on youtube has a review of it. He really does a good job of going in to the effects of certain tv settings has on the picture. Be forewarned that he is in the UK so TV model numbers wont line up a lot of the time but with a little digging you can find your model that corresponds to his.

https://www.youtube.com/@hdtvtest/featured

mrangle
0 replies
17h54m

At first, I bought into the "switch off weird (Ai) settings" talk.

Quickly, it became apparent that I couldn't come close to the overall picture quality potential with manual settings. This is for a late model qled. The chip and its auto-adjust capability are built for the panel.

So now I deal with some soap opera effect as a trade off for an amazing picture in every other respect. And I rarely notice the soap opera effect anymore. One gets used to it.

maxgashkov
0 replies
17h59m

Obligatory PSA to enable the setting to set correct framerate of the content from the source you're playing if you have at least 120hz-capable panel.

E.g. if you're playing 24fps movie from a device that's reporting 60fps uniformly across all content, you will have a bad time.

For Apple TV it can be set like this: https://support.apple.com/en-us/102277

layer8
0 replies
16h53m

HDTVTest [0] is a great YouTube channel if you want to go in some of the depths of this.

[0] https://youtube.com/@hdtvtest

jowea
0 replies
19h22m

Wouldn't I have remembered Elaine being a 9ft tall blue humanoid alien with a tail?

I never observed what TFA is complaining about, does someone have an screenshot?

j45
0 replies
18h59m

Another neat idea is to connect all “smart” equipment to an isolated vlan and separate wifi that can still be seen by your normal network devices.

For example if your wifi was called “Home”, an additional “Home-IoT” is for every device.

The IoT devices can then be set to not sniff your network, or even connect out if you want.

A good example of this is in this EdgeRouter setup guide, which is a pretty decent guide on how to plan a home network for more than just basic home browsing.

https://github.com/mjp66/Ubiquiti/blob/master/Ubiquiti%20Hom...

interestica
0 replies
17h9m

For those with a Sony Bravia panel (2015 and later I think), you can enable a pro-mode with a key combo that can turn a laggy and unusable panel into mostly a dumb display.

Display, Mute, Vol +, Home

I used this to basically save a frustrating laggy panel.

iambateman
0 replies
15h58m

Motion smoothing is so horrible…I don’t understand how TV makers have persisted with it for so long. Surely the people who make televisions appreciate a good picture enough to not spread this nonsense, right?

The effect makes movies completely unwatchable. I was at a friends house watching Elf and it felt like watching Will Ferrell host a Zoom call.

hammock
0 replies
19h34m

Does the same apply for audio settings as picture settings?

For example, Dialog Clarity/Enhancement, TruVolume (automatic volume leveling), and DTS Virtual:X?

Why or why not?

Do you use Spatial Audio on your Apple products (which sounds great to me)?

fortyseven
0 replies
21h30m

Genuinely tired of people telling me what an awful person I am for the weird shit I like. You do you.

empiricus
0 replies
1h28m

Using the author logic, we should also watch kramer with the image shrunk to a quarter of the screen (the size of the tv in the past was smaller) and also using garbage color accuracy (I don't actually know how color accurate were tv in the past but I suspect it was horrible). I am still waiting for 4k/8k 100hz movies and youtube, but until then I will use motion interpolation on my TV as the next best thing.

cpeterso
0 replies
19h13m

A couple years ago, my Samsung TV slowed to a crawl. Each click through a menu took multiple seconds. I eventually discovered a new setting buried deep to turn off "real-time anti-virus scanning". That immediately fixed the performance problems.

How would my TV get a virus? This was a Tizen TV, not an Android TV where I'm installing shady apps.

https://www.techspot.com/news/78967-samsung-loading-mcafee-a...

cloudking
0 replies
15h38m

Motion features are the worst, I find they always make shows and movies look uncanny. Have turned this off on many friends and family devices, who seem to not notice until I turn it off.

clnq
0 replies
9h43m

My Samsung “HDR” TV can only approach what an HDR image looks like on a studio display if a bunch of these settings are on and balanced just right.

It feels as if Samsung deliberately lowers contrast, for example, until their dynamic contrast is enabled, as the picture matches studio displays much more closely then.

No proof, not making this claim, but it did feel this way when I had my TV side by side to a studio display.

boredinstapanda
0 replies
14h40m

It's a shame no one makes a control board for TV screens like you can often find for old screens like from digital photo frames.

avazhi
0 replies
9h53m

Or just - get rid of your TV. I’m no Luddite but I’ve quite happily forgone a TV - and the concomitant ads and other irritants TVs bring - for nearly 20 years.

andersa
0 replies
18h12m

Imagine actually, unironically watching a movie with the motion smoothing off. Yes, I sure like to watch my movie on Google Slides and see every single frame during camera pans.

Perhaps once movies are recorded at a frame rate from the current decade this advice will be useful. We play PC games at 144Hz+ for a reason. There is no need for movies to still run a format derived from technical limitations of a hundred years ago.

__MatrixMan__
0 replies
18h37m

I wonder what it would look like if you designed a movie to glitch these settings as badly as possible.

10729287
0 replies
10h43m

Basically deactivate everything, not that complicated :) I must admit it's always a pain when friends are asking me to aknowledge how awesome is their brand new tv just to notice that it's all défault, very catchy, but definitely not good and tiring to watch.

Do you guys remember this hero putting papers on car's windshield to explain users how to fix their TV ? You are what you fight for !