The thing I hate about "advice" like this is that it assumes that everyone likes the same things and feels the same way, and it comes across as an attempt to shame anyone otherwise.
I like motion interpolation. I always have it turned on, and I chose my TV based on the quality of its motion interpolation.
Screens are so big these days, that if you're watching a 24fps movie, any panning movement becomes a horrible juddering shaking movement. Judder judder judder... ugh. I can't stand it.
With motion interpolation turned on, everything is silky smooth, and I can actually see what's on the screen, even when the picture is moving.
So no, I won't be turning it off, and I suggest that the next time you watch a shakey-cam action movie, you try turning it on too!
It makes everything look like a cheap soap opera to me. I can't stand it. I think this might be either a generational thing or perhaps a cultural thing, or maybe some of both.
"It makes everything look like a cheap soap opera to me"
This is nothing more than conditioning. "Quality" TV shows "film" at 24 fps despite the fact that they were going to be viewed at 30/60. They did this because even though 3:2 pulldown incontestably is a dirty, ugly hack that reduced quality, people were conditioned to think that if something went through that hack, it's quality. If it didn't, it can't be quality.
So when people talk about the "soap opera" effect, what they usually mean is concise and clear.
The best example of this was The Hobbit when presented at the original, director's intention 48FPS. People were so conditioned to movies being a blurring mess at 24FPS that a frequent complaint about the Hobbit was that it had the purported "soap opera effect".
It's not conditioning. Frame rate is a tool in the toolbox, and more isn't always better, just like a colorized black and white film isn't better just because it has added information.
This is most easily apparent in animation, which frequently makes use of variable frame rates to achieve different effects and feelings, often even within elements of the same scene. A character might be animated at 12 fps while the background is at 8 fps. Bumping those all up to 60 would be an entirely different shot with very different vibes.
Actually, more is always better. What do you think why we try to run PC games at 144, or even 240 and 360 fps?
What's the point? Can people even tell?
Of course they can. If you ever try to go back, it looks like garbage. There is no limit to improving the illusion of smooth motion, but it scales non-linearly (you need to double the frame rate each time to improve it noticeably). I can't personally tell apart whether it is running at 144 or 120 fps, but the larger jumps are very obvious.
Yea 120 v 144 isn’t too big of a difference for me. Actually until I got a new GPU that could actually output 120fps at 1440p, I left my monitor on 120hz. When AMDs AFMF drivers came out I actually went ahead and bumped the refresh to 144hz as I was definitely starting to see screen tearing but it was still pretty fluid. I’d be curious on whether the next jump for meto will be 4k 144hz or 1440p 240hz
Absolutely they can, though I found anything above 240hz to be indistingushable for myself personally. Though the monitor I'm writing this on is 300hz; it's motion clarity is only bested by OLEDs and the $2000 AUD (twice what I paid for this) 360hz bigger brother PG27AQN.
I’ve never had a display refresh rate higher than 144, but maybe because I play games but I can definitely tell the difference between 30/60 and even 60/120. After 120 the difference between 120/144 starts to diminish but there’s still a slight difference though it’s more pronounced in very fast moving objects like moving a mouse.
Games are different. Interactive media requires responsiveness. Saying that higher framerate is always better in cinema is a pretty bold statement, one that would fail a simple survey among moviegoers. What does "better" even mean, in this context? Fidelity is just one of many axes of expression when creating art. Higher isn't better than lower, or vice versa.
For me the higher frame rate is not really about response time, it is all about smooth motion. Panning the camera at 60Hz is just barely acceptable. 120Hz is where it starts looking real nice and I can really see things during the pan. 24Hz pans are unwatchable garbage.
A movie properly authored at 120Hz (not interpolated with potential artifacts) would be objectively better than a 24Hz one in every possible way.
Also with higher frame rate I can see subtle patterns in the movement and flow of things (clothes, water, humans). Also the eye can see more details and can recognize the texture of things, the different kinds of fabrics, all kind of details like this.
PC gamers pushing for higher framerates is mostly about reducing latency, which doesn't matter for a movie.
Need some way to justify an overpriced graphics card, is what I always assumed.
Surely 24 fps was made the standard to save money on film.
It's about the slowest frame rate that's not very distracting.
Needless to say, film is no longer the major expense it used to be. In the early days the film itself was a fairly big part of the production cost (and don't forget distribution - cost to replicate the film - and operating cost for the projectors - handling film rolls twice as big, etc.).
And standards are sticky, so the industry stayed on 24fps for decades.
Except, it is very distracting. Around 60 it starts to become bearable for me and allow camera pans to be interpreted as fluid motion.
It's quite astonishing how terrible 24 FPS really is, but as the GP mentioned it was "good enough" for the time. It's a scale, and costs and processing was prohibited that it was the point where most scenes, especially dramas and human interest type things, were served well by the format.
Action scenes are just brutal at 24FPS. Running, pans, and so on. Either it's a blurry mess at a 180 degree shutter, or it turns into a rapid slide show (like early in Saving Private Ryan).
This is as much about frame rate as it is about display technology.
24hz on an OLED with very quick pixels is a very different experience than 24hz on a CRT or 24hz on a film projector.
Animation frames are almost totally orthogonal to what we're talking about here. In fact, I'd argue they're the exception that proves the rule. Animation had to develop tricks like spaghettification to look good at 12 fps because it looks inherently worse than higher frame rates. In fact, techniques like smear frames are often used to directly emulate higher frame rates. It's an art form born of limitation, not a neutral tool. Just look at any 3d animated show that just drops the frame rate of mocapped characters to 12fps (Dragon Prince is one that comes to mind) - it looks like jarring, choppy shit.
Those are the origins, but the result is a style that can look good or better than higher framerates depending on what you're going for. Art is not held back by fidelity - instead fidelity is part of what defines it, as well as other constraints.
This is an odd take.
Firstly: No one on the mass market actually knows what 3:2 pulldown is, so it's hard for people to see it as an indicator of 'quality' -- most of HN, a technical audience, probably doesn't know what it is either: For reference, it's when a 24 frame per second film is converted to 29.97 frames by by interlacing 4 frames together to create 5 frames. That and a tiny bit of a slowdown gets you to 29.97, which is the NTSC frame rate.
Secondly: Why do people in traditionally PAL regions also hate the soap opera effect? Again, for reference, PAL regions ran at 25 frames per second and so got away with a 4% speedup and a 2:2 pulldown that has no real frame-blurring effect.
Thirdly: Generally, I prefer higher frame rate content: I have a 144Hz monitor, and prefer content as close to that as I can, but I still hate watching The Hobbit -- a lot of this has to do with motion blur: 48 frame per second content is not fast enough to get away with appearing seamless, and not slow enough that the per-frame motion blur you get with 24 frame per second 180 degree shutter hides the stuttering.
People don't have to know what it technically is to know it when they see it, and the simple incantation of "soap opera effect" demonstrates that.
Again, almost all dramas shoot at 24 fps. There is zero technical reason for this (there once was a cost saving / processing reason for it, and then people retconned justifications, which you can see earlier in this very thread). They do this because, again, people are conditioned to correlate that with quality. It's going to take years to break us from that.
This is not meaningful. Preferring games at a higher framerate has zero correlation with how you prefer movies. And however odd you think the take is, you like 24 FPS because you've been trained to like it, Pavlov's bell style.
What you're suggesting is that you know better than every cinema-goer and every cinematographer, animator, producer, and director around what their preferences "really" are, which is a pretty wild thing to claim, especially in the face of people telling you exactly why they prefer unaltered 24 FPS content to horribly interpolated uncanny-valley messes.
The reason no one has changed process isn't because there's tonnes of better options that everyone is just studiously ignoring because of pavlovian conditioning. It's absolutely nothing to do with people liking the look of interlaced 3:2 pulldowns. It's because the current options for HFR content just plain don't look very good. Some of this is unrelated to the technical specification of the recording & due to things like action content in HFR looking cheesy -- there's going to need to be a wild change in how content is choreographed & shot before we're anywhere near it being as well understood as current practises.
There are exceptions: 4K 120FPS HDR content for things like documentary content looks pretty good on a high refresh rate monitor (note: no one said games), but we haven't reached an era where that's even nearly commoditised and the in-the-middle stuff you'd want to do for cinema or TV just can't cut it.
Humorously this submission, and so many just like it, are about people who are outraged that their parents / friends / etc actually like motion smoothing. So...I guess? I remember a similarly boorish, ultimately-failed "no one should ever take vertical video!" movement from a few years ago, again pushed by people really, really certain of the supremacy of their own preference.
Now this attempt to appeal to authority is particularly silly. Peter Jackson -- you might have heard of him -- tried to do a movie at 48 FPS for a wide variety of quality reasons, to be lambasted by people just like you. People who are sure that the completely arbitrary, save-our-rolls-of-film 24 FPS is actually some magical, perfect number. It is insane. Everyone else is simply handcuffed to that completely obsolete choice from years ago, and will be for years more.
I'm not going to convince you, and have zero interest in trying. And I am certain you're not going to convince me. But your argument at its roots is "that's the way it's done, therefore that's the perfect way and the way it will forever be done". It's utter nonsense.
Instead of trying to jump to 48fps or 60fps, maybe they should just adopt 30fps as the new standard for a while. The 24fps fans won't have too much to complain about, because it's not that much faster (and it's the same as the old NTSC standard), and the HFR fans will at least have something a little better. Then, in a decade, they could jump to 36fps, then 42, then 48, etc.
As a bonus, the file sizes for movie files won't be that much bigger at only 30fps, instead of 60+.
people are conditioned to correlate that with quality
Are you sure it’s really just conditioning? Impressionist paintings are obviously a lower fidelity reproduction of reality than photorealistic paintings, yet people tend to like Impressionism more, and I don’t think that’s necessarily just cultural conditioning. Sometimes less is more.
The reason I didn't like the Hobbit was because they went overboard on CGI. They had to make Orlando Bloom (Legolas) appear younger than he was in the Lord of the Rings which was released a decade before.
Tolkien's elves are literally immortal and the time between The Hobbit and The Lord of the Rings is less than a hundred years. Legolas' father is several thousand years old. There is no reason to expect Legolas to look younger in The Hobbit; you'd want him to look exactly the same.
I watched The Hobbit in the cinema in HFR.
It looked like absolute soap opera crap. And the worst thing was that I could _easily_ tell when the actors were carrying light prop weapons compared to the real thing.
At 24 fps your eye can't tell, but at 60fps you see the tiny wobble that the foam(?) prop weapons have and it's really jarring. Same with other practical effects, they're easier to hide in the ye olde 24 fps version.
Watching movies in 60fps is the same as when porn moved to 4k. You see every little flaw in great detail that used to be hidden by worse technology. Everyone needs to step up their game in areas where it wasn't needed before.
Where does this nonsensical argument keep coming from? No idea what a "cheap soap opera" is even supposed to be. The only thing I notice is that the horrifying judder during camera pans is replaced by minor artifacts that don't distract anywhere near as much. It's literally not possible for me to interpret a 24hz pan as fluid motion.
See llm_nerd's answer below. Conditioning or not, it takes me out of the realism of whatever I'm watching if interpolation is applied. People don't really have to rationally justify it, they're used to a certain way of footage being played growing up and when some folk see interpolation it just feels off to them right away, and reminds them of those soap operas.
I wonder if you'd have the same reaction to a movie properly authored at 120Hz without the subtle artifacts.
There's a point at which these things balance out and work. I'm not sure that 120 frames per second is high enough, but it's much closer. We have anecdotal evidence (the fact that everyone hated The Hobbit) that 48 frames per second isn't.
One of the rules of cinema and shooting at low frame rate is to use "180 degree shutter" (a term that comes from the spinning shutters used in old film cameras), or in other words a shutter that is open for half as long as the frame rate. i.e.: If you're filming at 24 frames per second, use a 1/48th second shutter speed.
The reason for this is that 24 FPS @ 1/48" exposure film adds enough motion blur to each exposure that any movement that occurs over a small number of frames is naturally smeared by a consequence of the exposure time, but anything that is relatively steady across frames tends to look crisp. If you shoot at 24 FPS @ 1/1000", you end up with something that looks like a fast flipbook of static pictures. 24 FPS just isn't fast enough, and people can see that each individual frame doesn't contain movement.
Anecdotally, 120FPS @ 1/1000" on a 120Hz display doesn't exhibit this same problem, at least to me or people I've shown it to, although some people will notice it "feels fast".
48 FPS @ 1/96" seems to be the worst of both worlds: Too fast a shutter for motion blur to compensate nicely, too slow a frame rate to to make it look seamless, so it ends up in the uncanny valley where people know there's something missing, but not what.
The frame interpolation feature that people seem to hate is almost directly designed to fall into this horrible territory.
Motion blur doesn't work on modern TVs (i.e. OLED). They will display a static, blurry frame for 1/24 of a second, then snap to the next one practically instantly with a clearly visible jump. What I get in the end is a juddering mess that is also blurry, so I can literally see nothing.
I wonder if these 120 fps panels are fast enough to do black frame insertion to mimic the effect of film projection
I tried the black frame insertion mode on mine, but I can see the strobing, so it is extremely uncomfortable to look at.
Motion blur doesn't really work in that way on any device, but what you're missing is that the content's captured frame rate has a relatively important relation to the display frame rate: You can't usually change one & not the other without making everything look worse, at least to most people, and this is what a lot of modern TVs do.
Naive interpolation of inter-frame content is awful, is what a lot of TVs do to implement their smoothing feature, and is why everyone complains about it.
The reason a lot of people hated The Hobbit may be partly because of this problem: It was shot at a 270 degree shutter to try to get a bit more motion blur back into each frame, which to a lot of people felt strange.
Interestingly, some of The Hobbit fan edits changed the frame rate back to 24fps; watching those on my regular IPS screens, they look just fine.
Maybe, or maybe we just haven't adapted to 48fps yet. Something I heard a lot about The Hobbit was that the outdoor scenes looked great whereas the indoor seens looked like film sets - which, well, they were film sets, and making a set that looks "natural" will require new techniques. Everyone hates early stereo mixes (e.g. The Beatles), not because stereo sound inherently sounds bad but because production takes time to adapt to new technology.
Nah it’s much better at higher Hz
Kind of weird that you associate motion blur with realism.
Of course it seems weird... if you read what I said I implied it's not really rational.
But no one's brain is saying "wow the lack of motion blur really affects the realism here".
I walk into a house and see a TV show playing on a TV with interpolation turned on and it just looks weird because of how TV looked growing up. I mean that's just the simple reality of it. I understand when film nerds come along and explain motion blur etc but that's all very "system 2" thinking. I'm talking about pure system 1 subconscious processing.
What's even weirder is how some folks can never seem to grasp why their explanations of how motion blur is bad can't convince others to suddenly like interpolation.
https://youtu.be/MFlz5cCDMmc?t=25
The "Soap Opera Effect" is a real thing https://en.wikipedia.org/wiki/Motion_interpolation#Soap_oper...
Maybe some people can pick up on the fact that those extra frames aren't real more easily than others. Some innate sense thrown off by the interpolation. Motion interpolation gives me an uneasy feeling more than anything.
For example, some people can see a high frame rate and thus can't watch color wheel based DLP because they see rainbowing in the screen. I can't watch my old plasma in 48hz mode because it flickers. My spouse can't see the flicker at all.
For me, the interpolation really seems to separate the "layers" in a shot, and it just completely destroys the illusion of them not being a set with some lights. Like I said, it feels like a cheap soap opera from the developing country no matter what movie or show I'm watching.
Some of the problem for me may be related to the fact that I worked as a camera operator and video editor during the years from transitioning from the old NTSC standard to HD, and I paid hyperattention to detail as HD came online.
For some reason, the interpolation just screams "this is fake" to me.
Great description. It's the same for me.
I suspect it has to do with the degree to which your past was plagued with cheap soap operas vs poorly performing video games.
It feels like the jump to 4K and 240hz tv at the same time as one groups normal and another groups exception may have something to do with it.
Maybe the group who doesn’t want it too life like is inoculated, or the other way around.
I noticed this too after getting a 4K TV earlier this year. It really ruins a lot of films.
for as much debate there was on 60fps, I do agree. games and movies on 60fps do not feel "cinematic". I indulge in these to escape from reality.
My mind is blown. I didn't think -anyone- could possibly like motion interpolation for watching movies. I hate it so, so much. I'm trying to understand your POV.
How do you feel about watching movies in a theater? The frame rate there is low but the screen is so much larger.
I wish for theatre film releases to move on to HFR instead of sticking to 24 FPS which was arbitrarily set by ancient technology limits.
Even the ""HFR"" is a joke. What is 48 fps supposed to be? Every modern TV can do 120 already. Use that as a baseline.
48 allows theaters that haven't upgraded to show the movie by throwing away every other frame.
Well then, they can have that same alignment with an actual high frame rate movie at 120Hz, displaying every 5th frame.
Your mind is blown that someone might not like to watch movies at a prehistoric 24Hz on a screen that is capable of 120Hz, after video games have been running at this frame rate for a decade already?
Yes! Whenever I see motion interpolation on a TV, it gives me a sense of revulsion. It seems okay for sports, I guess?, but awful for movies, to the point where I would rather turn the TV off than watch a movie with motion interpolation. Perhaps what I'm thinking of is the 3:2 pulldown thing for 60Hz TVs and I haven't seen what it's like on a 120Hz screen?
I don't play games much, but visually I've always distinguished between games and movies. I expect them to look quite different, not at all similar.
And I thought my feelings about this were universal, and had confirmation bias from seeing postings before Thanksgiving like "if your parents' TVs look weird, here's a guide for turning off motion interpolation depending on make and model", etc. I've assumed that the whole point of motion interpolation was for new TVs to impress people with how they look for sports, that this is what sells TVs in Best Buy, and the over-application of motion interpolation to other content was an unfortunate byproduct.
How much of that high FPS or interpolated media have you seen? At first I too was jarred by it, as if it were sped up somehow, but after a few hours, I couldn't watch regular 24 FPS anymore as it literally looked like a slideshow. I distinctly remember watching Thor: Love and Thunder in theaters and when they panned over the pantheon, you literally couldn't see anything. In contrast, Avatar 2 with the 48 FPS scenes was one of the most immersive movies of my life.
A lot of lower end TVs have really bad implementations of it. Maybe try it again on a LG G3 or a Sony A95L and it might be completely different from what you remember.
Movies have a different display though. Film shutters and whatnot. Helps a lot with keeping the motion from just being jerky. OLEDs don't have that, and attempts at black frame insertion don't really work there because they already struggle with brightness. Hence, a mild motion interpolation is useful.
Different display technologies need different things. No difference from CRT filters on old video games played on modern screens.
Is anybody shooting or distributing film anymore?
Lots of people still shoot with film. But distribution will always be digital unless your screening is specifically advertising a film showing. That's usually just big Imax movies like Oppenheimer in select cities, or indie theaters showing old movies who can get access to the film.
Nah, it is just that filmmakers avoid a lot of panning shots because it looks like crap.
It's even worse in theaters. The screen is HUGE. Panning motions are so bad they often give me motion sickness.
There was one movie that they showed at 48fps - I think it was The Hobbit? I've forgotten. That was amazing. Blissful. My eyes have never been so happy.
Even if I forgot the plot already.
Yep. Hate theaters for this reason. Absolutely unwatchable. My LG G3 with motion smoothing provides far better experience. Even in Avatar 2 they put some sections of it in lower frame rate for no reason and I noticed them instantly.
Avatar 2 in the 48 FPS scenes was jaw dropping, it looked so real, as if you were transported there. It does not look the same in the 24 FPS scenes and pulled me right out of the movie.
I'm in agreement with them here, judder on movie screens is so bad to me it's distracting. I don't think I've had a better visual experience in a theater than at home in a long time. The screen is physically larger, but when I sit 4 feet from my TV it's the same size visually.
Audio still wins by a mile in a theater though. Though home theater audio has gotten pretty darn good. I just wouldn't know as long as I'm in an apartment.
You can also always choose when to have motion interpolation on or off. For sports and nature documentaries I think it's just better. For animation it's worse. For other films it depends, I usually prefer it in something fast and actiony, like John Wick.
I like interpolation too, but it must be high quality, not the low quality that's built into TVs. For example, I use SVP 4 which does interpolation with your GPU, works great.
Movied are even worse, I can absolutely see the judder especially when panning. It is one reason I prefer to wait for home releases and will go to movies that are specifically phenomena irreplaceable at home, such as Oppenheimer in 70mm IMAX.
The problem with motion interpolation is because most of it is the cheap variety. I don't like motion blur, it is annoying, but bad motion interpolation is worse.
There really are a large range of opinions about this. I love high frame rate, but can't stand interpolation. I wish theaters used higher frame rates. I really enjoyed the parts of Avatar 2 that were smoother, and it felt jarring to me whenever it would switch back to low frame rates.
Probably it's just what you're used to and how much you've been trained to notice artifacts.
Motion smoothing is great on my 120Hz TV.
Theater is fine -- I have a home theater w/ a 24p projector. I wish it had 120Hz smoothing, but at least it's not dim like my local theater.
24p content at 60Hz is really, really bad (3:2 pull down what not). At least we can agree on that. That seems to be what "turn off motion smoothing" is about, to me.
24p content at 24Hz is fine, but 120Hz is better.
EDIT: I should say that the input lag (gaming) is much greater for smoothing, so even at ~30 fps I'd run my PlayStation at 60Hz no smoothing (game mode on &c).
I understand and agree with what you are saying, but I think your preference for motion interpolation is quite unusual.
Perhaps it is a preference that will change with generations.
Probably not, young people watch most of their content on phones/etc, not big TVs with motion interpolation.
Someone tell the film industry, so that they stop lighting and color-grading the movies and TV shows for ultrabright, turbo-HDR-enabled expensive TVs - almost no one has those, and even those who do mostly watch on their phones. Maybe if the industry gets that memo, we'll once again have films and shows in which one can actually see what's happening.
I’d rather my films be graded for home theater use than phone use. HDR TVs these days are basically all TVs. Why should we all take a step back in quality when the hardware to view it decently is more affordable than it’s ever been? I don’t care how Oppenheimer looks on your phone I know Nolan would care even less.
Because who has home theater? I'd think it was mostly killed by the Internet, streaming, and the overall trend of cable-cutting, ditching TVs, and switching to computer and laptop screens for one's entertainment.
(I guess at some point this must have reversed everywhere except the bubble myself and my friends and acquaintances live in...)
Lots of people have TVs, not many are seriously watching Netflix on their phones (maybe their laptops) but generally on their TVs in the evening.
I personally, have a 100” dropdown screen, ust projector and a Sonos surround. This last month, I helped my work buddy shop some deals and got him a 75” Dolby Vision TV and surround sound system with sub for $1200. These are affordable things.
I have a home theater. I don't know how big the market is, but you can buy receivers from a half dozen manufacturers (Denon, Yamaha, Sony, Onkyo as well as a bunch of speciality brands), digital projectors (Epson, Sony, JVC, BenQ and also a bunch of speciality brands), and all manner of ancillary items (sound proofing, audio treatments, theater seating, etc).
/r/hometheater/ has nearly a million members.
https://www.avsforum.com/ is pretty active.
TVs are still getting larger and they've recently crossed the 100" in size.
So yeah, there's a least a few dozen of us!
This is absolutely the case with all of the nieces/nephews that I know. They prefer that they see the majority of, tablets, which 81% of kids have these days [1].
[1] https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&c...
That number is a pretty high--looking at your link, living in a house with a tablet isn't the same as having a tablet. I have tablet from work but my kids don't use it or any other. BTW that's really cool that you could direct the highlighting in the URL like that. If it's not specific to census.gov, what is that called? What level of the stack is interpreting it?
Its at the JS layer from what I can see:
#:~:text=In 2021%2C 81%25 of households,17 years old — owned tablets.
That query fragment is what triggered it, which means theres some Javascript taking that in and highlighting whats on the page
It's a Chrome feature: https://www.theverge.com/2021/4/17/22389519/google-feature-c...
Neat! I learned something new :) Thanks
The number of TVs that have it enabled by default seems to indicate otherwise. I'm not saying that the manufacturers are correct, necessarily, but I don't think they would go through the effort if they didn't think enough people wanted it.
Or: it just "shows" better when they're demo'ing the TV to a potential customer, not unlike the hyper-saturated nature footage they use to get you to look at their TV and not the one next to it.
I don't buy that. Why would motion interpolation look better in the store than it does in your own home?
I think it’s because of the content used for it. Some things do look better. But movies absolutely do not.
Because the store chooses to show footage for which it is optimized, not a movie. But there's also the consideration that it looks better at first sight when you walk past it, than it does when you watch a whole movie looking straight at it.
Just as the people who design google maps navigation don't drive, the people who design parental control features for google and apple surely don't have kids, and the PMs who choose to turn on motion interpolation by default likely don't watch movies on their TVs.
Major, shockingly obtuse sensibility gaps are sadly the norm today in product management.
It seems the side effect of treating customers as users in the classical sense, is the cross-industry adoption of the mantra, "don't get high on your own supply".
They are in fact quite incorrect. These poor upscale features are horrors crafted in the early days of post-processing nightmares. Half of it was pitched to make football games appear more real. They all murder the soul of the film and jack up delay. The industry even had to invent a synchronized audio mechanism to compensate for the summoning of these Eldridge horrors. What I don’t mind are modern upscalers which look very nice on expensive sets. Other modern features like HDR are also lovely because they focus on reproducing the movie with vibrant lighting. Anyhow, one summer working in a television station will improve your acuity for these things. You might even come to understand how these old creations will be the downfall of civilization. As an aside, do not forget to surgically remove samba tv from any android set with adb.
You can also prefer to not like subtitles, and watch films dubbed from their original language.
You can also prefer to not like black & white films, and watch the colorized versions done decades later.
You can also prefer not to see nudity, violence, or profanity, and watch the edited-for-TV versions of those films.
Finally you can prefer a totally different story or shot composition or artistic choices altogether, and ask Generative AI to recreate, reedit, or augment scenes to your preference.
All of these are valid preferences and we have the technology to facilitate them for you.
But 1) They should never be the default on any technology you acquire. That's the PRIMARY sin of all of the technologies mentioned by OP - it's not that they exist, it's that they're on by default, and since most humans never change the default settings on ANYTHING they change, they experience content in a way that as not intended by the original artist behind the vision.
And 2) Well, this is subjective, and everything is a spectrum. But you are ultimately robbing yourself of the specific experience intended for you by the creator of the film. It's certainly within your right not to care and think that you know better than them, but on a spectrum of that philosophy, carried out across all of society, it's probably not a good thing.
Oh look the exact type of shaming OP was talking about.
And so they should be shamed. There are professionals working on every aspect of a film, and these options just shit all over their work, ruining their intended vision.
Now if you're talking about watching YouTube or similar content then it's a different story.
"Vision" implies that there is a choice. How many feature films have been made at a frame rate other than 24fps? Ever since the sound era, I suspect you can count them all on your fingers.
So I don't buy this "vision" argument. It's just holier-than-thou. Directors choose 24fps because it's literally the only option. It's no choice at all.
Forcing me to watch your movie as a jerky motion-sickness-inducing mess? I will indeed, to use your words, shit on it.
The default should be what is more popular. The connoisseurs are probably savvy enough to change the settings from those defaults.
Most people don't give a damn about those things. They just want to be entertained, maybe they have other hobbies where they can exercise their particular snobbism, and movies are not particularly important for them as an art form or they don't care much about art at all.
Juddering 24fps footage means you have a display that isn't displaying 24fps properly (see here: https://www.rtings.com/tv/tests/motion/24p). The refresh rate and frame rate aren't matching, most modern TVs account for this and can correctly display it.
It's not that, because I experience exactly the same in movie theaters. I'm always amazed when people say it doesn't bother them.
We could be confusing terms here... I mean 'judder' as uneven frame times, it looks like things skip slightly out of time. Uneven frame times shouldn't even happen with a cinema screening unless something has gone terribly wrong.
My hope is that thanks to stuff like DLSS frame generation in video games (or maybe AR/VR) that the opinion of a majority of people will change over time. So that eventually maybe... we might actually see movies being filmed in higher framerates. People's conditioning really stands in their way, imo.
The only bad thing about motion interpolation on most TVs in my book is the fact that the /implementation is often pretty bad/. If perfect frame interpolation was a thing, I'd watch everything upsampled to 165Hz of my monitor. Well, I do it anyway using the Smooth Video Project. But that also suffers from artifacts, so is far from perfect. Much better than judder though...
DLSS 3 really is fantastic technology. It works amazingly well for interpolating a game from 60 to 120 (or higher). It fails pretty hard if you start with 24 though. We'll need something like RIFE for that, but currently no hardware exists that can run it in real time...
Don't know how you do it. I can't stand the jelly effect. Motion interpolation was the first thing I turned off when we got our A95K earlier this year.
Seriously... No! Our first LCD TV (several years old by now) had this, and watching LOTR on it - was wtf is this - am I in a theater? We both sit with my wife and watched it, and we were secretly annoyed but dared not to say or complain about it, because we just spent tons of money on it... Then we found it and fixed it!
New TV I've got has the `Filmmaker mode` - wasn't sure what exactly is that, turned it On and yes - it's how it should be. This article cleared it for me now
I don't think this is about feelings. (But it just might be in a small minority of cases IMO, and if it's the case for you, then all the power to you.)
In the case of this article, all the points are about discernment. Some people not only notice the issues, but are also able to identify them and articulate precisely what is happening. The rest don't.
No one was shaming you, get over yourself. The author repeatedly framed recommendations in the context of how the creators produced it. If you feel shame that’s all you.
It's so ironic that at the end you gave an advice:
Are you assuming that everyone likes the same things you like? I guess not. The same for the article—it doesn't assume that.
No advice fits everyone, but good advice fits many people. And I'd argue that the article's advice fits many, if not most, people.
TV is probably the only screen many people have that has motion interpolation on by default. They watch movies on theaters, PC monitors, laptops, tablets, and phones probably more than on TVs.
Many people are already used to what movies "look like." Non-tech-savvy ones might not know what "FPS" means, but they would likely be able to notice that watching movies on their phone feels different from on a TV with the default setting. The article's suggestion to disable motion interpolation on TVs makes the watching experience more consistent across all screens.
I like the feature, too. I remember watching the Battlestar Galactica remake with the interpolation setting active and getting an even deeper sense of realism out of the scenes. They were already aiming for a documentary style with the camera work and special effects, so the higher prosumer framerate fit with the style very well. On other films and TV shows I like the interpolation for panning shots which jidder a lot on the standard framerate.
Isn't the manufacturer doing a worse/larger version of this assumption normalisation by having these things on by default?
More articles explaining how to customise the settings is better than less because it draws attention to the fact these options are being forced on all consumers.
Shaky-cam action movies are their own unique problem :)
I think they exist because it saves a fair bit of cash on fight choreographers and the time it takes to train the actors to do it realistically. On the flip side, it really increases the respect I have for properly choreographed action scenes that are consistent and visible.