return to table of content

Re-creating Disney's sodium vapor process [video]

modeless
27 replies
21h23m

Hmm, why don't people do this with near infrared? It would allow using full spectrum lighting, and would be easier to selectively filter.

I'm guessing the answer is "they do, it's just uncommon and you haven't heard about it"

largbae
9 replies
21h8m

Does the magic color need to be within the frequency range that the camera and video format can record?

modeless
5 replies
21h5m

Most camera sensors need a filter to block infrared light because they are sensitive to it. Remove the filter and replace it with an infrared pass filter, and you've got an infrared camera. It may not be as sensitive as in visible wavelengths but you can use brighter infrared lights to compensate if necessary. This sodium vapor process already uses two cameras so it would work the same way.

It seems possible to produce a four-channel camera sensor specifically for this use case, if it was popular enough. Given how common green screen is in Hollywood I'm surprised I haven't heard of anyone doing it. Maybe Hollywood just doesn't care because they can do regular green screen and just hire someone to fix it in post.

hollerith
3 replies
20h49m

brighter infrared lights to compensate if necessary

I am guessing that that is dangerous to human health because infrared fails to elicit the flinch reaction that too-strong visible light does.

hansvm
2 replies
20h13m

Near-infrared penetrates very well. The same intensity is much safer, up till you get to the levels of radiation that are dangerous regardless of how they're distributed in your tissues.

That's one of the things that makes it suitable for thermal nano cancer therapies. You

(1) Create a nanoparticle with a gold core of the right size that it excites when exposed to near-infrared radiation (the far end of "near" is ideal),

(2) Coat it with something that's hopefully less toxic than the cancer because the gold nanoparticles are pretty reactive (silica was popular last I checked, though I'm not convinced that's actually safe, and there hasn't been much testing on it),

(3) Coat that in something that binds to the right antigens,

(4) Inject that into the patient so that you'll eventually have a tumor rich in these nanoparticles while the rest of the body has a very low concentration,

(5) Shine an intense near-infrared beam at the nanoparticles. It safely penetrates body and causes minimal heating, instead depositing all its energy at the tumor, selectively cooking just the bits of tissue you don't want anymore.

It's yet another way to make mice immortal, but human testing is a long ways off. The hard parts are (2) and (3), along with the (4a) I didn't mention where most of these things don't want to stay "nano" in most chemical environments, and the solution they're suspended in is usually also not ideally suited to being injected into living mammals who would like to retain that "living" property.

hollerith
1 replies
19h5m

Near-infrared penetrates very well. The same intensity is much safer

Glowing-hot steel produces enough IR that the workers that work with it have 7 times the rate of cataracts as the general population.

hansvm
0 replies
17h41m

Sure, enough of almost anything is dangerous.

NIR is much safer than the visible spectrum at the same power levels. The threshold for a few minutes of high-intensity NIR increasing your chance of eventual cataracts is staring at an NIR source 10x brighter (w/cm^-1) than the sun integrated across all its wavelengths. The threshold for sun-colored wavelengths causing eventual cataracts above the baseline from a few minutes of staring is under 1x.

Also worth noting, very very near IR (basically red, though we can't see it) doesn't quite enjoy those same properties, and during their time working with hot metal you'd expect a lot of energy in that band.

Also worth noting, many steel workers are exposed to dangerous amounts of visible light too. I absolutely believe that they can get enough NIR to cause problems, but if I wanted to try to prove that NIR specifically causes their problems to somebody else then I'd want to try to account for that fact (and for incidental welder exposure, ...).

magicalhippo
0 replies
20h8m

The sensors should be sensitive[1] well into the near-IR, and there are people in the astrophotography community which can service digital cameras to remove the IR-cut filter, these[2] for example.

Another alternative I was thinking about is just to use one of those monochrome astrophotography cameras, something along the lines of this[3]. The monochrome version doesn't have the IR-cut filter and sensitivity is pretty ok down into near-IR.

edit: The IMX492 sensor in that one has a 40% response at 850nm, and 850nm IR diodes are plentiful and shouldn't emit[4] much at all below the 700nm of the IR-cut filter the color camera should have.

Not sure how mixing sensor sizes and such would affect things, or if it's better/easier to just run two of the same camera.

[1]: https://s1-dl.theimagingsource.com/api/2.5/packages/publicat...

[2]: https://www.lifepixel.com/

[3]: https://www.zwoastro.com/product/asi294mm-mc/

[4]: https://lumileds.com/wp-content/uploads/files/DS191-luxeon-i...

laserbeam
2 replies
17h17m

You want the 2 cameras you are recording with to be near identical. If the lens is different (for example) then the mask might be distorted and would not line up with the colored image.

Building 2 nigh identical cameras, but one with a sensor outside of the frequency range of the other, is probably a very expensive custom job.

moron4hire
0 replies
10h48m

Lens distortions can be corrected mathematically in post after a relatively simple calibration. Your standard DSLR actually already does this. Most lenses have a chip that reports an ID that gets recorded with each photo to recall from a database of distortion parameters.

Glyptodon
0 replies
16h10m

If you have the prism for it and you just need a lens mount attachment at sensor distance from the prism and two reverse mounts on the other side of it to mount mirrorless cameras on (assuming light comes out of the prism straight or if you can just split the distance between each side of the prism -- note I have no idea about this) it doesn't seem that expensive - buy a couple mirrorless cameras, get yourself a cube with the mounts, and get going with one lens, no?

It seems like the cube with mounts is the most expensive part, and there's no way it can cost more than one of your cameras, let alone the lens, right?

wokwokwok
8 replies
20h35m

Did you see them putting the black panels up in the video and measuring the color spectrum off the costume, and covering the skylight?

They did that to get a 'perfect' mask.

If you use infrared, which is generated by random environmental objects, how would you prevent random spill?

Practically (my understanding) is if you got random spill on the surface, you'd get random alpha in the mask.

The point of using the sodium light is that it's a single wavelength that you really really would not get naturally, so you can totally control the emission sources of it.

Could you do that with a single frequency infrared? I guess it's possible, but I'm not sure how you'd do it technically, and it's unclear what the benefit of using a different frequency would be?

modeless
7 replies
20h25m

Near infrared isn't generated by random environmental objects. You're thinking of far infrared. Infrared is a big spectrum, much larger than the visible spectrum.

It may be generated by lights but could be filtered out. It is generated by the Sun, but so is the sodium wavelength. The technique would work indoors just as well as the sodium one, but with important advantages. I think it could even work pretty well outdoors if you used a wavelength that is mostly filtered out of sunlight by the atmosphere, or maybe two different wavelengths in two channels simultaneously.

wokwokwok
6 replies
17h59m

Huh, my bad, I thought thermal sources (ie. like random hot stuff, not glowing hot stuff) emitted a wide band of frequencies including near-IR, but fair enough. You'd still get it from any sort of incandescent light source right? (same as a sodium lamp)

but with important advantages

What's the benefit over using any other single frequency visible light?

Even if the atmosphere absorbs most of a particular band, if any of it gets through, your mask is going to be messy, which kind of ruins the point; I don't know anything can really fix that for outdoor purposes?

I mean, broadly speaking, all you need is a way to emit a sharp frequency band and avoid having any environmental pollution in the same band.

If there's a good way to do that in near-IR, it would work; but I feel like the same could be said about pretty much any frequency.

I'm not aware of anything specific about near-IR that makes it particularly attractive.

(but, clearly, I'm no expert in near-IR)

modeless
3 replies
17h45m

The advantages I stated were

It would allow using full spectrum lighting, and would be easier to selectively filter.

Making a broad spectrum IR filter is far easier than a notch filter for a single visible wavelength. In general it's harder to make more selective filters and an IR filter can be much less selective because the IR band is so large and far away from any visible light.

Also the infrared version would allow using any color background including black, and the on set lighting would appear to the eye exactly as it does to the camera. With the sodium vapor process the bright yellow background would make everything on set look different in person because the eye wouldn't be able to subtract it the way the camera does.

Yes, you would need a filter on any incandescent light. But this is also true of the sodium vapor process. Actually I'm not sure anyone makes a sodium vapor wavelength notch filter suitable for putting in front of an incandescent light. Consequently, as seen in the video, they restricted themselves to special LED lights that emit essentially only three wavelengths. These are very much not full spectrum and will not produce natural colors for some materials. And they had to completely block all windows, whereas for IR you could just install some IR blocking film on the windows and still use natural sunlight.

wokwokwok
2 replies
15h26m

The whole point of this technique is that you have that sharp frequency band and a crystal / filter that splits like band out.

You're basically proposing something completely different here.

So... while I agree you could do something as you describe, and I'd be interested to see the results of it, I'm not sure what the results would be like.

The advantages I stated were

> It would allow using full spectrum lighting, and would be easier to selectively filter.

All I can say is the CC results in that video were pretty great, despite the obvious caveats, compared to green-screen.

If you (or anyone) can build on it to do better, people will be interested in it; but there's obviously a difference between us idly speculating on how/if it might work, and this, where they've actually done it and shown that it does work.

"they do, it's just uncommon and you haven't heard about it"

I've never heard of it being done.

If anyone has, please post a link or something.

modeless
1 replies
15h18m

No, the point of the technique is you have a frequency band that you can isolate and use in the background and not in the foreground. Using IR instead of visible as your isolated band makes the isolation easier and cheaper with fewer artifacts and limitations. It's still a very similar process otherwise.

Yes, Corridor's results were great. Fewer limitations would make the process even better!

ctrw
0 replies
13h10m

Near ir LEDs are also cheap and easy to power, with its what's in remote controls. Low pressure sodium lights aren't. I remember using them in postgrad labs and it wasn't fun.

Alternatively you could use a solid state laser or narrow band LEDs if you need the band to isolate to be in the visible spectrum too.

pfdietz
1 replies
6h20m

What's the benefit over using any other single frequency visible light?

One advantage might be to pulse the IR and detect it via that, if the detectors are sufficiently fast. Maybe have a filter that's pulsed also (Pockels cell?)

https://en.wikipedia.org/wiki/Pockels_effect

randomfrogs
0 replies
5h41m

One problem you might run into is that a lot of common plastics are opaque to NIR light, so you might find certain materials gave you strange results (water bottles that appear transparent to the eye would not actually pass the NIR light needed to make the mask layer).

ikiris
3 replies
20h55m

Because you want color bands that you control, not that are naturally emitted.

modeless
2 replies
20h6m

Near infrared isn't naturally emitted by room temperature objects. You're thinking of far infrared.

ikiris
1 replies
16h52m

I didn't say room temperature objects. I said naturally emitted. Trying to have lights and other objects involved in film production that don't have emission spectra inside the mask range for near infrared would be a lot of effort compared to just... not having it be a problem with things that are easily attained with cheap materials like sodium narrowband.

modeless
0 replies
16h33m

By that definition the sodium wavelength is naturally emitted too. The sodium vapor process as implemented in the video requires special lights that don't emit the sodium wavelength. It's actually much harder to do that (and it affects color reproduction too) than to block some infrared from existing lights. Infrared blocking film and filters are readily available and cheap. They don't need to be fancy narrowband notch filters like a sodium wavelength filter would have to be.

Now that I think about it, Disney must have done something pretty special for the lighting in the foreground in their version of the process. They didn't have single wavelength LEDs back then. I wonder how they did it?

xemoka
0 replies
21h11m

Neat little rabbit hole, thanks. It lead to Netflix's "Magenta Green Screen: Spectrally Multiplexed Alpha Matting with Deep Colorization" by Dmitriy Smirnov, Chloe LeGendre, Xueming Yu, Paul Debevec: https://arxiv.org/abs/2306.13702

Not with a non-visible spectrum, but along the same lines. I can't seem to directly find anything with NIR though;

wiml
0 replies
16h6m

From what I've read in old books on SFX, they did, and near-UV as well.

My guess though is that using a visible wavelength makes it easier to set up the lighting and ensure it's even and so on (remember, this was before electronic technologies that would make it possible to live-preview the result). Also, sodium lights and sodium band filters had other uses in industry so they might have been more available than NIR/NUV equivalents.

ginko
0 replies
13h51m

Apart from what others have said, you'll run into problems with chromatic lens aberrations the further your "key" wavelength is outside of the visible spectrum so your two images might start to mismatch too much.

I know in IR still photography you're supposed to add a focus offset marked on your lens to adjust for that. A regular achromatic lens quickly fails outside of the wavelength range it was designed for[1].

[1] https://en.wikipedia.org/wiki/Chromatic_aberration#Minimizat...

eurekin
0 replies
12h52m

Maybe... it would get too hot?

EDIT: actually, since it's outside of the visible spectrum, typical lens design correcting longitudinal chromatic aberration and doing the spherical to rectlinear projection simply might be off, resulting in soft edged and general artifacts unwanted in a clean mask

kadoban
23 replies
1d

Very interesting stuff.

I'm curious if this means anything in terms of good vs perfect technology or what tech wins out in practice. This process seems just plain better than green-screen, so why did it lose out? Is is just that green screen is good enough? Was it too hard to do this sodium vapor process with random cameras back in the day? Are sodium vapor lights bad enough in some way that practically it lost out?

buck746
10 replies
23h56m

It lost because there were only 3 prisms made. It also required a lot more lot as running thru an optical splitter means at least half of your light was lost, it's not a big issue today. Modern cameras are insanely more sensitive than film was at the time. Alfred Hitchcock used this on "The Birds". In the YouTube video they got a great result even with a smaller studio than would have been used on Mary Poppins. The trick needed the screen as far back as you could get it to minimize the chance of spill, that used to be normal on blue screen before the software improved.

kadoban
9 replies
23h52m

It lost because there were only 3 prisms made.

But it seems so much better that that shouldn't have been a blocker. Like optics isn't magic, and knowing that someone did it before should be enough incentive? Not like movies are a small or poor industry.

cooper_ganglia
3 replies
21h28m

I wonder if it’s just more cost effective to use a well-established tool like green screen, and then contract out all the rotoscoping clean-up to other countries for less than minimum wage?

kadoban
0 replies
21h19m

Yeah maybe it's something like the person choosing the camera process doesn't directly have to deal with 12 hours of roto per day, so practically it's just not something they think about.

bigiain
0 replies
20h37m

"and then contract out all the " blame for it's shortcomings...

"Sorry boss, that's industry best practice. Nothing we can do to make sit better."

486sx33
0 replies
21h18m

I think part of the deal is that, no amount of manual intervention will allow translucent objects (like the veil) as nicely as sodium vapor lighting , also motion blur

astrodust
2 replies
20h48m

Incentive is one thing, budget is another.

Keep in mind stuff gets lost all the time, even really important things like nuclear weapon components like https://en.wikipedia.org/wiki/Fogbank

kadoban
1 replies
20h5m

Fair. But looking at that story (very interesting btw, thanks), they did recreate it. So incentive does mean a lot when $$ is also available, which I'd suspect it is in movies.

astrodust
0 replies
50m

I think the US nuclear program has a much bigger budget than Disney, and if they need something, they just get it.

The movie industry had cheaper options, so they went with those.

wbl
1 replies
19h29m

Today we could use a dichoric coating but back then I think fabrication was much harder.

namibj
0 replies
13h21m

Well, the technology for them is far from recent, the process control and chamber geometry design to allow the needed precision and uniform-ness at the throughout needed to not be extremely expensive, however, required cheap microprocessors for a good decade to develop the required skills in control engineers to then actually go and make a machine like this. And the computers suitable we're still triple-digits in the 80's, so this only really started to deploy to factories ~25 years ago.

YoshiRulz
2 replies
18h17m

This was addressed at the end of the video: The main points are that it can't be used outside, or when the shot includes the subjects' legs/feet (as the ground would need to be an amber-screen as well), and on top of that, it requires a certain level of setup which some shoots can't afford.

laserbeam
1 replies
17h12m

I don't think I've ever seen green screen used outside either. I think you can kinda use it outside... But any random cloud passing by would destroy your morale in the editing room if you're keying against green screen. Also lighting conditions change during a day and you'd have to be aware of that when keying. Not fun.

niccl
0 replies
16h56m

Lord of The Rings used some huge external green/blue screens. One I know of was the scene in Return of the King where there's a dead oliphaunt and, IIRC, Eowyn fighting the Witch King. But there were lots of others.

tadfisher
1 replies
21h31m

Practically, you could see it in the video: they had to have near-perfect lighting to avoid any 589nm spill on the subject, or they would end up with semi-transparent regions on the matte where they didn't want any. Chroma-keying is more tolerant to spill, only needs one camera with one film reel, and you don't need a special screen with special lamps and special cameras.

kadoban
0 replies
21h26m

Chroma-keying is more tolerant to spill, only needs one camera with one film reel, and you don't need a special screen with special lamps and special cameras.

Fair on the number of films/cameras.

Is normal keying actually more tolerant to spill or is it just worse so you're going to have to manually deal with spill anyway so you just live with it? I don't even have good eyes and even I sometimes see spill in actual movies, or especially in TV. Edit: I guess that's _different_ spill though.

I mean you do famously need a special screen :) And you don't need _special_ lamps per se but you do have to light your greenscreen fairly well to have real success.

cladopa
1 replies
21h19m

so why did it lose out? Is is just that green screen is good enough?

It looks that the problem was manufacturing the prism, as simple as that.

Today you can use unusual wavelength LEDs without phosphorous to do the same thing probably way cheaper.

namibj
0 replies
13h27m

No, LEDs are far wider than low pressure sodium, which is two about 0.1nm wide (thermal broadening, IIUC basically Doppler effect between the emitting particle and the surrounding world/containment vessel) lines 0.56nm apart. See https://lambdasys.com/uploads/LLE-2.pdf for a measured spectrum that's not as coarse as the one on en-wiki where both lines are blurred together.

LEDs are 10~50nm, untuned laser diodes 1~10nm. It is my understanding that it's easy to make a filter for the camera that very selectively targets this line pair, though it may entail mild optical complications as a sub-nm pass/reject bandwidth detunes from sideways angles, so lenses may need to collimate the light first before feeding the filter, and de-collimate again after.

rperez333
0 replies
6h7m

Because it creates more problems than it solves. A movie set is expensive and moves fast – and keying is not as big of a problem as they made it out to be.

For example, Niko doesn't seem to be aware of the IBK keyer (based on his comments in the VFX subreddit), which is the keyer used on almost every shot in high-budget movies. While it's not a single-click solution, it's the main tool to tackle thousands of shots with defocused edges, motion blur, and semitransparency every year.

Niko deserves credit for bringing Paul's incredible experiment, but Corridor is at its core a group of entertainers, and most of them have never had experience in a large production/VFX environment. Take everything they present with a grain of salt.

rendall
0 replies
16h8m

so why did it lose out?

The video suggests that it was difficult to replicate and extremely expensive.

mrguyorama
0 replies
6h14m

In modern production, an army of underpaid, un-unionized visual effects contractors who pitch lowest bidder contracts to manually retouch basically every single pixel in the film six times over is a fine price to pay for studios, while this method would require carefully planning everything ahead of time, proper scripting, and not being able to change scenes after shooting nearly as easily, and also probably involves hiring unionized labor by talented tradespeople who can say "naw that's a bad deal and a dumb idea".

giantrobot
0 replies
20h46m

As I understand it (not an expert but interested amateur) the sodium vapor method of a matte was a cheaper version of mattes movies used to use. To make a matte they'd film the actors over a black background with normal lighting for the actors. The black was velvet or batting that didn't reflect any of the foreground lights. They'd take that film and do a bunch of exposures that would eventually create a white silhouette that would preserve a decent amount of fine detail from the foreground. That matte would be used in an optical printer to double expose the film, once with the background plate then with the foreground plate. While effective this was really expensive and time consuming.

The sodium vapor light technique allowed for a very easily generated matte with a high level of detail. It however required specific lights and a lot of control over spillage from the background lighting into the foreground. It was also difficult to film from multiple angles since the lighting would need to be adjusted when the camera was repositioned. With those limitations in mind it was much cheaper and less time consuming than making mattes with multiple exposures and such. Cheaper effect shots meant more effect shots for the same size budget. With a movie like Mary Poppins that wanted a ton of effects shots the traditional method of making mattes would have been cost prohibitive.

Chroma keys came about because they were much cheaper than the multiple exposure method of doing mattes. The back and foreground could be lit together and the matte could be knocked out relatively easily by applying a chroma filter when making the work print. The mattes weren't as exact and had their own limitations but they were relatively inexpensive. For video (as opposed to film) chroma keys could be done electrically by filtering and mixing the video signal.

Digital mattes have a lot of flexibility. As shown in the video digital editing software has lots of toggles to help make chroma mattes look good. It's a very manual process and hard to really get right. Also as shown in the video there's some things that just do not work really well with chroma keying like translucent materials and anything with high index of refraction. They're typically a good balance of cheap and effective with a lot of known techniques to make look good.

WorldMaker
0 replies
1h39m

Some of why it "lost out" just seems that it was a proprietary technology carefully guarded with a rather low "bus factor". Blue/Green screens and chroma-keying have been "common knowledge" for a long time, with a lot of implementations over decades, but the sodium vapor process was a more closely held secret because it was a competitive advantage. Fewer people knew how to do it. Fewer people had access to do it. Eventually those people passed on or retired and the industry "forgot" how to do it entirely. It may have been "lost" simply because not enough people were allowed to know how it worked or have access to recreating it.

I think there are a lot of analogies here to find related to the software industry, such as the relationships between closed source and open source software.

lqet
6 replies
13h52m

Yes, extremely impressive. "Mary Poppins" rightfully received an Oscar for best visual effects.

I am continuously amazed at the level of innovation done at the Disney studios back when Walt Disney was still alive. Rewatching Snow White with my daughter a few months ago, I was flabbergasted by the animation quality, even when compared to animation movies made decades later. Just look at these water scenes:

https://youtu.be/54QeNL5ih6A?si=dOdrRrISu8f4I96G&t=95

https://youtu.be/khmrr7-W6BA?si=whn5mvuL9x95JP1l&t=70

This was in 1937! Compare these scenes to this Paramount animation clip (also featuring Snow White), which was finished just 4 years earlier: https://www.youtube.com/watch?v=cKOSJ5AAwfc

The multiplane camera [0] is also ingenious and is imho the single biggest reason why Disney could produce animation movies that looked "cinematic". I read somewhere that they re-used it for "The Little Mermaid" in the 80ies, and nobody really remembered how it worked anymore.

Disney also developed Xerox animation [1], which is cheaper than traditional animation, but which gave 101 Dalmatians a very nice "sketchy", artsy look .

[0] https://en.wikipedia.org/wiki/Multiplane_camera

[1] https://www.smithsonianmag.com/innovation/how-one-hundred-an...

degauss
1 replies
2h46m

Around 1996 I worked at Walt Disney Feature Animation (Hercules, Tarzan, Mulan, Dinosaur era). I was talked with implementing the Disney multi plane 2D scanning camera in a digital environment, Maya, a 3D modeling and animation package. I was given full specs and it was straightforward. One thing was odd, the 2D coordinate system used 5000,5000 as the origin, I forget what units. No one seemed to be able to explain why. A while later the original mechanical multi plane scanning camera was put on display. Looking at it, it made sense. It used a 4 digit mechanical counter for the position of each axis with a range of [0,9999] so 5000 is approximately the center. Warner Brothers Animation used software called Animo and it had the same origin. I ended up working with an Animo developer at my next job and asked him about it. He said Warner Brothers told them to implement it that way with 5000,5000 as the origin because that’s how they do it at Disney and Disney must have a good reason.

atlas_hugged
0 replies
55m

Hahaha that’s awesome. Witnessing the formation of a mini-cargo cult, so-to-speak.

beAbU
1 replies
13h31m

Re the animation quality: I wonder if Disney didn't create a lot of their later animation by rotoscoping over live-action recordings? The quality of the animation and the realism of the characters' movements in those examples you posted definitely allude to this.

Disney was known to recycle some of their animations in older movies [1], so this could be because they have a source live action recording somewhere that they rotoscoped over more than once for different productions. Also note the different backgrounds with different characters that move - excellent example of the multiplane camera in action. In the foreground they have a "recycled" animation, in the background they can then put something else in, and record it all in one go with their special camera.

1 - https://www.youtube.com/watch?v=vd09WdOgOeg

lqet
0 replies
12h41m

I found this interesting paragraph on Wikipedia:

Very few of the animators at the Disney studio had had artistic training (most had been newspaper cartoonists); among these few was Grim Natwick, who had trained in Europe. The animator's success in designing and animating Betty Boop for Fleischer Studios showed an understanding of human female anatomy and, when Walt Disney hired Natwick, he was given female characters to animate almost exclusively. Attempts to animate Persephone, the female lead of The Goddess of Spring, had proved largely unsuccessful; Natwick's animation of the heroine in Cookie Carnival showed greater promise, and the animator was eventually given the task of animating Snow White herself. Though live action footage of Snow White, the Prince and the Queen was shot as reference for the animators, the artists' animators disapproved of rotoscoping, considering it to hinder the production of effective caricature. Nevertheless, all of the above-mentioned characters were fully rotoscoped and utilized by their respective artists, some more, some less.[98] Despite Graham and Natwick's objections, however, some scenes of Snow White and the Prince were directly traced from the live-action footage.

https://en.wikipedia.org/wiki/Snow_White_and_the_Seven_Dwarf...

hondo77
0 replies
4h40m

It's not that they forgot how the multiplane worked. They understood it...in theory. However, they were quirky beasts and it was knowledge of those quirks that was lost. Like getting an old car to run--most people these days don't know about pumping the gas pedal to keep it running.

PaulHoule
0 replies
9h22m

Disney’s animation team was decimated in the 1970s, when they made The Black Caludron it took them two years to figure out how to draw decent humanoid animations again.

bcraven
3 replies
18h16m

After seeing the Olafur Eliasson retrospective at the Tate Modern a few years ago I tracked down the components to build my own low pressure sodium vapour light.

It's so weird, as those people say, so see things coloured... yet monochrome.

jlarcombe
1 replies
16h3m

I rather miss them in the streets. It's only been, what, ten years or so since they've been replacing most of them with white LEDs round my way, but it's almost hard to remember that everything was yellow whenever you went out in the dark! Especially amazing atmosphere when it snowed...

BoxOfRain
0 replies
9h38m

I really prefer the old yellow sodium light to the harsher white LEDs, I get that the SOX bulbs weren't economical to make any more for various reasons but wish we could have migrated to monochromatic yellow LEDs instead of white ones. They're not just more pleasant aesthetically, they're also better for insects and animals than broad-spectrum white lights. I don't get the obsession with making lights as cold and harsh as possible, give me warm light that makes the night look like night rather than a bad approximation of day.

At least by the time they got round to doing my street they used a lower temperature white than the ultra-harsh 'prison yard white' colour used in the earlier installations.

harywilke
0 replies
11h26m

I was fortunate to be at the Tate for that exhibition and got to see when they turned the 'sun' off at closing time. I was visiting my brother in London at the time, he wasn't a big fan of modern art, so we split up, I went to the Tate, he went elsewhere to sight see, then later at dinner after effusing over the 'sun' exhibit and showing him photos of how cool it was, we rushed across town to get in just before closing. We got to see it together for about 10 minutes, before they turned it off. It went from a Sun to the Moon. Spectacular.

esafak
9 replies
1d20h

tl,dr: Narrow-band lighting and optical filtering.

crawsome
8 replies
20h58m

Thanks. The sensationalism and feigned overenthusiasm of these thirsty youtubers is so grating. I'd rather just read about it.

wincy
1 replies
17h44m

I don't know, when Nico gets excited about a project, I suspect it's genuine. It's part of why I watch the channel.

His employees, (Wren especially), give off a silly fake enthusiasm vibe (or maybe he's just like that, and that's why he got hired, I don't know) but Nico only really shows up in the videos he wants to make. Guy seems like he's getting to live out his dream. The videos he spearheads are both technically impressive and time-consuming, such as Anime Rock Paper Scissors, which was a groundbreaking use of AI diffusion models a year ago, or this one.

coayer
0 replies
4h23m

I've been watching these guys for so long — when Wren's speaking to camera in his solo videos he overdoes it but I think he's naturally pretty excitable. Niko is my favorite though.

ChrisClark
1 replies
20h9m

He's been trying to recreate this for years, I can understand why he's so excited to do it and that it works so perfectly.

kibibu
0 replies
18h34m

The actual cleverness was from Paul Debevec, who has done incredible things for the Vfx industry over his career.

krasin
0 replies
19h55m

Normally, I would also prefer reading, but for special effects, I would rather see them on video.

kidintech
0 replies
11h18m

Although I hold a similar view about "youtubers", please give these guys a chance; it's at best ignorant to type that instead of supporting one of the few teams/channels that are visibly passionate, hard working, and constantly putting out fantastic content.

astrodust
0 replies
20h50m

While I normally agree, that they went through the trouble to create an actual demonstration is worth the video click.

CitrusFruits
0 replies
20h18m

The Corridor Crew have been talking about this and trying to find a modern example for years. From what I understand, this is actually a pretty big deal to be able to re-create a technique that was lost to time.

On a related note, I think this is the sort of channel that we should be encouraging and rewarding more. They do their best to make original and original content, while fairly regularly experimenting with and even pioneering new film making techniques (e.g. Anime Rock Paper Scissors). Sure they play the algorithm game, but they're not the worst offenders by any means.

jmpman
4 replies
20h39m

I’ve been doing something similar using “cheap” Chinese band notch filters. Had no idea about this Disney process. Also using this for AI training purposes. It’s amazing what you can buy on Alibaba, fully custom frequencies.

sixothree
1 replies
16h17m

After you using sodium vapor out a different light source?

jmpman
0 replies
5h46m

Different frequency.

krasin
0 replies
19h59m

I would be interested in some links!

fnands
0 replies
11h58m

Also using this for AI training purposes.

Cool! Training for what? You can't just drop a juicy comment like this and then leave us hanging ;-)

jiggawatts
4 replies
18h8m

The original beamsplitter cube isn't some exotic custom device that would cost tens of thousands to make.

It's just a dichroic beamsplitter, which is a standard optical element available from online sellers. Variants of these are commonly sued in scientific experiments, optical devices, etc...

Here's one for under $300: https://www.meetoptics.com/beamsplitters/plate/longpass-dich...

Just search for: "589 nm beamsplitter".

rerdavies
2 replies
14h2m

Presumably, the exotic part of the beamsplitter is that the dichroic filter must be a bandpass filter with a very narrow passband around 596nm. Yours is a high-pass filter, which wouldn't work for this application.

eurekin
1 replies
12h11m

So, a second beamsplitter with low-pass filter wouldn't cut it?

lightedman
0 replies
8h25m

Nope, you need a notch filter.

laserbeam
0 replies
16h58m

I can't find any that are prism shaped (in the short time I spent on the website). I don't actually know what kind of manufacturing challenges you get if you want to place these splitters at the center of a prism.

However, I'd guess these are the components the guys are using between the prism and the cameras.

slacker_news
2 replies
9h15m

I saw this video come up on my Youtube multiple times, but never wanted to watch it because of the clickbait title and thumbnail. But I see the title on Hacker News and was right away interested so clicked on it. I wish the title could be straightforward on Youtube.

spdif899
1 replies
9h7m

Check out this extension: https://dearrow.ajay.app/

Same guy that made sponsorblock; crowdsourced titles and thumbnails. Tends to have good coverage on anything reasonably popular. Also has features to reduce the clickbaitiness of videos that haven't had someone submit better titles/thumbnails.

slacker_news
0 replies
8h46m

That looks perfect for me thank you!

Animats
1 replies
21h5m

It looks like Redwood City's old sodium-vapor streetlights, recently replaced with LEDs. The Golden Gate Bridge also used to use such lighting. That still looks yellow, but in 1972 the original sodium-vapor lamps were replaced with ones that required a yellow filter.

jayknight
0 replies
7h58m

On a trip to San Diego, I ran to a grocery store one night, and had a hard time finding my car because of light like that, and I couldn't tell what any of the colors were. It's that still common out west?

stolsvik
0 replies
4h33m

A bit light on the detail of how the standard green-screen process works in post: I'd prefer a bit deeper dive into the problems one encounter. And then compared harder to how this became easier with the sodium lights, compare and contrast. I think that was glossed over too fast. Otherwise really interesting!

sschueller
0 replies
5h5m

What I find incredible is that this was done with film without modern tools such as those light meters and a preview mask on an LCD screen.

They had to do a test and then process the film for each adjustment. A extremely tedious process and that doesn't include the work that was required to produce that prism.

scotty79
0 replies
2h49m

Narrow band amber LED would probably not be terrible for this application.

Or even magenta ones with different filters.

garaetjjte
0 replies
1h36m

I wondered previously how feasible would it be to manufacture digital camera sensors with modified color filter mosaic, so instead of usual Bayer RGBG you would have RGB+589nm mosaic.