I get the sense that applications with true realtime requirements generally have hard enough requirements that they cannot allow even the remote possibility of failure. Think avionics, medical devices, automotive, military applications.
If you really need realtime, then you really need it and "close enough" doesn't really exist.
This is just my perception as an outsider though.
I wonder if this being fixed will result in it displacing some notable amount of made-for-realtime hardware/software combos. Especially since there's now lots of cheap, relatively low power, and high clock rate ARM and x86 chips to choose from. With the clock rates so high, perfect real-time becomes less important as you would often have many cycles to spare for misses.
I understand it's less elegant, efficient, etc. But sometimes commodity wins over correctness.