And does what physicists do, analyzes them: The Quantum Physics Of Vampires
Like a lot of things these days, it started with a tweet, by Zach Weinersmith:
The fact that vampire skin can tell candle light from sunlight HAS to violate quantum mechanics. I just need to figure out how...— Zach Weinersmith (@ZachWeiner) March 27, 2018
And since then, I've spent an inordinate amount of time thinking about the quantum physics of vampires... Having put that time in, though, I might as well get a blog post out of it.Hey, I might as well get one out of it too.
The central physics idea that I'm about to grossly overthink is that vampires are somehow distinguishing sunlight from other forms of light. They're perfectly capable of appearing in brightly lit rooms to attack ordinary humans, but sunlight reduces them to ash in seconds. But in physics terms, one photon is just like another. So what could possibly distinguish sunlight from other forms of light?
Most sources that generate a significant amount of light are either thermal sources or atomic line sources. A thermal source is just an object that's emitting light because it's very hot-- the heating element in a toaster, say, or the filament of an incandescent bulb. An atomic line source, on the other hand, consists of a collection of atoms of a particular element that are then induced to emit light at one of the characteristic frequencies associated with those atoms-- a neon light, or those yellowish sodium-vapor streetlights, say. For these purposes, lasers are a special case of an atomic line source-- they emit only a single narrow range of frequencies (though in the case of semiconductor lasers, these aren't actually coming from atomic states).
So, if you're looking for a distinction between sunlight and candlelight (as Zack originally noted) or sunlight and an incandescent bulb (for more modern vampires), the key distinction between them is the temperature. A candle flame is pretty hot in human terms, but only around 2000K (reminder: Kelvin temperatures are measured starting at absolute zero, and one kelvin is one degree Celsius; room temperature is a little bit less than 300K), while a really hot light bulb filament might hit 3000K. The Sun's spectrum closely matches a black-body at something like 5600K.
What's the difference between these? Well, the peak of the black-body spectrum shifts toward shorter wavelengths as the temperature increases, which is why objects being heated glow first a dull red, then yellow, then white. So sunlight would have a lot more short-wavelength radiation than candlelight or incandescent bulb lights-- really a lot more, because the drop-off at the short wavelength end of the spectrum is extremely rapid. The spectrum of the sun extends well into the ultraviolet, while candles and light bulbs produce next to no UV light.
So, it might be just the ultraviolet light that's the problem-- one series of vampire novels by Charlie Huston has vampires exposed to sunlight dying from extremely rapid cancers caused by UV lights, which is a nod toward this feature. But, of course, if UV alone were the culprit, that would present another problem-- as we know from modern vampire movies, they frequently hunt on the dance floors of night clubs, and it's a rare nightclub that doesn't feature some "black lights" bathing the crowd in ultraviolet radiation...
So, how could you distinguish sunlight from a black light source? Well, the sun emits light over a huge range of wavelengths, where "black lights" tend toward the atomic line source end of things, so maybe you need both short-wavelength radiation and long-wavelength radiation at the same time. Maybe, vampires in sunlight are victims of a two-photon process.
For vampires to be sensitive to sunlight specifically, you might imagine some process involving two photons of different invisible-to-humans wavelengths, one in the ultraviolet and one in the infrared. The sun produces huge amounts of infrared radiation, but many human light sources do not, opting instead to optimize the amount of visible light emitted. The likelihood of this process would depend very strongly on the intensity of the light-- it would go like the product of the intensity at each of the relevant wavelengths, so cutting the intensity in half would reduce the rate of two-photon absorption by a factor of four. This would explain why vampires are sensitive to sunlight, but can strike dramatic poses in the light of the full moon, which is, after all, just reflecting light from the Sun-- the many-times smaller amount of light from the moon has all the right wavelengths, but not enough intensity to be a problem.I'm glad we got that worked out. But I think it's solar neutrinos. A week late and twice as big due to tax season, Wombat-socho has "Rule 5 Sunday: Not Your Normal Graduation Picture" ready for business.
So, there's my crazy retcon for why sunlight specifically, but not candles, incandescent bulbs, fluorescent bulbs, and black lights in nightclubs: something about the process that reanimates them is disrupted by a two-photon process involving both long and short wavelengths found in sunlight.. It's a theory that makes fairly specific predictions, so as I said on Twitter, now we just need to collect a bunch of vampires and put it to the test...
You know as well as I do that Vampires dying in sunlight is over hyped.
ReplyDeleteI know you read the original Dracula. English sunlight weakened him, it did not kill him.