Relative Humidity vs Dew Point

Comments

There are two main ways to describe the amount of moisture in the atmosphere. Relative Humidity is described in percentages and is the most common way you’ll see moisture described in the media and weather reports. Dew Point is the measure most commonly used by weather enthusiasts and storm chasers though. So what are the differences between the two, and why do storm chasers typically use dew point?

First we need to look at the definition of relative humidity and definition of the dew point.

Wet and dry bulb thermometers used for measuring humidity. (Source: Wikimedia Commons - Edal Anton Lefterov)

Wet and dry bulb thermometers used for measuring humidity. (Source: Wikimedia Commons – Edal Anton Lefterov)

Wikipedia defines relative humidity as the ratio of the partial pressure of water vapour to the saturated vapour pressure of water a prescribed temperature. What does that actually mean though? Partial pressure is the pressure a gas would be at, if it were the only gas in the area it takes up. So if you take a jar of the air and remove everything except the water vapour, the pressure in that jar is the partial pressure of water vapour. Now if we increased the amount of water vapour in the jar to the maximum amount that could be there and still remain a vapour, the pressure in the jar is the saturated water vapour pressure. This amount varies with temperature.

So relative humidity can loosely be defined as the percentage of moisture in the air out of the maximum amount the air can hold at its current temperature and pressure. So 50% humidity at 20C means the air is holding 50% of the amount of water vapour it can hold at 20C. The interesting thing is though that at 30C, the air can hold much more moisture. So 50% humidity at 30C means there’s a lot more water vapour.

What about dew point? Wikipedia defines dew point as the temperature at which for a constant pressure water will condense from the air at the same rate as it evaporates. At any temperature below the dew point temperature, water will condense from the air and when that accumulates on an object it’s called dew. So this means instead of being a relative measure of how the proportion of moisture in the air to how much it can hold, dew point is an absolute measure of how much moisture is in the air. The higher the dew point the more moisture is in the air. A dew point of 20C always represents the same amount of moisture, regardless of temperature. The closer the dew point is to the temperature the higher the relative humidity is.

So why use one over the other. It’s down to the absolute versus relative measure. A relative humidity of 50% might seem like a dry day. But if it’s 50% RH at 35C that’s a dew point of 23C, which is actually a good amount of surface moisture for producing thunderstorms. The dew point is able to tell meteorologists and storm chasers more about the nature of the atmosphere at a glance than relative humidity and that’s why we use it.