Site hosted by Angelfire.com: Build your free website today!
Understanding Dew Point and Relative Humidity

Among the general public, there is much confusion when it comes to dew point and relative humidity. There is increasing usage of the term dew point these days by television media forecasters, but unfortunately not enough explanation. As long as you can understand everything you are about to read, you'll be a master when it comes to dew point verses relative humidity. It's really not that difficult, but rather the lack of explanation that makes the subject seem so hard to grasp.

First, here is a more "technical, official", right down to the point, definition of each term from NOAA/NWS, straight from one of their glossaries.

  • Relative Humidity - A dimensionless ratio, expressed in percent, of the amount of atmospheric moisture present relative to the amount that would be present if the air were saturated. Since the latter amount is dependent on temperature, relative humidity is a function of both moisture content and temperature. As such, relative humidity by itself does not directly indicate the actual amount of atmospheric moisture present. See dew point.
  • Dew Point (or Dew-point Temperature) - A direct measure of atmospheric moisture. It is the temperature to which air must be cooled in order to reach saturation (assuming air pressure and moisture content are constant).
  • Now here is a more understandable, detailed, and simplified (albeit more lenghty) explanation, along with some examples. Humidity, or invisible moisture in gaseous form (water vapor) in the air, can be measured by several values... each telling something a little different about atmospheric moisture. However... there is only one value that gives a true, exact measure of moisture in the air at any given time or location... using just that one indicator. This is what is known as the dew point.

    Dew point is a direct measure of the amount of moisture present in the air, measured in degrees just like the actual air temperature. Dew point is an irreplaceable, very useful value to any forecaster. All forecasts rely heavily on dew point forecasts... from the surface up through thousands of feet above the surface into the atmosphere. Previously used primarily behind the scenes, only in recent years has dew point become a "public" value shown on television media weather segments, however. The dewpoint temperature reflects increases and decreases in moisture in the air. It also shows at a glance, under the same atmospheric conditions, how much cooling of the air is required to condense moisture from the air. In other words, it indicates when the air would become saturated... either for fog, dew, frost, clouds, etc, to form.

    Relative humidity, expressed in a percentage value, is just what its name implies - relative. It is relative to the air temperature. The relative humidity of a volume of air is the ratio between the water vapor actually present and the water vapor necessary for saturation at a given temperature. Essentially this says how near the air is to being saturated. Relative humidity shows the degree of saturation, but it gives no clue to the actual amount of water vapor in the air. (For informational purposes, water vapor is a totally invisible, gaseous form.) Such as if you had a 50% relative humidity, you would know that at that exact temperature at that exact time the air is halfway to being saturated. However that gives no indication whatsoever of how much moisture is present in the air, it just simply tells you that the air is 50% from being completely saturated... or in other words the air contains half of the water vapor that it is capable of holding at that temperature. If the air temperature goes up or down at any time (which on most days temperature is always changing)... the relative humidity will also change. Yet, the moisture in the atmosphere has not changed at all. Saturation occurs when the air temperature and dew point are exactly equal, thus making the relative humidity 100%. When this is accomplished, you know at that time the air cannot hold anymore moisture. BUT, you still have no idea how much moisture is in the atmosphere!

    Perhaps this example will help understand how relative humidity is relative to air temperature. There is a glass of water that is half-filled. The relative humidity would be 50%. If the glass is three-quarters filled the relative humidity is 75%. The relative humidity will depend on how tall the glass might be and how full the glass might be. Imagine the surface of the water in the glass as the dew point... and the very top of the glass itself as the current temperatue. The amount of water and the empty space in between with no water in it would be the relative humidity, because remember relative humidity is the ratio between the dew point and the temperature. The relative humidity would increase if the dew point increases up toward the temperature OR if the temperature decreases closer to the dew point. As is often the case, it is the temperature that is changing all the time... so the relative humidity becomes dependent on the air temperature much of the time. Imagine the decreasing temperature as the glass getting smaller and smaller. The empty space in the glass is also getting smaller and thus relative humidity increasing as the temperature gets closer to its dew point. If the glass keeps getting smaller, eventually the top of the glass will reach the surface of the water in it (which is the dew point). The glass can now not hold any more water, and it is saturated (condensation occurs, where invisible gaseous water vapor turns into visible liquid water) with a relative humidity of 100% and an equal temperature and dew point. So as the relative humidity increased from 50% to 100% as the temperature decreased (as the glass got smaller)... the water in the glass (dew point) never changed!

    Theoretically, the air can be saturated at any temperature... 10F... 50F... 70F, etc. However, when the air is saturated at 10F and when it is saturated at 70F... there is significantly more moisture present in the air at 70F. Of note, the air temperature can never be lower than its dew point (or dew point higher than temperature, same difference)... for the simple fact that when the temperature reaches its dew point the air is completely saturated. In technical terms, saturation is where the amount of water condensing is exactly equal to the amount of water evaporating. Any water molecule escaping from a water surface is immediately replaced by another displaced from the air around it. Such as if you took a damp towel and put it outside into a saturated atmosphere, it would never dry as long as the air is saturated.

    Relative humidity will often, almost always in fact, be highest in the early morning when the temperature is at its lowest... at or near its dew point. Once the sun rises, during the daylight morning hours and afternoon the temperature will rise away from its dew point (going slightly off subject, this is why most fog dissipates). Assuming the atmosphere does not undergo any major changes during a day (such as frontal passages, mixing of drier or moister air aloft down to the surface, etc.), the dew point will stay around the same throughut the day, which means that moisture content has not changed. However... with every degree the temperature rises away from its dew point... the relative humidity will fall. The amount of heating that occurs during the day will depend on several factors, one of the most important being the amount and duration of sunshine that occurs. On a clear night and a following clear/sunny daytime with no major atmosphere changes... it is normal for the relative humidity to range from around 100% early in the morning just before and around sunrise.... to less than 50% during the max heating time of afternoon. On some days, relative humidity can go down to as low as 20% (from the near 100% early morning value) or so... give or take a little depending on what the exact temperature/dew point is. That can be a change of 70-80% in relative humidity during the course of a 12 hour (or less) period. Yet... the moisture content of the atmosphere remained the same the entire time! It's simply because the dew point did not change, which we have already learned is the direct measure of moisture. When the relative humidity changes so greatly, or at all for that matter, in a day... it's because of a change in temperature and you are seeing how close the air is to being saturated at a given temperature. But not how much moisture is in it. Below are some examples of temperature, relative humidity, and dew point.

  • Case 1) Temp = 90F Dwpt = 50F RH = 26%
  • Case 2) Temp = 60F Dwpt = 50F RH = 70%
  • Case 3) Temp = 85F Dwpt = 74F RH = 70%
  • Case 4) Temp = 76F Dwpt = 74F RH = 94%
  • Case 5) Temp = 100F Dwpt = 70F RH = 38%
  • Case 6) Temp = 85F Dwpt = 57F RH = 38%
  • Case 7) Temp = 25F Dwpt = 25F RH = 100%
  • In cases 1 and 2... dew point remains the same (which means moisture content stays the same). However when we drop the air temperature 30 degrees (from cases #1 to #2)... much closer to its dew point... relative humidity increases dramatically. Moisture has not changed at all... simply the air is much closer to being saturated in #2 than it was in #1.
    Now compare cases 2 and 3. It is the reverse effect of what we just discussed - relative humidity does not change but dew point increases dramatically. So there is significantly more moisture in case #3 than in #2. Then look at #4... where dew point remains the same as #3 but relative humidity increases as temperature falls closer to the dew point, thus falling closer to saturation. As stated in an above paragraph... air can be saturated at any level... but the amount of moisture present at that saturation point can vary greatly from instance to instance. Saturated air simply means that the air is holding the most water vapor that it possibly can at that time... whether it is 10F or 70F. It gives you no indication of the amount of moisture that is in the air. Looking at the dew point tells you instantly, and exactly, how much moisture is present.

    Case 5 is just another example of very low relative humidity, but with a very high moisture content in the air at the same time. Case 6 shows the same relative humidity, but a significant drop in moisture within the air as the dew point drops off quite a bit from #5. Case #7 is one more example a 100% relative humidity... yet a very dry airmass.

    Now after all this, you should realize that dew point is directly related to "mugginess", or lack thereof. Yes... the higher the dew point the more moisture in the atmosphere. But you will only begin to feel muggy and sticky when it reaches a certain level (prior to that the air is just too dry to feel muggy at all). This level is slightly different for every person... due to the moisture of your skin, sweat rate, etc. But a general guide follows below:

  • 50's = This is the point when a difference of 5 degrees or so can make a comfort difference in some people. A dew point of 59F would feel a bit more muggy than 54, but nothing even close to excessive.

  • 60's = As dew points would incease up through the 60s... a notable increase in stickiness would be felt. For most people... by the time upper 60s are reached it will feel quite, or very, muggy.

  • 70's = When dew points reach into the 70s you know there is a lot of moisture in the atmosphere. Everyone will feel extremely muggy, sticky, and steamy. Dew points high into the 70s are rare for most areas.

  • 80+ = Dew point values at or above 80 are extremely rare in many spots across the world, and very dangerous. However, in the famed Midwest heatwave of July 1995 (where over 1,000 people died as a direct result, many in Chicago, along with tens of thousands of animals), dew points surged into the upper 70s and lower 80s. The most moist place in the world is claimed by the Ethiopian Red Sea coastline... where the June dew point average is 84F!
  • Of course, the amount of moisture present in the air is significantly impacted by season. Cold air cannot hold as much moisture as a warm one, so a cold airmass would typically be drier than a warm one. After going through winter... or a warm-up during winter... you would start to feel the increased moisture in the air when dew points got into the 50s. Not significantly, but most people will feel the increase. On the other hand... after a summer heat wave with dew points in the 70s... a cold front ushering in dew points in the 50s would feel refreshingly dry. Your body gets used to a prolonged period of low or high moisture during a season... and a sudden change is felt very much by your body.

    That about concludes this section, and hopefully clears up all the consfusion about temperature-dew point-relative humidity relations. Yes, there are numerous other measures of atmospheric moisture... such as Wet Bulb Temperature, Precipitable Water... Humidity Mixing Ratio... and Specific and Absolute Humidity to name some. These are more used for forecasting purposes primarily however, generally values that the public would not have much use for.