Difference Between Gauge and Absolute Pressure Measurement
Pressure can be described as the force applied to an area. There are many different systems of pressure measurement, with absolute pressure and sealed gauge
pressure being two of the most common. There are many differences between these two measurements of pressure that have significant effects on their use and measurement. Depending on why you are measuring pressure, determining whether you need gauge or absolute reference pressure is as important as selecting the pressure range itself, particularly for low pressure. If you get it wrong it could create huge errors in your measurements.
The simplest way to explain the difference between the two is that absolute pressure uses absolute zero as its zero point, while gauge pressure uses atmospheric pressure as its zero point. Due to varying atmospheric pressure, gauge pressure measurement is not precise, while absolute pressure is always definite.
The most common pressure reference is gauge pressure which is signified by a ‘g’ after the pressure unit e.g. 30 psig. Gauge pressure is measured in relation to ambient atmospheric pressure. Changes of the atmospheric pressure due to weather conditions or altitude directly influence the output of a gauge pressure
sensor. A gauge pressure higher than ambient pressure is referred to as positive pressure. If the measured pressure is below atmospheric pressure it is called negative or vacuum gauge pressure.
Gauge pressure sensors only usually have one pressure port. The ambient air pressure is directed through a vent hole or a vent tube to the back of the sensing element. A vented gauge pressure transmitter allows the outside air pressure to be exposed to the negative side of the pressure sensing diaphragm so that it always measures with reference to the ambient barometric pressure. Therefore a vented gauge pressure sensor reads zero pressure when the process pressure connection is held open to atmospheric air.
A sealed gauge
reference is very similar except that atmospheric pressure is sealed on the negative side of the diaphragm. This is usually adopted on high pressure applications such as measuring hydraulic pressures where atmospheric pressure changes will have only a slight effect on the accuracy of the sensor. The definition of sealed-gauge pressure is the pressure measured through a sealed device in which the zero point is set. This set point is whatever the pressure inside of the device was before sealing, which the manufacturer of the sealed pressure gauge decides.
The definition of absolute pressure is the pressure of having no matter inside a space, or a perfect vacuum. Measurements taken in absolute pressure use this absolute zero as their reference point. The best example of an absolute referenced pressure is the measurement of barometric pressure. In order to produce an absolute pressure sensor the manufacturer will seal a high vacuum behind the sensing diaphragm. Therefore if you hold open the process pressure connection of an absolute pressure transmitter to the air it will read the actual barometric pressure.
So how do you know when to measure absolute pressure or when to measure gauge pressure?
This is not always straightforward but generally if you want to measure or control a pressure that is influenced by changes in atmospheric pressure, like the level of liquid in an open tank for example; you would choose vented gauge pressure
as you are interested in the pressure reading minus the atmospheric pressure component.
If you want to measure pressures that are not influenced by changes in atmospheric pressure, e.g. leak testing a completely sealed non-flexible container, you would use an absolute pressure sensor. If a gauge pressure sensor was used instead to measure the container pressure, and the barometric pressure changed, then the sensor’s reading would change, despite the fact that the pressure in the container remains the same.