ABSTRACT: Graham’s ratio (GR) is used to calculate the amount of carbon monoxide produced in proportion to the amount of oxygen consumed by the coal. It is a useful indicator for Coal Mines to determine the level of coal oxidation and to respond accordingly in the event of spontaneous combustion. The intensity of the coal reaction is related to the carbon monoxide produced and the oxygen consumed (oxygen deficiency). Graham’s ratio is very important as it is often used as a trigger for Trigger Action Response Plans (TARPs) for the management of spontaneous combustion. Samples with a similar composition to air may return a negative or minuscule measured oxygen deficiency unsuitable for Graham’s ratio. The same problem is identified in samples diluted with seam gas or when there are inaccuracies in other measured components when nitrogen is calculated by difference and not directly measured. The issue arises when oxygen deficiency is inadequate and insufficient, where the GR result can be overestimated and trigger a TARP level. Some mine sites introduced a filter for minimum oxygen deficiency value to avoid alarm “fatigue” for a Control Room Operator (CRO). There are cases where this minimal value is not suitable and where valid oxygen values have been filtered. This paper will present the case studies where the filter value was adjusted to suit the mine site actual real data and analysis technique.