There are a number of indicators used to determine the level of coal oxidation in underground coal mines. The common indicator Graham’s ratio is the amount of carbon monoxide produced in proportion to the amount of oxygen consumed by the coal. The more carbon monoxide produced relative to the oxygen consumed (oxygen deficiency), the greater the intensity of the coal’s reaction. Graham’s ratio is often used as a trigger for Trigger Action Response Plans (TARPs) for the management of spontaneous combustion. This emphasises the importance of accurate measurement of oxygen deficiency and ability to successfully determine the status of underground atmosphere. Samples with a similar composition to air may return a negative or minuscule measured oxygen deficiency unsuitable for Graham’s ratio. The same problem is identified in samples diluted with seam gas or when there are inaccuracies in other measured components when nitrogen is calculated by difference. If the oxygen deficiency is inadequate and insufficient, the Graham’s ratio result can be overestimated and trigger a TARP level. If the minimum oxygen deficiency is set to inadequate level, in order to avoid alarm “fatigue”, there is a concern that valid data may be excluded from interpretation. The optimal value which indicates the beginning of spontaneous combustion event is site specific. This paper will present the case studies where the oxygen deficiency minimum limit has been adjusted to suit the mine site actual real data and analysis technique.