Graham’s ratio is a commonly used indicator for measuring the intensity of the oxidation of coal in underground mine atmospheres. Successful measurement of oxygen deficiency is critical in order to generate relevant results, as well as meaningful data trends. Graham’s ratio is often used as a trigger for Trigger Action Response Plans (TARP) for the management of spontaneous combustion. If a Graham’s ratio is calculated where there is an insufficient oxygen deficiency the result can be overestimated and trigger a TARP level. Mitchell (1996) and the NSW Mines Rescue gas detection and emergency preparedness book (2014) has previously identified issues with using Graham’s Ratio when the oxygen deficiency is less than 0.3%, due to analytical limitations. This issue is often encountered in samples in which the composition is close to air due to the low inherent oxygen deficiency of the sample. The same problem is identified in samples diluted with seam gas. Errors in oxygen deficiency can be compounded by inaccuracies in other measured components when nitrogen is calculated by difference. A concern with applying the 0.3% oxygen deficiency requirement (minimum limit) to dilute or close to air samples is that valid data may be excluded from interpretation. This paper will review the magnitude and application of minimum oxygen deficiency required for a consistent valid measurement of Graham’s ratio. This will be done across a range of samples using real data from a number of modern analysis techniques in underground coal mines.