Nuclear magnetic resonance (NMR) well logging is a well-established technique for in-situ fluid measurement. It is however not routinely used in the mining industry to quantify water content in high magnetic susceptibility iron ore rock cuttings because the ore's magnetic properties adversely affect the measurement. In this study the relationship between NMR signal intensity and magnetic susceptibility is studied using a low magnetic field (2 MHz) benchtop NMR instrument. Magnetic susceptibility is commonly measured during well logging protocols and therefore could be used to correct and hence render quantitative NMR measurements of moisture content in iron ore. In this study NMR signal from water-saturated synthetic samples (magnetite/maghemite/hematite mixed with borosilicate glass beads) is measured as a function of iron content and magnetic susceptibility to obtain such a correlation. This correlation is then applied to iron ore rock cuttings. Comparison with gravimetric measurements show that a standard error of estimate of water content across all samples tested of 6.4 wt% using no correction, is reduced to 1.5 wt% using the corrections, corresponding to a relative error reduction from 47% to 11%. Additional comparison with whole rock cores shows that a higher magnetic susceptibility threshold for using the corrections is required for whole rock cores than for the rock cuttings, this difference requires further study. These results encourage the broad use of these types of correlations for NMR measurements of iron rich cuttings and application of NMR well logging to iron ore deposits.