The clay mineralogy rather that the clay content determines radiocaesium adsorption in soils
Abstract. The transfer of radiocaesium (137Cs) from soil to crops is the main long-term radiation risk after nuclear accidents. The prevailing concept is that 137Cs sorption in soil, and hence its bioavailability, is controlled by soil clay content (0–2 µm). This study tested this assumption using 24 soils collected worldwide. The Radiocaesium Interception Potential (RIP), i.e., 137Cs adsorption, was measured for the bulk soils and for their clay and silt fractions. The RIP varied by factor 438 among soils and was unrelated to its clay content (p >0.05). The RIP in the clay fractions was lowest for young volcanic soils with allophane and mica, and for highly weathered tropical soils with kaolinite. In contrast, about two order of magnitude higher RIP values were found in intermediate-weathered temperate soils dominated by illite. Soil RIP was, hence, related to soil illite content (R2 = 0.50; p <0.001). Significant fraction of soil RIP originated from clay minerals embedded in the silt fraction. The sum of RIP in clay and silt fractions overestimated the soil RIP by, on average, factor of 2, indicating that isolation of clay opens selective 137Cs sorption sites inaccessible in intact soils. Soil mineralogy, not just clay content, governs soil RIP. The validity of existing 137Cs bioavailability models require recalibration for its use on a global scale.