Thermal Resistivity of Soil
Soil thermal resistivity testing is a process used to evaluate the ability of soil to conduct and dissipate heat. This understanding is crucial for designing and installing underground infrastructure like pipelines and transmission cables, ensuring they don't fail prematurely. The test identifies whether the soil can keep a buried power cable at a safe temperature or if it risks overheating. Overheating can lead to reduced efficiency and, in severe cases, melting of the cable.
This testing, often done using a needle probe, helps foresee and address potential issues by assessing the soil's thermal resistivity in its natural state. If problems are detected, solutions might include adjusting the cables' capacity and insulation or using special thermal backfills in the trench.
Key applications of this testing include guiding the design of underground pipelines, preventing heat accumulation around power cables, and determining the right cable specifications based on local soil conditions. Before laying a cable, it's recommended to perform the test at the intended depth of installation. The test uses a transient line heat source method, employing a handheld needle probe equipped with a heater and temperature probe. This probe, meeting IEEE 442 and ASTM 5334 standards, is inserted into the soil, and as heat is applied, the sensor's temperature is monitored over time to calculate the soil's thermal resistivity. The aim is to minimize heating time to avoid affecting soil moisture and to ensure accurate readings, with temperature precision up to a thousandth of a degree. Results are expressed in degrees Celsius per meter per Watt (C.m/W).