The rate of root water uptake can be limited by either the hydraulic conductivity of the rhizosphere soil, i.e. the soil immediately adjacent to the roots, or by the water potential gradient between the soil and the roots. The traditional conceptual model of root water uptake held that root water uptake lowered the water content of the rhizosphere soil, which increased the hydraulic gradient between the rhizosphere soil and the bulk soil. This increased gradient would act to increase water flow toward the rhizosphere and roots. However, the lowered water content of the rhizosphere soil also decreased the soil hydraulic conductivity (recall Fig. 4‑7), which would act to decrease water flow to the roots. As long as the effect of the increased gradient was adequate to offset the effect of the decreased hydraulic conductivity, then root water uptake could proceed at a steady rate. But, inevitably, a time would come when the water potential of the rhizosphere soil reached near-equilibrium with the water potential of the roots. From that time on, the hydraulic gradient could only decrease and the hydraulic conductivity would continue to decrease. Thus, the rate of root water uptake would start to decline sharply. This is analogous, in many ways, to the transition from the constant-rate stage of evaporation to the falling-rate stage.
Rain or Shine by Tyson Oschner is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.