Abstract |
The classical assumption that the lowest dissolved oxygen (DO) occurs at the highest temperature may not always hold. The DO saturation concentration decreases monotonically with increasing temperature, lowering the DO, but the reaeration coefficient increases monotonically with increasing temperature, tending to raise it. The decay coefficient monotonically increases with increasing temperature, lowering the DO for single discharges but not necessarily for multiple discharges. (Lower decay rates attending lower temperatures could result in low DO at the point where the impact from one discharge meets that of another.) The paper addressed the question of whether DO might under some circumstances worsen with decreasing temperature. Using a linear programming model it is shown that for a uniform stream at constant streamflow, the pattern of discharge that maximizes the derivative of critical dissolved oxygen with respect to temperature is an infinite uniformly distributed load. This suggests that streams receiving a large number of discharges may be more susceptible to DO increasing with decreasing temperature than streams receiving a small number of discharges. (Copyright (c) 1989 by the American Geophysical Union.) |