A comparison is made of the predictions of the Complex Terrain Dispersion Model (CTDM) with wind-tunnel observations of flow and diffusion in a simulated neutral atmospheric boundary layer over two- and three-dimensional hills. The measure used to evaluate the ability of the model to simulate the laboratory data is the terrain amplification factor, defined as the ratio of the maximum ground-level concentration occurring in the presence of the terrain to the maximum that would occur (irrespective of position) due to the same source located in flat terrain. In general, CTDM predicted considerably smaller terrain amplification factors than were measured in the wind tunnel. When the measured values of flow speed-ups and streamline deformations were input to CTDM, the predictions of terrain amplification factors improved substantially. These results suggest the need for an adequate representation of the flow field around the terrain, and that substantial improvements in model performance may require computation of actual streamline patterns for calculating the strain along the flow trajectory, as opposed to a simple 'relaxation' of the strain factors away from the hill crest as is currently done.