The Sievert integral, widely used to compute dose distributions about filtered line sources, assumes that the emitted energy fluence if exponentially attenuated by the filter thickness traversed by the primary photons. To evaluate this approximation, a Monte Carlo simulation was performed realistically modelling the diffusion and energy degradation of primary photons due to coherent, incoherent and photoelectric interactions in the source filter. Estimates of the exposure rate at points near the source were obtained using analytical averaging. Comparison of the two models shows that for 226Ra and 192Ir sources, the Sievert algorithm consistently overestimates the exposure rate per unit activity. However, such errors may be significantly reduced if source intensity is expressed in terms of exposure rate. Computed exposure rate distributions based on exposure rate calibrations are also less sensitive to uncertainties in available spectroscopic data.